Skip to main content

A Methodology to Address the Gap Between Calculated and Actual Energy Performance in Deep Renovations of Offices and Hotels

  • Conference paper
  • First Online:
Improving Energy Efficiency in Commercial Buildings and Smart Communities

Part of the book series: Springer Proceedings in Energy ((SPE))

  • 616 Accesses

Abstract

A European Commission-funded Horizon 2020 project named ALDREN (ALliance for Deep RENovation in buildings) https://aldren.eu/ aims to establish the business case for deep renovation. The 30 month programme which started in November 2017 intends to encourage investment and accelerate the movement towards a nearly zero energy non-residential building stock across the EU, as targeted by 2050 to meet Paris Agreement commitments. The back-bone of ALDREN is the EVCS (European common Voluntary Certification Scheme) (Ribeiro serrenho T, Rivas Calvete S and Bertoldi P Cost-benefit analysis of the EVCS implementation, EUR - Scientific and Technical Research Reports, 2017) which will be used to track the deep renovation process. This paper describes the processes and tools being developed to close the gap between calculated and measured energy performance (EP):

  1. 1.

    A framework allowing measured (operational) performance to be compared with predicted (design) performance across all the countries in the ALDREN consortium using a harmonised approach and common language fed by a glossary of terms.

  2. 2.

    A “design for measurability” protocol that tracks the actions required during the deep renovation process, to ensure that performance predictions are as realistic as possible, that the construction and commissioning process is true to the design intent, and allowing the predicted performance to be verified through measurements.

  3. 3.

    A performance verification tool, which allows the predicted and actual (measured) performance to be compared at different levels of granularity.

The paper concludes that nearly zero energy performance targets can become measured outcomes, where driven by client leadership and wider team buy-in, and using the power of advanced simulation of HVAC systems to optimise design and ensure operation is aligned with the design intent.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 149.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 199.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 199.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Adrian Joyce, Secretary General, EuroACE, presentation to ALDREN partners meeting, Brussels, 10 April 2018.

  2. 2.

    ALDREN will also be addressing indoor environmental quality, but this is not covered in the present paper.

  3. 3.

    Base building energy ratings and the NABERS star rating scale are described in detail in Appendix 1.

  4. 4.

    We note some MEP consultant engineers in Australia insist on minimum tenant ratings also being achieved (e.g. 1 star) before signing up to stretching base building targets, to give themselves perceived protection against excessive tenant energy intensity affecting the base building services efficiency. Modelling studies demonstrate this to be unnecessary (although truly agile working is sweating the system), but the ease of mind it affords is understandable.

  5. 5.

    The NABERS base building rating defines energy efficiency using the principle that a building should receive no benchmark “allowance” from lettable space for any period it is unlet.

  6. 6.

    Base building energy covers the following energy end uses; sub-meters should be provided to measure the energy consumed by fuel type in supplying each of these building central services:

    • • Heating, domestic hot water, cooling and ventilation, for example, to a BCO Guide specification.

    • • Common-area lighting and power (including lift lobbies, plant rooms and common-area toilets).

    • • Vertical transportation (e.g. lifts and escalators).

    • • Exterior lighting.

    • • Exterior signage provided by the building owner for the benefit of office occupiers.

    • • Generator fuel where it serves central services.

    • • Car park ventilation and lighting, where internal or external car parks within the legal boundaries of the site are provided for occupier use.

    Supplementary HVAC services to a tenant’s energy-intensive areas including server rooms, dealer rooms and laboratories should use energy off the tenant’s meter, not the landlord’s HVAC.

  7. 7.

    Climate change, security of supply and affordability (minimising energy costs).

  8. 8.

    The underlying logic is that a better rating is associated with a building that has been better designed, better constructed, better commissioned and better operated and maintained.

  9. 9.

    To calculate the kWh of “electricity equivalent” of total energy use, kWh of electricity are added to kWh of any fuel multiplied by 0.4 and kWh of hot or chilled water delivered to the building multiplied by 0.5. The kWhe metric enables timeless, international comparisons of a building’s energy performance, and facilitates intrinsic building energy efficiency to be rated, independently from local, regional or national grid factors. Furthermore, with electricity often/usually the dominant energy carrier for commercial offices, kWhe has the enormous merit of using unity as the weighting or intensity factor for electricity—thus a unit of electricity retains the same value independent of the building’s location around the globe, or the timing of the period for which the analysis is being undertaken.

  10. 10.

    In 2012, BBP commissioned Verco and the UBT to develop the LER, a NABERS-style energy rating scheme for UK offices. Its application on about 85 buildings exposed challenges with the configuration and sub-metering of existing building services systems. This led BBP to focus on the concept of base building performance agreements for new buildings, where it was potentially possible to design out the obstacles to a harmonised investment-grade rating presented by the variability of engineering services and sub-metering configurations encountered in the existing stock.

References

  1. ALDREN (ALliance for Deep RENovation in buildings). Retrieved from https://aldren.eu/.

  2. Ribeiro Serrenho, T., Rivas Calvete, S., & Bertoldi, P. (2017). Cost-benefit analysis of the Voluntary Common European Union Certification Scheme (EVCS) implementation. EUR - Scientific and Technical Research Reports.

    Google Scholar 

  3. European Parliament. (2016). Energy efficiency of buildings. A nearly zero energy future? http://www.europarl.europa.eu/thinktank/en/document.html?reference=EPRS_BRI(2016)582022

  4. European Union. (2012). Directive 2012/27/EU of the European Parliament and of the Council on Energy Efficiency. Official Journal of the European Union, 315, 1–56.

    Google Scholar 

  5. The Economist. (2013). Investing in energy efficiency in Europe’s buildings. A view from the construction and real estate sectors.

    Google Scholar 

  6. The Renovate Europe campaign. Retrieved from https://www.renovate-europe.eu/.

  7. Tsoutsos, T., Tournaki, S., & Frangou, M. (2016). Nearly zero energy hotels in Europe (NEZEH). Technical University of Crete. Retrieved from www.nezeh.eu.

  8. New South Wales Office of Environment and Heritage. (2019, February). Handbook for estimating NABERS ratings. Version 1.1. Hurstville: New South Wales Office of Environment and Heritage

    Google Scholar 

  9. ASHRAE Standard 90.1. (2016). Energy standard for buildings except low-rise residential buildings. https://www.ashrae.org/technical-resources/bookstore/standard-90-1.

  10. U.S. affiliate of the International Building Performance Simulation Association (IBPSA-USA) and the Illuminating Engineering Society (IES). BEMP – Building Energy Modelling Professional Certification. https://www.ashrae.org/professional-development/ashrae-certification/certification-types/bemp-building-energy-modeling-professional-certification

  11. Mechanical and Electrical Engineering Working Party of National, Regional and Local Authorities (AMEV) for QUANTUM project. (2017). Technical monitoring as an instrument for quality assurance.

    Google Scholar 

  12. Better Buildings Partnership (BBP). (2014, June). Landlord energy rating documentation. Better Buildings Partnership (BBP)

    Google Scholar 

  13. National Australian Built Environment Rating System (NABERS). Retrieved from https://www.nabers.gov.au/.

  14. Australian Building Greenhouse Rating (ABGR). Retrieved from https://www.nabers.gov.au/about/our-story.

  15. New South Wales State government decree. (2004). M2004-04 Greenhouse Performance of Government Office Buildings and Rental Properties.

    Google Scholar 

  16. Blundell, L. (2011, August). NABERS Energy goes to 6 stars as (most of) the industry moves on. The Fifth Estate.

    Google Scholar 

  17. NABERS. (2016, April). Supporting policies and initiatives. Version 0.6.

    Google Scholar 

  18. NABERS. (2018, September 30). Annual report 2017–18 (p. 2). Version 1.

    Google Scholar 

  19. Bordass, W. T., Cohen, R. R., & Bannister, P. (2016). Design for performance: UK Commitment Agreements: Making measured energy in-use the objective for new office buildings. Feasibility Study Final Report. Better Buildings Partnership (BBP).

    Google Scholar 

  20. Harrison, S. (2017, June). NABERS and mandatory disclosure. Presentation to UCL/CIBSE NABERS workshop, London.

    Google Scholar 

  21. Australian Government. (2010). Building Energy Efficiency Disclosure Act No. 67 2010, to promote the disclosure of information about the energy efficiency of buildings, and for related purposes.

    Google Scholar 

  22. IPD/Department of Industry. (2013, December). NABERS Energy Office Market Analysis, Figure 16. IPD/Department of Industry.

    Google Scholar 

  23. The Property Council/IPD Australia, Green Property Index (2015, March). https://www.msci.com/documents/1296102/1672377/MSCI_AU+Green+snapshot+Flyer.pdf/e2548b3f-6809-4732-bd4d-281483e81256

  24. UK government, Energy Performance Certificates for your business premises. Retrieved from https://www.gov.uk/energy-performance-certificate-commercial-property.

  25. HM Government. (2006). The Building Regulations 2000 Approved Document L2A: Conservation of fuel and power in new buildings other than dwellings (2006 Edition). London: HM Government.

    Google Scholar 

  26. HM Government. (2002). The Building Regulations 2000 Approved Document L2: Conservation of fuel and power in buildings other than dwellings (2002 Edition). London: HM Government.

    Google Scholar 

  27. HM Government. (1984). Building Act 1984. Retrieved from http://www.legislation.gov.uk/ukpga/1984/55.

  28. Ratcliffe, S. (2016, November). Performance targeting & measurement is essential for effective energy efficiency policy & driving market transformation. London: CIBSE Building Performance Conference.

    Google Scholar 

  29. ODPM. (2004). Proposals for amending Part L of the Building Regulations and Implementing the EPBD. Consultation Document.

    Google Scholar 

  30. HM Government. (2007). The energy performance of buildings (certificates and inspections) (England and Wales) regulations 2007. London: HM Government.

    Google Scholar 

  31. Australian Government. What is a BEEC? Retrieved from http://cbd.gov.au/get-and-use-a-rating/what-is-a-beec.

  32. Waring, G., & Bordass, W. T. (2013, September). LER Phase 2: Case studies. Report to the Better Buildings Partnership.

    Google Scholar 

  33. Austin, B. (2013). The performance gap—Causes and solutions. Green Construction Board—Buildings Working Group.

    Google Scholar 

  34. UK Green Building Council (UKGBC). (2016, May). Delivering building performance.

    Google Scholar 

Download references

Acknowledgements

The ALDREN project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 754159.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Robert Cohen .

Editor information

Editors and Affiliations

Appendices

Appendix 1: What Has Been Achieved in Australia?

Some 15 years ago in Australia, “base building” energy ratingsFootnote 6 had started to influence investment decisions for existing and new buildings, sales and purchases. The scheme that measured and verified this base building performance was called the National Australian Built Environment Rating System or NABERS [13]. Some of the key steps have been:

  • 1999: New South Wales introduced a voluntary system (the Australian Building Greenhouse Rating, ABGR), to measure and benchmark the energy use of existing office buildings. This developed into the NABERS national Scheme [14].

  • 2002: Commitment Agreements were conceived for developers to ensure new offices could operate at their target energy performance levels and enable occupiers to sign up to pre-lets for space with the in-use energy performance they wanted.

  • 2004: State governments started to set minimum standards for space they occupied. New South Wales took the lead in March 2004, when they decreed their existing owned buildings and tenancies had to be rated by the year end, should attain 3 star base building (on a 1 to 5 star scale, with the empirical median performance at 2.5 stars) by July 2006 and new leases should require 3.5 stars from 2006 [15]. They also required 4 stars for major upgrades and 4.5 stars for new buildings. Other States gradually introduced their own minimum standards.

  • 2006: the Federal Commonwealth (Australian) Government mandated 4.5 star base buildings for new buildings, major refurbishments and new leases over 2000 m2. Most States have since ratcheted up their requirements to the 4.5 star level for all their stock over 2000 m2. In the same year, the Property Council of Australia introduced minimum NABERS base building energy ratings into their definitions of new offices: 4.5 stars for grade A, 4 stars for grade B.

  • 2010: the federal government introduced the Building Energy Efficiency Disclosure Act, to mandate disclosure of Base Building ratings on sale or let of office premises over 2000 m2 NLA.

  • 2011: NABERS extended the top of their scale to 6 stars, stating 5 stars represented excellent performance, and 6 stars market leading [16]. The new 6 star level was set by taking a theoretical 7 star level as zero emissions and applying a 50% reduction in the emissions at 5 stars. Similarly, 5.5 stars is a 25% reduction from the 5 star level.

  • 2012: the energy performance bar for grade A offices was raised: to 5 stars for new buildings and to at least 4 stars for existing buildings [17].

  • 2017: the threshold for mandatory disclosure was reduced from 2000 to 1000 m2 NLA [18].

A feasibility study [19] published by the Better Buildings Partnership (BBP) in May 2016 confirmed that, in the commercial office property market in Australia, better base building operational energy performance has become aligned with investor, developer and occupier interests. Over the last 15 years, this has driven a systemic change in design, construction and operation of office buildings, with innovation flourishing across the supply chain. As a result, base building services in today’s new buildings in Australia use on average half the energy they did when measurements started in 1998, and the best one fifth. The nexus of financial and property industry interests has also driven a remarkable uplift in the base building energy performance of the existing stock in Australia (see Fig. 3).

Fig. 3
figure 3

Growth in rated commercial office floor area and improvement in the existing stock average base building energy rating from 2006 to 2016. (Source: NABERS, OEH [20])

The context for Fig. 3 is that when base building ratings were initiated in 1998 (on a voluntary basis), a scale from 1 to 5 stars was set using empirical data to position the average performance at 2.5 stars. The blue line in Fig. 3 shows that by 2006 the average rating had crept up to 2.7 stars (right hand scale). By then some 3 million m2 of commercial office floor space had a rating (the blue-filled area on the graph and left hand scale).

Progress was boosted in 2007 by an Energy Efficiency in Government Offices policy requiring office buildings leased by the Commonwealth government to be a minimum of 4.5 stars. By 2010, the average rating had climbed to 3.6 stars, a 24% improvement on the 2006 position.

In 2010, policy makers had sufficient confidence in the approach to mandate disclosure for sale or let transactions [21], which widened the empirical data from a voluntary cohort of buildings. Not surprisingly, the overall effect was a reduction in the average rating over the next year to 3.3 stars as poorer performing buildings were obliged to lodge their rating data, as well as those that had been doing so voluntarily.

However, the hiatus in average rating improvement was short-lived and within a couple of years the market average was exceeding its previous record high and indeed grew continuously every year, reaching 4.2 stars by 2016. Over the ten year period from 2006 to 2016, the improvement from 2.7 to 4.2 stars represented a 41% reduction in energy intensity for the whole of the rated stock, which by then had climbed to 16 million m2 of commercial office floor space, an almost complete penetration into the overall market for tenancies over 2000 m2.

In 2017, the mandatory disclosure requirement was extended to tenancies over 1000 m2. It will be interesting to track the impact on the stock average rating once it includes these smaller tenancies which can lack the economies of scale supporting energy management activities in larger buildings. With the allocation of floor space in the market highly skewed to larger buildings, it seems unlikely this step will undermine the upward march of the headline statistic for the overall average rating.

In the context of tackling the energy trilemma,Footnote 7 the scale of these improvements is striking. But the market transformation in Australia is being driven by commercial interest: investors and developers get better yields from better rated buildings because occupiers associate them with better buildings.Footnote 8 Statistics demonstrate that occupiers stay in better rated buildings longer—voids are lower—as shown in Fig. 4a [22]. Occupiers are also willing to pay higher rents for better rated buildings, so income return is higher, as shown in the left hand set of data in Fig. 4b [23]. Better rated buildings also produce stronger capital growth (middle set of data in Fig. 4b).

Fig. 4
figure 4

(a) Offices with higher NABERS Energy ratings have lower voids (y-axis shows % vacancy rate at end of given period). (Source: NABERS, IPD). (b) Offices with higher NABERS Energy ratings deliver stronger financial returns (y-axis shows % financial return). (Source: The Property Council/IPD Green Property Index, MSCI, March 2015)

Government’s role has been to develop and operate an online public rating and disclosure platform, create infrastructure for independent and authoritative ratings to be produced by accredited assessors and to lead by example by setting minimum ratings for the space it leases. Once the rating had become established in the market, government was moved to make performance disclosure mandatory. It is apparent that technical innovation usually needs policy intervention to extend market take-up beyond early adopters. But the experience in Australia demonstrates how performance transparency can be powerful in driving improvement, both at the top and the bottom of the efficiency scale: there are no mandated minimum energy standards.

Appendix 2: How Does the UK Compare?

By contrast with Australia, sale and let transactions in the UK are informed by an Energy Performance Certificate (EPC) [24], a theoretical calculation which rates how energy efficient your building is using grades from A to G (with “A” the most efficient grade) but does not reflect real performance and so gives limited insight to decision makers. Full compliance with Building Regulations Part L2A [25] does support a direction of travel which should make it possible to measure the performance outcomes for all the energy uses regulated by Part L2—using sub-metering which has been mandated for new buildings since 2002 [26]. However, there is no requirement, nor a pervading culture, for a comparison to be made between the measured outcomes and the predictions made at the design stage, let alone for this to be disclosed to stakeholders. In many respects, this is a perverse situation—why is there no guidance suggesting this would be a useful purpose for the metering system? The absence of such a culture (or requirement) means that this comparison is almost never made. It certainly prevents policy makers getting the evidence base for the ratcheting up of Part L2 requirements that has occurred roughly every 5 years since energy efficiency regulations were first introduced for commercial buildings (offices and shops) in 1985 [27]. And it prevents a light being shone on the notorious performance gap between the predicted and measured values for regulated energy end uses.

This failure to use evidence which could be collected from equipment installed to comply with regulations to tackle the performance gap is especially stark in a building with a single occupier, where all sub-meter data can reasonably be expected to be collected at a single central point.

In multi-let buildings, the Part L2 metering requirements are less well aligned with the objective of quantifying the energy performance gap. Individual tenants might not install their own sub-metering system for their own energy use. But even if they did, this data on energy end use breakdown would not normally be made available to a landlord, making it difficult to aggregate whole building energy use for each category of regulated loads and creating a barrier for making a comparison with the predictions at the design stage.

Unlike in Australia, the UK does not have a mentality of designing for measurability. The best empirical evidence available for commercial multi-let buildings is collected by the BBP from its members. This data enables a comparison to be made between metered whole building energy intensity and the building’s EPC grade (see Fig. 5). The data suggests a limited correlation—the median energy intensity values do get better (lower) as grade improves, but there’s so much variability that this marginal trend is of limited statistical significance.

Fig. 5
figure 5

Comparing whole building energy intensity for buildings with different EPC grades. (Source: Sarah Ratcliffe (BBP) at the CIBSE Building Performance Conference, London 17 November 2016 [28])

The UK’s non-residential building construction supply industry is notoriously fragmented, a position often cited for poor energy performance outcomes. It is true that responsibility for energy efficiency is often passed from the building’s MEP designers to the appointed Design and Build contractor. And once the shell-and-core is completed, often with a placeholder Category A fit-out, it is then handed over again to a whole new set of businesses to deliver the desired Category B fit-out for each tenant. However, new building procurement in the Australian market is not materially different in these respects, and yet because the energy performance outcome is a critical KPI for the developer in Australia, the baton is not dropped at each handover point in the energy efficiency ‘relay’.

It is material to note that both jurisdictions share the aim to provide the market with relevant information about the energy performance of a building at the moment of a property transaction, when the data can inform buying and letting decisions. The arrangements in the UK arose from the implementation of the European Energy Performance of Buildings Directive [29], whilst those in Australia evolved from experience of applying the initially voluntary NABERS scheme. However, their approaches to the same end could not be more different: the UK’s EPC [30] and the Australian Building Energy Efficiency Certificate (BEEC) [31]—see Fig. 6.

Fig. 6
figure 6

Comparing how the markets in the UK and Australia are informed about building energy performance at the moment of a sale or lease property transaction

The alternative representations of a building’s energy efficiency in each country for the purpose of market transparency (theoretical “asset rating” EPC vs. measured “operational rating” BEEC), gives rise to the idea of considering the different approaches as if they were medicines being tested in a medical blind trial to treat a disease. After at least 10 years of each jurisdiction applying their different “medicine”, how have the two respective cohorts of patients (buildings) responded to the treatment they received. The 2016 feasibility study published by the BBP delved into the data to determine what, if any, differences there were in outcomes in the UK and Australia.

To make the comparison on a like-for-like basis, the energy performance of buildings in London and Melbourne were plotted on the same graph, where the x-axis is the NABERS 1 to 6 star scale and the y-axis is the measured base building energy intensity (see Fig. 7). Although there are significant differences between Melbourne’s climate and London’s, this factor would not be enough to drive dramatic variances in annual energy intensity. For much of a typical year in each climate, the weather in London and Melbourne is similar. Melbourne tends to have much hotter peak summer months, requiring more cooling energy, but this is compensated by milder peak winter months, requiring less heating energy.

Fig. 7
figure 7

Base building performance of new offices in Melbourne and London compared

The black line on the graph in Fig. 7 shows the relationship between base building energy intensity measured in units of kWh of electricity equivalentFootnote 9 (kWhe) per m2 of net lettable area per year and the 1 to 6 stars NABERS scale for the State of Victoria where Melbourne is the State capital. The scale is linear from 1 to 5 stars with a 38 kWhe/m2 NLA bandwidth. Base building energy must be <204 kWhe/m2 to get on the scale with a 1 star rating. 5 stars is at 52 kWhe/m2 NLA. 6 stars is at 26 kWhe/m2 NLA, that is, half-way from 5 stars to net zero. Half stars are available between the integer values—official ratings are rounded down to the nearest half star rating.

The graph illustrates a reality in which new office buildings in Melbourne are never worse than 4.5 stars, and a significant proportion achieve 5 or 5.5 stars. Two have actually achieved 6 stars, confirming this level as market leading. In terms of base building energy intensity, this places new buildings in Melbourne in the range 40–70 kWhe/m2/year, with the best at 26 kWhe/m2/year.

The average base building energy intensity of 160 kWhe/m2/year for London offices shown in Fig. 7 covers data collected for 85 assets by Verco in 2013. Because base building energy use is not specifically measured, the quality of this bulk data was recognised to be weak. To address this concern, detailed energy audits were undertaken for four of these assets, with the findings written up as case studies [32]. The results for these case studies were scattered around the average level, giving confidence in the average value, which was also anecdotally corroborated as plausible by individuals with everyday exposure to data from office building portfolios. This exercise was done as part of work to develop and test a Landlord Energy Rating (LER) scheme for the BBP.Footnote 10

With base building energy use not generally measured in the UK, it was estimated, probably optimistically, that the energy intensity range for new buildings in London might be 80–160 kWhe/m2/year. The conclusion is that the most efficient new office buildings in London are three times more energy intensive than the best in Melbourne, whilst the least good in London are using over six times more energy than the best in Melbourne. With no visibility of actual base building performance outcomes, it is no coincidence that the base building energy efficiency of new UK commercial offices compares so unfavourably with that for their counterparts in Australia. Harking back to the medical trial, the patients in Australia have fared very well, whilst those in the UK remain critically ill.

What proves that climate is not the critical factor is the data for base building energy intensity for Melbourne offices in 2002. Back then, the average rating was 2.5 stars or about 150 kWhe/m2/year. It can thus be seen that the least good new buildings in Melbourne are now using less than half the average in 2002, whilst the best are using six times less. The differences in today’s outcomes between London and Melbourne are clearly being driven by the huge improvements that Melbourne has achieved in the last 15 years, not climate differences. The EPC has not driven corresponding improvements in the operational energy performance of the UK’s commercial office buildings. However, Australia’s experience suggests that with the right drivers, the energy use of base building services in new UK offices could typically be halved, and best practice four to five times lower.

Following reports on the performance gap by the Green Construction Board in 2013 [33] and UK Green Building Council in 2016 [34], the UK property market has woken up to the potential of buildings which perform as intended and to the risks with those that do not. The ability to demonstrate that energy efficient operation can be achieved in new buildings, can help to identify exemplar pathways for deep retrofits of the existing stock, on a trajectory towards net zero energy in operation.

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Cohen, R., Waring, G. (2020). A Methodology to Address the Gap Between Calculated and Actual Energy Performance in Deep Renovations of Offices and Hotels. In: Bertoldi, P. (eds) Improving Energy Efficiency in Commercial Buildings and Smart Communities. Springer Proceedings in Energy. Springer, Cham. https://doi.org/10.1007/978-3-030-31459-0_10

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-31459-0_10

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-31458-3

  • Online ISBN: 978-3-030-31459-0

  • eBook Packages: EnergyEnergy (R0)

Publish with us

Policies and ethics