A guest post from Graham Coleman regarding the SPAB building condition survey 2017 by Archimetrics LTD with reply on their observations of the review Graham carried out.



My response to each section of the reply, where appropriate, is given in bold italics: The reply from Archimetrics is given in full.

“We thank Graham Coleman for taking the time to feedback on this work and appreciate that all comment is welcomed.

In summary we understand that Graham has highlighted a data handling error and he feels the data collected for all 3 properties simply shows that the walls rapidly find their equilibrium with external conditions regardless of wall material and thickness, retrofit measures taken, orientation and other variables which may have an effect on performance.”

There is more than “a data handling error” eg, recording data which includes absolute humidities which cannot occur in the UK, recording a relative humidity of 0% which cannot occur and concern over the performance of the data loggers and interpretation of the data.

“In response we would like to offer the following:

The data handling error concerns external conditions data. Data collection can of course be a difficult undertaking in the real world and when allowing for uncontrolled events. Our external data has indeed suffered outages for variety of reasons which include one home owner turning the mains off every night until the morning without prior notification, damage to our external wireless logger from building maintenance again without notification, sensor failure due to in-service conditions etc. These data holes are both declared and visible.”

I have over 45 years of data collection and analyses in the real world, and fully appreciate the comments about data losses, etc. It is important to show how the data was handled where it is missing/intermittent-this is not the case in the report, and as a consequence the results of which can and will lead to flawed outcomes.

In the report page 5 it states, “Where data is missing from an analysis values are shown as unchanging or as a gap and where this impinges on the written discussion the absence is noted within the text.”. There are only 2 instances where a “gap” has obviously been left – correctly – for absence of data (tables 3 and 9). In tables 5 and 11 these same gaps have been filled with 0.00 figures, that is data-and this has been included in the average. If there have been missing data then it certainly is not clear in what data series these occur, and how they were actually handled. According to the report the absence of data can be made up by, “-analysis values are shown as unchanging–” – in other words add false data. The data holes are not properly declared and only in 2 tables are they clearly visible.

“We acknowledge that these data holes have been incorrectly handled by the code we have written to analyse the data, this simple error is clear to see and easy to correct. This has obviously slipped through the review process.”

This statement is of concern and may have serious consequences for the project as a whole.

If the “code” you have written has mishandled data holes (and thus probably some data itself, eg, obtaining some absurdly low absolute humidity figures) it is seriously flawed. If the “code” written has mishandled data for the latest report figures to give erroneous results then a serious question must be asked, has the same mishandling of data via faulty “code” occurred from 2011 to the present?

The above also states that, “-this simple error is clear to see and easy to correct. This has obviously slipped through the review process” If the “simple error” is now that “clear to see” then why wasn’t it “clear to see” at the time of collection, data handling and during the review? This now leads to the question of how many simple “clear to see” data errors have been missed over previous years. If such “clear to see” simple errors are being missed then this may question the experience and expertise of the researchers and reviewers. It also brings up the question with reference to ‘not quite so clear to see data errors’. As for, “–easy to correct.”, this is not going to be the case. If there are such data errors then it is going to be more than a simple adjustment since the data sets will need to be reviewed as well, of course, as the “code” that has led to the errors- present and probable past.

“The purpose of this work comes about from the widely accepted concerns that insulating buildings, especially traditional buildings can have unintended consequences. This long-term observation of 3 buildings is intended to shed a little more light on this area of concern with the view that it is better to look than assume.”

Notwithstanding the likely data errors, the current published data from past years to present does not appear to identify anything except internal/external atmospheric changes. There was only 2-4 weeks data collection in February/March 2011 prior to insulation. Subsequent to that there appears to have been effectively nearly 6 full years data collection for each property, certainly consistently from 2013 to present. The data during this period does not reflect any apparent change to the walls.

“We do not hold the view that all walls insulated or not, quickly find equilibrium with the external environment regardless.”

The report’s own data shows that walls find equilibrium moderately rapidly following changes in internal/external atmospherics.

“We do hold the view that insulating buildings may have negative impacts on some building fabric and occupants and that we should endeavour to understand the implications as best we can so as to improve the potential outcomes of improving the energy efficiency of our building stock.”

Unfortunately the monitoring data as reported does not throw any light on this; the data shows that there has been no effective change since at least 2013. There is no data over the years of monitoring that records any “impact to the occupants”.

“We do hold the view that this work, along with any other study, has its flaws and that there is always scope for improvement-we are constantly evaluating, developing and improving through experience. It is not uncommon that the act of undertaking to answer a question often reveals further questions, acknowledgement of this fact forms an important part of our ethos.”

Any study will have ‘limitations” but should not have “flaws”, in other words wrong information-there is a significant difference between limitations and flaws. If a study is shown to have flaws then its value will be diminished: any flaws should have been identified prior to publication and resolved, and if subsequently found, the paper should be withdrawn for correction and re-evaluation.

“As part of a constructive review process, we would welcome a face to face meeting with Graham to discuss his views, in the hope that this might further understanding of this important field.”

I would be most pleased to meet to discuss

G.R.Coleman. B.Sc.(Hons).,C.Biol.,M.R.S.B.,A.I.M.M.M..

Copyright © 2010 Preservation Expert. Legal Stuff: All the advice and information in the posts on my blog is made in good faith and is based on my experience and knowledge at the time of writing. However, nobody is infallible and whilst I’m confident that most of what I write about preservation issues is accurate, there’s a good chance there’ll be an error or two somewhere. I do change my mind about stuff, as I gain more experience. In view of this you must make your own decisions on whether to follow any advice I write and think about this; I could be wrong. No responsibility will be accepted by the author for any losses anyone may suffer as a result of any mistake or for the consequence of any action you take as a result of reading this blog. If you do suffer a loss, resulting from anything I’ve written, a verbal heartfelt apology will be your only compensation.