Mahmoud Abdelaal joins HKA as Director in the European Advisory, Claims and Commercial Management team
9th November 2022
HKA announced today that Mahmoud Abdelaal joins HKA’s European ACCM team based out of the London office.
Mahmoud is a delay and programming expert with a civil engineering background and over 15 years’ experience. He has been appointed as the delay expert on various matters in the UK and the Middle East. He has also advised contractors and developers alike on various matters as a part of strategic advisory and dispute avoidance process.
Mahmoud’s experience comprises substantial building, infrastructure, power plant and mixed-use developments with a budget of £1billion+ in value.
Mahmoud was recognised in Who’s Who Legal as “enormously impressive” in his expert work, and consistently demonstrates “attention to detail, detailed technical understanding and a thorough and rigorous approach”.
In addition to his expert and advisory work, Mahmoud has delivered training and speeches in a number of law firms and universities on techniques of delay analysis and other delay matters.
“We are delighted to welcome Mahmoud to HKA. His skills, knowledge and expertise continues to strengthen our position as the leading source of expertise in the capital project and infrastructure space, providing highly experienced experts to respond to our client’s needs across EMEA”
Toby Hunt, Partner and Head of Europe
ABOUT HKA HKA is the world’s leading consultancy of choice for multi-disciplinary expert and specialist services in risk mitigation and dispute resolution within the capital projects and infrastructure sector. We also have particular experience advising clients on the economic impact of commercial and investment treaty disputes and in forensic accounting matters. In addition, HKA supports companies that conduct business with the US Federal Government, providing them with consulting services on complex government contracting matters.
As trusted independent consultants, experts and advisors, we deliver solutions amid uncertainty, dispute and overrun, and provide the insights that make the best possible outcomes a reality for public and private sector clients worldwide.
HKA has in excess of 1,000 consultants, experts and advisors in more than 40 offices across 18 countries.
Alex Lee joins HKA as Principal to lead new Environmental and Climate Change discipline
26th October 2022
HKA announced today that Alex Lee has joined HKA in Glasgow to lead its new Environment and Climate Change expert capabilities within its global Forensic Technical Services (FTS) team.
Alex is a Chartered Geologist and Scientist with 25+ years’ experience in his field. He holds a PhD in Geology and an MSc in Environmental Modelling and Monitoring. He is a technical specialist in risk assessment, contaminated land, hydrogeology and remediation design.
Before joining HKA, Alex was a Technical Director at WSP in the UK where he was a long-term member of the senior management team, involved in all aspects of management. He was also national technical lead regularly acting as an expert witness, providing opinion on technically complex matters.
“I am thrilled and excited to join HKA as global lead of its new Environment & Climate Change team. This new Environmental service offering will further add but also complement many of the services already offered by HKA. Environmental disputes are rapidly growing yet also present a very diverse arena and we will consequently be providing a portfolio of the best global experts.”
Alex Lee, Principal
As part of the FTS team, Alex will lead the recently launched Environment and Climate Change discipline, offering clients access to expert witnesses skilled in a variety of specialisms, including environmental science, regulatory and compliance, land remediation, climate performance, ecology, waste and contaminated land.
“We are delighted to welcome Alex to the team. His vast technical and leadership experience is a great fit to support our growth ambitions to further develop HKA’s Environment & Climate Change expert capabilities. His experience acting as an expert witness will also be a great source of knowledge for this growing team.”
Gerry Brannigan, Partner and Head of Forensic Technical Services
ABOUT HKA HKA is the world’s leading consultancy of choice for multi-disciplinary expert and specialist services in risk mitigation and dispute resolution within the capital projects and infrastructure sector. We also have particular experience advising clients on the economic impact of commercial and investment treaty disputes and in forensic accounting matters. In addition, HKA supports companies that conduct business with the US Federal Government, providing them with consulting services on complex government contracting matters.
As trusted independent consultants, experts and advisors, we deliver solutions amid uncertainty, dispute and overrun, and provide the insights that make the best possible outcomes a reality for public and private sector clients worldwide.
HKA has in excess of 1,000 consultants, experts and advisors in more than 40 offices across 18 countries.
Werner Luüs returns to HKA, joining the Quantum team as Director in Europe
25th October 2022
HKA announced today that Werner Luüs rejoins the European Quantum team based out of the London office.
Werner Luüs is an experienced chartered quantity surveyor who has worked for contractors, dispute resolution consultancy firms, and professional quantity surveying firms on a wide range of projects in the buildings, infrastructure, resources, power and utilities, and ship building sectors in the UK, Europe, Africa, Middle East, and Asia regions.
Werner has been appointed to prepare expert reports on quantum matters in dispute, and worked as lead support to quantum and delay experts during proceedings of complex, high-value disputes in international arbitration.
Werner specialises in quantity surveying, commercial management, expert witness and advisory services, and he has extensive experience working with project teams on live projects.
Werner’s impressive range of project experience, some which exceed US$1bn in value, includes major infrastructure schemes (an airport, and rail systems), healthcare facilities, a nuclear power plant, onshore and subsea oil and gas systems, and ship building.
“I’m delighted to welcome Werner as a Director in the Quantum team. We look forward to working with Werner again on large infrastructure and offshore projects.
Together with all colleagues, Werner will play an active role in our exciting future growth plans at HKA.”
Emyr Evans, Partner and Joint Service Line Lead – Quantum
ABOUT HKA HKA is the world’s leading consultancy of choice for multi-disciplinary expert and specialist services in risk mitigation and dispute resolution within the capital projects and infrastructure sector. We also have particular experience advising clients on the economic impact of commercial and investment treaty disputes and in forensic accounting matters. In addition, HKA supports companies that conduct business with the US Federal Government, providing them with consulting services on complex government contracting matters.
As trusted independent consultants, experts and advisors, we deliver solutions amid uncertainty, dispute and overrun, and provide the insights that make the best possible outcomes a reality for public and private sector clients worldwide.
HKA has in excess of 1,000 consultants, experts and advisors in more than 40 offices across 18 countries.
I considered the potential applications of machine learning technology in the construction industry in a previous article ‘Future of forecast: machine learning’when I reflected on how reliable forecasts and insights generated via machine learning technology may noticeably affect the efficiency of construction projects.
One of the challenges of using this emerging technology is the lack of efficient and reliable data[1]Curry, E. et al. (2021) ‘Technical Research Priorities for Big Data’, The Elements of Big Data Value. Springer International Publishing. doi:10.1007/978-3-030-68176-0_5 and therefore, the advancement of this technology relies on the determination of construction practitioners to record and standardise project data diligently, to support this core element of machine learning technology.
This article focuses on the necessity of capturing data into structured datasets – which are at the heart of machine learning – and explains the different data types, the importance of structured and standardised data and overall data quality.
Data types
The construction industry has a broad range of data types and sources.
Data, or information, can be anything; for instance, correspondences, photos, site diaries, reports, schedules and transactions. However, the data is stored on different platforms, such as emails, ERP Systems, spreadsheets, planning software, databases, etc. Unfortunately, in the majority of projects, data and data sources are not integrated with each other, which causes practitioners to duplicate work and waste time, but most importantly, to record the information in different sources under different formats. Recent research indicates that 96% of data generated in projects is not used at all and, data sources of 30% of engineering and construction companies are not integrated, with employees spending 13% of their working hours searching for information in data sources[2]Snyder, J., Menard, A. and Spare, N., n.d. Big Data = Big Questions for the Engineering and Construction Industry. FMI..
In literature, types of data are defined under the two main groups; structured; and unstructured data. Unstructured data is stored in undefined and native formats, and consists of videos, images, audio, emails, word and pdf documents, and project scheduling native files such as Ms Project (.mpp) and Primavera(.xer). Currently, 90% of generated data (in construction and other industries) is classified as unstructured data, requiring data science expertise to get reliable predictions using machine learning algorithms[3]Curry, E. et al. (2021) ‘Technical Research Priorities for Big Data’, The Elements of Big Data Value. Springer International Publishing. doi:10.1007/978-3-030-68176-0_5.
Conversely, structured data is in a standardised format, allowing it to be quickly processed in databases or even MS Excel spreadsheets that are meaningfully organised, structured and appropriately checked and cleansed. Consequently, structured data can be used by the average business person and easily managed by a language, i.e. SQL (Structured Query Language)[4]Praveen, S. and Chandra, U., 2017. Influence of Structured, Semi- Structured, Unstructured data on various data models. International Journal of Scientific & Engineering Research, 8(12)..
Structured and standardised data
Considering one of the challenges in the adoption of machine learning applications in the construction industry is the availability of highly skilled people with strong domain, data analytics and data science knowledge, and of course, know-how[5]Zillner, S. et al. (2021) ‘A Roadmap to Drive Adoption of Data Ecosystems’, The Elements of Big Data Value. Springer International Publishing. doi:10.1007/978-3-030-68176-0_3., the importance of having structured and high-quality data is an imperative for the short and medium-term [6]McCord, S.E. et al. (2022) ‘Ten practical questions to improve data quality’, Rangelands. Elsevier BV. doi:10.1016/j.rala.2021.07.006. as it is easier to manage and use by construction industry practitioners.
This then raises two questions: (1) how can the percentage of structured data in construction projects be increased?; and (2) how can the quality of data be increased and then maintained at a high level?
(1): One of the ways to increase the proportion of data that is captured into structured data is by converting it from unstructured to structured version, once it is first deemed to be valuable. In literature, this process is called ETL, which stands for Extract, Transform and Load. In this traditional process, firstly, the raw data is extracted from the data source, then the extracted data is reformatted and cleansed, and in the final stage, the reformatted data is loaded into the final target database [7]Saradava, H., Patel, A. and Aluvalu, R., 2016. A survey on ETL strategy for Unstructured Data in Data Warehouse using Big Data Analytics. In: International Conference on Research & … Continue reading. As a practical example, the plain text is extracted from emails, then reformatted to .xml formats, and finally loaded to the final database.
However, this process requires data science expertise. This reinforces the need for companies across the construction industry to accelerate the training of practitioners and attracting skilled people capable of taking on such roles, so that data can be captured and turned into knowledge for subsequent re-use and application.
(2): The high quality of data impacts the accuracy of forecasts[8]Sadiq, S. and Indulska, M. (2017) ‘Open data: Quality over quantity’, International Journal of Information Management. Elsevier BV. doi:10.1016/j.ijinfomgt.2017.01.003.; clearly therefore, it follows that low quality data would generate significant decision-making mistakes[9]Cai, L. and Zhu, Y. (2015) ‘The Challenges of Data Quality and Data Quality Assessment in the Big Data Era’, Data Science Journal. Ubiquity Press, Ltd. doi:10.5334/dsj-2015-002.. Therefore, ensuring that data is of a high quality is vital. Several researchers suggested different dimensions to assess and maintain data quality, the common ones being accuracy, validity, and completeness[10]Ramasamy, A. and Chowdhury, S. (2020) ‘Big Data Quality Dimensions: A Systematic Literature Review’, Journal of Information Systems and Technology Management. TECSI. … Continue reading. Similarly, Data Management Association (DAMA) defines data quality dimensions under six main headings: completeness; uniqueness; consistency; timeliness; validity and accuracy[11]DAMA UK, 2013. The Six Primary Dimensions For Data Quality Assessment..
There is no doubt that construction practitioners have the capacity to establish and maintain high-quality datasets, and one of the ways to increase the overall quality of data is “standardisation”. Recent research has identified that standardised records increase the quality of data[12]Ni, K. et al. (2019) ‘Barriers and facilitators to data quality of electronic health records used for clinical research in China: a qualitative study’, BMJ Open. BMJ. … Continue reading and therefore, the more standardised data input onto platforms, the more structured and high-quality data would be on hand for use by parties.
There are different ways to standardise information. For instance, during project schedule preparation, the data quality, project communication and productivity would increase if specific English vocabulary, classified words and adjunct words are used[13][1] Li, C.F. (2012) ‘The Researches on the Standardization of Petroleum Exploration and Development Structured Data’, Advanced Materials Research. Trans Tech Publications, Ltd. … Continue reading. Moreover, the same logic of using specific words can be applied to site diaries, progress reports, quality and health and safety reports or even email topics. Additionally, tabulating, formatting, and integrating the data in the reports with one another would increase overall data quality and reduce human error[14]Curry, E. et al. (2021) ‘Technical Research Priorities for Big Data’, The Elements of Big Data Value. Springer International Publishing. doi:10.1007/978-3-030-68176-0_5.. Therefore, data standardisation would allow companies to not only accelerate their digitisation and machine learning strategies, but also increase effective information communication with the potential to save 7.5% of the project’s total expenditure[15]Hong, Y. et al. (2022) ‘Improving the accuracy of schedule information communication between humans and data’, Advanced Engineering Informatics. Elsevier BV. doi:10.1016/j.aei.2022.101645..
Summary
The amount of data generated daily is increasing exponentially with much of this data being unstructured, and therefore not easy to extract, analyse and use to provide valuable insight into construction activity. Considering the lack of highly skilled data analytic knowledge, it is not surprising that companies waste over 90% of data in the construction industry. It seems clear that construction practitioners would make great gains by standardising data inputs, integrating data sources, and working innovatively to prevent waste by reducing the volume of unusable data. This would eliminate errors, improve data quality and add value to our decision-making process and our industry.
In subsequent articles I will consider data ecosystems, common data environments, and whether data can be an economic asset.
Curry, E. et al. (2021) ‘Technical Research Priorities for Big Data’, The Elements of Big Data Value. Springer International Publishing. doi:10.1007/978-3-030-68176-0_5
Curry, E. et al. (2021) ‘Technical Research Priorities for Big Data’, The Elements of Big Data Value. Springer International Publishing. doi:10.1007/978-3-030-68176-0_5
Praveen, S. and Chandra, U., 2017. Influence of Structured, Semi- Structured, Unstructured data on various data models. International Journal of Scientific & Engineering Research, 8(12).
Zillner, S. et al. (2021) ‘A Roadmap to Drive Adoption of Data Ecosystems’, The Elements of Big Data Value. Springer International Publishing. doi:10.1007/978-3-030-68176-0_3.
Saradava, H., Patel, A. and Aluvalu, R., 2016. A survey on ETL strategy for Unstructured Data in Data Warehouse using Big Data Analytics. In: International Conference on Research & Entrepreneurship.
Sadiq, S. and Indulska, M. (2017) ‘Open data: Quality over quantity’, International Journal of Information Management. Elsevier BV. doi:10.1016/j.ijinfomgt.2017.01.003.
Cai, L. and Zhu, Y. (2015) ‘The Challenges of Data Quality and Data Quality Assessment in the Big Data Era’, Data Science Journal. Ubiquity Press, Ltd. doi:10.5334/dsj-2015-002.
Ramasamy, A. and Chowdhury, S. (2020) ‘Big Data Quality Dimensions: A Systematic Literature Review’, Journal of Information Systems and Technology Management. TECSI. doi:10.4301/s1807-1775202017003.
Ni, K. et al. (2019) ‘Barriers and facilitators to data quality of electronic health records used for clinical research in China: a qualitative study’, BMJ Open. BMJ. doi:10.1136/bmjopen-2019-029314.
[1] Li, C.F. (2012) ‘The Researches on the Standardization of Petroleum Exploration and Development Structured Data’, Advanced Materials Research. Trans Tech Publications, Ltd. doi:10.4028/www.scientific.net/amr.461.749.
Curry, E. et al. (2021) ‘Technical Research Priorities for Big Data’, The Elements of Big Data Value. Springer International Publishing. doi:10.1007/978-3-030-68176-0_5.
Hong, Y. et al. (2022) ‘Improving the accuracy of schedule information communication between humans and data’, Advanced Engineering Informatics. Elsevier BV. doi:10.1016/j.aei.2022.101645.
HKA’s Jana Lefranc speaks to Lisa Dubot as part of Mayer Brown Frankfurt’s breakfast series.
Jana speaks about the new Equal Representation for Expert Witnesses (ERE) pledge which looks to promote equal opportunity for women to serve as expert witnesses in international disputes.
This article was the basis of a talk given by Franco Mastrandrea to the Society of Construction Arbitrators on 21 July 2022
Delay analysis can be performed using a variety of methods. The choice of methodology depends on many factors, like the nature of the project, contractual requirements, availability and reliability of information made available, the time allocated to conduct the analysis, as well as others.
This was a Part 8 Claim (i.e. a claim over an issue which does not involve a substantial dispute of fact) by Essential Living for declaratory relief arising out of an adjudication decision that I made on 22 July 2019 (“the Adjudication Decision”).
The issue for determination by the court was whether, and if so, to what extent, the Adjudication Decision was binding on the parties for the purpose of the ongoing final account process under the contract and any further adjudication, pending final resolution of the matters determined in the adjudication by legal proceedings or settlement.
The Adjudication Decision arose out of an amended JCT Construction Management Trade Contract 2011 (“the Trade Contract”) under which Elements (Europe) Limited as contractor agreed to carry out the design, supply, manufacture and installation of modular/volumetric units for a mixed use development for Essential Living (Greenwich) Limited as Employer. The original Trade Contract Sum was £25,751,956 excluding VAT, and the original Completion Date was set at 11 December 2017.
The Trade Contract provided for periodic interim payments and for (interim) adjustments to the Completion Date. It also provided for calculation of the Final Trade Contract Sum and for a final assessment of the Completion date.
In my decision I summarised the dispute before me as being over the “latest interim” valuation of completed works, and liability for contra charges and liquidated damages.
In summary, the Employer, who referred the adjudication, contended that it was due a payment from the Contractor of £11,126,306.13, by reference to the Construction Manager’s Interim Payment Certificate dated 15 March 2019, whereas the Contractor contended that it was due an interim payment from the Employer of £6,367,673.23. So in all, some £17.5m was in contention.
The decision that I made ran to 107 pages and determined that:
The sum due to the Contractor for the original scope of works performed to date was £24,673,360.34;
The sum due to the Contractor for variations was £2,346,650.46. This involved considering each of the variations in turn, considering the rival contentions, and providing a fairly full narrated rationale against each for reaching the conclusion/valuation that I did;
The amount to which the Employer was entitled for remedying defects in Elements’ works was £1,423,096.00, as against a claimed sum of £11,700,873. Again, this involved considering each of the items in turn, considering the rival contentions, and providing a fairly full narrated rationale against each for reaching the conclusion/valuation that I did;
The amount to which the Employer was entitled for liquidated damages. Naturally, this turned on responsibility as between the parties for delays. I found that liability was the capped sum of £1,287,598.00 (which would have been reached after 26 weeks of culpable delay on the part of the Contractor – my finding was that the Contractor was liable for 36.93 weeks’ delay). Again, this involved considering each of the delay items in turn, considering the rival contentions, and providing a fairly full narrated rationale against each for reaching the conclusion/time determination that I did;
The amount to which the Employer was entitled under clause 8.7A (being an agreed sum by way of repayment of the financing costs incurred by the Employer in relation to the performance security escrow account) was £300,000.00;
Taking all this into account, I ordered that the Contractor should pay to the Employer the sum of £1,842,360.64. So, although the Employer “won”, as against the sum that it was looking for the win was significantly short. I therefore ordered that payment of my charges be split equally between the parties.
It seems that thereafter, Elements submitted to InnC, the Construction Manager, then on the demise of InnC to Essential Living, its documents for the purpose of calculation of the Final Trade Contract Sum. On 10 February 2021, a Mr David Somerset of Somerset Consult was appointed as the Construction Manager. It seems that Mr Somerset never determined the Final Trade Contract Sum. This hiatus appears in turn to have led to the litigation.
The Employer’s stance in the litigation was that, in respect of any matters assessed and decided by the adjudicator, the Adjudication Decision was binding for the purposes of calculating the Final Trade Contract Sum, fixing the completion period under the contract, and any subsequent adjudication.
The Contractor opposed, on the ground that the adjudication related to an interim application for payment, shortly before the occurrence of practical completion; the Adjudication Decision did not impact on the final account process or the contractual review of the period for completion following practical completion (on 31 May 2019).
Mrs Justice O’Farrell decided that:
Although an adjudication decision is binding temporarily on the parties, so that they must comply with and give effect to it (Paragraph 23(2) of the Scheme), in the absence of any agreement to the contrary, it does not affect the underlying rights and obligations of the parties under their contract or displace the agreed contractual procedures for determining those rights and obligations;
The consequence of the binding effect of an adjudication decision on a dispute or difference is that a subsequent adjudicator has no jurisdiction to determine matters which are the same or substantially the same;
the parties were bound by the Adjudication Decision on any dispute or difference determined in that Decision until it was finally determined by the court or by subsequent settlement;
the parties could not seek a further decision by an adjudicator on a dispute or difference if that dispute or difference had already been the subject of the Adjudication Decision;
it was a matter of fact and degree, requiring careful analysis of the evidence and argument on each disputed item, as to whether the Adjudication Decision was binding on any other discrete issue referred to and determined by the adjudicator, unless and until the Adjudication Decision was overturned, modified or altered by the court;
it was a matter of fact and degree as to whether any matters which the Contractor might seek to refer to a subsequent adjudication were the same, or substantially the same, as the matters determined by the Adjudication Decision; absent any Notice of Adjudication before the court, it was not possible for this issue to be determined;
the Adjudication Decision was binding in respect of variations considered and assessed by the adjudicator, unless and until the Adjudication Decision was overturned, modified or altered by the court, or unless either party identified a fresh basis of claim that permitted such variation claim to be opened up and reviewed under the terms of the Contract.
the Adjudication Decision was not binding on the parties for the purpose of the Construction Manager’s final determination of the Completion Period under clause 2.27.5, from which would flow any liability on the part of Elements for liquidated damages and finance charges;
the Adjudication Decision was not binding on the parties for the purpose of determining the Final Trade Contract Sum.
Take aways
Whether what the adjudicator decides has application beyond the immediate decision depends to a substantial degree on what the governing contract says. Thus, there may as in this case be final review provisions for extensions of time and the Final Trade Contract Sum, which may in turn affect liquidated damages prolongation and disruption claims.
It seems then that as adjudicator you should be clear what the dispute is and what it is that you are deciding.
It may be that matters such as the following, if decided, will be binding in a subsequent adjudication:
The meaning of a particular contract term e.g. whether particular heads of financial recovery were recoverable as “direct loss and or damage”.
The application of a contract term; e.g. whether a prospective or retrospective delay analysis methodology should be applied.
Whether futher performance under the contract had been validly terminated.
Can findings of fact be binding? By way of example, I had in another matter in which I had rendered two decisions, decided that the Contractor in that case would have been able to assemble the appropriate resources and skills to carry out further works, had it been appointed to those further works (and to which in breach of contract, it was not appointed). That case, Mallino Development Ltd v Essex Demolition Contractors Ltd [2022] EWHC 1418 (TCC) (10 June 2022) (bailii.org), as outlined in this article –Mallino: using the courts to attempt a break out from an adverse adjudication decision? – was decided in the TCC two days after Essential. Might it be suggested that had that issue arisen in a subsequent adjudication that finding would, according to Essential, have been a matter decided as a discrete issue in the earlier adjudication? Whether that runs the risk of colliding with another principle – that factual findings and conclusion in one set of proceedings will not be treated as evidence of those facts in another set of prceedings[1]Hollington v. F. Hewthorn & Co. Ltd [1943] KB 587, CA. Cf. Hui Chi-Ming v. R. [1992] 1 AC 34 (which may itself not be entirely secure, whether at common law[2]See, for example, Lord Hoffmann in Arthur JS Hall & Co. v. Simons and Barratt et al [2002] 1 AC 615, [2000] 3 WLR 543, [2000] 3 All ER 673, [2000] BLR 407, HL. or as a result of the Civil Evidence Act, 1968[3]See, for example, Crypto Open Patent Alliance v Wright [2021] EWHC 3440 (Ch).) is not entirely clear – albeit that that principle appears anyway to apply only between different parties.
Having said that, the learned judge in Mallino arrived at the same conclusion as I had but by a different route, namely that as Mallino had not formally challenged the Essex Demolition pleading which invoked my finding in the adjudication that it would have been able to carry out that further works, the finding in the adjudication had become an agreed fact.
Reasoning
The original adjudicator’s reasoning may in any event be of interest or assistance not only to the parties but also to the person charged with any final reviews, any subsequent adjudicator resolving final review disputes, and/or the tribunal charged with finally determining the disputes between the parties.
Quaere?
Will there be a move to amend contract terms so that all aspects of valuation are the subject of final review?
Can you/should you as adjudicator in a subsequent adjudication in exercising you rights to investigate the facts and the law (Scheme, para 13) seek the earlier adjudicator’s decision if not proffered by the parties as a potential aid to your own decision making?
See, for example, Lord Hoffmann in Arthur JS Hall & Co. v. Simons and Barratt et al [2002] 1 AC 615, [2000] 3 WLR 543, [2000] 3 All ER 673, [2000] BLR 407, HL.
Carl is a chartered engineer with over 30 years’ experience in the oil and gas, petrochemical, process, power, and water and wastewater treatment industries. He has an extensive technical background in piping and mechanical engineering, which includes experience in stress analysis and product/package specification and delivery.
Carl has held important positions in all areas of engineering, proposals, project, and operations management, and has established leadership and problem-solving skills.
Carl is an experienced manager of multidisciplinary teams, successfully tendering and delivering design, supply, fabrication, and construction projects.
“We are delighted to continue to grow the energy resources and industrial expert team at HKA with the addition of Carl. He brings a wealth of mechanical engineering and project management expertise gathered over 30 years from working on major, complex projects across a broad range of sectors and regions. Carl is a significant addition to forensic technical services, and we are delighted to have him on board.”
Zaffer Khan, Technical Director
ABOUT HKA HKA is the world’s leading consultancy of choice for multi-disciplinary expert and specialist services in risk mitigation and dispute resolution within the capital projects and infrastructure sector. We also have particular experience advising clients on the economic impact of commercial and investment treaty disputes, forensic accounting matters and in cybersecurity and privacy governance and compliance. In addition, HKA supports companies that conduct business with the US Federal Government, providing them with consulting services on complex government contracting matters.
As trusted independent consultants, experts and advisors, we deliver solutions amid uncertainty, dispute and overrun, and provide the insights that make the best possible outcomes a reality for public and private sector clients worldwide.
HKA has in excess of 1,000 consultants, experts and advisors in more than 40 offices across 18 countries.
It is not uncommon in large construction projects for disputes to arise which involve several thousand alleged variations (or change orders) and/or alleged defects which need to be considered by the arbitral Tribunal or Court in the formal dispute resolution process chosen by the parties.
In addition to the often-extensive factual evidence, this usually gives rise to the need for expert evidence to be adduced dealing with the cause-and-effect liability issues as well as possible delay and quantum claims which arise as a result.
Parties and their appointed experts in many jurisdictions are now required to adopt a proportionate approach whereby they do not “use a sledgehammer to crack a nut” but this can mean different things in different situations. The concept of proportionality is not formulaic in nature.
Being proportionate must often be set against allegations of abuse of process in that not every claim may be considered on its own merits if shortcuts are taken. Despite this, the English Court at least has recently upheld[1] the established position that in principle, a claimant can pursue a claim which relies on sampling for establishing liability or causation of damage, but great care needs to be taken as to how this is executed. Also, whether the sampled answers can be extrapolated to the unsampled population needs careful attention too. This is in line with the Civil Procedural Rules rule 32.1 which confers powers on the Court to control evidence.
The Technology and Construction Court (TCC) Guide[2] in England requires , before the first case management conference, the parties to give careful thought to expert issues including any “…appropriate or necessary …sampling”. This will often require a discussion between the experts as to an appropriate sampling protocol to be adopted in the proceedings. In these joint expert discussions it is worth bearing in mind what are known as the “Whitford Guidelines”[3]. For a survey to have evidential value it must be shown that (in that case) the interviewees were selected from a cross section of the population, the survey was of a sufficient size to produce relevant results on a statistical basis, and it was conducted fairly.
Statistical sampling is a method whereby it involves the selection of a subset of items from a larger group (or population) of data claimed and uses the results of this sample to estimate the characteristics of the remainder of the population (i.e., the unsampled data claimed).
The use of sampling and extrapolation may make the claim more difficult to establish at trial but this must be balanced against the cost savings on offer by not exhaustively examining every individual claim on its own merits. However, the claimant must still give the defending party an opportunity to properly understand the claim made against it and allow it to defend itself against both the sampled and extrapolated claim.
So, the obvious question which arises is how does one approach a large and technically complex dispute involving thousands of issues on a proportionate basis by using sampling and extrapolation as a preferred method? I have recently been involved with two such cases and make the following comments on the basis that I am not a statistician, but I have used statistics as a tool in the same way that I use Microsoft Excel as a tool, but I do not know how the software works.
The first challenge any expert may face when employed to try to demonstrate cause and effect, is to their credibility, expertise and whether they are suitably qualified to offer an opinion on the matters under consideration. As well as the relevant rules of evidence to be applied, this is often referred to as a “Daubert” challenge following a case in the USA[4] which provided that an expert must be qualified based on “knowledge, skill, experience, training and education” and questioned whether such evidence was admissible in that case. While this case related to scientific expert evidence, it now seems it can equally be applied to other expert evidence (at least in the USA and Canada) such as technical or quantum experts using for instance, sampling and extrapolation as an approach which may be described as a scientific approach.
A Daubert challenge consists of two elements. The first is a question of reliability and the second is one of relevance. In the first such instance, reliability means: did the expert follow a sound method or any acceptable protocols? There are usually at least five tests which are applied to answer this question. They include:
Whether the theory or technique presented as expert testimony and evidence can be tested;
Whether the theory or technique has been subjected to peer review and publication;
The known or potential rate of error;
The existence and maintenance of standards controlling the technique’s operation; and
Is there any general acceptance of the approach by industry peers?
The second question is relevance: do the opinions fit the facts of the case? This is usually a matter for the Court or Tribunal to determine.
As for the five tests for reliability, the first one as to whether the theory or technique can be tested is perhaps less applicable to many construction disputes and is more applicable to scientific cases such as medical malpractice or pharmaceutical litigation. In construction cases it is often a question of whether there are other indicators that the expert’s analysis and evidence are reliable.
Likewise, in construction cases the lack of a peer review is not often a successful Daubert challenge due to the less widespread publications for quantum and delay experts than for, say, medical experts. So, for example a delay expert using a “Windows” method which is widely accepted in the industry nowadays would not need to be subject to a peer review.
So the remaining three challenges listed above should be considered by an expert when selecting a methodology. The expert’s methodology should also consider a reasonable amount of other potential causes of loss or damage etc. if they are likely to exist. These other potential causes should be identified in an expert report and an explanation provided as to why they were ruled out.
Assuming these challenges can be satisfied, the next step is to decide how best to go about a sampling and extrapolation exercise. These two do not automatically go together. There are two predominant types of sampling, they are random sampling and judgmental sampling. Random sampling is a widely used statistical method with various techniques of how it should be done, and the results can be extrapolated if done properly.
Judgmental sampling on the other hand, is a non-statistical method used to obtain a broad coverage of the population in a sample. For the results to be extrapolated, it is essential to show the sample was representative and was free of any bias.
As referred to above, the cases where this has been used successfully are often where both parties and/or their experts working together, are involved in the sampling process. By agreeing a sampling protocol, the parties can avoid the viability of the sampling process itself being an issue at trial.
When large complex disputes are sought to be resolved by sampling, a useful guide as to how best to approach the sampling exercise can be found in a document published by the U.S. Department of Health and Human Services Office of Inspector General titled “Statistical Sampling: A Toolkit for MFCU’s”[5]but this is by no means legal guidance. It does, however, provide a useful response to some of the five tests listed above.
The Toolkit provides a step-by-step guide of how to select a statistical sample and calculate a valid statistical estimate. It sets out thirteen steps which can be followed and would be capable of being replicated by an opposing party. It is an important aspect, and a potential challenge to sampling, if it is not possible to replicate what was done in the sampling exercise.
Importantly, the Toolkit provides a means of generating a random sample which is essential if it is to be extrapolated to the remainder of the population (claim data). A useful suggestion which is made is that if a party is planning to seek assistance from a statistician, this should be done at an early stage in the sampling process.
However, while it may well be advisable to employ an expert statistician at an early stage in this process, it is by no means essential provided great care is taken along the way, the right steps are taken, with the aim to achieve up to a 95% confidence level in the results generated.
Done properly, sampling and extrapolation can offer a proportionate approach to case management and will have a realistic chance of success in proving liability or causation in large complex cases. It must not, however, be seen as a shortcut to success and may well involve significant cost and time to achieve the desired results. The claimant still shoulders the burden of proof, and this must never be overlooked.
[1]Building Design Partnership Ltd v Standard Life Assurance Ltd [2021] EWCA Civ 1793, Cable v Liverpool Victoria Insurance Co Ltd [2020] EWCA Civ 1015, Amey LG v Cumbria County Council [2016] EWHC 2856 (TCC) and Imperial Chemical Industries Ltd v Merit Merrell Technology Ltd [2017] EWHC 1763 (TCC)
[2] Technology and Construction Court Guide (2nd edition, updated 2015) para 13.3.4
[3] See the Whitford Guidance in Champagne Louis Roederer v J Garcia Carrion, SA [2015] EWHC 2760 (Ch), para [29]
[4] Daubert v Merrell Dow Pharmaceuticals, Inc., 509 U.S. 579 (1993)
[5] “MFCU” stands for Medicaid Fraud Control Units
By September 2020, Microstrategy CEO, Michael Saylor, realised the company had a problem. By his calculations, due to inflation and excessive central-bank money printing, the real yield on Microstrategy’s sizeable cash reserves was negative 10%. Saylor described the technology company as having a “$500 million melting ice cube.” His solution? Saylor exchanged the cash for Bitcoin and has been a buyer ever since, even to the point of issuing debt to acquire more. He is a firm believer that holding cash is far riskier than holding Bitcoin due to the latter’s inherent scarcity – only 21 million will ever be mined.
In some ways, the construction industry has the exact opposite problem to Microstrategy. Instead of worrying about how to preserve the value of cash reserves, construction companies actually experience poor cash flow, arising from payment delays, cost overruns, low profitability, and disputes between parties. [1]
According to a recent survey of construction company owners, [2] 84% said they had cash flows problems from time to time, 20% said cash flow was a constant issue, and only 8% get paid on time. It may therefore come as no surprise to learn that, according to UK statistics, the construction industry has the highest percentage of insolvencies of all industries. [3]
Can cryptocurrency offer hope to the beleaguered construction industry? In this article, we consider whether a particular type of cryptocurrency – stablecoins – could be part of the answer to addressing the construction industry’s cash flow issues.
Before discussing stablecoins specifically, however, we need a brief refresher/introduction to smart contracts and cryptocurrency.
Smart contracts and cryptocurrency within the construction industry
Many articles have been written about how smart contracts [4] could revolutionise the construction industry. [5] At the risk of over-simplification, smart contracts aim to improve efficiency, reduce disputes and add transparency. When a contract automatically self-executes as a result of certain conditions being met (e.g., materials are delivered on-site, an agreed-upon milestone deadline is met), there is no need for anyone to enforce the terms of the contract. This would allow, for example, for prompt payment to be made without recourse to a judge or an arbitrator.
In the legacy banking world, however, “prompt payment” can still involve significant time and money, particularly if there are cross-border payments to be made, if it is a weekend/ bank holiday and the bank is shut, etc. For this reason, it may make good sense to consider incorporating cryptocurrency within the smart contract.
A cryptocurrency is a digital currency designed to work as a store of value and a medium of exchange; they are recorded on distributed ledger technologies/blockchain and are cryptographically secured. Among the benefits of this technology are that transactions may be settled near-instantaneously, there is no risk of double-spending, and there is no need for a central authority/ intermediary to ensure settlement takes place. Unlike with the legacy bank system, therefore, transactions using cryptocurrencies can be made on a 24/7 basis, can be significantly cheaper, and are fully transparent (since recorded on the blockchain).
Introducing stablecoins
Having explained certain basics, we now need to turn to selecting an appropriate digital currency for our smart contract. Although the author is a great fan of Bitcoin, one of its drawbacks is the sheer volatility in price. In October 2020, for example, it traded at around $10,000 per Bitcoin; the price rose to over $60,000 in April 2021 before retracing to around $29,000 in July 2021. It then hit an all-time high of $69,000 four months later in November 2021 and is currently around $30,000. Talk about a roller coaster! Clearly, it is hard to advocate for the use of Bitcoin in a construction (smart) contract if there remains fundamental uncertainty about the value (in fiat currency terms) of the eventual transaction.
Stablecoins were invented as a response to the need for a non-volatile cryptocurrency to facilitate everyday transactions. A stablecoin is a type of cryptocurrency that attempts to peg its price to an external reference point, usually a fiat currency or commodity. Unsurprisingly, given the importance of the US Dollar as the world’s de facto reserve currency, the overwhelming majority of stablecoins are pegged to the US Dollar.
Although a full guide to stablecoins is beyond the scope of this article, in general, there are three types of stablecoins:
(i) Centralised stablecoins – these coins are issued by a centralised authority and are purportedly fully backed by physical cash/cash equivalents, or commodities. They currently account for the largest share of the market by value. Examples: Tether, UDSC.
(ii) Crypto-collateralised stablecoins – as per the description, such stablecoins use other digital cryptocurrencies as collateral. Example: Maker DAO.
(iii) Algorithmic stablecoins – these types of stablecoins are not directly backed by collateral but rather use smart contracts and an algorithmic mechanism to maintain the peg. Algorithmic stablecoins have historically had a high failure rate. For a while it seemed that one ‘algo’ (UST, issued by Terra Luna) might prove to be the exception, but it collapsed spectacularly in early May 2022. One possible reason for the failure of Luna was the decision of its founder, Do Kwon, to back this stablecoin with Bitcoin, a highly volatile asset.
Growth of stablecoins
The recent growth of stablecoins has been nothing short of phenomenal. As a result of a surge in adoption and demand, the total value of all stablecoins has increased from $1 billion in 2018 to around $160 billion today. [6]
Source: Author’s own research
Initially stablecoins were mostly used to purchase other cryptocurrencies on exchanges that were not connected to the banking network. Today, the use cases for stablecoins have greatly expanded; they are used for regular online payments, decentralised finance, gaming (e.g., in-app purchases) as well as crypto trading. With the advent of a new digital age, it seems likely that the market for stablecoins will grow exponentially.
Why stablecoins could benefit the construction industry
We have already explained why it might be beneficial to incorporate cryptocurrency within smart contracts, that stablecoins offer a non-volatile medium of exchange, and are cheaper and quicker than the legacy banking system allows. Beyond these advantages, however, are others which we discuss below.
(a) Stablecoins can easily be converted to and from fiat currency.
Due to advances in banking and network technology, it is relatively easy and cheap to switch between digital and physical forms of cash on a cross-border basis. For a construction company employing an army of foreign labour, for example, payments can be made efficiently to individuals via mobile banking. This can be especially beneficial to foreign workers who may not have traditional bank accounts. Such workers can then send money home using the same technology, without incurring the hefty fees of traditional money transfer agents.
(b) Stablecoins can provide high yield.
In today’s low interest rate environment, the returns on deposits in a ‘traditional’ bank account are meagre. Companies can use treasury management (e.g., overnight deposits) to maximise their returns, but will likely struggle to outpace banking fees, let alone inflation. One of the attractions cryptocurrencies offer is the yield/return that can be generated by staking digital assets. Today, there are countless different strategies one might pursue in the quest for generating yield; the choice of strategy will depend as always on one’s risk appetite and how long one is prepared to lock up funds.
One relatively low-risk option involves staking the USDC stablecoin to earn a 12-month fixed rate of 4.7%. This return drops to 4.2% for a 3-month deposit and 4.0% for a 1- month deposit. [7]
Higher yields than the above are advertised on a variety of platforms, although the inevitable trade-off is assuming significantly greater risk.
It does not take much imagination to see how stakeholders in the construction industry could benefit from generating yield on stablecoin deposits. In a long-term construction project, for example, contractors/owners could earn a return on deposited funds until such time that these are released (e.g. when the conditions of a smart contract are met).
Conclusion
It is easy to forget how recently cryptocurrencies were invented. The Bitcoin whitepaper, which launched the world’s first and largest cryptocurrency, was only written in October 2008. Stablecoins are more recent still, with the first one invented only in 2014.
As a result, this is a technology that is still in its infancy, with much risk and uncertainty remaining about fundamental issues such as regulation. The collapse of algorithmic stablecoin, UST/ LUNA, in May will no doubt deter would-be participants in this new technology for some time to come. It is understandable that many will wish to exercise caution before taking the sort of leap of faith Michael Saylor took when he invested all of his company’s cash reserves in Bitcoin.
Putting that aside, however, the genie is truly out of the bottle and cryptocurrencies are likely to continue their incredible growth. For the reasons outlined in this article, it seems stablecoins may well offer a compelling solution to the chronic cash-flow issues endemic to the construction industry. Who Dares Wins?
The author wishes to thank Jennifer Carnegie for their kind assistance in carrying out research for this article.
___
[1] Carmichael, D. G., & Tran, H. (2012). Contractor’s financial estimation based on owner payment histories. Organization, Technology & Management in Construction: An International Journal, 4(2), 481–489. Source: https://hrcak.srce.hr/94292.
[3] According to ONS monthly insolvency statistics, 18% of all UK insolvencies between January 2019 and December 2021 were in construction. This compares with 13% for the hospitality & food sector and 8.4% for general manufacturing.
[4] A smart contract is a self-executing contract with the terms of the agreement between buyer and seller directly written into lines of code. The code and the agreements contained therein exist across a distributed, decentralized network. The code controls the execution, and transactions are trackable and irreversible. Source: Smart Contracts Definition (https://www.investopedia.com/terms/s/smart-contracts.asp).
[7]Circle | Institutional-Grade Crypto Yield. According to the product owner (Circle), this offer is ‘institutional grade’ with a ‘Clear regulatory framework’ and benefits from being ‘overcolleralized with bitcoin collateral.
Various delay analysis methods can be used within disputes.The choice of methodology depends on many factors, like the nature of the project, contractual requirements, availability and reliability of information made available, the time allocated to conduct the analysis, as well as others.
To justify the selection of a given delay analysis method, analysts and experts often refer to two guidance documents: the Delay and Disruption Protocol published by the UK Society of Construction Law (SCL Protocol) and Recommended Practice 29R-03 issued by the US-based Association for the Advancement of Cost Engineering International (RP 29R-03).
When a delay analysis is conducted during project execution, immediately or shortly after the occurrence of a delay event, the associated delay impact has yet to materialise. In this situation, delays can only be assessed on a prospective basis – i.e., contemporaneous analysis of delay. The SCL Protocol recommends, in the absence of contradictory contractual or legal requirements, to carry out a time impact analysis.[1] RP 29R-03 does not provide any contemporaneous guidance as the document focuses solely on delay analysis carried out retrospectively.
When delays incurred is analysed at a significant time distant from the delay event and its effect (retrospective analysis of delay), the 2nd edition of the SCL Protocol recognises that one size does not fit all, and provides a list of six commonly used delay analysis methodologies:
Impacted As-Planned Analysis;
Time Impact Analysis;
Time Slice Window Analysis;
As-Planned versus As-Built Window Analysis;
Retrospective Longest Path Analysis; and
Collapsed As-Built Analysis.
Except for the retrospective longest path analysis, the same delay analysis methodologies are also referenced in RP 29R-03.
The strengths and limitations of each of these methodologies have been commented on by judiciary for years and tested to some degree in court. Articles discussing delay analysis methodologies tend to advocate for the adoption of certain methods and the dismissal of others. Ultimately, it is difficult to get a clear picture as to which of the six methods are frequently used in practice to perform delay analysis retrospectively.
This article seeks to provide some answers by conducting a statistical analysis on delay analysis techniques based on a representative sample established using data from major capital projects around the globe on which HKA has provided delay analysis services. The purpose of this analysis is not to rank delay analysis methods based on how frequently they are employed, but rather to identify major trends and discuss why some methods have been rising or declining over the last decade.
Data Collection
To ensure the analysis produces the most meaningful results, only projects meeting all the following selection criteria were included within the representative sample:
The choice of delay methodology was not expressly prescribed in the contract.
The delay analysis prepared to determine causes of delay was:
conducted retrospectively;
diligently prepared and reasonably substantiated; and
transmitted to the opposing party on or after 1 January 2010.
The Responding party reviewed the delay analysis and provided a substantiated reply or an alternative delay analysis.
The amount claimed in relation to delay (prolongation) exceeded $5 million.
In addition, because the nature and location of the project may influence the choice of a delay analysis methodology, projects coming from all sectors and regions were considered as part of this analysis. Figure 1 below shows a breakdown by sector and regions of projects included within the representative sample.
To preserve confidentiality, data collection was limited to:
the date at which the Claimant served the delay analysis to the Respondent;
delay analysis methods used by the Claimant and the Respondent; and
the region where the project was located.
Analysis Results
Results from the statistical analysis shows that all six methods listed in the SCL protocol were applied between 2010 and 2021. As illustrated in Figure 2 below, these six delay analysis techniques have been used about 85% of the time to analyse delay retrospectively by either the Claimant, the Respondent, or both. Other methods to analyse delay only amount to 2%, and the remaining 13% represents situations where the Respondent simply issued a criticism of the Claimant’s delay analysis.
As shown in the above bar chart, an as-planned vs as-built approach was chosen about half of the time. This method was not always carried out in windows as described in the SCL Protocol, yet the project duration was most of the time subdivided into smaller time periods for complex and multi-year projects.
The other five methods lag far behind. The second most used technique – time slice window analysis – was chosen only 12% of the time, and the remaining four methods were all below the 10 percent mark.
Yet, these methods were not employed in the same proportion by the Claimant and the Respondent. The same method was selected by both parties only 34% of the time. As shown in Figure 3 below, the biggest difference concerns the impacted as-planned method that is four times more relied upon by the Claimant than the Respondent. This delay analysis technique, which consists of modelling delay events in a logically linked baseline schedule to assess the (theoretical) impact on project completion, has the reputation to favour the Claimant as Respondent’s risk delay events are more likely to be identified and reflected in the impacted schedule than the ones for which the Claimant is liable.
It can be argued the analysis confirms this hypothesis. A closer review of the collected data shows the impacted as-planned method was only employed by both the Claimant and the Respondent on one occasion to determine causes of delay retrospectively. The reason invoked by both parties was a lack of reliable as-built information.
It is also interesting to review the evolution of delay analysis method selection over time. Dividing the timeframe of the analysis into two equal periods, it is observed that the delay analysis methods identifying delay impacts first (i.e., the effect) – namely As-planned vs As-built, Time slice window analysis, and Retrospective longest path – have all been on the rise during the decade. Conversely, methods modelling delay events first (i.e., a cause) to assess the associated impact of delay later – Time Impact Analysis, Impacted As-planned, and Collapsed As-built – have all severely decreased since 2016. This is demonstrated by the time impact analysis which fell from 20 to zero percent, as shown in Figure 4 below.
According to the above bar chart, the choice of the As-planned versus As-Built method nearly doubled over the last decade. This method was also more often formalised as a windows analysis since 2016; a detailed review of the data shows a three-fold increase occurred between the two analysis sub-periods.
Furthermore, looking at the results of the analysis based on geographical region, there is a notable difference between Americas and the rest of the world. As shown in Figure 5 below, the time slice window analysis is preferred 31% of the time on this continent, which is more than twice the global average (12%). All other delay analysis techniques are less often chosen in all regions (except for the collapsed As-built analysis). Despite these differences, the As-Planned versus As-Built approach remains the most commonly used technique and the overall order of preference remains the same.
Discussion
The statistical analysis shows a large disparity between the different delay analysis techniques.
The As-Planned versus As-Built methodology is the most frequently used at any point in time, regardless of the region or the nature of the project. It could be argued this technique is the most versatile and preferred by the parties to analyse delay retrospectively whenever the project, contract and legal requirements allow it.
On the contrary, the impacted as-planned and the collapsed as-built methodologies are being used less frequently and used in very specific situations.
The interpretation of results for the remaining three methods is less straight forward.
Time impact analysis
The usage of time impact analysis to conduct delay analysis retrospectively has been debated for a long period. This method is highly praised by some and heavily criticised by others.[2]
The first edition of the SCL Protocol, published in 2002, stated time impact analysis was “to be used wherever the circumstances permit, both for prospective and (where the necessary information is available) retrospective delay analysis”.[3] Abandonment of the preference for retrospective time impact analysis was announced by the SCL Protocol review committee in July 2015.[4]
Analysis results show the influence of this change of position, and more generally, of the SCL Protocol. Until 2015, the time impact analysis was the second most used technique (20% of the time). Between 2016 and 2021, the time impact analysis method was not used retrospectively on any of the sample projects.[5]
Irrespective of the position set forth in the SCL Protocol, and the amount of discussion regarding the reasonableness of this method to conduct retrospective delay analysis, the present analysis shows that time impact analysis has never been on the verge of replacing the As-planned versus As-built analysis as the most commonly used method. The time and effort required to perform a time impact analysis was probably a limiting factor for its widespread adoption.
Time slice window analysis
The analysis shows that time slice window analysis is the second most commonly used method for analysing delay retrospectively. Analysis results tend to confirm this method is on the rise, albeit mainly in the Americas.
Multiple reasons could explain the preference for the time slice window analysis over the other methods:
commonly accepted as a robust method when reliable programmes are available;
ability to determine the as-built critical path based on CPM contemporaneous programme;
less time-consuming and easier to explain than a time impact analysis;
more robust than an impacted as-planned or a retrospective as-built; or
A combination of the above.
Yet none of the above reasons are readily apparent from the results of the statistical analysis.
It could also be argued that the revision of the SCL Protocol rescinding the recommendation for the retrospective time impact analysis may have benefited the time slice window analysis. It should be underlined that the influence of the SCL Protocol is more limited in the United States as RP 29R-03 predominates[6], and no similar increase was witnessed on other continents. A possible influence of the SCL Protocol therefore remains to be established.
Retrospective longest path analysis
The influence of the SCL Protocol is more apparent for the retrospective longest path. This method was not used at all between 2010 and 2016 on any of the projects within the sample. Then, this technique was applied 8% of the time in the subsequent period. The inclusion of this delay analysis technique in the 2nd edition of the SCL Protocol is most certainly not foreign to its recent adoption rate increase.
Conclusion
The statistical analysis presented in this article provides some trends on delay analysis methods used between 2010 and 2021.
The As-planned versus as-built is the most commonly used technique for Claimant and Respondent. The preference for this method has increased in the recent years and remains prevalent in all sectors and regions.
More generally, methods assessing effects of delay first have progressively gained preference over those starting with causes of delay. The prevalence of the time impact analysis to conduct retrospective delay analysis has declined in popularity in recent years. The impacted as-planned and collapsed as-built analysis techniques have also been less employed in recent years.
The revision of the SCL protocol has most certainly played an important role in this transition, but its reach and influence are less noticeable in the Americas.
RESOLVE DISPUTES WITH HKA
HKA has over 40 years of extensive global consultancy experience, specialising in a diverse range of sectors and industries. Our key services cover dispute resolution, claims consultancy, and advisory support.
Distinguishing ourselves by effectively addressing complex issues across various projects, we can offer in-depth analysis, guidance, and assistance throughout each phase of a dispute.
Moreover, we work with a diverse client base, including owners, operators, contractors, subcontractors, law firms, and government agencies, customising our services to cater to their specific needs.
[1] Society of Construction Law – “Delay & Disruption Protocol” (2017, 2nd edition), para. 4-2 to 4-12
[2] E.g., Livengood John – “Retrospective TIAs: Time to Lay Them to Rest” (2008, AACE International Annual Meeting); Robert M. D’Onofrio – “Ranking AACE International’s Forensic Schedule Analysis Methodologies” Cost Engineering Magazine (July/August 2015, AACE International)
[3] Society of Construction Law – “Delay & Disruption Protocol” (2002, 1st edition), para 3.2.11
[4] Society of Construction Law – “Delay & Disruption Protocol” (2015, RIDER 1), para 14-15
[6] Battrick et al. – “Bringing Order To The Delay Melee: Understanding the SCL Delay & Disruption Protocol and AACE RP 29R-03” (2018, SCL North America), p3, para 1