Contents
Purpose and Value
“Software architectures are neither intrinsically good nor intrinsically bad; they can only be evaluated with respect to the needs and goals of the organizations which use them.” [1]
Software Engineering Institute
Introduction
A common task for architecture teams and architecture review boards is to evaluate how well a given solution’s architecture meets its needs. Architects often develop a list of “non-functional requirements” or “-ilities” (e.g. scalability, modifiability, accessibility, reliability) with which to develop evaluation criteria. Many organizations adopt a defined set of these “quality characteristics”, together known as a Quality Model, so that they can be used consistently and, over time,become a language with widespread shared understanding. Quality models, such as ISO 25010 [2], are often used to form an attribute-based methodology for evaluating software architectures. Whilst these approaches can justifiably be comprehensive, involving substantial time and effort and the need to define a large set of scenarios or architecturally significant requirements, there is often a need for a lightweight approach that takes advantage of the 360-degree view provided by a quality model while avoiding the cost of developing and evaluating a tailored list of scenarios.
This Solution provides support for just such a lightweight approach to architecture evaluation. It can be used to conduct health checks of technology solutions that might already have been implemented, or future solutions whose architectures can be described but do not yet exist. Support for comprehensive architecture review methods and trade-off analyses such as ATAM [3] and SARM [4], including comparative evaluation of alternative architectures, is outside the scope of this Solution, but addressing this related topic may be prioritized in the future.
Purpose
The purpose of this solution is to support the conducting, reporting and recording of a lightweight evaluation of a solution architecture. A quality model provides the structure of the evaluation and subsequent report, with Ardoq surveys being used to set up the evaluation and capture its findings, and the evaluators are supported with visualizations to help them prepare their conclusions. New Risks (see Application Risk Management Solution) and Technical Debt Items (linking to the Technical Debt Management Solution) may be identified and recorded during the evaluation, and concluding recommendations may be supported by the creation of a new Initiative as an implementation vehicle (linking to Strategy to Execution Insights & Impacts Solution).
The Solution enables its users to address the following business questions:
Which quality characteristics (e.g. Reliability, Security, Usability) make up a solution’s quality criteria?
How well does a solution’s architecture meet its quality criteria?
How might a solution’s architecture be improved to meet its quality criteria more effectively?
Has the health check revealed any new technical debt that needs recording and addressing? *
Are there new risks that should be recorded associated with a solution’s current architecture? *
What new initiatives should be proposed to address the review team’s recommendations? *
Who has approved a given Solution Health Check?
When should a health check be conducted on this technology solution?
An automated process is provided to support regular evaluation of key solutions, ensuring that audit and compliance requirements for such evaluations can be met. The questions marked with an asterisk (*) can only be addressed if this solution is used in conjunction with another solution (i.e. Technical Debt, Application Risk Management or Compliance Assurance, and Strategy to Execution respectively).
Delivering value with this Solution
Evaluating business systems to determine how well their architecture supports their architectural requirements is a key activity for architecture teams. This Solution delivers value by providing support for a structured, repeatable, efficient and effective lightweight evaluation method. It gives all your evaluations a consistent approach, avoiding the need to design a bespoke agenda for each evaluation while ensuring, through its use of a standardized quality model, that solutions are viewed “in the round”, using a simple but expressive language that brings together business and IT stakeholders.
The use of a standardized quality model brings rigor and consistency to the adopted approach. Several example quality models are provided, including ISO 25010, and customers can additionally develop and utilize a bespoke quality model tailored to the specific needs of their organization. The Solution follows the approach described in How to use Architecture Reviews to identify Technical Debt, and supports the adoption of published methods and approaches, from its origins in “Active Design Reviews” [5] to more recent methods such as TARA from Eoin Woods [6], the Bank of England’s business system architectural health checks [7] and checklist-based approaches [8, p. 257].
With its use of a standardized quality model, it encourages the spreading of architectural thinking across an organization with the consistent use of a comprehensive but simple language for describing system quality. The evaluation approach, combined with the language of quality, can build shared understanding (among IT and business stakeholders alike) of both what qualities a solution needs to realise, and how well that solution achieves them. By following this approach, a team can decide the significance of any identified gaps between what is desired and what is achieved.
This Solution is specifically aimed at supporting evaluations of the architectures of IT solutions. The Business Health Check Solution is its business capability evaluation counterpart, and may often be the starting point of a chain of evaluations: a concern about the health of a business capability may lead to a Business Health Check, which in turn identifies a concern about the health of an application that contributes to the realization of that capability, triggering a request for a Solution Health Check of that application. Other common triggers for Solution Health Checks are audit reports, regulatory requirements, or governance processes that require periodic evaluations of business critical applications.
Scope and Rationale
The Evaluation Process
Figure 1 - the evaluation process
The process supported by the Solution is as follows:
Create new Solution Health Check
A new Solution Health Check is set up using the New Solution Health Check survey. The Application that is the subject of the Health Check is identified, and the quality characteristics that will form the evaluation criteria are selected from the organization’s standard Quality Model (see Using a Quality Model below for more details). The relative importance of each of the chosen quality characteristics is recorded on a scale of 1.0 to 5.0.
The panel reviews each quality characteristic
In a workshop, the panel of experts review the solution architecture, considering its ability to meet the needs of each relevant quality characteristic in turn. This review might involve exploration of architecture documentation, as well as interviewing key stakeholders, such as the solution architect, business sponsor, lead engineer or specialists in areas such as security, infrastructure and service design. They record their level of concern about each quality characteristic on a scale of 1.0 to 5.0, with accompanying comments and recommendations.
The panel document their conclusions
Having reviewed each quality characteristic in turn, the panel will wish to step back and reach an overall conclusion. This is supported with bubble chart visualizations in Ardoq that allow the panel to see the quality characteristics and their scores (importance and level of concern) together on a single map.
Figure 2 - bubble chart visualization
Guidance is given on which characteristics need immediate attention, which should be improved over time, and which are satisfactory. Using this information, the panel can summarize their conclusions and, if the corresponding additional Solution has been deployed, decide whether there is a need to record specific debt items for remediation, add a new risk to the Risk Register, or create a new Initiative to implement their recommendations. Approval of the outcome can be formally recorded, and a date range can be provided indicating the period during which this Health Check is applicable.
Repeating Heath Checks are scheduled
If no date range future end date has been provided, the Health Check was effectively a one-off process that has now concluded. If a date range is provided, and the end date is in the future, a message can be automatically triggered to the Enterprise Architect and/or the Application owner a month before the end date asking them to schedule the next Health Check in order that compliance is maintained without a break.
Broadcasts are provided to support step 4. These can be enabled or disabled depending on whether these steps are required, and automation of them is desired.
Using a Quality Model
In 1991, the International Organization for Standardization, ISO, published ISO 9126, a software product evaluation standard that “defines six characteristics that describe, with minimal overlap, software quality” [9]. A few years later, the Software Engineering Institute at Carnegie Mellon University published a technical report, Quality Attributes [10], introducing a generic taxonomy of software quality attributes. The idea of using a quality model to structure the evaluation of software solutions was gaining ground.
ISO 9126 was replaced with ISO 25010 [2], which goes beyond evaluation of software products to provide a “System and software quality model”, itself being revised in 2023. We are grateful to the British Standards Institute for granting us permission to include a model of the ISO 25010 quality model with this solution, alongside the Ardoq quality model.
Figure 3 - ISO 25010 Quality Model
Figure 4 - Ardoq Quality Model
We also include the UK.GOV Service Standard [11], which has become widely adopted as a quality model for evaluating service designs.
Figure 5 - UK.GOV Service Standard
We recommend that an organization choose a preferred quality model, adopting and, where appropriate, modifying one of those provided with this Solution, and socialize it as a widely used language among those interested in the quality of the systems and services they provide and consume. This Solution focuses on its use to evaluate solution architectures, and the same language can be used to classify technical debt (see Technical Debt Management Solution) and to classify stories, use cases and requirements.
With the Solution Health Check, the quality model is used as a checklist. Step 1 of the evaluation process involves the selection of the quality characteristics or sub-characteristics that will form the checklist. Alternative versions of the New Solution Health Check survey are provided, which should be selected according to whether you wish to run a very lightweight agenda, based on a selection of just the top level Quality Characteristics, a more detailed agenda using a selection of the Quality Sub-characteristics, or a mix of the two.
More detailed architecture review methods, such as ATAM and SARM, also make use of a quality model. Whilst these approaches are beyond the scope of this Solution, addressing them may be prioritized in the future.
As an alternative to adopting a dedicated Quality Model, many organizations have defined and adopted a set of Architecture Guiding Principles. The TOGAF Standard [12] highlights the potential use of Architecture Guiding Principles “as a guide to establishing relevant evaluation criteria”, and some organizations choose to use their architecture guiding principles as a checklist or set of criteria against which to evaluate the quality of their solutions. See How to represent Policies, Principles, Standards and Frameworks in Ardoq for details of how to model your architecture guiding principles in Ardoq. The Quality Models provided with this Solution follow the same approach, and so substituting your architecture guiding principles for the supplied quality models is easily accomplished. See the Solution Health Check Configuration Guide for details of how to switch from the Ardoq Quality Model to an alternative.
Solution Health Check
Once the criteria have been selected for the Health Check, a panel of experts can be assembled to collectively review the solution architecture that is under the spotlight. The composition of this panel will likely vary depending on the nature of the solution under review and the quality criteria that have been selected. A wide-ranging review might include the following roles:
Business sponsor
User(s)
Technical owner
Business architect
Solution architect
Support specialist(s)
Security architect
Infrastructure, platform or cloud specialist
Service Management representative
Other peer solution and specialist architects
It is important that all members of the panel have a good understanding of the solution; its purpose and use, and its architecture. The first part of the workshop should include a detailed explanation of these, perhaps reminding participants of documentation that may have been circulated in advance of the workshop.
The Assessment Survey is used as a checklist that can be completed collaboratively in the workshop, inviting the panel to discuss each significant quality characteristic or sub-characteristic in turn. It asks them to agree on a level of concern on a 1.0 to 5.0 scale, and provides space for them to elaborate on this with further comments and recommendations. The survey questions include the description of the characteristic for information purposes, but we chose not to include the level of significance that was previously specified in step 1 of the process, as this might lead to some bias in the panel’s collective opinion. It would be easy to include this as a read-only field in the survey, if desired.
Once the panel members have reviewed each of the criteria in turn, we recommend they take a step back and look at an overview of their levels of concern before reaching a final conclusion. This avoids being overly influenced by the most recent characteristics they’ve discussed. The bubble chart, shown in Figure 2 above, provides a useful overview. Their attention should be focussed on the upper half of the chart, which shows the characteristics about which they have expressed more serious concern. And in particular the top right-hand quadrant, which contains the characteristics that are both important for this system and a matter of serious concern.
Now they can reach their overall recommendations and conclusions, which can be recorded in the Conclusions survey.
The true benefit of conducting a Health Check workshop is often derived from ideas that are sparked from the discussions triggered by the process, rather than from the task of agreeing a set of scores. It is a good idea to document ideas as they occur: these may relate to improving the solution, processes and services around it, or to record concerns or issues in other systems of record such as risk registers or technical debt backlogs.
If the Compliance Assurance Solution or Application Risk Management Solution have been deployed, newly identified Risks can be added to the Risk Register. If the Technical Debt Management Solution has been deployed, new Debt Items can be added to the backlog. And if the Strategy to Execution Series has been adopted, a new Initiative can be created to address the panel’s recommendations. These are valuable steps if you want to ensure that Health Checks lead to action, rather than merely a record of a solution’s deficiencies. The Solution’s dashboard contains a report of those Health Checks that have urgent concerns (i.e. qualities that are both important and have a high level of concern), and highlights which of these have related Initiative components that are taking action to address them, either directly associated with the Health Check, or via a related Debt Item. Those which have no related initiatives are listed first. If the Technical Debt and Strategy to Execution Solutions have not been deployed, the dashboard’s report will still correctly list Live Solution Health Checks with Urgent Concerns, but the Action column will always contain the value “None.
In some organizations with strict governance, a Health Check is like a test that is either “passed” or “failed”. We have chosen not to include this aspect in our default implementation, but this can be easily added. The Solution Health Check component type adopts the Architecture Record pattern of the Architecture Records Solution metamodel. This pattern includes a Decision YesNo field that can be added to the Solution Health Check component type to record the “Pass” or “Fail” verdict of the panel.
Some organizations will wish to record approval of the panel’s conclusion, and this is facilitated with a set of Approved By references to those with the authority to sign off the report.
Looking at the bigger picture
An organization that conducts regular Health Checks across its most important business systems can make use of the resulting information to assess the overall health of the estate and gain insights into the success of its architectural efforts. Each Health Check contains an assessment of the relative importance of the quality characteristics and sub-characteristics in the chosen Quality Model. And the panel members record their level of concern regarding each of these at the time of the assessment. Taken as a whole, these data can provide the architecture team with valuable insights that they can use to refine their work. They can also provide evidence of the success or failure of the team’s efforts.
The Solution’s Dashboard contains a report entitled Quality Characteristic Analysis. This aggregates the importance and level of concern scores for all Health Checks that have been conducted and shows them by Quality Characteristic, providing the minimum, average and maximum scores of both importance and level of concern. Where sub-characteristics have been evaluated, these are included in the calculations for their respective parent characteristics. The report allows you to see which are the most important quality characteristics across your estate, which have a wide range of importance, and whether any are consistently a source of serious concern. This last indicator can guide the team to explore improvements in, for example, specialist architectural skills, standards, patterns or reference architectures.
The Dashboard also shows a line chart tracking the average of the levels of concern combined for all quality characteristics across all live Health Checks that have not yet retired (i.e. those that have been completed and are still current). If an organization has an architecture capability that is increasing in its effectiveness, one would expect this number to gradually decrease.
Further Reading
Bibliography
[1] R. Kazman, G. Abowd, L. Bass, and M. Webb, “SAAM: A Method for Analyzing the Properties of Software Architectures,” presented at the 16th International Conference on Software Engineering, May 1994.
[2] ISO: the International Organization for Standardization, “ISO/IEC 25010:2023,” ISO. Accessed: Feb. 11, 2024. [Online]. Available: https://www.iso.org/standard/78176.html
[3] R. Kazman, M. Klein, and P. Clements, “ATAM: Method for Architecture Evaluation,” Software Engineering Institute, Carnegie Mellon University, Technical Report CMU/SEI-2000-TR-004, Aug. 2000.
[4] S. Field, “SARM,” SARM: A site devoted to the Solution Architecture Review Method. Accessed: Feb. 14, 2024. [Online]. Available: https://sarm.org.uk/welcome/
[5] D. L. Parnas and D. M. Weiss, “Active design reviews: principles and practices,” in Proceedings of the 8th international conference on Software engineering, in ICSE ’85. Washington, DC, USA: IEEE Computer Society Press, Aug. 1985, pp. 132–136.
[6] E. Woods, “Industrial architectural assessment using TARA,” J. Syst. Softw., vol. 85, no. 9, pp. 2034–2047, Sep. 2012, doi: 10.1016/j.jss.2012.04.055.
[7] BSI, “BSI Standards Awards 2022 Winners.” 2022. [Online]. Available: https://www.bsigroup.com/globalassets/documents/about-bsi/awards/bsi-standards-awards-2022-winners-announcement.pdf
[8] P. Clements, R. Kazman, and M. Klein, Evaluating Software Architectures: Methods and Case Studies. Addison Wesley, 2001.
[9] International Organization for Standardization, “ISO/IEC 9126 - Software engineering product quality,” International Organization for Standardization, ISO/IEC 9126, 1991.
[10] M. Barbacci, M. H. Klein, T. A. Longstaff, and C. B. Weinstock, “Quality Attributes,” Software Engineering Institute, Carnegie Mellon University, Technical Report CMU/SEI-95-TR-021, Dec. 1995.
[11] “Service Standard - Service Manual - GOV.UK.” Accessed: May 20, 2024. [Online]. Available: https://www.gov.uk/service-manual/service-standard
[12]“The TOGAF® Standard, 10th Edition | www.opengroup.org.” Accessed: Feb. 03, 2025. [Online]. Available: https://www.opengroup.org/togaf/10thedition