Transcript:


Jackie Erickson

You are listening to the Edge Case Research Self Driving Car Safety Series, and in this episode, Phil Koopman addresses conformance-based metrics. Many companies publicly state that they “incorporate processes from industry standards.” However, that statement does not provide insight on how well the standard has been covered, and if the company actually conforms to the standard. Let’s hear Phil provide insight on questions that can be asked to get more insight on whether a system partially or fully conforms to a standard.
Now over to Phil.

Phil Koopman

This is Phil Koopman from Edge Case Research with a series on self driving car safety. This time I’ll be talking about metrics based on how well you’ve covered a safety standard. A typical software or systems safety standard has a large number of requirements to meet the standard, often called clauses. An example of a clause might be something like all hazards shall be identified and another clause might be all identified hazard shall be mitigated. There are often extensive tables of engineering techniques or technical mitigation measures that need to be done based on the risk presented by each hazard. For example, a low risk hazard might just need normal software quality practices. Whereas a life critical hazard might need dozens or hundreds of very specific safety and software quality techniques to make sure the software is not going to fail in use. The higher the risk, the more table entries need to be performed in design validation and deployment.

The simplest metric related to a safety standard is as simple yes, no question. Do you actually conform to the standard? However, there are nuances that matter. Conforming to a standard might mean a lot less than you might think for a number of reasons. So one way to measure the value of that conformance statement is to ask about the scope of the conformance and any assessment that was performed to confirm the conformance. For example, is a conformance just hardware components and not software also, or is it both hardware and software? It’s fairly common to see claims of conformance to a rigorous safety standard that only covered the hardware, and that’s a problem if a lot of the safety critical functionality is actually in the software.

If it does cover the software, what scope? Is it just the self test software that self tests the hardware? Does it include the operating system? Does it include all the application software that’s relevant to safety? What actually is the claim of conformance be made on? Is it just a single component within a very large system? Is it a subsystem? Is it entire vehicle? Does it cover both the vehicle and its cloud infrastructure and then communications to the cloud? Does it cover the system used to collect training data that is assumed to be accurate to create a safety critical machine learning based system? And so on. So if you see a claim of conformance, be sure to ask what exactly the claim applies to you because it might not be everything that matters for safety.

Also conformance can have different levels of credibility ranging from – well it’s in the spirit of the standard; Or we use an internal standard that we think is equivalent to this international standard; Or our engineering team decided we think we meet it; Or a team inside our company thinks we meet it but they report to the engineering manager so there’s pressure upon them to say yes; Or it’s a robustly separated group inside our company; Or it’s a qualified external assessment with a solid track record for technical integrity. Depending on the system, any one of these categories might be appropriate, but for life critical systems, you need as much independence as you can get. If you hear a claim for conformance it’s reasonable ask well, how do you know you conform and is the group assessing conformance independent enough and credible enough for this particular application?

Another dimension of conformance metrics are how much of the standard is actually being conformed to? Is it only some chapters or all of the chapters? Sometimes we’re back to where only the hardware conformed so they really only looked at one chapter of a system standard that would otherwise cover hardware and software. Is it only the minimum basics? Some standards have a significant amount of basically optional text. In some cases there’s more optional texts than mandatory texts. So did only the required texts get addressed or were the optional parts addressed as well?

Is the integrity level appropriate? So it might conform to a lower ASIL than you really need for your application, but it still has the conformance stamp to the standard on it. That can be a problem if using, for example, something assessed for noncritical functions and you want to use it in a life critical application. Is the scope of the claim conformance appropriate? For example, you might have dozens of safety, critical functions in a system, but only three or four were actually checked for conformance and the rest were not. You can say it conforms to a standard, but the problem is there’s pieces that really matter that were never checked for conformance.

Has the standard been aggressively tailored so that it weakens the value of the claim conformance? Some standards, permit skipping some clauses if they don’t matter to safety in that particular application, but with funding and deadline pressures, there might be some incentive to drop out clauses that really might matter. So it’s important to understand how tailored the standard was; Was that the full standard or where pieces left out that really should matter?

Now to be sure sometimes limited conformance on all these paths makes perfect sense. It’s okay to do that so long as, first of all, you don’t compromise safety. So you’re only leaving out things that don’t matter to safety. Second you’re crystal clear about what you’re claiming and you don’t ask more of the system that can really deliver for safety. Typically signs of aggressive tailoring or conformance to only part of a standard are problematic for life critical systems. It’s common to see misunderstandings based on one or more of these issues. Somebody claims conformance to a standard does not disclose the limitations and somebody else gets confused and says, oh, well, safety box has been checked when in fact it is not because the conformance claim is much narrower than is required for safety in that application.

During engineering, partial conformance and measuring progress against partial conformance can actually be quite helpful. Ideally, there’s a safety case that documents the conformance plan and has a list of how you plan to conform to all the aspects of the standard you care about. Then you can measure progress against the completeness of the safety case. Now the progress is somewhat not linear and not every clause take same amount of effort, but still just looking at what fraction of the standard you’ve achieved conformance to internally can be very helpful for managing the engineering process.
Near the end of the design validation process, you can do mock conformance checks and the metric there is the number of problems found with conformance, which basically amounts to bug reports against the safety case rather than against the software itself.

Summing up conforming to relevant safety standards is an essential part of ensuring safety, especially in life critical products. There are a number of metrics, measures and ways to assess how well that conformance actually is going to help your safety and it’s important to make sure you’ve conformed to the right standards, you’ve conformed with the right scope and that you’ve done the right amount of tailoring so that you’re actually hitting all the things that you need to in the engineering validation and deployment process to ensure you’re appropriately safe.

Jackie Erickson
You have just heard from Phil Koopman address conformance-based metrics.
A key takeaway from this episode – conforming to safety standards is essential to ensure the safety of a self driving car. As outlined in this episode, conforming to the right standards, within the appropriate scope and integrity levels can advance this innovation safely.

At Edge Case, our experts are deeply involved in the ongoing development of current industry standards, including UL 4600, SOTIF 21448, IEEE P2486, and many others. Our experts can provide an overall analysis of how well the system conforms and what is needed to reach an appropriate level of safety. To learn more or to receive an analysis, please email us at info@ecr.ai

We thank you for listening and we look forward to working with you on delivering the promise of autonomy.