Certification Breakout Session 2: Jim Krodel Richard Robinson David von Oheimb Tucker Taft Hal Peirson Jim Alfoss Kelly Hayhurst Patrick Graydon Scott Beecher Natasha Neogi Robin Bloomfield Jim K notes working group 72 (EuroCAE) is being looked at for coordination with FC-205 "top 3" needs from yesterday bulleted out, consolidated somewhat, categorized by topic. first bit of time directed at review of yesterday's notes context from one of this morning's talks: certification vs qualification: cert means airworthy, ie can fly. qualification is support of component toward cert of a system that can fly. qualification says evidence for qual can be cited in support of cert. what about IMA system approval basis, a prescriptive and very process oriented engagement (297). some amount may be helpful here. CM, for example, a big part of it. Maybe not so helpful in our context. Doesn't give mechanism for showing assurance of system based on qualities of components. Gap between validating properties of individual components versus demonstrating safety of a whole aircraft. reason for liking the qualification/certification distinction. added, consideration of certifying where legacy components are incorporated in new systems (per Natasha N talk this morning). In Service Oriented Architecture, interfaces specified by "contract". Hal thinks this can be a mechanism supporting "argument" or reasoning about how components behave in context. define certification framework: example motivated by fact that FAA current regulations inhibit component-level qualification. we're talking about the process of certification (not some evidenciary bus). illustrative: to use a new technology today, can do, but process is quite informal. no policy governs it, therefore it's quite ad hoc. realistic goals. we hear a lot of "anomaly" stories. how many would be "expected" in a year? can certification framework accommodate incorporation of realistic goals or targets, in addition to tools, methods. propose wording: "certification criteria consistent with systems analysis" in place of "certification framework" between certification of COTS vs open source. for COTS there is (may be) an organization that can be identified with a financial or other stake in providing evidence. not so for open source. for open source, may have behavioral or direct evidence, but not development process evidence. see that knowledge management is needed in addition to formal methods, etc. don't believe certification process can be fully automated, because of need for judgment. same as system architecture. to what degree is the science of certification the same as the science of risk analysis? decision of level at which evidence should be examined is a policy matter. on paradigm shift: if we're proposing a new science of certification, how can we know that the new processes introduced are "right"? how to know when verification system is done? how to show that new approach to certification gives better, cheaper, faster, safer results? if it's a "science" of certification, what's the role of experimentation. again, how to prove that the revolutionized process is valid? mainly, how can it be known that a new certification approach guarantees safe systems. what are success criteria for a certification approach? note, we would like to put some priorities on these "needs". how are the outcomes of this workshop going to be used? community here includes research (academic) community, industry, suppliers, and regulatory institutions. would be useful to direct "needs" at each segment of the community. approach needs acceptance/working together with academic, regulatory, and industry. Hal weighs in on prioritizing needs. His view of the top level issue is cost of certification, especially in light of expanding effort required to individually qualify components in their contexts. Top three needs (summarizing): (see Jim's notes) Top three research topics (review points from John's talk this morning, on "Science For Certification") Need to get consensus on most important, highest priority needs and research topics. current list is unwieldy top three challenges: better, richer way of specifying component interfaces, component commitments (contracts for timing, assumptions, limitations, side effects). a language for reasoning about how components aggregate. have argument that we have adequate languages. but current methods allow us to talk about component properties, want to have a methodology for reasoning about emergent properties. as a "starting point" for a certification framework or discipline, toward making the process more rigorous, suggest collecting and analyzing "patterns" for certification. need to engage certifiers with certifyees. BREAK-----------------