EMMA DLR Project Site - V&V Methodology for...

44
Contract No. TREN/04/FP6AE/SI2.374991/503192 Project Funded by European Commission, DG TREN The Sixth Framework Programme Strengthening the competitiveness Contract No. TREN/04/FP6AE/SI2.374991/503192 Project Manager M. Röder Deutsches Zentrum für Luft und Raumfahrt Lilienthalplatz 7, D-38108 Braunschweig, Germany Phone: +49 (0) 531 295 3026, Fax: +49 (0) 531 295 2180 email: [email protected] Web page: http://www.dlr.de/emma © 2005, - All rights reserved - EMMA Project Partners The reproduction, distribution and utilization of this document as well as the communication of its contents to other without explicit authorization is prohibited. This document and the information contained herein is the property of Deutsches Zentrum für Luft- und Raumfahrt and the EMMA project partners. Offenders will be held liable for the payment of damages. All rights reserved in the event of the grant of a patent, utility model or design. The results and findings described in this document have been elaborated under a contract awarded by the European Commission. V&V Methodology for A-SMGCS Konstantinos G. Zografos RCAUEB/TRANSLOG Document No: D6.2.1 Version No. V1.0 Classification: Confidential Number of pages: 44

Transcript of EMMA DLR Project Site - V&V Methodology for...

Page 1: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

Contract No. TREN/04/FP6AE/SI2.374991/503192

Project Funded by European Commission, DG TREN The Sixth Framework Programme Strengthening the competitiveness

Contract No. TREN/04/FP6AE/SI2.374991/503192

Project Manager M. Röder

Deutsches Zentrum für Luft und Raumfahrt Lilienthalplatz 7, D-38108 Braunschweig, Germany

Phone: +49 (0) 531 295 3026, Fax: +49 (0) 531 295 2180 email: [email protected]

Web page: http://www.dlr.de/emma

© 2005, - All rights reserved - EMMA Project Partners The reproduction, distribution and utilization of this document as well as the communication of its contents to other without explicit authorization is prohibited. This document and the information contained herein is the property of Deutsches Zentrum für Luft- und Raumfahrt and the EMMA project partners. Offenders will be held liable for the payment of damages. All rights reserved in the event of the grant of a patent, utility model or design. The results and findings described in this document have been elaborated under a contract awarded by the European Commission.

V&V Methodology for A-SMGCS

Konstantinos G. Zografos

RCAUEB/TRANSLOG

Document No: D6.2.1 Version No. V1.0

Classification: Confidential Number of pages: 44

Page 2: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 2 File Name: D621_METHO_V10.doc Version: V1.0

Distribution List

Member Type No. Name POC Distributed1

Internet http://www.dlr.de/emma Web

Intranet http://www.dlr.de/emma/members 1 DLR Joern Jakobi 2 AENA Patricia Pina Calafi 3 AI Philippe Berge 4 AMS Giuliano D'Auria 5 ANS_CR Miroslav Tykal 6 BAES Derek Jordan 7 STAR Max Koerte 8 DNA Nicolas Marcou 9 ENAV Antonio Nuzzo 10 NLR Frans van Schaik 11 PAS Alan Gilbert 12 TATM Stephane Paul 13 THAV Alain Tabard 14 AHA David Gleave 15 AUEB Konstantinos G.Zografos 16 CSL Libor Kurzweil 17 DAV Rolf Schroeder 18 DFS Klaus-Ruediger Täglich 19 EEC Stephane Dubuisson 20 ERA Jan Hrabanek 21 ETG Thomas Wittig 22 MD Phil Mccarthy 23 SICTA Claudio Vaccaro

Contractor

24 TUD Christoph Vernaleken CSA Karel Muendel Sub-Contractor N.N.

Customer EC Cesare Bernabei EC Morten Jensen

Additional EUROCONTROL Paul Adamson

1 Please insert an X, when the PoC of a company receives this document. Do not use the date of issue!

Page 3: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 3 File Name: D621_METHO_V10.doc Version: V1.0

Document Control Sheet Project Manager Michael Roeder Responsible Author Konstantinos G. Zografos <RCAUEB/

TRANSLOG> Frans van Schaik NLR Pina Calafi AENA Joern Jakobi DLR

Additional Authors

Philippe Berge, Marie-Christine Bressole AIF Subject / Title of Document: V&V Methodology for A-SMGCS Related Task('s): WP6.2 Deliverable No. D6.2.1 Save Date of File: 2005-12-12 Document Version: V1.0 Reference / File Name D621_METHO_V10.doc Number of Pages 44 Dissemination Level Confidential Target Date 2005-01-31

Change Control List (Change Log)

Date Release Changed Items/Chapters Comment 2004-07-09 0.01 Initial Draft Deliverable outline 2004-07-23 0.02 Updated Draft Updated deliverable outline 2004-09-20 0.03 First complete draft Inclusion of first input from partners 2005-01-13 0.04 Second complete draft Incorporation of partners comments 2005-03-21 0.05 Third complete draft Comments to be addressed 2005-04-08 0.06 Final For peer review 2005-08-08 0.07 Reviewed For submission 2005-12-12 1.0 Approved

Page 4: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 4 File Name: D621_METHO_V10.doc Version: V1.0

TABLE OF CONTENTS

1 Introduction .......................................................................................................................................... 7 2 Overall Methodological Framework .................................................................................................... 8

2.1 Verification and Validation objectives to be used in EMMA A-SMGCS evaluation................. 10 3 Verification & Validation Methodology for EMMA A-SMGCS ...................................................... 11

3.1 Safety Assessment Methodology ................................................................................................ 11 3.1.1 Aim of the Chapter ............................................................................................................... 11 3.1.2 Definition of safety............................................................................................................... 11 3.1.3 Safety impact of the verification & validation activities...................................................... 11 3.1.4 Safety impact of the EMMA A-SMGCS tools and procedures............................................ 12

3.1.4.1 Real-time Simulations ................................................................................................... 12 3.1.4.1.1 Occurrence of hazards and controller reactions ..................................................... 12 3.1.4.1.2 Experienced level of safety .................................................................................... 14

3.1.4.2 Shadow Mode Trials...................................................................................................... 15 3.1.4.2.1 Occurrence of hazards and controller reactions ..................................................... 16 3.1.4.2.2 Experienced level of safety .................................................................................... 17 3.1.4.2.3 Safety indicators in passive shadow-mode trials .................................................... 17

3.1.4.3 Operational Trials.......................................................................................................... 19 3.1.4.3.1 Occurrence of hazards and controller reactions ..................................................... 19 3.1.4.3.2 Experienced level of safety .................................................................................... 20

3.2 Capacity & Efficiency Assessment Methodology....................................................................... 20 3.2.1 Aim of the Chapter ............................................................................................................... 20 3.2.2 Impact of the EMMA A-SMGCS on capacity and efficiency.............................................. 20 3.2.3 Methodology to assess the impact on capacity and efficiency............................................. 21 3.2.4 Capacity and Efficiency Measurements ............................................................................... 21

3.2.4.1 Capacity......................................................................................................................... 22 3.2.4.1.1 Real Time Simulations ........................................................................................... 22 3.2.4.1.2 Shadow-mode Trials............................................................................................... 22 3.2.4.1.3 Operational Trials................................................................................................... 22

3.2.4.2 Efficiency ...................................................................................................................... 23 3.2.4.2.1 Real Time Simulations ........................................................................................... 23 3.2.4.2.2 Shadow-mode Trials............................................................................................... 23 3.2.4.2.3 Operational Trials................................................................................................... 24

3.2.5 Methods, techniques and tools ............................................................................................. 24 3.2.5.1 Real – Time Simulations ............................................................................................... 24 3.2.5.2 Questionnaires ............................................................................................................... 24

3.3 Human Factors Assessment Methodology .................................................................................. 24 3.3.1 Aim of the Chapter ............................................................................................................... 24 3.3.2 Impacts of EMMA A-SMGCS on the Human Factors (Tower environment) ..................... 25 3.3.3 Impacts of EMMA A-SMGCS on the Human Factors (Cockpit Environment) .................. 27 3.3.4 Methods, techniques and tools ............................................................................................. 31

3.3.4.1 Evaluation methods ....................................................................................................... 31

Page 5: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 5 File Name: D621_METHO_V10.doc Version: V1.0

3.3.4.2 Tools supporting HF methods ....................................................................................... 32 3.3.4.3 Data collection methods and techniques ....................................................................... 33

3.4 Cost-Benefit Assessment Methodology ...................................................................................... 36 3.4.1 Aim of the Chapter ............................................................................................................... 36 3.4.2 Costs and Benefits of EMMA A-SMGCS............................................................................ 36 3.4.3 Cost-Benefit Assessment Methodology ............................................................................... 39

4 Concluding Remarks .......................................................................................................................... 43 5 References .......................................................................................................................................... 44

Page 6: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 6 File Name: D621_METHO_V10.doc Version: V1.0

LIST OF FIGURES

Figure 2-1: Overall methodological framework for the verification and validation of an A-SMCGS. .. 9 Figure 3-1: Hierarchical decomposition of the Measures of Effectiveness of EMMA A-SMGCS ...... 41 Figure 3-2: Hierarchical decomposition of the costs of EMMA A-SMGCS ........................................ 42

LIST OF TABLES

Table 3-1: Objective indicators of safety in real-time simulations (realisation of non-nominal events)....................................................................................................................................................... 14

Page 7: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 7 File Name: D621_METHO_V10.doc Version: V1.0

1 Introduction

The objective of this report is to present the overall methodology that will be used for verifying and validating the EMMA A-SMGCS (levels I & II) at the three project airport sites, that is Prague, Malpensa, Toulouse, and the on-board part of the EMMA A-SMGCS . This deliverable along with the other two deliverables that will be developed within the framework of WP 6.2 of the EMMA project, i.e. :

• D6.2.2 “Verification and Validation Indicators and Metrics for A-SMGCS”, and

• D6.2.3 “Verification and Validation Test Environment Descriptions for Prague, Malpensa, Toulouse and On-board",

constitute an integrated output that will describe the general methodology, the indicators and metrics, the associated experimental designs for the measurement of the indicators, as well as the instruments (e.g. questionnaires) that will be used for the verification and validation of the EMMA A-SMGCS .

It should be pointed out that the results of WP6.2 are interrelated to the results of WP6.1 “V&V Strategy” and that the test environment descriptions will not be separate deliverables but they will constitute chapters in:

• D6.1.2 “V&V Test Plan for Prague”

• D6.1.3 “V&V Test Plan for Toulouse”

• D6.1.4 “V&V Test Plan for Malpensa”

• D6.1.5 “V&V Test Plan for On-Board A-SMGCS applications”

The remainder of this document is organized into four sections. Section two presents the overall methodological approach for the verification and validation of the proposed A-SMGCS (levels I & II). Section three presents the methodologies that are proposed to be used for assessing the performance of the proposed system under safety, capacity and efficiency, human factors, and cost effectiveness, during both the Verification and Validation of the EMMA A-SMGCS while Section four provides the major concluding remarks.

Page 8: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 8 File Name: D621_METHO_V10.doc Version: V1.0

2 Overall Methodological Framework The objective of this section is to present the overall methodological framework for the verification and validation of the proposed A-SMGCS (level I & II) as described in the ICAO Manual (ICAO, 2004) at the four EMMA project sites, that is, Malpensa, Prague, Toulouse Airports, and on-board. According to the EMMA Technical Annex, the testing of the EMMA A-SMGCS at the sites will involve: i) full implementation and operational use of the surveillance and control functions, while the more advanced guidance and route planning functions will only be prepared to be fully tested for the EMMA2, ii) system verification and validation, iii) two phases of testing (Phase I and Phase II) for EMMA A-SMGCS levels I & II respectively. Based on the project description the objective of the system verification is to test the EMMA A-SMGCS against the technical and operational requirements (did we build the system right?). Here it should be stressed that the objective of the verification exercise within SP6 is to test the system at a functionality level (i.e., testing the performance of the surveillance and alerting functions, as opposed to the technical testing of individual sensors, components, devices, needed to support these functions). The technical testing of the individual EMMA A-SMGCS components constitutes part of the work done at the vertical SP’s of the EMMA project. The objective of the system validation is to perform operational testing, man in the loop, which will assess the impacts of the system on the safety, capacity efficiency, human factors, cost, and the overall performance of the EMMA A-SMGCS (did we build the right system?). In particular the EMMA V & V process involves the following four sequential stages of system’s assessment:

i. Technical Testing which refers to the verification of the technical feasibility of the EMMA A-SMGCS levels I & II

ii. Operational Feasibility: Given that the system passes the technical testing stage, it is further assessed in terms of its operational feasibility, which refers to the ability of the proposed A-SMGCS levels I & II to cover the operational and users technical requirements. This stage of assessment is also considered as part of the verification phase of the evaluation process.

iii. Operational Improvements: This stage constitutes an intermediate step towards the assessment of direct impacts of the proposed system. It refers to the measurement of the potential improvements of the airports system performance in terms of safety, capacity, efficiency and human factors, caused to the use of the proposed A-SMGCS.

iv. Operational Benefits: The objective of this stage is to assess the benefits derived from the use of the proposed A-SMGCS levels I & II. This task implies the conversion of the operational improvements determined in the previous stage, to monetary values.

The EMMA V & V process will be performed in the following three environments: i) Real Time Simulations, ii) Shadow Mode trials and iii) Operational trials. In particular Real Time Simulations (RTS) will be used for all of the aforementioned stages of assessment. Shadow Mode trials platform can be used for the verification phase i.e. technical testing and operational feasibility, while it may be also used for measuring the operational improvements. It should be pointed out that an essential prerequisite for assessing the operational benefits of the proposed system is the specification of the appropriate conversion factors by each site. The criticality of this intermediate step lies in the lack of common conversion factors among different sites or even between stake holding groups of the same site.

The verification and validation of EMMA A-SMGCS is a complex evaluation problem since it involves multiple stakeholders, (i.e. individuals and/or organizations involved in and affected by the development and use of the EMMA A-SMGCS), and multiple sites where the EMMA A-SMGCS is demonstrated. The existence of multiple stakeholders suggests that:

Page 9: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 9 File Name: D621_METHO_V10.doc Version: V1.0

• for the development of the EMMA A-SMGCS there is a wide variety of requirements which should be reflected in the design characteristics of the system, and

• for the evaluation of the EMMA A-SMGCS there is a variety of objectives reflecting the goals of the various stakeholders (e.g., capacity, efficiency, safety, working conditions, costs)

On the other hand the existence of various sites implies a wide variety of local technological and organizational constraints that should be reflected in the system design characteristics and considered in the specification of the expected performance of the EMMA A-SMGCS. The identification of the verification and validation methods and means should address the above mentioned requirements and should identify methodologies that will be able to cope with the above mentioned problem characteristics. Figure 2-1 presents the overall methodological framework for the verification and validation of an A-SMGCS. Central issues for the implementation of the proposed methodological framework relate to the identification of:

i) the categories of impacts that will be used to verify and validate the system (i.e., verification and validation objectives),

ii) the methods that will be used to assess the validation and verification objectives.

iii) the indicators that will be used to measure the validation objectives.

Items (i) and (ii) are addressed in this deliverable while item (iii) will be addressed in greater detail in deliverable D6.2.2.

Figure 2-1: Overall methodological framework for the verification and validation of an A-SMCGS.

Page 10: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 10 File Name: D621_METHO_V10.doc Version: V1.0

2.1 Verification and Validation objectives to be used in EMMA A-SMGCS evaluation

The EMMA Verification phase involves the testing of the EMMA A-SMGCS in terms of covering the technical specifications emerging from the user requirements and stakeholders aspirations identified in the OSED document of SP1. This verification activities will involve the testing all subsystems and functionalities of EMMA A-SMGCS according to ESA (ESA, 1995). This process involves the development of checklists and questionnaires according to the above verification specifications.

The validation of EMMA A-SMGCS requires the testing of the performance of the system functions under consideration in terms of the following types of impacts: i) safety, ii) capacity & efficiency, iii) human factors, and iv) costs/benefits. In addition the validation process includes the assessment of the overall system performance. In what follows we identify the methodologies that will be used within the EMMA project in order to verify and validate the EMMA A-SMGCS at the four testing environments (i.e. three airports and on-board).

The validation of the EMMA A-SMGCS (Levels I & II) will be based on the comparative assessment of the proposed system with the baseline system i.e. the existing SMGCS of each site of the project under a set of specified experimental factors involving the level of visibility, the traffic conditions, and the system version. This type of assessment will be performed under three testing environments: i) Real Time simulations, ii) shadow mode trials, and iii) operational trials. It is evident that the baseline system depends on the local operational conditions and constraints of each site and therefore cross site assessment is pointless to perform within the context of shadow mode and operational tests. However, real time simulation enables the definition of a common baseline system and therefore cross site assessment is feasible under this platform of testing.

Page 11: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 11 File Name: D621_METHO_V10.doc Version: V1.0

3 Verification & Validation Methodology for EMMA A-SMGCS

3.1 Safety Assessment Methodology

3.1.1 Aim of the Chapter

The aim of this chapter is:

• to show that the verification and validation activities at the EMMA project sites do not have a negative safety impact on the real life operations. This type of evaluation is concerned with the safety impact of the V&V activities; and

• to describe how objective and subjective data can be gathered during the verification and validation activities at the EMMA project sites that will be used for evaluation purposes (among which is the safety case for EMMA operations following [ref. D133]). This type of evaluation is concerned with the safety impact of A-SMGCS.

3.1.2 Definition of safety

According to a definition used by EUROCONTROL, safety is “the freedom from unacceptable risk or harm” (EUROCONTROL, 2003) Safety can be expressed in terms of the frequency or probability at which a hazard is likely to occur. A hazard, in turn, is defined by EUROCONTROL as “any condition, event, or circumstance which could induce an incident or an accident”. Hazards can develop into an incident or an accident; however, appropriate mitigation can prevent a hazard from developing into an incident or accident. The term “mitigation” refers to steps that are taken to control or prevent a hazard from causing harm and reduce risk to a tolerable or acceptable level.

3.1.3 Safety impact of the verification & validation activities

In case verification and validation activities are carried out at the test sites during real operations (either as shadow mode trials or as operational), a safety assessment has to be conducted that serves to ensure that the verification and validation activities do not have a negative impact on safety of airport operations. The safety assessment should cover the following activities:

• The installation of the experimental system/platform at the airport

• The use of the experimental system/platform during the validation conduct (either in addition to or instead of the current system), and

• Other activities related to verification and validation (such as presence of validation team in the operational environment and the collection of measurements during operation).

Potential hazards concerning the above activities will be identified. For all identified hazards, the associated frequency and severity should be provided, and effective mitigation for the identified hazards should be recommended. Only if hazards are considered as extremely unlikely, or are not assessed as severe or efficient mitigation exists, the operational trials can be carried out. It is very likely that the outcome of the safety impact analysis has to be presented to the airport authorities

Page 12: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 12 File Name: D621_METHO_V10.doc Version: V1.0

which will eventually set the criterion for the acceptability of the identified risks. Verification and Validation activities that take place in a simulation environment do not require a prior safety impact analysis.

3.1.4 Safety impact of the EMMA A-SMGCS tools and procedures

Another issue concerns the measurement of safety as part of the validation and verification activities carried out within the EMMA project. Safety in this context has the status of a validation objective i.e. the effect of EMMA A-SMGCS (Levels I and II) on the safety level at the airport will be evaluated. The approaches that can be taken to assess the level of safety in airport operation strongly depend on the setting of the validation technique (e.g., simulations, shadow mode trials or operational trials) and will therefore be discussed separately for the different validation techniques. Note that emphasis is placed on safety measurements in ATC settings, rather than on safety measurements pertaining to the pilot. The reason is that in both EMMA A-SMGCS level I and II (which are within the scope of the EMMA project), it is the controller (and partially the driver) who is provided with new tools and procedures. For the pilot, in contrast, there is no change in tools or procedures in EMMA A-SMGCS levels I and II.

3.1.4.1 Real-time Simulations

Real-time simulations are carried out in a simulation facility (e.g., ATC simulator or flight simulator), and simulate events in the time frame in which they would naturally happen and involve the human operator (e.g. controller or pilot). Below, it is discussed what approaches can be taken in order to assess the impact of EMMA A-SMGCS on the level of safety in airport operation.

3.1.4.1.1 Occurrence of hazards and controller reactions

Safety is generally considered in terms of the number of incidents/accidents that occur relative to the number of traffic movements. With regard to airport operations, the following potential incidents or hazards can be identified (BETA, 2001):

• A runway incursion (i.e., the unintended presence of an aircraft, vehicle, or person on the runway or runway strip) [cf. Eurocontrol, 20042]),

• A violation of safe separation,

• A wrong taxi route taken by the aircraft, and

• Opposite traffic on the aerodrome.

In operational trials the occurrence of any of the above incidents (in particular a runway incursion, and a loss of separation) is extremely unlikely. This also holds for real-time simulations, at least if pseudo-pilots and pseudo–drivers conform to ATC instructions, and controllers are not put deliberately into a situation in which they are overloaded.

The power of a real-time simulation, however, is that any of the safety-critical events can be produced by the pseudo-pilots and pseudo-drivers in the simulation by violating ATC instructions. In case 2 EUROCONTROL (2004). Strategic Safety Action Plan Implementation: Runway incursions. http://www.eurocontrol.int/eatm/agas/runwayincursions/faq.html

Page 13: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 13 File Name: D621_METHO_V10.doc Version: V1.0

safety-critical situations are experimentally induced, the measurements to be taken do not concern the frequency of safety-critical events, but the controller’s or pilot’s response to these safety-critical situations. These responses can include:

• The response time for detection of the safety-critical situation, and

• The response time of an instruction to solve the safety-critical situation.

Measurements will be collected both for a baseline condition (in which controllers work with current tools and procedures) and an experimental condition (in which controllers work with the tools and procedures associated with EMMA A-SMGCS levels I and II). Significant effects of EMMA A-SMGCS on the safety measurements can be found by carrying out a significance test for the differences between the baseline and the experimental condition3.

In addition to safety-critical events that relate to airport operations in general (and, thus, can be realised for both the baseline condition and the experimental condition), there are safety-critical events that relate to the technical malfunction of the EMMA A-SMGCS. Examples for such events are:

• Temporary unavailability of the surveillance function (no labelled ground traffic),

• Aircraft foreseen of wrong label (due to wrong correlation or a label swap between radar targets),

• Falsely detected runway incursion alert),

• Runway incursion alert fails to issue an alert (missed alert).

For these technical malfunctions, the associated measurements to be taken include:

• The reactions towards failure of the surveillance function and the impact on controller performance,

• The reactions towards a false alert and the impact on controller performance,

• The response time for detecting a label swap or a wrong correlation towards and the impact on controller performance,

• The response time for detecting a runway incursion in case of a missed alert and impact on controller performance

Measurements can only be collected for the experimental conditions, as the system components to fail are not provided in the baseline condition. For this reason, data can only be analysed qualitatively by describing in which way the system malfunction affected the controller performance to handle the traffic at the airport, rather than by carrying out statistical tests Table 3-1 summarises the above considerations on the realisation of non-nominal events in real-time simulation.

3 Practically, however, the number of events is often too small for a sufficient statistical test power.

Page 14: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 14 File Name: D621_METHO_V10.doc Version: V1.0

Examples of events

Baseline condition Experimental condition

Examples of data to be measured

(a) Runway incursion

No surveillance and alerting function

A-SMGCS Level I/II

• Response time for detection

• Response time for resolution

• Duration of runway incursion

(b) Violation of safe separation

No surveillance and alerting function

A-SMGCS Level I/II

• Response time for detection

• Response time for resolution

• Duration of separation loss

(c) Wrong taxi route taken

No surveillance and alerting function

A-SMGCS Level I/II

• Response time for detection

• Response time for resolution

• Time on wrong taxi route

General non-nominal events

(d) Opposite traffic

No surveillance and alerting function

A-SMGCS Level I/II

• Response time for detection

• Response time for resolution

• Duration of situation (a) Failure of surveillance function

Not applicable A-SMGCS Level I/II

• Controller reactions

(b) Wrong a/c label

Not applicable A-SMGCS Level I/II

• Response time for detection

• No. of R/T issued to wrong aircraft

(c) False incursion alert

Not applicable A-SMGCS Level I/II

• Controller reactions

A-SMGCS related non-nominal events

(d) Missed runway incursion

Not applicable A-SMGCS Level I/II

• Response time for detection

• Response time for solution

Table 3-1: Objective indicators of safety in real-time simulations (realisation of non-nominal events)

3.1.4.1.2 Experienced level of safety

Safety can also be measured subjectively in terms of the operators’ perceived level of safety during the simulation. Subjective measurements are mostly collected on the basis of questionnaires or de-briefing questions at the end of a simulation run. For the EMMA verification and validation activities, it is suggested to use a safety-debriefing questionnaire at the end of a simulation session [see MFF, 20044].

This safety de-briefing questionnaire will contain the following items:

4 “Mediterranean Free Flight Programme, Working Area 4 (MFF450TR003) D452C – RTS/3 Pilot Human Factors, Safety and Training Analysis in Self-Separation Simulation Trials. Edition 1.0. 17 August 2004.

Page 15: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 15 File Name: D621_METHO_V10.doc Version: V1.0

• What kind of non-nominal / unsafe/ potentially dangerous situations occurred during the simulation run?

• How was the situation diagnosed?

• Which factors contributed to the occurrence of the situation?

• What was the consequence of the situation?

• What could have been the worst consequence if the situation was not detected?

• What could have been the worst consequence if the situation was detected?

• Which factors could have worsened the situation?

• What could be the severity of the worst credible case (rated on a scale from 1 [major accident] to 5 [no immediate effect on safety])?

• How often could this hazard occur in real operation?

• Are there any fall-back actions which could mitigate the situation?

The safety questionnaire can refer to those non-nominal events that were specifically induced in the simulation, but can also refer to all other non-nominal events that occurred during the simulation. In case no non-nominal events occurred or were experimentally realised, the safety debriefing can also be carried out for safety-critical situations the operator did not directly experience, but considers as possible.

The data obtained on the basis of the safety-debriefing questionnaire will be analysed qualitatively. That is, no quantitative measures will be associated to the different responses (except for severity and frequency), and data will not be subjected to statistical tests. Rather, the safety debriefing will be predominantly used to identify hazards that are caused by the introduction of EMMA A-SMGCS as well as hazards that are not caused by the introduction of EMMA A-SMGCS but whose frequency or severity changes as a result of the introduction of EMMA A-SMGCS. For these hazards, the possibility of an early detection, contributing factors, and mitigation strategies will be identified.

3.1.4.2 Shadow Mode Trials

In shadow mode trials, the experimental system/platform is installed and used parallel to the operational system/platform. Depending on the task distribution between the operational team (working with the operational system) and the trial team (working with the experimental system), three kinds of Shadow Modes can be distinguished: Passive, Active and Advanced Shadow Mode trials5.

In Passive Shadow Mode, it is the operational team (working with the home system) who controls traffic. The trial team (working with the experimental system) listens to the operational R/T communication and shadows the operational team behaviour, in that it enters the instructions given by the operational team into the experimental platform. The trial team cannot act on the traffic, nor make R/T and phone communications.

In Active Shadow Mode, it is also the operational team (working with the home system) who controls traffic. Like in passive shadow mode, the trial team listens to the operational R/T communication and shadows the operational team behaviour, but cannot act on the traffic, nor make R/T and phone

5 “NATS (2001). A methodology for validating EATCHIP III MTCD using shadow mode trials. NATS, 25 April 2001.

Page 16: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 16 File Name: D621_METHO_V10.doc Version: V1.0

communications. What distinguishes active from passive shadow mode, though, is the fact that the trial team can advise the operational team on expected conflicts and suggest solutions for these conflicts.

In Advanced Shadow Mode, it is the trial team (working with the experimental system) rather than the operational team who controls the traffic. The operational team serves as a safety back-up, monitoring the R/T communications and the trial team behaviour, in order to update the home system and to take back control from the trial team if the operational team considers this necessary.

Regardless of who is in charge of the control of traffic, though, the traffic situations that occur during the shadow mode trials are dictated by real traffic at the airport.

3.1.4.2.1 Occurrence of hazards and controller reactions

The safety indicators that can be measured in shadow mode trials are described in more detail in D6.2.2. In the present document, only a brief overview of the measurement of safety in shadow mode trials is given.

In principle, safety in shadow mode trials can be measured in terms of the incidents that occur during the trials. Incidents such as the violation of safe separation that occur during the shadow mode trials can only be attributed to the EMMA A-SMGCS in case of advanced shadow mode trials. In passive and active shadow mode trials, traffic is not controlled with the experimental system, and thus, incidents would be an indicator of the safety level of the current system.

Furthermore, current systems and procedures related to handling of traffic at an airport are already optimised for safety as far as possible. That means that the likelihood of any safety-critical situation (such as the ones listed above) is so small that the occurrence of any safety-critical situation is extremely unlikely during the evaluation period. In order to detect any significant differences between an operation with and without EMMA A-SMGCS, validation activities need to be carried out over an extensive period of time (which is by far longer than the time period for verification and validation activities).

In contrast to real-time simulations, the experimental realisation of safety-critical situations is not recommended. Although it is theoretically possible to instruct drivers of a test van or a test aircraft in such a way that they disregard ATC instructions in order to create a non-nominal situations, such a procedure is regarded as far to risky in an operational environment.

It can be expected that hazards relating the EMMA A-SMGCS functionality such as incorrect label information or missed and false runway incursion alerts are much more frequent than incidents that already constitute a violation of safe separation. These malfunctions of the EMMA A-SMGCS functionality can be measured as an indicator of safety in the shadow mode trials.

Furthermore, non EMMA A-SMGCS related incidents that do not directly compromise safety (such as an incorrect pilot read-back) should be recorded as a safety indicator. Because they can be expected to occur with a higher frequency, such incidents might be better suited as a safety indicator.

To summarise, the objective measurement of safety in shadow mode trials has to take into account the following aspects:

• “Safety-critical” incidents should not be experimentally induced in a field test such as a shadow mode trial,

• In order to obtain a sufficient data basis, recording of incidents should extent to those incidents that have either no safety impact or only a very negligible one

Page 17: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 17 File Name: D621_METHO_V10.doc Version: V1.0

• Other than in Real-time Simulations, the relevant measurement for incidents is their occurrence rather than the operator’s responses to the incident.

• A further set of incidents to be used as measurements for safety concern degraded performance of the EMMA A-SMGCS functionality.

When recording non EMMA A-SMGCS related incidents in shadow mode trials, the following needs to be considered: In passive and active shadow mode trials, non EMMA A-SMGCS related incidents are indicators for the safety level of the current ATM system. In advanced shadow mode, in contrast, EMMA A-SMGCS related and non EMMA A-SMGCS related incidents are indicators for the safety level of the experimental ATM system.

3.1.4.2.2 Experienced level of safety

The experienced level of safety in shadow mode trials can be assessed in the same way as in real-time simulations. Thus, it is suggested to use a safety-debriefing questionnaire at the end of a simulation session containing the following items:

• What kind of non-nominal / unsafe/ potentially dangerous situations occurred during the simulation run?

• How was the situation diagnosed?

• Which factors contributed to the occurrence of the situation?

• What was the consequence of the situation?

• What could have been the worst consequence if the situation was not detected?

• What could have been the worst consequence if the situation was detected?

• Which factors could have worsened the situation?

• What could be the severity of the worst credible case (rated on a scale from 1 [major accident] to 5 [no immediate effect on safety])?

• How often could this hazard occur in real operation?

• Are there any fall-back actions which could mitigate the situation?

Because there will be no experimentally produced safety-critical situations, the number of such events will be by far smaller than in the real-time simulations. For this reason, the safety questionnaire will focus to a larger extent on situations the operator did not directly experience, but considers possible. Data obtained on the basis of the safety-debriefing questionnaire will be analysed qualitatively. In particular, it can be analysed which hazards are associated to the introduction of EMMA A-SMGCS (and, thus are only experienced or conceived of by the operational team), and which hazards are not associated with the introduction of EMMA A-SMGCS but can change in severity of frequency (and thus are reported by the operational team and the trials team).

3.1.4.2.3 Safety indicators in passive shadow-mode trials As already mentioned in the previous paragraph, in passive shadow-mode trials, as the controller has no direct influence on the traffic, the number of hazards, that is, any condition, event or circumstance which could induce an incident or accident, cannot be recorded directly. In this section, an alternative

Page 18: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 18 File Name: D621_METHO_V10.doc Version: V1.0

methodology is proposed to perform a partial safety assessment of the EMMA A-SMGCS in passive shadow-mode trials.

3.1.4.2.3.1 Overview The objective of this analysis is to measure the number of potentially hazardous situations when using the EMMA A-SMGCS in non-nominal conditions (visibility 2). These hazardous situations can occur each time the controller is provided with incorrect surveillance information by the EMMA A-SMGCS display, and issues a clearance to a pilot, using this information. The following EMMA A-SMGCS procedures and types of clearance have been described in the Operational Requirement Document :

− ATC/Start-up clearance − Push-back clearance − Taxi clearance − Control of taxiway intersection − Handover between Approach, Departure, Aerodrome, Ground and Apron Control − Taxiing on the runway − Line-up procedures − Take-off clearances − Landing clearance

According to the Operational Requirement Document, in order to issue each of these clearances to the aircraft/vehicle, the ATCO has to check that:

− the aircraft/mobile requesting the clearance is positioned at the right place (i.e. an identification procedure shall be performed)

− there is no potential conflict with other mobiles, in air or on ground. In visibility condition 1, these tasks will be performed using direct outside view of the traffic. In visibility condition 2, these tasks will be performed using the EMMA A-SMGCS display.

3.1.4.2.3.2 Test procedure During passive shadow-mode trials, these scenarios can be used in order to determine that the EMMA A-SMGCS performances are acceptable to issue each clearance (push-back clearance, taxi clearance , etc.) in each corresponding area of the airport (apron, runway entry, taxiway intersection, etc.) in each set of conditions (visibility good [1] and bad [2]). Generic open questions can then be obtained for each combination of procedure/area/condition. For instance, a generic open question could be: “Using the EMMA A-SMGCS HMI, in visibility 2, does the current performance of the surveillance system allow for correctly identifying an aircraft at its stand, in order to give it a push-back clearance?”

The objective of the system safety assessment process is to answer each of these generic questions. In passive shadow-mode trials, the generic procedure will be the following: 1. The observer asks the controller to perform a given task, allowing him to determine if s/he is able to give a clearance to an aircraft/vehicle, using the EMMA A-SMGCS HMI (level I), then the outside view (baseline): - Identify a given aircraft, or - Detect potential conflicts with a given aircraft. (For example, in order to issue a push-back clearance, identify a given aircraft and its stand.)

Page 19: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 19 File Name: D621_METHO_V10.doc Version: V1.0

2. The observer counts the number of situations the controller is misled by insufficient surveillance performance or by unsuitable procedures, and creates a potentially unsafe situation (issues a wrong clearance, does not properly identify an aircraft/vehicle, or does not detect a conflict, etc.). As the controller is not actually controlling the traffic, his/er workload will be much reduced: it is assumed that, in such situations, the controller is not the main cause of the error. 3. If the result of the task, using the EMMA A-SMGCS display, is not consistent with the result of the task using the outside view, the observer can use a safety debriefing questionnaire, as defined in paragraph 3.1.4.2.2, in order to answer the following questions:

• Which factors contributed to the occurrence of the potentially unsafe situation? (technical factors, procedures)

• What was the consequence of the situation? (wrong identification, not detected conflict) • What could have been the worst consequence if the situations were not detected? • What could have been the worst consequence if the situations were detected? • Which factors could have worsened the situations? • What could be the severity of the worst credible case (rated on a scale from 1 [major accident]

to 5 [no immediate effect on safety])? • How often could this hazard occur in real operation? • Are there any fall-back actions which could mitigate the situation?

At the end of the process, the following quantitative data will be available : • Number of times the controller was asked to issue a clearance, for each type of clearance (total

number of situations : TNS), • Number of times the controller was mislead by insufficient surveillance performance or by

unsuitable procedures, and creates a potentially unsafe situation (issues a wrong clearance, does not properly identify an aircraft/vehicle, does not detect a conflict, etc.) (number of potentially unsafe situations : NUS),

• For each of these potentially unsafe situation, identification of the situation type : wrong identification, not detected conflict, (number of potentially unsafe situations for each situation type),

• For each of these potentially unsafe situation, identification of the main factor : technical factor or procedures (number of potentially unsafe situations caused by each factor),

• For each of these potentially unsafe situation, identification of the severity : technical factor or procedures (number of potentially unsafe situations of each category of severity).

3.1.4.3 Operational Trials

Operational trials are carried out by installing the new system/platform in the operational environment (in this case, the airport), and use it for the control of traffic. As the previous system/platform will – at least for a certain amount of time – co-exist with the new system, operational trials show a close resemblance with advanced shadow mode trials.

For this reason, the suggested approaches to safety measurement are – both in terms of objective and subjective indicators –broadly identical to the one suggested for shadow mode trials.

3.1.4.3.1 Occurrence of hazards and controller reactions

Page 20: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 20 File Name: D621_METHO_V10.doc Version: V1.0

Like in shadow mode trials, safety in operational trials can be measured in terms of the number of safety-critical events that occur relative to the number of traffic movements (for the list of safety-critical events see previous section). As mentioned for shadow mode trials, though, the occurrence of safety-critical situations during the evaluation period is extremely unlikely (and certainly not desirable). Because of the safety risks involved, the experimental realisation of non-nominal situations is also not recommended.

What can be measured, though, are hazards that do not directly compromise safety, as they are usually easily mitigated (such as an incorrect pilot read-back). Furthermore, hazards relating to a sub-optimal functioning of EMMA A-SMGCS can be measured. However, for the EMMA A-SMGCS related hazards, no baseline data will be available.

3.1.4.3.2 Experienced level of safety

For operational trials, the same way of obtaining subjective measures of safety is recommended as for real-time simulations and shadow mode trials. That is, a safety-debriefing questionnaire should be administered at the end of a simulation session. This safety-debriefing questionnaire should contain the same questions as for Real-time Simulations and Shadow mode trials.

3.2 Capacity & Efficiency Assessment Methodology

3.2.1 Aim of the Chapter

The objective of this chapter is to present the methodology that should be used to assess the capacity and efficiency impacts of the technical performance of EMMA A-SMGCS functions and concepts during the validation activities at the EMMA Project sites. More specifically, this chapter:

• Identifies the technical performance characteristics of the EMMA A-SMGCS functions under consideration which affect the capacity and efficiency performance of the system.

• Identifies and presents the method(s) that will be used to assess capacity and efficiency related technical performance characteristics of the EMMA A-SMGCS .

• Provides suggestions as to how the proposed method(s) should be implemented.

3.2.2 Impact of the EMMA A-SMGCS on capacity and efficiency

The main objective of the validation to be performed in the EMMA project is to ensure that the EMMA A-SMGCS contributes positively to the surface management at airports. In particular, greater benefits are expected under low visibility conditions and in congested traffic situations.

The EMMA A-SMGCS level I intends primarily to enhance safety and efficiency of ground surface operations through the introduction of the surveillance service. The main objective is to enhance ATM operations, in particular visual surveillance (performed in SMGCS) by an automated system capable of providing the same level of service in all-weather operations.

Page 21: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 21 File Name: D621_METHO_V10.doc Version: V1.0

The EMMA A-SMGCS level II aims at complementing surveillance service (Level I) with a control service whose objective is to detect potentially dangerous conflicts in order to improve safety, capacity and efficiency of ground movement areas.

The main benefits of implementation of EMMA A-SMGCS level I will be associated with, but not limited to, the increase of airport efficiency and the maintenance of airport throughput in bad visibility conditions.

As it is mentioned in the ICAO A-SMGCS Manual, savings are expected in terms of reduced delay, taxi-in and taxi-out time, aircraft turnaround time at the gate, improved aircraft servicing, improved aerodrome capacity (in terms of throughput expressed as operations per hour), or reduced operating costs.

3.2.3 Methodology to assess the impact on capacity and efficiency

According to MAEVA’s V&V methodology, capacity and efficiency assessment includes the next tasks:

• The installation of the experimental system/platform at the airport.

• The use of the experimental system/platform during the validation conduct (either in addition to or instead of the current system).

• Other activities related to validation (such as presence of validation team in the operational environment and the collection of measurements during operation).

There will be two types of data analysis (“absolute” and “relative”) due to the fact that there will be two different operational scenarios: Baseline (no A-SMGCS) and EMMA A-SMGCS in operation. Thus, assessment of EMMA A-SMGCS benefits could be performed as follows:

• In “absolute terms”; it shall refer to one scenario only, e.g. maximum number of movements with EMMA A-SMGCS in bad visibility conditions

• In “relative terms”; it shall be the outcome of the comparison between the two situations in order to evaluate an improvement or a change, e.g. number of extra movements in low visibility conditions that can be handled with the aid of the EMMA A-SMGCS.

Operational scenarios shall be compared in terms of the metrics defined. There are a distinction between objective and subjective measures of Capacity and Efficiency. The evaluation of the potential capacity and efficiency benefits for an airport equipped with EMMA A-SMGCS should include metrics for both quantitative (e.g. number of movements) and qualitative (e.g. complexity to handle the traffic) data collection methods. Objective data collection methods refer to performance data of the system that can be directly measured while subjective data collection methods refer to user judgments on the overall system performance or on specific aspects of it.

3.2.4 Capacity and Efficiency Measurements

Measurements taken strongly depend on the validation environment (real time simulations, shadow mode trials, operational trials). Emphasis is put on capacity and efficiency measurements in ATC settings, rather than on measurements pertaining to the pilot, because in both EMMA A-SMGCS level

Page 22: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 22 File Name: D621_METHO_V10.doc Version: V1.0

I and II, controller will be provided with new tools and procedures. For the pilot there will be no changes when ATC operates in tools or procedures in EMMA A-SMGCS level I and II.

3.2.4.1 Capacity In order to analyze the impact of the EMMA A-SMCGS on capacity, it is necessary to keep the concept of considering an airport as a whole, which means to consider the totality of the ground movements on the sub-systems runway, taxiway and apron. Nevertheless, this impact on capacity shall be analyzed from the point of view of the impact of EMMA A-SMGCS on each separated sub-system. According to the ICAO A-SMGCS Manual, the following different capacities should be taken into account:

1. Apron capacity - maximum number of aircraft (with indication of types) that can be parked;

2. Taxiway capacity - maximum number of aircraft that can be operated on the taxiways at the same time;

3. Runway capacity - maximum number of runway movements per hour

4. Approach capacity - maximum number of transfers between the aerodrome controller and approach controller.

Each of the above-mentioned capacities should be calculated using both design and operational values, and they should be referred to in predetermined units of time. These capacities should be calculated under different visibility conditions

3.2.4.1.1 Real Time Simulations If the traffic sample is correctly prepared and it allows to test the system at the airport limit in terms of number of movements; apron, taxiway, runway and approach capacities could be measured. It is important to have in mind that specific exercises with specific traffic samples would be necessary for each indicator. If real time simulations are performed with real traffic samples where traffic has not been prepared, then throughputs (i.e. number of movements) instead of capacities (i.e. maximum number of movements) would be measured.

3.2.4.1.2 Shadow-mode Trials Shadow mode is a validation technique in which the system being validated operates in parallel with the real system. The only exchange of information between the two systems is that the real system feeds the shadow-mode system. The advantage of shadow mode is that the evaluators use real information in a real operating environment, without interfering with airport operations.

Due to the nature of shadow mode, this is not an appropriate validation technique to validate capacity benefits. No capacity measurements will be taken during shadow-mode trials.

In case advanced shadow mode is conducted that means that the trial team (working with the experimental system) rather than the operational team controls the traffic, the indicators defined for the field trials would be applicable.

3.2.4.1.3 Operational Trials During operational trials it will not be possible to measure capacity, but throughputs. In order to measure capacity we would need to increase the traffic up to the maximum acceptable by the system and it can not be done during operational trials.

Page 23: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 23 File Name: D621_METHO_V10.doc Version: V1.0

The traffic sample will correspond to real traffic which has been regulated according to the infrastructure and airspace capacity. This constraint makes impossible to measure capacity, however the apron, taxiway, runway and approach throughputs are measurable.

3.2.4.2 Efficiency Validation should be focused on demonstrating that the contribution of EMMA A-SMGCS should improve the overall efficiency of the airport. Efficiency will be quantified by means of a broken – down into a number of measurable indicators that reflect the impact on operations when operating with EMMA A-SMGCS.

3.2.4.2.1 Real Time Simulations Impact of EMMA A-SMGCS levels I and II on efficiency, will be analysed from the point of view of ATCo’ s activities and flight scheduling, regarding the next aspects:

• Delays and punctuality:

o Average delay per aircraft: the difference between the scheduled time of operation and the time at which the operation is actually performed (several kinds of delays are described in D6.2.2).

o Punctuality index: the percentage of operations performed within the agreed/tolerated delay limits (to be defined in D6.2.2)

• Arrival (Departure) Taxi Times: time an aircraft spends from the Landing Time (Off Block Time) to the On Block Time (Taking – Off Time). The introduction of EMMA A-SMGCS should show an improvement on these taxi times under low visibility conditions.

• Ground Conflicts: the introduction of EMMA A-SMGCS on airport control system should show an improvement on detection speed and resolution capability of potentially dangerous ground conflicts. These effects will be reflected in a more efficient strategy of conflict resolution actions.

• Aircraft turn-around time: the improved aircraft servicing will lead to a reduction of the aircraft turn-around time at the gate.

3.2.4.2.2 Shadow-mode Trials As it has been already explained, it is not possible to measure, in shadow mode, efficiency indicators as the ones described in the previous section. However, users can provide their impressions in terms of delays, taxi times, etc.

In addition, during shadow-mode trials, questionnaires will be used in order to obtain the qualitative opinion of EMMA A-SMGCS operators on different aspects that affect capacity and efficiency such as:

• Improvement of the global airport situation awareness

• Higher capability to make better decisions on time

Page 24: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 24 File Name: D621_METHO_V10.doc Version: V1.0

3.2.4.2.3 Operational Trials The indicators defined for real time simulations are also applicable for operational trials.

3.2.5 Methods, techniques and tools

3.2.5.1 Real – Time Simulations Real-time simulations are carried out in a simulation facility (ATC simulator) and simulate events in the time frame in which they would naturally happen. Results will be collected from baseline operational conditions (in which pseudo-controllers work without EMMA A-SMGCS) and experimental operational conditions with the same tools and procedures, and EMMA A-SMGCS implemented. RTS output can be objective and subjective: Objective outputs will be referred to the metrics above described, and deeply detailed in D6.2.2. Subjective data gathered with real time simulations complement users’ opinions and comments about EMMA A-SMGCS impact on capacity and efficiency. One of the advantages of real-time simulations is that all ATC events can be simulated by the pseudo-controllers during the simulation, so extreme situations in which capacity or efficiency could be damaged due to bad weather conditions can be easily reproduced and contrasted. Critical situations experimentally induced allow taking measurements as many times as desired, more frequently than actual bad weather situations may occur.

3.2.5.2 Questionnaires Subjective measurements are usually collected by means of questionnaires. These questionnaires are to be defined within the Measurement and Analysis Specification Step. They shall contain both user comments and ratings. Ratings shall be provided through a point evaluation scale referred to statements built up over the proposed.

For example:

The number of take offs in a period of time increases:

strongly disagree slightly agree

disagree agree

slightly disagree strongly agree

Subjective measurements should also be complemented with interviews with all actors involved. This type of feedback from stakeholders would be a good indication of the impact of A-SMGCS on efficiency.

3.3 Human Factors Assessment Methodology

3.3.1 Aim of the Chapter

The objective of this chapter is to present how the ergonomic aspects of EMMA A-SMGCS are evaluated in order to determine a level of suitability.

Page 25: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 25 File Name: D621_METHO_V10.doc Version: V1.0

3.3.2 Impacts of EMMA A-SMGCS on the Human Factors (Tower environment)

EMMA A-SMGCS Level 1 and Level 2 are expected to provide a lot of useful assistance to the controller. The controller is provided with position and identification of all movements on the maneuvering area and of all aircraft on the movement area (Level 1). In addition, the controller is warned by the system in case of conflict situations between aircraft or between aircraft and vehicles or intrusions of movements in restricted areas (Level 2). Without such assistance the controller is forced to get this information (position & identification) by position reports of pilots / vehicles drivers, which are verified by the controller’s outside view, analog radar targets (SMR), the flight strip information and her/his mental map of the traffic situation. Particularly in case of low visibility conditions or/and high traffic amount, when the outside view is very restricted (or lacking) and the number of aircraft to be controlled increases, the demand on the controllers' resources increases too. The means to verify the aircraft positions by looking outside are not available anymore, so the human is forced to maintain a picture of the traffic situation on the basis of a mental representation, which is only working up to a definite limit. The controller's workload increases to a level, where her/his performance decreases, so that, to maintain safety, the traffic amount has to be reduced.

The EMMA A-SMGCS Level 1/2 supports the controller in maintaining awareness of the actual traffic situation. It is expected that this technical assistance increases the controller’s situation awareness and enables the controller to behave more efficiently, particularly in adverse traffic conditions (low visibility, high traffic amount), to handle more traffic by the same or even increased level of safety. Following impact on the controller is conceivable and assessable:

• Situation Awareness

• Workload Fatigue

• Usability

• Acceptance

• Errant behavior

• Head down / Head up time

• Conflicts detected in time to prevent an accident

• Time for processing an EFS

Verification

Verification means the process of testing whether ‘we build the system right’. In terms of Human Factors it means that it has to check if the system fulfills the Human Factors related requirements. These requirements are derived from standard check lists and will be supplemented by EMMA specific requirements that are developed in collaboration with the users and are laid down in the EMMA ‘Operational Requirement Document’ (D135_ORD_V1.0). This verification must be performed during the ‘real time simulation’ (RTS) and also during the ‘on-site testing’ phase to ensure the technical system is working as it was expected to work.

Page 26: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 26 File Name: D621_METHO_V10.doc Version: V1.0

Validation

In contrast to verification, validation means the process of testing whether ‘we build the right system‘. Thus it is tested if the Human-Machine-interaction performance can contribute to “operational improvements”. There are three different validation platforms: RTS (real time simulation), operational trials, and also shadow mode trials. All three platforms provide different possibilities of testing, that is, different methodologies and different validation objectives can be identified.

The RTS environment is a well-known test platform to realize ‘ceteris paribus6’ test conditions. It is a very appropriate environment to compare human performances between different scenarios (independent variable) under the same set of side conditions. As the side conditions can always be kept constant, the measured effect (or HF performance) can easily be explained by the variable that has been varied (e.g. with or without EMMA A-SMGCS support, or good vs. low visibility). A second advantage of the RTS test environment is that also safety critical situations can be tested, like RWY incursions, system information losses and more intrusive measurement equipment, such as eye tracking devices can be used. Methods applied here are mid-run and post-run questionnaires, debriefings, eye point of gaze measurements, behavior observation or physiological measurements.

The operational trials environment will be used to evaluate the new system in interaction with the human under real life conditions. The overall real environment cannot be copied by simulation (e.g. real tower light conditions) or there are circumstances that cannot be predicted without operational trials (e.g. behavior of the controller with pseudo pilots vs. real pilots). However, a lot of restrictions have to be taken into account here. Environmental conditions (visibility, traffic amount, system running or not) cannot be influenced by validation test leader. In most cases, only parts of the system can be tested (e.g. only one controller working position (CWP) with a restricted functionality). Additionally, it is not permitted to induce safety-critical situations to test the human reaction. All together, the variables cannot be assessed over the complete spectrum, only prevailing test conditions can be used, and the human performance can only be assessed with non-intrusive methods. The non-intrusive methods mainly comprise by post-run questionnaires and debriefings.

Passive shadow mode trials (the controller reacts with the system by monitoring the regular traffic but without interaction with the regular traffic) are very well suitable to investigate human machine interactions or to evolve new procedures. Human performance is hard to be measured here because of the human is not working but only monitoring. But performance can be measured referring tasks that result from an interaction with the system without direct influence to the regular traffic (e.g. fill in of an EFS by monitoring the R/T communication).

HF Performance Methodology Test Platt form Objective/ Subjective

Quantitative/ Qualitative

Situation Awareness Questionnaires RTS/Operational Trials

Subjective Quantitative

6 Equivalent to “all else being equal”

Page 27: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 27 File Name: D621_METHO_V10.doc Version: V1.0

HF Performance Methodology Test Platt form Objective/ Subjective

Quantitative/ Qualitative

Workload Questionnaires RTS/Operational Trials

Subjective Quantitative

Fatigue Questionnaires RTS/Operational Trials

Subjective Quantitative

Usability Questionnaires RTS/Operational Trials

Subjective Quantitative/ Qualitative

Acceptance Questionnaires RTS/Operational Trials

Subjective Quantitative/ Qualitative

Errant behavior Behavior Observation

RTS Objective Quantitative/ Qualitative

Head down / Head up time Behavior Observation

RTS Objective Quantitative

Conflicted detected in time to prevent an accident

Behavior Observation

RTS Objective Quantitative

Time for processing an EFS Behavior Observation

Shadow Mode Objective Quantitative

3.3.3 Impacts of EMMA A-SMGCS on the Human Factors (Cockpit Environment)

EMMA also considers onboard equipment that shall provide better situation awareness to the pilots, or spoken with terms of “operational improvements”: onboard systems (like Airport Moving Map Displays) shall support the pilots to react safer and more efficient. The onboard systems with EMMA are technically independent of ground equipment installations and validation activities referring onboard systems focus mainly on the “Human Factors” aspects. As the indicators used here are slightly different to those used for tests with ATCOs in a Tower environment, the HF section is split to two sections, one for Tower, one for Cockpit environment.

The literature proposes a great variety of HF criteria and guideline focusing on different viewpoints of the human-machine performance. The Airbus approach to investigating HF is based on a HF grid addressing more specifically the cockpit use by a crew (Robert & Buratto, 1998). It relies on a classical HF criteria approach, a crew task based approach, and an interface-based approach. It mainly considers: usability and utility.

Within the EMMA project scope, the following HF dimensions, which affect performance at different levels, may be considered: ease of performing the task, ease of use and of learning, safety (human errors), etc.

Page 28: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 28 File Name: D621_METHO_V10.doc Version: V1.0

HF dimensions (Topics) HF Criteria (Sub-topics)

Compliance with operational needs Operational utility (impact on situation awareness, on task achievement, etc.)

Information and Functions utility

Information completeness

Task chronology (consistency between system use and task context)

Interface usability Respect of human capacities (perception, working memory, etc.)

Information legibility

Commands usability

Application transparency and ease of learning

Work status transparency and appropriate feedback

Errors management Errors occurrence

Errors detection and recovery means

Errors consequences

Crew task sharing Task definition and repartition

Information exchange

Global consistency Consistency with other cockpit displays and equipment philosophy

Respect of known codes

Existing procedures (consistency with, modifications)

Training conditions and content

These HF dimensions (topics) may be considered at different levels in the design process.

The following table gives the main activities and methods according to development phases:

Development phases Activities Methods (not exhaustive)

Operational requirements analysis Meetings Working groups Requirements definition

Operational needs analysis

(Need analysis, preliminary task analysis)

Users’ interviews

Knowledge of current tasks and activities

Page 29: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 29 File Name: D621_METHO_V10.doc Version: V1.0

Development phases Activities Methods (not exhaustive)

General HMI concepts

System HMI concepts

Space allocation for systems control panels

Participatory design meetings

Directives for control panels HMI Use of Airbus standards Evaluations

HMI mock-ups development

General design (General design and concept evaluation)

Mock-ups Evaluation Evaluations

HMI design for every system Operational design reviews Detailed design

(Iteration on task analysis, design and evaluation of detailed interfaces functions)

General design and every system HMI design evaluation

General and local tests

Validation

(Validation of detailed interfaces & functions)

General design and every system HMI design validation

General and local tests

The evaluation process is iterative and depending on the application maturity and the design step, it can be preferable to perform some specific and rapid tests or to perform a more complete evaluation. Evaluations can be performed very early in the design process, based on rapid prototyping and performance can be considered on the basis of task description and high-level requirements. Specific detailed objectives (sub-topics) may be defined for the different evaluation steps from general topics such as accordance of function with operational needs, consistency with cockpit existing applications, functions understanding, information content relevance for task achievement, etc. Tests of human machine performances have a very important role in order to:

• Identify operational needs and assess the consistency of the application and HMI with them.

• Test the ease of use and the performance of use of new interfaces. Usability is in general an important construct underlying successful performance.

• Establish knowledge about how a system can be used by the user and by the crew and what kind of training and procedures have to be produced.

• Identify actual and expected errors. In a complex system, errors may be relatively rare but they can have large consequences. It is necessary to evaluate the predisposition toward errors inherent in a system design.

Advanced evaluation steps may also be performed in investigating the expected benefits of EMMA A-SMGCS, for instance, referring to:

• Improving pilots’ awareness,

• Improving ground security,

• Easing aircraft taxiing and reducing taxing time.

Page 30: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 30 File Name: D621_METHO_V10.doc Version: V1.0

HF measurements yield data to be analyzed in order to reach the evaluation objectives. Regarding the different topics, several possible techniques (see Data collection and techniques) may be considered. For example, situation awareness may be assessed on the basis of:

• Users’ debriefing (based on how much an operator feels he or she has when using the system and/or specific debriefing allowing to gather detailed information about the user perceptions to be collected and compared against baseline.

• Observation of user actions during the test (e.g.,, if the scenario includes specific situations where a lack of situation awareness should affect the performance (task accomplishment)).

The following table gives examples of HF measurements usable:

Examples of HF measurements

Situation Awareness

Subjective feedback

Decision taken in situations considered as complex (TBD)

Task achievement Subjective feedback

Time to accomplish the tasks (taxiing time, delay of action after selected events, etc.)

Requests made to ATC

Respect of ATC consign (in situations considered as complex such as lack of signaling on airport, high traffic, low visibility … TBD)

Detection time (after selected events, …)

… (TBD)

Errors Subjective feedback

Types, (i.e., individual or collective) occurrence, individual or collective, detection and recovering

System usability

Subjective feedback

Relative usage of displays

Errors

Time to find or process information on the display (or an area)

Satisfaction level

Subjective feedback

Workload

Subjective feedback

Etc.

The HF measures shall be defined in more detail with operational experts, consistent with the maturity of the application design, and the available evaluation means and resources.

Page 31: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 31 File Name: D621_METHO_V10.doc Version: V1.0

In addition to objective measurements, users’ comments (on how users describe their operational needs, on how they use the HMI in performing their tasks, on the information content they expect, etc.) are essential to derive recommendations for improving the HMI. Free comments or unexpected observed use of the application often provide useful hints for improving system design. Some results may be not statistically significant but operationally significant, because they constitute meaningful findings (for example affecting safety or impacting on the usability of the function in a given context).

3.3.4 Methods, techniques and tools

3.3.4.1 Evaluation methods

Evaluation can be made for several reasons such as comparing alternative designs or evaluate HMI usability. The applicability of a method is not general and it has to be determined by a combination of the objective following the design stage, of the tool to support the evaluation, the time available, and the availability of test participants. The following table sums up applicable methods:

Methods Description Comments

Documentation study

The documentation study is very useful for cockpit design. It consists of studying the operational procedures, the flight manual. The aim is to look at the continuity of the design all along the aircraft program development.

Expert judgment This method consists in comparing a design solution with norms, standards, recommendations, or to deliver a subjective evaluation based on academic training, field experience, ...

Heuristic evaluation by the human factor experts or specialists

Several designers (engineers, human factors experts) assess the usability of the system. The inspection of the interface is based on usability heuristics as « simple and natural dialogue », « minimize user memory load », « good error messages », etc).

Survey A survey is used to collect users’ opinion on a large range of population.

Participatory design meetings

Performed with a panel of experts or future users. Participatory design meetings are required to identify relevant functions definition, appropriate HMI solutions, etc. from an operational point of view.

Power-point or mock up based

Static, dynamic or exercise based

Page 32: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 32 File Name: D621_METHO_V10.doc Version: V1.0

Methods Description Comments

Walkthrough Done by experts having knowledge on technology and on the intended user, or in cooperation with users, this method is an expert review to detect problems early in the design process. After defining the users’ task, the expert uses the prototype (or paper), walking through the task, to predict user behaviour and possible problems.

The method can be focused on cognitive tasks.

This method can be associated with Cooperative evaluation: This method is used early in the design process, to detect usability problems. In this case, the user can use a prototype and carry out tasks. The designer or the human factors specialists can interview the user directly, and the user can be asked to think aloud, to comment and to propose alternatives.

A scenario helps to explore ideas or solutions, and the design decisions, in concrete situations. It considers the set of activities for which the users will use the system. Scenarios can be developed for normal or abnormal situations, and are more or less critical. They support the experimentations.

Local tests Local evaluations aim at evaluating application and HMI through scenarios focused on a given crew’s primary task and display management

Global tests Global evaluations aim at evaluating:

General HMI concept, the HMIs involved in all crew’s primary tasks (defined in Cockpit Philosophy), the coherency of HMIs between systems, etc.

in a full cockpit simulator, through several mission scenarios involving all the crew’s primary tasks and display management.

3.3.4.2 Tools supporting HF methods

The following table gives the tools used for cockpit design

Tools Description

Mock-ups Often supported by PC, work station (with power-point or software), physical mock-ups or simply by paper, they represent the global cockpit interface or a local system interface as prototype. They allow for static demonstration/evaluation in a given « single user » configuration and they are especially useful early in the design process.

Virtual simulator It represents a set of aircraft systems as controls and allows for the static demonstration/evaluation in a given « single user » configuration. The advantage compared to the mocks up is the possibility for the user to practice self-manipulation of the main controls.

O3P simulator is a prototyping tool used in the first stages of cockpit HMI design and human factor process to assess high-level orientation of HMI design.

Page 33: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 33 File Name: D621_METHO_V10.doc Version: V1.0

Tools Description

Engineering simulator

It represents a large part of the cockpit system and environment and is equipped with a visual representation of external environment. It is equipped also with simulated aircraft equipment and allows for demonstration/evaluation in a « crew » configuration.

Integration simulator

It represents all the characteristics of the previous one, but it is equipped with actual aircraft equipment, giving to the users a close to the reality reproduction of actual system working. It allows for demonstration/evaluation in a « crew » configuration.

Actual aircraft It can be used as evaluation tool and also as an experience feedback means.

It gives the advantage to be fully representative to the reality.

3.3.4.3 Data collection methods and techniques

The selection of methods has to be done according to the evaluation objectives and design step.

Data collection during test run:

Data type Objective How Support

Direct observation by Human factor specialist and designer

Direct observation allows gathering users actions during the test and comments given during the run.

These data are later analyzed aiming at taking into account the way users used the HMI, possible difficulties, errors performed, etc.

According to the evaluation objectives, the observation include:

- Actions from users, and in particular if errors occur, context and recovery; task sharing; difficulties encountered, etc.

- Comments from users during the test.

The observation table is provided as support.

The observation is looking at the users performing a representative task and observe/record the relevant users/system interactions (actions, problems of understanding). This method is also interesting for experience feedback as it allows understanding users’ task and use of the system.

Observation table

Experimental observation

The experiment is designed, according to predetermined hypotheses. The aim is to observe the behaviors of the users to find evidence for the hypotheses or to discover/explore new users’ behaviors.

Observed events definition + Records

Page 34: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 34 File Name: D621_METHO_V10.doc Version: V1.0

Data type Objective How Support

Direct observation from operational experts

It allows for having operational external elements on the strategy or efficiency of users activity, the application of known procedures, etc. It helps for the detailed understanding of tasks performed and HMI information required as well as impact of novelties on procedures.

Data recording through external equipment (eye-tracker, video, etc.)

It allows for studying the visual movements on the HMI (eye-tracker) or other type of observable behaviour such as user’s position (for example, standing up). It can give relevant elements about the use of the HMI information or HMI parts.

Video or audio recording allow specially for recording users’ verbal comments and communications during the test.

Such records may be intrusive (such as video). This set of techniques is usually used if the evaluation scope is associated with a specific interest in the techniques' parameters.

This material yields data transcription or transformation in usable way for analysts.

Video recording is very interesting in situations of collective tasks (allows for studying information exchanged between users) or complex tasks.

Video may also be used for recording data in case additional analysis is needed. It is possible to use video recording to support debriefing (Self-confrontation).

External equipment

System data recording

The recording of data directly from the system is needed for an objective and systematic description of actions performed by users and events recorded. Regarding the evaluation objectives, some quantitative description of users’ action give inputs combined with other data to draw the analysis (for example, specific actions not performed in the expected context).

The events or actions to be recorded have to be defined. A more systematic recording is also interesting because it gives the opportunity to analyze some features not planed in the experimental protocol but considered as relevant after a first analysis of data.

System data recording

Data capture through specific tasks requested (such as instantaneous self assessment, additional task, etc.)

The objective is to give additional information on subjective workload during the run (instantaneous self assessment) or aims at gathering information when adding specific tasks.

A limit of this method relies on the impact of the additional task (intrusive in users activity). Due to the intrusive character of an additional task, this method shall be justified by the evaluation scope.

Data capture through specific tasks requested (such as instantaneous self assessment, additional task, etc.)

Data collection after test run (or set of runs):

Page 35: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 35 File Name: D621_METHO_V10.doc Version: V1.0

Data type Objective How Support

Users’ debriefing (interview)

It allows gathering users points of view and operational explanations, comments on difficulties encountered with the HMI as well as errors risks, detection and recovery.

The debriefing allows gathering users explanations allowing (in combination with other data analysis) drawing the understanding of the problems encountered and the impact of the functions on human performance.

The debriefing is performed after the run.

The debriefing guide is provided as experimental support and is used after a run or at the end of the evaluation session in order to help gathering users inputs on the evaluation scope. The questions are defined according to detailed objectives (sub-topics).

The debriefing duration is planed as specific phase of the evaluation.

The interviewer role is to invite users to give additional explanation when possible, giving material for the analysis performed after. The users comments may be recorded and transcribed later or directly written during the debriefing. In both cases, the users comments are noted as exhaustively as possible. A specific notation code is defined. The data collected by interview are often qualitative. Rating scales can be associated to the questions.

Interviews are used to learn about what the users think about the evaluated system (in term of usability as well as requirements definition).

Debriefing guide

Written questionnaire It allows gathering users points of view on large panel of users. It may be used if the evaluation implies a large panel in order to gather users’ feedback.

Like the interview, questionnaires are used to collect users’ opinions as the interview, but the questions have to be prepared to be unambiguous. Some additional data could be required to complete the understanding of the data (more than in direct debriefing where explanations might be asked during it).

Quantitative analysis can be done as more people can be sound and the users’ answers can be compared.

Questionnaire

The objectives of the tests determine the type of information to be collected from the users. Amongst the numerous methods, the methods usable in earlier steps of a system design and evaluation may be local mock-up evaluations or local tests. Techniques for data collection may be:

• Paper and pencil appropriate in the major situations of evaluation cases, associated with observation table and debriefing.

• Video or audio recording

• Recording of simulator parameters.

Page 36: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 36 File Name: D621_METHO_V10.doc Version: V1.0

3.4 Cost-Benefit Assessment Methodology

3.4.1 Aim of the Chapter

The objective of this chapter is to present the major categories of the costs and benefits emerging from the development and use of the EMMA A-SMGCS and provide the method for measuring them.

3.4.2 Costs and Benefits of EMMA A-SMGCS

The introduction of the surveillance and control functions of EMMA A-SMGCS (level I & II) at all EMMA project sites entails costs that should be invested for the acquisition, maintenance, and operation of the EMMA A-SMGCS. On the other hand the introduction of the EMMA A-SMGCS is expected to generate numerous benefits. Therefore, the cost-benefit assessment of the EMMA A-SMGCS at all EMMA sites requires the quantification of the expected costs and benefits.

The process of identifying and measuring the expected costs and benefits constitutes a basic prerequisite for performing benefit-cost analysis in order to evaluate the investment on the proposed A-SMGCS (level I & II). In particular, the benefit-cost analysis for assessing the introduction of the proposed A-SMGCS involves the estimation of the present worth of the monetary value of the costs and benefits of this investment given the specific economic lifetime of the system. The objective of this section is to present the methodologies that will be used in order to assess the costs and benefits encountered by all stakeholders for introducing the proposed A-SMGCS (involving the implementation of the surveillance and alerting functions) in every EMMA project site.

The governing principle of the cost assessment is to ensure that reliable cost estimates can be obtained for all cost components associated with the introduction of the EMMA A-SMGCS (Level I & II). The estimation of costs will take place during the validation stage of the project since it is not possible to have reliable cost estimates at the verification stage.

‘Cost’ is defined as a comprehensive figure of measured input. It must definitely be determined by the full amount of resources, time, and staff used to acquire results. Costs should be expressed in a common monetary unit (i.e. Euros). A fundamental problem in cost estimation is to determine the total economic costs of proposed future investments. The life cycle cost model accomplishes this objective. Since various major cost categories are likely ‘uniform’ to most airport projects, EMMA A-SMGCS improvements might be organized and specified in great detail under four general cost categories:

EMMA A-SMGCS Planning, Research and Development

This cost category includes:

• any necessary research and development expense associated with the project • detailed project design and engineering plans • government approval

EMMA A-SMGCS Investment / Capital

Page 37: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 37 File Name: D621_METHO_V10.doc Version: V1.0

This cost category would inter alia include:

• all opportunity costs associated with getting the investment implemented • construction costs for new system, expansion or modernization work including labor,

materials, services, administration • total financing costs

EMMA A-SMGCS Operations and Maintenance

This cost category would inter alia include:

• transition costs from existing to new system • all recurring material cost to operate, maintain or repair the system over the full economic life • all types of personnel costs, such as for operating the system, • environmental impact costs • training costs

EMMA A-SMGCS Termination

This cost category would inter alia include:

• dismantling • site restoration minus income from sale of de-vestments at the end of the operational period

The benefits emerging from the introduction of the EMMA A-SMGCS level I & II refer to the savings derived from the expected reduction in delays, the increase of the airport throughput, and the improvement of safety. In particular, the following categories of benefits have been identified:

• Energy Savings: Monetary savings from reduction in fuel use

• Time savings for the airlines due to the reduction in delays

• Airport throughput: Monetary benefits from increase in airport throughput.

• Time Savings for passengers

• Safety impacts

• Reduction of Environmental impacts

The EMMA project should systematically identify the range of stakeholders as total A-SMGCS spending will in economical terms be compared to its benefits (effectiveness) per stakeholder category.

This stakeholder approach should therefore include:

• Cost Assessments to EMMA A-SMGCS users (ATM / Airports) • Cost Assessments to supporters (Airlines / Government) • Cost Assessments to interest taker (Air transport passengers)

The identification, quantification and monetization of costs for various EMMA A-SMGCS investment types should be developed in constant € for a limited number of focus years.

In order to do so the cost assessment should consider:

• how to interpolate or extrapolate focus year’s costs (and benefits) over the full operating period

Page 38: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 38 File Name: D621_METHO_V10.doc Version: V1.0

• how to quantify and compare multi-period streams of costs (and benefits) taking risk and inflation impacts into account

• how to select ‘low cost’ configurations

Determination of Types of Cost (Benefits) Measurement

Correct assignment of realistic EMMA A-SMGCS project costs requires an explicit understanding of the cost concepts applied and of other single cost factors ‘at work’, as they impact heavily on final cost assessments.

Cost estimation will require some insight into the methodological cost (and benefit) measurement approach. Therefore, critical assumptions and the type of measuring costs must be defined and/or explained - at least with respect to calculating:

• Full cost • Incremental cost • Opportunity cost • Sunk cost • Depreciation • Principal and interest expense • Inflation

As most of the EMMA A-SMGCS spending will cover multi-year periods, elementary aspects of ‘uncertainty’ must be identified and dealt with in cost measurement. Sensitivity analyses should be introduced, their assumptions be discussed and respective results with respect to cost (and benefit) volumes be illustrated thoroughly.

Page 39: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 39 File Name: D621_METHO_V10.doc Version: V1.0

3.4.3 Cost-Benefit Assessment Methodology

The benefit-cost assessment of the EMMA A-SMGCS (Level I & II) is characterized by the existence of multidimensional impacts and multiple stakeholders. The introduction of the proposed A-SMGCS (Level I & II) involves impacts affecting the safety, capacity, efficiency, and human factors performance for all stakeholders. The assessment of the overall performance of the system implies the measurement of the emerging impacts. In this context, the emerging impacts are measured on the basis of different metrics. On the other hand, the existence of multiple stakeholders implies the existence of different values, objectives, and aspirations in relation to the expected impacts, resulting from the introduction of the EMMA A-SMGCS. The diversity of values, objectives, and aspirations leads to different priorities in achieving the various system objectives, as well as different weight assignments expressing the importance of each impact category for achieving the overall EMMA A-SMGCS goal. For instance different stakeholders may assign different weights on the safety, efficiency, capacity, or human factors impacts of the system. Therefore, the overall assessment of an EMMA A-SMGCS requires the development and use of an evaluation methodology that will be able to cope with the characteristics of the evaluation problem at hand. The final outcome of the overall assessment is the selection of the “most desirable system” among a set of competing alternative systems at each EMMA site. For the purposes of the EMMA project validation the following two competing alternatives will be assessed: i) the baseline (before the introduction of EMMA A-SMGCS) available surface movement guidance and control system and ii) the EMMA A-SMGCS.

The following two alternative methods are available for the overall assessment of complex technological systems like EMMA A-SMGCS: i) cost-benefit, and ii) cost-effectiveness. The objective of the cost-benefit method is to provide an overall assessment of the performance of a system by converting all costs and all benefits into a single measurement on common monetary units. On the other hand, the objective of the cost-effectiveness analysis is to provide an overall assessment of the performance of a system by measuring all costs (inputs) using a single monetary unit and all other system impacts (outputs) using the corresponding metrics (e.g. number of reduced incidents, taxiway time savings per arriving/departing A/C). The requirement of the cost-benefit analysis to present all system benefits (outputs) in monetary terms makes its application difficult for the overall assessment of the EMMA A-SMGCS validation. The major difficulty stems out of the fact that certain types of outputs are very difficult to be converted into monetary values. For instance, it is difficult to assess the monetary value of different types of incidents that will be potentially avoided due to the introduction of the surveillance and control functions of the EMMA A-SMGCS (Level I & II) at the EMMA project sites. However, given that the sites provide the appropriate conversion factors for converting the operational improvements to monetary value, the cost-benefit analysis is applicable. Otherwise, the cost-effectiveness method should be used.

The application of the cost-effectiveness analysis in a multiple stakeholder decision-making environment requires the: i) measurement of system operational improvements in their respective units, and ii) the establishment of commonly (by all stakeholders) agreed weights expressing the importance of each category of system outputs. The outputs of the competing systems, (i.e. baseline system and EMMA A-SMGCS), can be measured objectively in absolute terms and/or subjectively using an expert judgment expressing the relative performance of the competing systems using a ratio scale.

Page 40: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 40 File Name: D621_METHO_V10.doc Version: V1.0

For the implementation of the cost-effectiveness method it is suggested to use the Analytic Hierarchy Process (AHP) technique. In what follows we present a brief overview of the AHP as well as a step by step description of its application for the overall assessment of EMMA A-SMGCS Level II that will take place within the framework of the validation at the EMMA project sites.

The AHP (Saaty, 1990), provides a practical way to deal quantitatively with complex decision making problems. It also provides an effective framework for group decision-making (i.e. multiple decision-makers). The goal of the Analytic Hierarchy Process model for the needs of the cost effectiveness analysis of the EMMA project will be to assign priorities to the identified performance indicators and the alternative systems (i.e. effectiveness and cost indicators) based on their relative importance according to the system stakeholders judgment.

The AHP is based on three principles: 1) the principle of constructing hierarchies, 2) the principle of establishing priorities, and 3) the principle of logical consistency. According to the method a complex decision making problem is decomposed hierarchically into its components. The top of the hierarchy consists of a single element that expresses the overall goal of the evaluation problem. The overall goal is decomposed to a set of criteria that constitute the second level of the hierarchy. This hierarchical decomposition scheme is continued until a set of measurable indicators is determined. The bottom level of the hierarchy consists of the alternatives. Given this hierarchical structure, each element (parent) from a level of the hierarchy (apart from the bottom level) is linked with a set of elements (child) from the preceding level. The AHP mathematical model calculates a weight for each child element based on the set of pairwise comparisons between the child elements as compared in terms of their importance with respect to their parent element. The synthesis of these weights in a bottom up approach yields the overall priorities of the alternatives. These weights express their relative degree of importance with respect to the overall goal.

In order to effectively address the evaluation problem, the following methodological steps must be implemented:

• Define system performance dimensions: identification of all relevant dimensions of the “performance” (assessment criteria) upon which the EMMA A-SMGCS applications and the baseline system are compared.

• Identify the cost-effectiveness performance measures: identification of the various indicators that will be used in order to evaluate the effectiveness and the costs of the alternative cases.

• Development of the AHP models: based on the information provided in the above two steps, an AHP model for the benefits and an AHP model for the costs are developed.

• Development of a questionnaire: in order to collect the data required for the implementation of the AHP methodology (pair wise comparisons), a questionnaire is developed.

• Data Collection: for the collection of the required data a survey is conducted. The survey takes the form of interviews and is addressed to the various groups of stakeholders involved in and affected by the EMMA A-SMGCS.

• Data Analysis: the data collected is analyzed using the AHP mathematical model [5]. The priorities in terms of the benefits and the cost of the alternative systems are calculated and the cost-effectiveness ratios are formed. The process is performed for the total group of stakeholders and for each group of stakeholders separately.

Page 41: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 41 File Name: D621_METHO_V10.doc Version: V1.0

• Sensitivity Analysis: The objective of this step is to determine the robustness of the resulting overall priorities of the alternative systems through minor changes of the associated priorities of the criteria or indicators.

Given the above analysis the hierarchical decomposition of the evaluation problem of the EMMA A-SMGCS includes two dimensions, that is cost and effectiveness. The two hierarchies that are constructed are illustrated schematically in the following figures.

IDENTIFY THE MOST BENEFICIAL A-SMGCS

SAFETY (S) EFFICIENCY AND CAPACITY (EC) HUMAN FACTORS (HF)

S1 Sn EC1 ECn HF1 HFn

BASELINE EMMA SYSTEM (LEVEL I&II)

. . . . . . . . . . . . . . . . . .

Figure 3-1: Hierarchical decomposition of the Measures of Effectiveness of EMMA A-SMGCS

Page 42: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 42 File Name: D621_METHO_V10.doc Version: V1.0

Figure 3-2: Hierarchical decomposition of the costs of EMMA A-SMGCS

Page 43: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 43 File Name: D621_METHO_V10.doc Version: V1.0

4 Concluding Remarks

An integrated methodological framework for assessing the performance of EMMA A-SMGCS (Level I & II) at the different EMMA project sites (i.e., three airports and the flight deck) has been developed. The proposed framework covers both the verification and the validation phase of EMMA A-SMGCS assessment. The proposed framework suggests methodologies for assessing all relevant types of impacts resulting from the implementation of the surveillance and control functions of the EMMA A-SMGCS. Depending on the situation, different types of methodologies are suggested to be used for assessing the EMMA A-SMGCS during the verification and validation phases. In addition the rationale for selecting the specific methodologies along with issues regarding their implementation at the EMMA sites are also identified. The implementation of the suggested methodologies requires the: i) identification of the indicators and metrics involved in the assessment of each impact category, ii) the description of the experimental design that will be used during the verification and validation phase for measuring the proposed indicators. The implementation issues of the proposed methodologies will be addressed by the subsequent tasks of WP6.2, that is, WP6.2.2 and WP6.2.3.

Page 44: EMMA DLR Project Site - V&V Methodology for A-SMGCSemma.dlr.de/temmaDocs/doc-SP6/D621_METHO_V1.0.pdf · 2006. 10. 10. · EMMA V&V Methodology for A-SMGCS RC-AUEB Save date: 2005-12-12

EMMA

V&V Methodology for A-SMGCS RC-AUEB

Save date: 2005-12-12 Confidential Page 44 File Name: D621_METHO_V10.doc Version: V1.0

5 References

BETA Project. General Test Concept. Operational Benefit Evaluation by Testing an A-SMGCS, Project of EC 5th Framework Programme COMPETITIVE AND SUSTAINABLE GROWTH PROGRAMME, Contract number: 1999-RD.10804, 2003.

EUROCONTROL. Operational Concept & Requirements for A-SMGCS Implementation Level I, 2003.

EUROCONTROL. Operational Concept & Requirements for A-SMGCS Implementation Level II, 2003.

ESA Board for Software Standardization and Control, “Guide to Software Verification and Validation”, Issue 1, Revision 1, European Space Agency, March 1995.

International Civil aviation Organization (ICAO). Advanced Surface Movement Guidance and Control System (A-SMGCS) Manual, Doc. 9830, First Edition 2004, ICAO Publication, Montreal 2004.

Saaty L.T. The Analytic Hierarchy Process: Planning, Priority Setting, Resource Allocation. The Analytic Hierarchy Process Series Vol. I (1990). RWS Publications, Pittsburgh, USA.

Roberto J.M., F. Buratto, “Towards a Grid for Integrating Ergonomic Requirements into Cockpit Interface Design”, Proceedings of the IHO-Aero Conference, 1998.