Download as pdf or txt
Download as pdf or txt
You are on page 1of 70

Advanced Testing of

Systems-of-Systems, Volume 2 -
Practical Aspects 1st Edition Bernard
Homes
Visit to download the full and correct content document:
https://ebookmass.com/product/advanced-testing-of-systems-of-systems-volume-2-pr
actical-aspects-1st-edition-bernard-homes/
More products digital (pdf, epub, mobi) instant
download maybe you interests ...

Advanced Testing of Systems-of-Systems, Volume 1 -


Theoretical Aspects 1st Edition Bernard Homes

https://ebookmass.com/product/advanced-testing-of-systems-of-
systems-volume-1-theoretical-aspects-1st-edition-bernard-homes/

Drug Delivery Aspects: Expectations and Realities of


Multifunctional Drug Delivery Systems: Volume 4:
Expectations and Realities of Multifunctional Drug
Delivery Systems 1st Edition Ranjita Shegokar (Editor)
https://ebookmass.com/product/drug-delivery-aspects-expectations-
and-realities-of-multifunctional-drug-delivery-systems-
volume-4-expectations-and-realities-of-multifunctional-drug-
delivery-systems-1st-edition-ranjita-shegokar/

System Architecture and Complexity Vol. 2: Contribution


of Systems of Systems to Systems Thinking Printz

https://ebookmass.com/product/system-architecture-and-complexity-
vol-2-contribution-of-systems-of-systems-to-systems-thinking-
printz/

Delivery of Drugs: Expectations and Realities of


Multifunctional Drug Delivery Systems: Volume 2:
Expectations and Realities of Multifunctional Drug
Delivery Systems 1st Edition Ranjita Shegokar (Editor)
https://ebookmass.com/product/delivery-of-drugs-expectations-and-
realities-of-multifunctional-drug-delivery-systems-
volume-2-expectations-and-realities-of-multifunctional-drug-
Practical Aspects of Vaccine Development Parag Kolhe

https://ebookmass.com/product/practical-aspects-of-vaccine-
development-parag-kolhe/

Reliability of High-Power Mechatronic Systems 2:


Aerospace and Automotive Applications Issues,Testing
and Analysis Abdelkhalak El Hami (Editor)

https://ebookmass.com/product/reliability-of-high-power-
mechatronic-systems-2-aerospace-and-automotive-applications-
issuestesting-and-analysis-abdelkhalak-el-hami-editor/

Trauma Plating Systems. Biomechanical, Material,


Biological, and Clinical Aspects 1st Edition Edition
Amirhossein Goharian

https://ebookmass.com/product/trauma-plating-systems-
biomechanical-material-biological-and-clinical-aspects-1st-
edition-edition-amirhossein-goharian/

Embedded Mechatronic Systems, Volume 2: Analysis of


Failures, Modeling, Simulation and Optimization 2nd
Edition Abdelkhalak El Hami

https://ebookmass.com/product/embedded-mechatronic-systems-
volume-2-analysis-of-failures-modeling-simulation-and-
optimization-2nd-edition-abdelkhalak-el-hami/

Nanopharmaceuticals: Expectations and Realities of


Multifunctional Drug Delivery Systems: Volume 1:
Expectations and Realities of Multifunctional Drug
Delivery Systems 1st Edition Ranjita Shegokar (Editor)
https://ebookmass.com/product/nanopharmaceuticals-expectations-
and-realities-of-multifunctional-drug-delivery-systems-
volume-1-expectations-and-realities-of-multifunctional-drug-
Advanced Testing of Systems-of-Systems 2
Advanced Testing of
Systems-of-Systems 2

Practical Aspects

Bernard Homès
First published 2022 in Great Britain and the United States by ISTE Ltd and John Wiley & Sons, Inc.

Apart from any fair dealing for the purposes of research or private study, or criticism or review, as
permitted under the Copyright, Designs and Patents Act 1988, this publication may only be reproduced,
stored or transmitted, in any form or by any means, with the prior permission in writing of the publishers,
or in the case of reprographic reproduction in accordance with the terms and licenses issued by the
CLA. Enquiries concerning reproduction outside these terms should be sent to the publishers at the
undermentioned address:

ISTE Ltd John Wiley & Sons, Inc.


27-37 St George’s Road 111 River Street
London SW19 4EU Hoboken, NJ 07030
UK USA

www.iste.co.uk www.wiley.com

© ISTE Ltd 2022


The rights of Bernard Homès to be identified as the author of this work have been asserted by him in
accordance with the Copyright, Designs and Patents Act 1988.

Any opinions, findings, and conclusions or recommendations expressed in this material are those of the
author(s), contributor(s) or editor(s) and do not necessarily reflect the views of ISTE Group.

Library of Congress Control Number: 2022944148

British Library Cataloguing-in-Publication Data


A CIP record for this book is available from the British Library
ISBN 978-1-78630-750-7
Contents

Dedication and Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . . xiii

Preface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xv

Chapter 1. Test Project Management . . . . . . . . . . . . . . . . . . . . . . . . . 1


1.1. General principles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.1.1. Quality of requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.1.2. Completeness of deliveries . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.1.3. Availability of test environments . . . . . . . . . . . . . . . . . . . . . . . 3
1.1.4. Availability of test data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.1.5. Compliance of deliveries and schedules . . . . . . . . . . . . . . . . . . . 5
1.1.6. Coordinating and setting up environments . . . . . . . . . . . . . . . . . . 6
1.1.7. Validation of prerequisites – Test Readiness Review (TRR) . . . . . . . . 6
1.1.8. Delivery of datasets (TDS) . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
1.1.9. Go-NoGo decision – Test Review Board (TRB) . . . . . . . . . . . . . . . 7
1.1.10. Continuous delivery and deployment . . . . . . . . . . . . . . . . . . . . 8
1.2. Tracking test projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
1.3. Risks and systems-of-systems . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
1.4. Particularities related to SoS . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
1.5. Particularities related to SoS methodologies . . . . . . . . . . . . . . . . . . . 11
1.5.1. Components definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
1.5.2. Testing and quality assurance activities . . . . . . . . . . . . . . . . . . . . 12
1.6. Particularities related to teams . . . . . . . . . . . . . . . . . . . . . . . . . . . 12

Chapter 2. Testing Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15


2.1. Organization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
2.2. Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
2.2.1. Project WBS and planning . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
vi Advanced Testing of Systems-of-Systems 2

2.3. Control of test activities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21


2.4. Analyze . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
2.5. Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
2.6. Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
2.7. Test execution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
2.8. Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
2.9. Reporting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
2.10. Closure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
2.11. Infrastructure management . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
2.12. Reviews . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
2.13. Adapting processes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
2.14. RACI matrix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
2.15. Automation of processes or tests . . . . . . . . . . . . . . . . . . . . . . . . . 33
2.15.1. Automate or industrialize? . . . . . . . . . . . . . . . . . . . . . . . . . . 33
2.15.2. What to automate? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
2.15.3. Selecting what to automate . . . . . . . . . . . . . . . . . . . . . . . . . . 34

Chapter 3. Continuous Process Improvement . . . . . . . . . . . . . . . . . . . 37


3.1. Modeling improvements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
3.1.1. PDCA and IDEAL . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
3.1.2. CTP . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
3.1.3. SMART . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
3.2. Why and how to improve? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
3.3. Improvement methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
3.3.1. External/internal referential . . . . . . . . . . . . . . . . . . . . . . . . . . 42
3.4. Process quality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
3.4.1. Fault seeding . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
3.4.2. Statistics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
3.4.3. A posteriori . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
3.4.4. Avoiding introduction of defects . . . . . . . . . . . . . . . . . . . . . . . 47
3.5. Effectiveness of improvement activities . . . . . . . . . . . . . . . . . . . . . . 48
3.6. Recommendations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50

Chapter 4. Test, QA or IV&V Teams. . . . . . . . . . . . . . . . . . . . . . . . . . 51


4.1. Need for a test team . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
4.2. Characteristics of a good test team . . . . . . . . . . . . . . . . . . . . . . . . . 53
4.3. Ideal test team profile . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
4.4. Team evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
4.4.1. Skills assessment table . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
4.4.2. Composition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
4.4.3. Select, hire and retain . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
4.5. Test manager . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
Contents vii

4.5.1. Lead or direct? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60


4.5.2. Evaluate and measure. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
4.5.3. Recurring questions for test managers . . . . . . . . . . . . . . . . . . . . 62
4.6. Test analyst . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
4.7. Technical test analyst . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64
4.8. Test automator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
4.9. Test technician . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
4.10. Choose our testers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
4.11. Training, certification or experience?. . . . . . . . . . . . . . . . . . . . . . . 67
4.12. Hire or subcontract? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
4.12.1. Effective subcontracting . . . . . . . . . . . . . . . . . . . . . . . . . . . 68
4.13. Organization of multi-level test teams . . . . . . . . . . . . . . . . . . . . . . 68
4.13.1. Compliance, strategy and organization . . . . . . . . . . . . . . . . . . . 69
4.13.2. Unit test teams (UT/CT) . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
4.13.3. Integration testing team (IT) . . . . . . . . . . . . . . . . . . . . . . . . . 70
4.13.4. System test team (SYST) . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
4.13.5. Acceptance testing team (UAT) . . . . . . . . . . . . . . . . . . . . . . . 71
4.13.6. Technical test teams (TT) . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
4.14. Insourcing and outsourcing challenges . . . . . . . . . . . . . . . . . . . . . . 72
4.14.1. Internalization and collocation . . . . . . . . . . . . . . . . . . . . . . . . 72
4.14.2. Near outsourcing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
4.14.3. Geographically distant outsourcing . . . . . . . . . . . . . . . . . . . . . 74

Chapter 5. Test Workload Estimation . . . . . . . . . . . . . . . . . . . . . . . . 75


5.1. Difficulty to estimate workload. . . . . . . . . . . . . . . . . . . . . . . . . . . 75
5.2. Evaluation techniques . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76
5.2.1. Experience-based estimation . . . . . . . . . . . . . . . . . . . . . . . . . . 76
5.2.2. Based on function points or TPA . . . . . . . . . . . . . . . . . . . . . . . 77
5.2.3. Requirements scope creep . . . . . . . . . . . . . . . . . . . . . . . . . . . 79
5.2.4. Estimations based on historical data. . . . . . . . . . . . . . . . . . . . . . 80
5.2.5. WBS or TBS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80
5.2.6. Agility, estimation and velocity . . . . . . . . . . . . . . . . . . . . . . . . 81
5.2.7. Retroplanning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82
5.2.8. Ratio between developers – testers . . . . . . . . . . . . . . . . . . . . . . 82
5.2.9. Elements influencing the estimate . . . . . . . . . . . . . . . . . . . . . . . 83
5.3. Test workload overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
5.3.1. Workload assessment verification and validation . . . . . . . . . . . . . . 86
5.3.2. Some values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86
5.4. Understanding the test workload . . . . . . . . . . . . . . . . . . . . . . . . . . 87
5.4.1. Component coverage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
5.4.2. Feature coverage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
5.4.3. Technical coverage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
viii Advanced Testing of Systems-of-Systems 2

5.4.4. Test campaign preparation . . . . . . . . . . . . . . . . . . . . . . . . . . . 89


5.4.5. Running test campaigns . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89
5.4.6. Defects management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90
5.5. Defending our test workload estimate . . . . . . . . . . . . . . . . . . . . . . . 91
5.6. Multi-tasking and crunch . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92
5.7. Adapting and tracking the test workload. . . . . . . . . . . . . . . . . . . . . . 92

Chapter 6. Metrics, KPI and Measurements . . . . . . . . . . . . . . . . . . . . 95


6.1. Selecting metrics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96
6.2. Metrics precision. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97
6.2.1. Special case of the cost of defaults . . . . . . . . . . . . . . . . . . . . . . 97
6.2.2. Special case of defects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98
6.2.3. Accuracy or order of magnitude? . . . . . . . . . . . . . . . . . . . . . . . 98
6.2.4. Measurement frequency . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99
6.2.5. Using metrics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99
6.2.6. Continuous improvement of metrics . . . . . . . . . . . . . . . . . . . . . 100
6.3. Product metrics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101
6.3.1. FTR: first time right . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101
6.3.2. Coverage rate . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102
6.3.3. Code churn . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103
6.4. Process metrics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104
6.4.1. Effectiveness metrics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104
6.4.2. Efficiency metrics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107
6.5. Definition of metrics. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108
6.5.1. Quality model metrics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109
6.6. Validation of metrics and measures . . . . . . . . . . . . . . . . . . . . . . . . 110
6.6.1. Baseline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110
6.6.2. Historical data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111
6.6.3. Periodic improvements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112
6.7. Measurement reporting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112
6.7.1. Internal test reporting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113
6.7.2. Reporting to the development team . . . . . . . . . . . . . . . . . . . . . . 114
6.7.3. Reporting to the management . . . . . . . . . . . . . . . . . . . . . . . . . 114
6.7.4. Reporting to the clients or product owners . . . . . . . . . . . . . . . . . . 115
6.7.5. Reporting to the direction and upper management . . . . . . . . . . . . . . 116

Chapter 7. Requirements Management . . . . . . . . . . . . . . . . . . . . . . . 119


7.1. Requirements documents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119
7.2. Qualities of requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120
7.3. Good practices in requirements management . . . . . . . . . . . . . . . . . . . 122
7.3.1. Elicitation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122
7.3.2. Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123
Contents ix

7.3.3. Specifications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123


7.3.4. Approval and validation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124
7.3.5. Requirements management . . . . . . . . . . . . . . . . . . . . . . . . . . 124
7.3.6. Requirements and business knowledge management . . . . . . . . . . . . 125
7.3.7. Requirements and project management . . . . . . . . . . . . . . . . . . . . 125
7.4. Levels of requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126
7.5. Completeness of requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . 126
7.5.1. Management of TBDs and TBCs . . . . . . . . . . . . . . . . . . . . . . . 126
7.5.2. Avoiding incompleteness. . . . . . . . . . . . . . . . . . . . . . . . . . . . 127
7.6. Requirements and agility . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 127
7.7. Requirements issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 128

Chapter 8. Defects Management . . . . . . . . . . . . . . . . . . . . . . . . . . . . 129


8.1. Defect management, MOA and MOE . . . . . . . . . . . . . . . . . . . . . . . 129
8.1.1. What is a defect? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 129
8.1.2. Defects and MOA . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130
8.1.3. Defects and MOE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130
8.2. Defect management workflow . . . . . . . . . . . . . . . . . . . . . . . . . . . 131
8.2.1. Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131
8.2.2. Simplify . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 132
8.3. Triage meetings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 133
8.3.1. Priority and severity of defects . . . . . . . . . . . . . . . . . . . . . . . . 133
8.3.2. Defect detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134
8.3.3. Correction and urgency . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135
8.3.4. Compliance with processes . . . . . . . . . . . . . . . . . . . . . . . . . . 136
8.4. Specificities of TDDs, ATDDs and BDDs . . . . . . . . . . . . . . . . . . . . 136
8.4.1. TDD: test-driven development. . . . . . . . . . . . . . . . . . . . . . . . . 136
8.4.2. ATDD and BDD . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 137
8.5. Defects reporting. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138
8.5.1. Defects backlog management . . . . . . . . . . . . . . . . . . . . . . . . . 139
8.6. Other useful reporting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 141
8.7. Don’t forget minor defects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 141

Chapter 9. Configuration Management . . . . . . . . . . . . . . . . . . . . . . . 143


9.1. Why manage configuration? . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143
9.2. Impact of configuration management . . . . . . . . . . . . . . . . . . . . . . . 144
9.3. Components . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145
9.4. Processes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145
9.5. Organization and standards . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 146
9.6. Baseline or stages, branches and merges . . . . . . . . . . . . . . . . . . . . . 147
9.6.1. Stages . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 148
9.6.2. Branches . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 148
x Advanced Testing of Systems-of-Systems 2

9.6.3. Merge . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 148


9.7. Change control board (CCB) . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149
9.8. Delivery frequencies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149
9.9. Modularity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150
9.10. Version management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150
9.11. Delivery management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151
9.11.1. Preparing for delivery . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153
9.11.2. Delivery validation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 154
9.12. Configuration management and deployments . . . . . . . . . . . . . . . . . . 155

Chapter 10. Test Tools and Test Automation . . . . . . . . . . . . . . . . . . . 157


10.1. Objectives of test automation . . . . . . . . . . . . . . . . . . . . . . . . . . . 157
10.1.1. Find more defects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 158
10.1.2. Automating dynamic tests . . . . . . . . . . . . . . . . . . . . . . . . . . 159
10.1.3. Find all regressions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 160
10.1.4. Run test campaigns faster . . . . . . . . . . . . . . . . . . . . . . . . . . . 161
10.2. Test tool challenges . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 161
10.2.1. Positioning test automation . . . . . . . . . . . . . . . . . . . . . . . . . . 162
10.2.2. Test process analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 162
10.2.3. Test tool integration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 162
10.2.4. Qualification of tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163
10.2.5. Synchronizing test cases . . . . . . . . . . . . . . . . . . . . . . . . . . . 164
10.2.6. Managing test data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 164
10.2.7. Managing reporting (level of trust in test tools). . . . . . . . . . . . . . . 165
10.3. What to automate? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 165
10.4. Test tooling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 166
10.4.1. Selecting tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 167
10.4.2. Computing the return on investment (ROI) . . . . . . . . . . . . . . . . . 169
10.4.3. Avoiding abandonment of tools and automation . . . . . . . . . . . . . . 169
10.5. Automated testing strategies. . . . . . . . . . . . . . . . . . . . . . . . . . . . 170
10.6. Test automation challenge for SoS . . . . . . . . . . . . . . . . . . . . . . . . 171
10.6.1. Mastering test automation . . . . . . . . . . . . . . . . . . . . . . . . . . 171
10.6.2. Preparing test automation . . . . . . . . . . . . . . . . . . . . . . . . . . . 173
10.6.3. Defect injection/fault seeding . . . . . . . . . . . . . . . . . . . . . . . . 173
10.7. Typology of test tools and their specific challenges . . . . . . . . . . . . . . . 174
10.7.1. Static test tools versus dynamic test tools . . . . . . . . . . . . . . . . . . 175
10.7.2. Data-driven testing (DDT) . . . . . . . . . . . . . . . . . . . . . . . . . . 176
10.7.3. Keyword-driven testing (KDT). . . . . . . . . . . . . . . . . . . . . . . . 176
10.7.4. Model-based testing (MBT) . . . . . . . . . . . . . . . . . . . . . . . . . 177
10.8. Automated regression testing . . . . . . . . . . . . . . . . . . . . . . . . . . . 178
10.8.1. Regression tests in builds . . . . . . . . . . . . . . . . . . . . . . . . . . . 178
10.8.2. Regression tests when environments change . . . . . . . . . . . . . . . . 179
Contents xi

10.8.3. Prevalidation regression tests, sanity checks and smoke tests . . . . . . . 179
10.8.4. What to automate? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 180
10.8.5. Test frameworks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 182
10.8.6. E2E test cases . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 183
10.8.7. Automated test case maintenance or not? . . . . . . . . . . . . . . . . . . 184
10.9. Reporting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 185
10.9.1. Automated reporting for the test manager . . . . . . . . . . . . . . . . . . 186

Chapter 11. Standards and Regulations . . . . . . . . . . . . . . . . . . . . . . . 187


11.1. Definition of standards . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 189
11.2. Usefulness and interest. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 189
11.3. Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 190
11.4. Demonstration of compliance – IADT . . . . . . . . . . . . . . . . . . . . . . 190
11.5. Pseudo-standards and good practices . . . . . . . . . . . . . . . . . . . . . . . 191
11.6. Adapting standards to needs . . . . . . . . . . . . . . . . . . . . . . . . . . . . 191
11.7. Standards and procedures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 192
11.8. Internal and external coherence of standards. . . . . . . . . . . . . . . . . . . 192

Chapter 12. Case Study . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 195


12.1. Case study: improvement of an existing complex system . . . . . . . . . . . 195
12.1.1. Context and organization . . . . . . . . . . . . . . . . . . . . . . . . . . . 196
12.1.2. Risks, characteristics and business domains . . . . . . . . . . . . . . . . 198
12.1.3. Approach and environment . . . . . . . . . . . . . . . . . . . . . . . . . . 200
12.1.4. Resources, tools and personnel . . . . . . . . . . . . . . . . . . . . . . . . 210
12.1.5. Deliverables, reporting and documentation . . . . . . . . . . . . . . . . . 212
12.1.6. Planning and progress . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 213
12.1.7. Logistics and campaigns . . . . . . . . . . . . . . . . . . . . . . . . . . . 216
12.1.8. Test techniques . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 217
12.1.9. Conclusions and return on experience . . . . . . . . . . . . . . . . . . . . 218

Chapter 13. Future Testing Challenges . . . . . . . . . . . . . . . . . . . . . . . 223


13.1. Technical debt . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 223
13.1.1. Origin of the technical debt . . . . . . . . . . . . . . . . . . . . . . . . . . 224
13.1.2. Technical debt elements . . . . . . . . . . . . . . . . . . . . . . . . . . . 225
13.1.3. Measuring technical debt . . . . . . . . . . . . . . . . . . . . . . . . . . . 226
13.1.4. Reducing technical debt. . . . . . . . . . . . . . . . . . . . . . . . . . . . 227
13.2. Systems-of-systems specific challenges . . . . . . . . . . . . . . . . . . . . . 228
13.3. Correct project management. . . . . . . . . . . . . . . . . . . . . . . . . . . . 229
13.4. DevOps . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 230
13.4.1. DevOps ideals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 231
13.4.2. DevOps-specific challenges . . . . . . . . . . . . . . . . . . . . . . . . . 231
13.5. IoT (Internet of Things) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 232
xii Advanced Testing of Systems-of-Systems 2

13.6. Big Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 233


13.7. Services and microservices . . . . . . . . . . . . . . . . . . . . . . . . . . . . 234
13.8. Containers, Docker, Kubernetes, etc. . . . . . . . . . . . . . . . . . . . . . . . 235
13.9. Artificial intelligence and machine learning (AI/ML). . . . . . . . . . . . . . 235
13.10. Multi-platforms, mobility and availability . . . . . . . . . . . . . . . . . . . 237
13.11. Complexity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 238
13.12. Unknown dependencies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 238
13.13. Automation of tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 239
13.13.1. Unrealistic expectations . . . . . . . . . . . . . . . . . . . . . . . . . . . 240
13.13.2. Difficult to reach ROI . . . . . . . . . . . . . . . . . . . . . . . . . . . . 241
13.13.3. Implementation difficulties . . . . . . . . . . . . . . . . . . . . . . . . . 242
13.13.4. Think about maintenance . . . . . . . . . . . . . . . . . . . . . . . . . . 243
13.13.5. Can you trust your tools and your results? . . . . . . . . . . . . . . . . . 244
13.14. Security . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 245
13.15. Blindness or cognitive dissonance . . . . . . . . . . . . . . . . . . . . . . . . 245
13.16. Four truths . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 246
13.16.1. Importance of Individuals . . . . . . . . . . . . . . . . . . . . . . . . . . 247
13.16.2. Quality versus quantity . . . . . . . . . . . . . . . . . . . . . . . . . . . 247
13.16.3. Training, experience and expertise . . . . . . . . . . . . . . . . . . . . . 248
13.16.4. Usefulness of certifications . . . . . . . . . . . . . . . . . . . . . . . . . 248
13.17. Need to anticipate . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 249
13.18. Always reinvent yourself . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 250
13.19. Last but not least . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 250

Terminology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 253

References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 261

Index. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 267

Summary of Volume 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 269


Dedication and Acknowledgments

Inspired by a dedication from Boris Beizer1, I dedicate these two books


to many very bad projects on software and systems-of-systems development
where I had the opportunity to – for a short time – act as a consultant. These taught
me multiple lessons on difficulties that these books try and identify and led me to
realize the need for this book. Their failure could have been prevented; may they rest
in peace.

I would also like to thank the many managers and colleagues I had the privilege
of meeting during my career. Some, too few, understood that quality is really
everyone’s business. We will lay a modest shroud over the others.

Finally, paraphrasing Isaac Newton, If I was able to reach this level of


knowledge, it is thanks to all the giants that were before me and on the shoulders of
which I could position myself. Among these giants, I would like to mention (in
alphabetical order) James Bach, Boris Beizer, Rex Black, Frederic Brooks, Hans
Buwalda, Ross Collard, Elfriede Dustin, Avner Engel, Tom Gilb, Eliahu Goldratt,
Dorothy Graham, Capers Jones, Paul Jorgensen, Cem Kaner, Brian Marick, Edward
Miller, John Musa, Glenford Myers, Bret Pettichord, Johanna Rothman, Gerald
Weinberg, James Whittaker and Karl Wiegers.

After 15 years in software development, I had the opportunity to focus on


software testing for over 25 years. Specialized in testing process improvements, I
founded and participated in the creation of multiple associations focused on software
testing: AST (Association of Software Tester), ISTQB (International Software

1 Beizer, B. (1990). Software Testing Techniques, 2nd edition. ITP Media.


xiv Advanced Testing of Systems-of-Systems 2

Testing Qualification Board), CFTL (Comité Français des Tests Logiciels, the
French Software Testing committee) and GASQ (Global Association for Software
Quality). I also dedicate these books to you, the reader, so that you can improve your
testing competencies.
Preface

Implementation

In the first part of these two books on systems-of-systems testing, we identified


the impacts of software development cycles, testing strategies and methodologies,
and we saw the benefit of using a quality referential and the importance of test
documentation and reporting. We have identified the impact of test levels and test
techniques, whether we are talking about static techniques or dynamic techniques.
We ended with an approach to test project management that allowed us to identify
that human actor and how their interactions are essential elements that must be
considered.

In this second part of the book on systems-of-systems testing, we will focus on


more practical aspects such as managing test projects, testing processes and how to
improve them continuously. We will see the additional but necessary processes such
as the management of requirements, defects and configurations, and we will also see
a case study allowing us to ask ourselves several useful questions. We will end with
a perilous prediction exercise by listing the challenges that tests will have to face in
the years to come.

August 2022
1

Test Project Management

We do not claim to replace the many contributions of illustrious authors on good


practices in project management. Standards such as PMBOK (PMI 2017) or CMMI
and methodologies such as ITIL and PRINCE2 comprehensively describe the tasks,
best practices and other activities recommended to properly manage projects. We
focus on certain points associated with the testing of software, components, products
and systems within systems-of-systems projects.

At the risk of writing a tautology, the purpose of project management is to


manage projects, that is, to define the tasks and actions necessary to achieve the
objectives of these projects. The purpose, the ultimate objective of the project, takes
precedence over any other aspect, even if the budgetary and time constraints are
significant. To limit the risks associated with systems-of-systems, the quality of the
deliverables is very important and therefore tests (verifications and validations that
the object of the project has been achieved) are necessary.

Project management must ensure that development methodologies are correctly


implemented (see Chapter 2) to avoid inconsistencies. Similarly, project
management must provide all stakeholders with an image of the risks and the
progress of the system-of-systems, its dependencies and the actions to be taken in
the short and medium term, in order to anticipate the potential hazards.

1.1. General principles

Management of test projects, whether on components, products, systems or


systems-of-systems, has a particularity that other projects do not have: they depend
– for their deadlines, scope and level of quality – on other parts of the projects: the
development phases. Requirements are often unstable, information arrives late,
deadlines are shorter because they depend on evolving developments and longer
2 Advanced Testing of Systems-of-Systems 2

deadlines, the scope initially considered increases, the level of quality of input data –
requirements, components to be tested, interfaces – is often of lower quality than
expected and the number of faults or anomalies is greater than anticipated. All of
these are under tighter budgetary and calendar constraints because, even if the
developments take longer than expected, the production launch date is rarely
postponed.

The methodologies offered by ITIL, PRINCE2, CMMI, etc. bring together a set
of good practices that can be adapted – or not – to our system-of-systems project.
CMMI, for example, does not have test-specific elements (only IVV), and it may be
necessary to supplement CMMI with test-specific tasks and actions as offered by
TMM and TMMI.

Let us see the elements specific to software testing projects.

1.1.1. Quality of requirements

Any development translates requirements (needs or business objectives) into a


component, product or system that will implement them. In an Agile environment,
requirements are defined in the form of User Stories, Features or Epics. The
requirements can be described in so-called specification documents (e.g. General
Specifications Document or Detailed Specifications Document). Requirements are
primarily functional – they describe expected functionality – but can be technical or
non-functional. We can classify the requirements according to the quality
characteristics they cover as proposed in Chapter 5 of Volume 1 (Homès 2022a).

Requirements are provided to development teams as well as test teams.


Production teams – design, development, etc. – use these requirements to develop
components, products or systems and may propose or request adaptations of these
requirements. Test teams use requirements to define, analyze and implement, or
even automate, test cases and test scenarios to validate these requirements. These
test teams must absolutely be informed – as soon as possible – of any change in the
requirements to proceed with the modifications of the tests.

The requirements must be SMART, that is:


– Specific: the requirements must be clear, there must be no ambiguity and the
requirements must be simple, consistent and with an appropriate level of detail.
– Measurable: it must be possible, when the component, product or system is
designed, to verify that the requirement has been met. This is directly necessary for
the design of tests and metrics to verify the extent to which requirements are met.
Test Project Management 3

– Achievable: the requirements must be able to be physically demonstrated


under given conditions. If the requirements are not achievable (e.g. the system will
have 100% reliability and 100% availability), the result will be that the component,
product or system will never be accepted or will be cost-prohibitive. Achievable
includes that the requirement can be developed in a specific time frame.
– Realistic: in the context of software development – and testing – is it possible
to achieve the requirement for the component, product or system, taking into
account the constraints in which the project is developed? We add to this aspect the
notion of time: are the requirements achievable in a realistic time?
– Traceable: requirements traceability is the ability to follow a requirement from
its design to its specification, its realization and its implementation to its test, as well
as in the other direction (from the test to the specification). This helps to understand
why a requirement was specified and to ensure that each requirement has been
correctly implemented.

1.1.2. Completeness of deliveries

The completeness of the software, components, products, equipment and systems


delivered for the tests is obviously essential. If the elements delivered are
incomplete, it will be necessary to come back to them to modify and complete them,
which will increase the risk of introducing anomalies.

This aspect of completeness is ambiguous in incremental and iterative


methodologies. On the one hand, it is recommended to deliver small increments, and
on the other hand, losses should be eliminated. Small increments imply partial
releases of functionality, thus generation of “losses” both regarding releases and
testing (e.g. regression testing) – in fact, all the expectations related to these multiple
releases and multiple test runs – to be performed on these components. Any
evolution within the framework of an iteration will lead to a modification in the
functionalities and therefore an evolution compared to the results executed during
the previous iterations.

1.1.3. Availability of test environments

The execution of the tests is carried out in different test environments according
to the test levels envisaged. It will therefore be necessary to ensure the availability
of environments for each level.
4 Advanced Testing of Systems-of-Systems 2

The test environment is not limited to a machine on which the software


component is executed. It also includes the settings necessary for the proper
execution of the component, the test data and other applications – in the appropriate
versions – with which the component interacts.

Test environments, as well as their data and the applications they interface with
must be properly synchronized with each other. This implies an up-to-date definition
of the versions of each system making up the system-of-systems and of the
interfaces and messages exchanged between them.

Automating backups and restores of test environments allows testers to self-


manage their environments so that they are not a burden on production systems
management teams.

In DevOps environments, it is recommended to enable automatic creation of


environments to test builds as they are created by developers. As proposed by Kim
et al. (2016), it is necessary to allow to recreate – automatically – the test
environments rather than trying to repair them. This automatic creation solution
ensures an identical test environment to the previous version, which will facilitate
regression testing.

1.1.4. Availability of test data

It is obvious that the input test data of a test case and the expected data at the
output of a test case are necessary, and it is also important to have a set of other data
that will be used for testing:
– data related to the users who will run the tests (e.g. authorization level,
hierarchical level, organization to which they are attached, etc.);
– information related to the test data used (e.g. technical characteristics,
composition, functionalities present, etc.) and which are grouped in legacy systems
interfaced with the system-of-systems under test;
– historical information allowing us to make proposals based on this historical
information (e.g. purchase suggestions based on previous purchases);
– information based on geographical positioning (e.g. GPS position), supply
times and consumption volumes to anticipate stock replenishment needs (e.g. need
to fill the fuel tank according to the way to drive and consume fuel, making it
possible to offer – depending on the route and GPS information – one or more
service stations nearby);
– etc.
Test Project Management 5

The creation and provision of quality test data is necessary before any test
campaign. Designing and updating this data, ensuring that it is consistent, is
extremely important because it must – as far as possible – simulate the reality of the
exchanges and information of each of the systems of the system-of-systems to be
tested. We will therefore need to generate data from monitoring systems (from
sensors, via IoT systems) and ensure that their production respects the expected
constraints (e.g. every n seconds, in order to identify connection losses or deviations
from nominal operating ranges).

Test data should be realistic and consistent over time. That is, they must either
simulate a reference period and each of the campaigns must ensure that the systems
have modified their reference date (e.g. use a fixed range of hours and reset systems
at the beginning of this range) or be consistent with the time of execution of the test
campaign. This last solution requires generating the test data during the execution of
the test campaign, in order to verify the consistency of the data with respect to the
expected (e.g. identification of duplicate messages, sequencing of messages, etc.)
and therefore the proper functioning of the system-of-systems as a whole.

1.1.5. Compliance of deliveries and schedules

Development and construction projects are associated with often strict delivery
dates and schedules. The impact of a late delivery of a component generates
cascading effects impacting the delivery of the system and the system-of-systems.
Timely delivery, with the expected features and the desired level of quality, is
therefore very important. In some systems-of-systems, the completeness of the
functionalities and their level of quality are often more important than the respect of
the delivery date. In others, respecting the schedule is crucial in order to meet
imperatives (e.g. launch window for a rocket aiming for another planet).

Test projects depend on the delivery of requirements and components to be


tested within a specific schedule. Indeed, testers can only design tests based on the
requirements, user stories and features delivered to them and can only run tests on
the components, products and systems delivered to them in the appropriate test
environments (i.e. including the necessary data and systems). The timely delivery of
deliverables (contracts, requirements documents, specifications, features, user
stories, etc.) and components, products and systems in a usable state – that is, with
information or expected and working functionality – is crucial, or testers will not be
able to perform their tasks properly.

This involves close collaboration between test manager and project managers in
charge of the design and production of components, products or systems to be
6 Advanced Testing of Systems-of-Systems 2

tested, as well as managers in charge of test environments and the supply of test
data.

In the context of Agile and Lean methods, any delay in deliveries and any
non-compliance with schedules is a “loss of value” and should be eliminated. It is
however important to note that the principles of agility propose that it is the
development teams that define the scope of the functionalities to be delivered at each
iteration.

1.1.6. Coordinating and setting up environments

Depending on the test levels, environments will include more and more
components, products and systems that will need to coordinate to represent test
environments representative of real life. Each environment includes one or more
systems, components, products, as well as interfaces, ETLs and communication
equipment (wired, wireless, satellite, optical networks, etc.) of increasing
complexity. The design of these various environments quickly becomes a full-time
job, especially since it is necessary to ensure that all the versions of all the software
are correctly synchronized and that all the data, files, contents of databases and
interfaces are synchronized and validated in order to allow the correct execution of
the tests on this environment.

The activity of coordinating and setting up environments interacts strongly with


all the other projects participating in the realization of the system-of-systems. Some
test environments will only be able to simulate part of the target environment (e.g.
simulation of space vacuum and sunlight with no ability to simulate zero gravity),
and therefore there may be, for the same test level, several test execution campaigns,
each on different technical or functional domains.

1.1.7. Validation of prerequisites – Test Readiness Review (TRR)

Testing activities can start effectively and efficiently as soon as all their
prerequisites are present. Otherwise, the activities will have to stop and then start
again when the missing prerequisite is provided, etc. This generates significant
waste of time, not to mention everyone’s frustration. Before starting any test task,
we must make sure that all the prerequisites are present, or at the very least that they
will arrive on time with the desired level of quality. Among the prerequisites, we
have among others the requirements, the environment, the datasets, the component
to be tested, the test cases with the expected data, as well as the testers, the tools and
procedures for managing tests and anomalies, the KPIs and metrics allowing the
reporting of the progress of the tests, etc.
Test Project Management 7

One solution to ensure the presence of the prerequisites is to set up a TRR (Test
Readiness Review) milestone, a review of the start of the tests. The purpose of this
milestone is to verify – depending on the test level and the types of test – whether or
not the prerequisites are present. If prerequisites are missing, it is up to the project
managers to decide whether or not to launch the test activity, taking into account the
identified risks.

In Agile methods, such a review can be informal and only apply to one user story
at a time, with the acronym DOR for definition of ready.

1.1.8. Delivery of datasets (TDS)

The delivery of test datasets (TDS) is not limited to the provision of files or
databases with information usable by the component, product or system. This also
includes – for the applications, components, products or systems with which the
component, product or system under test interacts – a check of the consistency and
synchronization of the data with each other. It will be necessary to ensure that the
interfaces are correctly described, defined and implemented.

Backup of datasets or automation of dataset generation processes may be


necessary to allow testers to generate the data they need themselves.

The design of coherent and complete datasets is a difficult task requiring a good
knowledge of the entire information system and the interfaces between the
component, product or system under test on the one hand and all the other systems
of the test environment on the other hand. Some components, products or systems
may be missing and replaced by “stubs” that will simulate the missing elements. In
this case, it is necessary to manage these “stubs” with the same rigor as if they were
real components (e.g. evolution of versions, data, etc.).

1.1.9. Go-NoGo decision – Test Review Board (TRB)

A Go-NoGo meeting is used to analyze the risks associated with moving to the
next step in a process of designing and deploying a component, product, system or
system-of-systems, and to decide whether to proceed to the next step.

This meeting is sometimes split into two reviews in time:


– A TRB (Test Review Board) meeting analyzes the results of the tests carried
out in the level and determines the actions according to these results. This technical
meeting ensures that the planned objectives have been achieved for the level.
8 Advanced Testing of Systems-of-Systems 2

– A management review to obtain – from the hierarchy, the other stakeholders,


the MOA and the customers – a decision (the “Go” or the “NoGo” decision)
accepted by all, with consideration of business risks, marketing, etc.

The Go-NoGo meeting includes representatives from all business stakeholders,


such as operations managers, deployment teams, production teams and marketing
teams.

In an Agile environment, the concept of Go-NoGo and TRB is detailed under the
concept of DOD (definition of done) for each of the design actions.

1.1.10. Continuous delivery and deployment

The concept of continuous integration and continuous delivery (CI/CD) is


interesting and deserves to be considered in systems-of-systems with preponderant
software. However, such concepts have particular constraints that we must study,
beyond the use of an Agile design methodology.

1.1.10.1. Continuous delivery


The continuous delivery practices mentioned in Kim et al. (2016) focus primarily
on the aspects of continuous delivery and deployment of software that depend on
automated testing performed to ensure developers have quick (immediate) feedback
on the defects, performance, security and usability concerns of the components put
in configuration. In addition, the principle is to have a limited number of
configuration branches.

In the context of systems-of-systems, where hardware components and


subsystems including software must be physically installed – and tested on physical
test benches – the ability to deliver daily and ensure the absence of regressions
becomes more complex, if not impossible, to implement. This is all the more true
since the systems-of-systems are not produced in large quantities and the
interactions are complex.

1.1.10.2. Continuous testing


On-demand execution of tests as part of continuous delivery is possible for unit
testing and static testing of code. Software integration testing could be considered,
but anything involving end-to-end (E2E) testing becomes more problematic because
installing the software on the hardware component should generate a change in the
configuration reference of the hardware component.
Test Project Management 9

Among the elements to consider, we have an ambiguity of terminology: the term


ATDD (Acceptance Test-Driven Development) relates to the acceptance of the
software component alone, not its integration, nor the acceptance of the system-of-
system nor of the subsystem or equipment.

Another aspect to consider is the need for test automation and (1) the continued
increase in the number of tests to be executed, which will mean increasing test
execution time as well as (2) the need to ensure that the test classes in the software
(case of TDD and BDD) are correctly removed from the versions used in integration
tests and in system tests.

One of the temptations associated with testing in a CI/CD or DevOps


environment is to pool the tests of the various software components into a single test
batch for the release, instead of processing the tests separately for each component.
This solution makes it possible to pool the regression tests of software components,
but is a difficult practical problem for the qualification of systems-of-systems as
mentioned in Sacquet and Rochefolle (2016).

1.1.10.3. Continuous deployment


Continuous deployment depends on continuous delivery and therefore automated
validation of tests, and the presence of complete documentation – for component
usage and administration – as well as the ability to run end-to-end on an
environment representative of production.

According to Kim et al. (2016), in companies like Amazon and Google, the
majority of teams practice continuous delivery and some practice continuous
deployment. There is wide variation in how to perform continuous deployment.

1.2. Tracking test projects

Monitoring test projects requires monitoring the progress of each of the test
activities for each of the systems of the system-of-systems, as well as on each of the
test environments of each of the test levels of each of these systems. It is therefore
important that the progress information of each test level is aggregated and
summarized for each system and that the test progress information of each system is
aggregated at the system-of-systems level. This involves defining the elements that
must be measured (the progress), against which benchmark they must be measured
(the reference) and identifying the impacts (dependencies) that this can generate.
Reporting of similar indicators from each of the systems will facilitate
understanding. Automated information feedback will facilitate information retrieval.
10 Advanced Testing of Systems-of-Systems 2

1.3. Risks and systems-of-systems

Systems-of-systems projects are subject to more risk than other systems in that
they may inherit upstream-level risks and a process’s tolerance for risk may vary by
organization and the delivered product. In Figure 1.1, we can identify that the more
we advance in the design and production of components by the various
organizations, the risks will be added and the impact for organizations with a low
risk tolerance will be more strongly impacted than others.

Figure 1.1. Different risk tolerance

In Figure 1.2, we can identify that an organization will be impacted by all the
risks it can inherit from upstream organizations and that it will impose risks on all
downstream organizations.

Figure 1.2. Inherited and imposed risks

We realize that risk management in systems-of-systems is significantly more


complex than in the case of complex systems and may need to be managed at
multiple levels (e.g. interactions between teams, between managers of the project or
between the managers – or leaders – of the organizations).
Test Project Management 11

1.4. Particularities related to SoS

According to Firesmith (2014), several pitfalls should be avoided in the context


of systems-of-systems, including:
– inadequate system-of-systems test planning;
– unclear responsibilities, including liability limits;
– inadequate resources dedicated to system-of-systems testing;
– lack of clear systems-of-systems planning;
– insufficient or inadequate systems-of-systems requirements;
– inadequate support of individual systems and projects;
– inadequate cross-project defect management.

To this we can add:


– different quality requirements according to the participants/co-contractors,
including regarding the interpretation of regulatory obligations;
– the needs to take into account long-term evolutions;
– the multiplicity of level versions (groupings of software working and delivered
together), multiple versions and environments;
– the fact that systems-of-systems are often unique developments.

1.5. Particularities related to SoS methodologies

Development methodologies generate different constraints and opportunities.


Sequential developments have demonstrated their effectiveness, but involve
constraints of rigidity and lack of responsiveness, if the contexts change. Agility
offers better responsiveness at the expense of a more restricted analysis phase and an
organization that does not guarantee that all the requirements will be developed. The
choice of a development methodology will imply adaptations during the
management of the project and during the testing of the components of the system-
of-systems.

Iterative methodologies involve rapid delivery of components or parts of


components, followed by refinement phases if necessary. That implies that:
– The planned functionalities are not fully provided before the last delivery of
the component. Validation by the business may be delayed until the final delivery of
the component. This reduces the time for detecting and correcting anomalies and
12 Advanced Testing of Systems-of-Systems 2

can impact the final delivery of the component, product or system, or even the
system-of-systems.
– Side effects may appear on other components, so it will be necessary to retest
all components each time a component update is delivered. This solution can be
limited to the components interacting directly with the modified component(s) or
extend to the entire system-of-systems, and it is recommended to automate it.
– The interfaces between components may not be developed simultaneously and
therefore that the tests of these interfaces may be delayed.

Sequential methodologies (e.g. V-cycle, Spiral, etc.) focus on a single delivery,


so any evolution – or need for clarification – of the requirement will have an impact
on lead time and workload, both in terms of development (redevelopment or
adaptation of components, products or systems) and in terms of testing (design and
execution of tests).

1.5.1. Components definition

Within the framework of sequential methodologies, the principle is to define the


components and deliver them finished and validated at the end of their design phase.
This involves a complete definition of each product or system component and the
interactions it has with other components, products or systems. These exhaustive
definitions will be used both for the design of the component, product or system and
for the design of the tests that will validate them.

1.5.2. Testing and quality assurance activities

It is not possible to envisage retesting all the combinations of data and actions of
the components of a level of a system-of-systems; this would generate a workload
disproportionate to the expected benefits. One solution is to verify that the design
and test processes have been correctly carried out, that the proofs of execution are
available and that the test activities – static and dynamic – have correctly covered
the objectives. These verification activities are the responsibility of the quality
assurance teams and are mainly based on available evidence (paper documentation,
execution logs, anomaly dashboards, etc.).

1.6. Particularities related to teams

In a test project, whether it is software testing or systems-of-systems testing, one


element to take into account is the management of team members, and their
Test Project Management 13

relationships with each other, others and to the outside. This information is grouped
into what NASA calls CRM (Crew Resource Management). Developed in the
1970s–1980s, CRM is a mature discipline that applies to complex projects and is
ideal for decision-making processes in project management.

It is essential to:
– recognize the existence of a problem;
– define what the problem is;
– identify probable solutions;
– take the appropriate actions to implement a solution.

If CRM is mainly used where human error can have devastating effects, it is
important to take into account the lessons that CRM can bring us in the
implementation of decision-making processes. Contrary to a usual vision, people
with responsibilities (managers and decision-makers) or with the most experience
are sometimes blinded by their vision of a solution and do not take into account
alternative solutions. Among the points to keep in mind is communication between
the different members of the team, mutual respect – which will entail listening to the
information provided – and then measuring the results of the solutions implemented
in order to ensure their effectiveness. Team members can all communicate important
information that will help the project succeed.

The specialization of the members of the project team, the confidence that we
have in their skills and the confidence that they have in their experience, the
management methods and the constraints – contractual or otherwise – mean that the
decision-making method and the decisions made can be negatively impacted in the
absence of this CRM technique. This CRM technique has been successfully
implemented in aeronautics and space, and its lessons should be used successfully in
complex projects.
2

Testing Process

Test processes are nested within the set of processes of a system-of-systems.


More specifically, they prepare and provide evidence to substantiate compliance
with requirements and provide feedback to project management on the progress of
test process activities. If CMMI is used, other process areas than VER and VAL will
be involved: PPQA (Process and Product Quality Assurance), PMC (Project
Monitoring and Control), REQM (Requirements Management), CM (Configuration
Management), TS (Technical Solution), MA (Measurement and Analysis), etc.

These processes will all be involved to some degree in the testing processes.
Indeed, the test processes will decline the requirements, whether or not they are
defined in documents describing the conformity needs, and the way in which these
requirements will be demonstrated (type of IADT proofs), will split the types of
demonstration according to the levels test and integration (system, subsystem,
sub-subsystem, component, etc.) and static types (Analysis and Inspection for static
checks during design) or dynamic (demonstration and tests during the levels of
integration and testing of subsystems, systems and systems-of-systems). Similarly,
the activities of the test process will report information and progress metrics to
project management (CMMI PMC for Project Monitoring and Control process) and
will be impacted by the decisions descending from this management.

The processes described in this chapter apply to a test level and should be
repeated on each of the test levels, for each piece of software or containing software.
Any modification in the interfaces and/or the performance of a component
interacting with the component(s) under test will involve an analysis of the impacts
and, if necessary, an adaptation of the test activities (including about the test) and
evidence to be provided to show the conformity of the component (or system,
subsystem, equipment or software) to its requirements. Each test level should
coordinate with the other levels to limit the execution of tests on the same
requirements.
16 Advanced Testing of Systems-of-Systems 2

The ISO/IEC/IEEE29119-1 standard describes the following processes, grouping


these in organizational processes (in yellow), management processes (pink) and
dynamic processes (green).

Figure 2.1. Test processes. For a color version of this


figure, see www.iste.co.uk/homes/systems2.zip

Defined test processes are repeatable at each test level of a system-of-systems:


– the general organization of the test level;
– planning of level testing activities;
– control of test activities;
– analysis of needs, requirements and user stories to be tested;
– design of the test cases applicable to the level;
– implementation of test cases with automation and provision of test data;
– execution of designed and implemented test cases, including the management
of anomalies;
– evaluation of test execution results and exit criteria;
– reporting;
– closure of test activities, including feedback and continuous improvement
actions;
– infrastructure and environment management.
Testing Process 17

An additional process can be defined: the review process, which can be carried
out several times on a test level, on the one hand, on the input deliverables, and on
the other hand, on the deliverables produced by each of the processes of the level.
Review activities can occur within each defined test process.

The proposed test processes are applicable regardless of the development mode
(Agile or sequential). In the case of an Agile development mode, the testing
processes must be repeated for each sprint and for each level of integration in a
system-of-systems.

The processes must complement each other and – even if they may partially
overlap – it must be ensured that the processes are completed successfully.

2.1. Organization

Objectives:
– develop and manage organizational needs, in accordance with the company’s
test policy and the test strategies of higher levels;
– define the players at the level, their responsibilities and organizations;
– define deliverables and milestones;
– define quality targets (SLA, KPi, maximum failure rate, etc.);
– ensure that the objectives of the test strategy are addressed;
– define a standard RACI matrix.

Actor(s):
– CPI (R+A), CPU/CPO (I), developers (C+I);
– experienced “test manager” having a pilot role of the test project (R).

Prerequisites/inputs:
– calendar and budgetary constraints defined for the level;
– actors and subcontractors envisaged or selected;
– repository of lessons learned from previous projects.

Deliverables/outputs:
– organization of level tests;
– high-level WBS with the main tasks to be carried out;
18 Advanced Testing of Systems-of-Systems 2

– initial definition of test environments.

Entry criteria:
– beginning of the organization phase.
Exit criteria:
– approved organizational document (ideally a reduced number of pages).

Indicators:
1) efficiency: writing effort;
2) coverage: traceability to the quality characteristics identified in the project test
strategy.

Points of attention:
– ensure that the actors and meeting points (milestones and level of reporting)
are well defined.

2.2. Planning

Objective:
– plan test activities for the project, level, iteration or sprint considering existing
issues, risk levels, constraints and objectives for testing;
– define the tasks (durations, objectives, incoming and outgoing, responsibilities,
etc.) and sequencing;
– define the exit criteria (desired quality level) for the level;
– identify the prerequisites, resources (environment, personnel, tools, etc.)
necessary;
– define measurement indicators and frequencies, as well as reporting.

Actor(s):
– CPI (R+A), CPU/CPO (I), developers (C+I);
– experienced testers “test manager”, having a role of manager of the test
project (R);
– testers (C+I).
Testing Process 19

Prerequisites/inputs:
– information on the volume, workload and deadlines of the project;
– information on available environments and interfaces;
– objectives and scope of testing activities.

2.2.1. Project WBS and planning

Objective:
– plan test activities for the project or level, iteration or sprint considering
existing status, risk levels, constraints and objectives for testing;
– define the tasks (durations, objectives, incoming and outgoing, responsibilities,
etc.) and sequencing;
– define the exit criteria (desired quality level) for the level;
– identify prerequisites, resources (environment, personnel, tools, etc.)
necessary;
– define measurement indicators and frequencies, as well as reporting.

Actor(s):
– CPI (R+A), CPU/CPO (I), developers (C+I);
– experienced testers “test manager”, having a role of manager of the test project
(R);
– testers (C+I).

Prerequisites/inputs:
– REAL and project WBS defined in the investigation phase;
– lessons learned from previous projects (repository of lessons learned).

Deliverables/outputs:
– master test plan, level test plan(s);
– level WBS (or TBS for Test Breakdown Structure), detailing – for the
applicable test level(s) – the tasks to be performed;
– initial definition of test environments.
20 Advanced Testing of Systems-of-Systems 2

Entry criteria:
– start of the investigation phase.

Exit criteria:
– test plan approved, all sections of the test plan template are completed.

Indicators:
1) efficiency: writing effort vs. completeness and size of the deliverables
provided;
2) coverage: coverage of the quality characteristics selected in the project Test
Strategy.

Points of attention:
– ensure that test data (for interface tests, environment settings, etc.) will be well
defined and provided in a timely manner;
– collect lessons learned from previous projects.

Deliverables/outputs:
– master test plan, level test plan(s);
– level WBS, detailing – for the test level(s) – the tasks to be performed;
– detailed Gantt of test projects – each level – with dependencies;
– initial definition of test environments.

Entry criteria:
– start of the investigation phase.

Exit criteria:
– approved test plan, all sections of the applicable test plan template are
completed.

Indicators:
1) efficiency: writing effort;
2) coverage: coverage of the quality characteristics selected in the project’s test
strategy.
Testing Process 21

Points of attention:
– ensure that test data (for interface testing, environment settings, etc.) will be
well defined and provided in a timely manner.

2.3. Control of test activities

Objective:
– throughout the project: adapt the test plan, processes and actions, based on the
hazards and indicators reported by the test activities, so as to enable the project to
achieve its objectives;
– identify changes in risks, implement mitigation actions;
– provide periodic reporting to the CoPil and the CoSuiv;
– escalate issues if needed.

Actor(s):
– CPI (A+I), CPU/CPO (I), developers (I);
– test manager with a test project manager role (R);
– testers (C+I) [provide indicators];
– CoPil CoNext (I).

Prerequisites/inputs:
– risk analysis, level WBS, project and level test plan.

Deliverables/outputs:
– periodic indicators and reporting for the CoPil and CoSuiv;
– updated risk analysis;
– modification of the test plan and/or activities to allow the achievement of the
“project” objectives.

Entry criteria:
– project WBS, level WBS.

Exit criteria:
– end of the project, including end of the software warranty period.
22 Advanced Testing of Systems-of-Systems 2

Indicators:
– dependent on testing activities.

2.4. Analyze

Objective:
– analyze the repository of information (requirements, user stories, etc. usable
for testing) to identify the test conditions to be covered and the test techniques to be
used. A risk or requirement can be covered by more than one test condition. A test
condition is something – a behavior or a combination of conditions – that may be
interesting or useful to test.

Actor(s):
– testers, test analysts, technical test analysts.

Prerequisites/inputs:
– initial definition of test environments;
– requirements and user stories (depending on the development method);
– acceptance criteria for (if available);
– analysis of prioritized project risks;
– level test plan with the characteristics to be covered, the level test environment.

Deliverables/outputs:
– detailed definition of the level test environment;
– test file;
– prioritized test conditions;
– requirements/risks traceability matrix – test conditions.

Entry criteria:
– validated and prioritized requirements;
– risk analysis.

Exit criteria:
– each requirement is covered by the required number of test conditions
(depending on the RPN of the requirement).
Testing Process 23

Indicators:
1) Efficiency:
- number of prioritized test conditions designed,
- updated traceability matrix for extension to test conditions.
2) Coverage:

- percentage of requirements and/or risks covered by one or more test


conditions,
- for each requirement or user story analyzed, implementation of traceability to
the planned test conditions,
- percentage of requirements and/or risks (by risk level) covered by one or
more test conditions.

2.5. Design

Objective:
– convert test conditions into test cases and identify test data to be used to cover
the various combinations. A test condition can be converted into one or more test
cases.

Actor(s):
– testers, test technicians.

Prerequisites/inputs:
– prioritized test conditions;
– requirements/risks traceability matrix – test conditions.

Deliverables/outputs:
– prioritized test cases, definition of test data for each test case (input and
expected);
– prioritized test procedures, taking into account the execution prerequisites;
– requirements/risks traceability matrix – test conditions – test cases.

Entry criteria:
– test conditions defined and prioritized;
24 Advanced Testing of Systems-of-Systems 2

– risk analysis.

Exit criteria:
– each test condition is covered by one or more test cases (according to the
RPN);
– partitions and typologies of test data defined for each test;
– defined test environments.

Indicators:
1) Efficiency:
- number of prioritized test cases designed,
- updated traceability matrix for extension to test cases.
2) Coverage:
- percentage of requirements and/or risks covered by one or more test cases
designed.

2.6. Implementation

Objective:
– finely describe – if necessary – the test cases;
– define the test data for each of the test cases generated by the test design
activity;
– automate the test cases that need to be;
– setting up test environments.

Actor(s):
– testers, test automators, data and systems administrators.

Prerequisites/inputs:
– prioritized test cases;
– risk analysis.

Deliverables/outputs:
– automated or non-automated test scripts, test scenarios, test procedures;
Testing Process 25

– test data (input data and expected data for comparison);


– traceability matrix of requirements to risks – test conditions – test cases – test
data.

Entry criteria:
– prioritized test cases, defined with their data partitions.

Exit criteria:
– test data defined for each test;
– test environments defined, implemented and verified.

Indicators:
1) Efficiency:
- number of prioritized test cases designed with test data,
- updated traceability matrix for extension to test data,
- number of test environments defined, implemented and verified vs. number
of environments planned in the test strategy.
2) Coverage:
- percentage of test environments ready and delivered,
- coverage of requirements and/or risks by one or more test cases with data,
- coverage of requirements and/or risks by one or more automated test cases.

2.7. Test execution

Objective:
– execute the test cases (on the elements of the application to be tested) delivered
by the development;
– identify defects and write anomaly sheets;
– report monitoring and coverage information.

Actor(s):
– testers, test technicians.
26 Advanced Testing of Systems-of-Systems 2

Prerequisites/inputs:
– system to be tested is available and managed in delivery (configuration
management), accompanied by a delivery sheet.

Deliverables/outputs:
– anomaly sheets filled in for any identified defect;
– test logs.

Entry criteria:
– testing environment and resources (including testing tools) available for the
level, and tested;
– anomaly management tool available and installed;
– test cases and test data available for the level;
– component or application to be tested available and managed in delivery
(configuration management);
– delivery sheet provided.

Exit criteria:
– coverage of all test cases for the level.

Indicators:
1) Efficiency:
- percentage of tests passed, skipped (not passed) and failed, by level of risk,
- percentage of test environment availability for test execution,
- test execution workload achieved vs. planned.
2) Coverage:
- percentage of requirements/risks tested with at least one remaining defect,
- percentage of requirements/risks tested without any defect remaining.

2.8. Evaluation

Objective:
– identify whether the test execution results show that the execution campaign
will be able to achieve the objectives;
Testing Process 27

– ensure that the acceptance criteria defined for the requirements or user stories
are met;
– if scope or quality changes impact testing, identify and select the mitigation
actions.
Actor(s):
– person responsible for project testing activities.

Prerequisites/inputs:
– definition of acceptance criteria;
– project load estimation data;
– actual usage data of project loads;
– progress data (coverage, deadlines, anomalies, etc.) of the project.

Deliverables/outputs:
– progress graphs, identification of trends;
– progress comments (identification of causes and proposals for mitigation).

Entry criteria:
– start of the project.

Exit criteria:
– end of the duration of each test task and of the test campaign;
– complete coverage achieved for the features or components to be tested.

Indicators:
1) Efficiency:
- identify the workload used, the anomalies identified – including priority and
criticality – as well as the level of coverage achieved, compare against the objectives
defined in the planning part.
2) Coverage:
- all the activities planned for the test task or for the test campaign have been
carried out,
- to be defined based on the planned objectives and their achievement for each
requirement or user story.
Another random document with
no related content on Scribd:
Although no one could be surprised that the President and his
Cabinet hesitated to put themselves without reserve in the hands of
an adventurer, Eaton’s anger was extreme at finding the
Government earnest for peace rather than war. Himself a
Connecticut Federalist, a close friend of Timothy Pickering, he
expressed his feelings in his private letters with the bitterness as well
as with the humor of his class.[314]
“I waited on the President and the Attorney-General. One of them
was civil, and the other grave.... I endeavored to enforce conviction on
the mind of Mr. Lincoln of the necessity of meeting the aggressions of
Barbary by retaliation. He waived the subject, and amused me with
predictions of a political millennium which was about to happen in the
United States. The millennium was to usher in upon us as the
irresistible consequence of the goodness of heart, integrity of mind,
and correctness of disposition of Mr. Jefferson. All nations, even
pirates and savages, were to be moved by the influence of his
persuasive virtue and masterly skill in diplomacy.”
Eaton’s interviews probably took place at the moment when the
Louisiana treaty confirmed the Cabinet in its peace policy and in
reliance on diplomacy. In March, 1804, Eaton succeeded in returning
to the Mediterranean as naval agent, but without special powers for
the purpose he had in mind.
“The President becomes reserved; the Secretary of War ‘believes
we had better pay tribute,’—he said this to me in his own office.
Gallatin, like a cowardly Jew, shrinks behind the counter. Mr. Madison
‘leaves everything to the Secretary of the Navy Department.’ And I am
ordered on the expedition by Secretary Smith,—who, by the by, is as
much of a gentleman and a soldier as his relation with the
Administration will suffer,—without any special instructions to regulate
my conduct.”
With no other authority to act as a military officer than a vague
recommendation from the President as a man who was likely to be
extremely useful to Barron, Eaton returned with Barron’s large
squadron. He felt himself ill-treated, for he was irritable and self-
asserting by nature, and was haunted by a fixed idea too
unreasonable for the President to adopt; but he chose to act without
authority rather than not act at all, for he was born an adventurer,
and difficulties which seemed to cooler heads insurmountable were
nothing in his eyes. Sept. 5, 1804, he arrived at Malta, and thence
sailed to Alexandria; for in the meanwhile Hamet had been driven to
take refuge in Egypt, and Eaton on reaching Cairo, Dec. 8, 1804,
found that the object of his search was shut up in Minyeh on the Nile
with some rebellious Mamelukes, besieged by the viceroy’s troops.
After infinite exertions and at no little personal danger, Eaton brought
Hamet to Alexandria, where they collected some five hundred men,
of whom one hundred were Christians recruited on the spot. Eaton
made a convention with Hamet, arranged a plan of joint operations
with Barron, and then at about the time when President Jefferson
was delivering his second Inaugural Address, the navy agent led his
little army into the desert with the courage of Alexander the Great, to
conquer an African kingdom.
So motley a horde of Americans, Greeks, Tripolitans, and Arab
camel-drivers had never before been seen on the soil of Egypt.
Without discipline, cohesion, or sources of supply, even without
water for days, their march of five hundred miles was a sort of
miracle. Eaton’s indomitable obstinacy barely escaped ending in his
massacre by the Arabs, or by their desertion in a mass with Hamet at
their head; yet in about six weeks they succeeded, April 17, 1805, in
reaching Bomba, where to Eaton’s consternation and despair he
found no American ships.[315]
“Nothing could prevail on our Arabs to believe that any had been
there. They abused us as impostors and infidels, and said we had
drawn them into that situation with treacherous views. All began now
to think of the means of individual safety; and the Arabs came to a
resolution to separate from us the next morning. I recommended an
attempt to get into Derne. This was thought impracticable. I went off
with my Christians, and kept up fires upon a high mountain in our rear
all night. At eight the next morning, at the instant when our camp was
about breaking up, the Pacha’s casnadar, Zaid, who had ascended
the mountain for a last look-out, discovered a sail! It was the ‘Argus;’
Captain Hull had seen our smokes, and stood in. Language is too
poor to paint the joy and exultation which this messenger of life
excited in every breast.”
Drawing supplies from the brig the little army rested a few days;
and then, April 25, moved against Derne, where they found the town
held by a garrison of eight hundred men who had thrown up
earthworks and loopholed the terraces and houses for musketry.
Eaton sent to the governor a flag of truce, which was sent back with
the Eastern message,—“My head, or yours!” Three cruisers, the
“Nautilus,” “Argus,” and “Hornet,” acted in concert with Eaton, and a
vigorous combined attack, April 27, drove the governor and his
garrison from the town. Eaton received a ball through the left wrist,
but could not afford to be disabled, for on the news of his arrival a
large force was sent from Tripoli to dislodge him; and he was obliged
to fight another little battle, May 13, which would have been a
massacre had not the ships’ guns held the Tripolitans in awe.
Skirmishing continued another month without further results. Eaton
had not the force to advance upon Tripoli, which was nearly seven
hundred miles to the westward, and Hamet found no such popular
support at Derne as he had hoped.
What influence Eaton’s success at Derne had on the Pacha at
Tripoli was never perfectly understood; but the Pacha knew that
Rodgers was making ready for an assault, beside which the hottest
of Preble’s bombardments would seem gentle; Eaton at Derne with
Hamet was an incessant and indefinite threat; his own subjects were
suffering, and might at any moment break into violence; a change of
ruler was so common a matter, as Yusuf had reason to remember,
that in the alternative of losing his throne and head in one way or the
other, he decided that peace was less hazardous than war.
Immediately upon hearing that his troops had failed to retake Derne,
he entered into negotiations with Tobias Lear, the American Consul-
General at Algiers, who had come to Tripoli for the purpose; and on
this occasion the Pacha negotiated with all the rapidity that could be
wished. June 3, 1805, he submitted to the disgrace of making peace
without being expressly paid for it, and Lear on his side consented to
ransom the crew of the “Philadelphia” for sixty thousand dollars.
When Eaton learned what Lear had done, his anger was great
and not unreasonable. That Lear should have made a treaty which
sacrificed Eaton’s Mahometan allies, and paid sixty thousand dollars
for the imprisoned seamen at a moment when Eaton held Derne,
and could, as he thought, with two hundred marines on shore and an
immense fleet at sea drive the Pacha out of his dominions within six
weeks, was astonishing. Lear’s only excuse was the fear of causing
a massacre of the “Philadelphia’s” crew,—a reason which Eaton
thought unfounded and insufficient, and which was certainly, from a
military point of view, inadmissible. The treaty left the Mahometan
allies at Derne to be massacred, and threw Hamet on Eaton’s hands.
Deposited at Syracuse with a suite of thirty persons without means
of support, Caramelli became a suppliant for alms to the United
States Congress. Eaton declared the treaty disgraceful, and
thenceforth his grievances against the government took an acute
form. The settlement of his accounts was slow and difficult. He
returned to America and received great attentions, which made him
none the less loud in complaint, until at last he died in 1811 a victim
to drink and to craving for excitement. Eaton was beyond question a
man of extraordinary energies and genius; he had even the rare
courage to displease his own Federalist friends in 1807, because of
defending Jefferson who had done nothing for him, but who at a
critical moment represented in his eyes the Union.
Meanwhile peace with Tripoli was obtained without tribute, but at
the cost of sixty thousand dollars, and at the expense of Eaton and
his desperate band of followers at Derne. Hamet Caramelli received
at last a small sum of money from Congress, and through American
influence was some years afterward made governor of Derne. Thus
after four years of unceasing effort the episode of the Tripolitan war
came to a triumphant end. Its chief result was to improve the navy
and give it a firmer hold on popular sympathy. If the once famous
battles of Truxton and the older seamen were ignored by the
Republicans, Preble and Rodgers, Decatur and Hull, became brilliant
names; the midnight death of Somers was told in every farmhouse;
the hand-to-hand struggles of Decatur against thrice his numbers
inflamed the imagination of school-boys who had never heard that
Jefferson and his party once declaimed against a navy. Even the
blindest could see that one more step would bring the people to the
point so much dreaded by Jefferson, of wishing to match their forty-
fours against some enemy better worthy of their powers than the
pirates of Tripoli.
There was strong reason to think that this wish might soon be
gratified; for on the same day when Lear, in the “Essex,” appeared
off Tripoli and began his negotiation for peace, Monroe’s travelling-
carriage rumbled through the gates of Madrid and began its dusty
journey across the plains of Castile, bearing an angry and
disappointed diplomatist from one humiliation to another.
INDEX TO VOLS. I. AND II.
Abolition Society, an early, i. 128.
Acts of Congress, of Sept. 24, 1789, to establish the Judiciary, i,
259, 260, 275, 276;
of June 13, 1798, to suspend intercourse with France, 383;
of June 25, 1798, concerning aliens, 140, 141, 206, 207, 259,
286;
of July 14, 1798, concerning sedition, 140, 141, 206, 207,
259, 261, 286;
of Feb. 9, 1799, further to suspend intercourse with France,
384;
of Feb. 13, 1801, to provide for the more convenient
organization of the courts, 274–276, 278, 280, 288, 293,
297;
of Jan. 14, 1802, for the apportionment of representatives,
301;
of March 8, 1802, to repeal the Judiciary Act of 1801, 280,
281, 284–298;
of March 16, 1802, fixing the military peace establishment,
301;
of April 6, 1802, to repeal the internal taxes, 272;
of April 29, 1802, for the redemption of the public debt, 272;
of April 29, 1802, to amend the judicial system, 298;
of April 30, 1802, to enable Ohio to form a State government,
302;
of Feb. 28, 1803, for building four sloops-of-war and fifteen
gunboats, ii. 77;
of Oct. 31, 1803, to take possession of Louisiana, 119, 120;
of Feb. 24, 1804, for collecting duties within the territories
ceded to the United States, 257, 260–263, 291, 293, 304,
380;
of March 25, 1804, to establish the Mediterranean Fund, 141;
of March 26, 1804, for the temporary government of
Louisiana, 120–129;
of Jan. 19, 1805, to erect a dam from Mason’s island, 209;
of March 2, 1805, further providing for the government of
Orleans Territory, 401;
of March 3, 1805, for the more effectual preservation of peace
in the ports and harbors of the United States, 397, 398.
Acts of Parliament, on navigation, ii. 319, 320, 327;
on naturalization, 338, 413, 414;
on merchant-shipping, 345.
Adams, John Quincy, senator from Massachusetts, ii. 110, 117,
184, 379;
proposes draft of Constitutional amendment, 118, 160, 164.
Addington ministry, ii. 358, 416.
Addington, Henry (Lord Sidmouth), succeeds Pitt, ii. 342, 347;
retires from office, 418.
Addison, Judge, impeached, ii. 195.
Admiralty courts in the West Indies, ii. 340.
Albany in 1800, i. 3.
Alien and sedition laws, i. 140, 206, 259.
(See Acts of Congress.)
Allston, Washington, i. 149.
Alquier, French minister at Madrid, i. 363, 368.
Alsop, Richard, i. 102.
Amendment to the Constitution, the twelfth, ii. 132.
“American Citizen,” the, i. 331.
Ames, Fisher, i. 82, 83;
his opinion of democracy, 84;
in conversation, 86;
speech of, on the British treaty, 88, 93;
his language toward opponents, 119; ii. 164.
Amiens, peace of, i. 370; ii. 59, 290, 326, 347, 385.
(See Treaties.)
Amusements in 1800, in New England, i. 50;
in Virginia, 51.
Anderson, Joseph, senator from Tennessee, ii. 157.
“Aristides.” Pamphlet by W. P. Van Ness, ii. 73, 172.
Armstrong, General John, senator from New York, i. 108, 113,
230, 234, 281; ii. 157;
succeeds Livingston at Paris, 291, 308.
Army, chaste reformation of, i. 238;
peace establishment in 1801, 242, 261, 272, 301.
Ashe, an English traveller, i. 43, 52, 53, 54.
Astor, John Jacob, i. 28.
“Aurora” newspaper, i. 118, 121.

Bailey, Theodorus, i. 231, 266, 296.


Bainbridge, Captain, ii. 137, 426.
Baldwin, Abraham, senator from Georgia, i. 305.
Ballston Spa, i. 92.
Baltimore in 1800, i. 29, 131.
Banks, in Boston in 1800, i. 22;
in New York, 25;
in the South, 31;
hostility to, 65.
Baptists in New England, i. 89.
Barbary Powers, war with the, i. 244 et seq.; ii. 425 et seq.
Baring, Alexander, ii. 358.
Barlow, Joel, i. 69, 99;
his “Columbiad,” 103 et seq., 106, 182.
Barron, Commodore Samuel, at Tripoli, ii. 428;
yields the command to Rodgers, 429.
Bartram, William, i. 124.
Bayard, James A., member of Congress from Delaware, i. 269,
271;
his reply to Giles, 291 et seq.;
beaten by Cæsar A. Rodney, retires to the Senate, ii. 76;
re-elected to the House, 201;
moves the form of question in the Chase impeachment, 237,
241.
Beaujour, Felix de, quoted, i. 46, 165.
Belknap, Jeremy, i. 93.
Bernadotte, General, appointed minister at Washington, ii. 10;
Talleyrand’s instructions to, 11.
Berthier, General, Napoleon’s agent for the retrocession of
Louisiana, i. 366.
Beurnonville, French ambassador at Madrid, ii. 59, 277.
Bishop, Abraham, collector of New Haven, i. 226.
Blockade, law of, ii. 385;
of Martinique and Guadeloupe, 381;
of New York, 396.
Bonaparte. (See Napoleon.)
Bonaparte, Jerome, his marriage to Miss Patterson and his
reception by the President, ii. 377 et seq.
Bonaparte, Joseph, negotiates treaty of Morfontaine, i. 360, 362;
scene of, with Napoleon, ii. 35 et seq.
Bonaparte, Lucien, appointed ambassador at Madrid, i. 371,
373;
opposes the cession of Louisiana, ii. 34;
scene of, with Napoleon, 35 et seq.
Boston, population and appearance of, in 1800, i. 20;
business, 21;
an intellectual centre in 1800, 75;
sentiment of, 87;
social customs of, in 1800, 91;
a summer watering-place, 92.
Bowditch, Nathaniel, i. 93.
Boyle, John, ii. 228.
Brackenridge, H. H., author of “Modern Chivalry,” i. 124; ii. 195.
Bradley, Captain, of the “Cambrian,” ii. 393, 396.
Bradley, Stephen R., senator from Vermont, ii. 157, 218, 238,
259.
Breckenridge, John, senator from Kentucky, i. 269;
moves the repeal of the Judiciary Act, 278, 280; ii. 85, 94;
on the admission of Louisiana to the Union, 108;
his bill for the territorial government of Louisiana, 120.
British claims, ii. 339.
Brown, Charles Brockden, i. 123.
Brown, James, secretary of the Louisiana Territory, ii. 220.
Bryant, William Cullen, i. 110, 133.
Buckminster, Joseph, i. 81.
Buckminster, Joseph Stevens, i. 90, 162.
Bülow, Heinrich Wilhelm, i. 41, 48.
Burr, Aaron, Vice-President, i. 65, 93, 109, 112;
his character, 195;
centre of intrigue, 229 et seq.;
his hatred of Virginia, 279;
his toast at the Federalist dinner, 282;
attacked by the “American Citizen” and “Aurora,” 283; ii. 154;
invoked by Pickering and Griswold, 171;
his defence by “Aristides,” 172;
his interview with Jefferson, 175;
nominated for governor of New York, 177;
confers with Griswold, 183;
defeated, 185;
his hostility to Hamilton, 185;
his duel with Hamilton, 187 et seq.;
presides at the Chase impeachment, 227, 238, 368;
communicates with Merry, 395;
his plan of creating a western confederacy, 402;
asks the aid of the British government, 403;
Turreau’s opinion of, 407;
his plan, 408.
Butler, Pierce, ii. 95.

Cabot, George, his opinion of democracy, i. 84, 86 et seq.;


letter of, opposing Pickering’s scheme, ii. 164;
inclines to Burr, 182.
Calhoun, John C., i. 154.
Callender, James T., his libels on Jefferson, i. 322 et seq.
Calvinism, popular reaction against, in New England, i. 82.
Campbell, George W., member of Congress from Tennessee, ii.
123;
impeachment of Judge Chase, 224, 228, 230.
Campbell, Justice, on the Louisiana case, ii. 127.
Campbell, Thomas, borrows from Freneau, i. 126.
Canals in 1800, i. 8–10, 26, 29, 38, 94.
Canning, George, rise of, ii. 417.
“Canons of Etiquette,” the, ii. 365.
Capitol at Washington in 1800, i. 30, 198;
designed by Dr. Thornton, 111.
Caramelli, Hamet, ii. 430, 436.
Cevallos, Don Pedro de, i. 371; ii. 23;
remonstrates against the sale of Louisiana, 58;
refuses to pay for French spoliations, 276, 279;
his conditions on ratification of Spanish claims convention,
280;
his comments on the Americans, 282, 283;
alarmed, 284;
complains of Pinckney’s conduct, 294.
Channing, William Ellery, i. 90;
his impressions of Virginia manners, 132, 171.
Charles IV. of Spain, his character, i. 341;
refuses papal territory, 354;
his delight at the offer of Tuscany, 369;
refuses to sell Florida, 401;
delivers Louisiana to Napoleon, 401;
distressed by Napoleon, ii. 56;
his demands on Napoleon, 59;
withdraws protest against the sale of Louisiana, 277;
declares war on England, 309.
Charleston, S. C., in 1800, i. 37 et seq., 92, 149.
Chase, Justice Samuel, his charge to the Baltimore grand jury,
ii. 147;
his impeachment, 149 et seq., 158;
scene of impeachment, 227;
his counsel, 229;
the managers of his impeachment, 229;
articles of impeachment, 229;
the trial, 230 et seq.;
votes on the articles, 238;
his acquittal, 239.
Chauncey, Isaac, at Tripoli, ii. 428.
Cheetham, editor of the “American Citizen and Watchtower,” i.
121;
attacks Burr, 331.
Chillicothe in 1800, i. 2.
Christophe, i. 416.
Cincinnati in 1800, i. 2.
Claiborne, William Charles Cole, appointed governor of
Mississippi Territory, i. 295, 403;
receives possession of Louisiana, ii. 256;
governor of Orleans Territory, 400.
Claims, American, on France. (See French spoliations.)
Claims, American, on Spain. (See Pinckney.)
Clark, Christopher, ii. 228.
Clay, Henry, i. 133.
Cleveland in 1800, i. 3.
Clifton, William, i. 98.
Clinton, De Witt, i. 112, 228, 233;
resigns his senatorship to become mayor of New York, 266,
281;
attacks Burr through Cheetham, 331;
his duel with Swartwout, 332; ii. 206.
Clinton, George, i. 114;
governor of New York, 228; ii. 173;
nominated for Vice-President, 180.
Cobbett, William, i. 46;
in Philadelphia, 118.
Cocke, William, senator from Tennessee, ii. 113;
censures Randolph, 240.
Coleman, William, editor of the New York “Evening Post,” i. 119.
Colonial System of the European Powers, ii. 323.
Colonial trade, ii. 319, 322, 327–329;
direct and indirect, 324, 325;
West Indian, value of, 331, 332.
Columbia College, i. 101.
“Columbiad,” the, of Joel Barlow, i. 103 et seq.
Commerce, foreign and domestic, in 1800, i. 5, 14.
Congregational clergy, i. 79.
Congress, the Seventh, first session of, i. 264–307;
second session, 427–433; ii. 74–77;
the Eighth, first session of, 92, 96–159;
second session, 206–242, 396.
(See Acts of Congress.)
Connecticut, i. 105.
“Constitution,” the, ii. 426.
Cooper, Dr. Charles D., ii. 178;
letter, 186.
Cooper, James Fenimore, i. 110;
quotation from “Chainbearer,” 43.

Dallas, Alexander James, i. 127, 281; ii. 198;


letter of, to Gallatin, 198.
Dana, Samuel, member of Congress from Connecticut, i. 269,
271.
Davis, John, an English traveller, i. 122;
his account of Jefferson’s inauguration, 197.
Davis, Matthew L., i. 231 et seq., 296.
Dayton, Jonathan, senator from New Jersey, i. 280; ii. 105.
Dearborn, Henry, appointed Secretary of War, i. 219; ii. 2, 431.
Debt, public. (See Finances.)
Decatur, James, killed at Tripoli, ii. 427.
Decatur, Stephen, burns the “Philadelphia,” ii. 139;
at Tripoli, 427.
Decrès, Napoleon’s Minister of Marine, instructions of, to
Richepanse and Leclerc, re-establishing slavery, i. 397;
defining the boundaries of Louisiana and its administration, ii.
5.
Democrats, denounced by New England clergy, i. 79 et seq.;
social inferiority, 92;
the Northern, 264.
Dennie, Joseph, on democracy, i. 85;
editor of the “Portfolio,” 119, 121.
Deposit at New Orleans, the right of, granted by treaty, i. 349;
taken away, 418;
restored, ii. 3.
Derbigny, Pierre, ii. 401, 406, 408.
Desertion of British Seamen, ii. 333–335, 345, 346, 392.
Dessalines, i. 416.
Destréhan, Jean Noel, ii. 401, 406.
Dexter, Samuel, i. 93, 192, 219.
Dickens, Charles, i. 56.
“Diomed,” stallion, i. 51.
Drayton, Governor, of South Carolina, i. 151.
Dry-dock, Jefferson’s plan of, i. 428; ii. 77.
Duane, William, editor of the “Aurora,” i. 118;
his influence in Pennsylvania, ii. 194.
Duponceau, Peter S., i. 127; ii. 259.
Dupont de Nemours, commissioned by Jefferson to treat
unofficially with Bonaparte, i. 411;
letter to, ii. 254.
Dwight, Theodore, i. 101;
his attack on democracy, 225.
Dwight, President Timothy, quoted, i. 21, 23;
his travels, 41;
describes popular amusements, 49, 56;
lack of roads in Rhode Island, 64;
his poem, “The Conquest of Canaan” cited, 96 et seq.;
his “Greenfield Hill,” 98;
value of his Travels, 100, 310.

Early, Peter, member of Congress from Georgia, ii. 228, 230.


Eaton, William, his character and career, ii. 429 et seq.;
his interviews with Jefferson and the Cabinet, 430;
attacks Derne, 433.
Education in New England, i. 76, 77;
in New York, 110;
in New Jersey and Pennsylvania, 129;
in Virginia, 136.
Election of 1800, i. 152, 163;
of 1801, 294; ii. 202;
of 1802, 308, 329, 330;
of 1803, 76;
of 1804, 163, 176, 185, 197, 201, 202, 204.
Embargo imposed by Washington, ii. 323.
Emerson, Ralph Waldo, i. 171.
“Emmanuel,” case of, ii. 327.
England, colonial policy of, ii. 317;
cordiality with, 347;
change of tone toward, 356, 387.
“Enterprise,” United States schooner, captures Tripolitan corsair,
i. 245.
Eppes, John W., member of Congress from Virginia, ii. 95.
Erie Canal, the, i. 112.
Essex Junto, the, i. 89, 314.
Etiquette at Washington, ii. 362 et seq., 380.
Eustis, Dr. William, member of Congress from Boston, i. 93,
281.
Evans, Oliver, his inventions, i. 68, 71, 182.
“Evening Post,” the New York, i. 119, 120; ii. 366.
“Experiment,” sloop, i. 6.

Federalists. (See Party.)


Fight, the “rough-and-tumble,” in the South, i. 52 et seq.
Finances in 1801, i. 239 et seq., 253, 270, 272;
in 1802, ii. 75, 77;
in 1803, 135, 136, 141;
in 1804, 206.
Fitch, John, his inventions, i. 66 et seq., 181.
Florida restored by England to Spain, i. 353;
Bonaparte’s demand for, refused by Charles IV., 369;
Bonaparte’s attempts to secure, 401;
Livingston’s attempt to secure, ii. 44.
Florida, West, ii. 7;
claimed by Livingston as part of the Louisiana purchase, 68;
Jefferson’s anxiety to secure, 245;
scheme for seizing, 255;
claim to, 273, 311, 312;
claim adopted by the President, 302.
Foster, Augustus, his description of Jefferson, i. 186;
of Madison, 190.
Fox, Charles James, ii. 418.
Franklin, Benjamin, i. 60 et seq., 181;
citation from Poor Richard, 44.
French Revolution, i. 82.
French spoliations, i. 350, 361-363; ii. 30, 31, 40–42, 46–50, 61.
Freneau, Philip, i. 125.
Frere, John Hookham, i. 402.
Fugitive-Slave Bill, i. 300.
Fulton, Robert, i. 69, 71, 182.

Gaillard, John, senator from South Carolina, ii. 238.


Gallatin, Albert, his opinion of the Connecticut River district, i.
19;
on Indian corn, 58;
his political doctrines, 72, 115 et seq., 163, 177;
personal characteristics of, 190;
appointed Secretary of the Treasury, 218;
supports M. L. Davis, 232;
opposes removals from office, 235; ii. 194;
his financial measures, i. 239;
his financial schemes adopted, 272;
inserts school and road contract into the Ohio Constitution,
302;
the Yazoo sale, 304;
underestimates the product of the taxes, ii. 75;
his opinion on the acquisition of territory, 79, 131;
success of the Treasury Department under, 135;
asks Congress for a special tax for the Barbary war, 141, 261;
attacked by Duane, 194, 196;
by Eaton, 431.
Gelston, Daniel, i. 231.
George III., character of, i. 342.
Georgia, state of, in 1800, i. 4, 39;
surrenders territory to the United States, 303;
land speculation in, 303;
Rescinding Act, 304.
Gerry, Elbridge, i. 358.
Giles, William B., member of Congress from Virginia, i. 209,
261, 267;
his political career, 284 et seq.;
debate on the Judiciary Bill, 286 et seq., 299; ii. 142;
supports the impeachment of Judge Chase, 221;
his view of impeachment, 223, 235, 237, 238, 241.
Goddard, Calvin, member of Congress from Connecticut, ii. 160.
Godoy, Don Manuel, Prince of Peace, i. 346 et seq.;
treaty of 1795 negotiated by, 348, 369, 371;
baffles Bonaparte, 374;
attempts to conciliate the United States, ii. 21;
protests against the sale of Louisiana, 57;
conciliates Napoleon, 277.
Goodrich, Elizur, i. 226.
Gore, Christopher, ii. 347.
Granger, Gideon, appointed Postmaster-General, i. 308;
an active politician, ii. 192;
agent for the Yazoo claims, 212;
attacked by Randolph, 213.
Graydon, Alexander, i. 127.
Gregg, Andrew, member of Congress from Pennsylvania, ii.
123.
Grégoire, Abbé, i. 105.
Grenville, Lord, ii. 316, 418.
Griswold, Gaylord, member of Congress from New York, ii. 96.
Griswold, Roger, member of Congress from Connecticut, i. 269,
299; ii. 99, 101, 133, 142, 160;
his letters to Oliver Wolcott, 162, 169, 180;
conference of, with Burr, 183, 390, 391.

Hamilton, Alexander, i. 85, 86, 108, 277;


Talleyrand’s remark concerning, 352; ii. 168;
opposes Burr for governor, 176;
not in favor of disunion, 177;
projects, 184;
his opposition to Burr, 185 et seq.;
his duel with Burr, 186 et seq.;
mourned by the Federalists, 190.
Harper, Robert G., ii. 154, 228, 232.
Harrowby, Lord, British Foreign Secretary, ii. 418;
receives Monroe, 420;
instructions as to impressments and the boundary convention,
423 et seq.
“Hartford wits,” i. 101.
Harvard College, i. 77, 78, 90.
Hastings, Warren, trial of, ii. 226.
Hawkesbury, Lord, British Foreign Secretary, ii. 344, 410.
Henry, Patrick, i. 143.
Higginson, Stephen, ii. 164.
Hillhouse, James, senator from Connecticut, ii. 160.
Hopkins, Lemuel, i. 102.
Hopkinson, Joseph, ii. 228, 231.
Horses and horse-racing in New England, i. 50;
in New York and Virginia, 51.
Hosack, Dr. David, i. 111.
Hospitals and asylums in 1800, i. 128.
Hull, Isaac, at Tripoli, ii. 428.
Hunt, Samuel, member of Congress from New Hampshire, ii.
160.

Impeachment. (See Pickering and Chase.)


Impeachment, a scarecrow, ii. 243.
Impressment of seamen, ii. 335 et seq., 358, 384, 393, 394,
421, 423;
Act of Congress punishing, ii. 397, 420.
Indian corn, i. 58.
Indian tribes in 1800, i. 4.
Ingersoll, C. J., i. 123.
Ingersoll, Jared, ii. 259.
Inns of New England and New York, i. 21.
Inquisitiveness, American, i. 55.
Insane, the, treatment of, in 1800, i. 128.
Irving, Peter, editor of the “Morning Chronicle,” i. 121.
Irving, Washington, i. 110.

Jackson, Andrew, i. 54.


Jackson, Francis James, his reputation, ii. 360.
Jackson, James, senator from Georgia, and the Yazoo sale, i.
305; ii. 95, 238.
Jackson, John G., member of Congress from Virginia, ii. 211;
replies to Randolph’s attack on Madison, 215.
Jackson, Mr., editor of the “Political Register,” ii. 265;
discloses Yrujo’s attempt to use him, 266.
Jacmel, siege of, i. 385.
Jay, Chief-Justice, i. 108;
sent to England by Washington, ii. 323;
negotiates treaty with Lord Grenville, 326.
Jay’s treaty. (See Treaties.)
Jefferson, Thomas, i. 13, 32, 59, 65, 67, 72, 73;
Federalist opinion of, 80 et seq., 83, 112, 114;
opposed to manufactures, 138;
chief author of the Kentucky Resolutions, 140 et seq.;
leader of the Virginia school, 143;
characteristics of, 144 et seq.;
his political doctrines, 146 et seq., 156;
Thomas Moore’s verses on, 167;
visionary, 170;
his ideas of progress, 178, 179;
personal characteristics, 185 et seq.;
his dress, 187;
social pre-eminence, 188;
his inauguration, 191;
his antipathy to Marshall, 192, 194;
purity of his life, 196;
his inaugural address, 199 et seq.;
his conception of government, 210 et seq.;
his foreign policy, 214 et seq.;
his Cabinet, 218 et seq.;
his plans for the navy, 222 et seq.;
his treatment of patronage, 224, 294;
his New Haven letter, 226;
his first annual message, 248;
his course with regard to the Judiciary, 255 et seq.;
his abnegation of power, 262;

You might also like