Professional Documents
Culture Documents
SSRN Id3880911
SSRN Id3880911
SSRN Id3880911
423-446
https://doi.org/10.17304/ijil.vol18.3.818
Abstract
The introduction of advanced new technologies is transforming the space industry. Artificial
intelligence is offering unprecedented possibilities for space-related activities because it enables
space objects to gain autonomy. The increasing autonomy level of space objects does not come
without legal implications. The lack of human control challenges existing liability frameworks.
This paper reviews the provisions of the Outer Space Treaty and the Liability Convention as the
main legal documents introducing the legal grounds for attributing liability in case of damages
caused by autonomous space objects. Looking at the limitations of these legal frameworks in
what concerns the attribution of liability, this paper identifies the conditions that could cause
a liability gap. The amendment of the Liability Convention, the concept of “international
responsibility” introduced by Article VI of the Outer Space Treaty and several international law
principles are analysed as potential solutions for preventing the liability gap and mitigating the
risks posed by autonomous space objects.
I. INTRODUCTION
Ever since Gagarin entered space, and a little later, Armstrong set foot on
the moon, governments spent large amounts of money on space-related activi-
ties. On a yearly basis, the global space economy is estimated by the Space
Foundation, an organization actively advocating on behalf of the global space
community, and by the Satellite Industry Association, engaged in advocating
on behalf of the commercial satellite industry of United States (US). Accord-
ing to these two sources, the global space economy is gradually increasing.
In 2018, the worth of the global space economy was estimated at $360 billion
(according to the Space Foundation) and approximately $415 billion (accord-
ing to the Satellite Industry Association).1
The increasing investments in the space industry also come from the pri-
1
“ESPI Yearbook 2019: Space Policies, Issues and Trends,” European Space Policy Institute,
https://espi.or.at/?view=article&id=468:espi-yearbook-2019&catid=29.
vate sector, the so-called “New Space”, with major players such as SpaceX
and Blue Origin. This new phenomenon includes the emerging trends from the
space private business, which aims to engage in space-related activities inde-
pendently from governments. New entrants in the space industry usually fall
in one of two categories: the first is existing large companies, such as Google
or Facebook, interested in diversifying their portfolio and creating a symbiosis
between their current business activities and space applications and the sec-
ond is new space companies: start-ups.2 In terms of space budgets allocated
by the private space sector, the last decade showed a significant increase in
investments. Over the last 15 years, the total investment in space-related start-
up ventures amounts to $13.3 billion and more than 80 new space companies
have been set up.3
In addition to being the new actors, New Space also includes innovative
industrial approaches, specifically in what concerns advanced new technolo-
gies. Space-related technologies contributed to the growth of the private sec-
tor and by developing innovative technologies will continue to do so in the
near future, as indicated by Jet Propulsion Laboratory (JPL), a global leader in
planetary exploration and space-based astronomy that supports the missions
of National Aeronautics and Space Administration (NASA). Regarding these
innovative technologies, developing autonomous systems is a top priority.4
Autonomous systems are equipped with artificial intelligence (AI) capabilities
and function without human intervention.
The introduction of autonomous systems in space activities does not come
without legal implications, especially in what concerns issues of liability. As
far as we know there are no cases yet requiring the application of space law
in the context of damages caused by an AI system. However, if we look at
the evolution of AI systems in other industries, such as the self-driving cars
industry, many incidents already occurred.5 We believe we should anticipate
future incidents involving autonomous space objects and consider a frame-
work for liability regimes, in order to avoid situations in which liability cannot
be attributed, in other words: a liability gap. Therefore, this paper analyses
whether existing legal frameworks dealing with liability for damages caused
by space objects are capable of dealing with incidents caused by AI, specifi-
2
Ibid.
3
Alessandra Vernile, The Rise of Private Actors in the Space Sector (Springer International
Publishing, 2018), 2.
4
Strategic Technologies: Science and Technology, Jet Propulsion Laboratory (California
Institute of Technology, 2019), 2.
5
“Self-Driving Car Statistics for 2021: Policy Advice,” accessed February 14, 2021, https://
policyadvice.net/insurance/insights/self-driving-car-statistics/.
424
6
Virginia Dignum, “What Is Artificial Intelligence?” in Responsible Artificial Intelligence:
How to Develop and Use AI in a Responsible Way, Virginia Dignum, ed. (Cham: Springer
International Publishing, 2019), 9–34.
7
Sofia Samoli et.al, “AI Watch: Defining Artificial Intelligence,” (Luxembourg: Publications
Office of the European Union, 2020).
8
European Commission, “A Definition of Artificial Intelligence: Main Capabilities and
Scientific Disciplines,” in Independent High-Level Expert Group on Artificial Intelligence set
up by the European Commission, 8 April 2019.
9
Nathalie Nevejans, European Civil Law Rules in Robotics (Brussels: European Parliament,
2016).
10
“Communication from the Commission to the European Parliament, the European Council,
the European Economic and Social Committee and the Committee of the Regions, Coordinated
Plan on Artificial Intelligence,” European Commission Brussels, 7 December 2018, https://eur-
lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52018DC0795&from=EN.
425
personnel recruitment etc.11 AI systems can be descriptive, they tell you what
happened; diagnostic, they tell you why something happened or predictive as
they forecast what will (statistically probably) happen; and prescriptive in the
sense of being capable of performing actual decision making and implemen-
tation.12
The process of making decisions and taking actions of AI systems is ena-
bled by the fact that the system is fed with a relevant set of data or uses ap-
propriate sensors, for example, cameras or microphones, enabling the system
to collect the data required for achieving the goal for which it was designed.13
Subsequently, the collected data are interpreted and the system takes a deci-
sion, which may be translated into either pursuing an action or not. If it de-
cides to act, this decision will be executed through the system’s physical or
software actuators.14
Depending on the type of the AI system, the final decision is made either
by humans or autonomously, sometimes with some degree of human control.
AI systems can make decisions and improve their capabilities without human
intervention but depending on the available data. The process in which possi-
ble new actions are considered through an analysis of desired outcomes based
on previous failure or success is known as machine learning (ML).15 The in-
spiration for this comes from the neural networks of the human brain. As a
general classification, there are two main categories of ML: supervised and
unsupervised. Supervised ML relies on algorithms, which have been trained to
calculate outcomes based on examples, i.e. the AI system was “trained” with
examples of sets of input and corresponding output data previously identified
as correct.16 For unsupervised learning, algorithms are not trained, do not re-
ceive instructions identifying which data sets are correct and, therefore, will
11
“What Is Artificial Intelligence and How Is It Used?” News European Parliament, 9 April
2020, https://www.europarl.europa.eu/news/en/headlines/society/20200827STO85804/what-
is-artificial-intelligence-and-how-is-it-used.
12
Humberto Farias, “Machine Learning vs. Predictive Analytics: What’s the Difference?”
Concepta, 10 October 2017, accessed 11 February 2021, https://www.conceptatech.com/blog/
machine-learning-vs-predictive-analytics-what-is-the-difference.
13
Basheer Qolomany, et. al., “Leveraging Machine Learning and Big Data for Smart Buildings:
A Comprehensive Survey,” IEEE Access 7, (2019): 90316–90356.
14
“How Artificial Intelligence Works,” European Parliament Think Tank, 14 March 2019,
accessed 14 February 2021, https://www.europarl.europa.eu/thinktank/en/document.
html?reference=EPRS_BRI(2019)634420.
15
“Machine Learning: What It Is and Why It Matters,” SAS, accessed 14 February 2021,
https://www.sas.com/en_us/insights/analytics/machine-learning.html.
16
Stuart Russell and Peter Norvig, Artificial Intelligence: A Modern Approach (Third edition)
(Harlow: Pearson, 2014), 695.
426
search independently for relevant data sets required for achieving their goal.17
In the space industry, one area in which the applications of AI are cur-
rently being investigated is in satellite operations; in particular, to support the
operations of large satellite constellations, such as positioning, communica-
tion and end-of-life management. In addition, it is becoming more common
to find ML systems analysing the huge amount of data that comes from each
space mission.18
427
428
a decision and changes the course of the rover in order to avoid it.29 AI is also
used for space sustainability, i.e. for removing space junk. ESA plans to launch
the world’s first debris-removing space mission, ClearSpace-1, which will use
an AI-powered camera to find the debris. Its robotic arms will then grab the
object and drag it back to the atmosphere in order to burn it up.30 Another ap-
proach in using AI for space sustainability includes collision avoidance ma-
noeuvres using ML techniques.31 This collision avoidance system, currently
under development by ESA, will automatically assess the risk and likelihood
of in-space collisions, improve the decision making process on whether or not
a manoeuvre is needed, and may even send the orders to at-risk satellites to
get out of the way.32
These examples of how AI can be used in space are paving the way for a
higher autonomy level of this technology, which will be required to achieve
other important milestones in space-related activities, for example reaching
out to neighbouring solar systems, as Alpha Centauri. This would imply the
traversing of a distance of over four light-years. Upon arrival, the spacecraft
would need to operate independently for years, even decades, exploring mul-
tiple planets in the solar system. This ambition is not far from becoming a
reality, given recent precedents: for example, in 2017, an autonomous space-
craft completed almost a dozen years of nearly continuous operations of Earth
observation, using both onboard and ground-based AI.33
429
Space, Including the Moon and Other Celestial Bodies, 610 U.N.T.S. 205 [Outer Space Treaty].
35
United Nations Convention on the International Liability for damage caused by space objects,
opened for signature on 29 March 1972, 961 UNTS 13810 (entered into force September 1972).
36
Fabio Tronchetti, Fundamentals of Space Law and Policy (New York: Springer, 2013), 72.
37
Bin Cheng, “International Responsibility and Liability for Launch Activities,” Air and Space
Law 20, no. 6 (December 1, 1995): 297-310.
430
Outer Space Treaty and the corresponding Articles II and III of the Liability
Convention, but from the state technically responsible under Article VI of the
Outer Space Treaty.40
Figure 1. Accountability Concepts under Core International Space Law
431
432
care in outer space, the breach of which would constitute fault.48 As a general
rule, fault can only be asserted when the act violates duty of care or standard
of conduct. The Convention does not provide a clear obligation to act or to
abstain from acting, making it difficult to implement this notion in practice.49
Some legal scholars referred to Black’s Law Dictionary in an attempt to define
fault as “an error or defect of judgement or of conduct; any deviation from
prudence or duty resulting from inattention (…); the intentional or negligent
failure to maintain some standard of conduct when the failure results in harm
to another person”.50 As such, “fault liability” can then be defined as imply-
ing a certain degree of blameworthiness, or, alternatively, a type of liability
in which the plaintiff must prove that the defendant’s conduct was either neg-
ligent or intentional.51 Other legal scholars considered that the Liability Con-
vention did not explicitly resolve certain details on purpose, for instance de-
fining “fault”, in order to avoid a too specific approach. If the negotiations of
the Liability Convention had moved in that direction most likely the Liability
Convention would have never been agreed upon. Fortunately, it was possible
to agree on a formal process for the resolution of disputes, therefore, some of
the lacunae in the Convention can be resolved through the use of this process.
Thus, it may, for example, be possible to obtain the required understanding of
the meaning of “fault,” if necessary.52
48
Marc S. Firestone, “Problems in the Resolution of Disputes Concerning Damage Caused
in Outer Space,” Tulane Law Review : Devoted to the Civil Law, Comparative Law and
Codification, 1985.
49
Yun Zhao, “The 1972 Liability Convention: Time for Revision?” Space Policy 20, no. 2
(May 1, 2004): 117–22, https://doi.org/10.1016/j.spacepol.2004.02.008.
50
Frans von der Dunk, “Too-Close Encounters of the Third Party Kind: Will the Liability
Convention Stand the Test of the Cosmos 2251-Iridium 33 Collision?,” Space, Cyber,
and Telecommunications Law Program Faculty Publications, January 1, 2010, https://
digitalcommons.unl.edu/spacelaw/28.
51
Ibid.
52
Carl Q. Christol, ““International Liability for Damage Caused by Space Objects”, American
Journal Of International Law 74, no. 2 (1980): 346-371, doi:10.2307/2201505.
433
53
Anne-Sophie Martin and Steven Freeland, “The Advent of Artificial Intelligence in Space
Activities: New Legal Challenges,” Space Policy 55 (February 1, 2021): 101408, https://doi.
org/10.1016/j.spacepol.2020.101408.
54
Liability Convention, art I (b).
55
von der Dunk and Tronchetti, Handbook of Space Law, 86.
56
Eric Berger, “Meet Ravn X—A fully autonomous, air-launched rocket for small satellites,”
Ars Technica, accessed 20 December 2020, https://arstechnica.com/science/2020/12/meet-
ravn-x-a-fully-autonomous-air-launched-rocket-for-small-satellites/.
57
Vladimir Kopal, “Some Remarks on Issues Relating to Legal Definitions of Space Objects,
Space Debris and Astronaut,” in Proceedings of the 37th on the Law of Outer Space (1994), 99.
58
Stephan Hobe, Cologne Commentary on Space Law / Vol. 2, Rescue Agreement, Liability
Convention, Registration Convention, Moon Agreement. (Köln: Heymann, 2013), 34.
434
59
United Nations Convention on Registration of Objects Launched into Outer Space, Nov. 12,
1975, 28 U.S.T. 695, 1023 U.N.T.S. 15. [Registration Convention]
60
Yoon Lee, “Registration of Space Objects: ESA Member States’ Practice,” Space Policy 22,
no. 1 (February 1, 2006): 42–51, https://doi.org/10.1016/j.spacepol.2005.11.007.
61
Registration Convention, art. II
62
Registration Convention, art. IV
63
Martin and Freeland, “The Advent of Artificial Intelligence in Space Activities”
64
Ibid.
435
that the launching state had not violated international law and the international
UN treaties.65
The Liability Convention does not provide for a definition of “gross negli-
gence” and no indication was made for attributing negligent conduct to others
or for the allocation of a principal’s vicarious liability for an agent or employ-
ee.66 In the absence of clear criteria applicable to gross negligence, it may
prove difficult to apply them in practice. According to the Cambridge Diction-
ary, gross negligence is being defined as a serious lack of care or attention
towards a person or thing that another person is responsible for.67 The analysis
of this definition reveals that the concept is related to the mental element of an
act or omission, it is a product of human thought, it is associated to an action
or omission part of a human activity. The concept is being challenged in case
of a damage resulting wholly or partially from an act or omission of an au-
tonomous space object deployed or controlled by a claimant state. Depending
on the autonomy level of the space object, invoking exoneration of liability
based on the concept of gross negligence may not be applicable in case of
space objects equipped with AI capabilities,68 because it no longer involves a
human activity.
65
Liability Convention, art. VI (2)
66
Christol, “International Liability for Damage Caused by Space Objects”.
67
“Gross Negligence,” Cambridge Dictionary, accessed 20 December 2020, https://dictionary.
cambridge.org/dictionary/english/gross-negligence.
68
George Anthony Gal, Cristiana Santos, Lucien Rapp, Réka Markovich, Leendert van
der Torre, “Artificial intelligence in space,” available at https://www.researchgate.net/
publication/342377395_Artificial_intelligence_in_space.
69
See Supra 3.2, II
436
70
Ibid.
71
David J. Gunkel, “The Other Question: Can and Should Robots Have Rights?,” Ethics and
Information Technology 20, no. 2 (2017): pp. 87-99, https://doi.org/10.1007/s10676-017-9442-
4.
72
Publications Office of the European Union, “Liability for Artificial Intelligence and other
Emerging Digital Technologies.” (Publications Office of the European Union, November
27, 2019), http://op.europa.eu/en/publication-detail/-/publication/1c5e30be-1197-11ea-8c1f-
01aa75ed71a1/language-en/format-PDF.
73
George Anthony Gal et al., “Artificial Intelligence in Space.”
437
74
von der Dunk and Tronchetti, “International Space Law,” 51 – 52.
75
Ibid.
438
439
440
441
aware of the risk that it takes by launching an autonomous space object. This
knowingly taking a risk would then be the justification for assigning liability
to that state should the risk materialise in the sense that the autonomous object
causes damage.
In a distinct case concerning Application of the Convention on the Preven-
tion and Punishment of the Crime of Genocide (Bosnia and Herzegovina v.
Serbia and Montenegro)88 the ICJ extended the application of due diligence
under the Corfu Channel case. This means that the due diligence obligation is
not exclusively connected to a state’s control over its territory. The due dili-
gence obligation also covers elements under a state’s jurisdiction and control
that it has power over or has the capacity to influence.89 In an outer space con-
text, the space objects’ launch and operation are presumed to be activities that
launching states have control over. This suggests that the best-efforts obliga-
tion of due diligence to prevent acts, such as causing space object collisions,
that would cause damage to another state is a duty incumbent on launching
states.90
The launching state’s responsibility also applies in cases where it is not the
state itself that is involved in the launching, but a New Space private party – as
there is not yet in the Treaties a provision to hold a private company liable for
damages caused in space.
VI. CONCLUSION
In recent years, the space industry has been revolutionised. Increas-
ing budgets laid the grounds for technological advancements. Space objects
launched by states and private actors are becoming increasingly sophisticated
given their AI capabilities. The autonomy of space objects has become a prior-
ity for states as well as for private actors. Autonomous space objects are used,
among others, to monitor the operation of satellites and climate change, are
implemented in space stations operations as virtual assistants for astronauts
and they support exploration on planets, where on-site conditions are still too
dangerous for humans. Moreover, there is an increasing number of situations
in which human control over a space object is no longer economically or prac-
tically feasible.
88
International Court of Justice, “Latest Developments: Application of the Convention on the
Prevention and Punishment of the Crime of Genocide (Bosnia and Herzegovina v. Serbia and
Montenegro),” accessed February 14, 2021, https://www.icj-cij.org/en/case/91.
89
Dennerley, “State Liability for Space Object Collisions,” 281.
90
Ibid.
442
443
BIBLIOGRAPHY
Articles in journals and periodicals
Bishop, William W. J., G. Guerrero, and E. Hambro. “The Corfu Channel Case (Mer-
its).” American Journal of International Law 43, no. 3 (1949): 558–89.
Cheng, Bin. “International Responsibility and Liability for Launch Activities.” Air
and Space Law 20, no. 6 (December 1, 1995): 297-310.
Chien, Steve, and Kiri L. Wagstaff. “Robotic Space Exploration Agents,” in Science
Robotics 2, no. 7 (2017): 1-2.
Chien, Steve, et. al. “The Future of AI in Space.” IEEE Intelligent Systems 21, no. 04
(July 1, 2006): 64–69.
Christol, Carl Q. “International Liability for Damage Caused by Space Objects.” Amer-
ican Journal Of International Law 74, no. 2 (1980): 346-371.
Dennerley, Joel A. “State Liability for Space Object Collisions: The Proper Interpreta-
tion of ‘Fault’ for the Purposes of International Space Law.” European Journal of
International Law 29, no. 1 (May 8, 2018): 281–301.
Finch, Edward R. “Outer Space Liability: Past, Present and Future.” The International
Lawyer 14, no. 1 (1980): 123–127.
Firestone, Marc S. “Comment: Problems in the Resolution of Disputes Concerning
Damage in Outer Space.” Tulane Law Review, (January 1985): 747-772.
Gorove, Stephen. “Liability in Space Law: An Overview Space Law.” Annals of Air
and Space Law 8, (1983): 373–380.
Gunkel, David J. “The Other Question: Can and Should Robots Have Rights?” Ethics
and Information Technology 20, no. 2 (2017): pp. 87-99.
Lee, Yoon. “Registration of Space Objects: ESA Member States’ Practice.” Space
Policy 22, no. 1 (2006): 42–51.
Martin, Anne-Sophie and Steven Freeland “The Advent of Artificial Intelligence in
Space Activities: New Legal Challenges,” Space Policy 55 (February 1, 2021):
101408.
Qolomany, Basheer, et al. “Leveraging Machine Learning and Big Data for Smart
Buildings: A Comprehensive Survei.” IEEE Access 7 (2019): 90316–90356.
Reis, Herbert.“Some Reflections on the Liability Convention for Outer Space.” Jour-
nal of Space Law 6, no. 2 (1978): 125–128.
Steve Chien and Kiri L. Wagstaff, “Robotic Space Exploration Agents,” in Science
Robotics 2, no. 7 (2017): 1-2.
Weil, Prosper. “The Court Cannot Conclude Definitively . . . Non Liquet Revisited
Chapter 1: Questions of Theory.” Columbia Journal of Transnational Law 36,
no. Issues 1 & 2 (1998): 109–119.
Zhao, Yun. “The 1972 Liability Convention: Time for Revision?” Space Policy 20,
no. 2 (2004): 117–122.
444
of Outer Space, Including the Moon and Other Celestial Bodies. 610 UNTS 205
(opened for signature 27 January 1967, entered into Force 10 October 1967).
United Nations Convention on Registration of Objects Launched into Outer Space.
1023 UNTS 15 (opened for signature 12 November 1975, entered into force on
15 September 1976).
United Nations Convention on the International Liability for Damage Caused by
Space Objects. 961 UNTS 13810 (opened for signature on 29 March 1972, en-
tered into force 20 March 1975).
Web sources
“Floating Robot Cimon sent to International Space Station.” BBC News, 29 June 2018,
accessed 12 February 2021, https://www.bbc.com/news/technology-44655675.
“What Is Artificial Intelligence and How Is It Used?” News European Parlia-
ment, 9 April 2020, https://www.europarl.europa.eu/news/en/headlines/
society/20200827STO85804/what-is-artificial-intelligence-and-how-is-it-used.
Airbus. “Astronaut Assistant CIMON-2 Is on its way to the International Space Sta-
tion.” Accessed 12 February 2021, https://www.airbus.com/newsroom/press-re-
leases/en/2019/12/astronaut-assistant-cimon2-is-on-its-way-to-the-international-
space-station.html.
Berger, Eric. “Meet Ravn X—A fully autonomous, air-launched rocket for small sat-
ellites.” Ars Technica, accessed 20 December 2020, https://arstechnica.com/sci-
ence/2020/12/meet-ravn-x-a-fully-autonomous-air-launched-rocket-for-small-
satellites/.
Berquand, Audrey and Deep Bandivadekar. “Five Ways Artificial Intelligence Can
Help Space Exploration,” The Conversation, 25 January 2021, accessed February
12, 2021, http://theconversation.com/five-ways-artificial-intelligence-can-help-
space-exploration-153664.
Cambridge Dictionary. “Gross Negligence.” Accessed 20 December 2020, https://dic-
tionary.cambridge.org/dictionary/english/gross-negligence.
European Parliament Think Tank. “How Artificial Intelligence Works.” Accessed
14 February 2021, https://www.europarl.europa.eu/thinktank/en/document.
html?reference=EPRS_BRI(2019)634420.
European Space Agency. “Automating Collision Avoidance.” accessed February 12,
2021, https://www.esa.int/Safety_Security/Space_Debris/Automating_collision_
avoidance.
European Space Agency. “Robots in Space.” Accessed 12 February 2021, https://
www.esa.int/Enabling_Support/Preparing_for_the_Future/Discovery_and_Prep-
aration/Robots_in_space2
Farias, Humberto. “Machine Learning Vs Predictive Analytics: What’s the Differ-
ence? Data Science.” Accessed 11 February 2021, https://www.conceptatech.
com/blog/machine-learning-vs-predictive-analytics-what-is-the-difference.
Gal, George Anthony, Cristiana Santos, Lucien Rapp, Réka Markovich, Leendert van
der Torre. “Artificial intelligence in space.” available at https://www.research-
gate.net/publication/342377395_Artificial_intelligence_in_space.
Information Commissioner’s Office. “Big Data, Artificial Intelligence, Machine
Learning and Data Protection.” Accessed https://ico.org.uk/media/for-organisa-
tions/documents/2013559/big-data-ai-ml-and-data-protection.pdf.
445
Others
Samoli, Sofia, et.al. “AI Watch: Defining Artificial Intelligence.” (Luxembourg: Pub-
lications Office of the European Union, 2020).
von der Dunk, Frans. “Liability versus Responsibility in Space Law: Misconception
or Misconstruction?” In Proceedings of the 34th Colloquium on the Law of Outer
Space, 363-371, 1992.
Kopal, Vladimir. “Some Remarks on Issues Relating to Legal Definitions of Space
Objects, Space Debris and Astronaut.” In Proceedings of the 37th Colloquium on
the Law of Outer Space, 99-106, 1994.
European Commission Brussels. “Communication from the Commission to the Eu-
ropean Parliament, the European Council, the European Economic and Social
Committee and the Committee of the Regions, Coordinated Plan on Artificial In-
telligence.” 7 December 2018, https://eur-lex.europa.eu/legal-content/EN/TXT/
PDF/?uri=CELEX:52018DC0795&from=EN.
European Space Policy Institute. “ESPI Yearbook 2019: Space Policies, Issues
and Trends.” https://espi.or.at/?view=article&id=468:espi-yearbook-
2019&catid=29.
von der Dunk, Frans. “Too-Close Encounters of the Third Party Kind: Will the Liabil-
ity Convention Stand the Test of the Cosmos 2251-Iridium 33 Collision?” Space,
Cyber, and Telecommunications Law Program Faculty Publications, January 1,
2010, https://digitalcommons.unl.edu/spacelaw/28.
Mathieu, Pierre-Philippe, Sveinung Loekken, et. al., “Towards a European AI for
Earth Observation Research & Innovation Agenda,” in Workshop at ESA Φ-lab
(European Space Agency, 2018), 1-20, https://blogs.esa.int/philab/files/2018/07/
Towards-a-European-AI-for-Earth-Observation-Research-Innovation-Agenda-.
446
pdf.
European Commission. “A Definition of Artificial Intelligence: Main Capabilities and
Scientific Disciplines,” in Independent High-Level Expert Group on Artificial In-
telligence set up by the European Commission, 8 April 2019.
447