Ebook New Foundations For Information Theory Logical Entropy and Shannon Entropy 1St Edition David Ellerman Online PDF All Chapter

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 69

New Foundations for Information

Theory Logical Entropy and Shannon


Entropy 1st Edition David Ellerman
Visit to download the full and correct content document:
https://ebookmeta.com/product/new-foundations-for-information-theory-logical-entrop
y-and-shannon-entropy-1st-edition-david-ellerman/
More products digital (pdf, epub, mobi) instant
download maybe you interests ...

New Foundations for Information Theory Logical Entropy


and Shannon Entropy 1st Edition David Ellerman

https://ebookmeta.com/product/new-foundations-for-information-
theory-logical-entropy-and-shannon-entropy-1st-edition-david-
ellerman-2/

Forecasting with Maximum Entropy: The Interface Between


Physics, Biology, Economics and Information Theory 1st
Edition Hugo Fort

https://ebookmeta.com/product/forecasting-with-maximum-entropy-
the-interface-between-physics-biology-economics-and-information-
theory-1st-edition-hugo-fort/

Infamy Entropy Endless Obsession 1st Edition Kelsie


Calloway

https://ebookmeta.com/product/infamy-entropy-endless-
obsession-1st-edition-kelsie-calloway-2/

Infamy Entropy Endless Obsession 1st Edition Kelsie


Calloway

https://ebookmeta.com/product/infamy-entropy-endless-
obsession-1st-edition-kelsie-calloway/
An Account Of Thermodynamic Entropy 1st Edition Alberto
Gianinetti

https://ebookmeta.com/product/an-account-of-thermodynamic-
entropy-1st-edition-alberto-gianinetti/

Entropy Divergence and Majorization in Classical and


Quantum Thermodynamics Takahiro Sagawa

https://ebookmeta.com/product/entropy-divergence-and-
majorization-in-classical-and-quantum-thermodynamics-takahiro-
sagawa/

High-Entropy Materials: From Basics to Applications 1st


Edition Huimin Xiang

https://ebookmeta.com/product/high-entropy-materials-from-basics-
to-applications-1st-edition-huimin-xiang/

Arctic Archives Ice Memory and Entropy 1st Edition Susi


K Frank Kjetil A Jakobsen

https://ebookmeta.com/product/arctic-archives-ice-memory-and-
entropy-1st-edition-susi-k-frank-kjetil-a-jakobsen/

Loss Data Analysis : The Maximum Entropy Approach 2


Extended Edition Henryk Gzyl

https://ebookmeta.com/product/loss-data-analysis-the-maximum-
entropy-approach-2-extended-edition-henryk-gzyl/
SpringerBriefs in Philosophy

SpringerBriefs present concise summaries of cutting-edge research and


practical applications across a wide spectrum of fields. Featuring
compact volumes of 50 to 125 pages, the series covers a range of
content from professional to academic. Typical topics might include:
A timely report of state-of-the art analytical techniques
A bridge between new research results, as published in journal
articles, and a contextual literature review
A snapshot of a hot or emerging topic
An in-depth case study or clinical example
A presentation of core concepts that students must understand in
order to make independent contributions
SpringerBriefs in Philosophy cover a broad range of philosophical
fields including: Philosophy of Science, Logic, Non-Western Thinking
and Western Philosophy. We also consider biographies, full or partial, of
key thinkers and pioneers.
SpringerBriefs are characterized by fast, global electronic
dissemination, standard publishing contracts, standardized manuscript
preparation and formatting guidelines, and expedited production
schedules. Both solicited and unsolicited manuscripts are considered
for publication in the SpringerBriefs in Philosophy series. Potential
authors are warmly invited to complete and submit the Briefs Author
Proposal form. All projects will be submitted to editorial review by
external advisors.
SpringerBriefs are characterized by expedited production schedules
with the aim for publication 8 to 12 weeks after acceptance and fast,
global electronic dissemination through our online platform
SpringerLink. The standard concise author contracts guarantee that
an individual ISBN is assigned to each manuscript
each manuscript is copyrighted in the name of the author
the author retains the right to post the pre-publication version on
his/her website or that of his/her institution.
More information about this series at http://​www.​springer.​com/​
series/​10082
David Ellerman

New Foundations for Information


Theory
Logical Entropy and Shannon Entropy
1st ed. 2021
David Ellerman
Faculty of Social Sciences, University of Ljubljana, Ljubljana, Slovenia

ISSN 2211-4548 e-ISSN 2211-4556


SpringerBriefs in Philosophy
ISBN 978-3-030-86551-1 e-ISBN 978-3-030-86552-8
https://doi.org/10.1007/978-3-030-86552-8

Mathematics Subject Classification (2010): 03A05, 94A17

SpringerBriefs in Philosophy

© The Author(s), under exclusive license to Springer Nature


Switzerland AG 2021

This work is subject to copyright. All rights are solely and exclusively
licensed by the Publisher, whether the whole or part of the material is
concerned, specifically the rights of translation, reprinting, reuse of
illustrations, recitation, broadcasting, reproduction on microfilms or in
any other physical way, and transmission or information storage and
retrieval, electronic adaptation, computer software, or by similar or
dissimilar methodology now known or hereafter developed.

The use of general descriptive names, registered names, trademarks,


service marks, etc. in this publication does not imply, even in the
absence of a specific statement, that such names are exempt from the
relevant protective laws and regulations and therefore free for general
use.

The publisher, the authors and the editors are safe to assume that the
advice and information in this book are believed to be true and accurate
at the date of publication. Neither the publisher nor the authors or the
editors give a warranty, expressed or implied, with respect to the
material contained herein or for any errors or omissions that may have
been made. The publisher remains neutral with regard to jurisdictional
claims in published maps and institutional affiliations.

This Springer imprint is published by the registered company Springer


Nature Switzerland AG
The registered company address is: Gewerbestrasse 11, 6330 Cham,
Switzerland
To the memory of Gian-Carlo Rota—mathematician, philosopher, mentor,
and friend.
About this book
This book presents a new foundation for information theory where the
notion of information is defined in terms of distinctions, differences,
distinguishability, and diversity. The direct measure is logical entropy
which is the quantitative measure of the distinctions made by a
partition. Shannon entropy is a transform or re-quantification of logical
entropy that is especially suited for Claude Shannon’s “mathematical
theory of communications.” The interpretation of the logical entropy of
a partition is the two-draw probability of getting a distinction of the
partition (a pair of elements distinguished by the partition) so it
realizes a dictum of Gian-Carlo Rota: . Andrei

Kolmogorov suggested that information should be defined


independently of probability, so logical entropy is first defined in terms
of the set of distinctions of a partition and then a probability measure
on the set defines the quantitative version of logical entropy. We give a
history of the logical entropy formula that goes back to Corrado Gini’s
1912 “index of mutability” and has been rediscovered many times. In
addition to being defined as a (probability) measure in the sense of
measure theory (unlike Shannon entropy), logical entropy is always
non-negative (unlike three-way Shannon mutual information) and
finitely-valued for countable distributions. One perhaps surprising
result is that in spite of decades of MaxEntropy efforts based
maximizing Shannon entropy, the maximization of logical entropy gives
a solution that is closer to the uniform distribution in terms of the usual
(Euclidean) notion of distance. When generalized to metrical
differences, the metrical logical entropy is just twice the variance, so it
connects to the standard measure of variation in statistics. Finally, there
is a semi-algorithmic procedure to generalize set-concepts to vector-
space concepts. In this manner, logical entropy defined for set
partitions is naturally extended to the quantum logical entropy of
direct-sum decompositions in quantum mechanics. This provides a new
approach to quantum information theory that directly “measures” the
results of quantum measurement.
Acknowledgements
After Gian-Carlo Rota’s untimely death in 1999, many of us who had
worked or coauthored with him set about to develop parts of his
unfinished work. My own work focused on developing his initial efforts
on the logic of equivalence relations or partitions. Since the notion of a
partition on a set is the category-theoretic dual to notion of a subset of
a set, the first product was the logic of partitions (see the Appendix)
that is dual to the Boolean logic of subsets (usually presented as
“propositional logic”). Rota had additionally emphasized that the
quantitative version of subset logic was logical probability theory and
that probability is to subsets as information is to partitions. Hence the
next thing to do in this Rota-suggested program was to develop the
quantitative version of partition logic as a new approach to information
theory. The topic of this book is this logical theory of information based
on the notion of the logical entropy of a partition–that then transforms
in a uniform way to the well-known entropy formulas of Shannon. The
book is dedicated to Rota’s memory to acknowledge his influence on
this whole project.
Contents
1 Logical Entropy
1.​1 Introduction
1.​2 Logical Information as the Measure of Distinctions
1.​3 Classical Logical Probability and Logical Entropy
1.​4 Information Algebras and Joint Distributions
1.​5 Brief History of the Logical Entropy Formula
References
2 The Relationship Between Logical Entropy and Shannon Entropy
2.​1 Shannon Entropy
2.​2 Logical Entropy, Not Shannon Entropy, Is a (Non-negative)
Measure
2.​3 The Dit-Bit Transform
References
3 The Compound Notions for Logical and Shannon Entropies
3.​1 Conditional Logical Entropy
3.​2 Shannon Conditional Entropy
3.​3 Logical Mutual Information
3.​4 Shannon Mutual Information
3.​5 Independent Joint Distributions
3.​6 Cross-Entropies, Divergences, and Hamming Distance
3.​6.​1 Cross-Entropies
3.​6.​2 Divergences
3.​6.​3 Hamming Distance
3.​7 Summary of Formulas and Dit-Bit Transforms
References
4 Further Developments of Logical Entropy
4.​1 Entropies for Multivariate Joint Distributions
4.​2 An Example of Negative Mutual Information for Shannon
Entropy
4.​3 Entropies for Countable Probability Distributions
4.​4 Metrical Logical Entropy =​(Twice) Variance
4.​5 Boltzmann and Shannon Entropies:​A Conceptual
Connection?​
4.​6 MaxEntropies for Discrete Distributions
4.​7 The Transition to Coding Theory
4.​8 Logical Entropy on Rooted Trees
References
5 Quantum Logical Information Theory
5.​1 Density Matrix Treatment of Logical Entropy
5.​2 Linearizing Logical Entropy to Quantum Logical Entropy
5.​3 Theorems About Quantum Logical Entropy
5.​4 Quantum Logical Entropies with Density Matrices as
Observables
5.​5 The Logical Hamming Distance Between Two Partitions
5.​6 The Quantum Logical Hamming Distance
References
6 Conclusion
6.​1 Information Theory Re-founded and Re-envisioned
6.​2 Quantum Information Theory Re-envisioned
6.​3 What Is to Be Done?​
References
A Basics of Partition Logic
A.​1 Subset Logic and Partition Logic
A.​2 The Lattice Operations on Partitions
A.​3 Implication and Negation in Partition Logic
A.​4 Relative Negation in Partition Logic and the Boolean Core
A.​5 Partition Tautologies
References
Index
About the Author
David P. Ellerman
has a BS from MIT (Philosophy of Science), and two Masters
(Philosophy and Economics) and a PhD (Mathematics) from Boston
University. He is now associated with the University of Ljubljana,
Slovenia after retiring from 20 years of teaching Economics,
Mathematics, and Computer Science in the Boston area and after a stint
in the World Bank as Senior Advisor to the Chief Economist Joseph
Stiglitz. He has published articles in information theory, economics,
logic, mathematics, physics, philosophy, and law (see website www.​
ellerman.​org) and eight books including:
Putting Jurisprudence Back into Economics: On What Is Really Wrong
with Today’s Neoclassical Theory. Cham, Switzerland: SpringerNature.
2021.
The Uses of Diversity: Essays on Polycentricity. Lexington Books,
2020.
Intellectual Trespassing as a Way of Life: Essays in Philosophy,
Economics, and Mathematics. Lanham MD: Rowman & Littlefield. 1995.
Neo-Abolitionism: Abolishing Human Rentals in Favor of Workplace
Democracy. Cham, Switzerland: SpringerNature. 2021.
Helping People Help Themselves: From the World Bank to an
Alternative Philosophy of Development Assistance. Ann Arbor: University
of Michigan Press, 2005.
© The Author(s), under exclusive license to Springer Nature Switzerland AG 2021
D. Ellerman, New Foundations for Information Theory, SpringerBriefs in Philosophy
https://doi.org/10.1007/978-3-030-86552-8_1

1. Logical Entropy
David Ellerman1
(1) Faculty of Social Sciences, University of Ljubljana, Ljubljana,
Slovenia

For in the general we must note, That whatever is capable of a


competent Difference, perceptible to any Sense, may be a sufficient
Means whereby to express the Cogitations.John Wilkins 1641

Abstract
This book presents a new foundation for information theory where the
notion of information is defined in terms of distinctions, differences,
distinguishability, and diversity. The direct measure is logical entropy
which is the quantitative measure of the distinctions made by a
partition. Shannon entropy is a transform or re-quantification of logical
entropy for Claude Shannon’s “mathematical theory of
communications.” The interpretation of the logical entropy of a
partition is the two-draw probability of getting a distinction of the
partition (a pair of elements distinguished by the partition) so it
realizes a dictum of Gian-Carlo Rota: . Andrei

Kolmogorov suggested that information should be defined


independently of probability, so logical entropy is first defined in terms
of the set of distinctions of a partition and then a probability measure
on the set defines the quantitative version of logical entropy. We give a
history of the logical entropy formula that goes back to Corrado Gini’s
1912 “index of mutability” and has been rediscovered many times.
Keywords Information-as-distinctions – Logical entropy – History of
the formula

1.1 Introduction
This book develops the logical theory of information-as-distinctions.
The atom of information is: “This is different from that.” It can be seen
as the application of the logic of partitions [7] to information theory.
Partitions are dual (in a category-theoretic sense) to subsets. George
Boole developed the notion of logical probability [5] as the normalized
counting measure on subsets in his logic of subsets. This chapter
develops the normalized counting measure on partitions as the
analogous quantitative treatment in the logic of partitions—so that
measure will be called “logical entropy. ” That measure is a new logical
derivation of an old formula measuring diversity and distinctions, e.g.,
Corrado Gini’s index of mutability or diversity [10], that goes back to
the early twentieth century.1
This raises the question of the relationship of logical entropy to
Claude Shannon’s entropy [24] . The entropies are closely related since
they are both ultimately based on the concept of information-as-
distinctions —but they represent two different way to quantify
distinctions. Logical entropy directly counts the distinctions (as defined
in partition logic) whereas Shannon entropy, in effect, counts the
minimum number of binary partitions (or yes/no questions) it takes,
on average, to make the same distinctions. Since that gives (in standard
examples) a binary code for the distinct entities, the Shannon theory is
perfectly adapted for applications to the theory of coding and
communications.
The logical theory and the Shannon theory are also related in their
compound notions of joint entropy, conditional entropy, and mutual
information. Logical entropy is defined as a (probability) measure on a
set in the usual mathematical sense (e.g., is non-negative) so, as with
any measure, the compound formulas satisfy the usual Venn diagram
relationships. The compound notions of Shannon entropy were defined,
without any mention of a set, so that they also satisfy similar Venn
diagram relationships. When extended to three or more partitions or
random variables, the Shannon mutual information can have negative
values. There is a more general measure notion, a signed measure, that
can take on negative values and signed measures also satisfy all the
usual Venn diagram relationships (where some areas in the diagrams
may be negative). Thus logical entropy is defined as a certain
(probability) measure on a set, while Shannon entropy is defined
independently of any given set. But given the Venn diagram
relationships between the compound Shannon entropies, a set can be
constructed ex post so that those Shannon entropies are the values of a
signed measure on that set.
There is also a monotonic but non-linear transformation of formulas
that transforms each of the logical entropy compound formulas into the
corresponding Shannon entropy compound formula, and the transform
preserves the Venn diagram relationships. This “dit-bit transform” is
heuristically motivated by showing how average counts of distinctions
(“dits”) can be converted in average counts of binary partitions (“bits”).

1.2 Logical Information as the Measure of


Distinctions
There is now a widespread view that information is fundamentally
about differences, distinguishability, distinctions, and diversity. As
Charles H. Bennett, one of the founders of quantum information theory,
put it:
So information really is a very useful abstraction. It is the notion
of distinguishability abstracted away from what we are
distinguishing, or from the carrier of information. [3, p. 155]

This view even has an interesting history. In James Gleick’s book, The
Information: A History, A Theory, A Flood, he noted the focus on
differences in the seventeenth century polymath, John Wilkins , who
was a founder of the Royal Society. In 1641, the year before Isaac
Newton was born, Wilkins published one of the earliest books on
cryptography, Mercury or the Secret and Swift Messenger, which not only
pointed out the fundamental role of differences but noted that any
(finite) set of different things could be encoded by words in a binary
code.
For in the general we must note, That whatever is capable of a
competent Difference, perceptible to any Sense, may be a
sufficient Means whereby to express the Cogitations. It is more
convenient, indeed, that these Differences should be of as great
Variety as the Letters of the Alphabet; but it is sufficient if they
be but twofold, because Two alone may, with somewhat more
Labour and Time, be well enough contrived to express all the
rest. [27, Chap. XVII, p. 69]

Wilkins explains that a five letter binary code would be sufficient to


code the letters of the alphabet since 25 = 32.

Thus any two Letters or Numbers, suppose A.B. being


transposed through five Places, will yield Thirty Two
Differences, and so consequently will superabundantly serve for
the Four and twenty Letters….[27, Chap. XVII, p. 69]

As Gleick noted:

Any difference meant a binary choice. Any binary choice began


the expressing of cogitations. Here, in this arcane and
anonymous treatise of 1641, the essential idea of information
theory poked to the surface of human thought, saw its shadow,
and disappeared again for [three] hundred years. [11, p. 161]

Thus counting distinctions would seem the right way to measure


information,2 and that is the measure which emerges naturally out of
partition logic—just as finite logical probability emerges naturally as
the measure of counting elements in Boole’s subset logic.
In addition to the philosophy of information literature [2], there is a
whole sub-industry in mathematics concerned with different so-called
“measures” of ‘entropy’ or ‘information’ [1, 26] that is long on formulas
and ‘intuitive axioms’ but short on interpretations and short on actually
being measures. Out of that plethora of definitions, logical entropy is
the measure (in the non-negative technical sense of measure theory) of
information that arises out of partition logic just as logical probability
theory arises out of subset logic.
The logical notion of information-as-distinctions supports the view
that the notion of information is independent of the notion of
probability and should be based on finite combinatorics. As Andrei
Kolmogorov put it:

Information theory must precede probability theory, and not be


based on it. By the very essence of this discipline, the
foundations of information theory have a finite combinatorial
character. [15, p. 39]

Logical information theory precisely fulfills Kolmogorov’s dictum. It


starts simply with a set of distinctions defined by a partition on a finite
set U, where a distinction is an ordered pair of elements of U in distinct
blocks of the partition. Thus the “finite combinatorial” object is the set
of distinctions (“ditset” ) or information set (“infoset” ) associated with
the partition, i.e., the complement in U × U of the equivalence relation
associated with the partition. To get a quantitative measure of
information, any probability distribution on U defines a product
probability measure on U × U, and the logical entropy is simply that
probability measure on the information set. This motivational
description of logical information theory will now be developed in
detail.

1.3 Classical Logical Probability and Logical


Entropy
A partition on a set U is a set of subsets of U, called the
“blocks” of π, that are jointly exhaustive (union is U) and mutually
exclusive (disjoint). A distinction or dit of a partition π is an ordered
pair of elements of U that are in different blocks of π. The ditset
is the subset of U × U consisting of all the distinctions of π. A
binary relation R ⊆ U × U on U is said to be a partition relation (also
apartness relation ) is it is anti-reflexive, i.e., no self-pairs are in R,
symmetric, i.e., implies , and intransitive in the
sense that if , then for any sequence u = u 0, u 1, u 2, …, u n, u

n+1 = u of elements of U, at least one of the pairs for i = 0, …, n
must be in R. All ditsets are partition relations and all partition
relations are the ditset of some partition on U. An indistinction or indit
of π is the set of ordered pairs of elements in the same block of
π, so the inditset is set of indits of π. The inditset
of a partition is the equivalence relation associated with π, i.e., the
equivalence relation whose equivalence classes are blocks of π.
Equivalence relations and partition relations are complementary
subsets of U × U.
George Boole [5] extended his logic of subsets to finite logical
probability theory where, in the equiprobable case, the probability of a
subset S (event) of a finite universe set (outcome set or sample space)
was the number of elements in S over the total
number of elements: . Pierre-Simon Laplace’s

classical finite probability theory [18] also dealt with the case where
the outcomes were assigned real point probabilities
so rather than summing the equal probabilities , the point

probabilities of the elements were summed:


—where the equiprobable formula is for for j = 1, …, n. The

conditional probability of an event T ⊆ U given an event S is


.

In Gian-Carlo Rota’s Fubini Lectures [23] (and in his lectures at


MIT), he has remarked in view of duality between partitions and
subsets that, quantitatively, the “lattice of partitions plays for
information the role that the Boolean algebra of subsets plays for size
or probability” [17, p. 30] or symbolically:
Since “Probability is a measure on the Boolean algebra of events” that
gives quantitatively the “intuitive idea of the size of a set”, we may ask
by “analogy” for some measure to capture a property for a partition like
“what size is to a set.” Rota goes on to ask:

How shall we be led to such a property? We have already an


inkling of what it should be: it should be a measure of
information provided by a random variable. Is there a candidate
for the measure of the amount of information? [23, p. 67]

Our claim is quite simple; the analogue to the size of a subset is the size
of the ditset, the set of distinctions, of a partition.3 The normalized size
of a subset is the logical probability of the event, and the normalized
size of the ditset of a partition is, in the sense of measure theory, “the
measure of the amount of information” in a partition. Thus we define
the logical entropy of a partition , denoted , as the

size of the ditset normalized by the size of U × U:

This is just the product probability measure of the equiprobable or


uniform probability distribution on U applied to the information set or
ditset . The inditset of π is so where
in the equiprobable case, we have:
Consider the partition on . The
set of distinctions is: where the
ellipsis stands for the reversed ordered pairs. Hence and
.

The corresponding definition for the case of point probabilities


is to just add up the probabilities of getting a
particular distinction:

Taking , the logical entropy with point probabilities


is:

Instead of being given a partition on U with point


probabilities p j defining the finite probability distribution of block
probabilities , one might be given only a finite probability
distribution . Then substituting p i for gives the:
Since , we again have the logical

entropy as the probability ∑i ≠ j p i p j of drawing a distinction in


two independent samplings of the probability distribution p.
In the previous example of the probability
distribution of block probabilities (with equiprobable points) is
and so as before. The

distribution can be illustrated in the following logical entropy box


diagram. The probabilities are measured off on the width and height so
the whole area is 1. The area of the squares along the diagonal are
so the logical entropy is the area off the

diagonal which is divided into two symmetrical halves as in Fig. 1.1.

Fig. 1.1 Logical entropy box diagram

That two-draw probability interpretation follows from the


important fact that logical entropy is always the value of a probability
measure. The product probability measure on the subsets S ⊆ U × U is:
Since , we have that:

And since we have the joint logical


entropy:

Since we have an easy set-of-blocks definition for the join, we also have:

For the other operations on partitions (see the Appendix),

There are also parallel “element ↔ distinction” probabilistic


interpretations:
is the probability that a single draw from U gives a
element u j of S, and
is the probability that two
independent (with replacement) draws from U gives a distinction
of π.

The duality between logical probabilities and logical entropies


based on the parallel roles of ‘its’ (elements of subsets) and ‘dits’
(distinctions of partitions) is summarized in the Table 1.1.

Table 1.1 Logical probability theory and logical information theory


Logical probability theory Logical information theory
Elements u ∈ U finite Dits finite

Subsets S ⊆ U Ditsets

1-draw prob. of S-element 2-draw prob. of π-distinction

1.4 Information Algebras and Joint Distributions


Consider a joint probability distribution on the finite sample
space X × Y (where to avoid trivialities, assume ), with the
marginal distributions and where
and . For notational simplicity, the entropies can be
considered as functions of the random variables or of their probability
distributions, e.g., , , and
. For the joint distribution, we have the:

which is the probability that two samplings of the joint distribution


will yield a pair of distinct ordered pairs , , i.e.,

with an X-distinction x ≠ x ′ or a Y -distinction y ≠ y ′ (since ordered pairs


are distinct if distinct on one or more of the coordinates). The logical
entropy notions for the probability distribution on X × Y are
all product probability measures of certain subsets

where:

These information sets or infosets are defined solely in terms of


equations and inequations (the ‘calculus of identity and difference’)
independent of any probability distributions.
For the logical entropies defined so far, the infosets are:

The infosets S X and S Y, as well as their complements4

generate a Boolean subalgebra of


which will be called the information algebra of X × Y . It is defined
independently of any probability measure on X × Y , and any
such measure defines the product measure μ on ,
and the corresponding logical entropies are the product measures on
the infosets in .
The four atoms S X ∩ S Y = S X∧Y, S X ∩ S ¬Y = S X∧¬Y, S ¬X ∩ S Y = S ¬X∧Y,
and S ¬X ∩ S ¬Y = S ¬X∧¬Y in the information Boolean algebra are
nonempty and correspond to the four rows in the truth table for the
two propositions x ≠ x ′ and y ≠ y ′ and to the four disjoint atomic areas
in the Venn diagram for the compound logical entropies as in Table 1.2.

Table 1.2 Truth table for atoms and some formulas in

Atoms x ≠ x ′ y ≠ y ′ X≢Y X ⊃ Y

S X∧Y T T F T
S X∧¬Y T F T F
S ¬X∧Y F T T T
S ¬X∧¬Y F F F T
For n = 2 variables X and Y , there are ways to fill in the

T’s and F’s to define all the possible Boolean combinations of the two
propositions so there are 16 subsets in the information algebra
. The 15 nonempty subsets in are defined in
disjunctive normal form by the union of the atoms that have a T in their
row. For instance, the set S X≢Y corresponding to the symmetric
difference or inequivalence is
.
The information algebra is a finite combinatorial
structure defined solely in terms of X × Y using only the distinctions and
indistinctions between the elements of X and Y . Any equivalence
between Boolean expressions that is a tautology, e.g.,
, gives a set identity in
the information Boolean algebra, e.g., .
Since that union is disjoint, any probability distribution on X × Y gives
the logically necessary identity (see below).
In addition to the logically necessary relationships between the
logical entropies, other relationships may hold depending on the
particular probability distribution on X × Y . Even though all the 15
subsets in the information algebra aside from the empty set ∅ are
always nonempty, some of the logical entropies can still be 0. Indeed,
iff the marginal distribution on X has for some x ∈
X. These more specific relationships will depend not just on the infosets
but also on their positive supports (which depend on the probability
distribution):

Now and , and for the product


probability measure μ on , the sets and

are of measure 0 so:

Consider S X⊃Y = S X∧Y ∪ S Y ∧¬X ∪ S ¬X∧¬Y and suppose that the


probability distribution gives so that . That

means in a double draw of and , if x ≠ x ′, then there is zero


probability that y = y ′, so x ≠ x ′ implies (probabilistically) y ≠ y ′. In
terms of the Venn diagram, the area is a subset of the area,
i.e., in terms of the underlying sets.
In the different setting of two partitions π and σ on the same sample
space with point probabilities , then
instead of , , and , we have , , and
given respectively by the product probability measure on
, , and .
1.5 Brief History of the Logical Entropy Formula
The logical entropy formula is the

probability of getting distinct values u i ≠ u j in two independent


samplings of the random variable u. The complementary measure
is the probability that the two drawings yield the

same value from U. Thus is a measure of heterogeneity or

diversity in keeping with our theme of information as distinctions,


while the complementary measure is a measure of homogeneity

or concentration. Historically, the formula can be found in either form


depending on the particular context. The p i’s might be relative shares
such as the relative share of organisms of the ith species in some
population of organisms, and then the interpretation of p i as a
probability arises by considering the random choice of an organism
from the population.
According to I. J. Good, the formula has a certain naturalness: “If p 1,
…, p t are the probabilities of t mutually exclusive and exhaustive events,
any statistician of this century who wanted a measure of homogeneity
would have take about two seconds to suggest which I shall call

ρ.” [13, p. 561] As noted by Bhargava and Uppuluri [4], the formula
was used by Gini in 1912 [10] as a measure of “mutability” or

diversity. But another development of the formula (in the


complementary form) in the early twentieth century was in
cryptography. The American cryptologist, William F. Friedman , devoted
a 1922 book [9] to the “index of coincidence” (i.e., ). Solomon

Kullback (see the Kullback-Leibler divergence treated later) worked as


an assistant to Friedman and wrote a book on cryptology which used
the index [16].
During World War II, Alan M. Turing worked for a time in the
Government Code and Cypher School at the Bletchley Park facility in
England. Probably unaware of the earlier work, Turing used

in his cryptoanalysis work and called it the repeat rate since it is the
probability of a repeat in a pair of independent draws from a
population with those probabilities (i.e., the identification probability
). Polish cryptoanalysts had independently used the repeat
rate in their work on the Enigma [21].
After the war, Edward H. Simpson, a British statistician, proposed
as a measure of species concentration (the opposite of

diversity) where π is the partition of animals or plants according to


species and where each animal or plant is considered as equiprobable.
And Simpson gave the interpretation of this homogeneity measure as
“the probability that two individuals chosen at random and
independently from the population will be found to belong to the same
group.” [25, p. 688] Hence is the probability that a random

ordered pair will belong to different species, i.e., will be distinguished


by the species partition. In the biodiversity literature [22], the formula
is known as “Simpson’s index of diversity” or sometimes, the “Gini-
Simpson diversity index” [19], However, Simpson along with I. J. Good
worked at Bletchley Park during WWII, and, according to Good, “E. H.
Simpson and I both obtained the notion [the repeat rate] from Turing.”
[12, p. 395] When Simpson published the index in 1948, he (again,
according to Good) did not acknowledge Turing “fearing that to
acknowledge him would be regarded as a breach of security.” [13, p.
562]
In view of the frequent and independent discovery and rediscovery
of the formula or its complement by Gini,

Friedman, Turing, and no doubt others, I. J. Good wisely advises that

Thus it is unjust to associate ρ with any one person. It would be


better to use such names as “repeat rate” or “quadratic index of
homogeneity” for ρ and perhaps “quadratic index of
heterogeneity or diversity” for 1 − ρ. [13, p. 562]
Two elements from are either identical or distinct.
Gini introduced d ij as the logical distance between the ith and jth
elements where d ij = 1 for i ≠ j and d ii = 0 (i.e., the complement d ij = 1 −
δ ij of the Kronecker delta). Since
, the logical

entropy, i.e., Gini’s index of mutability, , is

the average logical distance between a pair of independently drawn


elements. In 1982, C. R. (Calyampudi Radhakrishna) Rao generalized
that by allowing other non-negative distances d ij = d ji for i ≠ j (but
always d ii = 0) so that the quadratic entropy [20] Q =∑i ≠ j d ij p i p j
would be the average distance between a pair of independently drawn
elements from U. The logical entropy is also the quadratic special case
of the Tsallis-Havrda-Charvat entropy formulas [14, 26].

References
1. Aczel, J., and Z. Daroczy. 1975. On Measures of Information and Their
Characterization. New York: Academic Press.
2.
Adriaans, Pieter, and Johan van Benthem, ed. 2008. Philosophy of Information. Vol.
8. Handbook of the Philosophy of Science. Amsterdam: North-Holland.
3.
Bennett, Charles H. 2003. Quantum Information: Qubits and Quantum Error
Correction. International Journal of Theoretical Physics 42: 153–176. https://​doi.​
org/​10.​1023/​A :​1024439131297.
[Crossref]
4.
Bhargava, T. N., and V. R. R. Uppuluri. 1975. On an axiomatic derivation of Gini
diversity, with applications. Metron 33: 41–53.
5.
Boole, George. 1854. An Investigation of the Laws of Thought on which are founded
the Mathematical Theories of Logic and Probabilities. Cambridge: Macmillan and
Co.
6.
Ellerman, David. 2009. Counting Distinctions: On the Conceptual Foundations of
Shannon’s Information Theory. Synthese 168: 119–149. https://​doi.​org/​10.​1007/​
s11229-008-9333-7.
[Crossref]
7.
Ellerman, David. 2014. An introduction to partition logic. Logic Journal of the
IGPL 22: 94–125. https://​doi.​org/​10.​1093/​j igpal/​j zt036.
[Crossref]
8.
Ellerman, David. 2017. Logical Information Theory: New Foundations for
Information Theory. Logic Journal of the IGPL 25 (5 Oct.): 806–35.
9.
Friedman, William F. 1922. The Index of Coincidence and Its Applications in
Cryptography. Geneva IL: Riverbank Laboratories.
10.
Gini, Corrado 1912. Variabilità e mutabilità. Bologna: Tipografia di Paolo Cuppini.
11.
Gleick, James 2011. The Information: A History, A Theory, A Flood. New York:
Pantheon.
12.
Good, I. J. 1979. A.M. Turing’s statistical work in World War II. Biometrika 66:
393–6.
[Crossref]
13.
Good, I. J. 1982. Comment (on Patil and Taillie: Diversity as a Concept and its
Measurement). Journal of the American Statistical Association 77: 561–3.
[Crossref]
14.
Havrda, Jan, and Frantisek Charvat. 1967. Quantification Methods of
Classification Processes: Concept of Structural α-Entropy. Kybernetika (Prague)
3: 30–35.
15.
Kolmogorov, Andrei N. 1983. Combinatorial Foundations of Information Theory
and the Calculus of Probabilities. Russian Math. Surveys 38: 29–40.
[Crossref]
16.
Kullback, Solomon 1976. Statistical Methods in Cryptoanalysis. Walnut Creek CA:
Aegean Park Press.
17.
Kung, Joseph P. S., Gian-Carlo Rota, and Catherine H. Yan. 2009. Combinatorics:
The Rota Way. New York: Cambridge University Press.
[Crossref]
18.
Laplace, Pierre-Simon. 1995. Philosophical Essay on Probabilities. Translated by
A.I. Dale. New York: Springer Verlag.
19.
Rao, C. R. 1982a. Gini-Simpson Index of Diversity: A Characterization,
Generalization and Applications. Utilitas Mathematica B 21: 273–282.
20.
Rao, C. R. 1982b. Diversity and Dissimilarity Coefficients: A Unified Approach.
Theoretical Population Biology. 21: 24-43.
[Crossref]
21.
Rejewski, M. 1981. How Polish Mathematicians Deciphered the Enigma. Annals of
the History of Computing 3: 213–34.
[Crossref]
22.
Ricotta, Carlo, and Laszlo Szeidl. 2006. Towards a unifying approach to diversity
measures: Bridging the gap between the Shannon entropy and Rao’s quadratic
index. Theoretical Population Biology 70: 237–43. https://​doi.​org/​10.​1016/​j .​tpb.​
2006.​06.​003.
[Crossref]
23.
Rota, Gian-Carlo. 2001. Twelve problems in probability no one likes to bring up.
In Algebraic Combinatorics and Computer Science: A Tribute to Gian-Carlo Rota, ed.
Henry Crapo and Domenico Senato, 57–93. Milano: Springer.
[Crossref]
24.
Shannon, Claude E. 1948. A Mathematical Theory of Communication. Bell System
Technical Journal 27: 379–423; 623–56.
[Crossref]
25.
Simpson, Edward Hugh. 1949. Measurement of Diversity. Nature 163: 688.
[Crossref]
26.
Tsallis, Constantino. 1988. Possible Generalization for Boltzmann-Gibbs
statistics. J. Stat. Physics 52: 479–87.
[Crossref]
27.
Wilkins, John 1707 (1641). Mercury or the Secret and Swift Messenger. London.

Footnotes
1 Many of the results about logical entropy were developed in [6] and [8].

2 Logical information theory is about what Adriaans and van Benthem call
“Information B: Probabilistic, information-theoretic, measured quantitatively”, not
about “Information A: knowledge, logic, what is conveyed in informative answers”
where the connection to philosophy and logic is built-in from the beginning.
Likewise, this book is not about Kolmogorov-style “Information C: Algorithmic, code
compression, measured quantitatively.” [2, p. 11]

3 The lattice of partitions on U is isomorphically represented by the lattice of


partition relations or ditsets on U × U, so in that sense, the size of the ditset of a
partition is its ‘size.’

4 Note that S ¬X and .


© The Author(s), under exclusive license to Springer Nature Switzerland AG 2021
D. Ellerman, New Foundations for Information Theory, SpringerBriefs in Philosophy
https://doi.org/10.1007/978-3-030-86552-8_2

2. The Relationship Between Logical


Entropy and Shannon Entropy
David Ellerman1
(1) Faculty of Social Sciences, University of Ljubljana, Ljubljana,
Slovenia

Abstract
This chapter is focused on developing the basic notion of Shannon
entropy, its interpretation in terms of distinctions, i.e., the minimum
average number of yes-or-no questions that must be answered to
distinguish all the “messages.” Thus Shannon entropy is also a
quantitative indicator of information-as-distinctions, and, accordingly, a
“dit-bit transform” is defined that turns any simple, joint, conditional,
or mutual logical entropy into the corresponding notion of Shannon
entropy. One of the delicate points is that while logical entropy is
always a non-negative measure in the sense of measure theory (indeed,
a probability measure), we will later see that for three or more random
variables, the Shannon mutual information can be negative. This means
that Shannon entropy can in general be characterized only as a signed
measure, i.e., a measure that can take on negative values.

Keywords Shannon entropy – Logical entropy – Dit-bit transform –


Measure – Venn diagrams

2.1 Shannon Entropy


For a partition with block probabilities
(obtained using equiprobable points or with point probabilities), the
Shannon entropy of the partition (using logs to base 2) is:

Or if given a finite probability distribution , the


Shannon entropy of the probability distribution is:

The Shannon entropy is often presented as being the same as the


Boltzmann entropy ; it is even called the “Shannon-

Boltzmann entropy” by many authors. The story is that when presented


with Shannon’s formula, John von Neumann suggested calling it
“entropy” since it occurs in statistical mechanics—and since Shannon
could always win arguments since no one really knows what entropy is.
But it only occurs in statistical mechanics as a numerical approximation
based on taking the first two terms in the Stirling approximation
formula for . The first two terms in the Stirling approximation for
are: .
If we consider a partition on a finite U with , with n blocks
of size N 1, …, N n, then the number of ways of distributing the
individuals in these n boxes with those numbers N i in the ith box is:
. The normalized natural log of W, is one

form of entropy in statistical mechanics. Indeed, the formula


is engraved on Boltzmann’s tombstone.
The entropy formula can then be developed using the first two
terms in the Stirling approximation.
where (and where the formula with logs to the base e only

differs from the usual base 2 formula by a scaling factor). Shannon’s


entropy is in fact an excellent numerical approximation to
for large N (e.g., in statistical mechanics). But the

common claim is that Shannon’s entropy has the same functional form
as entropy in statistical mechanics, and that is simply false. Taking a
few more terms would give a humongous formula that is a “more
accurate approximation” [8, p. 2].
There is another intuitive approach to Shannon entropy that might
be mentioned. When an outcome with probability p i occurs, it is said
that the “surprise factor” is . These surprise factors then need to be

averaged over the whole distribution . Since is the

multiplicative inverse of p i, the appropriate average could be taken as


the multiplicative (or weighted geometric) average
rather than the usual additive average. It should

be noted that this multiplicative form of Shannon entropy is log-free,


and one obtains the usual forms of Shannon entropy by taking the log
to some base of the multiplicative form: and
. Moreover, it is easily shown that for two
random variables X and Y with the joint distribution , the
multiplicative Shannon entropy is

multiplicative when X and Y are independent ( ),


i.e., . Then taking the log of
the multiplicative Shannon entropy yields the usual result that Shannon
entropy is additive for independent random variables.
What is the interpretation of the multiplicative Shannon entropy?
When a probability has the form , then has the “numbers-

equivalent” interpretation as the number of equiprobable outcomes so


that each has that probability p 0. Extending that to any p 0, the
multiplicative Shannon entropy is the multiplicative average
numbers-equivalents for the distribution . For
instance, in the case of m equiprobable outcomes so ,

so . In the biodiversity

literature, the multiplicative Shannon entropy is the number of equally-


numbered species that would have the “same diversity” as a given
distribution of proportions for the species.

Notice that if all N species are equally common, each is a


proportion 1∕N of the total. Thus the [additive Shannon entropy]
measure …equals , so the [additive Shannon entropy]
measure of equally common species is simply the logarithm of
the number of equally common species, meaning that E
[multiplicative Shannon entropy] equally common species
would have the same diversity as the N unequally common
species in our census. [7, p. 514]

The move from the multiplicative Shannon entropy to the additive


form using logs to the base 2 means just recalibrating the multiplicative
average numbers-equivalent to the number of 0, 1-bits it takes to yield
that number, i.e., if , the . For

instance, it is the multiplicative Shannon entropy that occurs naturally


in the usual statistical derivation of the usual (additive) Shannon
entropy, and the usual entropy is obtained by recalibrating in bits—
which establishes the connection in coding theory to the word length of
a message in a binary code.
For that statistical derivation, let be the
probabilities over a m-letter set of possible messages .
In a sequence of N messages, the probability of a particular sequence u
1 u 2…u N is where u i could be any of the symbols in the

alphabet so if u i = a j then . In a typical sequence, the ith


symbol will occur p i N times (law of large numbers) so the probability
of a typical sequence is (note change of indices to the messages):

Since the probability of a typical sequence is P N for , the

typical sequences are equiprobable. Hence the number of typical


sequences is and assigning a

unique binary code to each typical sequence requires X bits where


so:

Hence the Shannon entropy is interpreted

as the limiting average number of bits necessary per message. In terms of


distinctions, this is the average number of binary partitions necessary to
distinguish the messages.
Consider a three level binary tree where each branch splits into two
equiprobable branches at each level as in Fig. 2.1. The 23 = 8 leaves are
the messages each with probability . The multiplicative Shannon

entropy is the number of equiprobable messages 23 = 8 and the


Shannon entropy is the number of binary decisions or bits,

, necessary to determine each message which, in this canonical


example, is the length of the binary code for each message.

Fig. 2.1 Three level binary tree

If we think of the tree as a Galton bean machine with a marble


falling down from the root and taking one of the branches with equal
probability, then the probability of reaching any particular leaf is, of
course, . The logical entropy is the probability that in two different

trials the marble will reach different leaves:


.

2.2 Logical Entropy, Not Shannon Entropy, Is a


(Non-negative) Measure
Shannon entropy and the many other suggested ‘entropies’ (Rényi,
Tsallis, etc.) are routinely called “measures” of information in the
general sense of a real-valued quantification—but not in the sense of
measure theory. The formulas for mutual information, joint entropy,
and conditional entropy are defined so these Shannon entropies satisfy
Venn diagram formulas. As Lorne Campbell put it:

Certain analogies between entropy and measure have been


noted by various authors. These analogies provide a convenient
mnemonic for the various relations between entropy,
conditional entropy, joint entropy, and mutual information. It is
interesting to speculate whether these analogies have a deeper
foundation. It would seem to be quite significant if entropy did
admit an interpretation as the measure of some set. [1, p. 112]

For any finite set U, a measure μ is a function such


that:
1. ,

2. for any E ⊆ U, , and

3. for any disjoint subsets E 1 and E 2, .

The standard usage in measure theory [4, 3] seems to be that a


“measure” is defined to be non-negative, and the extension to allow
negative values is a “signed measure.” That definition is used here
although a few authors [10] define a measure to allow negative values
and then call the restriction to non-negative values a “positive
measure.” The point to notice is that both measures and signed
measures can be represented as Venn diagrams (allowing negative
areas). As we will see, for three or more random variables, the Shannon
mutual information can have negative values—which has no known
interpretation. But there is an interesting difference. The logical
entropies are defined in terms of a probability measure on a set. The
compound Shannon entropies are defined ‘directly’ so as to satisfy the
Venn diagram relationships without any mention of a set on which the
Venn diagram is defined. As one author put it: “Shannon carefully
contrived for this ‘accident’ to occur” [11, p. 153]. But it is possible ex
post to then define a set so that the already-defined Shannon entropies
are the appropriate values on subsets of that set. The ex post
construction for the Shannon entropies was first carried out by Kuo
Ting Hu [6] but was also noted by Imre Csiszar and Janos Kö rner [2],
and redeveloped by Raymond Yeung [14]. Outside the context of
Shannon entropies, the underlying mathematical facts about additive
set functions and the inclusion-exclusion principle were known at least
from the 1925 first edition of the Polya-Szego classic [9] and Ryser’s
treatment of combinatorics [12]—all well-developed in [13].

2.3 The Dit-Bit Transform


The logical entropy formulas for various compound notions (e.g.,
conditional entropy, mutual information, and joint entropy) stand in
certain Venn diagram relationships because logical entropy is a
measure. The Shannon entropy formulas for these compound notions,
e.g., [1], are defined so as to satisfy
the Venn diagram relationships. There is a deeper connection that helps
to explain the connection between the two entropies, the dit-bit
transform. This transform can be heuristically motivated by
considering two ways to treat the standard set U n of n elements with
the equal probabilities . In that basic case of an equiprobable

set, we can derive the dit-bit connection, and then by using a


probabilistic average, we can develop the Shannon entropy, expressed
in terms of bits, from the logical entropy, expressed in terms of
(normalized) dits.
Given U n with n equiprobable elements, the number of dits (of the
discrete partition on U n) is n 2 − n so the normalized dit count is:
That is the (normalized) dit-count or logical measure of the information
in a set of n distinct elements (think of it as the logical entropy of the
discrete partition on U n with equiprobable elements).
But we can also measure the information in the set by the number
of binary partitions it takes (on average) to distinguish the elements,
and that bit-count is [5]:

The dit-bit connection is that the Shannon-Hartley entropy


will play the same role in the Shannon formulas that

plays in the logical entropy formulas—when both are


formulated as probabilistic averages or expectations.
The common thing being measured is an equiprobable U n where
. The dit-count for U n is and the bit-count for U

is , and the dit-bit transform converts one count into

the other. This dit-bit transform should not be interpreted as if it was


just converting a length using centimeters to inches or the like since it
is highly nonlinear.
We start with the logical entropy of a probability distribution
:

It is expressed as the probabilistic average of the dit-counts or logical


entropies of the sets with equiprobable elements. But if we

switch to the binary-partition bit-counts of the information content of


those same sets of equiprobable elements, then the bit-counts
are and the probabilistic average is the Shannon

entropy: . Both entropies have

the mathematical form as a probabilistic average or expectation:

and differ by using either the dit-count or bit-count conception of


information in .
The graph of the dit-bit transform is familiar in information theory
from the natural log inequality: . Taking negatives of both
sides gives the graph of the dit-bit transform since it replaces the left-
hand side of the inequality by the right-hand side to

give the dit-bit transform: as indicated in Fig. 2.2.


Fig. 2.2 The dit-bit transform (natural logs)

The dit-bit connection carries over to all the compound notions of


entropy since it preserves Venn-diagrams. Since the logical notions are
the values of a probability measure, the compound notions of logical
entropy have the usual Venn diagram relationships. And then by the dit-
bit transform, those Venn diagram relationships carry over to the
compound Shannon formulas since the dit-bit transform preserves
sums and differences.
The logical entropy formula (and the
corresponding compound formulas) are put into that form of an
average or expectation to apply the dit-bit transform. But logical
entropy is the exact measure of the information set
for the product
probability measure where for

, , i.e., .

References
1. Campbell, L. Lorne. 1965. Entropy as a Measure. IEEE Trans. on Information
Theory IT-11: 112–114.
[Crossref]
2.
Csiszar, Imre, and Janos Kö rner. 1981. Information Theory: Coding Theorems for
Discrete Memoryless Systems. New York: Academic Press.
3.
Doob, J. L. 1994. Measure Theory. New York: Springer Science+Business Media.
[Crossref]
4.
Halmos, Paul R. 1974. Measure Theory. New York: Springer-Verlag.
5.
Hartley, R.V.L. 1928. Transmission of information. Bell System Technical Journal 7:
535–63.
[Crossref]
6.
Hu, Kuo Ting. 1962. On the Amount of Information. Probability Theory & Its
Applications 7: 439–447. https://​doi.​org/​10.​1137/​1107041.
[Crossref]
7.
MacArthur, Robert H. 1965. Patterns of Species Diversity. Biol. Rev. 40: 510–33.
[Crossref]
8.
MacKay, D. J. C. 2003. Information Theory, Inference, and Learning Algorithms.
Cambridge UK: Cambridge University Press.
9.
Polya, George, and Gabor Szego. 1998. Problems and Theorems in Analysis Vol. II.
Berlin: Springer-Verlag.
[Crossref]
10.
Rao, K. P. S. Bhaskara, and M. Bhaskara Rao. 1983. Theory of Charges: A Study of
Finitely Additive Measures. London: Academic Press.
11.
Rozeboom, William W. 1968. The Theory of Abstract Partials: An Introduction.
Psychometrika 33: 133–167.
[Crossref]
12.
Ryser, Herbert John. 1963. Combinatorial Mathematics. Washington DC:
Mathematical Association of America.
[Crossref]
13.
Takacs, Lajos. 1967. On the Method of Inclusion and Exclusion. Journal of the
American Statistical Association 62: 102–113. https://​doi.​org/​10.​1080/​01621459.​
1967.​10482891.
14.
Yeung, Raymond W. 1991. A New Outlook on Shannon’s Information Measures.
IEEE Trans. on Information Theory 37: 466–474. https://​doi.​org/​10.​1109/​18.​
79902.
[Crossref]
Another random document with
no related content on Scribd:
contemptuously of Madame de Montespan, and his remarks
had been carried to the lady's ears by one of those tale-
bearers who flourish at court. Of course madame became
his enemy. She had great influence with the king, though
not so much as Madame de Maintenon came to have
afterward. My uncle's disgrace grew more and more
apparent every day, and at last he received peremptory
orders to retire to his château in Provence, where he held
some sort of office under government. He was allowed,
however, to remain in Paris for two or three weeks, to settle
up his affairs there, which were, I imagine, in no little
confusion.

My aunt was in despair. To be banished from court was


to be cast out of heaven in her estimation. She hated the
country, and went thither even for a few weeks with
unwillingness. She wept and went into fits of the nerves, as
she usually did under any disturbance of mind or body. The
poor lady was really very ill for a few days, and as I was the
only person who had the least control over her in her
paroxysms, I had my hands full. However, she had an
elastic constitution of mind and body, and she soon
recovered, and began planning all sorts of amusements,
one of which, I remember, was to be the refurnishing of the
Château de Fayrolles from top to bottom. I was glad to see
her diverted by anything, and I listened to all her schemes,
and being ready with my pencil, was able to afford her
pleasure by sketching designs for furniture, hangings, and
the like, which even my uncle declared to be very clever.

One day, being left to myself while my aunt entertained


a visitor, I began drawing from memory a sketch of my old
home, the Tour d'Antin. I became so interested in my work
as to take no note of the time, till I was surprised by the
entrance of my uncle and Father Martien. I had not seen the
latter gentleman for some weeks, nor was I at all glad to
behold him now. My religion had become to me more and
more an empty profession, and though I still went regularly
to mass and confession, my attendance was the merest
form. At mass and vespers, though I kept my book open, I
thought of anything rather than the services, and as to my
confessions, if I had repeated the Confiteor correctly, and
then gone on in the orthodox devout whisper to say that I
had become a Moslem and the fifteenth wife of the Grand
Bashaw, good old Father Le Moyne would have been none
the wiser, but would have given me absolution in his usual
gentle, nasal sing-song. I had learned to love and respect
the old man, for though indolent to a degree, he was kind
and fatherly, and did not disgrace himself with wine or
worse things, and it was with real dismay that I
contemplated exchanging him for the sharp-sighted, cold
Father Martien.

My uncle looked at the sketch and commended it,


saying that it showed real talent. He then began asking me
questions about it, sitting down with the drawing in his
hand. At last—

"There is one thing, Vevette, which I have hesitated to


ask you about, not wishing to revive painful memories; but
the time has arrived when such an inquiry becomes
necessary. Where did your father conceal his treasure?"

"His treasure!" I repeated. "I don't understand."

My uncle contracted his brows.

"Do not trifle with me," said he sternly. "I am well


informed that my unhappy brother invested a great deal of
the money which he acquired in one way and another in
plate and jewels. What did he do with them? I know that he
left them concealed somewhere about the estate; but
where?"

"I do not know," I answered, with perfect truth; "I know


that he and my cousin—" I could not bring myself to say
Andrew—"hid away a part of what silver there was, but I
never heard what they did with it. He sold a good deal, I
know."

"And what disposition did he make of the money?"

"He turned a large part of it into diamonds. The rest he


left with my foster-father, Simon Sablot, who afterward
brought it to my mother in England."

"And the jewels, my daughter, what of them?" asked


Father Martien.

"Oh, we carried them to England with us," I answered,


inwardly rejoicing to give an answer so little satisfactory.
"My mother sold them in London, and invested a part of the
money in a little estate at Tre Madoc—the Welles House.
The rest is in the hands of my lord, unless he has put it into
land."

My uncle stamped his foot and bit his lip with vexation.
It seems he thought his brother had left his treasures
concealed, and hoped by my means to lay hands upon
them. No doubt they would have made a very welcome
supply at that time.

"Are you telling the exact truth, daughter?" asked


Father Martien sternly.

"I am telling the exact truth, so far as I know it," I


answered, with some spirit. "There may be some of the
silver still concealed at the Tour d'Antin, but if so, I do not
know where it is."

"Where should you think it would be most likely to be


hidden?" was the next question.

"That I cannot tell either," I answered; "but I suppose


the vault under the tower would be as probable a place as
any."

"Is there not a vault under the old chapel?" asked


Father Martien.

"Yes," I answered; "a burial vault."

"Have you ever been in it?"

"Yes, once; at the burial of one of our servants," I


replied, availing myself of the orthodox Jesuit doctrine of
mental reservation.

"Do you think your father would be likely to place the


treasure in that vault?" was the next question.

"I do not know as to that," I answered. "I suppose he


might. But I rather imagine that what silver there was, was
buried somewhere about the orchard."

My uncle continued questioning me for some time, but


as I could not tell him what I did not know, he was not
much the wiser for my replies. He did not half believe that
we had carried off the jewels, and declared that he meant
to write to Lord Stanton on the subject of "my property," as
I called it.

"It is mine," I replied indignantly; "my mother left it to


me."
My uncle laughed contemptuously.

"Your mother had no more right to it than you have,"


said he. "Being married to your father, as they presume to
say, by a Protestant minister, the marriage is no marriage
by law. It was not worth a pin. You are an illegitimate child,
and as such have no rights whatever. My brother's
succession belongs to me, and I intend to have it."

"It is the truth, my daughter," said Father Martien, as I


looked at him. "The blasphemous parody of the holy
sacrament of marriage with which your wretched and guilty
parents were united was not only invalid but was of itself a
grievous crime in the eye of the law as well as of the
Church."

If I had had the power of life and death in my hands, I


should at that moment have laid both of these men dead at
my feet. In my rage, I actually looked around for a weapon,
and it was well for all parties that there was none at hand.
Then, as the conviction of my utter helplessness and
desolation came over me, I burst into an agony of tears.

"Hush, my daughter!" said the priest. "These tears do


not become you. Let your natural affection for your parents
be laid upon the altar of that church which they spurned,
and it may become a merit. Indulged, it is in your case a sin
against God and our holy religion."

My uncle, devout Catholic as he was, had not lost all


feeling, nor was he a man to be put down in his own house
even by a priest. He silenced the Father by a look, and then
set himself to soothe me, saying that though he had
thought it needful to tell me the truth, he should not visit
upon me the misfortune of my birth, but should continue to
regard me as a daughter so long as I showed myself dutiful
to him and to my aunt. He bade me retire to my room and
compose myself, saying that he would make my excuses.

"I am going into the country for a few days," said he.
"When I return I shall hope to find that you have recovered
your spirits and are prepared to submit to any arrangement
your friends may think it best to make for your welfare."

My uncle gave me his hand as far as the door of my


apartment, and parted from me with a fatherly salute,
recommending me to lie down and rest awhile. He was very
kind to me for the rest of the day, and seemed by his
manner to wish to make me forget the harshness he had
used toward me.

The next morning he called me into his own room and


put into my hands a letter he had written to my Lord
Stanton. It was to the effect that, having embraced the
Catholic faith, and being resolved in future to make my
home with my father's brother, I desired to have all my
property put into the hands of Monsieur de Fayrolles, my
natural guardian.

"You will copy this letter," said my uncle, "and I will


inclose it in one of my own to my Lord Stanton. If he is an
honest man, he will see the justice and wisdom of such an
arrangement. If he is not, I must take other measures, for I
am resolved not to be cheated of my right. Sit down here
and copy the letter."

I had nothing for it but to obey, my uncle all the time


standing by and observing me. When the copy was finished,
he inclosed both letters in an envelope, and was just about
sealing them, when my aunt called upon him. With an
expression of impatience, he laid down the unsealed letter
and went into the next room. In a moment I had turned
down the corner of the sheet and written in small
characters, "Don't, for Heaven's sake."

It was the work of a moment, and when my uncle


returned he found me reading a book I had taken from the
table. He reproved me for opening a book without leave,
but seeing that it was only a play of Monsieur Racine's I had
taken up, told me to keep it if I liked. He sealed the letter
without looking at it again, told me he was pleased with my
compliance, and gave me a gold piece to buy ribbons with,
as he said. I was not sorry to receive it, for I was already
turning over in my head plans of escape, and I knew that
any plan I could form would need money to carry it out.

My uncle was absent several days, and came back in


anything but a good humor. He had not succeeded in finding
the treasure, if there was any to find, neither had he
succeeded in letting the land. The house, having been
reduced by the fire to a mere empty shell, had partly fallen
in and filled up the cellars, while of the vault under the
chapel, he said the whole floor seemed to have sunk into
some abyss.

"So there really was a cavern underneath," I said.


"There was a tradition to that effect, and my father always
believed that such a cavern existed, and that it had some
connection with the sea."

"It might have a connection with the infernal regions,


judging from the sounds which proceed from it," said my
uncle. "I was near falling into it headlong. It is the more
vexatious because there are niches around the wall which
have evidently been built up—one even quite lately."

"And they are quite inaccessible?" said my aunt.


"Oh, entirely. The whole building is so ruinous that one
enters it only at the risk of one's life."

"The niches are only burial-places," I ventured to say,


thinking at the same time that poor Grace's grave would
now at least be safe from insult.

"Yes, but they may have been used for deposits of


another sort. However, there is no use in thinking more
about the matter. You are looking better, Vevette. I am glad
to see you try to put on a more cheerful face. Your
countenance lately has been a perpetual kill-joy—fit only for
a convent of Carmelites."

Indeed, my health had improved. The very thought of


escape, impracticable as it seemed at present, had put new
life into me. I began to take a little care of myself, and to be
anxious to acquire strength.

"I do not think, my friend, that the convent of the


Carmelites will be Vevette's vocation," said my aunt,
smiling. "I have an affair of great importance to lay before
you when we are at leisure."

A cold chill struck to my heart as I heard these words


and guessed what they might mean. The event proved that
my forebodings were well founded. There was a certain
Monsieur de Luynes, an elderly gentleman of good family,
and very wealthy, who often visited my aunt, being indeed
some sort of connection. This gentleman had lost his wife
many years before, and having married off all his
daughters, he had conceived the idea of providing a
companion and nurse for his declining years. He was
hideously ugly—tall, shambling, with bushy gray eyebrows,
and a great scar on his cheek which had affected the shape
of one of his eyes; but his manners were amiable and kind,
and he had the reputation of leading a remarkably good life.
He had always taken a good deal of notice of me, and had
once or twice drawn me into conversation as I sat at my
aunt's side, and I had thought him very agreeable. It was
this Monsieur de Luynes who now made a formal proposal
for my hand. I was not at all consulted in the matter. I was
simply called into my aunt's boudoir, told of the proposal
which had been made, and ordered to consider myself the
future wife of Monsieur de Luynes.

"There is no reason for any delay," said my aunt.


"Monsieur, who is himself very wealthy, does not ask for any
dot with you. The trousseau can be prepared in a few days,
and I will engage it shall be a fine one. You will be a happy
woman, petite."

"Yes indeed; you may consider yourself most fortunate,"


added my uncle. "Considering the misfortune of your birth
and your state of poverty and dependence, it is a match far
beyond anything we have a right to expect for you. It will
give me great pleasure to see you established at the head
of Monsieur de Luynes' fine house before I leave Paris."

My resolution was taken in a moment. If Monsieur de


Luynes' offer had come to me at the beginning of my
residence in France, I should instantly have accepted it, and
rejoiced in the opportunity of doing so. But my mind had
changed. I know not how, but I was just as certain that
Andrew had remained faithful to me as if he himself had
told me, and being so assured, I would have suffered
myself to be thrown into the fire rather than marry any one
else. I waited till my uncle had done speaking, and then,
with a calmness which amazed myself, I told him of my
determination.
"Tut!" said he. "Let me hear no such girlish folly. You
will do what I consider best for you, and take care you do it
with a good grace or it will be the worse for you."

"Nay, do not be severe with Vevette," said my aunt. "All


girls think it necessary to put on such airs and make such
declarations. Leave her to me."

Left to my aunt I was, and I set myself to soften her


heart toward me. I begged only to be allowed to remain
single, promising to be guided by her in everything else—to
perform any menial service, to work my fingers to the bone.
All was in vain. My aunt laughed at my entreaties,
considering them only as the wilfulness of a child; told me
the time would come when I would thank her for not
yielding to my folly. Finally losing patience, as I continued
weeping, she let me feel the iron hand masked under the
velvet glove. She told me in severe tones that my wilfulness
was unbearable, and that unless I gave way and did what
was thought best for me, I should be sent to that same
Carmelite convent to be brought to my senses.

"We have wished to be kind to you," she added; "but


there are means of subduing refractory girls which the good
sisters well know how to practise, and of which you shall
make trial if you are disobedient. Now go to your room, dry
your eyes, and let me see you looking your best when
Monsieur de Luynes comes this evening."

I had nothing for it but to comply. My resolution was


fixed as ever. They might send me to the Carmelites, starve
me, bury me alive if they would, but I would never marry—
never! However, I thought best to temporize. The evening
found me dressed, my aunt herself looking over my toilette
and commending my docility.
"I thought you would see the propriety of giving way,"
said she; "and I am glad for your sake you have done so.
You would not have liked to be shut up alone in the charnel-
house of the convent, without light or food for twenty-four
hours together, as happened to a cousin of my own who set
herself up against her father's authority. No, it is much
better to be in my salon than in the company of mouldy
skeletons."

I held my tongue; but I could have said that I should


have preferred the society of the mouldiest Carmelite ever
buried in sackcloth to that of Monsieur de Luynes. The kind
old man was very attentive to me, made many gallant
speeches, and presented me with a magnificent box of
bonbons and preserved fruits, containing also a beautiful
pearl clasp. I almost wished I could have loved him, and
indeed if my heart had not been full of another, I believe I
should have married him, if only to escape from my present
state of servitude. But there it was: I loved Andrew. I
should always love him, and I could never marry any one
else, whether I ever saw him again or not.

Under ordinary circumstances, I should never have


been left alone with my intended bridegroom till after the
ceremony; but my aunt had a great opinion of the
discretion and goodness of Monsieur de Luynes, which
indeed he well deserved. She also trusted a good deal, I
fancy, to his powers of persuasion, for she allowed him
more than once to remain tête-à-tête with me for an hour
or two at a time in the little salon, while she entertained her
visitors or gave audience to the tradespeople who were
busied with my wedding outfit. On one of these occasions I
took a desperate resolution and opened to Monsieur de
Luynes my whole heart. Monsieur tried hard to shake me,
promising me every sort of good, and even going so far as
to hint that I should, in the course of nature, outlive him;
and then, being a widow, I could go where I liked and do as
I pleased.

Finding, however, that even this agreeable prospect


failed to move me, and that I was settled in my resolution,
after two or three interviews, he bade me farewell with
much kindness, and going to my uncle formally retracted
his suit, saying that he would never wed an unwilling bride.

My aunt's anger was loud and voluble; my uncle's more


silent and much more terrible. He said little except to bid
me retire to my room. Here I remained till evening, without
notice of any kind.

That night my lodging was changed to a bare attic at


the top of the house, lighted only by a window in the roof,
and furnished with a pallet bed, a straw chair, and a crucifix,
with its vessel of holy water underneath. Into this cell, I
was locked by my uncle's own hands, and here I remained
prisoner for a fortnight, seeing nobody but my aunt's
women, who once a day brought me a meagre supply of
coarse food. I had but one companion—an ugly gray cat,
which lived in the neighboring garret, and made her way to
my cell through a hole in the wainscot, attracted, I suppose,
by the smell of my soup. She shared my meals by day and
my bed at night, and, I doubt not, sincerely regretted my
departure. I have always loved and patronized ugly gray
cats for her sake.

I was happier in this garret than I had been before in a


long time. I had lived absolutely without prayer ever since
my illness, for my repetitions of the rosary might as well
have been repetitions of "Cruel Barbara Allen," for all the
devotion there had been in them. But somehow my firm
decision not to marry any one but my first love had brought
help and comfort to me. It had been a step in the right
direction.

When first locked into my prison cell, I had thrown


myself on my knees and besought help from heaven to hold
firm my resolution. That prayer had opened the way for
others. I began to review my life and sincerely to repent the
sins which had brought me into such straits. I saw and
recognized the fact that the double-mindedness which had
always been my bane, had in this instance lain at the root
of my apostasy. I confessed the justice of my Heavenly
Father, and was enabled wholly to surrender myself into his
hands for time and eternity; and I received comfort, and
even joy, such as I had never found before.

At the end of a fortnight, my uncle visited me again and


inquired whether I were now ready to submit my will to his.
Modestly, I hope, but certainly with firmness, I declared my
determination unchanged, and was ordered instantly to
prepare for a journey.
CHAPTER XX.
"YOU SHALL HAVE NO CHOICE."

I EXPECTED nothing less than to be taken at once to the


Carmelite convent with which I had been threatened. I was
therefore agreeably surprised when, on being led to the
courtyard I found all my uncle's servants assembled, and
his own travelling carriage waiting, with my aunt already
seated in it. There were three vacant places, one of which I
was desired to take, while my uncle placed himself in the
other, and my aunt's gentlewoman took the fourth. There
was a great deal of running back and forth for small
packages which had been forgotten, a great deal of ordering
and counter-ordering, of pulling at straps and examining of
buckles; but at last all was ready, and we set out. My aunt
had not spoken to me at all, but as we passed a fine house
which was being newly repaired and decorated she broke
out with—

"And to think, ungrateful girl, that all that might have


been yours, and you must throw it all away for a whim!"

I made no answer, for there was nothing to be said.

"There is no use in talking further on that matter,


madame," said my uncle. "Vevette has made her decision,
and she must abide the consequences. Henceforth she will
have no choice as to what she will do. All will be decided for
her, and it is possible she may come to regret Monsieur de
Luynes."

"That may well be, my uncle, since Monsieur de Luynes


was a true friend, who did not expect to gain any hidden
treasures by his kindness," I answered. "But I shall never
regret having acted honorably by him, whatever happens."

My uncle bit his lip, as well he might, and I saw the


waiting-woman look out of the window to hide a smile. She
knew all about my uncle's Journey to Normandy, and, like
others of her class, she enjoyed a hit at her betters.

"Be silent!" said my uncle sternly. "Nobody wishes to


hear the sound of your voice. Speak only when you are
spoken to."

I obeyed, and, indeed, I had no inclination to talk. The


morning was beautiful, and the spring was just coming on,
and, forlorn as it looked, I was delighted to see the open
country once more, and to breathe an air not poisoned with
the thousand and one smells—not to use a stronger word—
of Paris. The king could indeed crush and impoverish his
poor people to maintain his armies and his mistresses, but
he could not hinder the wild flowers from blooming nor the
birds from singing. My spirits rose insensibly, and I more
than once caught myself on the point of breaking out into a
song.

My uncle sat back in the corner and said nothing. My


aunt kept up a perpetual prattle with Susanne, now
bewailing her banishment from Paris and the court, now
remarking upon this or that fine lady, and listening to the
tittle-tattle in which Susanne was a proficient.

At last my uncle said he would ride on horseback a


while, so his groom was called up with the spare horse, and
we women were left to ourselves. Then my aunt fell upon
me, and such a rating as she gave me! I have heard English
women scold, but I never heard any fishwoman equal my
elegant, double-refined aunt, Madame de Fayrolles. She
worked herself up into such a passion that she told a good
deal more than she meant, and thus I learned that my Lord
Stanton had returned a very short and sharp answer to
Monsieur de Fayrolles' letter, absolutely refusing to let him
have any of the property intrusted to him, and requiring
that I should at once return to England.

So my friends had not quite forgotten or forsaken me.


That was some comfort, but I dared not say so. My aunt
went on, growing more and more excited, till she ended
with—

"And when I thought at least I should have you to help


me at Fayrolles, to draw patterns for my embroidery, and
sing and read aloud to me, I cannot even enjoy that."

"But indeed, aunt, I will do anything for you—singing or


reading, or whatever you please," I said soothingly, for I
was afraid of one of her fits of nerves. "There is nothing in
my power that I will not do for you if you will let me, at
Fayrolles or anywhere else. You know we agreed to work
chairs upon satin for the salon, and I have several patterns
drawn already."

"Yes, but you won't be at Fayrolles!" said my aunt


between her sobs. And then, catching a warning glance
from Susanne, she said no more.

Then I was not to go to Fayrolles! What did they mean


to do with me? To send me back to England? That was not
likely, after what my uncle had said about my having no
choice. Probably I should be placed in some country
convent, where I should be out of reach of all help,
whatever happened, and where no one would ever hear
from me again. This was what I dreaded of all things. I had
almost given up any belief in the faith I had lately
professed, and the question occurred to me whether I ought
not openly to confess the change which had come over me.
I knew only too well what such a confession involved—
either a life-long imprisonment or a horrible death—perhaps
being left to perish by inches in some underground cell,
amid rats and vermin. Such things happened all the time.

Worse even than that, I knew that many of the


convents were sinks of iniquity—places of resort for idle
young gentlemen and wickeder women, like that of Port
Royal, which afterward passed through so many
vicissitudes. I am very far from saying that they were all of
this character, but a great many of them were so, even
taking the accounts of Roman Catholics themselves.* A
residence in such a community was no pleasant prospect.

* See Racine's "Memoirs of Port Royal." Letters of St.


Francis de Sales, and almost any free-spoken memoirs of
the time.

And was I, after all, ready to die for my faith? Had I


indeed any assured faith to die for? Might not Father
Martien be right after all? My mind was tossed upon a sea of
doubt and conjecture, and for a time found no rest; but at
last I was enabled to pray, and to cast myself, more
completely than I had ever done before, upon the arms of
mercy. I asked for light and help above all things, and light
and help were given me, not all at once, but by degrees. I
became sensible of a sweet calm and clearness of mind, in
which I saw all things more plainly. I felt sure that my many
sins had been forgiven and washed away, and that when
the time came for action, I should have strength given me
to act for the best.
I had plenty of time for my own thoughts, for my uncle
soon reentered the carriage, and after that my aunt did not
venture to speak to me again, though she talked at me
whenever there was a chance. She was a woman who bore
discomfort of any kind very ill, and the more weary she
grew with her journey, the more unbearable grew her
peevish fretfulness. At last my uncle was moved to speak
sharply to her, whereupon she fell into one of her nervous
fits, and I had to exert all my skill to keep her from
throwing herself out of the carriage. With much expenditure
of coaxing and soothing I got her quieted at last, and
persuaded her to take some refreshment, after which she
fell asleep.

I fancy my attentions softened her heart toward me, for


she was much more kind to me during the rest of the day,
and I thought even interceded for me with her husband; but
if so it was without avail, and even increased my troubles.

For the whole of the next day I travelled with my former


maid, Zelie, and the old woman I had spoken of. They
understood my disgrace well enough, and did all in their
power to make me feel it, treating me with the utmost
insolence and neglect, so that at the inns where we
stopped, I had the most wretched lodgings imaginable, and
really went hungry while my jailers, for such they were,
feasted upon dainties at my uncle's expense. In this,
however, they overshot the mark and brought themselves
into trouble. My uncle, remarking in the morning upon my
extreme paleness, asked whether I was ill.

"No, monsieur," I answered, "I am not ill, but I am


hungry. I have had not a morsel since yesterday noon but
some crusts of mouldy black bread, which I could not have
eaten if I had been starving."
Monsieur turned angrily upon Zelie, who stammered
and denied and charged me with falsehood; but my uncle
knew me well enough to believe what I said, and my face
spoke for itself. I was once more removed to my aunt's
carriage, and fared as she did, and the rest of the way was
more comfortable. Susanne had always been friendly to me.

During my imprisonment, she had more than once


smuggled comforts into my cell, and when we were alone
together she spoke to me with kindness and pity. My aunt's
heart evidently softened to me more and more, but my
uncle was implacable. To cross him once was to make him
an enemy forever; I had disappointed him in every way,
and he meant to make me feel the full force of his
displeasure.

I had gathered from the servants that our first


destination was Marseilles. As we drew near that city, we
passed company after company of unhappy wretches
destined for the galleys, laboring along, chained together,
and driven like cattle to the slaughter. Many of them were
condemned for no crime but that of having attended a
preaching, or prayed in their own families—that of being
Protestants, in short—and these were linked oftentimes to
the most atrocious criminals, whose society must have been
harder to bear than their chains. But more than once or
twice man's cruelty was turned to the praise of God, and
the criminal was converted by the patience and the
instructions of his fellow.

As we passed one of these sad cavalcades my uncle


stopped the coach to ask some questions, and I was
brought face to face with one I knew right well. It was an
aged preacher who had long known my father, and had
often been at our home in Normandy. I had no mind to have
him recognize me, and I turned away my face to hide my
overflowing tears. Monsieur did not at first recognize the old
preacher, but the other knew him in a moment, and called
him by name.

"What, Monsieur Morin, is this you!" said my uncle. "I


thought you were dead!"

"I soon shall be," answered the old man calmly.


"Happily for me, I am more than seventy years old, and my
prison-doors must soon be opened. Then I shall receive my
reward; but you—ah, Henri, my former pupil, whom I so
loved, how will it be with you? Oh, repent, while there is yet
time! There is mercy even for the denier and the apostate!"

For all answer, my uncle, transported with rage, lifted


his cane and struck the old man a severe blow. The very
criminals cried shame upon him, and the young officer in
charge hastened to the spot, and with expressions of pity
offered his own handkerchief to the poor old man, whose
brow was cut and bleeding.

"Well," said my uncle, turning to me, and seeing, I


suppose, what I thought, "how do you like the way your
former friends are treated? How would you like to share
their lot?"

"I would rather be that old man than you, monsieur!" I


returned, on fire with indignation. "I would rather be the
helpless prisoner than the coward that abuses him!"

"Coward!" repeated my uncle, white with rage.

"Dastard, if you like it better!" I returned, reckless of


consequences. "To strike a helpless man is cowardly; to
strike an old and feeble man is dastardly!"

You might also like