Download as pdf or txt
Download as pdf or txt
You are on page 1of 33

Graph neural networks for efficient

learning of mechanical properties of


polycrystals Jonathan M. Hestroffer
Visit to download the full and correct content document:
https://ebookmass.com/product/graph-neural-networks-for-efficient-learning-of-mecha
nical-properties-of-polycrystals-jonathan-m-hestroffer/
More products digital (pdf, epub, mobi) instant
download maybe you interests ...

Accelerators for Convolutional Neural Networks Arslan


Munir

https://ebookmass.com/product/accelerators-for-convolutional-
neural-networks-arslan-munir/

Physics-Informed Neural Networks M. Raissi & P.


Perdikaris & G.E. Karniadakis

https://ebookmass.com/product/physics-informed-neural-networks-m-
raissi-p-perdikaris-g-e-karniadakis/

Analysis and Visualization of Discrete Data Using


Neural Networks Koji Koyamada

https://ebookmass.com/product/analysis-and-visualization-of-
discrete-data-using-neural-networks-koji-koyamada/

Python Deep Learning: Understand how deep neural


networks work and apply them to real-world tasks 3rd
Edition Vasilev

https://ebookmass.com/product/python-deep-learning-understand-
how-deep-neural-networks-work-and-apply-them-to-real-world-
tasks-3rd-edition-vasilev/
Federated Learning for Future Intelligent Wireless
Networks Yao Sun

https://ebookmass.com/product/federated-learning-for-future-
intelligent-wireless-networks-yao-sun/

AI Applications to Communications and Information


Technologies: The Role of Ultra Deep Neural Networks
Daniel Minoli

https://ebookmass.com/product/ai-applications-to-communications-
and-information-technologies-the-role-of-ultra-deep-neural-
networks-daniel-minoli/

Building Computer Vision Applications Using Artificial


Neural Networks, 2nd Edition Shamshad Ansari

https://ebookmass.com/product/building-computer-vision-
applications-using-artificial-neural-networks-2nd-edition-
shamshad-ansari/

Graph Database and Graph Computing for Power System


Analysis Renchang Dai

https://ebookmass.com/product/graph-database-and-graph-computing-
for-power-system-analysis-renchang-dai/

The Mechanics of Hydrogels: Mechanical Properties,


Testing, and Applications (Elsevier Series in Mechanics
of Advanced Materials) 1st Edition Hua Li (Editor)

https://ebookmass.com/product/the-mechanics-of-hydrogels-
mechanical-properties-testing-and-applications-elsevier-series-
in-mechanics-of-advanced-materials-1st-edition-hua-li-editor/
Computational Materials Science 217 (2023) 111894

Contents lists available at ScienceDirect

Computational Materials Science


journal homepage: www.elsevier.com/locate/commatsci

Full length article

Graph neural networks for efficient learning of mechanical properties of


polycrystals
Jonathan M. Hestroffer a ,∗, Marie-Agathe Charpagne b , Marat I. Latypov c,d ,∗, Irene J. Beyerlein a,e
a
Materials Department, University of California, Santa Barbara, CA, USA
b
University of Illinois at Urbana-Champaign, Urbana, USA
c
Department of Materials Science and Engineering, University of Arizona, Tucson, AZ 85721, USA
d
Graduate Interdisciplinary Program in Applied Mathematics, University of Arizona, Tucson, AZ 85721, USA
e
Department of Mechanical Engineering, University of California, Santa Barbara Santa Barbara, CA, USA

ARTICLE INFO ABSTRACT

Dataset link: https://github.com/jonathanhestr We present graph neural networks (GNNs) as an efficient and accurate machine learning approach to predict
offer/PolyGRAPH mechanical properties of polycrystalline materials. Here, a GNN was developed based on graph representation
of polycrystals incorporating only fundamental features of grains including their crystallographic orientation,
Keywords:
Titanium
size, and grain neighbor connectivity information. We tested our method on modeling stiffness and yield
Texture strength of 𝛼-Ti microstructures, varying in their crystallographic texture. We find the GNN predicts both
Graphs properties with high accuracy with mean relative errors of ∼ 1% for unseen microstructures from a given set
Deep learning of textures and <2% for microstructures of unseen texture, even when presented with limited training data. This
Strength accuracy is comparable to methods that require high-resolution three-dimensional (3D) microstructure data,
Stiffness such as 3D convolutional neural networks (3D-CNNs) and models that depend on the computation of spatial
statistics. The present results show that graph-based deep learning is a promising framework for property
prediction, especially considering the high cost associated with obtaining high-resolution 3D microstructure
data and the general scarcity of experimental materials datasets.

1. Introduction with explicit account for microstructure. Belonging to a class of deep


neural networks, 3D-CNNs automatically learn relevant spatial correla-
A crucial element in materials design for advanced structural com- tions in microstructure through repeatedly applied spatial convolutions.
ponents is the prediction of mechanical properties of materials based On the other hand, statistical learning models, which are machine
on their microstructure. In the case of polycrystalline materials, many learning models based on microstructure statistics, depend on compu-
microstructural features are known to affect their mechanical response,
tation of spatial statistics (e.g., 𝑛-point correlations) as features [16].
such as grain size, shape, and crystallographic orientation, to name a
Both approaches have been increasingly applied to model mechanical
few. Explicit account for the crystallographic orientation, grain mor-
phology, along with grain-neighbor interactions and non-uniform intra- properties, capable of both local and effective property prediction
granular stress fields is possible with full-field, spatially resolved three- as demonstrated for heterogeneous (e.g., two-phase composite) and
dimensional (3D) polycrystalline techniques such as crystal plasticity polycrystalline materials [17–22]. These models have also shown to be
finite element (CPFE) or fast Fourier transform (CPFFT) methods [1]. robust and orders of magnitude faster than physics-based simulations.
While powerful, these computational models come at a significant A promising addition to these state-of-the-art frameworks is graph-
computational cost [2–5]. The problematic trade-off between explicit based deep learning [23], a set of techniques that generalize the
and complete 3D polycrystalline microstructure representation and the operations of structured deep learning models, such as CNNs, to the
computational costs calls for data-driven surrogate models that can non-Euclidean domain of graphs. Graphs present a lightweight, ver-
consider microstructure details and replace computationally expensive satile, and highly interpretable data structure for digitizing polycrys-
micromechanical simulations.
talline microstructure information. Typically, microstructure graphs
Currently, 3D convolutional neural networks (3D-CNNs) [6–10] and
consist of nodes representing individual grains, while graph edges
statistical learning models (also referred to as materials knowledge sys-
connecting nodes represent the grain boundaries that they share [24].
tems, or MKS) [11–15] are prevalent approaches to surrogate modeling

∗ Corresponding authors.
E-mail addresses: jonathanhestroffer@ucsb.edu (J.M. Hestroffer), latmarat@arizona.edu (M.I. Latypov).

https://doi.org/10.1016/j.commatsci.2022.111894
Received 28 June 2022; Received in revised form 8 October 2022; Accepted 31 October 2022
Available online 12 November 2022
0927-0256/© 2022 Elsevier B.V. All rights reserved.
J.M. Hestroffer et al. Computational Materials Science 217 (2023) 111894

Most importantly, through this representation, graphs naturally capture


topological characteristics of microstructures without the need for 3D
high-resolution spatial microstructure data. This makes graphs particu-
larly suitable representations of microstructure when high-resolution
spatial information is not experimentally accessible. Far-field high-
energy X-ray diffraction microscopy (HEDM) is a representative case,
where higher confidence is placed on measurements of grain centroid,
and crystallographic orientation, but less so on exact microstructure
geometry [25–27]. Graphs offer effective representation of microstruc-
ture data consistent with such measurements, which also maintains
information about the grain neighborhood and connectivity. Even sim-
ple structural analyses of microstructure graphs, outside of any ma-
chine learning, have yielded important insight into structure–property
relationships. Examples include analysis of node degree (number of
neighboring nodes) and node eccentricity (maximum distance to any
other node) to predict strain localization and fatigue hotspots in nickel-
base superalloy René 88DT [24] and analyzing various measures of
centrality (node importance) to reveal extended networks of grains that
drive plastic strain in Ti-7Al [28].
Graph neural networks (GNNs) have become a widely applied deep
learning method given the expressive power of graph data with well-
known applications including recommendation systems, traffic predic-
tion, and molecule design [29–32]. Operating much like CNNs, GNNs
adaptively learn multiscale localized spatial features of graph data Fig. 1. Sample MVE of 𝛼-Ti (a), with corresponding (0002) pole figure (b), microstruc-
ture graph (c), and graph with edge-bundling (d). For each graph, nodes represent
through layers of spatial graph convolution operations [32]. GNNs
individual grains with corresponding inverse pole figure coloring, while edges represent
are also very versatile, capable of producing low-dimensional vector boundaries shared between grains.
embeddings at multiple scales for graph, edge, or even node-level infer-
ence tasks with little or no change to model architecture or convolution
algorithm [33–35]. Only recently have GNNs been used for model-
letters, and sets by calligraphic font. We represent the 3D polycrys-
ing microstructure–property relationships with some early successes
talline microstructure of microstructure volume elements (MVEs) as a
being the prediction of stored elastic energy functionals [36], mag-
homogeneous, undirected graph (, ), with nodes, 𝑣 ∈ , constituting
netostriction of Tb0.3Dy0.7Fe2 alloys [37], elongation of magnesium
individual grains connected by edges, 𝑒 ∈ , signifying the boundaries
alloys [38], and grain-scale elasticity in Ni and Ti alloys [39]. These
that are shared between grains, accounting for any periodicity. Addi-
studies however present limited evaluations of model generalization
tionally, features are assigned to the individual nodes, {𝐱𝑣 , 𝑣 ∈ }.
capability with respect to the similarity of training and test set distri-
These features include corresponding grain-average crystallographic
butions, and offer minimal insight into model sensitivity in regard to
feature selection and GNN design. Given how nascent the application of orientation, reduced to the hexagonal fundamental region, and defined
graph-based deep learning is to the micromechanics of polycrystalline by the four-component unit quaternion vector, 𝐪, as well as grain size,
𝑑. This results in a 5-dimensional feature vector for each node, 𝐱𝑣 =
materials, cementing its usage in the field requires further exploration [ ]𝑇
of different inference tasks, model architectures and microstructure 𝑞0 , 𝑞1 , 𝑞2 , 𝑞3 , 𝑑 . A microstructure graph for a sample MVE is shown in
graph design. Fig. 1c, visualized in a 2D force-directed layout [40]. The same graph is
In this work, a GNN is developed for predicting the stiffness and provided in Fig. 1d with edge-bundling [41] applied to alleviate visual
yield strength of 𝛼-Ti based on 3D grain-level microstructure data. edge clutter.
Using crystallographic orientation and grain size as the only grain-
level attributes, the trained model achieves high prediction accuracy for 2.2. GNN model architecture
both regression tasks using a single architecture and hyper-parameter
optimization campaign. Additionally, a comprehensive assessment of
Once graph representation of microstructures was established, the
the GNN’s generalization performance was conducted showing that the
next step was to design a relevant GNN model architecture. As seen
model is capable of generalizing predictions of stiffness and strength
in Fig. 2, the GNN developed here consisted of one fully-connected
to new microstructures generated from texture groups both seen and
layer for feature pre-processing, two message-passing layers, a global
unseen during training. Accuracy of the GNN is comparable to cur-
mean pooling layer (graph-readout), and two fully-connected post-
rent state-of-the-art frameworks including 3D-CNNs and statistics-based
processing layers. This architecture was inspired by the general GNN
methods even in a limited training data setting, which is advantageous
given the current scarcity of experimental materials datasets. Lastly, design space proposed by You et al. [42]. For the message-passing
sensitivity to feature and hyper-parameter selection is analyzed and layers, we employ SAGE convolutional layers as implemented in the
potential extensions of the GNN are discussed. Altogether, our work PyTorch Geometric library [43]. These layers, which are based on the
highlights the importance of grain neighborhood connectivity on the GraphSAGE algorithm [44], train aggregator functions that best com-
effective mechanical response of polycrystals and the utility of their bine feature information of a node’s local neighborhood. This method
graphical representation. enables generalization across graphs with node features of the same
form. Finally, we implement an artificial neural network (ANN) which
2. Methods ignores grain–grain interaction to serve as a point of comparison for
the GNN. The ANN developed shares the same architecture and hyper-
2.1. Microstructure graph design parameters of the GNN except node neighborhood aggregation is not
performed, essentially reducing the message-passing layers into fully-
As a brief remark on notation, scalars are denoted by non-bold low- connected layers. See Appendix for details of the GNN and ANN along
ercase letters, vectors bold lowercase letters, matrices bold uppercase with descriptions of model training, evaluation, and optimization.

2
J.M. Hestroffer et al. Computational Materials Science 217 (2023) 111894

Fig. 2. GNN architecture consisting of three fully-connected layers (FLs) and two
message-passing layers (MPLs).

3. Results
Fig. 3. Mean training and validation loss curves from 10-fold cross-validation for (a)
We trained and critically assessed our GNN model using a synthetic stiffness and (b) strength.

𝛼-Ti dataset generated by Priddy et al. [45,46]. The dataset contained


1200 MVEs of polycrystalline 𝛼-Ti microstructures and their corre-
sponding effective mechanical properties. Each MVE was stochastically
generated from one of 12 distinct crystallographic textures (orientation
distributions) with equiaxed grain morphology and log-normal grain
size distribution. These textures, labeled A through L, are provided
in Fig. A.1 in Appendix for reference. Textures A through H were
inspired by literature [47–52], while textures I through L represented
different combinations of texture components present in textures A
through H. Effective mechanical properties of the MVEs were obtained
from CPFE simulations of uniaxial tension in x, y, and 𝑧-directions with
periodic boundary conditions. The simulations were performed under
displacement-controlled loading to a total strain of 1.5% where both
elastic stiffness and yield strength at 0.2% offset strain were determined
for each loading direction.
With this dataset, we first evaluated the accuracy of the GNN for
modeling stiffness and yield strength in the 𝑥-direction for texture
groups A through G. 10 MVEs of each texture group were reserved
for testing and 10-fold cross-validation was performed on the remain-
ing MVEs. Hyper-parameters of the GNN were optimized for stiffness
regression and then directly applied to strength regression, see Ap-
pendix for details. Results of cross-validation and testing for stiffness
and strength can be found in Fig. 4 in the form of parity plots and
respective regression metrics are reported in Table 1. The GNN is able
to accurately predict both stiffness and strength and generalize to new
Fig. 4. Ground truth versus GNN predicted values of stiffness and strength for MVEs
MVEs of texture groups seen during training. This is evident in the of textures A through G. Results of 10-fold cross-validation (a, b), and those for the
low values of MeanARE and MaxARE for both the validation and test test set (c, d).
sets, performing significantly better for both inference tasks than a
baseline dummy model that predicts the mean ground truth value for a
given validation or test set. In addition, the GNN performs marginally As shown in Fig. 5 and Table 2, results of this experiment show that
better than the ANN in terms of MeanARE, but shows more substantial the GNN predicts stiffness and yield strength for MVEs of unseen
improvement in terms of MaxARE. While accuracy of the GNN is high textures with accuracy comparable to the first experiment, with slightly
for both regression tasks, it performs relatively better at predicting increased MeanARE for each regression task, but still performing better
stiffness than strength with lower values of MeanARE and MaxARE than the ANN and dummy model.
respectively. The same insight can be gathered from the reported loss Finally, we assessed how well the GNN performs with limited
curves with ultimately lower mean validation loss achieved for stiffness training data. Especially in settings where obtaining ground truth la-
(Fig. 3a). This is expected partly as the GNN was optimized for stiffness, bels/values from experiment or simulation is time-consuming, machine
but may also indicate that yield strength is a more challenging property learning models which can learn from limited data are very attractive.
to learn. In a manner similar to cross-validation, the GNN was trained for both
Next, we evaluated the predictions of the GNN for unseen tex- stiffness and strength regression using only 20 of the 100 MVEs from
tures. Here, the same exact model, including architecture and hyper- each texture group A through G. The GNN was then tested using all
parameter configuration, was used to predict stiffness and strength for 500 MVE graphs from texture groups H through L. This was repeated
MVEs of textures H through L that were left out during the original five times, each time providing a separate set of 20 MVEs for training.
evaluation. Here, the model received the same set of graphs from Results of this experiment can be found in Fig. 5 and Table 2. Even
textures A through G for training, but a test set of 500 graphs from the with a reduced training set, the GNN still predicts stiffness and strength
never before seen textures H through L. This was a particularly useful with high accuracy. The most noticeable degradation of model perfor-
evaluation of the GNN’s capacity for interpolation and generalization mance is visible when comparing MaxARE values, particularly when
as a large fraction of MVEs from textures H through L exhibit modulus predicting strength (7.70% MaxARE). These increased deviations are
and strength values which are largely absent from the training data also evident in the parity plots, however they appear modest in the
in the ranges 130–145 GPa, and 950–1150 MPa respectively (Fig. 4). context of a near 5-fold reduction in training data. As demonstrated

3
J.M. Hestroffer et al. Computational Materials Science 217 (2023) 111894

Table 1
Regression metrics for stiffness and strength inference of textures A through G. Results of the 10-fold cross-validation (valid.)
and test set (test) reported for the GNN, ANN, and dummy regressor model.
Model Stiffness Strength
MeanARE MaxARE R2 MeanARE MaxARE R2
(%) (%) (%) (%)
GNN (valid.) 0.25 1.05 0.998 0.92 5.44 0.994
GNN (test) 0.27 0.95 0.998 1.20 3.32 0.990
ANN (valid.) 0.59 3.00 0.985 1.35 9.59 0.985
ANN (test) 0.71 5.72 0.982 1.65 7.20 0.978
Dummy (valid.) 5.40 14.65 0 10.16 25.89 0
Dummy (test) 5.49 14.32 0 10.20 25.22 0

Table 2
Regression metrics for stiffness and strength inference of textures H through L. Values reported for GNN, ANN, and dummy
regressor models with entire and reduced (red.) training data.
Model Stiffness Strength
MeanARE MaxARE R2 MeanARE MaxARE R2
(%) (%) (%) (%)
GNN 0.43 1.87 0.986 1.36 5.91 0.950
GNN (red.) 0.65 3.29 0.967 1.67 7.70 0.937
ANN 1.24 5.08 0.898 2.15 14.31 0.911
ANN (red.) 1.03 5.61 0.910 2.19 11.71 0.901
Dummy 3.81 11.89 0 7.15 23.70 0

the typical evaluation scenario where training and test datasets are
drawn from the same distribution, in this case from the same tex-
ture groups. More impressive however is the model’s generalization
performance for new microstructures of unseen textures, even with
reduced training data. Such tests do not reach the level of extrapolation,
given the test textures H through L represent various combinations of
texture groups A through G, however they do represent more difficult
and probing tests of interpolation than what has been performed in
prior studies involving GNNs. The accuracy achieved by the ANN
suggests that the effective properties of MVEs can largely be predicted
through non-linear relationships of orientation and grain size, however
the performance of the GNN indicates that incorporating grain–grain
interactions results in consistent improvement of model accuracy across
all assessments.
To contextualize our model’s accuracy, valuable comparisons can
be made against work done using 3D-CNNs as well as statistics-based
models. The value comes from the very distinct representations of
microstructure used, with the GNN developed here containing only
rudimentary descriptors of grains and the only spatial information
available is the connectivity of grains, while both CNNs and statistics-
based models require full-field discretization of the microstructure.
These various methods have largely been used for predicting effective
properties of high-contrast composite materials reporting MeanAREs of
Fig. 5. Ground truth versus GNN predicted values of stiffness and strength for MVEs approximately 0.4% to 3.0% for both stiffness and strength inference
of textures H through L. Results when all training data is available (a, b), and those tasks [7,13,18]. The results of our GNN sit comfortably within this
for the limited data case (c, d). range; however, as these studies are not for polycrystalline materials,
the results are not directly comparable.
The best comparison that can be made is with work by Paulson and
in the previous assessments for seen textures, more substantial gains coauthors [11] that used a statistics-based model formulated on the
in accuracy are made with the GNN compared to the ANN for the computation of 2-point statistics to predict stiffness and yield strength
unseen textures in terms of the maximum prediction error indicating for the same dataset presented here. There, they conducted similar
that the treatment of grain–grain interactions is particularly beneficial experiments, training and testing their model on texture groups A
for reducing prediction outliers and increasing model precision. through G as well as testing on unseen textures H through L. When
performing calibration/validation of their model on texture groups
4. Discussion A through G, they reported very low MeanAREs of approximately
0.2% and 1.5% respectively for stiffness and strength compared to our
4.1. Generalization performance of the GNN GNN’s 0.25% and 0.92%. When testing on unseen textures H through
L they reported MaxAREs of 1% and 5% respectively for stiffness and
The GNN developed in this work achieves very high accuracy for strength, compared to our model’s MaxAREs of 1.87% and 5.91%. The
both stiffness and strength regression tasks with fixed architecture and accuracy achieved by the GNN model suggests that grain connectivity
hyper-parameter configuration. Results of the 10-fold cross-validation suffices to represent essential microstructural information for modeling
and corresponding testing indicate great generalization capability in effective properties of polycrystalline materials without the need of

4
J.M. Hestroffer et al. Computational Materials Science 217 (2023) 111894

Table 3
Regression metrics for stiffness and strength of GNN variants. Values reported for 10-fold cross-validation of textures A through
G.
Model Stiffness Strength
MeanARE MaxARE R2 MeanARE MaxARE R2
(%) (%) (%) (%)
GNN (base) 0.25 1.05 0.998 0.92 5.44 0.994
Euler angles 0.23 1.35 0.998 0.82 3.93 0.995
PReLU 0.25 1.34 0.998 1.14 6.96 0.991
Max aggregator 0.34 1.50 0.996 1.18 6.05 0.990

high-resolution spatial data. This is a desirable feature of our GNN, Additionally, the specific microstructure graph design method and
since such high-resolution data is very time consuming and expensive GNN architecture presented in this work is strongly suited for learn-
to acquire. ing multiscale properties of polycrystalline materials. The same node
embedding vectors, 𝐡(𝑘)𝑣 , used here to perform graph-level inference,
4.2. Effects of feature selection and intra-layer GNN design can be leveraged to perform node-level inference as done by Pagan
et al. [39]. This versatility allows GNNs to potentially address both
While many hyper-parameters of the GNN were optimized, others homogenization and localization problems with a single architecture.
were not tuned by authors. Notably, these are the chosen representation One limitation of the GNN developed here, is its ability to generalize
of crystallographic orientation and intra-layer hyper-parameters of the to new load directions. Currently, additional training would be required
GNN, including the nonlinear activation function, and neighborhood for the GNN to predict effective properties of the MVEs in either the
aggregator function of the message-passing layers. It is worthwhile to y- or 𝑧-direction as both the graphs and input graph features would
analyze the model’s sensitivity to these parameters as their effects on be identical between inference tasks, with only the target variables
model accuracy are not determinable a priori and can guide future changing. This limitation is not unique to our GNN as inference in a
GNN design. For this, the same 10-fold cross-validation experiment was new load direction would present a very difficult extrapolation problem
performed for different GNN variants, each variant representing the for any type of neural network. While the GNN can be trained for multi-
base GNN model with only one feature or hyper-parameter difference. output regression, predicting effective properties in all three directions
The three variants tested included using Bunge Euler angles instead simultaneously, the network lacks prior physical domain knowledge
of quaternions, parametric ReLU (PReLU) in place of ReLU activation of the material’s elastic anisotropy to generalize well to new load
function, and max rather than mean aggregator function. Overall, the directions.
results presented in Table 3 indicate only minor changes occur in Another limitation worth mentioning is that while the GNN pre-
model performance in terms of MeanARE and MaxARE for stiffness and sented here is well-suited for polycrystalline materials, it is not easily
strength regression across the different variants tested. The marginal employable for every type of microstructure. In particular, training a
performance differences suggest the GNN developed here is relatively similar regression model for a heterogeneous or two-phase microstruc-
insensitive to these hyper-parameters. ture poses certain challenges. For these kinds of microstructures, a
Choice of crystallographic orientation representation and its effect direct application of our graph construction method would result in
on model performance was of special interest. In the literature, po- a graph with only two nodes, one for each phase, connected by a
tential challenges in training deep learning models with orientation single edge. It is unknown how such a graph and accompanying GNN
space representations like Euler angles and quaternions have been would perform in predicting effective mechanical properties, even with
discussed as they exhibit multiple degeneracies due to inherent crystal added node features. What is certain is that the application of multiple
symmetries with generalized spherical harmonics as an alternative convolutional layers would be meaningless for such a graph. One
representation of orientations [53]. In this work, all grain orientations way around this might be to discretize the two-phase microstructure
were reduced to the hexagonal fundamental region to avoid symmetry into connected subregions. This however calls into question whether
equivalence issues. Our findings show the GNN exhibited high accu- 3D-CNNs should be used instead since they generalize well to any
racy in cross-validation experiments for both stiffness and strength regular grid 3D image data, including two-phase microstructures. That
regardless of the orientation space representation. being said, non-Euclidean discretizations could serve as the basis of
a microstructure graph, such as an unstructured mesh with variable
4.3. Extensions and limitations element size [54]. Such a scenario might support the usage of GNNs
over 3D-CNNs depending on how compact the input graph is compared
Polycrystalline microstructures lend themselves to intuitive graph to the regular grid representation of the microstructure [55].
design as grains and the physical boundaries they share with their The major benefit of GNNs compared to CNNs is that the computa-
neighbors can be described naturally as a multi-relational network. tional cost of many spatial convolution algorithms scale linearly with
Intuition can also be applied when assigning features to the graph. For the number of edges in the graph. For graphs based on polycrystalline
the inference tasks presented here, grain orientation and size proved microstructure, this scaling is advantageous given that the number
as sufficient node features. However, one can introduce more detailed of edges is largely independent of microstructure resolution [32,56].
information as needed, such as grain shape or even positional informa- This contrasts greatly with 3D-CNNs where computational cost scales
tion in the form of x-y-z coordinates. Characteristics of grain boundaries cubically with microstructure volume and dataset resolution. This scal-
can also be incorporated into the graph. As an example, it might ing benefit, when combined with the streamlined representation of
be desirable to weigh the individual messages passed by neighboring microstructure offered by graphs, results in significant reductions in
nodes according to the physical area two grains share. In this example, training time and computational resource dependence compared to
edge-weights, 𝛼𝑢,𝑣 , proportional to the boundary area, can be assigned CNNs. Recent studies reported GNN training times were 35 times
to the microstructure graph and affect the calculation of the aggregated faster than comparative CNNs for microstructure volumes of moder-
neighborhood representation in Eq. (B.3) in the following manner, ate size (1203 voxels, containing 300 grains) [37]. This makes GNNs
( ) particularly attractive for deep learning on very large microstructures
(𝑘)
∑ containing tens or hundreds of millions of voxels and thousands of
(𝑘−1)
𝐡 (𝑣) ← mean 𝛼𝑢,𝑣 ⋅ 𝐡𝑢 (1)
𝑢∈ (𝑣)
grains [57].

5
J.M. Hestroffer et al. Computational Materials Science 217 (2023) 111894

5. Conclusions

In summary, we demonstrate that GNNs can be used to predict effec-


tive mechanical properties of synthetic 𝛼-Ti microstructures of various
crystallographic textures. The GNN developed was capable of predict-
ing the stiffness and yield strength in a particular loading direction, and
generalizing well to unseen microstructure and textures. Accuracy of
the GNN is comparable to other machine learning frameworks, which
rely on full-discretization of microstructure such as 3D-CNNs or those
founded on explicit computation of 2-point statistics. The framework
developed here can be easily extended to other effective property
inferences without changing the selected design of the microstructure
graph. With that said, the total design space of microstructures, GNN
architectures, and choice of message-passing layers is vast and remains
largely unexplored. This makes graph-based deep learning an area of
great potential research for a wide variety of applications relying on
microstructure–property predictions. The success of our model inspires
a different approach to studying the micromechanics of polycrystals,
transitioning from the perspective of spatially resolved microstructures
to that of a network arrangement of grains. Fig. A.1. Sample MVE (0002) pole figures for each texture class A through L.

CRediT authorship contribution statement


with activation function, 𝜎, and 𝐡(𝑘) (𝑘−1)
𝑣 , 𝐡𝑣 , and 𝐡(𝑘)
 (𝑣)
indicating the
Jonathan M. Hestroffer: Conceptualization, Software, Formal updated node, current node, and aggregated neighborhood represen-
Analysis, Writing – original draft. Marie-Agathe Charpagne: tations respectively at the kth layer. Additionally for each layer, there
Conceptualization, Software, Writing – review & editing. Marat I. are separate trainable weight matrices, 𝐖(𝑘) , 𝐖(𝑘)
(𝑘)
∈ R𝑑 ×𝑑
(𝑘−1)
, and
self neigh
Latypov: Conceptualization, Writing – review & editing. Irene J. (𝑘)
biases 𝐛(𝑘) ∈ R𝑑 , which linearly transform the current node and
Beyerlein: Project administration, Funding acquisition, Writing –
aggregated neighborhood representations respectively. In this work,
review & editing.
neighborhood representations are calculated via element-wise mean
aggregation as,
Declaration of competing interest
( )
(𝑘)

(𝑘−1)
The authors declare that they have no known competing finan- 𝐡 (𝑣) ← mean 𝐡𝑢 (B.3)
𝑢∈ (𝑣)
cial interests or personal relationships that could have appeared to
influence the work reported in this paper. where {𝐡(𝑘−1)
𝑢 , 𝑢 ∈  (𝑣)} represent all immediate node neighbors within
a 1-hop distance. This contrasts with the original GraphSAGE algorithm
Data availability that sub-samples neighbors to reduce training times of exceedingly
large graphs. Sub-sampling is not needed here as the microstructure
Data and codes necessary to reproduce these findings can be ac- graphs are small, averaging only 200 nodes each. This reduces our
cessed via https://github.com/jonathanhestroffer/PolyGRAPH. update equation to a simple extension of the famous self-loop graph
convolutional network (GCN) introduced by Kipf et al. [56]. The exten-
Acknowledgment sion being that the weights applied to a node and its neighbors are no
longer shared as in the classic GCN, rather they are learned separately;
This work is funded by the U.S. Dept. of Energy, Office of Basic this increases the potential expressivity of the GNN [58].
Energy Sciences Program DE-SC0018901. After updating, node representations calculated in Eq. (B.5) undergo
𝓁 2 normalization and the final graph embedding vector of our GNN,
Appendix A (𝐾)
𝐱 ∈ R𝑑 , is calculated during graph-readout at layer depth 𝐾 by
taking the element-wise average of all final node representations,
1 ∑ (𝐾)
Texture classes A through L 𝐱 ← 𝐡 (B.4)
|| 𝑣∈ 𝑣
Appendix B which is then passed through two additional fully-connected layers to
the final output layer. ReLU nonlinearity is applied after every fully-
connected and message-passing layer except for the output. Models for
Details of the GNN both stiffness and yield strength prediction were based on the GNN
model architecture described above.
Let 𝑘 = 1, … , 𝐾 denote the message-passing layer index. Be-
fore graph convolution layers are applied, the fully-connected pre-
Details of the ANN
processing layer transforms input node features, 𝐱𝑣 , into initial node
representations, 𝐡(0)
𝑣 , (i.e. 0th layer representation). With the graphs
now prepared for convolution, the representations of each node are The GNN and ANN differ only in their message-passing layer update
updated at each message-passing layer according to the following equations, with the ANN ignoring neighborhood aggregation. In the
update equation, case of the ANN, the message-passing layer update equation becomes,
( ) ( )
(𝑘)
𝐡(𝑘)
𝑣 ← 𝜎 𝐖self ⋅ 𝐡𝑣
(𝑘−1)
+ 𝐖(𝑘)
neigh
⋅ 𝐡(𝑘)
 (𝑣)
+ 𝐛(𝑘) (B.2) 𝐡(𝑘) (𝑘)
𝑣 ← 𝜎 𝐖self ⋅ 𝐡𝑣
(𝑘−1)
+ 𝐛(𝑘) (B.5)

6
J.M. Hestroffer et al. Computational Materials Science 217 (2023) 111894

Fig. B.1. Parallel coordinates plot of the 200, 10-fold cross-validation experiments trained on stiffness data of textures A through G. Hyper-parameters listed left to right in
decreasing order of importance, based on the predicted complexity of MSE response relative to perturbations of the hyper-parameter. Optimal hyper-parameters highlighted in
orange.

Training and evaluation Table B.1


Hyper-parameters of the GNN. Listed are sets of possible values for each as well as the
optimal values determined via Bayesian optimization using SigOpt.
All fully-connected and message-passing layers have learnable
Hyper-parameters Possible values Optimal value
weights and biases associated with them that must be trained. Weights
NFL1 [8, 16, 32] 32
of the GNN were initialized using Kaiming uniform initialization [59]
NMPL1 = NMPL2 [16, 32, 64] 64
and biases were initialized from the uniform distribution,  (0, 1). NFL2 [16, 32, 64, 128] 64
Parameters of the model were trained using an adaptive learning rate NFL3 [8, 16, 32, 64] 16
optimization algorithm (Adam) [60], implemented in PyTorch [61] batch size [16, 32, 64] 16
using a mean-squared error (MSE) loss function, learning rate [1, 5, 10, 50] ×10−4 10 × 10−4
weight decay [1, 5, 10, 50] ×10−5 1 × 10−5
1 ∑(
𝑛
)2 number of epochs [100, 150, 200, 300, 400, 600] 600
MSE = 𝑦 − 𝑦̂𝑖 (B.6)
𝑛 𝑖=1 𝑖

where 𝑦𝑖 and 𝑦̂𝑖 the denote ground truth and predicted target values re-
spectively of the 𝑖th sample of 𝑛 total samples in a training or validation loss histories to ensure the GNN was not overfitting at the optimal
batch. Evaluation metrics reported to analyze model performance are number of epochs. No overfitting was observed, as shown in Fig. 3,
the mean absolute relative error (MeanARE), the maximum absolute and therefore no early stopping procedure was adopted. Training and
relative error (MaxARE), and the coefficient of determination (R2 ) inference times of the optimized GNN range from 30 to 180 s, and
defined below, 0.004 to 0.024 s respectively, depending on the amount of training data
(10,000 to 100,000 grains), using an NVIDIA RTX 2060 GPU with 6 GB
1 ∑ |𝑦𝑖 − 𝑦̂𝑖 |
𝑛
MeanARE = × 100% (B.7) of memory.
𝑛 𝑖=1 |𝑦𝑖 |
( )
|𝑦𝑖 − 𝑦̂𝑖 |
MaxARE = max𝑖 × 100% (B.8)
|𝑦𝑖 |
References
∑𝑛 ( )2
2 𝑖=1 𝑦𝑖 − 𝑦̂𝑖
R =1− ∑ ( )2 (B.9)
𝑛 [1] J. Segurado, R.A. Lebensohn, J. Llorca, Chapter one - computational homoge-
𝑖=1 𝑦𝑖 − 𝑦 nization of polycrystals, in: M.I. Hussein (Ed.), Advances in Crystals and Elastic

where 𝑦 = 1𝑛 𝑛𝑖=1 𝑦𝑖 . These metrics are reported for 10-fold cross- Metamaterials, Part 1, in: Advances in Applied Mechanics, vol. 51, Elsevier,
2018, pp. 1–114, http://dx.doi.org/10.1016/bs.aams.2018.07.001, URL https:
validation as well as testing. In the case of 10-fold cross-validation //www.sciencedirect.com/science/article/pii/S0065215618300012.
or for the five repeated evaluations of reduced training data, ground [2] I.J. Beyerlein, M. Knezevic, Review of microstructure and micromechanism-based
truth and predicted values of all folds are included in the calculations. constitutive modeling of polycrystals with a low-symmetry crystal structure, J.
These metrics are evaluated for the GNN model as well as for a dummy Mater. Res. 33 (22) (2018) 3711–3738, http://dx.doi.org/10.1557/jmr.2018.333.
regressor model. The dummy model predicts the mean ground truth [3] I.J. Beyerlein, M. Knezevic, Mesoscale, Microstructure-Sensitive Modeling for
Interface-Dominated, Nanostructured Materials, Springer International Publish-
value for a given validation or test set, 𝑦̂ ← 𝑦.
ing, Cham, 2018, pp. 1–42, http://dx.doi.org/10.1007/978-3-319-42913-7_82-
1,
Hyper-parameter Tuning [4] M. Knezevic, I.J. Beyerlein, Multiscale modeling of microstructure-property
relationships of polycrystalline metals during thermo-mechanical deforma-
To maximize the performance of our GNN model, we carried out tion, Adv. Energy Mater. 20 (4) (2018) 1700956, http://dx.doi.org/10.
hyper-parameter optimization, where all model parameters apart from 1002/adem.201700956, URL https://onlinelibrary.wiley.com/doi/abs/10.1002/
adem.201700956, arXiv:https://onlinelibrary.wiley.com/doi/pdf/10.1002/adem.
the base architecture of the GNN were tuned. We selected 10-fold
201700956.
cross-validation MSE on a portion of the dataset as a loss metric to [5] W. Andreoni, S. Yip (Eds.), Handbook of Materials Modeling, Springer
minimize during hyper-parameter tuning. We utilized the optimization International Publishing, 2020, http://dx.doi.org/10.1007/978-3-319-42913-7.
tool SigOpt [62] to perform the optimization over a grid of possible [6] A. Cecen, H. Dai, Y.C. Yabansu, S.R. Kalidindi, L. Song, Material structure-
values for each hyper-parameter; parameters and possible values are property linkages using three-dimensional convolutional neural networks, Acta
listed in Table B.1. A total of 200 separate 10-fold cross-validation Mater. 146 (2018) 76–84.

experiments were performed each with a unique hyper-parameter con- [7] Z. Yang, Y.C. Yabansu, R. Al-Bahrani, W. keng Liao, A.N. Choudhary, S.R.
Kalidindi, A. Agrawal, Deep learning approaches for mining structure-property
figuration, results of these experiments are provided in Fig. B.1. Once linkages in high contrast composites from simulation datasets, Comput. Mater.
all experiments were completed, the top candidate model was selected Sci. 151 (2018) 278–287, http://dx.doi.org/10.1016/j.commatsci.2018.05.014,
for a more extensive analysis of its respective training and validation URL https://www.sciencedirect.com/science/article/pii/S0927025618303215.

7
J.M. Hestroffer et al. Computational Materials Science 217 (2023) 111894

[8] X. Li, Z. Liu, S. Cui, C. Luo, C.-F. Li, Z. Zhuang, Predicting the effective [29] Y. Koren, R. Bell, C. Volinsky, Matrix factorization techniques for recommender
mechanical property of heterogeneous materials by image based modeling and systems, Computer 42 (8) (2009) 30–37, http://dx.doi.org/10.1109/MC.2009.
deep learning, Comput. Methods Appl. Mech. Engrg. 347 (2019) http://dx.doi. 263.
org/10.1016/j.cma.2019.01.005. [30] W. Jiang, J. Luo, Graph neural network for traffic forecasting: A survey,
[9] C. Herriott, A.D. Spear, Predicting microstructure-dependent mechanical prop- 2021, http://dx.doi.org/10.48550/ARXIV.2101.11174, URL https://arxiv.org/
erties in additively manufactured metals with machine- and deep-learning abs/2101.11174.
methods, Comput. Mater. Sci. 175 (2020) 109599, http://dx.doi.org/10.1016/ [31] D. Duvenaud, D. Maclaurin, J. Aguilera-Iparraguirre, R. Gómez-Bombarelli,
j.commatsci.2020.109599, URL https://www.sciencedirect.com/science/article/ T. Hirzel, A. Aspuru-Guzik, R.P. Adams, Convolutional networks on graphs
pii/S0927025620300902. for learning molecular fingerprints, in: Proceedings of the 28th International
[10] X. Liu, Z. Yan, Z. Zhong, Predicting elastic modulus of porous Conference on Neural Information Processing Systems - Volume 2, NIPS ’15,
La0.6Sr0.4Co0.2Fe0.8O3-𝛿 cathodes from microstructures via FEM MIT Press, Cambridge, MA, USA, 2015, pp. 2224–2232.
and deep learning, Int. J. Hydrogen Energy 46 (42) (2021) 22079– [32] J. Zhou, G. Cui, S. Hu, Z. Zhang, C. Yang, Z. Liu, L. Wang, C. Li, M.
22091, http://dx.doi.org/10.1016/j.ijhydene.2021.04.033, URL https: Sun, Graph neural networks: A review of methods and applications, AI Open
//www.sciencedirect.com/science/article/pii/S0360319921013379. 1 (2020) 57–81, http://dx.doi.org/10.1016/j.aiopen.2021.01.001, URL https://
[11] N.H. Paulson, M.W. Priddy, D.L. McDowell, S.R. Kalidindi, Reduced-order www.sciencedirect.com/science/article/pii/S2666651021000012.
structure-property linkages for polycrystalline microstructures based on 2- [33] P. Veličković, G. Cucurull, A. Casanova, A. Romero, P. Liò, Y. Bengio, Graph
point statistics, Acta Mater. 129 (2017) 428–438, http://dx.doi.org/10.1016/ Attention Networks, in: International Conference on Learning Representations,
j.actamat.2017.03.009, URL https://www.sciencedirect.com/science/article/pii/ 2018, accepted as poster. URL https://openreview.net/forum?id=rJXMpikCZ.
S135964541730188X. [34] M. Sun, S. Zhao, C. Gilvary, O. Elemento, J. Zhou, F. Wang, Graph convolutional
[12] A. Gupta, A. Cecen, S. Goyal, A.K. Singh, S.R. Kalidindi, Structure–property link- networks for computational drug development and discovery, Brief. Bioinform.
ages using a data science approach: Application to a non-metallic inclusion/steel 21 (3) (2019) 919–935, http://dx.doi.org/10.1093/bib/bbz042, arXiv:https://
composite system, Acta Mater. 91 (2015) 239–254. academic.oup.com/bib/article-pdf/21/3/919/33227266/bbz042.pdf.
[13] M.I. Latypov, S.R. Kalidindi, Data-driven reduced order models for effective [35] M. Zhang, P. Li, Y. Xia, K. Wang, L. Jin, Revisiting graph neural networks for
yield strength and partitioning of strain in multiphase materials, J. Comput. link prediction, 2021, URL https://openreview.net/forum?id=8q_ca26L1fz.
Phys. 346 (2017) 242–261, http://dx.doi.org/10.1016/j.jcp.2017.06.013, URL [36] N.N. Vlassis, R. Ma, W. Sun, Geometric deep learning for computational
https://www.sciencedirect.com/science/article/pii/S0021999117304588. mechanics part I: Anisotropic hyperelasticity, Comput. Methods Appl. Mech.
[14] M.I. Latypov, L.S. Toth, S.R. Kalidindi, Materials knowledge system for nonlinear Engrg. 371 (2020) 113299, http://dx.doi.org/10.1016/j.cma.2020.113299, URL
composites, Comput. Methods Appl. Mech. Engrg. 346 (2019) 180–196, http: https://www.sciencedirect.com/science/article/pii/S0045782520304849.
//dx.doi.org/10.1016/j.cma.2018.11.034, URL https://www.sciencedirect.com/ [37] M. Dai, M.F. Demirel, Y. Liang, J.-M. Hu, Graph neural networks for an accurate
science/article/pii/S0045782518305930. and interpretable prediction of the properties of polycrystalline materials, Npj
[15] S.R. Kalidindi, A Bayesian framework for materials knowledge systems, MRS Comput. Mater. 7 (1) (2021) 103, http://dx.doi.org/10.1038/s41524-021-00574-
Commun. 9 (2) (2019) 518–531. w.
[16] S.R. Kalidindi, Feature engineering of material structure for AI-based materials [38] C. Shu, J. He, G. Xue, C. Xie, Grain knowledge graph representation learning: A
knowledge systems, J. Appl. Phys. 128 (4) (2020) 041103. new paradigm for microstructure-property prediction, Crystals 12 (2) (2022) http:
[17] Z. Yang, Y.C. Yabansu, D. Jha, W. keng Liao, A.N. Choudhary, S.R. Kalidindi, //dx.doi.org/10.3390/cryst12020280, URL https://www.mdpi.com/2073-4352/
A. Agrawal, Establishing structure-property localization linkages for elastic 12/2/280.
[39] D.C. Pagan, C.R. Pash, A.R. Benson, M.P. Kasemer, Graph neural network
deformation of three-dimensional high contrast composites using deep learn-
modeling of grain-scale anisotropic elastic behavior using simulated and mea-
ing approaches, Acta Mater. 166 (2019) 335–345, http://dx.doi.org/10.1016/
sured microscale data, 2022, http://dx.doi.org/10.48550/ARXIV.2205.06324,
j.actamat.2018.12.045, URL https://www.sciencedirect.com/science/article/pii/
URL https://arxiv.org/abs/2205.06324.
S1359645418309960.
[40] S.G. Kobourov, Spring embedders and force directed graph drawing algorithms,
[18] C. Rao, Y. Liu, Three-dimensional convolutional neural network (3D-
2012, http://dx.doi.org/10.48550/ARXIV.1201.3011, URL https://arxiv.org/abs/
CNN) for heterogeneous material homogenization, Comput. Mater. Sci. 184
1201.3011.
(2020) 109850, http://dx.doi.org/10.1016/j.commatsci.2020.109850, URL https:
[41] H. Zhou, P. Xu, X. Yuan, H. Qu, Edge bundling in information visualization,
//www.sciencedirect.com/science/article/pii/S0927025620303414.
Tsinghua Sci. Technol. 18 (2) (2013) 145–156, http://dx.doi.org/10.1109/TST.
[19] R. Pokharel, A. Pandey, A. Scheinker, Physics-informed data-driven surrogate
2013.6509098.
modeling for full-field 3D microstructure and micromechanical field evolution
[42] J. You, R. Ying, J. Leskovec, Design space for graph neural networks,
of polycrystalline materials, JOM 73 (2021) http://dx.doi.org/10.1007/s11837-
2020, http://dx.doi.org/10.48550/ARXIV.2011.08843, URL https://arxiv.org/
021-04889-3.
abs/2011.08843.
[20] D. Montes de Oca Zapiain, E. Popova, F. Abdeljawad, J.W. Foulk, S.R. Kalidindi,
[43] M. Fey, J.E. Lenssen, Fast graph representation learning with PyTorch geo-
H. Lim, Reduced-order microstructure-sensitive models for damage initiation
metric, 2019, http://dx.doi.org/10.48550/ARXIV.1903.02428, URL https://arxiv.
in two-phase composites, Integr. Mater. Manuf. Innov. 7 (3) (2018) 97–115,
org/abs/1903.02428.
http://dx.doi.org/10.1007/s40192-018-0112-0. [44] W.L. Hamilton, R. Ying, J. Leskovec, Inductive representation learning on large
[21] A. Marshall, S.R. Kalidindi, Autonomous development of a machine-learning
graphs, NIPS (2017).
model for the plastic response of two-phase composites from micromechanical [45] M.W. Priddy, N. Paulson, Synthetic alpha-ti microstructures and associated elastic
finite element models, JOM 73 (7) (2021) 2085–2095, http://dx.doi.org/10. stiffness and yield strength properties, 2016, URL https://matin.gatech.edu/
1007/s11837-021-04696-w. resources/52. (Accessed Jan 2022).
[22] D.M. de Oca Zapiain, S.R. Kalidindi, Localization models for the plastic response [46] M.W. Priddy, N. Paulson, D. McDowell, S.R. Kalidindi, Synthetic alpha-ti
of polycrystalline materials using the material knowledge systems framework, microstructures and associated elastic stiffness and yield strength properties
Modelling Simulation Mater. Sci. Eng. 27 (7) (2019) 074008, http://dx.doi.org/ - extended, 2017, URL https://matin.gatech.edu/resources/187. (Accessed Jan
10.1088/1361-651x/ab37a5. 2022).
[23] M.M. Bronstein, J. Bruna, Y. LeCun, A. Szlam, P. Vandergheynst, Geometric deep [47] M. Peters, G. Lütjering, G. Ziegler, Control of microstructures of (𝛼+ 𝛽)-titanium
learning: Going beyond Euclidean data, IEEE Signal Process. Mag. 34 (4) (2017) alloys, Int. J. Mater. Res. 74 (5) (1983) 274–282.
18–42, http://dx.doi.org/10.1109/msp.2017.2693418. [48] M. Peters, A. Gysler, G. Lütjering, Influence of texture on fatigue properties of
[24] W.C. Lenthe, M.P. Echlin, J.C. Stinville, M. De Graef, T.M. Pollock, Twin related Ti-6Al-4V, Metall. Mater. Trans. A 15 (8) (1984) 1597–1605, http://dx.doi.org/
domain networks in René 88DT, Mater. Charact. 165 (2020) 110365, http://dx. 10.1007/bf02657799.
doi.org/10.1016/j.matchar.2020.110365, URL https://www.sciencedirect.com/ [49] G. Lütjering, Influence of processing on microstructure and mechanical properties
science/article/pii/S1044580320305647. of (𝛼+ 𝛽) titanium alloys, Mater. Sci. Eng. A 243 (1–2) (1998) 32–45.
[25] H.F. Poulsen, Three-Dimensional X-Ray Diffraction Microscopy: Mapping Poly- [50] Y. Wang, J. Huang, Texture analysis in hexagonal materials, Mater. Chem. Phys.
crystals and their Dynamics, Vol. 205, Springer Science & Business Media, 81 (1) (2003) 11–26, http://dx.doi.org/10.1016/S0254-0584(03)00168-8, URL
2004. https://www.sciencedirect.com/science/article/pii/S0254058403001688.
[26] H.F. Poulsen, An introduction to three-dimensional X-ray diffraction microscopy, [51] G. Lütjering, J. Williams, Titanium, in: Engineering Materials and Processes,
J. Appl. Crystallogr. 45 (6) (2012) 1084–1097, http://dx.doi.org/10.1107/ Springer Berlin Heidelberg, 2007, URL https://books.google.com/books?id=
S0021889812039143. 41EqJFxjA4wC.
[27] M.P. Miller, M. Obstalecki, E. Fontes, D.C. Pagan, J.P.C. Ruff, A.J. Beaudoin, [52] B.D. Smith, Microstructure-Sensitive Plasticity and Fatigue of Three Titanium
Insit@CHESS, a resource for studying structural materials, Synchrotron Radiat. Alloy Microstructures (Ph.D. thesis), Georgia Institute of Technology, 2013.
News 30 (3) (2017) 4–8, http://dx.doi.org/10.1080/08940886.2017.1316124. [53] D. Montes de Oca Zapiain, A. Shanker, S.R. Kalidindi, Convolutional
[28] D.C. Pagan, K.E. Nygren, M.P. Miller, Analysis of a three-dimensional slip Neural Networks for the Localization of Plastic Velocity Gradient
field in a hexagonal Ti alloy from in-situ high-energy X-ray diffraction mi- Tensor in Polycrystalline Microstructures, J. Eng. Mater. Technol. 144
croscopy data, Acta Mater. 221 (2021) 117372, http://dx.doi.org/10.1016/ (1) (2021) http://dx.doi.org/10.1115/1.4051085, 011004, arXiv:https:
j.actamat.2021.117372, URL https://www.sciencedirect.com/science/article/pii/ //asmedigitalcollection.asme.org/materialstechnology/article-pdf/144/1/
S1359645421007515. 011004/6701898/mats_144_1_011004.pdf.

8
J.M. Hestroffer et al. Computational Materials Science 217 (2023) 111894

[54] A.L. Frankel, C. Safta, C. Alleman, R. Jones, Mesh-based graph convolutional [58] W.L. Hamilton, Graph representation learning, Synth. Lect. Artif. Intell. Mach.
neural networks for modeling materials with microstructure, J. Mach. Learn. Learn. 14 (3) (2020) 1–159.
Model. Comput. 3 (1) (2022) 1–30. [59] K. He, X. Zhang, S. Ren, J. Sun, Delving deep into rectifiers: Surpassing
[55] R. Hanocka, A. Hertz, N. Fish, R. Giryes, S. Fleishman, D. Cohen-Or, MeshCNN, human-level performance on ImageNet classification, in: 2015 IEEE International
ACM Trans. Graphics 38 (4) (2019) 1–12, http://dx.doi.org/10.1145/3306346. Conference on Computer Vision, ICCV, 2015, pp. 1026–1034.
3322959. [60] D. Kingma, J. Ba, Adam: A method for stochastic optimization, in: International
[56] T.N. Kipf, M. Welling, Semi-supervised classification with graph convolutional Conference on Learning Representations, 2014.
networks, 2016, http://dx.doi.org/10.48550/ARXIV.1609.02907, URL https:// [61] A. Paszke, S. Gross, F. Massa, A. Lerer, J. Bradbury, G. Chanan, T. Killeen, Z. Lin,
arxiv.org/abs/1609.02907. N. Gimelshein, L. Antiga, A. Desmaison, A. Köpf, E. Yang, Z. DeVito, M. Raison,
[57] J.C. Stinville, J.M. Hestroffer, M.A. Charpagne, A.T. Polonsky, M.P. Echlin, C.J. A. Tejani, S. Chilamkurthy, B. Steiner, L. Fang, J. Bai, S. Chintala, PyTorch: An
Torbet, V. Valle, K.E. Nygren, M.P. Miller, O. Klaas, A. Loghin, I.J. Beyerlein, imperative style, high-performance deep learning library, 2019, http://dx.doi.
T.M. Pollock, Multi-modal dataset of a polycrystalline metallic material: 3D org/10.48550/ARXIV.1912.01703, URL https://arxiv.org/abs/1912.01703.
microstructure and deformation fields, Sci. Data 9 (1) (2022) 460. [62] S. Clark, P. Hayes, SigOpt web page, 2019, URL https://sigopt.com.

9
Another random document with
no related content on Scribd:
disease, and died June 12th, 1863. The men became very much
attached to him during his brief sojourn with the Regiment.

CHAPLAIN AUGUSTUS H. LUNG


Was born in Rush, Susquehanna County, Pa., November 1st, 1827.
After devoting several years to study and teaching, he entered the
Harford Academy, where he remained two years and a half. At the
expiration of that time, he was admitted into the Sophomore Class of
the Lewisburg University, and graduated in 1853. In the fall of the
same year he became a student in the Theological Seminary at
Rochester, and completed his studies July, 1855. During the year
1857 he was settled Pastor of “the First Baptist Church of
Canandaigua Village,” laboring with marked success until
commissioned, January 2nd, 1862, Chaplain of the Thirty-third.
While on the Peninsula, he was attacked with a severe fit of
sickness, and lay at the point of death for several days. He, however,
recovered his health, and rejoining the Regiment, remained until its
return home, when he resumed his pastoral labors at Canandaigua.

SURGEON T. RUSH SPENCER


Was appointed Surgeon of the Thirty-third on its organization,
afterwards promoted to Brigade Surgeon.

SURGEON SYLVANUS S. MULFORD,


Who resided at Cherry Valley, was chosen Assistant Surgeon of the
Thirty-third on its organization and was afterwards promoted to full
Surgeon. He resigned at Stafford Court House.

SURGEON D’ESTAING DICKINSON


Was born in the town of Watertown, Jefferson County, May 19th,
1836, and graduated from Albany Medical College in 1858. After
practising in Watertown and vicinity four years, he was chosen
Surgeon to Sing Sing Prison, which position he held when appointed
Assistant Surgeon of the Thirty-third. He was promoted to full
Surgeon in the fall of 1862, and remained with the Regiment until its
return home. While in charge of the Liberty Hall Hospital on the
Peninsula, containing nearly five hundred men, he was made
prisoner, refusing to leave his patients. After being detained by the
rebel authorities for several weeks, he was set at liberty and rejoined
the Regiment. During the winter of 1863 he was placed in charge of
Howe’s Division Hospital at Acquia Creek, and when General
Hooker’s series of battles occurred, was given, the entire supervision
of all the hospitals of the Sixth Army Corps.

ASSISTANT SURGEON RICHARD CURRAN


Was born in Carrahill, Clare County, Ireland, January 4th, 1838, and
coming to this country at the age of twelve years, settled at Seneca
Falls with his parents. He graduated from the Medical Department of
Harvard College in 1860, and enlisted as a private in the Thirty-third.
He was appointed Hospital Steward, when the Regiment reached
Washington, and promoted to Assistant Surgeon, August 15, 1862.
General Smith complimented him, after the battle of Antietam, in a
special order, for having advanced with his Regiment into the
thickest of the fray.

ASSISTANT-SURGEON DUNCAN M’LACHLEN


Was born in Caledonia, Livingston County, January 30th, 1832.
Studied medicine with Dr. Chamberlain, of Le Roy, New York.
Graduated at the Buffalo Medical College, and was appointed
Assistant Surgeon of the Thirty-third, January 22nd, 1863.

CAPTAIN GEORGE M. GUION,


Co. A, was engaged in the druggist business at Seneca Falls, on the
outbreak of the rebellion. Remained with the Regiment until
September, 1862, participating in the various battles of the
Peninsula, when he was promoted to the Lieutenant-Colonelcy of the
148th New York Volunteers, which position he still retains.

CAPTAIN EDWIN J. TYLER,


Co. A, was born in Onondaga County, New York, April 1st, 1828. He
moved at an early age to Seneca Falls, which has been his home
until the present time. Engaged in the mercantile business at
eighteen years of age and, followed it until 1847, when he sailed for
California, being nine months and twenty-two days on the voyage.
Returned to Seneca Falls, in the spring of 1851, and re-engaged in
business. Was elected as First Lieutenant of Company A, on its
formation. Acted as Adjutant of the Regiment from May until August
1862, and on the following October was promoted to the Captaincy.

FIRST LIEUTENANT PRICE WESLEY BAILEY,


Co. A, was born in Newtown, North Wales, August 18th, 1837.
Emigrated to this country in 1847, settled at Skaneateles; moved
from thence successively to Auburn, New York City and Utica.
Returning home, attended school one year, and then proceeded to
Seneca Falls. Enlisted as a private in Company A, afterwards
elected to Second Lieutenant, and detached to General Davidson’s
Staff at Yorktown. Promoted to First Lieutenant, May 21st, 1862; was
relieved from Staff at Harrison’s Landing, and took charge of his
Company. Appointed on General Neill’s Staff, January 16th, 1863.

SECOND LIEUTENANT THOMAS H. SIBBALDS,


Co. A, on the organization of the company, was elected Second
Sergeant, and promoted to Second Lieutenant, October 1862,
immediately after the battle of Antietam.
CAPTAIN HENRY J. WHITE,
Co. B, succeeded to the command of the Company when Captain
Corning was promoted to Lieutenant-Colonel, November 1861.
Afterwards resigned and was succeeded by First Lieutenant Draime.

CAPTAIN HENRY J. DRAIME,


Co. B, was born in the City of Sadan, France, and coming to
America in 1832, settled with his parents in Canton, Ohio. March
27th, 1839, he enlisted in the Fifth Regular Artillery, was soon
promoted to non-commissioned officer, and remained in the service
five years. During that time, he was stationed in Detroit, Chicago,
Buffalo, Sackett’s Harbor, Fort Columbus, Fort Adams, &c. Fired the
minute guns at Sackett’s Harbor on the death of President Harrison,
and was ordered to Rhode Island with his battery, to assist in
suppressing the insurrection known as the “Dorrite War.”
After leaving the army, he resided, among other places, in
Rochester, Fredonia, &c., employed in superintending engineering
operations. He was living in Palmyra when the war commenced;
enlisted as a private in Co. B, was elected Second Lieutenant, and
promoted to First Lieutenant, and then Captain.

FIRST LIEUTENANT L. C. MIX,


Co. B, was born in New Haven, Conn., in 1829. Removed to Ithaca,
N. Y. Having early developed a taste for drawing, he was induced to
go to New York to learn the “art and mysteries” of engraving. After
five years’ practice, started business in Rochester. Was engaged for
ten years, until the war, when he went with the Thirty-third as
Commissary Sergeant. Promoted to Second Lieutenant of Co. C,
July 29th, 1861. Acted in that capacity until the battle of Antietam,
when he was wounded. Promoted to First Lieutenant, and assigned
to Co. B, October 17th, 1862. Afterwards rejoined the Regiment and
was detailed Aid-de-Camp to Colonel Taylor, Colonel Commanding
First Brigade, Second Division, Sixth Corps, not being sufficiently
recovered to resume active duties as a line officer.

SECOND LIEUTENANT JOHN J. CARTER,


Co. B, was born in Troy, June 16th, 1842. His parents dying when he
was quite young, he was removed to Buffalo, and sent to school.
Two years later he was placed under the care of Rev. John Sherdan,
of Portageville. Soon after Cyrus Rose, of Nunda, adopted him,
receiving him into his family as his own son. He was nearly prepared
for College when the war commenced, but laying aside his books,
enlisted as a private in Co. F. On reaching the field, was appointed
Quartermaster-Sergeant. Nine months afterwards he was promoted
to a Second Lieutenancy, and assigned to Co. B. General Smith
mentioned Lieutenant Carter, among others, after the battle of
Antietam, “whose conduct was particularly gallant under my own
observation.”

CAPTAIN JOHN F. AIKENS,


Co. C, was born at Newburg, N. Y. Removed at the age of fourteen
to Waterloo. Was employed in various pursuits until the breaking out
of the war, when he was made Captain of Co. C. Resigned at
Washington, July 28th, 1861.

CAPTAIN CHESTER H. COLE,


Co. C, was born in Leray, Jefferson County, October 3rd, 1828.
When sixteen years of age he proceeded to the West. Returned to
Leray, and soon removed to Pillar Point, where he became
employed as a ship-carpenter. Afterwards worked in Oswego,
Syracuse, and New York. When the war broke out, he was residing
in Waterloo; enlisted as a private in Co. C, was elected First
Lieutenant on the formation of the Company, and promoted to
Captain, July 29th, 1861. He was with the Thirty-third in all its
engagements up to the storming of Marye’s Heights, when he was
severely wounded in the thigh. He recovered sufficiently, however, to
return home with the Regiment.

FIRST LIEUTENANT ROBERT H. BRETT,


Co. C, was born in Yorkshire, England, May 17th, 1829, came to this
country when above five years of age, and settled at Utica. At
twenty-one years of age sailed for California, where he remained two
years. Returned to Utica, and engaged in the machinist business. In
1860 moved to Waterloo. Was elected Orderly Sergeant of Co. C,
and promoted to First Lieutenant, July 29th, 1861.

SECOND LIEUTENANT J. E. STEBBINGS,


Co. C, was born at Madrid, St. Lawrence County, August 2, 1833.
Enlisting as a private in Co. C, at Waterloo, was elected Third
Sergeant; promoted to Orderly Sergeant, July 29th, 1861, and to
Second Lieutenant, October 17th, 1862.

SECOND LIEUTENANT ANDREW J. SCHOTT,


Co. C, was elected Second Lieutenant on the organization of the
Company, and resigned, July 29th, 1861. Since died at Waterloo, N.
Y.

CAPTAIN HENRY J. GIFFORD,


Co. D, succeeded John R. Cutler to the command of Co. D, at Camp
Griffin. Was educated for the law. Enlisted as a private in the
Thirteenth New York Volunteers, and afterwards promoted to First
Lieutenant of Co. D, Thirty-third New York. On the departure of the
Regiment from the field, the one hundred and sixty-three three years’
recruits were organized into one Company under him, and attached
to the Forty-ninth New York.
FIRST LIEUTENANT STEPHEN T. DUEL,
Co. D, was chosen First Lieutenant on the formation of the
Company; since resigned.

SECOND LIEUTENANT WM. E. ROACH,


Co. D, was born in Colchester, Vermont, October 9th, 1825, moved,
at seven years of age, to Troy, and from thence to Rochester in
1838. Proceeded to California in 1849, and returned to Rochester.
Was appointed Second Lieutenant, Co. D, during the fall of 1862.
Has since performed service, at battles of Fredericksburg and
Chancellorsville, in the ambulance Corps, to which position he was
assigned in February, 1863.

CAPTAIN WILSON B. WARFORD,


Co. E, was born in Hunterdon, New Jersey, July 27th, 1822.
Removed to Geneseo in 1839, where he remained until the breaking
out of the rebellion. Enlisted as a private in Co. E, and was
immediately elected Captain. Was very fond of military pursuits, and
for many years served as Captain of a Military Company.

FIRST LIEUTENANT JOHN GUMMER,


Co. E, was born in Dorsetshire, England, July 23d, 1819; came to
this country during the spring of 1851, and settled in Geneseo.
Enlisted as a private in Co. E, and was elected Second Lieutenant.
Promoted to First Lieutenant, July 28th, 1862.

SECOND LIEUTENANT WALTER H. SMITH,


Co. E, resigned, March 18th, 1863.
CAPTAIN JAMES M. McNAIR,
Co. F, was born on the 8th of June, 1835, in Nunda, N. Y. His earlier
years were spent at school, and upon the farm. He prepared for
College under the tutorship of Prof. Judson A. Bassett, at the Nunda
Literary Institute, and after teaching a term, entered the University of
Rochester in July, 1857. He graduated with his class in July 1860,
and immediately entered the law office of Orlando Hastings, in
Rochester. During the winter of 1860 and 1861 he taught the
Academy in West Bloomfield, N. Y., where he was engaged when
the rebellion broke out. He immediately enlisted with a company
forming at Nunda, of which he was elected Captain, May 10th, 1861.
He continued with the company until it was mustered out of service,
June 2nd, 1863, at Geneva, N. Y. The degree of Master of Arts was
conferred upon Captain McNair at the Commencement of the
University of Rochester in July, 1863.

FIRST LIEUTENANT H. A. HILLS,


Co. F, was born at Nunda, Livingston Co., Feb. 1st, 1834. Proceeded
to Shelby County, Ky., in the year 1856, and afterwards to Illinois,
Kansas, Missouri, and Nebraska, being employed most of the time in
teaching. Returning to Nunda, enlisted as a private in Co. F, elected
Orderly Sergeant on the organization of the Company, promoted to
Second Lieutenant, February, 6th, 1862, and to First Lieutenant at
White Oak Church, December 27th, 1862.

FIRST LIEUTENANT GEORGE T. HAMILTON,


Co. F, enlisted as a private in Co. F, was elected First Lieutenant on
its permanent organization, and resigned at Camp Griffin.

FIRST LIEUTENANT HENRY G. KING,


Co. F, was born at Mount Morris, August 15th, 1835. When the war
commenced, enlisted as a private in Co. F, was chosen Second
Lieutenant, and promoted to First Lieutenant, February, 1862. He
was detailed for a time as Acting Quartermaster.

SECOND LIEUTENANT JOHN F. WINDSHIP,


Co. F, was born in Queensbury, Warren County, June 11th, 1832. At
nineteen years of age removed to Angelica, Allegany County;
afterwards resided in Illinois, Missouri, and Michigan. Returned to
Wyoming, Pike County. During the winter of 1861, enlisted as a
private in Co. F; promoted to Second Sergeant, May 6th, 1861,
afterwards to First Sergeant, and, December 27th, 1862, to Second
Lieutenant.

CAPTAIN THEODORE B. HAMILTON,


Co. G, was elected Captain of the Company on the organization, and
participated in all the earlier engagements in which the Regiment
was engaged. During the month of December, 1862, he was
promoted to the Lieutenant-Colonelcy of the Sixty-second New York,
which position he still holds.

CAPTAIN GEORGE A. GALE,


Co. G, was born in London, Canada West, November, 1st, 1839. In
1845, removed to Watertown, and three years later to Buffalo;
attended school until sixteen years of age, and then became
employed in the printing establishment of Jewett, Thomas & Co.,
where he remained four years. When hostilities commenced, he
enlisted in a three months’ Regiment, and afterwards in Co. G, as a
private. Was immediately elected First Sergeant, and served in that
capacity until May 20th, 1862, when he was promoted to Second
Lieutenant. October 16th, 1862 he was promoted to First Lieutenant,
and during the following December, to Captain of the Company. He
was wounded in the left leg before Yorktown, but remained but a
brief time away from the Regiment.

FIRST LIEUT. ALEXANDER E. EUSTAPHEIVE,


Co. G, was elected First Lieutenant on the organization of the
Company, and resigned October 14th, 1862.

FIRST LIEUTENANT G. W. MARSHALL,


Co. G, was born in Elizabethtown, N. J., March 1st, 1840. Removed
to Buffalo with his parents at an early age, where he remained until
the breaking out of the rebellion. Enlisted as a private in the Buffalo
Company, elected Fifth Sergeant on the formation of the Company.
Promoted to First Sergeant, May 20th, 1862, Second Lieutenant,
October 15th, 1862, and to First Lieutenant, December 27th, 1862.

SECOND LIEUTENANT BYRON F. CRAIN,


Co. G, was born at Manchester, Ontario Co., April 26th, 1836. At ten
years of age he removed with his parents to Shortsville; enlisted as a
private in the Canandaigua Co. D; promoted to Second Lieutenant,
December 27th, 1862, and assigned to Co. G.

CAPTAIN CALVIN C. WALKER,


Co. H, was elected Captain of Co. H, on its formation. When the
Regiment was organized at Elmira, he was chosen Lieutenant-
Colonel, but resigned not long after reaching Washington.

CAPTAIN ALEXANDER H. DRAKE,


Co. H, was born at Starkey, Yates County, October 18th, 1832. At
the age of fifteen, removed with his parents to Steuben County,
where he remained until the year 1858. Then became employed as a
clerk in Canandaigua and afterwards at Geneva. Enrolled himself as
a private in Co. H, elected Second Lieutenant on the organization of
the Company, and promoted to First Lieutenant, May 25th, 1861. He
was taken prisoner at Williamsburg, and after several months’
confinement in Salisbury, North Carolina, was exchanged and
returning to the Regiment, January 24th, 1862, he was promoted to
the Captaincy of Co. H.

FIRST LIEUTENANT REUBEN C. NILES,


Co. H, was elected Orderly Sergeant on the formation of the
Company, promoted to Second Lieutenant, Jan. 24th, 1862, and
resigned December 27th, owing to ill health.

FIRST LIEUTENANT MARSHALL J. GUION,


Co. H, was appointed Commissary Sergeant at organization of
Regiment. Was transferred from Co. A, and made Second
Lieutenant of Co. H, January 24th, 1862; resigned December 27th,
1862.

FIRST LIEUTENANT OTIS COLE,


Co. H, was born in Perinton, Monroe Co., Sept. 14th, 1834. At
nineteen years of age, entered the Rochester University, remaining
there nearly two years. Returning home, engaged in farming and
stock growing until twenty-four years of age, then became employed
two years on the Michigan Southern railroad. Returning home again,
engaged in nursery and vineyard business until August 27th, 1862,
when he enlisted as a private in a body of recruits for the Thirty-third.
Was commissioned First Lieutenant, October 13th, and assigned to
Co. H. Remained with the Company until Jan. 27th, when he was
appointed A. A. G., First Brigade, Howe’s Division. Served in this
capacity, and also as Acting Brigade Commissary, until March 23d.
April 14th, was transferred to General Russell’s Staff.

SECOND LIEUTENANT SYLVESTER PORTER,


Co. H, was born in the town of Seneca, Ontario County, April 17th,
1842, where he resided with his parents until the outbreak of the
rebellion. Enlisted as a private in Co. H, and was elected Second
Sergeant, May 23d, 1861. He was afterwards promoted to First
Sergeant, and to Second Lieutenant, October 16th, 1862. Was
wounded, at the battle of White Oak Swamp, in the left shoulder, and
confined to the hospital for two months. At the end of that time he
rejoined the Regiment, and was again wounded in the right thigh,
during the sanguinary struggle on Salem Heights. Returned home,
and was mustered out with his Company.

CAPTAIN JAMES M. LETTS,


Co. I, was engaged in the Daguerrean business at Penn Yan on the
outbreak of the rebellion; was chosen Captain of Co. I, on its
organization, and resigned at Camp Griffin.

CAPTAIN EDWARD E. ROOT,


Co. I, was born in Washington County, August 24th 1839. Removed
at an early age to Yates County. After spending several years, at the
Prattsburg Academy and Rochester Commercial College, he
became employed in the Stationery House of George R. Cornwell,
Penn Yan, as confidential clerk. Was elected First Lieutenant of Co.
I, on its organization, and promoted to Captain, December 27th,
1861. He received a severe wound in the left thigh while leading his
Company in the charge on Marye’s Heights. For several weeks his
life was despaired of, but after lying two months in the hospital, he
recovered sufficiently to return home.
CAPTAIN WILLIAM HALE LONG,
Co. I, was born in New York City, February 22nd, 1835. At fifteen
years of age went to sea. Returned in three years; engaged in
mercantile pursuits till the outbreak of the rebellion, then joined the
Seventh Massachusetts as a private. Was afterwards elected
Second Lieutenant Fifth New York, and a few days succeeding,
Second Lieutenant, Co. I, Thirty-third. After being promoted to First
Lieutenant, served as Provost Marshal under General Brooks, and
as Aid-de-Camp to General Davidson. October, 1862, was promoted
to Captain, and A. A. General, and assigned to duty under General
Vinton. Remained with him until General Neill took Command of the
Brigade, when he became his A. A. General.

FIRST LIEUTENANT GEORGE BRENNAN,


Co. I, was born in Penn Yan, December 18th, 1838. Remained there
until the outbreak of the rebellion, when he enlisted as a private in
Co. I. Promoted to Sergeant, August 1st, 1861, to Orderly Sergeant,
January 1st, 1862, and to First Lieutenant, December 1st, 1862.

SECOND LIEUTENANT CHARLES HOWE,


Co. I, was elected Orderly Sergeant at its organization. Promoted to
Second Lieutenant, December 31st, 1861, and resigned, October
1862.

CAPTAIN PATRICK McGRAW,


Co. K, was born in the county of Down, Ireland, June 16th, 1824.
When seventeen years of age, enlisted in the Eighty-ninth Regiment
English Infantry, serving in Canada three years, and in England,
Ireland and Scotland eleven more. Came to this country in the winter
of 1853, and settled at Seneca Falls. Was elected Captain of Co. K,
and remained with the Regiment till the close of its two years’
campaign.

FIRST LIEUTENANT BARNARD BYRNE,


Co. K, served as First Lieutenant of Co. K, until severely wounded
while charging up Marye’s Heights.

SECOND LIEUTENANT PATRICK RYAN,


Co. K, was elected on the formation of the Company, afterwards
resigned.

SECOND LIEUTENANT EDWARD CAREY,


Co. K, was appointed to fill Lieutenant Ryan’s place, but was
immediately assigned to General Smith’s Staff.

DECORUM EST PRO PATRIA


MORI.
FIRST LIEUT. GEORGE W. BROWN,
Co. D, born in Rochester, was an only son, and, employed
as mercantile clerk, proved a most efficient and
trustworthy young business man. He entered the
Regiment as a private. Promoted to Lieutenant of Co. D,
he fell mortally wounded at the battle of Williamsburg. His
agreeable manners and gallant conduct had endeared him
to the Regiment, every member of whom mourned his loss
as if he had been a brother.
FIRST LIEUTENANT MOSES CHURCH,
Co. E, was born in New England, about the year 1817. He
was residing at Geneseo when the war commenced,
engaged in the hardware business. Fond of military
pursuits, he connected himself with a Militia company, and
was chosen Lieutenant. On the organization of Co. E., he
was elected first Lieutenant, participated in the various
skirmishes near Washington, and the battles at
Williamsburg and Mechanicsville. At the battle of Golden’s
Farm he exhibited great bravery, going fearlessly out in
front of the breastwork, and firing round after round at the
enemy, until he fell dead, pierced through the head by a
minie-ball. He was universally beloved by the Regiment. A
brave soldier, and skilful officer, he died, leaving a bright
record behind him.

FIRST LIEUT. CHARLES D. ROSSITER,


Company D, the youngest son of William and Electa B.
Rossiter, was born in Rochester, New York, March 4th,
1842. His parents soon after removed to Little Falls,
Herkimer County, N. Y., where his father died. In the year
1856 he entered the Farmers and Mechanics’ Bank at
Rochester, and, writes the Cashier, “though quite young,
soon learned to count money, and became a very rapid
and accurate accountant.”
In 1861 he enlisted in Co. G, Fifty-fourth Regiment N. Y.
S. M., and was soon after promoted to Sergeant. During
the following spring he was again promoted to Orderly,
and served in that capacity with his home regiment until
September, when he was authorized with others to raise a
Company for the war.
Lieutenant Rossiter was wounded by a ball, entering
the left side and passing completely through his body, in
the fatal retreat of Sedgwick’s Corps, after storming
Fredericksburg Heights. He was carried in a blanket a
short distance by four of his faithful men, but owing to the
extreme pain it gave him, he asked to be left behind, and
was accordingly left in the rebel hospital at Banks’ Ford.
He lived just a week from the day he was wounded, and
owing to a merciful peculiarity of the wound, his sufferings
were not excessive. Lieutenant Roach, at the risk of his
own life, succeeded in finding his body, and at dead of
night carried it on his shoulders nearly a mile. Lieutenant
Rossiter’s remains were taken to Rochester and interred
at Mount Hope, May 20th, 1863.
In a communication written since his death, his Captain
says of him, “Charlie was ardent and enthusiastic, firmly
devoted to his country’s good, and he fell nobly, a martyr
to her cause. I have never seen an officer to whom the
trying scenes of a battle-field were new, bear himself with
more bravery and cool courage than did Charlie.”

Lines written on his death.

Aye! Lay the banner across his breast,


With chaplets twine the marble brow,
It will be calmer now.
What boon but this demand the brave,
A warrior’s fame, a warrior’s grave?

This land, where peace and plenty reign,


He left for a field of death and strife,
To offer up, in Freedom’s fane,
A sacrifice—his life.
More glorious gift could mortal give?
He died, but oh! his name shall live.

But hark! though death has brought relief,


An honor saved, a glory won;
The voice of woe, “My son! my son!”
No wonder if her grief be wild,
He was the widow’s only child.

Loved ones, bereaved ones, no more from sleep


Wake in the silent hours wildly to weep;
All does not die with the swift-fleeting breath,
There is light in the darkness; even in death.

SERGEANT-MAJOR GEORGE W.
BASSETT
Was born in the town of Barrington, Yates County,
November 6, 1838. When the war broke out, he was a
Law Student at Penn Yan. Enlisting as a private in Co. I,
he was chosen Third Sergeant, and promoted to
Sergeant-Major, May 22nd, 1861. Having borne off
Lieutenant Mix from the battle-field of Antietam, he
returned to the front and was immediately shot through the
head. By his winning ways and zealous attendance to
duties, he had won the esteem of his officers and
commanders, and fell universally regretted.

TO THE NEW YORK THIRTY-THIRD REGIMENT.


BY A. A. H.

Oh! where are those heroes; the first in the fight,


The brave Thirty-third with their standard so bright,
Unfurled to the breeze in the enemy’s view,
As they shouted aloud for the Red, White and Blue?
We saw them depart like a host from our shore;
Their guns on their shoulders they gallantly bore.
The path of their fathers they fearlessly trod;
Their bosoms beat proudly, their trust was in God.
Their steps never faltered, their hearts never failed,
At the glance of the traitors their eye never quailed.
On the red field of glory they fought undismayed;
On the red field of glory their relics are laid.
Now chant we their requiem, mournful and slow,
In deep thrilling tones let its melody flow;
Ah! well may we tell of their triumphs with pride,
Like warriors they fought, and like heroes they died;
Farewell to the dauntless, farewell to the brave!
Unshrouded they sleep in a far distant grave;
But fadeless, immortal their memory shall bloom,
And freedom with roses shall scatter their tomb.
Of the brave Thirty-third doth a remnant remain,
Whose gallant commander shall lead them again,
And the heart of rebellion grow cold as it feels
The plunge of their weapons, the wounds of their steel.
Their bright swords are gleaming, their banner unfurled
By the soft floating zephyrs, is gracefully curled;
They are restless, impatient the charge to renew,
They are shouting aloud for the Red, White and Blue.
GENERAL ORDERS PERTAINING TO
THE ARMY.
PICKET DUTY.
This most important feature, for the safety of an army, is perhaps
the least understood of all that appertains to the art of war. As the
same system is germain to all armies, the following explanation will,
no doubt, be acceptable in this volume, as the duty has been seldom
described, though often spoken of, in the numerous details of
midnight attacks, and skirmishes. In the disposition of, say two
hundred men, they go forth to a point designated as the grand
reserve, varying in distance from two hundred rods to nearly a mile
from the outer or picket line, where are left half of the number as a
reserve, in case the pickets are driven in, and also for mutual relief in
their fatiguing duties (often out on picket for three days). Then, to the
right and nearer the line, is stationed an officer and forty-eight men,
who immediately relieve the line of men (who are out in front) sixteen
in number, leaving thirty-two men on the support, so-called—or two
more reliefs, relieving each other every two hours; the same on the
left support.
When six hours have passed, the three reliefs on each support
having stood on post their two hours each, the Grand Reserve sends
out the ninety-six men who have been resting—forty-eight to each
support—they, in turn, going through the same routine—the first
ninety-six men going back to the main reserve to rest, &c. Thus the
whole thing is systematized, the Grand Reserve and the supports
alternately relieving each other, until the whole time for which they
are detailed, expires, when another detail from some other Regiment
relieves the whole picket. The Picket Guard is always commanded
by a Staff Officer. The following is a specimen of an order from
Brigade Headquarters, detailing a Picket Guard from the Thirty-third
Regiment:

Headquarters 3d Brigade, 2d Division, 6th Corps.


(Special Orders.)
Ten Commissioned Officers, fifty Non-Commissioned
Officers, and three hundred and fifty Privates, will be detailed
from the Thirty-third Regiment N. Y. S. Volunteers, for Picket
Guards, and will mount at 9.30 A. M.
Major John S. Platner, 33d N. Y. S. V., and Assistant-
Surgeon Richard Curran, of the same Regiment, will
accompany the detail, which is to remain on duty for three
successive days.
Grand Guard Mounting will be had according to Butterfield’s
System, on which a Division Staff Officer will perform the

You might also like