Professional Documents
Culture Documents
Thermal Analysis - Testing - Labs - Guide - EN - LR
Thermal Analysis - Testing - Labs - Guide - EN - LR
Thermal Analysis - Testing - Labs - Guide - EN - LR
Applications
Booklet
Thermal analysis (TA) is an effective tool for materials characterization found in multiple testing laboratories
Testing Labs Guide
worldwide. It encompasses a broad range of specialized characterization techniques used to investigate samples
from almost any industry sector – from pharmaceuticals & food to automotive, construction and electronics.
At the same time, local, state and federal regulatory agencies are increasing their emphasis on the need for test-
ing laboratories to be GLP compliant and/or accredited to ISO/IEC 17025. These standards demonstrate that a
laboratory operates to a global standard and has passed a rigorous examination of methods, facilities and staff.
It also assures that the laboratory is capable of producing data that is accurate, traceable and reproducible.
This complicated process can be simplified and streamlined with a STARe thermal analysis system, making
accreditation an achievable goal for all testing laboratories.
This guide provides a comprehensive overview about Thermal Analysis (TA) and its applications for testing
laboratories; it includes:
• An introduction to GLP (Good Laboratory Practice), ISO/IEC 17025 standards, and the unique
features available in STARe software that help meet technical requirements for compliance/accreditation.
• A description of the main TA techniques
• Main effects and properties measured by TA
• Automated solutions for seamless TA workflows
• Real application examples with tips and hints for proper result interpretation
• Measurement uncertainty
• How to properly calibrate and adjust your instrument
• Standards in Thermal Analysis
• Data integrity
• Useful accessories
• Services provided by METTLER TOLEDO
Disclaimer
The selected application examples presented in this guide were conducted with the utmost care and in accordance
with our present knowledge. METTLER TOLEDO does not assume responsibility for any experiments carried out
using the methods or instruments described in this handbook; the analyst alone is responsible for ensuring the
safety and accuracy of their measurements.
When chemicals, solvents and gases are used, general safety rules and the instructions given by the manufacturer
or supplier must be observed.
A trademark (denoted by the ® or TM symbol) identifies commercial products that have been registered.
Introduction and Overview
1. What Can Thermal Analysis do for Testing Laboratories 4
1.1. Purpose of thermal analysis measurement 4
1.2. Main effects and properties measured by thermal analysis 4
1.2.1. Glass transition 4
1.2.2. Melting temperature and melting behavior 5
1.2.3. Purity determination 5
1.2.4. Compositional analysis, decomposition reactions 5
1.2.5. Stability 6
1.2.6. Phase diagrams 6
1.2.7. Polymorphism 6
2. Computerized Systems and its Validation 7
2.1. Introduction 7
2.2. What is a computerized system? 7
2.3. Computerized system “validation” 8
2.3.1. Validation of the computer system and the equipment 8
2.3.2. Qualification of operators 9
2.3.3. Method validation 9
3. An Overview of the Various Thermal Analysis Techniques 10
3.1. Differential Scanning Calorimetry 10
3.2. Thermogravimetric Analysis 11
3.3. Thermomechanical Analysis 11
3.4. Dynamic Mechanical Analysis 11
3.5. Evolved Gas Analysis 12
3.6. Thermomicroscopy 12
4. Applications and Their Thermal Analysis Techniques 13
5. Seamless Automated Workflows with Comprehensive TA Software 14
5.1. The automation solution for DSC, TGA and TGA/DSC 15
5.2. Some practical software options 15
6. Application Examples 16
6.1. Identity check of PP/PE copolymer 18
6.2. DSC and hot stage – Investigation of polymorphism 19
6.3. Compositional analysis of rubber 21
6.4. DSC – Thermal stability of edible fats and oils 22
6.5. TMA – Linear expansion coefficient of inorganic material 23
6.6. DMA – Young’s modulus of a composite material 24
7. Calibration and Adjustment 25
8. Uncertainty of Measurement 26
8.1. What is uncertainty of measurement? 26
8.2. What are the sources of measurement uncertainty? 26
8.3. Estimation of the uncertainty 27
9. Standardization in Thermal Analysis 28
10. Data Integrity 29
11. Excellent Crucibles, Accessories and Reference Materials 30
11.1. Crucibles 24
11.2. Reference materials 24
12. More Educational Resources 31
Thermal Analysis Application Handbooks 32
1. What Can Thermal Analysis do for Testing Laboratories
final goods release, product failure analysis and comparison of competitive materials.
Thermal analysis applications for testing labs are perhaps as versatile as the samples
received. The following section briefly summarizes the various effects or properties –
in polymers, chemicals and petrochemicals, pharmaceuticals and food – that can be
investigated by thermal analysis.
• Failure analysis
Thermal analysis techniques contribute substantially to failure analysis, providing relevant details on material
properties such as glass transition, melting behavior, purity, curing status, oxidation stability, filler content,
and more.
• Material Comparisons
Material comparisons are used for several different purposes:
• Batch-to-batch comparisons in quality control
• Fail/pass analysis
• Evaluation of ageing processes
• Competition analysis
The new Reference Library option in STARe software is an ideal tool for comparing materials.
Thermal analysis includes a number of versatile techniques firmly established as analytical methods for
materials characterization.
The glass transition occurs when an amorphous material is heated or cooled in a particular temperature range,
resulting in a change in the specific heat (cp; www.mt.com/ta-cp). Above the glass transition temperature (Tg),
polymers or glasses become soft and can be plastically deformed. Below Tg, the material turns hard and brittle.
Knowledge of the glass transition is important for optimizing production parameters and the properties of products.
In addition, the glass transition can be used to identify and compare materials and is therefore important for
quality assurance and failure analysis.
www.mt.com/ta-dsc
www.mt.com/ta-tma
The melting peak contains necessary information for determining the melting point or range, melting behavior,
and the enthalpy of fusion. Polymorphic and eutectic transitions can also be identified from their respective
melting curves.
www.mt.com/ta-dsc
Ensuring the purity of fine and specialty chemicals is essential, as undesirable contaminants may have serious
consequences for their further processing and use. Purity determination is performed routinely during the pro-
duction of chemicals by evaluating the peak shape of the DSC melting curve. The method is based on the van’t
Hoff law of melting point depression caused by the presence of impurities. Purity levels between 90 and 100
mole percent can be determined with excellent accuracy. Purity, melting point and the heat of fusion of a sample
can be determined from a single DSC melting curve.
www.mt.com/ta-dsc
Thermogravimetry (TGA) is ideal for compositional analyses of a variety of samples, including polymers, plastics,
composites, laminates, adhesives, food, coatings, pharmaceuticals, organic materials, rubber, petroleum,
chemicals, explosives and biological samples. The method is based on the stepwise degradation of the sample
in an inert and/or oxidative atmosphere. For example, the Noack method (ASTM D6375) is a widely used TGA
application for the analysis of lubricants. Evolved gases from TGA can be analyzed by Fourier transform infrared
spectroscopy, mass spectroscopy, gas chromatography-mass spectroscopy or micro gas chromatography-mass
spectroscopy (respectively FTIR spectroscopy, MS, GC/MS; Micro GC/MS).
www.mt.com/ta-tga
Stability testing is necessary to ensure that the quality of a product, for example pharmaceuticals, is acceptable
throughout its entire storage lifetime. TGA permits the determination of shelf life and requires only a small amount
Testing Labs Guide
In the case of long-term tests, samples that have been stored for variable periods of time and under different
conditions are analyzed at regular intervals. Product alterations can be easily recognized by shifts or changes in
the measurement results. In addition, decomposition reactions can be investigated kinetically by measuring
samples under different dynamic conditions (heating rates). These results enable the prediction of decomposition
behavior under conditions where measurements are difficult to perform or where the reaction times are very short
or very long.
www.mt.com/ta-dsc
www.mt.com/ta-tga
These questions can be answered with the help of a phase diagram, which describes the relationship between
the melting temperature and composition of a multi-component system. In order to construct such a phase
diagram, mixtures of the components with different compositions are measured by DSC.
www.mt.com/ta-dsc
1.2.7. Polymorphism
Many substances exhibit polymorphic behavior, which is the ability of a single substance to exist in different
crystalline forms. Despite having the same chemical composition, polymorphs exhibit differences in their physical
properties such as melting point, heat of fusion, solubility, or bioavailability. For example, one polymorph may
be more readily absorbed than its inactive, or even toxic, counterpart. It is important to be aware of such modifi-
cations in order to optimize production and storage conditions and to ensure only the desired polymorphic form
is present.
www.mt.com/ta-dsc
2.1. Introduction
Laboratories providing services in or for industries producing products that may directly affect human and
animal health and safety as well as the environment (i.e. pharmaceutical, pesticide and cosmetic products,
veterinary drugs, food additives, and industrial chemicals), are regulated by Good Laboratory Practices
(GLP’s, e.g. FDA’s GLP, OECD GLP).
GLP’s in general are focused on product safety. Another approach, adopted by the International Organization
of Standardization, is to set a standard for “General requirements for the competence of testing and calibration
laboratories”. This standard is known as ISO 17025, and it focuses on the procedural aspects of conducting
measurements in a laboratory. Today, ISO 17025 is a widely accepted standard followed by many testing and
calibration labs independent of whether they must follow GLP regulations or not.
Despite the fundamentally different intentions of OECD/FDA GLP regulations (focused on products) and ISO 17025
(focused on procedures) there is a common aim: The focus on “documented evidence”, i.e. the requirement that
all activities in daily laboratory practice (e.g. calibration, system changes, sampling, measurements, evaluations
etc.) have to be documented either in written form or electronically. In 2017, ISO 17025 was updated to align
with the current working environment of testing laboratories, which includes computerized systems, electronic
records and the production of electronic results and reports.
Compliance with both GLP’s and ISO17025 requires that computerized systems have to be validated before use.
In what follows, we describe what is meant by a computerized system, and how such systems are validated.
”A computerized system is a function (process or operation) integrated with a computer system and performed
by trained personnel. The function is controlled by the computer system. The controlling computer system is
comprised of hardware and software. The controlled function is comprised of equipment to be controlled and
operating procedures performed by personnel”.
OECD Series on Principles of Good Laboratory Practice and Compliance Monitoring No. 17
A general accepted model of a computerized system is depicted in Figure 1. It shows that a computerized system
consists of a computer system, which can be further subdivided into the computer, all kinds of peripheral devices
(printer, cables, network, etc.), and the software consisting of an operating system and some instrument specific
control software. On the other side, we have what are called the “Controlled functions” including the equipment
(i.e. analytical instrument), but also the operator(s) of the computerized system as well as the procedures to be
used (SOPs).
Computerized system
The aim of computerized system validation is to show that the system works and is fit for its intended purpose
according to certain specifications. Such specifications are written in the User Requirement Specification or URS.
The URS documents the functional requirements of the instrument (e.g. temperature range, automation needs,
but also intended application specific requirements (e.g. temperature accuracy, sensitivity etc.) and user relevant
software features. URS are probably the most important part of “Design Qualification” (DQ). In addition to URS,
DQ details some “must” constraints regarding the vendor, e.g. certification according to ISO 9001, offerings of
the vendor regarding service (e.g. calibration), trainings, application support, etc.)
The computerized system comprises a number of “elements” (see Figure 1). In principle, each “element” has to
be validated individually. In practice however, an integral validation of the computer system and the equipment
is performed. In addition, method validation and operator qualification are required.
Validation of the computer system and the equipment is done in three steps:
1. Installation qualification (IQ): Are all components present and properly installed?
2. Operational qualification (OQ): Does the computer system and the equipment operate according to vendor
specifications (e.g. temperature accuracy, drift, noise)?
3. Performance qualification (PQ): Does it work for your specific application? Can you solve your analytical task
with this equipment?
If the ensemble of the computer system and the equipment pass these three qualification steps, routine operation
can be started. Routine operation, however, will require periodic testing and calibration of the equipment.
Software validation, which is also part of the computer system and the equipment, poses a particular challenge:
• It is difficult to define specifications for software
• It is difficult to define test procedures and acceptance criteria
• With networks, firewalls, servers, clouds etc. computer systems get more and more complex.
The Good Automatic Manufacturing Practice (GAMP) guide on validation of computerized systems distinguishes
between five categories of control software for analytical instruments. STARe software is designed as a closed
system, more precisely an environment in which system access is controlled by persons who are responsible for
the content of electronic records present on the system. As such, STARe software belongs to Category 3, meaning
there is no fixed rule as to the validation approach.
GLP and ISO 17025 laboratories require measurements to be performed by qualified personnel. Thus, it has to
be documented that users have been trained on the system. METTLER TOLEDO offers basic and advanced training
courses. When successfully completed, participants obtain a certificate issued by METTLER TOLEDO.
www.mt.com/ta-validation-hb
Validating an analytical method is to ensure the quality and reliability of its results, i.e. to demonstrate that the
method is suitable for its intended purpose. During validation, method relevant performance parameters are
determined (e.g. measurement uncertainty, limit of detection, selectivity) and compared with the respective
requirements of the analytical task. “Validation” is an ongoing process linked to the method’s life cycle.
Remarks:
• Standard methods, i.e. methods published by national or international organizations (such as e.g. ISO, ASTM,
DIN) are considered as validated methods and do not need any further validation. Comment: In many standard
methods, performance parameters are not specified at all or inadequately. This is particularly true for generic
methods (e.g., ASTM E928-01 (Purity determination by DSC)).
• Interlaboratory studies in the validation process provide data on the repeatability and comparability of results.
Thermal Analysis (TA) is the term used to describe the analytical techniques that measure the physical and
Testing Labs Guide
chemical properties of a sample as a function of temperature or time. The sample is subjected to a temperature
program, which consists of a series of preselected segments, in which the sample is heated or cooled at a
constant rate or held isothermally. Samples can also be subjected to different atmospheres, for example, air
(oxidizing) or nitrogen (inert) and pressures using a high-pressure DSC. In particular, investigations at defined
relative humidity are becoming more and more important. This can be achieved by the application of a Sorption
Analyzer System available for TGA, TMA and DMA.
www.mt.com/ta-techniques
Differential scanning calorimetry (DSC) measures the heat flow produced in a sample when it is heated, cooled,
or held isothermally at constant temperature. The measurement signal is the energy absorbed or released by the
sample in milliwatts (mW). Melting points, crystallization behavior, specific heat capacity and chemical reactions
are just some of the many properties and processes that can be measured by DSC.
2
4
endothermal
-5
Temperature
www.mt.com/ta-dsc
Thermogravimetric analysis (TGA) measures the mass of a sample as it is heated, cooled or held at a constant
temperature in a defined atmosphere – usually nitrogen (inert) or air/oxygen (oxidative). The mass is measured
using a highly sensitive electronic balance, with any interfering buoyancy or gas flow effects being corrected
with blank curves.
4
Temperature
0%
25 200 400 °C
www.mt.com/ta-tga
Thermomechanical analysis (TMA) is used to study the dimensional changes of a material as a function of tem-
perature or time. In TMA, the sample is subjected to a constant force, an increasing force, or a modulated force,
whereas in dilatometry, dimensional changes are measured using the smallest possible load. Depending on the
measurement mode used, TMA allows you to measure:
• Thermal expansion and shrinkage behavior,
• softening, and
• changes in mechanical properties of materials induced by physical or chemical transitions such as the glass
transition, crystallization, melting and curing.
www.mt.com/ta-tma
Dynamic mechanical analysis (DMA) is used to study the viscoelastic properties and behavior of a wide range
of materials. Samples are subjected to a sinusoidal mechanical stress as a function of temperature or frequency.
Depending on the measurement mode used, DMA determines either the shear modulus, or the Young’s modulus.
The most important results obtained from DMA are the temperatures that characterize a thermal effect, the loss
angle (the phase shift), the mechanical loss factor (the tangent of the phase shift), and the tensile or shear storage
and loss moduli.
DMA experiments can also be performed in liquids using the Fluid Bath option. The entire sample holder and
sample is immersed in the liquid. The Fluid Bath option consists of a special immersion bath and external
temperature control using a circulating heating bath or chiller.
www.mt.com/ta-dma
In the broader definition, evolved gas analysis (EGA) investigates the nature of volatile products released by a
substance as it is heated. This can be done using many different types of techniques and equipment. Coupling
Testing Labs Guide
a METTLER TOLEDO TGA or TGA/DSC to an FTIR, MS, GC/MS or Micro GC(/MS), creates a TGA-EGA system.
The thermogravimetric analyzer records the loss-of-mass of the sample while the evolved gas analyzer simul-
taneously provides information about the gaseous products evolved (e.g. moisture, solvents or additives) from
processes such as evaporation, desorption, decomposition, and chemical reactions (add ref to EGA handbook).
www.mt.com/ta-ega
3.6. Thermomicroscopy
It is often difficult from a thermoanalytical measurement alone (e.g. DSC) to know what exactly occurs in the
sample when it is heated. Additional information derived from the visual appearance can aid interpretation or
support hypotheses that have already been proposed.
Results from thermal analysis coupled with optical analysis (thermo-optical analysis) can be obtained by using
light transmission microscopy or reflected microscopy during the thermoanalytical measurement.
For thermal analysis transmission microscopy one uses hot-stage microscopy, while for thermal analysis reflection
microscopy, DSC-microscopy is the method of choice.
www.mt.com/ta-toa
The table below summarizes the effects and properties of chemical compounds that can be investigated
by various thermal analysis techniques.
DSC Differential Scanning Calorimetry TMA Thermomechanical Analysis EGA Evolved Gas Analysis
TGA Thermal Gravimetric Analysis DMA Dynamic Mechanical Analysis HS Hot Stage Microscopy
Compositional analysis • • •
Chemical reactions • • •
Content determination •
Crystallization / Crystallinity • •
Enthalpy changes •
Evaporation / Drying • •
Expansion coefficients •
Glass transition • • •
Heat of transition •
Identification • • •
Interactions / Compatibility •
Kinetic analysis • •
Oxidative stability • • •
Phase diagrams •
Phase transitions • •
Polymorphism / Pseudopolymorphism • •
Purity determination • •
Safety investigations •
Shear Modulus •
Shrinking / Swelling •
Sorption / Desorption •
Thermal stability • • • •
Viscoelastic behavior • •
Young's Modulus • •
www.mt.com/ta-applications
Today’s analytical laboratories face growing demand for increased testing capacity – often with limited resources.
This poses a huge challenge for testing and contract laboratories who must still achieve reliable results and
compliant documentation. More and more labs are therefore turning to automated systems that carry out many
labor-intensive and error-prone tasks formerly performed by humans.
METTLER TOLEDO provides a unique combination of technical capabilities and Laboratory Information Management
Systems (LIMS) connectivity to support this trend in thermal analysis. Central to this is the innovative STARe
thermal analysis software, through which complete investigation of materials is managed and controlled – from
sample management and measurement to interpretation and validation of results. Within the software all
measurement methods, measurement data, calibration data and evaluations of results are stored in a secure
database with date and time stamp. Therefore, they are traceable at all times. It is impossible to delete or change
data by accident.
The ideal TA workflow detailed below seamlessly integrates automation with DSC technology. The workflow,
from the perspective of automation, can be structured into five steps.
2 3
4
5
1
Step 1: LIMS order electronically from a METTLER TOLEDO the STARe EvalMacro option — can speed
The LIMS system sends a request to the balance to STARe by interfacing the TA up repetitive tasks and eliminate operator
STARe software to run an experiment, software directly with LabX® — METTLER bias. EvalMacro performs evaluations
evaluate results and generate a report. TOLEDO’s balance software. The STAReX™ of the same type fully automatically
link reduces transcription errors, speeds and enables graphical comparison and
Step 2: Sample identification up the analytical process and facilitates statistical evaluation of results to ensure
Sample containers can be labeled with the second person review required by they lie within predefined limits.
barcodes for quick, error-free sample regulated industries.
identification and transfer to STARe. The Step 5: LIMS result transfer
barcode reader may be connected to either Step 4: Automatic result evaluation In the final step, results are transferred to
the PC or a METTLER TOLEDO balance. Following a measurement, the resulting the LIMS database, which supports most
curve must be evaluated. Particularly file formats, including graphics (e.g. tiff),
Step 3: Sample weigh-in via STAReX where repeated analyses must be perfor- text files and List & Label reports.
Weighing results can be transferred med, automated evaluations — e.g. via
www.mt.com/ta-software
Robust endurance-tested sample robot functions reliably and efficiently throughout the day.
Fully integrated automated workflows are supported by adding relevant software options such as:
EvalMacro – with the reliable sample changer and the automatic evaluation and assessment of results,
experiments can be fully automated from the measurement to the display and storage of the results.
Quality Control – sample measurements can be compared to known reference curves and results
transferred to a statistics table for quick and easy statistical evaluations and trend analysis.
www.mt.com/ta-qc
Reference Library – facilitates the interpretation of results and helps with materials’ identification,
e.g. for failure analysis.
www.mt.com/ta-libraries
Data Integrity – provides password access-control to the application, assigns user-rights for each user-level,
classifies data based on levels of sensitivity, and assigns users to groups or projects.
www.mt.com/ta-dataintegrity
LIMS Connectivity – automate workflows, integrate laboratory operations, and manage samples and
associated information.
Asset Management – obtain a detailed overview of a thermal analysis system, even if installed in multiple
laboratories or sites. Monitor each instrument, including its date of purchase, calibration history and service log.
www.mt.com/ta-swupdates
Deformulation analysis uses a combination of analytical methods and conventional extraction methods
Testing Labs Guide
to identify and quantify the components of a complex mixture. This process is often used for competitive or
failure analysis. In this example, components of an unknown elastomer were identified and quantified using
thermogravimetric analysis (TGA) and TGA coupled to gas-chromatography-mass spectrometry (TGA-GC/MS)
via a heated storage interface (IST).
TGA results
The TGA results are presented in Figure 1. The TGA and the first derivative (DTG) curves are show in solid and
dashed lines, respectively. Curves under nitrogen atmosphere are display in black and curves under oxygen are
displayed in red. The small step at low temperature is due to vaporization of volatile compounds, corresponding
to about 6% of the sample. Subsequently, the DTG curve shows two clear peaks caused by the pyrolysis of the
polymeric components. The second mass loss at a DTG peak maximum of about 370°C (typical of the pyrolysis
of NR) is about 27%. The second decomposition step (DTG peak maxima of about 440°C) caused by the
pyrolysis of different polymeric components is about 24%. After switching from N2 to air, the TGA curve shows
a mass loss of 24% at about 550°C due to the combustion of carbon black filler. A further mass loss step at
about 650°C of about 5.4% is also observed. It probably corresponds to the loss of carbon dioxide from the
decomposition of a carbonate.
TGA-IST-GC/MS
In order to identify individual elastomeric components present in the unknown rubber, the TGA was coupled to
gas chromatography-mass spectrometer (GC/MS) by means of a heated storage interface (IST) [1].
The compounds in the GC/MS total ion chromatogram (TIC; not shown here) were identified using NIST/EPA/NIH
Mass Spectral Library 2011. The emission profiles of the most relevant compounds are shown in Figure 2. The
presence of limonene and isoprene among the gases evolved clearly confirms that natural rubber (NR) is one
of the constituents of this rubber sample. In addition, the detection of 1,3-pentadiene and styrene among others
indicates the presence of butadiene rubber (BR) and styrene-butadiene rubber (SBR).
Conclusion
In this example, a quantitative analysis of an unknown rubber sample was performed using TGA and TGA-GC/MS.
The sample was identified as a ternary blend of NR, SBR and BR. To determine which carbonate constitutes the
rubber, ATR-FTIR analysis can also be performed just after the TGA experiments.
References
[1] N. Fedelich, Thermogravimetric analysis and gas analysis Part 4: TGA-GC/MS, METTLER TOLEDO
UserCom 48 (2018), pp. 1–7.
www.mt.com/ta-ega
DSC measurements can be used to check whether different copolymer samples are identical. Evidence is based
on the melting point temperature, heat of fusion, glass transition temperature or other thermal effects.
Evaluation
Interpretation
Each curve displays two melting peaks, one for PE and the other for PP. An estimate of their relative amounts
is possible based on an assumed heat of fusion of 70 J/g for PP (corresponding to a degree of crystallinity
of 34%).
Conclusion
The molded part is clearly not made of Stamylan P46M10, based on its assessment by DSC, which permits
qualitative, as well as semi-quantitative analysis of copolymers.
www.mt.com/ta-dsc
Sample Sulfapyridine
Application Active ingradient
Conditions Measuring cells: DSC, HS82 microscope hot stage
Crucibles: DSC: Aluminum 40 µL, hermetically sealed
HS82: Sample holder with cover glass
Sample preparation: As received, no preparation
DSC measurement: Shock-cooled from the melt.
Then heating from 40 °C to 200 °C at 5 K/min
Atmosphere: Nitrogen, 50 cm3/min
Interpretation
The DSC curve shows the polymorphic transitions of sulfapyridine. The sample had first been melted and then
shock-cooled from the melt. According to Oswald, the metastable phase (B) crystallizes above the glass transition
Tg (A). This is followed by a monotropic transition, corresponding to a solid-solid transition, on heating (C). The
metastable modification melts at (D), before crystallizing from the liquid phase (E) to form a stable modification,
which finally melts (F).
Evaluation
Conclusion
DSC and Thermo-optical analysis were applied to study the polymorphic behavior of sulfapyridine. Adding
hot-stage microscopy facilitates the interpretation of the individual phase transitions that are not readily detected
on the DSC curve.
www.mt.com/ta-dsc
www.mt.com/ta-toa
TGA is frequently used to analyze the composition of rubbers. In just one run, TGA permits the determination
of volatiles, elastomeric compounds, carbon black and inorganic fillers.
Evaluation
The TGA curve exhibits three steps. The first (below about 300 °C) corresponds to the loss of a small quantity of vol-
atile compounds. This is followed by pyrolysis of the polymer shortly afterward between 300 and 550 °C. The step
height corresponds to the polymer content. At 600 °C, the atmosphere was switched from nitrogen to oxygen, which
results in combustion of the carbon black additive. The height of the third step can therefore be taken as a measure
of the carbon black filler content. Inorganic components remain behind as a residue at 800 °C.
Conclusion
TGA is an excellent technique for characterizing the thermal properties of rubbers by permitting the assessment
of volatile compounds, filler content and inorganic residues.
www.mt.com/ta-tga
Oxidation causes rancidity in edible oils and fats, giving them an unpleasant odor and taste and making them
unsuitable for cooking. Similarly, long-term oxidation may occur during storage and processing. The oxidative
Testing Labs Guide
stability of fats or oils depend on their precise nature. For example, used oil can be distinguished from fresh oil
based on their respective oxidation onset temperatures (OOTs).
Evaluation
The diagram shows the OOT curves of soybean oil and palm fat. The soybean oil shows that oxidation begins at
about 188 °C under oxygen but shows no visible signs of reaction under nitrogen. Similarly, palm fat begins to
oxidize at about 213 °C.
Start of oxidation
Sample Weight in mg Atmosphere
(extrapolated onset) in °C
Soybean oil 1.13 O2 188
Soybean oil 2.06 N2 ---
Palm fat 0.57 O2 213
Conclusion
Clearly, OOT determination is a useful method for comparing and characterizing edible fats and oils.
DSC can be used to measure the stability of fats and oils by determining the temperature at which oxidation
begins – the higher the DSC onset temperature, the more stable the fat or oil.
www.mt.com/ta-dsc
Materials expand or contract under temperature changes, the degree of which depends on the temperature range.
This range must be taken into account when designing composites from different materials, and in many engi-
neering applications. Otherwise, cracks and damage can occur that eventually lead to product failure. The follow-
ing application uses TMA to determine the coefficient of expansion, and its temperature-dependence, in metals,
crystals and glasses.
Evaluation
The coefficient of thermal expansion, CTE, of several different inorganic materials was determined by TMA.
The coefficient of expansion can be assessed in various ways:
1. As the mean value of the change in length over a temperature interval: mean CTE.
2. As a single value at a certain temperature: normalized slope of the TMA curve.
3. As a curve showing the temperature-dependence of CTE values.
The mean coefficients of expansion in the temperature range of 60 °C to 160 °C are indicated in the upper left of
the evaluation, which are the typical CTEs for the inorganic materials. CTE values taken at a specific temperature
were measured for aluminum and copper. The values are often used to verify the accuracy of CTE measurements
by comparing them to the reference values of the corresponding pure metals. The dependence of the expansion
coefficient on temperature is clearly shown in the example of invar in the upper right corner of the evaluation window.
Conclusion
The linear coefficient of thermal expansion of a material is a fundamental physical quality, which influences the
mechanical properties of composite materials. Where applicable, it is also vital to assess CTEs over large tem-
perature ranges, even for long objects such as pipelines, railway tracks or bridges. For example, a 100 m long
steel bridge can extend by up to 9 cm between a cold winter and hot summer.
www.mt.com/ta-tma
Tg is an important aspect in the selection and quality control of materials for the aviation industry. Dynamic
mechanical analysis (DMA) is the most sensitive and hence preferred thermal analysis technique for the
Testing Labs Guide
determination of Tg.
Evaluation
The evaluation shows the DMA curves – storage modulus (E’) and loss modulus (E’’) – of a composite material
(material B) measured in single cantilever mode at a frequency of one hertz. Tg corresponds to the peak of the
loss modulus curve or the intersection of the two tangent lines applied to the storage modulus curve – 180 °C
and 193 °C, respectively.
Conclusion
The DMA 1 meets the highest class criteria defined by the aviation company. The results achieved from the storage
and loss modulus curves lie within ±4 °C of the mean value defined by the aviation company’s standard for Tg
determination in material B.
www.mt.com/ta-dma
Calibration is the act of checking the accuracy of a measuring instrument by comparing measurement results
using a reference substance for which the “true” value of the measured property is known.
Adjustment is defined as modifying the specific instrument parameters so that the measurement results of the
calibration performed afterward are within the tolerance limit.
In thermal analysis, the ordinate (X-axis) and the abscissa (Y-axis) need to be calibrated:
Ordinate
• Heat flow, peak area (DSC)
• Mass (TGA, automatically performed in the electronic microbalance)
• Length (displacement) and force (TMA and DMA)
Abscissa
• Temperatures
• Time (e.g. for isothermal measurements) derived from the quartz clock of a microprocessor (extremely accurate)
Within the scope of GLP and ISO 17025, testing laboratories must ensure their laboratory equipment is properly
maintained and calibrated. The STARe software now supports you in performing this process. You define the
calibration intervals. If you try to perform a measurement on an instrument whose calibration date has expired,
the instrument refuses to do the measurement. The instrument must first be calibrated as shown in the flow
diagram below.
There are different settings that facilitate your work: 1. Calibration prompts or emails to tell you when a
calibration is due. 2. Measurement refusal if the calibration has expired. You can also have the calibration and
adjustment performed by trained METTLER TOLEDO service specialists. The service specialists will issue you
with a calibration certificate.
Conventional thermal analysis instruments must be readjusted each time experimental conditions such as
atmosphere, heating rate or crucible type are changed. STARe software’s FlexCal® calibration includes the
methods and database to store and handle the necessary adjustment parameters.
www.mt.com/ta-calibration
In this section, we describe how a concept for estimating the uncertainty of measurement can be developed and
Testing Labs Guide
We have seen that to identify the sources of measurement errors, it is helpful to distinguish between systematic
(determinate) and random (indeterminate) errors. However, it is becoming more usual to consider the sum of
all the contributions that affect the accuracy of measurement results and not just those from these two types of
error. This is known as the uncertainty of measurement (also called the measurement uncertainty or simply the
uncertainty). In this context, one uses the term partial uncertainty for a particular step of an analysis procedure
(e.g. sample preparation) and combined uncertainty for the sum of all the uncertainties.
A useful and commonly applied concept of uncertainty is described in the ISO publication “Guide to the Expression
of Uncertainty in Measurement (GUM)”, published in 1993, as well as in the revised edition of 1995. The guide
was published under the guidance of several different organizations such as the International Standards
Organization (ISO), the International Bureau of Weights and Measures (BIPM), the Organization for International
and Legal Metrology (OIML) and the International Union of Pure and Applied Chemistry (IUPAC). In this guide,
the uncertainty of measurement is described as follows:
“The uncertainty of the result of a measurement reflects the lack of exact knowledge of the value of the measurand.
The result of a measurement after correction for recognized systematic effects is still only an estimate of the
value of the measurand because of the uncertainty arising from random effects and from imperfect correction of
the result for systematic effects. Note – The result of a measurement (after correction) can unknowingly be very
close to the value of the measurand (and hence have negligible error) even though it may have a large uncertainty.
Thus the uncertainty of the result of a measurement should not be confused with the remaining unknown error.”
In simple terms, the uncertainty of measurement is the range of values within which the value of the quantity
being measured (the measurand) is expected to lie with a stated level of confidence. It is not the same as error
because to estimate the error the “true” value must be known. Although the term uncertainty of measurement
appears to have become widely accepted in laboratory practice and in international standards, it is important
to note that most analytical procedures refer to the uncertainty of the result rather than the uncertainty of
measurement.
Although the concept of the uncertainty of measurement does not distinguish between systematic and random
errors of measurement, the sources remain the same. The most important of these are:
• Influences of the procedure (often called method bias)
• Instrumental influences
• Sampling and sample preparation
• Environmental influences
• Experimental parameters
• Evaluation methodology
• Time-dependent interdependencies
• Shortcomings of the analyst
• Gross errors
The individual sources of uncertainty are identified for each of the main influence factors. This is most easily
done using a cause-and-effect diagram (see Figure 1). If the major causes for the uncertainty are influenced
by secondary causes, further branches are introduced. This leads to finer and finer branches and finally to
a “fishbone” diagram which provides a clear overall view of all the primary, secondary and ternary factors that
contribute to the uncertainty. The contribution of the uncertainty is now quantified for each source of uncertainty
that has been identified, for example, instrument performance or sample preparation. If the uncertainties are
mutually independent, the combined standard uncertainty corresponds to the square root of the sum of the
squares of the individual uncertainties. If the individual uncertainties are not mutually independent, (i.e. if
measurands and other factors influence each other), the rules of variance have to be applied to determine the
combined uncertainty.
Weighing in Calibration
Cutting the (sample mass)
sample
Thermal contact
with crucible
Enthalpy of
fusion
Heating rate
Baseline type
Flow rate
Method Evaluation
The estimate of the uncertainty of measurement is an important step in validation. In this step, the overall
uncertainty is compared with the maximum allowable uncertainty specified for the particular measurement.
If the overall uncertainty exceeds the specified value, the largest contributions to the uncertainty must be reduced
through suitable measures such as greater care, increasing the number of measurements, more careful evaluation,
and so on. If the analytical procedure still does not satisfy the requirements, then another procedure may have
to be used. In this case, the uncertainty of measurement will have to be determined again.
www.mt.com/ta-validation-hb
In 1965, the first International Conference on Thermal Analysis (ICTA) established a Committee on Standardization
Testing Labs Guide
to promote the consistent reporting of accurate and reproducible data. Standards are pertinent to many industries
using thermal analysis techniques such as differential scanning calorimetry (DSC), thermogravimetry (TGA),
thermomechanical analysis (TMA) or dynamic mechanical analysis (DMA). As these vary widely in their mode
of operation and sample environments, reported results must include the necessary experimental information
so as to enable meaningful comparisons between results from different instruments and laboratories.
Today, standardization in thermal analysis is carried out by many national and international organizations, such
as ISO, ASTM, DIN and CEN. Standards should be based on secure knowledge derived from science, technology
and experience and ensure that products, processes and services are suitable and accepted world-wide for the
given purpose. Standardization can help maximize the compatibility, safety, repeatability and quality of products
and services.
A comprehensive list of standards and their corresponding thermal analysis methods and instruments can be
downloaded at www.mt.com/ta-standards.
www.mt.com/ta-standards
Data integrity is the degree to which data are complete, consistent, accurate, trustworthy, reliable and that these
characteristics of the data are maintained throughout the data life cycle. Data integrity is a central issue in all
laboratories regulated by Good Manufacturing Practice (GMP) or Good Laboratory Practice (GLP). However, even
non-regulated research laboratories and industries can recognize that the benefits of establishing good data
management practices outweigh the costs.
For all systems which store, process, and retrieve data – such as thermal analyzers – data integrity
is paramount.
All of the necessary functionality for a total data integrity solution, including 21 CFR Part 11 for regulated industries,
is provided by STARe software (see Table 1). Electronic records are completely protected against unintentional or
intentional modification, as they are kept in a secure, relational database. Additionally, STARe software provides
password access-control to the application, assigns user-rights for each user-level, ensures file integrity with
electronic records stored in a secure database, and properly logs the audit trail and electronic signatures.
Customers’ data management needs are even more comprehensively supported by:
• User-group specific data access: assign users to groups or projects; configure STARe for each user to
match both company and project organization. Correctly deployed, user assignment protects data against
unauthorized viewing, access or modification.
• Data classification: ensure confidentiality of electronic data across the whole system. Electronic data are
automatically classified, and are accessible only by users with appropriate clearance. Such data cannot be
viewed by unauthorized users, even in a non-compliant system.
www.mt.com/ta-dataintegrity
11.1. Crucibles
Testing Labs Guide
Crucibles serve as containers for samples during thermoanalytical measurements. They guarantee that the
sensor is not contaminated by the measurement. The type of crucible used for a measurement can have a
large effect on the quality of the results obtained, and in addition, also influences important characteristics
of the DSC measuring cell. Considering the relevant factors before the measurement can often help to save
time later on when interpreting the curve.
Sample robot
All DSC and TGA models from METTLER TOLEDO can be automated. The sample robot can
process up to 34 samples even if every sample requires a different method and a different
crucible. The sample robot is very robust and operates reliably 24 hours a day and through-
out the whole year.
www.mt.com/ta-crucibles
Handbooks
Written for thermal analysis users with background information, theory and practice, useful tables of material
properties and many interesting applications.
www.mt.com/ta-handbooks
Tutorial Kit
The Tutorial Kit handbook with 21 well-chosen application examples and the corresponding test substances
provides an excellent introduction to thermal analysis techniques and is ideal for self-study.
www.mt.com/ta-handbooks
Videos
Our technical videos explain complex issues concerning thermal analysis instrumentation and the STARe software –
whether it’s sample preparation, installation, creating experiments or evaluating measurement results.
www.mt.com/ta-videos
UserCom
Our popular, biannual technical customer magazine, where users and specialists publish applications from
different fields.
www.mt.com/ta-usercoms
Applications
If you have a specific application question, you may find the answer in the application database.
www.mt.com/ta-applications
Webinars
We offer web-based seminars (webinars) on different topics. After the presentation, you will have the opportunity
to discuss any points of interest with our thermal analysis experts.
www.mt.com/ta-webinars (Live Webinars)
www.mt.com/ta-ondemand (On Demand Webinars)
Training
Classroom training is still one of the most effective ways to learn. Our User Training Courses will help you get the
most out of your equipment. We offer a variety of one-day theory and hands-on courses aimed at familiarizing
you with our thermal analysis systems and their applications.
www.mt.com/ta-training (Classroom)
www.mt.com/ta-etraining (Web-based)
www.mt.com/ta-news
For more information