INDEX
Sr.
No.
Title Detail of Speaker/Trainer Page
No.
1 Application of Advanced Instrumentation
and In-vitro tools in Drug Discovery: A
review
Dr. Deepak Barot., M.V.Sc.,
PhD., CLAS (FELASA ‘C’) President & CEO, Prerana Bio-
Innovations Research Pvt. Ltd.,
Gandhinagar
01
2 Advanced Transgenic Animals models for
drug discovery
Dr. S.D. Patel, M.V.Sc., Ph.D.
Senior Scientist, Zydus Research
Centre, Cadila Healthcare,
Ahmedabad
06
3 Role of Veterinarians in Biomedical Science
K.B. Patel, M.V.Sc., Ph.D.
Chief Veterinary Officer,
Institute of Liver and Biliary
Sciences, New Delhi
13
4 Robotics and Artificial Intelligence in Drug
discovery: A review
U.D. Patel, M.V.Sc., Ph.D.
Associate Professor and Head,
Dept. of Vet. Pharmacology &
Toxicology, College of
Veterinary Science and A.H.,
JAU,
Junagadh
20
5 Automation and Biotechnology: Impact on
Veterinary Research and Education
B. S. Mathapati, M.V.Sc., Ph.D.
Assistant Professor, Dept. of Vet.
Microbiology, College of
Veterinary Science and A.H.,
JAU, Junagadh
26
6 Automation in Chromatography:
Application of HPLC in Veterinary Science
H.B. Patel, M.V.Sc., Ph.D.
Assistant Professor,
Dept. of Vet. Pharmacology &
Toxicology, College of
Veterinary Science and A.H.,
JAU,
Junagadh
37
7 Application of HPTLC in
Ethnopharmacology
C.M. Modi, M.V.Sc., Ph.D.
Assistant Professor,
Dept. of Vet. Pharmacology &
Toxicology, College of
Veterinary Science and A.H.,
Junagadh Agricultural University
Junagadh
57
1
Application of Advanced Instrumentation and In vitro Tools in Drug Discovery:
A Review
Dr. Deepak Barot., M.V.Sc., PhD., CLAS (FELASA ‘C’)
& Maulik Badmalia., PhD.,
Prerana Bio-Innovations Research Pvt. Ltd.
www.prebioresearch.com
The process of drug discovery is both capital and time intensive process, it is
evident from the fact that bringing a new drug to the market it can take 12-15 years and
more than $1 billion. This process starts with identification of new target, screening of
molecules as a potential drug. Once a promising hits are identified, lead optimization is
implemented. Lead optimization refers to identification of molecule with highest potency
and selectivity which can be further optimized to produce a safe and efficacious drug. This
process involves high-throughput screening which screens large number of hits in
relatively low time. This involves use of numerous sophisticated techniques and robotics
to identify lead molecules, the major techniques involved in this process are:
1. Mass Spectrometry
2. X-ray Diffraction (Protein Crystallography)
3. Molecular Docking
4. Cryo Electron Microscopy
This talk shall attempt to briefly introduce you to these techniques.
1. X-ray diffraction
X-ray diffraction or macromolecular crystallography has become one of the most
successful and important methods when it comes to exploring drug protein interactions.
This method provides structural information of biomolecules such as protein, nucleic acids
and viruses up to atomic resolution. These structures provide an important insight into the
structure-function relationship of proteins in normal and diseased conditions. Use of these
structures for in silico selection of drugs is becoming a more popular option these days.
Though this technique exists since mid-20th century, the implementation of high-
throughput systems has resulted in ease of operation. The HTS involves use of automated
robots to undertake large scale polymerase chain reaction to generate clones of proteins,
truncates and mutants. These robots not only assist in generation of protein clones but also
assist in protein purification, crystallization and identification of crystals for processing.
The advent of HTS in crystallography has reduced the turnaround time for the setting up
of crystals to diffraction to data analysis.
2. Molecular Docking
With improvement in molecular crystallography techniques there is a noted
increase in deposition of crystal structures. These structures provide important database to
explore the drugs in silico. There are numerous drug libraries consisting of novel drug
2
molecules, nutraceutical and pre-reported drugs, these libraries are screened against target
protein molecule sequentially in simulated environment inside a computer. This provides
a relatively cheaper alternative method of drug screening along with quick turnaround
times. The high end computers use high end soft wares that simulate the environment under
which the drug and target protein will interact. This does not only help in identifying novel
drug candidates but also provides an insight into possible conformations that the drug will
undertake upon interacting with the drug. The importance of molecular docking is not
limited to drug discovery but can also be exploited for lead optimization. The efficacy of
drugs can be enhanced by checking the score of interaction generated for original drug and
in silico modified drug, the best scoring modified drugs can be further tested in vitro or in
vivo for their performance.
3. Cryo-electron Microscopy
Cryo-electron microscopy (cryo-EM) of non-crystalline single particles is a
biophysical technique that can be used to determine the structure of biological
macromolecules and assemblies. Cryo-EM allows structure determination of large and/or
dynamic macromolecular assemblies in solution without the need to obtain crystals and,
even more importantly, in their native state, including post-translational modifications;
moreover, different conformational states can be resolved in one experiment. Cryo-EM has
already demonstrated its usefulness in target identification, validation and characterization,
which typically involves target structure analysis to understand the mechanism of action
and assess drug ability. This analysis requires access to the native structure, ideally in
complex with biological partners. More recently, single-particle cryo-EM analysis has
provided access to a rapidly growing number of high-resolution structures for objects not
amenable to crystallization, including large, dynamic assemblies and membrane proteins,
and technical advances have enabled the study of smaller objects and objects at higher
resolution, including complexes with small molecules. Steps involved in Cryo EM based
structure determination are shown in the figure 1.
4. Mass Spectrometry
Recently, mass spectrometry (MS) has been recognized as a tool yielding huge
potential for high-throughput discovery of lead compounds from Herbal Medicines. MS is
highly critical to drug discovery and preclinical development needs. It can effectively
address a wide range of research questions that are concerned with the qualitative and
quantitative analysis of target molecules. MS currently exists in numerous configurations
with varying functionalities
3
Since the turn of the century, mass spectrometry (MS) technologies have continued
to improve dramatically, and advanced strategies that were impossible a decade ago are
increasingly becoming available. The basic characteristics behind these advancements are
MS resolution, quantitative accuracy, and information science for appropriate data
processing. The spectral data from MS contain various types of information. The benefits
of improving the resolution of MS data include accurate molecular structural-derived
information, and as a result, we can obtain a refined biomolecular structure determination
in a sequential and large-scale manner. Moreover, in MS data, not only accurate structural
information but also the generated ion amount plays an important rule. This progress has
greatly contributed a research field that captures biological events as a system by
comprehensively tracing the various changes in biomolecular dynamics. The sequential
changes of proteome expression in biological pathways are very essential, and the amounts
of the changes often directly become the targets of drug discovery or indicators of clinical
efficacy. To take this proteomic approach, it is necessary to separate the individual MS
spectra derived from each biomolecule in the complexed biological samples. MS itself is
not so in finite to perform the all peak separation, and we should consider improving the
methods for sample processing and purification to make them suitable for injection into
MS. The above-described characteristics can only be achieved using MS with any
analytical instrument. Moreover, MS is expected to be applied and expand into many fields,
not only basic life sciences but also forensic medicine, plant sciences, materials, and natural
products.
Figure 1: Steps involved in structure elucidation using a CryoEM
4
Figure 2: Two proteomic approaches by shotgun LC-MS/MS analysis and 2D-PAGE/PMF.
Proteins are extracted correctly from biological samples.
1) Proteins are first digested by trypsin protease after denaturation and reductive alkylation.
And generated peptides are separated by reverse-phase HPLC, and directly induced into
ESI interface. MS and MS/ MS spectra are obtained by data-dependent acquisition.
2) Proteins are denatured in reductive chaotropic solution with non-ionic detergent, and
separated by isoelectric point and molecular weight. Each protein spots are digested by
trypsin after reductive alkylation. PMF spectra are obtained after purification and mixing
with ionization matrix. Protein identification is performed by proteome data base server by
using processed MS/MS and PMF spectra for the determination of post-translational
modification or molecular annotation.
5. In-vitro tools for the evaluation of human xenobiotic metabolism and toxicity
early in drug discovery and development
Use of laboratory animals is must for efficient Drug Discovery and Development.
But one of the important aspect of animal welfare i.e. “Reduction” can be practiced
efficiently much early in drug discovery and thereby indiscriminate usage of animals can
be avoided. In-vitro Assays can be very useful alternatives to animal experimentation if
practiced early in drug discovery. Human hepatocytes represent the gold standard of in
vitro experimental system for the evaluation of human drug metabolism, drug-drug
interactions, and hepatotoxicity. Successful cryopreservation to retain the ability to be
cultured (plateable) allows this experimental system to be used routinely in drug discovery
and development. Enterocytes are responsible for intestinal metabolism and are the key
cell type for assessing oral bioavailability. With CYP3A4 activity comparable to that of
human hepatocytes, enterocytes are a useful in vitro test system for the investigation of
5
intestinal drug metabolism, drug-drug interactions and toxicity assessments. A novel
medium, Li’s Differentiation Maintenance (LDMM) has been developed for longer term
culturing of human hepatocytes with restoration of hepatocyte differentiation properties.
Because of species difference, laboratory animal results do not always accurate
reflect human effects. In our laboratory, primary cell-based systems have been developed
to aid the assessment of human xenobiotic metabolism and toxicity. We believe that
experimental systems that can predict human effects require to have human-specific as well
as organ-specific properties. For safety evaluation, human organ-specific drug metabolism
is especially important as it is required for metabolic activation of nontoxic parent
chemicals to toxic metabolites as well as metabolic detoxification of toxic parent
compounds to less toxic metabolites. The in vitro human experimental systems include
hepatocytes for hepatic events, enterocytes for intestinal events, as well as the patented
Integrated Discrete Multiple Organ Co-culture (IdMOC™) system for the whole human
organism.
A novel in vitro enteric experimental system, cryopreserved human intestinal
mucosa (CHIM), for the evaluation of enteric drug metabolism, drug-drug interaction, drug
toxicity, and pharmacology has been developed. CHIM was isolated from the small
intestines of human donors. The small intestines were first dissected into the duodenum,
jejunum, and ileum, followed by collagenase digestion of the intestinal lumen. The isolated
mucosa was gently homogenized to yield multiple cellular fragments, which were then
cryopreserved in a programmable liquid cell freezer and stored in liquid nitrogen. After
thawing and recovery, CHIM retained robust cytochrome P450 (P450) and non-P450 drug-
metabolizing enzyme activities and demonstrated dose-dependent induction of
transcription of CYP24A1 (approximately 300-fold) and CYP3A4 (approximately 3-fold)
by vitamin D3 as well as induction of CYP3A4 (approximately 3-fold) by rifampin after
24 hours of treatment. These results suggest that CHIM may be a useful in vitro
experimental model for the evaluation of enteric drug properties, including drug
metabolism, drug-drug interactions, and drug toxicity.
Let us all together understand the advancement in instrumentation and do better
research and also lets take a step towards better animal welfare and alternatives to animal
experimentation by doing much required in-vitro assays early in drug discovery.
References:
1) Renaud et al., Nat. Rev. Drug. Discov.,2018
2) Sayers et al., BBA Gen. Subj. 2016
3) Amaro et al., Biophys. J. 2017
4) Nilsson et al., Analy. Chem 2014
5) Noriko Iwamoto and Takashi Shimada, Pharmacol. and Therap., 2018
*********************************************************************
6
Advanced Transgenic Animals Models For Drug Discovery
Dr. S.D. Patel
Senior Scientist, Zydus Research Centre, Cadila Healthcare, Ahmedabad
1. Introduction
Pharmaceutical and biotechnology researchers have been increasingly applying
PD/PK modeling, especially mechanistic PD/PK modeling, to all stages of drug
development. This especially includes moving from preclinical animal studies to human
clinical trials. These models are the most widespread and important computer-based
mathematical models used in drug development today. In the transition from animal models
to human studies, an important focus is dealing with differences between animal models
and humans, not only with respect to size (e.g., allometric scaling) but also with respect to
other characteristics such as differences in metabolism and the heterogeneity of the human
population. Pharmaceutical companies are faced with the challenge that about 10% of
compounds tested in clinical trials make to the market and, out of these, a minority will
generate significant profit. So, use of transgenic animals will provide solutions for drug
research, xenotransplantation, clinical trials and will prove to be a new insight in drug
development.
2. Types of Animal Models
Rodent Model: Rat, mice, gerbils, guinea pigs, and hamsters.
Non-rodent mammalian models: Dog, rabbit, cat, rhesus macaque, guinea pig, swine,
chimpanzee, and ferret.
Spontaneous models: as a result of naturally occurring mutations.
Induced models: are produced by a laboratory procedure like administration of a drug
or chemicals, feeding of special diets or surgical procedure.
Transgenic models: are animals that are genetically altered to have traits that mimic
symptoms of specific human pathologies.
3. Transgenic animal model:
It is an organism whose genetic makeup has been altered by the addition of genetic
material from an unrelated organism. Transgenic animals have either DNA added or have
their existing DNA altered (to abolishing or modifying the expression of an existing gene).
They provide genetic models of various human diseases which are important in
understanding disease and developing new targets.
4. Method to develop Transgenic Mouse:
It is designed to support investigators doing biology of aging research by creating
mice that have been genetically altered by either inserting a new gene or removing a normal
gene. This method has become one of the most exciting approaches of discovering the
functions and interactions of genes in mammals. At the University Of Washington, Nathan
7
Shock Center, this transgenic technology is used to develop new animal models for
studying genetic mechanisms of the aging process (Fig. 1).
Figure 1: Method to produce transgenic mouse
Figure 2. Micromanipulator
During the previous year, transgenic mouse production has focused on constructs
with enhanced defense against free radical injury in aging (e.g., catalase, superoxide
dismutase, glutathione S-transferase), Werner Syndrome, adult onset diabetes, Alzheimer’s
disease, thrombospondin, and rheumatoid arthritis in aging. Almost 4000 embryos, mainly
of the C57BL/6 inbred strain, have been transferred, 498 pups analyzed and at least 40
contained the integrated construct.
In addition, this core concentrated an appreciable portion of effort into embryonal
stem line (ES) methodologies for generation of knockouts and targeted ES transgenic. This
included work to generate mouse models of Werner’s Syndrome, models for study of
presenillin genes related to Alzheimer’s Disease and study of models of thrombospondin
in aging. In the past year, a total of 396 embryos were transferred, and 79 pups were born,
of which 37 were chimeric.
The isolation of mammalian genes is of utmost importance to the biology and
medicine of aging because of the contributions these studies can make to the understanding
8
of physiology and development. Techniques for introducing foreign genes into the mouse
germ line provide novel approaches for modeling human genetic and chronic degenerative
diseases. Since the initial report in 1980 describing transgenic mice, methods for the direct
microinjection of DNA into the pro-nuclei of fertilized embryos have become established.
Foreign genes can be incorporated into somatic germ-line tissues, with expression of these
elements in the progeny of founder mice.
The creation of “transgenic” animals that make a specified gene product presents a
spectrum of opportunities for basic studies in molecular pathogenesis and pre-clinical
investigations applicable to a wide variety of medical problems of aging. An additional
gene transfer technology developed in the 1980’s involved the use of stem cells from the
early embryo, so-called embryonic stem (ES) cells. The capacity of ES cells to undergo
differentiation makes them useful for investigating the effects of genetic modifications of
either the gain of function or loss of function.
These pluripotent genetically modified ES cells can then be used to make mice with
deleted genes (gene knockout) or targeted mutagenesis of genes thought to be involved in
the aging process. It is also possible to develop lines of transgenic mice carrying very large
DNA constructs (>600 kb) transected into ES cells. The University of Washington Aging
Program provides a focused level of expertise and resources to enhance and facilitate the
development of transgenic animal models using ES cell transfer technology.
5. Two methods of producing transgenic mice are widely used:
1. Transforming embryonic stem cells (ES cells) growing in tissue culture with the
desired DNA
2. Injecting the desired gene into the pro-nucleus of a fertilized mouse egg.
6. Advantages of Transgenic Animals:
Used for various purposes, from creating disease models, to producing vaccines, to
providing alternatives for human organ transplants.
They provide genetic models of various human diseases which are important in
understanding disease and developing new targets.
Animal models have been used to evaluate the role and function of genes, genetic
diseases and mechanisms of their progressions, and the efficacy of drugs
Constructs disease resistant animals and humans
This will lead to more discoveries in science
7. Disadvantages of Transgenic Animals:
Generation of transgenic animals are so much expensive
Transgenic animals are also very much costlier
High mortality rate and other deleterious effects on animals
Large number of recipients is required for embryo transfer because of low
transgenesis rate.
Usually leads to breeding problems
Leads to mutations and functional disorders
9
A new disease could be created
Ethical issues
8. Methods for generation of Transgenic animals
• Pronuclear microinjection of foreign DNA into the pronucleus of the fertilized egg;
• Genetic modification of embryonic stem cells (ESCs) followed by microinjection
into 8-cell or blastocyst embryos.
• Other methods of DNA transfer are by lentivirus, sperms, pluripotent cells and
cloning.
• CRISPR/Cas9 Model Generation: Rapidly develop knock out and knock in
models on various genetic backgrounds, including historically difficult strain
backgrounds.
• Targeted mutation : a process whereby a specific gene (removal of a gene or part
of a gene) is made nonfunctional (knocked-out) or less frequently made functional
(knocked-in).
9. Pronuclear Injection
• Genetically modified animals that arise from the injection of the pronucleus of
fertilized eggs are commonly referred to as transgenicanimals.
• In this case, a cDNA or gene of interest is fused to gene promoter sequences that
drive the expression of the gene cassette either ubiquitously or in a cell- and tissue-
specific manner. The cDNA or gene then enters the host genome through
nonhomologous recombination.
Figure 3. Microinjection Method
10. Embryonic Stem Cells method
• It is used to derive traditional knockout (or KO) and knock-in (or KI) mice, in which
critical exons of the native gene have been replaced with a drug resistance gene
cassette (such as neomycin resistance) or a marker of gene activity [β-galactosidase
(encoded by LacZ) or green fluorescent protein (GFP)].
• The replacement cassette is flanked by sequences that are identical to portions of
the targeted gene so that during cell division the targeted exons are replaced through
homologous recombination.
10
• This replacement interrupts the normal exon sequences and ‘knocks out’ the
function of the targeted gene.
• In a second step, the recombined ESCs are injected into mouse blastocysts, thereby
incorporating the genetically modified cells into the transgenic animal.
• Animals Derived from Homologous Recombination in Embryonic Stem Cells
Figure 4: Embryonic stem cell method
• A knockout mouse is a genetically modified mouse in which researchers have
inactivated, or "knocked out", an existing gene by replacing it or disrupting it with
an artificial piece of DNA.
• They are important animal models for studying the role of genes which have
been sequenced but whose functions have not been determined.
• By causing a specific gene to be inactive in the mouse, and observing any
differences from normal behaviour or physiology, researchers can infer its probable
function. Eg nude mouse
11. Important Transgenic rodent models
Nude mouse
• It is a genetic mutation that causes a deteriorated or absent thymus, resulting in to
inhibition of immune system and lead to reduced number of T cells.
• The phenotype: is a lack of body hair, which gives it the "nude" nickname.
• Use: it can receive many different types of tissue and tumor grafts, as it mounts
no rejection response.
• These xenografts are commonly used in research to test new methods of imaging
and treating tumors.
• Genetics: Nude mice have a spontaneous deletion in the FOXN1 gene. (Humans
with mutations in FOXN1 also are athymic and immune deficient.)
• Mice with a targeted deletion in the FOXN1 ("knockout" mice) also show the
"nude" phenotype. Since nude females have underdeveloped mammary glands and
are unable to effectively nurse their young, nude males are bred
with heterozygous females.
11
• The life span of nude mice is normally 6 months to a year.
• In controlled, germ free environments and with antibiotic treatments found in many
laboratories that routinely use nude mice, they can live almost as long as normal
mice (18 months to two years).
Severe combined immunodeficiency (SCID) mouse
• It is a severe immunodeficiency genetic disorder that is characterized by the
complete inability of the adaptive immune system to mount, coordinate, and sustain
an appropriate immune response, usually due to absent or
atypical T and B lymphocytes.
• used as research into the basic biology of the immune system, cell transplantation
strategies, and the effects of disease on mammalian systems.
• Genetics: due to a rare recessive mutation on Chromosome 16 responsible for
deficient activity of an enzyme involved in DNA repair (Prkdc or "protein kinase,
DNA activated, catalytic polypeptide").
• Because V(D)J recombination does not occur, the humoral and cellular immune
systems fail to mature.
• As a result, SCID mice have an impaired ability to make T or B lymphocytes, may
not activate some components of the complement system, and cannot efficiently
fight infections, nor reject tumors and transplants.
ob/ob mouse
• It is a mutant mouse, unable to produce leptin, that eats excessively, thus resulting
in obesity,
• It is an animal model of type II diabetes.
• Identification of the gene mutated in ob led to the discovery of the hormone leptin,
which is important in the control of appetite.
• The first ob/ob mouse arose by chance in a colony at the Jackson Laboratory in
1949.
• The mutation is recessive.
• Mutant mice are phenotypically reaching a weight three times that of unaffected
mice.
• ob/ob mice develop high blood sugar, despite an enlargement of the pancreatic
islets and increased levels of insulin.
• The gene produces a hormone, called leptin that is produced predominantly
in adipose tissue.
• Role of leptin is to regulate appetite by signalling to the brain that the animal has
had enough to eat.
• Since the ob/ob mouse cannot produce leptin, its food intake is uncontrolled by this
mechanism.
12
(db/db) mice
• Leprdb is an autosomal-recessive mutation on chromosome 4.
• Obesity expressed at 4-5 weeks of age
• Elevation of plasma insulin demonstrated at 10-14 days
• Polyphagia, Proteinuria, Glycosuria
• Polyuria/Polydipsia
• Hyperinsulinemia despite severe depletion of pancreatic islet insulin-producing B-
cells
• Leptin receptor deficient
• Hyperglycemia develops at 4-8 weeks of age
Ob/ob mouse db/db mouse
Figure 5: ob/ob and db/db mouse
ZDF (Zucker Diabetic Fatty) Rat
• Nomenclature: ZDF-Leprfa/Crl
• Therapeutic Area: Metabolic
• Control: ZDF lean
• Origin: Mutation occur in in a colony of outbred Zucker rats in the laboratory of
Dr. Walter Shaw at Eli Lilly Research Laboratories in Indianapolis, Indiana, 1977.
• Characteristics: Obesity, Insulin Resistance, Hyperinsulinemia, Type 2 Diabetes,
Hyperglycemia, Hypertriglyceridemia, Hypercholesterolemia, Nephropathy,
Impaired Wound Healing, Mild Hypertension, Neuropathy.
• The ZDF rat is a popular obese, type 2 diabetes model.
• In the Zucker fatty (fa) rats, a missense A to C mutation in the Lepr gene on
chromosome 5 (Leprfa) causes a Gln to Pro change in all the identified isoforms of
Ob-R protein. A sub-strain of Zucker fatty rats, known as the ZDF rats, is
selectively inbred for hyperglycemia.
• They carry an autosomal recessive defect in β-cell transcription that is inherited
independently from the Lepr mutation.
************************************************************************
13
Role of Veterinarians in Biomedical Science
Dr. K.B. Patel
Chief Veterinary Officer, Institute of Liver and Biliary Sciences, New Delhi
1. Introduction
The veterinary profession has continually adapted to the changing needs of society.
In the nineteenth century, horses and their health were the primary focus of veterinary
medicine because of the important role horses played in the lives and livelihood of people,
including farming and transportation. In the early twentieth century, animal agriculture
rose in prominence. In the late twentieth century, veterinary medicine gave much greater
attention to the health of companion animals. The human–companion animal relationship
and bond continues to evolve. Pets often play a key role in the family structure, and the
importance of the human–animal bond is well recognized to have positive effects on human
well-being, health, stress, healing, and the immune system.
The question that is often asked is why veterinarians become involved with animals
used for scientific purposes. There are two main reasons. First, the animal research branch
of the profession is intellectually stimulating and encourages scientific curiosity. It
encourages use of all of the primary disciplines of veterinary science and adds unique skills
not often used in other veterinary pursuits including care of a wide variety of often unusual
species, facility design for intensive animal production and experimental holding, infection
control and zoonoses, scientific principles, ethics, philosophy, policy formation, animal
research compliance, gene technology and OHS. Second, the desire to become a
veterinarian usually stems from empathy for animals and it is that empathy that is critical
for veterinarians who fulfil a role in the monitoring and care of animals used for scientific
purposes. The recognition that these animals are sentient beings, whose welfare is
paramount, is imperative and the veterinarian plays a key role in ensuring that everyone
involved with the use of these animals understands this principle. The adverse effects of
stress on the immune system, for example, are well documented and it is the interface
between the researchers and the veterinarians that promotes the reality that good animal
welfare leads to good science!
2. Contributions of veterinarian in animal and human health
Veterinarians in research seek better ways to prevent, diagnose and treat animal and
human health problems. They study many diseases such as cancer and cardiovascular
disease using laboratory animals that are carefully bred, raised, and maintained under the
supervision of veterinarians. Laboratory animal veterinarians help select the best animal
models for particular research projects and carefully monitor the animals to ensure proper
care and attention to the animals’ well-being. In addition to developing ways to reduce or
eliminate the threat of animal diseases, research veterinarians have made many
contributions to human health. For example, research veterinarians were the first to isolate
14
filterable viruses, the first tumor-causing virus, Salmonella species, Brucella species, and
other pathogens. They also helped conquer malaria and yellow fever, solved the mystery
of botulism, produced an anticoagulant used to treat some people with heart disease, and
developed and refined surgical techniques for hip-joint replacement and limb and organ
transplants in people.
The use of animals as experimental models will continue to be an essential tool in
the immediate future, permitting innovative and increasingly molecular discoveries of
basic science to be translated into improvements in health care for humans and animals
alike. The laboratory animal veterinarian plays a critical role on behalf of society and the
research community by overseeing and helping to safeguard the use of animals in research.
In terms of societal expectations, the laboratory animal veterinarian works to promote
animal welfare by ensuring that experimental protocols maximize animal well-being and
minimize pain and distress. On behalf of the research community, laboratory animal
veterinarians must ensure that healthy animals are procured and maintained to produce
reliable research results. The veterinarian also assists the research team in developing
relevant models for study and to seek refinement, reduction, and replacement alternatives
for animal use. Factors such as public concern regarding animal welfare and increased
regulatory scrutiny of research animal use, along with the rapidly expanding use of
genetically engineered mice as research tools, have increased the need for veterinarians
with expertise in laboratory animal medicine in all sectors of employment including
academia, government, regulatory affairs, and industry in India
3. Most challenging part of a research veterinarian’s job
Research veterinarians say the most challenging part of their job is the vast amount
of knowledge they must have readily available. These veterinarians must understand
everything currently known about a disease – from its cause(s), to the way it interacts with
the body’s cells, to research on potential cures.
4. Animals used in biomedical research
Almost all of the animals used in laboratory work are mice,rats and other rodents.
Cats, dogs, monkeys, and other animals (sheep, goat, pigs, mini pigs, etc.) make up less
than ten percent of the research animal population. Mice and rats are ideal for research
projects because it is easy to study long-term degenerative diseases in an animal that has a
short lifespan. In addition, mice and rats are inexpensive and easy to obtain. Cats and dogs
are chosen to study diseases in the human population such as cancer and AIDS. Monkeys
may be research subjects because of their physical similarities to humans, making it easier
to study complex human ailments in their systems.
5. Profile of veterinarians in biomedical Research
Veterinarians working in pharmaceutical and biomedical research firms develop,
test, and supervise the production of drugs, chemicals, and biological products, such as
15
antibiotics and vaccines for human and animal use. In both government laboratories and in
corporate research facilities, veterinarians provide daily veterinary medical care to the
animals involved in research, ensure that the animals are properly and humanely cared for,
and use their expertise to improve surgical techniques for humans and animals. Some
veterinarians who work for the Committee for the purpose of Control and Supervision of
Experiments on Animals (CPCSEA) visit research laboratories to ensure that the treatment
of the lab animals adheres to the Indian laws designed to protect the lab animals. Research
veterinarians may also work with government institutions and pharmaceutical industry and
contract research organizations. Research veterinarians at these agencies work to find cures
for diseases and sometimes test potential new drugs on the animals to determine if the drugs
are effective and safe to meet regulatory requirements of Drug Controller General of India
(DCGI).
In addition to performing research themselves, veterinarians play crucial roles in
the support of biomedical research. Laboratory animal veterinarians support the medical
care and health of laboratory animals, promote animal well-being and humane care of
laboratory animals, maintain barrier conditions against diseases to improve laboratory
research, conduct collaborative research, provide technical instruction to scientists,
monitor compliance with state and federal regulations, and serve on institutional animal
research review panels. Veterinary and medical pathologists (that specialize in laboratory
animal pathology) are central to the diagnosis of spontaneous diseases, understanding the
mechanisms of disease induced by experimental procedures, and analyzing the phenotype
of spontaneous or induced genetic modifications in animals. When there is a lack of expert
pathology support on a research project important information may be misinterpreted or go
undiscovered, which reduces the impact of animal-based research.
Animal models of human disease have become increasingly important in fostering
translational research. Research using spontaneous or experimental animal models of
disease can have benefits for both veterinary and human medicine. Clinical trials and
evidence-based medicinefor spontaneous diseases in companion animals can have direct
relevance to similar human conditions. In some cases, clinical trials with spontaneously
occurring diseases in dogs, cats, horses, and other companion animals may better predict
pathogenesis and treatment out comes in humans than experimental rodent studies.
6. Animal testing in advancing medicine
Virtually every major medical advance of the last century for both human and
animal health is the result of animal testing. From antibiotics to organ transplants,
practically every present day protocol for preventing, treating, curing, and controlling
diseases in humans is based upon knowledge gained through animal research.
16
7. The Role of the Laboratory Animal Veterinarian
Several key documents regarding adequate veterinary care were provided to
discussion group participants: the American College of Laboratory Animal Medicine
(ACLAM) position paper on adequate veterinary care (ACLAM 1996); the report of the
joint working group of FELASA/ECLAM and the European Society of Laboratory Animal
Veterinarians (ESLAV) on veterinary care for laboratory animals (Voipio et al. 2008); and
recommendations published by the Canadian Council on Animal Care (CCAC): Appendix
VI, Continuing Education (CE) for Consulting and Newly Hired Institutional Veterinarians
Working in Science, and an excerpt from Section 7.2, Qualifications and Continuing
Education for Veterinarians and Staff (CCAC 2008). The ACLAM position paper notes
that the complexity of the veterinary care program may vary depending on the number and
species of animals used and types of research conducted at the institution.
ACLAM outlines the following primary responsibilities for the veterinarian:
Disease detection and surveillance, prevention, diagnosis, treatment, and
resolution; guidance and monitoring in (1) handling and restraint of animals; (2)
anesthetics, analgesics, and tranquilizers; and (3) methods of euthanasia; review and
approval of all preoperative, surgical, and postoperative procedures; promotion and
monitoring of animal well-being before, during, and after experimentation or testing; and
involvement in the review and approval of all animal care and use at the institution.
Ancillary roles identified by ACLAM include providing training to staff in the care
and use of laboratory animals, providing input to the occupational health and safety
program, monitoring for zoonotic diseases, advising on standards of hygiene in the animal
facility, and advising on and monitoring biohazard control policies and procedures. The
FELASA/ECLAM/ESLAV guidelines for the veterinary care of laboratory animals
recognize the importance of postgraduate education for veterinarians who specialize in
laboratory animal medicine based on the many competencies necessary to fulfill the
veterinarian’s multiple roles in this field. The guidelines acknowledge that there may be
acceptable gradations of expertise in laboratory animal medicine, ranging from the
availability of a skilled mentor for veterinarians
Veterinarians working in small institutions to specialty board certification from
ECLAM. Regardless of the specific circumstances of employment, the guidelines identify
the following key areas of responsibility: assessment of animal welfare, including
behavioral evaluation; early recognition of signs of pain and distress, and their alleviation;
prevention, diagnosis, and treatment of disease; verification that animals procured for
research are derived from a licensed source or approved breeder; implementation of an
appropriate health monitoring program; appropriate record keeping; involvement in the
training and assessment of competence of those who conduct procedures that may affect
animal welfare; review of anesthetic procedures; consultation with the researcher about
proposed surgical procedures and adequate perioperative care; advice, training, and
17
oversight of euthanasia procedures; and participation in the review of proposed research
and implementation of the Three R’s concept viz. reduction, refinement, and replacement
(Russell and Burch 1959). More recently, the OIE Terrestrial Animal Health Code has been
revised to describe the various responsibilities of the veterinarian. In addition to provision
of an adequate program of veterinary care in a research environment, these responsibilities
involve clinical duties, including preventive medicine, disease surveillance, diagnosis and
management of disease, and management of controlled drugs; post-mortem examinations;
maintenance of veterinary medical records; and advice on zoonotic risks and notifiable
diseases, surgery and postoperative care, analgesia, anesthesia, euthanasia, and humane
endpoints. There was consensus among the discussion participants that, over and above the
roles and responsibilities described in these documents, the veterinarian served as a trainer
in laboratory animal medicine and science for the institution and was key to ensuring sound
management of the animal facility. They also noted that the veterinarian’s input in protocol
review was essential.
8. Scope for Veterinarians in Biomedical Research
The involvement of laboratory animals in medicine research has yielded numerous
unique insights pertaining to pathogenesis mechanism and subsequent ability to link and
develop new drug molecules. The recent advances in human genome research have further
increased the usage of laboratory animals to get connections between diseases and
molecular factors with introduction of molecular bio-imaging concept. This growing
contribution in medical science has been highly appreciated and due regards has been paid
to professional veterinarian whose active participation and thorough knowledge on actual
physiological and anatomical peculiarities affiliated with different laboratory animals
model has further propounded several valuable information. The animal model studies has
continually enhanced our understanding on genetic, molecular cellular component of
human diseases and promoted to direct animal to human translational application. Many
efforts are on to find suitable alternatives to animal experiments, to increase the usefulness
of those that already exist, and to refine animal research models and methods. But, at
present days biomedicine, with its experimentation on animals, reveals the laws of nature
with which the clinician and his patient can use to improve the life quality and prolong the
life span and eliminate sufferings. We all want to lead a healthy and enjoyable life. Most
of us want the benefits of modern medical research-benefits that we would not have thought
without the contribution of animal research
An increased emphasis on scientific methods of breeding and raising livestock,
poultry, and fish will contribute to a demand for more veterinary researchers. Research
veterinarians will also continue to support public health and disease control programs. We
can look to research veterinarians to develop new surgical and medical treatments for
animals that can be later used to help humans in need of the same type of treatment. There
are many opportunities available in the pharmaceutical industry, contract research
18
organizations, vaccine manufacturing companies, lab animal breeders and diagnostic
service providers, government research institutions, universities, infertility clinics etc.
9. Vivarium Automation:
As a special request from organizer, this topic is added in this presentation. This
will give glimpse of automation technologies used in vivarium and their benefits. In
vivarium, where environmental controls are critical to research, and where animal and
personnel health and welfare are a constant concern, automation can work on behalf of
reducing risk and increasing efficiency. The introduction of an automated process can
provide many improvements in vivarium function and safety that save time and money.
Automation can also introduce new levels of control and monitoring that can raise specific
criteria and conditions to a higher level of reliability.
The types of automation discussed in this article tend to be large systems; however,
automation isn’t just robots and controls. A standard rack washer or tunnel washer is a
form of automation as it replaces manual cleaning and there are a host of automation
devices available to handle the delivery, dispensing and proportioning of disinfectants,
sterilants and cage wash chemicals.
For some types of automation, adding new equipment to do a certain task can be an
initial expense but can result in labor savings and health risk reduction. Some automation,
however, is a systemic operation that requires careful planning. As with automated
equipment, facility or building related automation comes with long term benefits and
potentially high initial costs. This is all the more reason to approach automation as a part
of the overall facility plan.
10. Summary
Because of their emphasis on animal welfare issues, laboratory animal veterinarians
are in a position to make direct and important improvements to the care, well-being, and
welfare of animals, which often become permanent standards for veterinary care of all
species, both nationally and internationally. Veterinarians can play very important role in
Biomedical research as Laboratory Animal Medicine Veterinarian, director for laboratory
animal care and use program, independent researcher in different areas of biomedical
research, translational research, as comparative medicine experts. All these roles are
different than what veterinarians studied in their degree curriculum and require different
kind of skills and competencies. There are different courses are available in India and
Abroad to acquire these competencies and skills. Some are even available on-line.
Automation technologies are improving efficiency in the vivarium and helping
institutions keep up with the growing number of mutant rodents in their colonies and
maintaining consistency in quality with enhanced efficiency.
19
Reference:
1) https://www.avma.org/KB/K12/Documents/biomedical_bgnd.pdf
2) Tomas J. R. et al., (2009).The Need for Veterinarians in Biomedical Research. J
Vet Med Educ. 2009 ; 36(1): 70–75
3) Turner P. V. et al (2009) Laboratory animal medicine — Needs and opportunities
for Canadian veterinarians.
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2643448/
4) Bayne K. et al (2011) Harmonizing veterinary training and qualifications in
laboratory animal medicine: a global perspective. ILAR J. 2011;52(3):393-403
5) https://www.researchgate.net/publication/280923078_ROLE_OF_VETERINARI
AN_IN_LABORATORY_ANIMAL_RESEARCH_AND_MEDICINE
6) Zurlo et al (2009) Adequate Veterinary Care for Animals in Research: A
Comparison of Guidelines from Around the World.
https://pdfs.semanticscholar.org/0bc6/a503260680fef1b383ca2182623e4a357b43.
7) Chris Cosgrove et al (2003) Vivarium Automation Part 1
https://www.laboratoryequipment.com/article/2003/06/vivarium-automation-part-1
************************************************************************
20
Robotics and Artificial Intelligence in Drug discovery: A review
U.D. Patel1, C.M. Modi1, H.B. Patel1, B.S. Mathapati2, Ladumor V.C. 1, Vaja R.K1.
1Department of Vet. Pharmacology and Toxicology, 2Department of Vet. Microbiology
College of Veterinary Science and A.H., Junagadh Agricultural University, Junagadh
1. Introduction
In the fields of medicine, biotechnology and pharmacology, drug discovery is the
process by which new candidate medications are discovered (Figure 1). Historically, drugs
were discovered through identifying the active ingredient from traditional remedies or by
serendipitous discovery. Later chemical libraries of synthetic small molecules, natural
products or extracts were screened in intact cells or whole organisms to identify substances
that have a desirable therapeutic effect in a process known as classical pharmacology. Since
sequencing of the human genome which allowed rapid cloning and synthesis of large
quantities of purified proteins, it has become common practice to use high throughput
screening of large compounds libraries against isolated biological targets which are
hypothesized to be disease modifying in a process known as reverse pharmacology. Hits
from these screens are then tested in cells and then in animals for efficacy.
Discovering drugs that may be a commercial success, or a public health success,
involves a complex interaction between investors, industry, academia, patent laws,
regulatory exclusivity, marketing and the need to balance secrecy with communication
(Warren et al., 2011). Meanwhile, for disorders whose rarity means that no large
commercial success or public health effect can be expected, the orphan drug funding
process ensures that people who experience those disorders can have some hope of
pharmacotherapeutic advances.
Figure 1: Drug discovery and Development (Source: https://medium.com/advances-in-
biological-science/dynamic-undocking-a-structure-guided-tool-for-virtual-drug-
discovery-bd380ff30435)
21
2. Screening methods and designing
It is very unlikely that a perfect drug candidate will emerge from these early
screening runs. One of the first steps is to screen for compounds that are unlikely to be
developed into drugs; for example compounds that are hits in almost every assay, classified
by medicinal chemists as "pan-assay interference compounds", are removed at this stage,
if they were not already removed from the chemical library (Baell and Holloway, 2010;
Dahlin and Walters, 2014; Baker, 2017). It is often observed that several compounds are
found to have some degree of activity, and if these compounds share common chemical
features, one or more pharmacophores can then be developed. At this point, medicinal
chemists will attempt to use structure-activity relationships (SAR) to improve certain
features of the lead compound:
Increase activity against the chosen target
Reduce activity against unrelated targets
Improve the drug likeness or ADME properties of the molecule.
3. Challenges of Drug Development
Drug development is a lengthy, complex, and costly process, entrenched with a
high degree of uncertainty that a drug will actually succeed.
The unknown pathophysiology for many nervous system disorders makes target
identification challenging.
Animal models often cannot recapitulate an entire disorder or disease.
Challenges related to heterogeneity of the patient population might be alleviated
with increased clinical phenotyping and endotyping.
Greater emphasis on human data might lead to improved target identification and
validation.
There is a lack of validated diagnostic and therapeutic biomarkers to objectively
detect and measure biological states.
Unfamiliarity with current regulatory processes for investigational new drug (IND)
applications can be resolved through pre-IND meetings (Anonymous, (2014).
4. Robotics in Drug discovery
In recent years, high throughput screening (HTS) has been a major component in
advancing new lead discovery research in Pharma. High throughput screening is a great
concept and holds promise for academia, but to follow the pharmaceutical paradigm in
transforming a screen hit to a lead, and eventually into a drug is easier said than done.
While the automotive and electronics industries, among others, have been relatively fast in
adopting robotics, pharma has been lagging behind. However, as the technology available
becomes more advanced, flexible and affordable, robotics now stands as a pivotal element
to helping pharmaceutical companies reduce costs and increase efficiency, notably in the
drug discovery process.
22
As an example of how the use of robots can allow researchers to bypass menial
tasks and focus their time on more worthwhile activities such as real drug development and
research, Hogg points to the case of US genome research equipment manufacturer
SciGene, which has created a robot with the ability to prepare DNA samples. Hogg notes,
“The laboratory technician or researcher does not require engineering skills but can
program the robot using simple instructions. The precision is so high that robots today can
put 40,000 dots of DNA onto a single microscopic slide – such a feat cannot be rivalled by
human hands.”(Burton, 2018). It is noted that with the use of automation, running samples
by hand has a potential output of around 30-40 per day, where robotic systems can approach
output in the hundreds of thousands per day. A cornerstone of the initiative is NiCoLA-B,
a drug discovery robot at the U.K. Center for Lead Discovery, the company’s research
center on the Cambridge campus. Its name is a variation of the moniker for the entire robot
system: CoLAB (collaborative laboratory), which was developed by HighRes
Biosolutions. The robot can test more than 300,000 compounds a day.
Figure 2: Robotic system for screening of molecules.
(Credit: HighRes Biosolutions; Source https://www.rdmag.com/article/2018/03/using-
robotics-and-automation-technology-control-drug-discovery-costs)
Figure 3: Global market in robotics at present and in future
23
Figure 4: Applications of robots in various fields
5. Bioinformatics or Artificial intelligence (AI) in Drug discovery
Drug discovery is an expensive process. The cost of bringing a drug from the early
discovery stages through to the market currently stands at an estimated average of $2.6 bn
(Avorn, 2015). Bioinformatics analysis can not only accelerate drug target identification
and drug candidate screening and refinement, but also facilitate characterization of side
effects and predict drug resistance. High-throughput data such as genomic, epigenetic,
genome architecture, cistromic, transcriptomic, proteomic, and ribosome profiling data
have all made significant contribution to mechanism-based drug discovery and drug
repurposing. Machine learning and other technologies are expected to make the hunt for
new pharmaceuticals quicker, cheaper and more effective.
London-based start-up firm BenevolentBio has its own AI platform, into which it
feeds data from sources such as research papers, patents, clinical trials and patient records.
This forms a representation, based in the cloud, of more than one billion known and
inferred relationships between biological entities such as genes, symptoms, diseases,
proteins, tissues, species and candidate drugs. This can be queried rather like a search
engine, to produce ‘knowledge graphs’ of, for example, a medical condition and the genes
that are associated with it, or the compounds that have been shown to affect it. Most of the
data that the platform crunches are not annotated, so it uses natural-language processing to
recognize entities and understand their links to other things. “AI can put all this data in
24
context and surface the most salient information for drug-discovery scientists,” (Fleming ,
2018).
Figure 5: Artificial intelligence and SAR
Another example of a tech-giant making advances into healthcare through AI is
Google’s DeepMind Health, in this case, working in partnership with Moorfields Eye
Hospital NHS Foundation Trust in London, developing technology to address macular
degeneration in aging eyes. The results, recently published in Nature Medicine’s journal,
detail the process in which AI technology analyzed thousands of historic retinal scans to
identify signs of eye diseases — such as glaucoma, macular degeneration and diabetic
retinopathy. Boston-based biopharma company Berg is using AI to research and develop
diagnostics and therapeutic treatments in multiple areas, including oncology, by applying
an algorithm and probability-based artificial intelligence to analyze large numbers of
patients’ genotype, phenotype and other characteristics (Wnuk, 2018)
Conclusion:
There is important role of robotics in the process of drug discovery and
development. The process of drug development can be speedup by using the robotics. The
precision and robustness can be achieved with the use of robot based in vitro screening
laboratories. In future, pharma industry may increase robotic setup for fast screening of its
library of synthesized molecules for variety of pharmacological activity screening.
References
Anonymous, (2014). Forum on Neuroscience and Nervous System Disorders; Board on
Health Sciences Policy; Institute of Medicine. Washington (DC): National Academies
Press (US); 2014 Feb 6.
Avorn J. The $2.6 billion pill-methodologic and policy considerations. N Engl J Med 2015;
372: 1877-9.
25
Baell JB, Holloway GA (2010). "New substructure filters for removal of pan assay
interference compounds (PAINS) from screening libraries and for their exclusion in
bioassays". Journal of Medicinal Chemistry. 53 (7): 2719–40.
Baker M. (2017). "Deceptive curcumin offers cautionary tale for chemists". Nature. 541
(7636): 144–145.
Burton, P (2018). Robotics: Changing the Face of Drug Discovery.
https://pharmaboardroom.com/articles/robotics-changing-the-face-of-drug-discovery/
Assessed on 27th December, 2018.
Dahlin JL, Walters MA (2014). "The essential roles of chemistry in high-throughput
screening triage". Future Medicinal Chemistry. 6 (11): 1265–90.
Flaming, N. (2018). How artificial intelligence is changing drug discovery.
https://www.nature.com/articles/d41586-018-05267-x. Assessed on 27th December,
2018.
https://pharmaphorum.com/views-and-analysis/artificial-intelligence-in-drug-discovery-
and-diagnosis/. Assessed on 27th December, 2018.
Warren J (2011). "Drug discovery: lessons from evolution". British Journal of Clinical
Pharmacology. 71 (4): 497–503.
Wnuk, P (2018). Artificial intelligence in drug discovery and diagnosis
************************************************************************
26
Automation and Biotechnology: Impact on Veterinary Research and Education
B. S. Mathapati1, B. B. Javia1, D.B.Barad1, H.B. Patel2, U.D. Patel2, V.K.Singh3 S. N.
Ghodasara1 and A.M. Zala1
1Department of Veterinary Microbiology, 2Department of Veterinary Pharmacology &
Toxicology, 3 Department of Veterinary Physiology & Biochemistry College of
Veterinary Science & A.H., JAU, Junagadh
1. Introduction
In today’s advanced world, the terms automation, artificial intelligence (AI),
biotechnology and nano-technology are common words. Initially these were the routine
words in lexicon of engineering technology and industry. But now these words have also
taken their niche even in biology and related sciences. Automation is the use or introduction
of automatic equipment in a manufacturing or other process or facility. In other words,
there is some automatic equipment that does not need human inference to so some task.
Again it may be Artificial Intelligent or Fix Programmed. This means you can do
Automation by two ways either make an Artificial Intelligent Machine or Program Fix
instructions to a machine. This is what Automation is. Some people may consider
automation easy but actually it is very fascinating, you need to have decent amount of
knowledge in control engineering/control systems, signals and systems, signal processing,
electronic devices and a core programming language. On the other hand Artificial
Intelligence we create some sort of Machine in which we program the “Ability to acquire
and apply knowledge and skills. That means any set of work which the automated machine
does through one or more decision making points. So using this intelligence it learns by
previous experiences that means we do not need to program each and every instruction and
we use either Data or we reward the machine whenever it does some task that we want it
to do. And day by day it learns the task we want it to do. These advanced areas have brought
lot many changes in the areas of biology research in a larger extent along with few
expansions in the areas of education. Here the veterinary research and education in per se
is not at all different from other research & education. So, the topics and points discussed
here will be concentrated on research and education rather veterinary research & education
specifically.
2. Artificial intelligence driven Automation in research
Inspired by the word “automatic”, the term automation was coined in 1936 by D.
S. Harder from General Motors. Harder defined the term as the “automatic handling of
parts between each production process”. It refers to the entire process that involves the use
of machines, control systems etc. in order to optimize production processes in the
production of goods. Modern automation systems used in biotechnology consist of made-
to-measure, highly precise hardware and software components. They are employed
throughout, from single steps to complete processes, including management, regulation
27
and control. Because of the good cost-performance ratio and the flexibility, platform
solutions that can be adapted to certain requirements by both manufacturer and user have
become very popular. From a historical perspective, automation followed the
mechanization of work and production processes that were considerably modernized by
the industrial revolution. Automation goes far beyond mechanization as it involves not just
the pure execution of tasks but also the control and regulation being taken over by artificial
systems. Much like a bicycle leverages mechanical advantage to propel a person much
faster and further than on his or her own, automation machinery is amplifying both the
speed and accuracy with which biological research can be conducted. Machines are precise
and do not fatigue, unlike their human counterparts. For this reason, machines are slowly
being adopted for laborious, repetitive, and tedious laboratory tasks.
There are two ways of thinking about machines in the modern scientific
community: physical, mechanical machines and laboratories, institutes, and facilities to
which experiments can be outsourced. The former of these are often tailored to a single
specific task. Tasks can include phenotypic screening, liquid transfer, and even laboratory
organism maintenance. Mechanical machines are perfect for performing repetitive, tedious
lab work inside the laboratory. Many of these machines offer sensors that can detect and
measure traits that would be otherwise immeasurable by hand. When utilized properly,
mechanical machines can greatly increase precision, efficiency, and reproducibility while
reducing overall workload. Mechanical machines, while massively advantageous when
working properly, often lack the ability to sense their surroundings or steps in their
mechanical processes. Therefore, it is nearly impossible for these machines to adapt to
conditional changes or for certain segments of the machine to provide feedback to other
segments, either to adjust processes or alert their human keepers to an issue.
Great strides must be made in the design and engineering of experimental
machinery so as to reduce innate variability of experimental setup and screening. Human’s,
sentient beings, can sense and adjust to errors made mid-experiment. If too much of a
solution is added to a reaction, a person can adjust the rest of the protocol on the fly or, at
worst, begin the reaction over again. If the same error is made by a machine, it may go
unnoticed until the output data are collected. Even worse, it may go completely unnoticed,
presenting false data that are interpreted as correct upon analysis. Likewise, if the error is
due to the improper functioning of a device, such as a pipette in the above example, a
human is more likely to notice the error in the first place as we are equipped with sensors,
i.e. eyes, that can detect that the volume transferred does not match the volume desired.
Such sensors and feedback networks are currently not available on many of the mechanical
machines being utilized in experiments.
The second way to think about machine automation in science is through the
outsourcing of experiments to external research labs, institutions, or facilities. These
organizations will take, as input, reagents or experimental parameters and return the output
reagents or data back to the experimenter. These organizations can be thought of as
28
automation machines because they take input and serve output from and to experimenters
without any further involvement or work from the experimenter. The most common
example of this process is that of DNA sequencing. While DNA sequencing has become a
routine task in many modern laboratories, only a small minority of labs own and operate
their own sequencing machine, either due to financial cost or lacking an operator with the
requisite expertise and experience. Instead, most laboratories outsource this process to
external organizations that specialize in DNA sequencing.
Perhaps the greatest advantage of the utilization of automation machinery is the
ability to pipeline experimental segments into one another. For example, laboratories can
now outsource the creation of a specific reagent, the utilization of the reagent in a specific
experiment, and the measurement of the outcome of the experiment all without conducting
any work in their own laboratory. While the jury is still out on whether this pipelining
technique will ever lead to any “virtual laboratories” where investigators simply dream up
experiments, outsource their execution, then analyze the data, it is important to note that
the modularization of experiments will almost certainly lead to increased efficiency in
much of modern science. Instead of spending money on training and technicians to run
repetitive short-term experiments, researchers now have the capacity to outsource these
experiments to either mechanical machinery in their own lab or external research
organizations for a fixed, per-experiment cost. Automation can increase the productivity
of a single individual by orders of magnitude. Machines do not tire or vary innately in their
performance of a task; however, machines can break, suffer inaccuracy or variability in
their measurements, and not detect faults when they may be blindingly apparent to a
human. For these reasons, great care must be taken in protocol derivation and maintenance
scheduling when utilizing any form of automation equipment.
When utilizing machinery in the design and/or execution of an experiment, the most
important variable to consider is the trustworthiness of the data at each individual step. If
unnoticed, systematic biases in data collection resulting from measurement error of
machinery can lead researchers astray. Additionally, variability between experimental runs
must be considered. One sign of data trustworthiness is reproducibility of the data. If the
same experiment is replicated under the same conditions, the data resulting from each
round of the experiment should be, at best, identical or, at worst, comparable. Although
automation machinery has the incredible upside potential to streamline and parallelize
experimental workflows, scientists must be careful to thoroughly validate results, both data
and reagents.
Processes that previously required pipetting, analysis and production to be carried
out manually are increasingly now controlled by automated systems. However, this has not
necessarily involved a complete reinvention of the wheel, instead automation systems used
in the systems and mechanical engineering sectors are being adapted and optimized for
application in the life sciences.
29
Automation is the creation of technology and its application in order to control and
monitor the production and delivery of various goods and services. It performs tasks that
were previously performed by humans. Automation is being used in a number of areas such
as manufacturing, transport, utilities, defense, facilities, operations and lately, information
technology.
Automation can be performed in many ways in various industries. For example, in
the information technology domain, a software script can test a software product and
produce a report. There are also various software tools available in the market which can
generate code for an application. The users only need to configure the tool and define the
process. In other industries, automation is greatly improving productivity, saving time and
cutting costs. Automation is evolving quickly and business intelligence in applications is a
new form of high-quality automation. In the technology domain, the impact of automation
is increasing rapidly, both in the software/hardware and machine layer. However, despite
advances in automation, some manual intervention is always advised, even if the tool can
perform most of the tasks.
Automation is slowly taking over our lives, ranging from smart homes to self-
driving cars. With experiments becoming ever more complex and datasets more immense,
a question that comes to mind: can we automate biotech research? Definitely yes. Most
pharma and biotech companies, and in part also academic labs, have automated sections of
their processes and make use of liquid-handling systems, especially those pursuing high-
throughput screenings. But compared with manufacturing and service industries, the life
sciences sector is lagging behind in terms of using laboratory automation to improve
productivity and quality.
The potential of laboratory automation becomes clear when looking at academia.
Here, PhD students are commonly the workhorses of the lab, often becoming absorbed in
repetitive pipetting tasks that keep them from spending time on the actual science. On top
of that, most researchers are still using handwritten lab notebooks. Synbio companies are
probably the most advanced when it comes to laboratory automation. Using software and
hardware automation, some of these companies aim to transform the process behind
engineering life. Co-founded by Tom Knight, the godfather of synthetic biology, Ginkgo
Bioworks was started with the dream of making biology easier to engineer. The company
designs and optimizes organisms to efficiently produce compounds according to the
customer’s desire. Today, the US company has built several ‘foundries’ — factories where
every step of genetic engineering has been automated, including high-throughput analytics
and small-scale fermentation. Running about 15,000 automated operations per month, the
so-called ‘organism engineers’ can build and test thousands of new prototypes for each
project. Thanks to automation, the company has significantly scaled up its throughput and
capacity.
In Europe, British LabGenius is another synbio company that applies AI and
automation to its gene synthesis technology to rapidly search through trillions of genetic
30
designs with the goal to find new biological solutions and create novel compounds. Liquid
handling systems and pipetting robots have been around for a while, but most of them are
simply too expensive for most laboratory settings. Also, they often require trained
personnel to program them according to the specifics of the experiment. Luckily, low-cost
liquid-handling robots are starting to become available thanks to companies like Opentrons
in the US. “Basically, if you’re a biologist you spend all of your time moving tiny amounts
of liquid around from vial to vial by hand with a little micro-pipette or you have a $100,000
robot that does it for you. We’re a $3,000 robot,” OpenTrons co-founder Will Canine told
TechCrunch. Their machines are controlled through a web browser and allows researchers
to download protocols from the cloud to run experiments. Taking it one step further,
London-based Synthace is developing Antha, “a programming language for biology”. The
technology consists of a software interface connecting hardware and wetware in the lab to
automate simple workflows of experimentation. It allows a researcher to build a protocol
and then communicate it to their lab equipment.
Many are still skeptical that robots and remote labs can actually do the trick. “I have
a hard time believing that a centralized automated lab will give you the freedom and
flexibility to experiment with all the parameters you need to do some innovation,” Roger
Chen, an associate at AlphaTech Ventures in San Francisco told Nature. “The thinking is
that we can solve a lot of the problems by taking the human out of the loop,” Fracchia tells,
“But biology is not computer science. Automated lab services are great for simple
experiments such as DNA sequencing or synthesis. But you cannot put data on everything.”
Fracchia believes that the solution to the reproducibility crisis is augmentation, not
necessarily automation. His company, BioBright, offers a suite of software, hardware, and
services to turn laboratories into ‘smart labs’ by enabling automated data collection using
sensors, cameras and a simple raspberry pi computer. The company also developed a voice
assistant that records researcher’s voice notes, generating a reliable record that reduces
human error. In the future, Fracchia envisions scientists wearing lenses that project
holograms describing the content of our test tube. Automation alone might not be the sole
solution to make research more efficient and reproducible, yet innovations in this area
could greatly benefit the overall efficiency and quality of research in life science labs.
The key areas where the atomization and artificial intelligence with specific
application in biology research differs from that of the industrially used automation
technologies. As in biological research the fluid handling is the most essential basic feature
of automation, here general purpose robotics are of little use. So we need systems like wet
robotics or soft robot system. Designing soft robots that can flexibly transform between
different morphologies and freely move in a controllable manner to perform desired tasks
is a long-sought goal spanning many scientific and technological disciplines ranging from
mechanics and physics to biology. A significant challenge in soft robotics is the
development of actuation schemes that reflect the required flexibility of a robot. Pneumatic
actuation is commonly used for actuation owing to its high degree of controllability, but it
31
remains complex and cumbersome. Over the past decade, many types of self-powered
machines, including biomolecular motors, hybrid microbes, and chemical fuel-driven
nano-micro motors, have been studied, but none of these methods have been found to be
compatible with soft robotics because their components are partially or entirely based on
solid materials. Room temperature liquid metals (LMs), which are generally composed of
gallium, gallium–indium eutectic alloys [EGaIn: 75 % Ga, 25 % In (wt/wt)], and gallium–
indium–tin alloys [Galinstan: 68.5 % Ga, 21.5 % In, and 10 % Sn (wt/wt)], have received
increasing attention as building blocks of soft robots owing to singular properties such as
superior conductivity, favorable flexibility, and increased biocompatibility relative to
mercury. In addition, the self-healing ability of LMs, which is an important feature of LM
systems, can benefit soft robotics.
The refinement of techniques for the remote manipulation of LM morphologies will
be essential in the further development of innovative soft robotics. The challenge in remote
manipulation is to develop methods for imparting energy (or fuel) to an LM to impart
physical motion and/or transformation. Once this goal is achieved, more complex tasks can
be envisioned for application-oriented LM soft robots. There is currently a high degree of
interest in the manipulation of LMs operating under physiological environments to carry
out biotechnological applications.
Robots are often designed on the basis of the ergonomics of organisms; as such, it
is inevitable that the concepts and designs underlying soft robotics will approach those of
biological systems, although much work remains to be done in this respect. We believe that
this convergence in design criteria will lead to the eventual use of soft robots in
biotechnological applications. Over the past few years, the biotechnological application of
LMs has featured prominently in four major areas: microfluidics, healthcare devices,
biomaterials, and theranostic nanomedicines. Here, the overview of only the major relevant
developments will be mentioned in these fields.
Micro-fabricated fluidic devices are promising for biotechnological applications to
performing efficient biochemical reactions and for fast analyses on a chip. It is well known
that microfluidics can provide effective production methods for synthesis and chemical
functionalization of LM microspheres. LMs themselves are potentially a useful component
for these fluidic devices because of their unique abilities and highly controllable
manipulations. In fact, recent review describes the potential applications of gallium LM
alloys for making various soft and metallic microfluidic components such as channels,
pumps, valves, mixers, electrodes, sensors, heat sinks, heaters and coolers. Traille et al.
fabricated a wireless passive temperature sensor using microfluidic technology and the
thermal volume expansion properties of Galinstan. Their study represented the first
demonstration of the remote measurement of temperature using the radar cross section
variability and thermal expansion of LM. Knoblauch et al. developed a microinjector to
introduce femto- to attoliter-range samples into cells and organelles with minimal cellular
damage. Their method, which relies on the heat-induced expansion of a Galinstan LM
32
within a pipette to force the sample from the capillary tip, is suitable for the delivery of
fluorochromes and DNA constructs into cyanobacteria and subcellular organelles such as
nuclei and chloroplasts. Controlling the temperature of a reaction mixture on a chip is of
particular importance for many microfluidic applications. Zhu et al. fabricated a Galinstan
enabled vortex generator using electrochemical actuation of LMs for homogeneous thermal
regulation in a fluidic chamber. Microfluidic technology base on LMs is a relatively new
field of research and is developing very rapidly. We are sure that it provides a high potential
to further advance the field of microfluidics for biotechnological applications.
Owing to their extremely high electrical conductivities, favorable fluidity, high
compliance, and benign biocompatibility, gallium-based LM alloys are promising
materials for use as stretchable and flexible circuits in wearable healthcare electronics.
Boley et al. demonstrated the applicability of EGaIn nanoparticles in stretchable
electronics. In their study, EGaIn nanoparticles were inkjet-printed directly onto an
elastomer glove surface to form arrays of strain gauges with intricate wiring and contact
pads. Hirsch et al. developed stretchable electronic conductors formed of biphasic solid–
liquid thin metal films.
Starting from a bilayer of solid–liquid thin metal film prepared from gold and liquid
gallium, a bilayer metallization sequence was initiated via the sputtering of an alloying
gold film onto a polydimethylsiloxane (PDMS) sheet. Liquid gallium (melting point=
29.88C) was then thermally evaporated onto the gold film. This process produced a
heterogeneous film forming a continuous network containing dispersed bulges composed
of clusters of the solid intermetallic alloy AuGa2 and liquid gallium. The conductor had a
large area and fine patterns comprising interconnections of light emitting diodes (LEDs).
All of the components remained in a working status and the LEDs were able to light
normally under stretching and folding to different shapes.
Yu et al. conducted biotechnological experiments to demonstrate
electrocardiogram measurement using gallium or gallium–indium alloy printed onto the
body surface and successfully verified the conformability and good attachability of the
resulting electrode.
Based on their advantages in terms of stretchability and rapid manufacturability,
fabrications based on flexible LM electronics represent a promising approach toward future
wearable healthcare electronics. It is widely agreed that the biggest advantage of using
gallium-based LMs in biotechnological applications is their high biocompatibility.
Additional positive qualities of LMs include convenience in terms of fabrication,
encapsulation, and surgical use, as these materials have zero stiffness and nearly infinite
stretchability as a result of their high fluidity. In this light, exploring the applicability of
LMs as implantable biomaterials is a key step to developing innovative biotechnology.
Zhang et al. reported the use of GaInSn alloy as an effective connecting or functional
recovery channel for transected sciatic bullfrog nerves. Based on measurements of
33
electroneurographic signals, they found that the favorable fluidity and high electrical
conductivity of GaInSn are particularly useful properties for such therapeutics.
The same group also demonstrated that the strong X-ray attenuation property and
high density of gallium were effective for use in in vitro X-ray imaging of cardiovascular
and kidney vessels. LM angiography offers the possibility of mega-contrast quality in the
reconstruction of significantly enhanced radiological vascular images. Although further
toxicological investigations of LMs will be required, the studies outlined above represent
the first indications of the potential of LMs as implantable biomaterials. Gallium-based
LMs are useful for various applications because of unique properties, including superior
conductivity, high flexibility, and low toxicity that suggest promising pathways to the
construction of future soft robotic designs. Continuous and vigorous development in this
area can be expected and exciting results yielding many innovations should be anticipated.
Given the amount of experimental work already undertaken and in progress in this area, it
is easy to envision that LMs will ultimately demonstrate strong functionality and
performance ability as building blocks in soft robotics. Overall, it seems likely that the
biomedical and pharmaceutical, analytical, and environmental sectors will eventually
benefit significantly from the development of LM soft robots that perform useful tasks.
3. Advantages of AI driven Automation in research
Reproducibility
Current laboratory protocols contribute approximately $3 billion annually to the
problem of preclinical irreproducibility. Laboratory automation as well as new digital tools
and technologies can contribute to solutions. The automated system like an extractor or
automated ELISA set up and other laboratory automated work stations can generate the
data of samples with maximum reproducibility which in other ways can cut the costs of
research on repeated experiments.
Increased throughput or productivity with efficiency:
Automation will save lots of time, energy & resources through high throughput
outcomes. It can work tirelessly and any number of samples can be analyzed without
hampering the efficiency. This can provide lot of productive time to researcher to get
behind new things rather than repetition of non – reproducible experiments.
Improved quality or increased predictability of quality:
Automation can lessen the burden of inter and intra-laboratory variations brings the
uniformness in the quality of research, which will hasten the total outcome of scientific
research as a whole. Can bring up the need based laboratory collaborations and co-
operations according to the expertise & laboratory infrastructure.
Overall growth in Research & Its output:
Automation can hasten the protocols in laboratories of accreditation/authoritative
agencies which can provide the standards/ approvals in comparatively shorter time, so that
there will be overall growth in research and its output.
34
There are few hurdles which can be considered as limitations of automation in
research. In no case the machine can match or replace the capacity of human being in toto.
Others like An automated system may have a limited level of intelligence, and is therefore
more susceptible to committing errors outside of its immediate scope of knowledge (e.g.,
it is typically unable to apply the rules of simple logic to general propositions). Security
threats and vulnerability can also considered as the limitations along with high initial costs
laboratory infrastructure.
4. AI driven Automation in education
Automation technologies are becoming a significant and meaningful part of our
daily lives. Robotic process automation (or RPA) is an emerging form of clerical process
automation technology based on the notion of software robots or artificial intelligence (AI)
workers. How RPA and AI can make impacts in the education system?
The educational domain is a broad one, encompassing teachers, students, and other
beneficiaries. Automation technologies, such as RPA, and AI, can deliver challenges as
well as benefits to trainers, students, and academic administrators. Saya is a robot teacher
that led a 2009 science and technology class in Tokyo, Japan. Despite the advances in these
technologies, McKinsey suggests the replacement of teachers by automation is highly
improbable. In fact, most education jobs are low on the spectrum of replacement by
automation, with only 20% of education positions being fully automatable. Potential tasks
for automation include ones that are troublesome for many trainers such as scheduling,
keeping track of attendance, and even assignment grading. While automating these tasks
will benefit the trainers, in the long run, it will allow them to spend less time on routine
tasks and more time with their students.
Education system consists of both academic and non-academic activities. In the
field of education, various tasks are paper-based tasks and most of them are repetitive,
ponderous and time-consuming. The scope of RPA and AI is massive in Education system,
especially at the administrative level. Few areas in the education sector where RPA, AI can
make an immense impact. Couple of areas which covered are given below,
Course Registration, Shortlisting and Enrolment Process
Course registration and enrolment process are the primary and hectic tasks at the
administration level. Once students are applied for a course, a person needs to check and
manually process the application forms and shortlist the candidate. The purview of
automation in this process. If we can set up an automation to check the eligibility criteria,
the system will share the list of shortlisted candidates, therefore, we can avoid a big process
of cross-checking the records one by one.
Attendance marking
Attendance management is one of the daily hectic tasks in the educational system.
Physically capturing the attendance, informing the absence of student to their parents,
ascertain the percentage of individual students, segregating the students based on the
percentage they obtained all these are a chaotic job. From a typical attendance marking
35
system, now most of the institutions are seeking the help of technology to fulfill their
attendance requirement. Instead of marking the attendance manually, students are provided
with active RF ID cards. This automizes the attendance process once they enter into the
premise. Scaling up of such features with SMS gateways improve the usability of the
application. This benefit the parents in terms of making sure their children's availability in
the institution and helps the teachers to generate various routine reports whenever required.
Cognitive Learning
Cognitive psychology refers to the study of the mind and how we think. This can
be benefitted mainly in the e-learning segment. Cognitive integrated applications capture
the expressions and emotions from the student's face. Application run the numerous
patterns built into it and provide instructions to the trainer. It can give you an insight that,
what learning method works and don't work to the targeted students. This will make a huge
impact on the learning outcome of students. Soon the AI and RPA overmaster the
education system and will take it to higher levels. It will lead to obtaining a new face to
education domain. Therefore the interaction time between student and trainer will increase
and student-trainer bond will become stronger than now.
5. Conclusion:
Robotics, biotechnology and automation significantly advance the life sciences
including research areas in veterinary sciences, e.g., via academic, industrial, diagnostics
and pharmaceutical liquid-handling robots and open source approaches. Consequently,
formal and informal education must convey these concepts. The Next Generation Science
Standards (NGSS) and other national initiatives promote cross-disciplinary approaches for
science, technology, engineering, and math (STEM) learning. Automation in research has
definite advantages like reproducibility, increased throughput or productivity with
efficiency, improved quality or increased predictability of quality and overall growth in
research & its output. There are few hurdles which can be considered as limitations of
automation in research. In no case the machine can match or replace the capacity of human
being in totality. In education robotic process automation (or RPA) is an emerging form of
clerical process automation technology based on the notion of software robots or artificial
intelligence (AI) workers. Education system consists of both academic and non-academic
activities. In the field of education, various tasks are paper-based tasks and most of them
are repetitive, ponderous and time-consuming. The scope of RPA and AI is massive in
Education system, especially at the administrative level.
Reference:
1. Chapman T. Lab automation and robotics: Automation on the move. Nature. 2003;
421: 661–666. https://doi.org/10.1038/421661a PMID: 12571603
2. Kong F, Yuan L, Zheng YF, Chen W. Automatic liquid handling for life science: a
critical review of the current state of the art. J Lab Autom. 2012; 17: 169–185.
https://doi.org/10.1177/2211068211435302 PMID: 22357568
36
3. Hossain Z, Bumbacher E, Chung AM, Kim H, Litton C, Pradhan S, et al. A Real-
time Interactive, Scalable Biology Cloud Experimentation Platform. Nat Biotech.
2016; 34.12: 1293–1298.
4. Soft Robotics: Transferring Theory to Application (Eds.: A. Verl, A.
AlbuSchaeffer, O. Brock, A. Raatz), Springer, New York, 2015
5. Yue Yu and Eijiro Miyako. Recent Advances in Liquid Metal Manipulation toward
Soft
Robotics and Biotechnologies. Chem. Eur. J. 2018, 24, 9456 – 9462
******************************************************************
37
Automation in Chromatography: Application of HPLC in Veterinary Science
H.B. Patel1. U.D. Patel1, C.M. Modi1, B.S. Mathapati2, Shreesha Rao S.1,
Patel Utsav N.1
1Department of Vet. Pharmacology and Toxicology, 2Department of Vet. Microbiology,
College of Veterinary Science and A.H., Junagadh Agricultural University
1. Introduction
What is Liquid Chromatography?
Liquid chromatography is the science of separating the chemical compounds and used to
identify and quantify target compound from mixture
High Performance Liquid Chromatography (HPLC)
Figure1: Instrumental diagram of HPLC work flow
Principle of HPLC:
HPLC is a column chromatography technique in which cartridge or column is
packed with a sorbent (stationary phase) and liquid (mobile phase) is passed through the
packed column. A dissolved sample (in a liquid) is injected into the flow path of the mobile
phase. The sample mixture separates into individual analyte bands as it passes through the
HPLC column and chromatogram is generated in which individual analyte bands are seen
as “peaks” and peaks are quantified. Separation occurs by difference in the relative speed
of each analyte of mixture with competition between the mobile phase and stationary phase
for binding with compounds. One injected sample divide in bands as shown below.
Peaks Auto sampler
38
Figure 2: Separation of analytes in the form of different bands
Yellow is the earliest eluting analyte’s band, moving at fastest rate and it “likes”
the mobile phase. Blue is well retained near the inlet and move the slowest in the column
which has more affinity with the particles of stationary phase.
2. History of chromatography
Chromatography word has derived from greek words (chroma – color, Graphy --
writing/study). 1903- Dr. Mikhail Tswett. A Russian scientist had carried out an
experiment with tall glass open column filled with sand-like particles and powder of plant
extract poured into the column and seen colored “bands” developed as the extract
percolated down through the column and different compounds had separated.
Figure 3: Glass column filled with sand-like particles and plant extract separated in the form
of bands
1903 Tswett - Plant pigments separated on chalk columns
1931 Lederer & Kuhn – Liquid chromatography of carotenoids
1938 TLC and ion exchange
1950 Reverse phase liquid chromatography
1954 Martin & Synge (Nobel Prize)
1959 Gel permeation
39
1965 liquid chromatography instrument (Waters)
3. Specific applications of HPLC in various areas of research
A. Pharmaceutical applications
Tablet dissolution study of drug dosages forms.
Shelf-life determinations of pharmaceutical products
Identification of active ingredients of various formulations
Quality assurance and quality control of drugs
B. Environmental and human health protection
Detection of some toxic compounds in drinking Water
Identification of adulteration various products
Bio-monitoring of pollutants (Wiklund et al., 2005).
C. Forensic science
Quantification of the drug in biological samples.
Identification of anabolic steroids and doping agents in serum, urine, sweat, and
hair
Forensic analysis of textile dyes (Bowden and Madsen, 1986).
Determination of narcotics and metabolites in blood
D. Clinical use
Quantification of ions in human urine
Analysis of antibiotics in blood plasma.
Estimation of bilirubin and bilivirdin in plasma in case of hepatic disorders.
Detection of endogenous neuropeptides in extracellular fluids of brain
(Abidi,1991).
E. Food and Flavor
Ensuring the quality of soft drink and drinking water.
Analysis of adulterations in milk products.
Sugar analysis in fruit juices.
Analysis of polycyclic compounds in vegetables.
Analysis of pesticide traces in agricultural crops (Christie et al., 1991).
4. Basic components of HPLC system
a) Solvent or mobile phase reservoir
b) High pressure pump
c) Injector
d) Column
e) Detector
f) Data recording and interpretation unit
a) Solvent or Mobile phase reservoir:
The solvents or buffers or mixture of solvents and buffers in the form of
homogenous mixture are stored in solvent reservoir and are allowed to enter the mixing
chamber through mechanical pump via flowing tubes fitted with inline filters to prevent
40
entry of impurities of mobile phase. Reservoirs mainly made up of glass bottled with
properly covered.
b) High pressure pump:
Different types of pumps are used in high pressure liquid chromatography which
can be classified as direct gas pressure pump, pneumatic intensifier pump, reciprocating
pump and syringe pumps. The role of the pump is to propel (force) a liquid (the mobile
phase) through the column and detector at a specific flow rate, expressed in mL/min.
Normal flow rates in HPLC are 1-2 mL/min achieved in the pressure range of 2000 – 9000
psi but in UHPLC mode operating pressure can be as high as 15000-18000 psi. An ideal
pump should have the following characteristics:
Solvent compatibility and resistance to corrosion
Constant flow delivery independent of back pressure
Low dead volume for minimum problems on solvent changeover
The pump should be inert to solvents, buffer salts and solutes. They are made of
stainless steel, titanium and resistant minerals.
The main types of pumps used in HPLC (or in LC) are the following:
Constant Pressure Pumps: Provide consistent continuous flow rate through the column
with the use of pressure from a gas cylinder.
Constant Flow Pumps:
a. Reciprocating piston pumps: Deliver solvents through reciprocating motion of a
piston in a hydraulic chamber. The main drawback of a reciprocating pump is that it
produces a pulsing flow. With a flow-sensitive detector, such as micro-adsorption
detector, a pulse damping system must be used and detector sensitivity is reduced.
b. Syringe type pumps: Suitable for small bore columns and constant flow rate is
delivered to column by a motorized screw arrangement.
Different types of pump operation/solvent delivery system/
1. Isocratic (iso = same)
Solvent composition stays the same for the entire run (60: 40) (Alcohol: Water);
constant mobile phase composition
2. Gradient
Solvent composition changes throughout the run with gradual Change or step Change;
variable mobile phase composition
c) Injector:
Injector is mainly used to inject the sample components either in the form of liquid
or gaseous form into the column compartment which allows mixing up with mobile phase
and gets detected. There are two types of injectors, one is manual injector using injection
valves and which is called Rheodyne injector and another one is fully automated
electronically controlled device (auto injector/sampler). The manual injector operates by
loading sample loop by means of glass or plastic syringes while the mobile phase is pumped
into the column. In case of automated injector, devices control automatically number of
41
injections, injection volumes and time gap between injections which ensures the
reproducibility of injections. In auto injector large number of samples are injected at
specific time interval automatically which reduces manual efforts of injection after every
run in case of manual injector.
d) Columns:
Figure 4: Liquid Chromatography columns with different lengths and diameters.
The column is the heart of a high pressure liquid chromatography system which
separates the sample components of interest using various physical and chemical
parameters. HPLC columns are made up of either polymers or silica based. Silica based
packings are the most popular and oftenly used. The chemical nature of these packings
vary with the stationary phase and average particle diameter of the packings is between 3
to 10 µm with a narrow size distribution. The HPLC columns packings should be based on
spherical silica 18 particles having good mechanical strength and a narrow size distribution.
The columns of smaller particles permit faster separation than columns of larger particle
size. Small molecules are usually chromatographed on particles having 5-12 nm pore size
while large molecules are usually separated with particles having a pore size of 12 to 30
nm, e.g., proteins are usually separated on 30 nm pore size particles.
The silica surface contains various kinds of SiOH (silanol) groups and free silanols
are undesirable for the separation of the basic molecules. The addition of appropriate alkali
can eliminate problems of secondary interactions between the basic compounds and acidic
silanols (e.g., triethylamine). The stability of the bonded phase is important in HPLC
reproducibility. Long chain alkyl bonded phase packings (e.g C8 or C18) generally are more
stable than monomeric phases (e.g, diols). End capping is often used to more completely
bond (silanize) packings and consists of a subsequent reaction with trimethylchlorosilane
or hexamethyldisilanize, to increase the cover support and to minimize unwanted reaction
with free silanols. Normally reverse phase separations can be made with C8, C18, C3, C4,
Phenyl, phenyl ethyl, cyano, amino, polystyrene packings while normal phase separations
must utilize cyano, diol, amino and pure silica packings. Column specifications should
specify particle size, length, internal diameter etc. Normal columns are 3-25 cm in length
and internal diameter of most analytical columns is 0.4-0.5 cm. Micro-bore column (0.1-
0.2 cm i.d) are used for interfacing with detectors such as mass spectrometer.
e) Detector:
UV/Vis detector
Fluorescence detector
42
Refractive index detector
Electrochemical Conductivity detector
Mass spectrometer (MS)
Detectors equipped with the flow-through cell and it is a major breakthrough in the
development of modern liquid chromatography. Current LC detectors have wide dynamic
range, and have high sensitivities often allowing the detection of nanograms of material.
In the last decade there is a significant progress in the development of LC-MS interfacing
systems. MS as an on-line HPLC detector is said to be the most sensitive, selective and in
the same time the most universal detector. But it is still the most expensive one.
UV detectors:
The most widely used detector in HPLC is the UV-absorption or
spectrophotometer. In this detector the changes in UV-absorption when the solution passes
through a flow cell is measured. UV detectors are concentration sensitive and have the
advantage that they don’t destroy the solute. UV detection can utilizes the fixed emission
line of a mercury line (254 nm) to allow the detection of molecules with some absorption
at this wavelength. The continuous emission of energy by deuterium lamp can be utilized
in conjunction with a monochromator to provide a variable wavelength detector. The
variable wavelength facility is extremely useful for getting better sensitivity in difficult
analysis as solutes can be monitored at their wavelength of maximum absorption. Not all
molecules possess sufficiently strong UV chromophore for satisfactory UV absorption.
Bile acids, lipids, sugars etc., are examples of such compounds. Today such molecules are
detected by the process of derivatisation when a chromophoric group is attached to the
solute either at pre-column or post column level.
Fluorescence detector:
It is extremely sensitive and works on the principle that the fluorescent energy
emitted at a longer wavelength, known as emission wavelength, is proportional to the/
concentration. First, however the solute has to be excited with the excitation wavelength
energy, this wavelength is always lower than the emission wavelength, but solute has to
possess initial fluorescence, it can react with compounds, which are fluorescent
derivatizing agents.
Refractive index (RI) Detector:
It functions by measuring the change in the refractive index as the eluent passes
throws the flow cell. The sample and reference cell should be thermo stated, if better
sensitivity required. This detector is at least three times as sensitive as the UV-visible
detector, but variations in ambience produce several fluctuations.
Electrochemical detectors:
Very sensitive, but its usage is restricted to only oxidation and reductions. The
system works on the principal of polarography. A serious drawback of this detector is that
the electrodes get contaminated and poisoned by absorption of the oxidized or reduced
compounds. Partially to obliterate this defect a moving electrode has been used in HPLC
43
to detect solutes containing halogens. The column effluent is volatilized in an oven purged
with nitrogen gas and the halogenated solutes like chlorinated insecticides, polychlorinated
biphenyls, DDT in milk etc., are detected in favorable cases up to very less concentration
level.
Mass spectrometer (MS)
The idea of LC-MS combination, though expensive is undoubtedly attractive.
Various methods of stream splitting, transport and solute ionization have been reported,
the limitation of this detector buffer solution being non-volatile can’t be used. In GC,
column effluent or solvents vaporizes and solute pyrolysed in a hydrogen /air atmosphere
and then detected by flame ionization detector.
Various other detectors, though commercially not popular have been highlighted,
they include a microwave plasma detector, infrared detector, flame photometric detector,
a phosphorus/ sulfur detector, a dual UV fluorescence detector and radiometric detection
system.
f) Data Recording and Interpretation:
Once the detection is completed the detected signals is converted into electronic
signals and then amplified by means of amplifier and recorded as a chromatogram in the
form of data points. Then manually or by software interpreted as per requirement to convert
as a presentable format.
5. Common modes/basis of liquid chromatography separation
A. Partition chromatography (Solubility):
The stationary phase of this chromatography technique is a liquid supported on an
inert solid. Mobile phase is either a liquid (liquid-liquid chromatography) or gas (gas-liquid
chromatography). This is based on the compounds’ solubility. In other words difference in
solubility is also due the differences in polarity. More polar substances are having higher
affinity to the stationary phase. Less polar substances will go into the mobile phase. Paper
chromatography comes under this type.
B. Adsorption chromatography (Polarity):
The stationary phase of this particular technique is a solid material on which the
sample compounds are adsorbed. Mobile phase is either a liquid (solid-liquid
chromatography) or a gas (gas-solid chromatography). Adsorption is completely different
from absorption. In here molecules are adsorb to a surface however molecules will not
become a part of this section. Adsorption chromatography is based on the interaction
between the solute molecules and active sites on the stationary phase. This attachment or
interaction depends on the polarity of solutes. This techniques proves the statement that
“polar like polar”.
Based on retention mechanism for analyses during the separation:
a) Normal Phase
b) Reversed Phase
44
a) Normal-phase chromatography
Stationary phase is Polar (hydrophilic) (e.g. silica)
Mobile phase is Non-Polar (hydrophobic) (e.g. hexane)
Polar analytes
Polar analytes are more attracted to the polar stationary phase and less attracted to
the non-polar mobile phase and more retention on normal-phase column.
Non-Polar analytes
Non-polar analytes are more attracted to the non-polar mobile phase, less attracted
to polar stationary phase and less retention on normal phase column. Ultimately the non-
polar analytes come out before polar analytes and detected earlier on chromatogram.
b) Reversed -phase chromatography
Stationary Phase is Non-polar (hydrophobic) (e.g. C18)
Mobile Phase is Polar (hydrophilic) (e.g. H2O)
Non-Polar Analytes
Non-Polar Analytes are more attracted to the non-polar stationary phase and less
attracted to the polar mobile phase hence more retention on reversed-phase column
Polar Analytes
Polar analytes are more attracted to the polar mobile phase and less attracted to non-
polar stationary phase hence less retention on reversed-phase column
Ultimately the polar analytes come out before non-polar analytes and detected earlier on
chromatogram.
Figure 5: Polarity nature of different mobile phases used in liquid chromatography
Figure 6: Polarity nature of different stationary phases used in liquid chromatography
45
Analyte attraction to the mobile phase and the stationary phase is based on different
polarities of these two phases. Analytes attracted to the mobile phase will be moving faster,
while the analytes attracted to the stationary phase/packing material will slow down.
C. Ion exchange chromatography (Charge):
a) Anion Exchange (SAX, WAX)
b) Cation Exchange (SCX, WCX)
Ionic compounds are often better separated by ion exchange chromatography
(IEC). In this case, the stationary phase consists of acidic or basic functional groups bonded
to the surface of a polymer matrix (resin or silica gel). Charged species in the mobile phase
are attracted to appropriate functional groups on the ion exchanger and thereby separated.
Ion pairing chromatography is an alternative to ion exchange chromatography. Mixtures of
acids, bases and neutral substances are often difficult to separate by ion exchange
techniques. In these cases ion pairing chromatography is applied. The stationary phases
used are the same reversed phases as developed for reversed phase chromatography. An
ionic organic compound, which forms an ion pair with a sample component of opposite
charge, is added to the mobile phase. This ion-pair is, chemically speaking, a salt which
behaves chromatographically like a non-ionic organic molecule that can be separated by
reversed phase chromatography.
D. Size Exclusion Chromatography (Molecular size):
a) Size Exclusion Chromatography (SEC)
b) Gel Permeation Chromatography (GPC)
Size exclusion chromatography (SEC) or gel permeation chromatography (GPC)
uses as the stationary phase a porous matrix which is permeated by mobile phase
molecules. Sample molecules small enough to enter the pore structure are retarded, while
larger molecules are excluded and therefore rapidly carried through the column. Thus size
exclusion chromatography means separation of molecules by size.
E. Affinity chromatography (Biological activity):
The stationary phase of this chromatographic technique is a gel matrix. And the
mobile phase is a buffered solution. Stationary phase are capable of attaching to their
specific compounds those coming with the mobile phase. According to the theory of this
technique, the shape of the compounds in blue color matches perfectly to the shape of the
arrow head. Thus they are bound well to the stationary phase taking longer period to elute
where as other compounds (yellow and dark blue) elute first. Thus the interaction is
between one type of solute molecule in the test sample and the substance in the stationary
phase (blue arrow).This reaction is a covalent interaction.
This is mainly used in antibody testing assays. Test sample contains a mixture of
proteins and immobilized molecule is an antibody to some specific protein. Only the
specific protein is reacted to this antibody in the stationary phase. And this particular
protein is extracted by changing the pH of the mobile phase. This extraction is done by
weakening the covalent bonds of the two substances involved in the interaction.
46
Ultra Performance Liquid Chromatography (UPLC):
In the year 2004, further advances in instrumentation and column technology has
resulted into significant increase in resolution, speed and sensitivity in liquid
chromatography. Columns with smaller particles (1.7 µ) and instrumentation with
specialized capabilities designed to deliver mobile phase at 15,000 psi (1,000 bar) to
achieve a new level of performance. A new system had to be holistically created to perform
ultra-performance liquid chromatography, now known as UPLC technology. Columns
contain the same packing material chemistry, are the same length with the same mobile
phase. One column has particles which are a third the size. Smaller particle sizes provide
for a better separation with the same run time. However, back pressure will increase.
Figure 7: Comparative particle size of HPLC and UPLC columns and sharpness of peaks
separation
6. HPLC parameters
The result of the chromatographic separation is the chromatogram, which is the
function of the detector response versus the time which shows some important parameters.
Figure 8: HPLC separation of a multi-component mixture
47
Figure 9: The chromatogram illustrates the most important parameters which characterize
a separation.
Peak widths:
W1/2 = peak width at half height
W = band width of the peak (intersection point of the inflectional tangents with the zero
line)
Peak symmetry is measured at 10% or peak height with parameters like:
A = peak front at 10 % of peak height to peak maximum
B = peak maximum to peak end at 10 % of peak height
Retention times
t0 = dead time of a column = retention time of an unretarded substance
tR1, tR2…..= retention times of components 1, 2, . . .
t’R1, t’R2…= net retention times of components 1, 2 . .
Retention:
In an elution chromatographic separation substances differ from each other only in
their residence time in or at the stationary phase. From this the following time definitions
arise:
The total retention time (tR1 or tR2):
It is the time, which is needed by a sample component to migrate from column inlet
(sample injection) to the column end (detector).
The dead time (t0):
It is the time required by an inert compound to migrate from column inlet to column
end without any retardation by the stationary phase. Consequently, the dead time is
identical with the residence time of the sample compound in the mobile phase.
48
The net retention time (t’R1 or t’R2):
It is the difference between total retention time and dead time. That is the time the
sample component remains in the stationary phase.
The capacity factor (k’):
It is a measure of the position of a sample peak in the chromatogram. It is specific
for a given substance. k’ depends on the stationary phase, the mobile phase, the
temperature, quality of the packing etc.
,
,
Different formulae for calculation of various parameters of HPLC chromatogram
The relative retention (α):
It is also known as separation factor, is the ratio between two capacity factors,
where the figure in the denominator is the reference compound. The relative retention
describes the ability of a system of stationary and mobile phase to discriminate between
two compounds. It is independent of column construction (length, quality of packing) and
flow velocity. It depends on the temperature and the properties of the mobile and stationary
phases. Impurities in the mobile phase (e.g. water content) strongly influence the relative
retention. Instead of the mobile phase volumetric flow rate F (mL/min) it is advantageous
to use the linear velocity u (cm/sec). The linear velocity is independent of the cross section
of the column and proportional to the pressure drop in the column. The linear velocity can
be calculated by means of the dead time, where (L) is the column length in cm and (t0)the
dead time in sec.
The permeability (K) of a column:
It describes its transmittance for a mobile phase and characterizes the hydraulic
resistance. The permeability of a column depends on mobile phase, temperature, column
length and pressure. A change in permeability indicates a change in the packing (e.g.
swelling of ion exchangers, silica gel etc.).
The number of theoretical plates (n):
It characterizes the quality of a column packing and mass transfer phenomena. The
larger n, the more complicated sample mixtures can be separated with the column. The
column plate number N is the single most important characteristic of a column and defines
the ability of the column to produce sharp, narrow peaks and to achieve good resolution
for band pairs with small α values. N depends on the particle size and is usually expressed
per meter length of the column. The values of N for a 25 cm column should be
49
approximately 10000. The shape of the peaks produced by the column is equally important.
The peak symmetry should be between 0.9 to 1.1. Retention time reproducibility or
consistency is the criteria for a good column.
The height equivalent of a theoretical plate (h):
HETP, is the length, in which the chromatographic equilibrium between mobile and
stationary phase is established. Since a large number of theoretical plates is desired, h
should be as small as possible. There are, of course, no real plates in a chromatographic
column, since the packing is homogeneous. The value of h is a criterion for the quality of
a column. h values depend on the particle size, the flow velocity, the mobile phase
(viscosity) and especially on the quality of a packing. For practical reasons, the peak
symmetry is measured at 10% of peak height, where A is the distance from peak front to
peak maximum and B is the distance from peak maximum to peak end. Ideally symmetry
should be 1, i. e. A = B. With values below 1 one speaks of fronting, with values above 1
there is peak tailing.
Figure 10: Shape of ideal peak showing proper symmetry and good separation
7. HPLC troubleshooting
The following are the common problems in HPLC and possible solutions for:
Uncommon peak shapes
Lack of sensitivity
Poor sample recovery
Pressure problems
Baseline problems
Leaks
Changing retention times
1. Uncommon peak shapes
A. Broad peaks
a) Early eluting analyte due to column
Overload
:Dilute sample 1:10 and repeat the separation
b) Injection volume too large :Inject smaller volumes or reduce solvent strength
for injection to focus the sample components
50
c) Viscosity of mobile phase is too high :Increase column temperature or use a solvent of
lower viscosity
d) Retention times too long :Use gradient elution or a stronger mobile phase
for isocratic elution
e) Poor column efficiency :Use mobile phases of lower viscosity, elevated
column temperature, lower flow rate or a packing
with smaller particle size
f) Peak broadening in the injection valve :Decrease size of the sample loop, or introduce an
air Bubble in front and back of the sample in the
loop
g) Extra column volume of the LC system
too large
:Use zero dead volume fittings and connectors;
use Smallest possible tubing diameter (<0,25
mm) and Matched size of fittings
h) Volume of detector cell too large :Use smallest possible cell volume for the
sensitivity required; use a detector without heat
exchanger in the system
i) Detector time constant too slow :Adjust the time constant to the peak width
j) Sampling rate of the data system is too
low
:Increase the sampling rate
k) Only some peaks broad: late elution of
analytes from a previous run
:Flush the column with a strong eluent after each
run, or end gradient at a higher concentration
B. Peak fronting
a) Column overload :Decrease sample amount; increase column
diameter; use a stationary phase with higher
capacity
b) Formation of channels in the column :Buy a new column or have the column repacked
(we will be glad to inform you about our refill
service)
C. Peak tailing
a) Basic analytes: interactions with silanol
Groups
:Use silica-based base deactivated RP phases; use
a competing base such as triethylamine; use a
51
stronger mobile phase; switch to polymer-based
columns.
b) Sample components which can form
Chelates: metal traces in the packing
:Only use high-purity silica pickings with very
low content of metal ions; add EDTA or another
chelating compound to the mobile phase; switch
to polymer columns
c) Silica-based column: silanol
interactions
:Decrease the pH value of the mobile phase to
suppress ionization of the silanol groups; increase
the buffer concentration; derivatize the sample to
avoid polar interactions
d) Silica-based column: degradation at
High ph values
:Use RP columns with good surface shielding,
polymer columns or sterically protected phases
e) Silica-based column: degradation at
High temperatures
:Use temperatures below 50 °c
f) Dead volume at the column head :Rotate injection valve quickly; use an injection
valve with pressure bypass; avoid pressure pulses
/ replace the deteriorated column, or, if possible,
open the upper end fitting and fill the void with
the column packing or some silanised glass fibre
wadding;
g) Unswept dead volume :Minimise the number of connections; ensure that
the rotor seal is tight; check whether all fittings
are tight
D. Peak doubling
a) Simultaneous elution of an interfering
Substance
:Use sample clean-up or fractionation prior to
injection (e. G. Solid phase extraction or improve
selectivity by choice of another mobile or
stationary phase
b) Simultaneous late elution of a substance
From a previous run
:Flush the column with a strong eluent after each
run, or end gradient at a higher concentration
c) Column overload :Decrease sample amount; increase column
diameter; use a stationary phase with higher
capacity
d) Injection solvent too strong :Use a weaker solvent for the sample or a stronger
mobile phase
52
e) Sample volume too large :If the sample is dissolved in the mobile phase,
the injection volume should be smaller than one-
sixth of the column volume
f) Dead volume or formation of channels
In the column
:Replace the column or, if possible, open the
upper end fitting and fill the void with the same
packing; have the column repacked
g) Plugged frit
:Install an in-line filter with 0.5 μm pore size
between pump and injector to remove solids
from the mobile phase, or between injector and
column, to filter particulate matter from the
sample / if possible, clean or replace the plugged
frit
h) Unswept volume in the injector :Replace the rotor of the injection valve
E. Negative peaks
a) RI detector: refractive index of the
analyte lower than that of the mobile
phase
:Reverse detector polarity to obtain positive
peaks
b) UV detector: absorption of the analyte
lower than absorption of the mobile
phase
:Use a mobile phase with lower UV absorption; if
recycling solvent, use fresh HPLC grade eluent
when the
recycled mobile phase starts to affect detection
2. Pressure problems:
A. High back pressure
a) Viscosity of mobile phase too high :Use a solvent of lower viscosity or increase the
temperature
b) Particle size of packing too small :Use a packing with larger particle size (e. g. 7
μm instead of 5 μm)
c) For polymer-based columns: swelling of
The adsorbent caused by eluent changes
:Use only solvents compatible with the column;
check proper eluent composition;
Consult instructions for use for solvent
compatibility; use a column With a higher degree
of cross-linking
d) Salt precipitation :Especially in reversed phase chromatography
with high proportions of organic solvents in the
53
mobile phase; Ensure that the solvent
composition is compatible with the buffer
concentration; reduce the ionic strength and the
ratio of organic : aqueous in the mobile phase;
premix the mobile phase
e) Contamination at the column inlet :Improve sample clean-up; use guard columns;
backflush column with a strong solvent in order
to dissolve the impurity
f) Microbial growth in the column :Use a mobile phase with at least 10 % organic
solvent; prepare fresh buffer daily; add 0.02 %
sodium azide to aqueous mobile phases;
For storage equilibrate the column with at least
25% organic solvent and without buffer
g) Plugged frit in in-line filter or guard
Column
:Replace frit or guard column
h) Plugged frit at column inlet
:Replace the end fitting or the frit
i) When the injector is disconnected from
The column: plugged injector
:Clean the injector or replace the rotor
B. Pressure fluctuations
a) Air bubbles in the pump :Degas the solvent; flush the solvent with helium
b) Leak in liquid lines between pump and
column
:Tighten all fittings; replace defective fittings;
tighten rotor in the injection valve
C. Increasing Pressure
a) Accumulation of solids at the column
Head
:Filter sample and mobile phase; use an 0.5 μm
in-line filter; disconnect the contaminated
column and clean it by back-flushing; replace
plugged inlet frits; replace the guard column
b) In aqueous / organic solvent systems
precipitation of buffer components
:
:Ensure that the solvent composition is
compatible with the buffer concentration;
Reduce the ionic strength and the ratio of organic
: aqueous in the mobile phase
c) Plugged liquid lines
:Systematically disconnect system components
from the detector end to the blockage;
Clean or replace the plugged component
D. Decreasing Pressure
a) Insufficient flow from the pump :Loosen the cap on the mobile phase reservoir
b) Leak in liquid lines between pump and
Column
:Tighten all fittings; replace defective fittings;
tighten the rotor in the injection valve
54
c) Leaking pump check valve or seals :Clean the check valve; replace defective check
valves or seals
d) Air bubbles in the pump :Degas all solvents; check for blockage between
solvent reservoir and pump; if necessary replace
the frit in the inlet line
3. Baseline problems
A. Baseline drifting to lower absorption
a) With gradient elution: UV absorption of
Mobile phase A
:Use non-UV-absorbing HPLC grade solvents for
your
Mobile phases; if a UV-absorbing solvent is
inevitable,
Use a UV-absorbing additive in mobile phase B
B. Baseline drifting to higher absorption
a) Accumulation and elution of impurities :Use sample clean-up or fractionation prior to
injection; use only hplc grade solvents; clean the
contaminated Column with a strong solvent
b) With gradient elution: UV absorption of
Mobile phase B
:Use a higher wavelength of the UV detector; use
non- UV-absorbing HPLC grade solvents for
your mobile Phases; if a UV-absorbing solvent is
inevitable, use a UV-absorbing additive in mobile
phase A
C. Undulating baseline
a) Temperature changes in the room
:Monitor or avoid changes in room temperature;
isolate
The column or use a column oven; cover the RI
Detector to protect it from air currents
D. Baseline noise
55
a) Continuous: detector lamp problem or
Dirty detector cell
:Replace the UV lamp or clean the detector cell
b) Periodic: pump pulses :Repair or replace the pulse damper; purge any air
From the pump; clean or replace the check valves
c) Random: accumulation of impurities :Use sample clean-up or fractionation prior to
injection;
Use only HPLC grade solvents / backflush
contaminated
Column with a strong solvent
d) Spikes: air bubble in the detector :Degas the mobile phase; install a back pressure
restrictor
At the detector outlet; ensure that all fittings are
tight
e) Spikes: column temperature higher
Than the boiling point of the solvent
:Use lower working temperature
f) Occasional sharp spikes: external
electrical Interferences
:Use a voltage stabiliser for your LC system or
use an independent electrical circuit for your
chromatography
equipment
4. Changing retention times
A. Decreasing retention times
a) Column overloaded with sample :Reduce the amount of sample or use a column
with larger diameter
b) Increasing flow rate :Check and – if necessary – adjust the pump flow
rate
c) Active groups at the stationary phase :Use a mobile phase containing an organic
solvent (modifier) or a competing base, increase
the buffer strength; use a packing with higher
surface coverage
d) Loss of bonded stationary phase :Replace column; for silica adsorbents use mobile
phases between pH 2 and pH 8
B. Increasing retention times
a) Changing mobile phase composition :Cover the solvent reservoirs; ensure that the
gradient system supplies the proper composition;
if possible, mix the mobile phase by hand
b) Decreasing flow rate :Check and – if necessary – adjust the pump flow
rate; check for pump cavitation; check for leaking
pump seals and other leaks in the system
56
c) Loss of bonded stationary phase :For silica adsorbents use mobile phases between
pH 2 and pH 8
C. Fluctuating retention times
a) Only during first few injections: active
groups
:Condition the column with concentrated sample
b) Insufficient buffer capacity :Use buffer concentrations above 20 mM
c) Insufficient mixing of the mobile phase :Ensure that the gradient system supplies a
mobile phase with constant composition;
compare with manually mixed eluents; use
partially premixed mobile phases
d) Selective evaporation of one component
from the mobile phase
:Cover the mobile phase reservoirs; avoid
vigorous flushing with helium; prepare fresh
mobile phase
e) Accumulation of impurities :Flush the column occasionally with a strong
solvent, replace the guard column more
frequently
f) Fluctuating column temperature :Ensure that the room temperature is constant; if
necessary, thermostat or isolate the column
Conclusion:
HPLC is a versatile, reproducible chromatographic technique used for the
estimation of various drug products. It has wide applications in different fields in term of
quantitative and qualitative estimation of active molecules. Its operation is also required
too much care and attention for proper separation of desired compounds from mixture.
References:
Abidi, S.L. (1991).High-performance liquid chromatography of phosphatidic acids and
related polar lipids. J.Chromatogr. 587: 193-203.
Bowden, R.E. and Madsen, P.O. (1986). High- pressure liquid chromatographic assay of
sulbactam in plasma, urine and tissue. Antimicrob. Agents Chemother. 30: 31-233.
Christie, W.W.; Gill, S.; Nordbäck, J.; Itabashi, Y.; Sanda, S. and Slabas, A.R. (1998). New
procedures for rapid screening of leaf lipid components from Arabidopsis.
Phytochemical Anal. 9: 53-57.
Wiklund, A. E.; Dag, B. and Brita, S. (2005). Toxicity evaluation by using intact sediments
and sediment extracts. Marine Pollution Bulletin. 50(6): 660-667.
************************************************************************
57
Application of HPTLC in ethnopharmacology
C.M. Modi, H.B. Patel and U.D. Patel, Makawana C.N., Khadayata A.V.
Department of Vet. Pharmacology and Toxicology, College of Veterinary Science and
A.H., Junagadh Agricultural University, Junagadh
Introduction
During the past decades, herbal medicine has increased exponentially (Zlatkis et
al., 1977; Sethi, 1996). According to World Health Organization, developing countries
depends essentially on plants for primary health care needs owing to poverty and lack of
access to modern medicine (Neumann and Margot, 2009). The resurgence of herbal
medicines has increased the international trade enormously. Herbal medical database
indicates the markets in Asia had reach $ 2.3 billion (Sherma, 2010). Pharmaceutical
companies have established renewed concern in exploring plants as a major source to
discover new lead structures and to develop standardized phytotherapeutic compound with
promising safety, efficacy and quality (Butler, 2008; Gershell and Atkins, 2003).
Furthermore, various hazardous side effects like, hypersensitivity reactions, effects
from adulterants, and interactions with herbal drugs have been confirmed that drawing
many regulatory agencies to standardize of plant-based drugs (Sweedler, 2002). The World
Health Organization has developed specific guideline to instigate nationalized policies on
plant based drugs and to study their prospective safety, efficacy and quality as a
prerequisite for global harmonization (Wenlock et al., 2003). The validation of plant based
drugs and recognition of adulterants from authentic curative herbs are important for both
pharmaceutical industries and community health (Kertesz et al., 2005).
HPTLC-AN OVERVIEW
Technological advancements take place in the processes of isolation, purification
and structural elucidation of natural compounds have made it probable to generate
appropriate strategies for the analysis of quality and standardization of plant based
medicines (Gershell and Atkins, 2007). A variety of sophisticated methods such as
spectrophotometric, chromatographic, polarography, electrophoresis, and the use of
molecular biomarkers in fingerprints are presently employed in standardization of plant
based medicines. The term “thin-layer chromatography” introduced by E. Stahl in 1956.
High-performance thin-layer chromatography (HPTLC) is an enhanced form of thin-layer
chromatography (TLC). HPTLC is the state-of-the-art technique for generating and
evaluating digital images.
High performance thin layer chromatography (HPTLC) fingerprint profiles are
used for ensuring the identification of constituents, identification and determination of
impurities, quantitative determination of active substances, transparency and potency of
herbal formulations (ICH, 2005). The use of modern apparatus such as video scanners,
58
densitometers, and new chromatographic chambers, more effective elution techniques,
high-resolution sorbents with selected particle size or chemically modified surface, the
possibility of combining with other instrumental methods, and development of computer
programs for method optimization all make HPTLC an important alternative method to
HPLC or gas chromatography. HPTLC is one of the ideal TLC technique for evaluation of
stability and consistency of polyherbal preparations from different manufactures (Sudberg
et al., 2010). HPTLC fingerprint is mostly used for evaluating the compounds with low or
moderate polarities and to control the intrinsic quality of herbal drugs (Arup et al., 1993).
Thin layer chromatography studies are the key identity test in most pharmacopoeial
monographs. Pharmacopoeial standards are typically used by industry as a basis for
meeting QC requirements and current good manufacturing practices (cGMPs). High-
performance thin layer chromatography (HPTLC) is robust, simplest, rapid, and efficient
tool in quantitative analysis of compounds. HPTLC is an analytical technique based on
TLC intended to increase the resolution of the compounds to be separated and to allow
quantitative analysis of the compounds. The separation can be further improved by
repeated development of the plate using a multiple development device. As a consequence,
HPTLC offers better resolution and lower Limit of Detection (LODs). The usage of
HPTLC is well appreciated and accepted all over the world. Many methods are being
established to standardize the assay methods. HPTLC remains one step ahead when
compared with other tools of chromatography (Sudberg et al., 2010).
TLC IS USED IF…
The substances are nonvolatile or of low volatility
The substances are strongly polar, medium polar, nonpolar or ionic
A large number of samples must be analyzed simultaneously, cost-effectively, and
within a limited period of time
The samples to be analyzed would damage or destroy the columns of LC (liquid
chromatography) or GC (gas chromatography)
The substances in the material being analyzed cannot be detected by the methods of
LC or GC or only with great difficulty after the chromatography. All the components
of the sample have to be detectable (remain at the start or migrate with the front)
The components of a mixture of substances after separation have to be detected
individually or have to be subjected to various detection methods one after the other
(e.g. in drug screening).
APPLICATION OF HPTLC
HPTLC is one of the most widely applied method and its uses are as follow
Pharmaceuticals and drugs
Clinical chemistry, forensic chemistry and biochemistry
Cosmetology
Food analysis
59
Environmental analysis
Analysis of inorganic substances
The other advantages include simplicity, low costs, parallel analysis of samples,
high sample capacity, rapidly obtained results, and possibility of multiple detection. Le
Roux et al (1992) evaluated a HPTLC technique for the determination of salbutamol serum
levels in clinical trials and established as a suitable method for analyzing samples from the
serum. Many lipids have also been analyzed and studied using HPTLC; 20 different lipid
subclasses were separated using HPTLC with the reproducible and promising results.
HPTLC is now strongly recommended in the analysis of drugs in serum and other tissues
(Bernardi and Tamburini, 2009).
DRUG DISCOVERY TASKS COVERED BY HPTLC
Qualitative description of herbal medicines
Generally, a fingerprint of a plant should be specific to allow a clear distinction
from possible adulterant. The fingerprints of herbal preparations primarily should be
comprehensive. Set of fingerprints covering different substance classes of various
polarities can be generated. The method was validated according to the International
Conference on Harmonization guidelines for accuracy, precision, linearity, specificity, and
robustness.
Identification of raw material and products
The identification of raw material is crucial in order to ensure its authenticity,
quality, and safety before it is processed. HPTLC is the preferred tool for the identification
of herbal drugs. Identity is ensured with a chromatographic fingerprint, which visualizes
the characteristic chemical constituents of a material. Comparing the fingerprints of a
sample with that of a reference drug, the number, sequence, position, and color of the zones
must be identical or at least similar. To get meaningful and reliable results, the chemical
nature of the separated zone should not be known. For proper identification of a plant,
acceptance criteria must be set. The decision could be based on presence or absence of
characteristic zones. In the test, a sample of the proper species must meet the acceptance
criteria and samples of other plants species must fail. HPTLC can give reliable answers
rapidly (Ravishankara, et al., 2002; Narasimhan, et al., 2006). The natural variability of
plants is a serious analytical challenge. Even though botanically clearly identified but the
different samples of the same species may show different fingerprints. To solve this
problem, it is important to work with representative samples, authentic plants, or if possible
with botanical reference material (BRM).
A simple and reproducible method using HPTLC was successfully performed for
the quantitative analysis of above diterpenoids in the root bark of Photinia integrifolia. In
which Diterpenoids 1β,3α,8β-trihydroxy-pimara-15-ene (A), 6α,11,12,16-tetrahydroxy-7-
oxo-abieta-8,11,13-triene (B) and 2α,19-dihydroxy-pimara-7,15-diene (C) were used as
60
chemical markers for the standardization of Photinia integrifolia plant extracts (Yadav et
al., 2011). A simple HPTLC method has been developed for the simultaneous
determination of isoorientin, isovitexin, orientin, and vitexin, both pure and in commercial
samples of bamboo-leaf flavonoids. It was found that HPTLC is a simple, precise, specific,
and accurate and can be used for manufacturing QC of bamboo-leaf flavonoids or for
governmental regulatory purposes (Wang et al., 2010).
Detection of adulterants and impurities
HPTLC not only confirm but also establish its identity. It is also an ideal screening
tool for adulterations and is highly suitable for evaluation and monitoring of cultivation,
harvesting, and extraction processes and testing of stability. Adulterants and mixtures are
typically determined by visually comparing fingerprints of the test sample against
fingerprints of reference materials with respect to sequence, color, and intensity of the
zones prior and after derivatization. On the basis of established acceptance criteria, the
identity test can recognize the presence of adulterants instead of the desired species.
The presence of specific markers for certain adulterants within the otherwise correct
fingerprint of a sample can be used to determine an impurity. This is a simple and fast
approach allowing the evaluation of up to 16 samples within a few minutes. If video
densitometry or scanning densitometry is employed, an impurity in the sample can be
quantified.
Monitoring the batch-to-batch production process
For the industrial production of herbal medicines, several cGMP related issues must
be considered. The whole production process must be under strict control. The HPTLC
fingerprint is an important analytical tool for monitoring of an extraction process. It gives
quick and reliable answers about the status of the process in progress or about the finished
product. The only requirement is to demonstrate that the fingerprint is selective enough to
show any change in the composition of the sample. Because of their comprehensive nature
and comparability, HPTLC fingerprints can be used to document how efficiently and
completely the starting material is transferred into the product and whether the composition
of constituents has changed during production.
Another cGMP issue is the batch-to-batch consistency of the production. It involves
three elements: consistency of the raw material, full control of the production process, and
proper definition of the finished product. Because the plant raw material comes with a
naturally variable composition, it is clear that also the finished product will show a certain
variation. Therefore, batch-to-batch consistency becomes a matter of definition and
specification. In any case final herbal products can be compared to a reference product or
to a formerly released batch. A requirement for reliable analytical work is that the HPTLC
methodology is standardized and all methods are validated. It is of great advantage to work
with a cGMP compliant documentation system, which generates images reproducibly and
archives them for comparison with current images.
Quantitative determination
61
Quantitative determination by HPTLC is generally performed by scanning
densitometry. Video densitometry, based on images, is an alternative for the analysis. For
the quantitation, some points have to be considered. Most fingerprint methods, used for
identification, are suitable for quantification only with some adaptation. Baseline
separation for the quantitative evaluation of the selected compound is essential. Hence,
robust methods are required which are optimized and standardized. The sample preparation
plays an important role as it does for all quantification techniques. Considering these
points, HPTLC quantification leads to reproducible data (Ebel, 1996). Detection limit of
absorption measurements is about 10–100 ng. Patel et al developed a simple and rapid
HPTLC method and validated for quantitative determination of olanzapine on silica gel
60F254 layers using methanol-ethyl acetate (8.0 + 2.0, v/v) as the mobile phase. The
developed method was found to be simplest among existing analytical methods (Patel et
al., 2010) A sensitive, simple, selective, precise, and accurate HPTLC method of analysis
for paracetamol, diclofenac potassium, and famotidine both as a bulk drug and in tablet
formulation was developed and validated (Khatal et al., 2010).
HPTLC in other fields
In recent years, HPTLC is a globally accepted practical solution to characterize
small molecules in quality assessment throughout the developing world. HPTLC is used
for purity control of chemicals, pesticides, steroids, and water analysis (Weber, 2005).
HPTLC is also widely used for analysis of vitamins, water-soluble food dyes, pesticides in
fruits, vegetables, and other food stuffs (Verbitski et al., 2006) Fuchs et al., (2008) reported
the analysis of stem cell lipids by offline HPTLC-MALDI-TOF MS. HPTLC is useful in
detecting chemicals of forensic concern, including abuse drugs, poisons, adulterations,
chemical weapons, and illicit drugs.
REGULATORY ISSUES AND QUALITY CONTROL
Product derived from a plant and the purpose for which and where it is sold,
different regulations may apply, but one of the central elements is always the necessity to
properly define the identity of the starting material as well as its consistency with respect
to specifications. For some of the most widely used plants, pharmacopoeial monographs
are available, which serve as the basis for quality. Aside from botanical and organoleptic
characteristics of the plant, monographs usually describe tests for chemical identification
and assays.
As analytical techniques thin layer chromatography (TLC) and high-performance
thin layer chromatography (HPTLC) are included in the European Pharmacopoeia (PhEur),
the United States Pharmacopoeia (USP), the Pharmacopoeia of the Peoples Republic of
China (PhPRCh) as well as in the American Herbal Pharmacopoeia (AHP) (EP, 2005).
Also in the Quality Standards of Indian Medicinal Plants (QS-IMP) and the Indian Herbal
Pharmacopoeia (IHP), TLC and HPTLC are recommended for qualitative and quantitative
evaluation of phytochemical constituents of herbal drugs (QSIMP, 2005; IHP, 2002). A
general trend towards harmonization and consolidation of monograph contents and the use
62
of comparable analytical tools can be observed in the pharmacopoeias. The WHO
guidelines for the worldwide use of herbal drugs recommend TLC/HPTLC and the use of
chromatographic fingerprints for identification and qualitative determination of impurities
of herbal medicines (WHO, 2007). If a plant derived product is sold in Europe as Herbal
Medicinal Product (HMP) it requires proof of quality, safety, and efficacy for approval.
The European Medicines. Agency (EMEA) has issued regulations and directives for the
quality control of herbal drugs (EMEA, 2007). The herbal drug preparation in its entirety
is regarded as the active substance. It is recommended to use chromatographic fingerprint
techniques such as TLC/HPTLC for identity test, tests for presence of adulterants, and to
determine the stability of the HMP.
Reference:
1. Zlatkis A, Kaiser RE. Amsterdam: Elsevier Science and Technology; 1977.
HPTLC, high performance thin-layer chromatography.
2. Sethi PD. CBS Publishers and Distributors; 1996. HPTLC: High Performance Thin
Layer Chromatography: Quantitative Analysis of Pharmaceutical Formulations.
3. Arup U, Ekman S, Lindblom L, Mattsson JE. High performance thin layer
chromatography (HPTLC), an improved technique for screening lichen
substances. Lichenologist. 1993;25:61–71.
4. Neumann C, Margot P. New perspectives in the use of ink evidence in forensic
science: Part I.Development of a quality assurance process for forensic ink analysis
by HPTLC. Forensic Sci Int. 2009;185:29–37.
5. Sherma J. Review of HPTLC in Drug Analysis: 1996-2009. J AOAC
Int. 2010;93:754–64.
6. Butler MS. Natural products to drugs: Natural product-derived compounds in
clinical trials. Nat Prod Rep. 2008;25:475–516.
7. Gershell LJ, Atkins JH. A brief history of novel drug discovery technologies. Nat
Rev Drug Discov. 2003;2:321–7.
8. Sweedler JV. The continued evolution of hyphenated instruments. Anal Bioanal
Chem. 2002;373:321–2.
9. Albert K, Krucker M, Glaser T, Schefer A, Lienau A, Zeeb D. Hyphenated
techniques. Anal Bioanal Chem. 2002;372:25–6.
10. Wenlock MC, Austin RP, Barton P, Davis AM, Leeson PD. A comparison of
physicochemical property profiles of development and marketed oral drugs. J Med
Chem. 2003;46:1250–6.
11. Sudberg S, Sudberg EM, Terrazas J, Sudberg S, Patel K, Pineda J, et al. Fingerprint
analysis and the application of HPTLC to the determination of identity and quality
of botanicals, from an industry perspective. J AOAC Int. 2010;93:1367–75.
12. Le Roux AM, Wium CA, Joubert JR, Van Jaarsveld PP. Evaluation of a high-
performance thin-layer chromatographic technique for the determination of
salbutamol serum levels in clinical trials. J Chromatogr. 1992;581:306–9.
63
13. Kertesz V, Ford MJ, Van Berkel GJ. Automation of a surface sampling
probe/electrospray mass spectrometry system. Anal Chem. 2005;77:7183–9.
14. Ravishankara, M.N. et al., Evaluation of antioxidant properties of root bark of
Hemidesmus indicus R. Br. (Anantmul), Phytomedicine, 9, 153, 2002.
15. Narasimhan, S. et al., Free radical scavenging potential of Chlorophytum
tuberosum baker, J. Ethnopharmacol., 104, 423, 2006.
16. EG-Leitfaden einer Guten Herstellungspraxis für Arzneimittel und Wirkstoffe; 7th
edn., Editio Cantor Verlag, Aulendorf, 2003, p. 61.
17. Ebel, S., Quantitative analysis in TLC and HPTLC, J. Planar Chromatogr., 9, 4,
1996.
18. European Pharmacopoeia, 5th edn., Council of Europe, European Directorate for
the Quality of Medicines (EDQM), Strasbourg, 2005.
19. American Herbal Pharmacopoeia, Santa Cruz, CA, USA.
20. Quality Standards of Indian Medicinal Plants, 1–3, Indian Council of Medicinal
Research, Indraprastha Press (CBT), New Delhi, 2005.
21. Indian Herbal Pharmacopoeia, revised edn., Indian Drug Manufacturers’
Association, Mumbai, 2002.
22. World Health Organisation, Quality Control Methods for Medicinal Plant
Materials, Geneva, 1998, p. 22.
http:==www.who.int=medicines=services=expertcommittees=
pharmprep=QAS05_131Rev1_QCMethods_Med_PlantMaterialsUpdateSept05.pd
f (accessed March 05, 2007).
23. Guideline on quality of herbal medicinal products=traditional herbal medicinal
products, CPMP=QWP=2819=00 Rev 1 and EMEA=CVMP=814=00 Rev 1.
European Medicines Agency (EMEA), London, 2006.
http:==www.emea.eu.int=pdfs=human=qwp=281900en. pdf (accessed March 05,
2007).
24. ICH Harmonized Tripartite Guideline: Validation of Analytical Procedures: Text
and Methodology Q2(R1) Geneva, Switzerland: International Conference on
Harmonization; 2005.
25. Sudberg S, Sudberg EM, Terrazas J, Sudberg S, Patel K, Pineda J, et al. Fingerprint
analysis and the application of HPTLC to the determination of identity and quality
of botanicals, from an industry perspective. J AOAC Int. 2010;93:1367–75.
26. Arup U, Ekman S, Lindblom L, Mattsson JE. High performance thin layer
chromatography (HPTLC), an improved technique for screening lichen
substances. Lichenologist. 1993;25:61–71.
27. Bernardi T, Tamburini E. An HPTLC-AMD method for understanding the
metabolic behavior of microorganisms in the presence of mixed carbon sources.
The case of Bifidobacterium adolescentis MB 239. JPC-J Planar Chromatogr-
Modern TLC. 2009;22:321–5.
64
28. Patel RB, Patel MR, Bhatt KK, Patel BG. Development and validation of an
HPTLC method for determination of olanzapine in formulations. J AOAC
Int. 2010;93:811–9. [PubMed]
29. Yadav D, Tiwari N, Gupta MM. Simultaneous quantification of diterpenoids in
Premna integrifolia using a validated HPTLC method. J Sep Sci. 2011;34:286–91.
30. Wang J, Tang F, Yue Y, Guo X, Yao X. Development and validation of an HPTLC
method for simultaneous quantitation of isoorientin, isovitexin, orientin, and
vitexin in bamboo-leaf flavonoids. J AOAC Int. 2010;93:1376–83.
31. Khatal LD, Kamble AY, Mahadik MV, Dhaneshwar SR. Validated HPTLC method
for simultaneous quantitation of paracetamol, diclofenac potassium, and famotidine
in tablet formulation. J AOAC Int. 2010;93:765–70.
32. Weber W. Luminographic detection of toxicity with Vibrio fischeri (luminescent
bacteria) CAMAG Bibiliography Service. 2005:94.
33. Verbitski SM, Gourdin GT, Ikenouye LM, McChesney JD. Rapid Screening of
Complex Mixtures by Thin-Layer Chromatography-Bioluminescence. Am
Biotechnol Lab. 2006;24:40–2.
34. Fuchs B, Schiller J, Süss R, Zscharnack M, Bader A, Müller P, et al. Analysis of
stem cell lipids by offline HPTLC-MALDI-TOF MS. Anal Bioanal
Chem. 2008;392:849–60.
******************************************************************