Medicine

LABORATORY MEDICINE: HISTORY AND MODERN ASPECTS

LABORATORY MEDICINE: HISTORY AND MODERN ASPECTS

 

LABORATORY TESTS IN HEMATOLOGY – I

LABORATORY TESTS IN HEMATOLOGY – II

 

Clinical Laboratory diagnostics(laboratory medicine) is the medical discipline devoted to obtain, explore and employ knowledge about using various techniques for the analysis of body fluids composition and properties of cells and tissues, and interpretation of the results in relation to health and disease.

It should be stressed that laboratory diagnostics or laboratory medicine is both the clinical discipline and the separate medical science. These two fields of laboratory diagnostics are tightly bound as in the case of other clinical sciences. Laboratory tests are used in various stages of the diagnostic process in all fields of clinical medicine, being along with imaging studies, electrophysiological and other procedures the main source of information on the health status of the patient. It is estimated that laboratory results can be the basis of 60 % -70 % of medical decisions. In addition to routine diagnostics in symptomatic patients, laboratory tests are used for screening, treatment monitoring and medical jurisprudence. Thus, laboratory diagnostics generating around 10 % of all healthcare costs is crucial for the healthcare decision making process, contributing to improved outcomes and cost savings

 

Clinical laboratory (also called medical laboratory) is a facility that provides controlled conditions in which tests are done on clinical specimens in order to acquire information about the health of an individual (or patient) for the purpose of diagnosis, treatment, and

prevention of disease or medical research.

The clinical laboratory consists of four major divisions (or departments).

1.     Medical Microbiology and Parasitological Laboratory: This laboratory deals with the study of human pathogens. Pathogens are biological agents that cause diseases to their hosts. They include microorganisms (bacteria, viruses and fungi) and parasites (e.g. intestinal worms, lice and malaria parasites) of medical importance. In bigger health centresor research institutes, medical microbiology and parasitological laboratory is usually split into sub-unit like bacteriology, parasitology and virology laboratories.

2.     Haematology: This laboratory is involved in the performance of relevant tests (on blood) in the diagnosis of blood diseases, (e.g. Anaemia, Haemoglobinopathies, Leukaemiaetc) and blood transfusion services e.g. blood group, blood cross matching.

3.     Clinical Biochemistry Laboratory or Clinical Chemistry : This division of laboratory is concerned with the performance of quantitative and qualitative tests on clinicalspecimens to investigate the state of various body chemistries. Such clinical specimens include body fluids (e.g. whole blood, plasma, serum, urine, sweat, cerebrospinal fluid) and occasionally faeces, tissue, hair e.t.c.

4.     Histopathology Laboratory: This is the laboratory where tissues (or cells) are processed for microscopic examination in order to investigate or study disease manifestations on the tissue (or cells), structure, for diagnostic purposes e.g. Cancer diagnosis. In the laboratory, tissue samples are processed onto glass slides from which effects of diseases on the histological architecture of tissues can be microscopically examined and hence diagnostic inferences are made.

 

Universal Precautionary Measure in Clinical Laboratory

Garner (1997) defined Universal Basic Precaution as the prevention of transmission of blood pathogens through strict respect of rules concerning care and nursing. Gerberding et al.,(1995) also defined universal precaution as the routine use of appropriate barrier and techniques to reduce the likelihood of exposure to blood, other body fluid and tissue that may contain blood borne pathogens.

Universal basic precautions assume that all clinical specimens contain infectious agents and should therefore be handled as such. This approach eliminates the need to identify infected patients or specimen from Human Immunodeficiency Virus (HIV) or other blood borne pathogen infected patients.

The followings are the laboratory universal safety precautions.

1. Universal precautions should apply to blood and all body fluid containing visible blood, semen, vaginal secretions, tissues, cerebrospinal fluid, peritoneal fluid, pericardial fluid, synovial fluid and amniotic fluid.

2. Laboratory workers should use protective barriers appropriate for the laboratory procedure and the type and extent of exposure expected. All persons processing blood should wear gloves and laboratory coats; and these should be removed before leaving the

laboratory. Biological safety barriers should be used wherever necessary.

3. Hands should be washed immediately when contaminated with blood or other body fluids, after removing gloves and after completing laboratory activities.

4. Use of needles and syringes should be minimised. They should be used in situations in which there is no alternative. If used, needles should not be recapped or bent or broken by hand. After use, needles and other sharp instruments should be placed in a ‘sharpsafe

puncture-resistant container for disposal.

5. Specimens of blood should be placed in strong-leak-proof containers during transport.

6. Mouth pipetting must not be performed in the laboratory. Mechanical devices should be used.

7. Contaminated materials used in the laboratory should be decontaminated appropriately before reprocessing or disposal.

8. Laboratory work surfaces should be cleaned and decontaminated with appropriate disinfectant after a blood or body fluid spill and at the end of day’s work.

 

 

History of laboratory medicine

 

Three distinct periods in the history of medicine are associated with three different places and, therefore, different methods of determining diagnosis: From the middle ages to the 18th century, bedside medicine was prevalent; then between 1794 and 1848 came hospital medicine; and from that time forward, laboratory medicine has served as medicine’s lodestar. The laboratory’s contribution to modern medicine has only recently been recognized by historians as something more than the addition of another resource to medical science and is now being appreciated

as the seat of medicine, where clinicians account for what they observe in their patients.

The first medical diagnoses made by humans were based on what ancient physicians could observe with their eyes and ears, which sometimes also included the examination of human specimens.

The ancient Greeks attributed all disease to disorders of bodily fluids called humors, and during the late medieval period, doctors routinely performed uroscopy. Later, the microscope revealed not only the cellular structure of human tissue, but also the organisms that cause disease. More sophisticated diagnostic

tools and techniques—such as the thermometer for measuring temperature and the stethoscope for measuring heart rate—were not in widespread use until the end of the 19th century. The clinical laboratory would not become a standard fixture of medicine until the beginning of the 20th century. This four-part article reviews the history and development of diagnostic methods from ancient to modern times, as well as the evolution of the clinical laboratory from the late 19th century to the present.

Ancient diagnostic methods

In ancient Egypt and Mesopotamia, the earliest physicians made diagnoses and recommended treatments based primarily on observation of clinical symptoms. Palpation and auscultation were also used. Physicians were able to describe dysfunctions of the digestive tract, heart and circulation,the liver and spleen, and menstrualdisturbances;unfortunately, this empiric medicine was reserved for royalty and the wealthy. Other less-than-scientific methods of diagnosis used in

treating the middle and lower classes included divination through ritual sacrifice to predict the outcome of illness. Usually a sheep would be killed before the statue of a god. Its liver was examined for malformations or peculiarities; the shape of the lobes and the orientation of the common duct were then used to predict the fate of the patient.

Ancient physicians also began the practice of examining patient specimens. The oldest known test on body fluids was done on urine in ancient times (before 400 BC). Urine was poured on the ground and observed to see whether it attracted

insects. If it did, patients were diagnosed with boils. The ancient Greeks also saw the value in examining body fluids to predict disease. At around 300 BC, Hippocrates promoted the use of the mind and senses as diagnostic tools, a principle that played a large part in his reputation as the “Father of Medicine.” The central Hippocratic doctrine of humoral pathology attributed all disease to disorders of fluids of the body. To obtain a clear picture of disease, Hippocrates

advocated a diagnostic protocol that included tasting the patient’s urine, listening to the lungs, and observing skin color and other outward appearances. Beyond that, the physician was to “understand the patient as an individual.” Hippocrates related the appearance of bubbles on the surface of urine specimens to kidney disease and chronic illness. He also related certain urine sediments and blood and pus in urine to disease. The first description of hematuria, or the presence of blood in urine, by Rufus of Ephesus surfaced at around AD 50 and was attributed to the failure of kidneys to function properly in filtering the blood. Later (c. AD 180), Galen (AD 131–201), who is recognized as the founder of experimental physiology, created a

system of pathology that combined Hippocrates’ humoral theories with the Pythagorean theory, which held that the four elements (earth, air, fire and water), corresponded to various combinations of the physiologic qualities of dry, cold, hot

and moist. These combinations of physiologic characteristics corresponded roughly to the four humors of the human body: hot + moist = blood; hot + dry = yellow bile; cold + moist = phlegm; and cold + dry = black bile.

Galen was known for explaining everything in light of his theory and for having an explanation for everything.

He also described diabetes as “diarrhea of urine” and noted the normal relationship between fluid intake and urine volume. His unwavering belief in his own infallibility appealed to complacency and reverence for authority. That dogmatism essentially brought innovation and discovery in European medicine to a standstill for nearly 14 centuries. Anything relating to anatomy, physiology and disease was simply referred back to Galen as the final authority from whom there could be no appeal.

Middle Ages

In medieval Europe, early Christians believed that disease was either punishment for sin or the result of witchcraft or possession. Diagnosis was superfluous. The basic therapy was prayer, penitence, and invocation of saints. Lay medicine based diagnosis on symptoms, examination, pulse, palpitation, percussion, and inspection of excreta and sometimes semen. Diagnosis by “water casting” (uroscopy) was

practiced, and the urine flask became the emblem of medieval medicine. By AD 900, Isaac Judaeus, a Jewish physician and philosopher, had devised guidelines for the use of urine as a diagnostic aid; and under the Jerusalem Code of 1090, failure to examine the urine exposed a physician to public beatings. Patients carried their urine to physicians in decorative flasks cradled in wicker baskets and, because urine could be shipped, diagnosis at long distance was common. The first book detailing the color, density, quality and sediment found in urine was written

around this time, as well. By around AD 1300, uroscopy became so widespread that it was at the point of near universality in European medicine.

Medieval medicine also included interpretation of dreams in its diagnostic repertoire. Repeated dreams of floods indicated “an excess of humors that required evacuation,” and dreams of flight signified “excessive evaporation of humors.”

Seventeenth century

The medical advances of the 17th century consisted mostly of descriptive works of bodily structure and function that laid the groundwork for diagnostic and therapeutic discoveries that followed. The status of medicine was helped along by the introduction of the scientific society in Italy and by the advent of periodical literature. Considered the most momentous event in medical history since Galen’s time, the discovery of the circulation of blood by William Harvey (1578–1657) marked the beginning of a period of mechanical explanations for a variety of functions and processes, including digestion, metabolism, respiration and pregnancy. The English scientist proved through vivisection, ligation and perfusion that the heart acts as a muscular pump propelling the blood throughout the body in a continuous cycle.

The invention of the microscope opened the door to the invisible world just as Galileo’s telescope had revealed a vast astronomy. The earliest microscopist was a Jesuit priest, Athanasius Kircher (1602–1680) of Fulda (Germany), who was probably the first to use the microscope to investigate the causes of disease. His experiments showed how maggots and other living creatures developed in decaying matter.

Kircher’s writings included an observation that the blood of patients with the plague contained “worms;” however, what he thought to be organisms were probably pus cells and red blood corpuscles because he could not have observed

Bacillus pestis with a 32-power microscope. Robert Hooke (1635–1703) later used the microscope to document the existence of “little boxes,” or cells, in vegetables and inspired the works of later histologists; but some of the greatest contributions

to medical science came from Italian microscopist, Marcello Malpighi (1628–1694). Malpighi, who is described as the founder of histology, served as physician to Pope Innocent XII and was famous for his investigations of the embryology of the chick and the histology and physiology of the glands and viscera. His

work in embryology describes the minutiae of the aortic arches, the head fold, the neural groove, and the cerebral and optic vesicles.

Uroscopy was still in widespread use and had gained popularity as a method to diagnose “chlorosis,” or love-sick young women, and sometimes to test for chastity. Other methods of urinalysis had their roots in the 17th century, as well.

The gravimetric analysis of urine was introduced by the Belgian mystic, Jean Baptiste van Helmont (1577–1644).

Van Helmont weighed a number of 24-hour specimens, but was unable to draw any valuable conclusions from his measurements. It was not until the late 17th century—when Frederik Dekkers of Leiden, Netherlands, observed in 1694

that urine that contained protein would form a precipitate when boiled with acetic acid—that urinalysis became more scientific and more valuable. The best qualitative analysis of urine at the time was pioneered by Thomas Willis

(1621–1675), an English physician and proponent of chemistry. He was the first to notice the characteristic sweet taste of diabetic urine, which established the principle for the differential diagnosis of diabetes mellitus and diabetes insipidus.

Experiments with blood transfusion were also getting underway with the help of a physiologist in Cornwall, England, named Richard Lower (1631–1691). Lower was the first to perform direct transfusion of blood from one animal to another. Other medical innovations of the time included the intravenous injection of drugs, transfusion of blood, and the first attempts to use pulse rate and temperature

as indicators of health status.

18-th century

The 18th century is regarded as the “Golden Age” of the successful practitioner, as well as the successful quack. Use of phrenology (the study of the shape of the skull to predict mental faculties and character), magnets, and various powders and potions for treatment of illness were a few of the more popular scams. The advancement of medicine during this time was more theoretical than practical. Internal medicine was improved by new textbooks that cataloged and described many new forms of disease, as well as by the introduction of new drugs, such as digitalis and opium. The state of hospitals in the 18th century, however, was alarming by today’s standards. Recovery from surgical operations was rare because of septicemia. The concept of antisepsis had not yet been discovered, and hospitals were notorious for filth and disease well into the 19th century. One notable event that is a forerunner to the modern practice of laboratory measurement of prothrombin time, plasmathromboplastin time and other coagulation tests, was

the discovery of the cause of coagulation. An English physiologist, William Hewson (1739–1774) of Hexham, Northumberland, England, showed that when the coagulation of the blood is delayed, a coagulable plasma can be separated

from the corpuscles and skimmed off the surface. Hewson found that plasma contains an insoluble substance that can be precipitated and removed from plasma at a temperature slightly higher than 50°C. Hewson deduced that coagulation

was the formation in the plasma of a substance he called “coagulable lymph,” which is now known as fibrinogen. A later discovery that fibrinogen is a plasma protein and that in coagulation it is converted into fibrin attests to the importance

of Hewson’s work.

The clinical diagnostic methods of percussion, temperature, heart rate and blood pressure measurements were further refined, and there were some remarkable attempts to employ precision instruments in diagnosis. Leopold Auenbrugger (1722–1809) was the first to use percussion of the chest in diagnosis in 1754 in Vienna. This method involved striking the patient’s chest while the patient holds his or her breath. Auenbrugger proposed that the chest of a healthy person sounds like a cloth-covered drum. A student of Auenbrugger’s, Jean Nicolas Corvisart, a French physician at La Charité in Paris, pioneered the accurate diagnosis

of heart and lung diseases using Auenbrugger’s chestthumping technique. Corvisart’s translation of Auenbrugger’s treatise on percussion, “New Invention to Detect by Percussion Hidden Diseases in the Chest,” popularized the practice

of thumping on a patient’s chest. The resulting sounds are different when the lungs contain lesions or fluids than in healthy people. This observation was validated by postmortem examination.

James Currie (1756–1805), a Scot, was the first to use cold baths in treatment of typhoid fever; and by monitoring the patient’s temperature using a thermometer, he was able to adjust the temperature and frequency of the baths to treat individual patients. It took another hundred years, however, before thermometry became a recognized feature in clinical diagnosis.

Additional advances in urinalysis occurred with J.W. Tichy’s observations of sediments in the urine of febrile patients (1774); Matthew Dobson’s proof that thesweetness of the urine and blood serum in diabetes is caused by sugar (1776); and the development of the yeast test for sugar in diabetic urine by Francis Home (1780).

19-th century

1893 T. W. Richards invents the nephelometer; Hermann M. Biggs establishes Diagnostic Laboratory in New York City.

1895 Franz Ziehl and Friedrich Neelsen introduce their modification of the

acid-fast stain for tuberculosis; William Roentgen discovers X-rays; William

Pepper Laboratory is established at the Pennsylvania General Hospital.

1896 S. Riva-Rocci invents the sphygmomanometer; C. W. Purdy publishes Practical Urinalysis and Urinary Diagnosis; Ferdinand Widal develops the

agglutination test for identification of the typhoid bacillus; in Great Britain, clinical

laboratories existed in Edinburgh, Leeds, Glasgow, and London by this date.

1897 The first commercial clinical laboratory established in England, The

Clinical Research Association, receives specimens by mail.

20-th century

1899 American Society for Microbiology is founded.

1900 F.G. Hopkins discovers tryptophan; Otto Folin becomes the first full-time clinical biochemist (in its modern sense) in the U.S.

1902 The DuBoscq visual colorimeter is first introduced into clinical laboratories.

1903 Ayer Clinical Laboratory is established at Pennsylvania Hospital, designed by Simon Flexner for work with patients.

1904 Christian Bohr discovers the reciprocal relationship between pH and oxygen content of hemoglobin (Bohr effect); M. Beijerinck obtains the first pure culture of the sulfur-oxidizing bacterium Thiobacillus thioparus; the first ultraviolet

lamps and the first practical photoelectric cell are invented.

1905 H.J. Bechtold discovers immunodiffusion.

1906 American Hospital Association is formed from the Association of Hospital Superintendents of the U.S. and Canada.

1908 Todd and Sanford publish the first edition of

Diagnosis by Laboratory Methods.

http://intranet.tdmu.edu.ua/data/kafedra/internal/clinlab/classes_stud/en/med/lik/ptn/4/LABORATORY%20MEDICINE.%20HAEMATOLOGY.files/image002.png

 

Venipuncture is in widespread use by 1920.

 

1911 Oskar Heimstadt invents the fluorescence microscope.

1912 American College of Surgeons is chartered in Illinois.

1913 D.D. van Slyke is appointed chemist at Rockefeller Hospital Laboratory; American Association of Immunologists is founded.

1916 K.M.G. Siegbahn develops X-ray spectroscopy. P.A. Kohler develops the colorimeter–nephelometer.

1918 N. Wales and E.J. Copeland develop the electric refrigerator (Kelvinator).

1919 F.W. Aston develops the mass spectrograph.

1920 First clinical laboratory method for serum phosphorus is established; the use of venipuncture for diagnostic testing becomes widespread; Victor Meyers establishes the University of Iowa center for training clinical chemists, primarily for hospital positions; Conference of Public Health Laboratories is founded.

1921 First clinical laboratory method for serum magnesium is introduced; The Denver Society of Clinical Pathologists, precursor of the American Society of

Clinical Pathologists, is founded in Denver, CO.

1922 ASCP is founded in St. Louis, MO.

1925 American Type Culture Collection is founded.

1926 Arne Tiselius develops moving boundary electrophoresis of proteins; Theodor Svedberg determines the molecular weight of hemoglobin by ultracentrifugation; ASCP appoints a “Committee on the Registration of Laboratory Technicians” to define and classify medical technicians.

http://intranet.tdmu.edu.ua/data/kafedra/internal/clinlab/classes_stud/en/med/lik/ptn/4/LABORATORY%20MEDICINE.%20HAEMATOLOGY.files/image004.png

1928 G.N. Papanicolaou first reported the ability to recognize cancer in vaginal smears, thus beginning clinical cytology; F.A. Paneth founds radiochemistry.

1929 Otto Folin introduces the use of the light filter in colorimetry; R. Gabreus develops the erythrocyte sedimentation rate as an index of severity of disease;

M. Knoll and E. Ruska invent the electron microscope; ASCP establishes its Board of Registry for certifying medical technologists; Mayo Clinic has 21 laboratories by this date.

1930 Kay develops the first clinical laboratory method for alkaline phosphatase, thus beginning clinical enzymology; refractometry is first used in clinical

labs for the determination of protein in urine; ASCP issues its first medical technologist certification to P.H. Adams for Ft. Wayne, IN; Beckman

Instruments is founded.

1932 Cherry and Crandall develop the clinical laboratory method for serum lipase activity; American Society of Clinical Laboratory Technicians, precursor of

the American Society for Medical Technology, is founded.

1934 Commercial development of the electron microscope takes place.

http://intranet.tdmu.edu.ua/data/kafedra/internal/clinlab/classes_stud/en/med/lik/ptn/4/LABORATORY%20MEDICINE.%20HAEMATOLOGY.files/image006.png

1935 Beckman Instruments Co. introduces the first

pH meter; ASCP Board of Registry first requires a college degree for medical technologist certification.

1937 First hospitalbased blood bank is established at Cook

County Hospital, Chicago, IL; ASCP and its Board of Registry officially oppose state licensure of medical technologists.

1938 Somogyi develops 2 major clinical laboratory methods for serum and urine amylase activity; Gutman develops the first assay for acid phosphatase.

1939 Conway and Cook develop the first clinical laboratory method for blood ammonia; American Medical Technologists is founded.

1940 Visual colorimeters begin to be replaced by photoelectric  colorimeters in clinical labs; RCA demonstrates the first commercial electron microscope.

1941 G.N. Papanicolaou and H.F. Traut prove the diagnostic usefulness of vaginal smears in cervical cancer; A.J.P. Martin and R.L.M. Synge separate amino acidsand peptides by chromatography.

1943 Penicillin is successfully used in therapy.

1944 William Sunderman applies refractometry of proteins in the clinical lab.

1945 S. Borgstrom develops the whole blood clotting time test; itemized charges for hospital services are begun.

1946 The Vacutainer evacuated serum collection tube is introduced by Becton Dickinson Co.; Arne Tiselius separates proteins by chromatography; College of

American Pathologists is founded.

1947 Edwin Land develops the Polaroid camera; American Association of Blood Banks is founded.

1948 American Association of Clinical Chemistry is founded.

1950 R.S. Yalow and S. Berson develop radioimmunoassay; Levey and Jennings adapt the Shewhart QC chart to use in clinical laboratories;

Histochemical Society is founded.

http://intranet.tdmu.edu.ua/data/kafedra/internal/clinlab/classes_stud/en/med/lik/ptn/4/LABORATORY%20MEDICINE.%20HAEMATOLOGY.files/image008.png

1952 M.D. Poulik invents immunoelectrophoresis.

1954 Kuby develops the clinical lab method for serum creatine phosphokinase activity; A. Walsh develops the atomic absorption spectrometer.

1955 Wroblewski and LaDue develop the clinical laboratory method for serum lactate dehydrogenase; Karmen develops the clinical laboratory method for aspartate aminotransferase; Leonard Skegges develops the concept of “continuous flow dialysis” in connection with treatment of renal disease; Severo Ochoa

synthesizes RNA.

1956 Wroblewski and LaDue develop the method for serum alanine aminotransferase activity called “serum glutamic- pyruvic transaminase” and

recognize its greater specificity for liver disease compared with that of aspartate aminotransferase; J. Edwards proposes prenatal screening for genetic disease.

1957 Van Handel and Zilversmit develop a direct chemical method for the determination of triglycerides.

1959 The first clinical laboratory chemical analyzer, the singlechannel “Auto-Analyzer,” is introduced by Technicon Corp.; Technicon first applies

flame photometry to automated methods.

1960 Methods for serum creatine phosphokinase isoenzymes are

developed; the first method for gamma-glutamyl transferase in serum is developed;

Perkin-Elmer Corp. introduces atomic absorption spectrometry

for the determination of calcium and magnesium; the laser is developed; Feichtmeier invents the mechanical pipettor (Auto Dilator).

1961 Becton Dickinson Co. introduces disposable hypodermic syringe and needle.

1962 Siegelman develops a method for glutamic dehydrogenase;

IBM introduces disk storage for computers; International Society for Clinical

Laboratory Technology is founded.

1965 Scanning electron microscope is developed; the U.S. enacts

Medicare and Medicaid (Titles 18 and 19 of the Social Security Amendments).

1966 Medicare/Medicaid officially goes into effect.

1967 G.I. Abelev shows that alphafetoprotein is elevated in

serum of patients with testicular teratocarcinoma; MetPath Laboratories is founded; U.S. enacts the Clinical Laboratory Improvement Act (CLIA ’67).

1968 The first random-access analyzer is introduced by DuPont (the ACA); the 1% Medicare allowance for unidentified costs is reduced to zero; Canada enacts the Federal Medical Care Act, creating a single-paye

 

The use of laboratory tests:

Laboratory investigations are involved in every branch of clinical medicine.

The results of laboratory tests may be of use in:

1.     Diagnosis and in the monitoring of treatment.

2.     Screening for disease or in assesing the prognosis.

3.     Reseach into the biochemical basis of disease

4.     Clinical trials of new drugs

Laboratory investigations hold the key for the diagnosis and prognosis of diabetes mellitus, jaundice, myocardial infarction, gout, pancreatitis, rickets, cancers, acid-base imbalance etc. Successful medical practice is unimaginable without the service of clinical laboratory.

In general, laboratory tests can be broadly divided into two groups:

In 1. Discretionary or selective requesting, the tests are carried out on the basis of an individual patient's clinical situation. The case for discretionary requesting has been put admirably (Asher, 1954):

1. Why do I request this test?

2. What will I look for in the result?

3. If I find what I am looking for, will it affect my diagnosis?

4. How will this investigation affect my manage­ment of the patient?

5. Will this investigation ultimately benefit the patient?

In contrast, 2. Screening tests are used to search for disease without there being any necessary clinical indication that disease is present.

The situations in which discretionary test requests are undertaken are listed in Table 1.2

 

Table 1.2 Test selection for the purposes of discretionary testing

Category

Example

To confirm a diagnosis

Plasma (free T4) and (thyroid-stimulating hormone, TSH) in suspected hyperthyroidism

To aid differential diagnosis

To distinguish between different forms of jaundice

To refine a diagnosis

Use of ACTH to localize Cushing's syndrome

To asses the severity of disease

Plasma (creatinine) or (urea) in renal disease

To monitor progress

Plasma (glucose) to follow of patients with diabetes mellitus

To detect complications or side effects

ALT measurements in patients treated with hepatotoxic drug

To monitor therapy

Plasma drug concentration in patients treated with antiepileptic drugs

 

Screening may take two forms: 1. Well-population screening in which typically a spectrum of tests is carried out on individuals from an apparently healthy population in an attempt to detect presymptomatic or early disease. The value of well-population screening has been called into question and certainly should only be initiated under certain specific circumstances which are listed in Table 1.3.

Table 1.3 Requirements for well-population screening

The disease is common or life-threatening

The tests are sensitive and specific

The tests are readily applied and acceptable to the population to be screened

Clinical, laboratory and other facilities are available for follow-up

Economics of screening have been clarified and the implications accepted

 

2. Case-finding screening programmes perform appropriate tests on a population sample known to be at high risk of a particular disease.

These are inherently more selective and yield a higher proportion of useful results (Table 1.4).

Table 1.4 Examples of tests used in case-finding programmes.

 

Programmes to detect diseases in

Chemical investigations

Neonates:

 

PKA (phenylketonuria)

Serum [phenylalanine]

 

Hypothyroidism

Serum [TSH] and/or [thyroxine]

Adolescents and young adults:

 

Substance abuse                                                                                                     

Drug screen

Pregnancy:

 

Diabetes mellitus in the mother                                                                        

Plasma and urine [glucose]

 

Open neural tube defect (NTD) in the foetus

Maternal serum [a-fetoprotein]

Industry:

 

 

Industrial exposure tolead                                                                              

Blood [lead]

 

Industrial exposure to pesticides

Plasma cholinesterase activity

Malnutrition

Plasma [albumin] and/or [pre-albumin]

 

Thyroid dysfunction 

Plasma [TSH] and/or [thyroxine]

 

ADVANTAGES OF SCREENING

First, an uncommon or unexpected disease may be found and created (Table 1.5). Second, the early requesting of a battery of tests might be expected to expedite management of the patient. Most studies have not shown this to be so.

 

 

 

Table 1.5 Advantages of screening in identifying unexpected test results

Disease

Unexpected abnormal  test results

Hyperparathyroidism

Raised plasma calcium

Hypothyroidism

Raised plasma TSH and/or a low T4

Diabetes mellitus

High random plasma glucose

Renal tract disease

Raised plasma creatinine or urea

Liver disease

Increased plasma ALT, AST

DISADVANTAGES OF SCREENING

It is easy to miss significant abnormalities in the 'flood' of data coming from the laboratory, even when the abnormalities are 'flagged' in some way. Most of the abnormalities detected will be of little or no significance, yet may need additional time-consuming and often expensive tests to clarify their importance (or lack of it).

In other instances, to simplify requesting, a wide range of tests are routinely requested on all patients in a particular category, for example, admission screening on all those admitted through the Accident and Emergency (A&E) Department. Mention should also be made of batteries of tests which are generally requested on a discre­tionary basis but where the test group collectively provides information about an organ system (e.g. tests for liver disease) or a physiological state (e.g. water and electrolyte status). Many laboratories analyse and report these functional or organ-related groups. For example, a 'liver function test' group might consist of plasma bilirubin, alanine aminotransferase (ALT), alkaline phosphatase (ALP), γ-glutamyltransferase (GGT) and albumin measurements.

Clinical biochemical tests comprise over ⅓ of all hospital laboratory investigations.

Core biochemistry: Most biochemistry laboratories provide the "core analyses", commonly requested tests which are of value in many patients, on a frequent basis.

Core biochemical tests:

1. Sodium, potassium, chloride and bicarbonate

2. Urea and creatinine

3. Calcium and phosphate

4. Total protein and albumin

5. Bilirubin and alkaline phosphatase

6. Alanine aminotransferase (ALT) and Aspartate aminotransferase (AST)

7. Glucose

8. Amylase

Specialized tests:

Not every laboratory is equiped to carry out all possible biochemistry requests.

Large departments may act as reference centres where less commonly asked  for tests are performed.

Specialized tests:

1. Hormones

2. Specific proteins

3. Trace elements

4. Vitamins

5. Drugs

6. Lipids and lipoproteins

7. DNA analyses

The emergency lab:

All clinical biochemistry laboratories provide facilities for urgent tests. An urgent test is designated as one on which the clinician is likely to take immediate action. The main reason for asking for an analysis to be performed on an urgent basis is that immediate treatment depends on the result.

Emergency tests:

1. Urea and electrolytes

2. Blood gases

3. Amylase

4. Glucose

5. Salicylate

6. Paracetamol

7. Calcium

Specimen collection:

The biological fluids employed in the clinical biochemistry laboratory include blood, urine, saliva, sputum, faeces, tissue and cells, cerebrospinal fluid,peritoneal fluid, synovial fluid, pleural fluid, stones.

Among these, blood (directly or in the form of plasma or serum) is frequently used for the investigations in the clinical biochemistry laboratory.

Identification of patients and specimens

The correct patient must be appropriately iden­tified on the specimen and request form, as follows:

1. Patient identification data (PID). This usually comprises name plus unique number.

2. Test request information. This includes relevant clinical details (including any risk of infection hazard), the tests to be performed and where the report is to be sent.

3. Collection of specimens. In the correct tube and the appropriate preservative.

4. Matching of specimens to requests. Each specimen must be easily and unequivocally matched to the corresponding request for investigations.

 

 

Table 1.1 Some commoner causes of errors arising from use of the laboratory.

 

Error

Consequence

Crossover of addressograph labels between patients

This can lead to two patients each with the other's set of results. Labels between patients. Where the patient is assigned a completely wrong set of results, it is important to investigate the problem in case there is a second patient with a corresponding wrong set of results

 

Timing error                         

There are many examples where timing is important but not considered. Sending in a blood sample too early after the administration of a drug can lead to misleadingly high values in therapeutic monitoring. Interpretation of some tests (e.g. cortisol) is critically dependent on the time of day when the blood was sampled

 

Sample collection tubeerror           

For some tests the nature of the collection tube is critical which is why the Biochemistry Laboratory specifies this detail. For example, using a plasma tube with lithium-heparin as the anticoagulant invalidates this sample tube for measurement of a therapeutic lithium level! Serum electrophoresis requires a serum sample; otherwise, the fibrinogen interferes with the detection of any monoclonal bands. Topping up a biochemistry tube with a haematology (potassium-ethylenediamine tetraaceticacid (EDTA) sample) will lead to high potassium and low calcium values in the biochemistry sample

 

Sample taken from close to the site of an intravenous infusion       

The blood sample will be diluted so that all the tests will be correspondingly site of an intravenous (IV) infusion       low with the exception of those tests which might be affected by the composition of the infusion fluid itself. For example, using normal saline as the infusing fluid would lead to a lowering of all test results but with sodium and chloride results which are likely to be raised

 

Analytical error                      

Although comparatively rare, these do inevitably happen from time to time and any result which is unexpected should lead the requesting clinician to discuss the matter further with the Laboratory. Transcription errors within the Laboratory are increasingly less common because of the electronic download of results to the Laboratory computer as a source of the printout or results on the VDU. Most errors generated within the Laboratory occur at the Reception as a result of mislabelling of samples within the Laboratory

 

 

 

COLLECTION OF BLOOD:

Venous blood is most commonly used for a majority of biochemical investigations. It can be drawn from any prominent vein (usually from a vein on the front of the elbow).

Capillary blood (<0.2 ml) obtained from a finger or thumb, is less frequently employed.

Arterial blood (usually drawn under local anesthesia) is used for blood gas determinations.

 

Precautions for blood collection : Use of sterile (preferably disposable) needles and syringes, cleaning of patients skin, blood collection in clean and dry vials/tubes are some of the important precautions.

 

Biochemical investigations can be performed on 4 types of blood specimens – whole blood, plasma, serum and red blood cells. The selection of the specimen depends on the parameter to be estimated.

1. Whole blood (usually mixed with an anticoagulant) is used for the estimation of hemoglobin, carboxyhemoglobin, pH, glucose, urea, non-protein nitrogen, pyruvate, lactate, ammonia etc. (Note : for glucose determination, plasma is prefered in recent years).

2. Plasma, obtained by centrifuging the whole blood collected with an anticoagulant, is employed for the parameters—fibrinogen, glucose, bicarbonate, chloride, ascorbic acid etc.

3. Serum is the supernatant fluid that can be collected after centrifuging the clotted blood. It is the most frequently used specimen in the clinical biochemistry laboratory. The parameters estimated in serum include proteins (albumin/globulins), creatinine, bilirubin, cholesterol, uric acid, electroylets (Na+, K+, Cl-), enzymes (ALT, AST, LDH, CK, ALP, ACP, amylase, lipase) and vitamins.

4. Red blood cells are employed for the determination of abnormal hemoglobins, glucose 6-phosphate dehydrogenase, pyruvate kinase etc.

Collection and preservation of blood specimens

Lack of thought before collecting specimens or carelessness in collection may adversely affect the interpretation or impair the validity of the tests carried out on the specimens. Some factors to consider include the following:

1. Diet Dietary constituents may alter the concen­trations of analytes in blood significantly (e.g. plasma [glucose] and [triglyceride] are affected by carbohydrate and fat-containing meals, respectively).

2. Drugs Many drugs influence the chemical compo­sition of blood. Such effects of drug treatment, for example, antiepileptic drugs, have to be taken into account when interpreting test results. Details of rel­evant drug treatment must be given when request­ing chemical analyses, especially when toxicological investigations are to be performed.

3. Diurnal variation. The concentrations of many substances in blood vary considerably at different times of day (e.g. cortisol). Specimens for these analyses must be collected at the times specified by the laboratory, as there may be no reference ranges relating to their concentrations in blood at other times

Care when collection blood specimens

The posture of the patient, the choice of skin-cleansing agent and the selection of a suitable vien (or other source) are the principal factors to con­sider before proceeding to collect each specimen:

1.  The skin must be clean over the site for collect­ing the blood specimen. However, it must be remembered that alconol and methylated spir­its can cause haemolysis, and that their use is clearly to be avoided if blood [ethanol] is to be determined.

2.  Limbs into which intravenous infusions are being given must not be selected as the site of venepuncture unless particular care is taken. The needle or cannula must first be thoroughly flushed out with blood to avoid dilution of the specimen with infusion fluid.

3.  Venepuncture technique should be standardised as far as possible to enable closer comparison of successive results on patients.

4. Venous blood specimens should be obtained with minimal stasis Prolonged stasis can markedly raise the concentrations of plasma proteins and other non-diffusible substances (e.g. protein-bound substances). It is advisable to release the tourniquet before withdrawing the sample of blood.

5. Posture should be standardised if possible When a patient's posture changes from lying to standing, there may be an increase of as much as 13% in the concentration of plasma proteins or protein-bound constituents, due to redistribution of fluid in the extracellular space.

6.  Haemolysis should be avoided, since it renders specimens unsuitable for plasma K+, magne­sium and many protein and enzyme activity mea­surements.

7. Infection hazard  High-risk specimens require special care in collection, and this danger must be clearly indicated on the request form.

http://intranet.tdmu.edu.ua/data/kafedra/internal/clinlab/classes_stud/en/med/lik/ptn/4/LABORATORY%20MEDICINE.%20HAEMATOLOGY.files/image010.png

 Vacutainers used for blood collection and storage

Care of blood specimens after collection

Blood specimens should be transported to the lab­oratory as soon as possible after collection. Special arrangements are needed for some specimens (e.g. for acid-base measurements, or unstable hormones) because of their lack of stability. Most other analytes are stable for at least 3 h in whole blood, or longer if plasma or serum is first sepa­rated from the cells. As a rule, whole blood specimens for chemical analysis must not be stored in a refrigera­tor, since ionic pumps that maintain electrolyte gradients across the cell membrane are inactive at low temperatures. Conversely, separated serum or plasma is best refrigerated, to minimize chemical changes or bacterial growth.

Several changes occur in whole blood specimens following collection. The commoner and more important changes that occur prior to the separation of plasma or serum from the cells are:

1.  Glucose is converted to lactate: this process is inhibited by fluoride;

2.  Several substances pass through the erythrocyte membrane, or may be added in significant amounts to plasma as a result of red cell destruction insufficient to cause detectable haemolysis. Examples include K+ and lactate dehydrogenase;

3. Loss of CO2 occurs, since the Pco2, of blood is much higher than in air;

4. Plasma [phosphate] increases due to hydrolysis of organic ester phosphates in the red cells;

5. Labile plasma enzymes lose their activity.

 

ANTICOAGULANTS

Certain biochemical tests require unclotted blood. Serum from coagulated blood is the specimen of choice for many assay systems.

Heparin (inhibits the convension prothrmobin to thrombin) is the most widely used anticoagulant for clinical chemical analysis. Heparin  is an ideal anticoagulant, since it does not cause any change in blood composition. However, other anticoagulants are prefered to heparin, due to the cost factor.

Ethylene diamine tetra acetic acid (EDTA) is a chelating agent, and is  particularly useful for hematological examination because it preserves cellular components of the BLOOD. It chelates with calcium and blocks coagulation. EDTA is employed to collect blood for hematological examinations It may affect some of the clinical chemistry tests.

Sodium fluoride is usually used as a preservative for blood glucose by inhibiting the enzyme systems involved in the glycolysis. Without an antiglycolytic agent, the blood glucose concentration decreases about 10 mg/dl per hour and false results may be obtained. Fluoride is also anticoagulant. It should not be used for enzyme assays, as well as when the test involves enzymatic analysis.

Citrate is widely used for coagulation studies.

Oxalate inhibits blood coagulation by forming insoluble complexes with calcium ions. Potassium oxalate may be used at a concentration of 1 -2 mg/ml blood. At concentration of > 3   mg/ml, oxalate may cause hemolysis.                

Potassium or sodium oxalate : These compounds precipitate calcium and inhibit blood coagulation. Being more soluble, potassium oxalate (5-10 mg per 5 ml blood) is prefered.

Potasium oxalate and sodium fluoride : These anticoagulants are employed for collecting blood to estimate glucose. Further sodium fluoride inhibits glycolysis and preserves bfood glucose concentration.

Ammonium oxalate and potassium oxalate :  mixture of these two compounds in the ratio

 3 : 2 is used for blood collection to carry out certain hematological tests.

HEMOLOYSIS

The rupture or lysis of RBC, releasing the cellular constituents interferes with the laboratory investigations. Therefore, utmost care should be taken to avoid hemolysiswhen plasma or serum are used for biochemical tests. Use of dry syringes, needles and containers, allowing slow flow of blood into syringe are among the important precautions to avoid hemolysis.

PRESERVATION OF BLOOD SPECIMENS

Plasma or serum should be separated within 2 hours after blood collection. It is ideal and advisable to analyse blood, plasma or serum, immediately after the specimen collection. This however, may not be always possible. In such a case, the samples (usually plasma/serum) can be stored at 4°C until analysed. For enzyme analysis, thesample are preserved at -20°C.

 

Sampling Errors:

1. Blood sampling technique. Difficulty in obtaining a blood specimen may lead to haemolysis with consequent release  of potassium and other red cell constituents. Results for these will be falsely elevated.

2. Prolonged stasis during venepuncture.  Plasma water diffuses into the interstitial space and the serum or plasma sample obtained will be concentrated. Proteins and protein-bound components of plasma such calcium or thyroxine will be falsely elevated.

3. Insufficient specimen. Each biochemical analysis requires a certain volume of specimen to enable  the test to be carried out.

4. Errors in timing. The biggest source of error in the measurement of any analyte in a 24-hour urine specimen is in the collection of an accurately timed volume of urine.

5. Incorrect specimen container. For many analyses the blood must be collected into a container with anticoagulant and preservative. For example, samples for glucose should be collected into a special container containing fluoride which inhibits glycolysis; otherwisethe time taken to deliver the sample to the laboratory can affect the result.

6. Inappropriate sampling site. Blood samples should not be taking downstream from an intravenous drip.It is not unheard of for the laboratory to receive a blood glucose request on a specimen taken from the same arm into which 5% glucose is being infused.

7. Incorect specimen storage. A blood sample stored overnight before being sent to the laboratory will show falsely high potassium, phosphate and red cells enzymes such as lactate dehydrogenase, because of leakage into the extracellular fluid from the cells.

          Many hormones show circardian rhythm. For example, ACTH has maximum peak at early morning, and minimum level at afternoon. Maximum level of growth hormone is during night and minimum is in the day time. Many  reference values are age related; e.g., levels of urea and cholesterol are more in geriatric patients. Exercise will increase the level of transaminases and creatinine. Triglyceride level is to be done in fasting condition. Caffeine (coffee and tea) will increase the levels of free fatty acid, glycerol, total lipids and glucose. Smoking will increase the levels of GH, cortisol and triglycerides.

COLLECTION OF URINE:

An early morning fasting specimen is generally the most concentrated specimen. Therefore, this is preferred for microscopic examination and for the detection of proteins, beta chorionic gonadotropin and other metabolites.  

Urine, containing the metabolic waste products of the body in water is the most important excretory fluid. For biochemical investigations, urine can be collected as a single specimen or for 24 hours. Single specimens of urine, normally collected in the morning, are useful for qualitative tests e.g., sugar, proteins. Twenty four hour urine collections (done between 8 AM to 8 AM) are employed for quantitative estimation of certain urinary constituents e.g., proteins, hormones, metabolites.

Depending on the test, either a random or a com­plete timed collection of urine is needed. The timed collection is obtained as follows:

1. Just before the collection period is due to start, the patient empties his/her bladder. This urine must be discarded.

2. Thereafter, from the start (e.g. at 8 AM) to the end of the collection period, all urine passed by the patient must be added to the container. If this con­tains preservative, the specimen must be mixed gently each time more urine is passed and added to the collection.

3. At the end of the period (e.g. 8 AM the next day, in the case of a 24-h collection), the patient emp­ties his/her bladder. This urine must be included in the collection.

4. The period over which the collection was made must be recorded and written on the specimen container and the request form.

For large volumes, an aliquot (e.g. 25 mL) may be sent to the laboratory, but the complete speci­men must first be mixed and its volume recorded on the container and the request form.

Urine specimens tend to deteriorate unless the correct preservative is added from the start, or the specimen is refrigerated throughout the collection period. The changes include:

1. destruction of glucose by bacteria;

2. conversion of urea to ammonia, by bacte­ria, with fall in [H+] and precipitation of phosphates;

3. oxidation of urobilinogen to urobilin and porphobilinogen to porphyrins.

 

Preservatives for urine : The preservatives are used (1) to reduce bacterial action; (2) to minimise chemical decomposition, and (3) to decrease atmospheric oxidation of unstable compounds. The most satisfactory form of preservation of urine specimen is to refrigerate it during the collection. Formalin, thymol chloroform, toluene, concentrated HCI and glacial acetic acid are the commonly used urine preservatives.

For the collection of 24 hr urine samples, preservatives have to be used or else urine undergoes changes due to bacterial action. Hydrochloric acid, toluene, light petroleum, thymol, formalin etc., are among the common preservatives used.

 

Timed Urine Specimen

Usually, urine sample is collected for the 24 hour period. This will minimise the influence of short-term biological variations and diurnal rhythms. Generally, collection of urine samples are done from 6 AM to next 6 AM.  The bladder should be emptied when the collection is started (6 AM), and this urine is discarded. Thereafter all the urine  should be collected. The next day urine is voided at 6 AM and this sample is also collected.

 

 

CEREBROSPINAL FLUID:

CSF is a fluid of the nervous system. It is formed by a process of selective dialysis of plasma by the choroid plexuses of the ventricles of the brain. The total volume CSF is 100-200 ml.

Collection of CSF : CSF is collected by puncturing the interspace between the 3rd and the 5th number vertabrae, under asepetic conditions and local anesthesia.

Biochemical investigations on CSF : Protein, glucose   and   chloride   estimations   are usually performed in the clinical biochemistry laboratory.

 

The interpretation of results:

Most biochemical analyses are quantitative. Many tests measure the amount of the analyte in a small volume of the sample (blood, plasma, serum, urine or some other fluid or tissue). The tests results are commonly expressed in molar units. A mole of any compound always contains 6* 1023  molecules. Describing how much of ananalyte is present in moles indicates how many molecules of the substance are present. Molar units can be converted to mass units: one mole is the molecular weight of the substance in grams. Results are reported as concentrations, usually in terms of the number of moles in one litre (mol/l).

 

Molar units:

Mole

Abbreviation

Definition

Milimole

mmol

          *10-3 of a mole

Micromole

µmol

*10-6 of a mole

Nanomole

nmol

*10-9 of a mole

Picomole

pmol

*10-12 of a mole

Femtomole

fmol

*10-15 of a mole

 

Enzymes are not usually expressed in moles but as enzyme activity in "units". Large molecules such as proteins are reported as grams or milligrams. Blood gas results (PCO2 or PO2) are expressed in kilopascals (kPa), the units in which partial pressures are measured.

 

Biological factors affecting the interpretation of results:

1.     Sex of the patient. Reference ranges for some analytes such as serum creatinine are different for men and women.

2.     Age of the patient. There may be different references ranges for neonates, children, adults and the eldery.

3.     Effect of diet.The sample may be inappropriate if taken when the patient is fasting or after a meal.

4.     Time when sample was taken. There may be variations during the day and night.

5.     Stress and anxiety.

6.     Posture of the patient. Redistribution of fluid may effect the result.

7.     Effects of exercise. Strenuous exercise can release enzymes from tissues.

8.     Medical history.Infection or tissue injury can affect biochemical values independently of the disease process being investigated.

9.     Pregnancy. This alters some references ranges.

10. Menstrual cycle. Hormone measurement will vary through the menstrual cycle.

11. Drug history.

 

The clinician may well ask the following questions on receiving a biochemistry report:

1. Does the result fit in with what I expected on the basis of the clinical examination and history of the patient?

2. If the result is not what I expected, can I explain the discrepancy?

3. How can the result change my diagnosis or the way I am managing the patient?

4. What should I do next?

 

QUALITY CONTROL:

Quality control in clinical biochemistry laboratory refers to the reliability of investigative service. Any error in the laboratory will jeopardize the lives of patients. It is therefore utmost important that the laboratory errors are identified and rectified.

Quality control comprises of four interrelated factors namely precision, accuracy, specificity and sensitivity.

Precision refers to the reproducibility of the result when the same sample is analysed on different occasions (replicate measurements) by the same person. For instance, the precision is good, if the blood glucose level is 78, 80 and 82 mg/dl on replicates.

Accuracy means the closeness of the estimated result to the true value e.g., if true blood urea level is 50 mg/dl, the laboratory reporting 45 mg/dl is more accurate than the one reporting 35 mg/dl.

Specificity refers to the ability of the analytical method to specifically determine a particular parameter e.g., glucose can be specifically estimated by enzymatic glucose oxidase method.

Sensitivity deals with the ability of a particular method to detect small amounts of the measured constituent.

METHODS OF QUALITY CONTROL

1. Internal quality control refers to the analysis of the same stored sample on different days in a

laboratory, the results should vary within a narrow range.

2. External quality control deals with the analysis of a sample received from outside, usually from a national or regional quality control centre. The results obained arethen compared.

 

Accuracy

It is the closeness of a result to the true value. For example, if one technician performs a test on a serum which is known to contain 5,0 mmol/L glucose and obtains a result of 4.9 mmol/L.

A second technician does the same test on the same sample, and gets the result of 4.5 mmol/L. Then the value of the first technician is considered as accurate. Values farther away from the true value are less accurate than those closer.

Precision

This refers to the reproducibility of the result. If one technician performs glucose analysis on the same sample on three different occasions and obtains 4 mmol/L, 3.9mmol/L, and 4.1 mmol/L, then the results have been reproduced very well, and the precision is very good. Precision depends on the technique, the reagents, as well as on the technician.

Specificity

Specificity of a reaction denotes that only one substance will answer that particular test. For example, in the case of glucose oxidase method, only glucose molecules are assayed. So it is a very specific method. But if the reducing property of the glucose is utilised for the assay purpose (e.g., Nelson Somogyi method), then other reducing agents in the blood will interfere in the reaction, and hence specificity is lowered.  Specificity is determined by the method of the analysis.

Sensitivity                                                  |

It indicates that whether the method could be utilised to test a very dilute solution. For example, biuret method is used for solutions having a few g of protein/dl. Spectrophotometric method is useful to detect a few mg of protein/dl, while ELISA method is employed if the solution has only microgram of protein/dl. Thus ELISA method is most sensitive. Generally speaking, as the sensitivity is increased, specificity is decreased.

Limit of Errors Allowable in Laboratory:

In a laboratory, error may not be totally avoided; but should be kept at a minimum. The limits are denoted by the term, percentage error. The percentage of allowable error in an assay is given by the formula:                           

Difference between maximum and minimum of normal range                     

—————————————————                                       x100

Mean of the normal range x 4

The percentage error, therefore, will vary from test to test. To take an example, in the case of blood glucose analysis, the normal range is 70 to 110 mg/dl. If these values are substituted in the formula:

 

(110 minus 70)

———————— x 100 = 10%

90x4

Now, in the case of blood urea (normal range of 20-40 mg/dl), the percentage error will be

(40 minus 20)

———————— x 100 = 16%

30-x4

 

 

Biochemical testing outside the laboratory:

The methods for measuring some biological compounds in blood and urine have become so robust and simple to use tht measurements can be made away from the laboratory – by the patient's bedside, in the ward, in the home or even in the shoping centre.

Tests performed away from the laboratory:

The most common blood test  outside the laboratory is the determination of glucose concentration, in a finger stab sample, at home or in the clinic. Diabetic patients who need to monitor their blood glucose on a regular basis can do so at home or at work using one of many commercially available pocket-sized instruments.

A portable bench analyser. This instrument may be used to monitor patients' glucose and cholesterol, and its frequently used in many outpatient clinics and in screeningcentres.

Common tests on blood performed away from the laboratory

 

Analyte

Used when investigating

A

Blood gases

Acid-base status

Glucose

Diabetes mellitus

Urea

Renal disease

Creatinine

Renal disease

Bilirubin

Neonatal jaundise

Therapeutic drugs

Compliance of toxicity

Salicylate

Detection of poisoning

Paracetamol

Detection of poisoning

Glucose

Diabetic monitoring

B

Cholesterol

Coronary heart disease risk

Alcohol

Fitness to drive/confusion, coma

C

 

Common tests on urine performed away from the laboratory

 

Analyte

Used when investigating

A

Ketones

Diabetic ketoacidosis

Protein

Renal disease

Red cells/haemoglobin

Renal disease

Bilirubin

Liver disease and jaundise

Urobilinogen

Jaundise/haemolysis

pH

Renal tubular acidosis

Glucose

Diabetes mellitus

B

hCG

Pregnancy test

 

The tests commonly performed away from the laboratory can be categorized as follows:

A.   Tests performed in medical or nursing settings.

They clearly give valuable information and allow the practitioner to reassure the patient or family, or initiate futher investigations or treatment.

B.    Tests performed in the home, shopping centre or clinical setting.

They can give valuable information when properly and appropriately used.

C.    Alcohol tests.

 

Manual vs automation in clinical laboratory

 

Automation in clinical laboratory is a process by which analytical instruments perform many tests with the least involvement of an analyst. The International Union of Pure and Applied Chemistry (IUPAC) define automation as “the replacement of human manipulative effort and facilities in the performance of a given process by mechanical and instrumental devices that are regulated by feedback of information so that an apparatus is self-monitoring or selfadjusting”.

Presently no currently available clinical instrument fully meets this definition,

however the term ‘automation’ is applicable to the individual steps in many analytical processes and modern instrumentation is improvising with more and more intelligence built into new generations of laboratory analyzers to soon come up to the IUPAC definition.

Automated instruments enable laboratories to process a much larger workload without a relative increase in manpower. Automation in clinical laboratories has evolved from fixed automation whereby an instrument performs a repetitive task by itself, and has progressed to programmable automation, which permits it to perform a variety of different tasks. Intelligent automation has recently been introduced into a few individual instruments or systems to enable them to self-monitor and respond appropriately to changing conditions. Instead of

resorting to manual means automation leads to reduction in variability of results and error of analysis by doing away with jobs that are repetitive and monotonous for an individual and that can lead to boredom or casual attitude. However, the improved reproducibility attained by automation is not necessarily associated with improved accuracy of test results since accuracy is mainly influenced by the analytical methods used. The significant improvement in quality of laboratory tests in recent years is due the combination of well-designed automated instrumentation with good analytical methods and effective quality assurance programs. Automation may initially incur high costs for procurement of the equipments but is

economical in the long run due to the reduction in the manpower required to perform the tasks.

Automated analyzers usually include the mechanized versions of basic manual laboratory techniques and procedures, and several ways have been developed for automating them. When initially introduced, automation mimicked manual test procedures and was applied to those tests requested most often. All the individual steps in the procedure are duplicated. Analytical methods, which are quicker and with fewer steps as well as modification of existing protocols are being developed as the manufactures have integrated computer hardware and software into analyzers to provide automatic process control and data processing capabilities.

 

Types of analyzers

 

Semi-auto analyzer: Here, the samples and reagents are mixed and read manually

Batch analyzer: The reagent mixture is mixed and fed automatically. One reagent is stored in the machine at a time enabling one batch of a specifc test to be automatically conducted e.g. RA 100.

Random Access autoanalyzers: These analyzers can store more than one reagent. Samples are placed in the machine and the computer is programmed to carry out any number of selected tests on each sample e.g. Hitachi 912.

http://intranet.tdmu.edu.ua/data/kafedra/internal/clinlab/classes_stud/en/med/lik/ptn/4/LABORATORY%20MEDICINE.%20HAEMATOLOGY.files/image012.pnghttp://intranet.tdmu.edu.ua/data/kafedra/internal/clinlab/classes_stud/en/med/lik/ptn/4/LABORATORY%20MEDICINE.%20HAEMATOLOGY.files/image014.png

Figure-1: Semi-Autoanalyzer

 

 

 

http://intranet.tdmu.edu.ua/data/kafedra/internal/clinlab/classes_stud/en/med/lik/ptn/4/LABORATORY%20MEDICINE.%20HAEMATOLOGY.files/image016.png

Figure-2: Autoanalyzer

 

Some important techniques employed in the clinical laboratory analysis

are as follows:

(a) Optical Techniques

(b) Electrochemical Techniques

(c) Chromatography

(d) Electrophoresis

Optical Techniques

Analytical techniques that make use of light spectrum either of a specific

wavelength or as visible light spectrum can be collectively referred to as Optical Techniques. The major optical techniques used in clinical laboratory are microscopy and spectrophotometry.

http://intranet.tdmu.edu.ua/data/kafedra/internal/clinlab/classes_stud/en/med/lik/ptn/4/LABORATORY%20MEDICINE.%20HAEMATOLOGY.files/image018.png

 

Many biochemical quantitative analyses done in clinical laboratory are based on measurements of radiant energy (light) emitted, transmitted, absorbed, scattered or reflected when the substance being measured interact with an incident light, under controlled conditions. Techniques of measuring such radiant energy (light) are termed spectrophotometric  techniques.

Specific spectrophotometric techniques (or instrumentation design) depend on whether the interaction between the incident wavelength of light and the substance being measured results into light absorption, (or transmission) reflection or scattering.

In colorimetric method, (an example of spectrophotometric), light of a specific wavelength is made to pass through a solution of which concentration is to be determined. The amount of light absorbed by the solution is measured (absorbance). A known standard solution of the substance being measured is treated same way and its absorbance is measured, the concentration of the test solution is derived by simple

extrapolation. The specific wavelength of light made to pass through the solution is

dependent on the colour of the test solution. Complementary colour of that of the solution is made use of: e.g.

 

 

Colour of Solution            Complementary Colours of Light

          Blue                                1. Blue Yellow (e.g. 450 nm)

          Bluish-green                   2. Bluish-green Red (630 nm)

 

The basic principle of colorimetric technique is Beer-Lambert law. Beer-Lambert law states that ‘when a specific wavelength of light (monochromatic light) passes through a colouredsolution, the amount of light absorbed is directly proportional to the concentration of the solution (intensity of the colour) and the length path through the

solution.The biochemical substance or analyte (e.g. glucose, cholesterol) to be measured in a clinical specimen (body fluids) is allowed to specifically react with chemical agent(s) to form a coloured product (in solution). The absorbance of the coloured solution formed is measured using spectrophotometer (an instrument used to measure the amount of light

absorbed or transmitted by substances in solutions). Since absorbance of a substance in solution is directly proportional to its concentration, the concentration of the substance of interest is calculated from its absorbance and the absorbance of a standard solution can be treated the same way.

http://intranet.tdmu.edu.ua/data/kafedra/internal/clinlab/classes_stud/en/med/lik/ptn/4/LABORATORY%20MEDICINE.%20HAEMATOLOGY.files/image020.png

Picture of a Spectrophotometer

http://intranet.tdmu.edu.ua/data/kafedra/internal/clinlab/classes_stud/en/med/lik/ptn/4/LABORATORY%20MEDICINE.%20HAEMATOLOGY.files/image022.png

Picture of an Atomic Emission Flame Photometer

 

Electrophoretic Techniques

Electrophoresis is a method of separation of mixtures based on differential rate of movement of charged particles when subjected to an electric field at a specific pH. Electrophoretic technique is typically used in the clinical laboratory for the separation of proteins. It is primarily a qualitative method of analysis, but it can be adopted for quantitative analysis.

Principle of Electrophoretic Separation of Proteins

Proteins in serum vary in their iso-electric points. Iso-electric point of a protein is the pH at which there is no net charge (zero charge) on protein particles. At a pH alkaline to its iso-electric point, a protein will carry a net negative charge and therefore migrates to the anode when a current is passed, whereas at a pH acidic to its iso-electric point, it will carry a net positive charge and migrate to the cathode when a current is applied.

Iso-electric point of serum proteins varies from 4.7(albumen) to 7.3 (gamma globulin).

Hence, at a buffered pH of say 8.6, each protein fraction will migrate at different rates when subjected to an electric field.

http://intranet.tdmu.edu.ua/data/kafedra/internal/clinlab/classes_stud/en/med/lik/ptn/4/LABORATORY%20MEDICINE.%20HAEMATOLOGY.files/image024.png

Electrophoretic technique can be used to detect or identify an abnormal protein present in plasma as a result of disease conditions. Example is in a disease called multiple myeloma, an abnormal protein called Bence -Jones protein can be detected by electrophoretic method. Determination of Haemoglobin genotype of an individual is also done using this technique.

 

Chromatographic Techniques

Chromatography is a method of separation of mixtures which utilizes differential affinity of the separating molecules substance in the mixture, for mobile and stationary phases, over which the substances to be separated are distributed. A mobile phase may be a gas or a liquid (solvent) in which the substance (mixture) is solubilised, while a stationary phase is either a solid or a liquid supported (stationed) on a solid matter, over which the mobile phase carries the mixture. Substances (in the mixture) that have greater affinity for the mobile phase are separated first, after the order of their affinities for the mobile

phase constituents of the mixture that have greater affinities for the stationary phase are separated much latter during the process. As the mobile phase carries the mixture over stationary phase (like an effluent), the separated constituent are collected as different fractions. The different fractions can be identified and also quantified. Chromatographic technique is typically named after the mobilestationary phase e.g. Gas-liquid chromatography, or after the working principle e.g. ion-exchange chromatography. Others include:

· Thin layer chromatography

· Molecular sieve chromatography

· High performance liquid chromatography

 

Centrifugation

Centrifugation is a process that involves the use of centrifugal force for the separation of mixtures. In the clinical laboratory setting, the major use of centrifugation is as follows:

(1) Separation of plasma, serum and red cells from whole blood, when a particular fraction of the blood is needed for tests.

(2) Acquisition of urine sediment for microbiological examination

(3) Any laboratory procedure (test) that require separation of a particular fraction of a suspension.

Many particles or cells in a liquid medium (suspension) at a given time, will eventually settle at the bottom of a container due to gravity. However, the length of time required for such separation may be long. When a suspension is rotated at a certain speed or revolution per minute, centrifugal force causes the particulates to move away from the axis of rotation and therefore settles at the bottom of the container as a precipitate. The remaining solution or liquid is called the supernate or supernatant. The equipment used for the process is called centrifuge.

There are basically two types of centrifuge used in clinical laboratory:

1. Bench centrifuge

2. Microhaematocrit centrifuge

http://intranet.tdmu.edu.ua/data/kafedra/internal/clinlab/classes_stud/en/med/lik/ptn/4/LABORATORY%20MEDICINE.%20HAEMATOLOGY.files/image026.png

Picture of a Microhaematocrit Centrifuge

http://intranet.tdmu.edu.ua/data/kafedra/internal/clinlab/classes_stud/en/med/lik/ptn/4/LABORATORY%20MEDICINE.%20HAEMATOLOGY.files/image028.png

Picture of a Bench Centrifuge

 

CLINICAL INTERPRETATION OF LABORATORY TESTS IN HEMATOLOGY – I

CLINICAL INTERPRETATION OF LABORATORY TESTS IN HEMATOLOGY – II

 

Blood constitutes 6 to 8 percent of total body weight. In terms of volume, women have 4.5 to 5.5 L of blood and men 5 to 6 L. In infants and children, blood volume is 50 to 75 mL/kg in girls and 52 to 83 mL/kg in boys. The principal functions of blood are the transport of oxygen, nutrients, and hormones to all tissues and the removal of metabolic wastes to the organs of excretion. Additional functions of blood are (1) regulation of temperature by transfer of heat to the skin for dissipation by radiation and convection, (2) regulation of the pH of body fluids through the buffer systems and facilitation of excretion of acids and bases, and (3) defense against infection by transportation of antibodies and other substances as needed. Blood consists of a fluid portion, called plasma, and a solid portion that includes red blood cells (erythrocytes), white blood cells (leukocytes), and platelets (thrombocytes). Plasma makes up 45 to 60 percent of blood volume and is composed of water (90 percent), amino acids, proteins, carbohydrates, lipids, vitamins, hormones, electrolytes, and cellular wastes.

http://intranet.tdmu.edu.ua/data/kafedra/internal/clinlab/classes_stud/en/med/lik/ptn/4/LABORATORY%20MEDICINE.%20HAEMATOLOGY.files/image029.png

Hematology is traditionally limited to the study of the cellular elements of the blood, the production of these elements, and the physiological derangements that affect their functions.

 

http://intranet.tdmu.edu.ua/data/kafedra/internal/clinlab/classes_stud/en/med/lik/ptn/4/LABORATORY%20MEDICINE.%20HAEMATOLOGY.files/image030.png

Hemopoiesis

 

Hemopoiesis – the processes of blood cells formation and development. 

There are 2 kinds of hemopoiesis: embrional and postembrional. Organs of embrional hemopoiesis: 1. In the first few weeks of gestation the yolk sac is the main site of hemopoiesis.2. Liver. 3. Spleen. 4. Lymphatic nodules. 5. Thymus. 6. Bone marrow – on 6-th month is central site of hemopoiesis. Organs of postembrionalhemopoiesis: 1. Bone marrow (vertebrae, ribs, sternum, skull, sacrum and pelvis, proximal ends of femur). 2. Spleen. 3. Lymphatic nodules.

All blood cells derive from a common stem cell.

http://intranet.tdmu.edu.ua/data/kafedra/internal/clinlab/classes_stud/en/med/lik/ptn/4/LABORATORY%20MEDICINE.%20HAEMATOLOGY.files/image031.png

 

 

http://intranet.tdmu.edu.ua/data/kafedra/internal/clinlab/classes_stud/en/med/lik/ptn/4/LABORATORY%20MEDICINE.%20HAEMATOLOGY.files/image033.jpg

Under the influences of local and humoral factors, stem cells differentiate into different cell lines. Erythropoiesis and thrombopoiesis proceed independently once the stem cell stage has been passed, whereas monocytopoiesis and granulocytopoiesis are quite closely “related.” Lymphocytopoiesis is the most independent among the remaining cell series. Granulocytes, monocytes, and lymphocytes are collectively called leukocytes (white blood cells), a term that has been retained since the days before stainingmethods were available, when the only distinction that could be made was between erythrocytes (red blood cells) and the rest. All these cells are eukaryotic, that is, they are made up of a nucleus, sometimes with visible nucleoli, surrounded by cytoplasm, which may include various kinds of organelles, granulations, and vacuoles. Despite the common origin of all the cells, ordinary light microscopy reveals fundamental and characteristic differences in the nuclear chromatin structure in the different cell series and their various stages of maturation.

http://intranet.tdmu.edu.ua/data/kafedra/internal/clinlab/classes_stud/en/med/lik/ptn/4/LABORATORY%20MEDICINE.%20HAEMATOLOGY.files/image035.jpg

 

The developing cells in the granulocyte series (myeloblasts and promyelocytes), for example, show a delicate, fine “net-like” (reticular) structure. Careful microscopic examination (using fine focus adjustment to view different depth levels) reveals a detailed nuclear structure that resembles fine or coarse gravel. With progressive stages of nuclear maturation in this series (myelocytes, metamyelocytes, and band or staff cells), the chromatin condenses into bands or streaks, giving the nucleus— which at the same time is adopting a characteristic curved shape—a spotted and striped pattern. Lymphocytes, on the other hand—particularly in their circulating forms—always have large, solid-looking nuclei. Like cross-sections through geological slate, homogeneous, dense chromatin bands alternate with lighter interruptions and fissures. Each of these cell series contains precursors that can divide (blast precursors) andmature or almostmature forms that can no longer divide; the morphological differences between these correspond not to steps in mitosis, but result from continuous “maturation processes” of the cell nucleus and cytoplasm. Once this is understood, it becomes easier not to be too rigid about morphological distinctions between certain cell stages. The blastic precursors usually reside in the hematopoietic organs (bone marrow and lymph nodes). Since, however, a strict blood–bone marrow barrier does not exist (blasts are kept out of the bloodstream essentially only by their limited plasticity, i.e., their inability to cross the diffusion barrier into the bloodstream), it is in principle possible for any cell type to be found in peripheral blood, and when cell production is increased, the statistical frequency with which they cross into the bloodstream will naturally rise as well. Conventionally, cells are sorted left to right from immature to mature, so an increased level of immature cells in the bloodstream causes a “left shift” in the composition of a cell series—although it must be said that only in the precursor stages of granulopoiesis are the cell morphologies sufficiently distinct for this left shift to show up clearly.

 

 

Blood Cell functions.

 

http://intranet.tdmu.edu.ua/data/kafedra/internal/clinlab/classes_stud/en/med/lik/ptn/4/LABORATORY%20MEDICINE.%20HAEMATOLOGY.files/image036.png

Neutrophil granulocytes with segmented nuclei serve mostly to defend against bacteria. Predominantly outside the vascular system, in “inflamed” tissue, theyphagocytose and lyse bacteria. The blood merely transports the granulocytes to their site of action.

The function of eosinophilic granulocytes is defense against parasites; they have a direct cytotoxic action on parasites and their eggs and larvae. They also play a role in the down-regulation of anaphylactic shock reactions and autoimmune responses, thus controlling the influence of basophilic cells.

The main function of basophilic granulocytes and their tissue-bound equivalents (tissue mast cells) is to regulate circulation through the release of substances such as histamine, serotonin, and heparin. These tissue hormones increase vascular permeability at the site of various local antigen activity and thus regulate the influx of the other inflammatory cells.

The main function of monocytes is the defense against bacteria, fungi, viruses, and foreign bodies. Defensive activities take place mostly outside the vessels by phagocytosis. Monocytes also break down endogenous cells (e.g., erythrocytes) at the end of their life cycles, and they are assumed to perform a similar function in defense against tumors. Outside the bloodstream, monocytes develop into histiocytes; macrophages in theendothelium of the body cavities; epithelioid cells; foreign body macrophages (including Langhans’ giant cells); and many other cells.

Lymphocytes are divided into two major basic groups according to function. Thymus-dependent T-lymphocytes, which make up about 70% of lymphocytes, providelocal defense against antigens fromorganic and inorganic foreign bodies in the form of delayed-type hypersensitivity, as classically exemplified by the tuberculin reaction. T-lymphocytes are divided into helper cells and suppressor cells. The small group of NK (natural killer) cells, which have a direct cytotoxic function, is closely related to the T-cell group.

The other group is the bone-marrow-dependent B-lymphocytes or Bcells, which make up about 20% of lymphocytes. Through their development into immunoglobulin-secreting plasma cells, B-lymphocytes are responsible for the entire humoral side of defense against viruses, bacteria, and allergens. Erythrocytes are the oxygen carriers for all oxygen-dependent metabolic reactions in the organism. They are the only blood cells without nuclei, since this allows them to bind and exchange the greatest number of O2 molecules. Their physiological biconcave disk shape with a thick rim  provides optimal plasticity.

Thrombocytes form the aggregates that, along with humoral coagulation factors, close up vascular lesions. During the aggregation process, in addition to the mechanical function, thrombocytic granules also release  factors that promote coagulation. Thrombocytes develop from polyploid megakaryocytes in the bone marrow. They are the enucleated, fragmented cytoplasmic portions of these progenitor cells.

COMPLETE BLOOD COUNT

 

A CBC includes (1) enumeration of the cellular elements of the blood, (2) evaluation of RBC indices, and (3) determination of cell morphology by means of stained smears.

INDICATIONS FOR A COMPLETE BLOOD COUNT

Because the CBC provides much information about the overall health of the individual, it is an essential component of a complete physical examination,

especially when performed on admission to a health-care facility or before surgery. Other indications for a CBC are as follows:

1.    Suspected hematologic disorder, neoplasm, or immunologic abnormality

2.    History of hereditary hematologic abnormality

3.    Suspected infection (local or systemic, acute or chronic)

4.    Monitoring effects of physical or emotional stress

5.    Monitoring desired responses to drug therapy and undesired reactions to drugs that may cause blood dyscrasias

6.    Monitoring progression of nonhematologic disorders such as chronic obstructive pulmonary disease, malabsorption syndromes, malignancies, and renal disease

Taking Blood Samples

This means that blood should always be drawn at about the same time of day and after at least eight hours of fasting, since both circadian rhythm and nutritional status can affect the findings. If strictly comparable values are required, there should also be half an hour of bed rest before the sample is drawn, but this is only practicable in a hospital setting. In other settings (i.e., outpatient clinics), bringing portable instruments to the relaxed, seated patient works well.

A sample of capillary blood may be taken when there are no further tests that would require venous access for a larger sample volume. A well perfused fingertip or an earlobe is ideal; in newborns or young infants, the heel is also a good site. If the circulation is poor, the blood flow can be increased by warming the extremity by immersing it in warm water. Without pressure, the puncture area is swabbed several times with 70% alcohol, and the skin is then punctured firmly but gently with a sterile disposable lancet. The first droplet of blood is discarded because it may be contaminated, and the ensuing blood is drawn into the pipette (see below). Care should be taken not to exert pressure on the tissue from which the blood is being drawn, because this too can change the cell composition of the sample.

General blood analysis (normal values)

1.   Erythrocytes (red blood cells) Male - 4-5,1× 1012/L

                                              Female – 3,7-4,7× 1012/L

2.   Hemoglobin    Male - 130-160 g/L    Female – 120-140 g/L   

3.   Hematocrit                   Male - 40-48 %    Female – 36-42 %

4.   Reticulocytes                       0,5-1 %

5.   Plateletes                              180-320 × 109/L

6.   ESR (Erythrocytes sedimentation rate)  Male - 1-10 mm/hour

                                                              Female – 2-15 mm/hour

7.   Leucocytes 4-9 × 109/L

Neutrophilic band granulocytes – 1-6 %

Neutrophilic segmented granulocytes – 45-72 %

Eosinophilic granulocytes – 0,5-5 %

Basophilic granulocytes – 0-1 %

Monocytes – 3-11 %

Lymphocytes – 19-37 %

 

Erythrocyte Parameters

The quality of erythrocytes is characterized by:

 

1.    MCV (mean corpuscular volume)

Male - 80-94 mcm3 (Fl) Female – 81-99  mcm3 (Fl)

2.    MCH (mean corpuscular hemoglobin) – 27-31 pg

3.    MCHC (mean corpuscular hemoglobin concentration) – 33-37 % or 20,4-22,9 mmol/L

4.    Red Cell Distribution Width (RDW) - 11,5-14,5 %.

 

MCV indicates the volume of the Hgb in each RBC, MCH is the weight of the

Hgb in each RBC, and MCHC is the proportion of Hgb contained in each RBC. MCHC is a valuable indicator of Hgb deficiency and of the oxygen-carrying

capacity of the individual erythrocyte. A cell of abnormal size, abnormal shape, or both may contain an inadequate proportion of Hgb. RBC indices are used mainly in identifying and classifying types of anemias. Anemias are generally classified according to RBC size and Hgb content. Cell size is indicated by the terms normocytic, microcytic, and macrocytic. Hemoglobin content is indicated by the terms normochromic, hypochromic, and hyperchromic.

 

ERYTHROCYTE (RBC) COUNT

 

The erythrocyte (RBC) count, a component of the CBC, is the determination of the number of RBCs per cubic millimeter. In international units, this is expressed as the number of RBCs per liter of blood. The test is less significant by itself than it is in computing Hgb, Hct, and RBC indices. Many factors influence the level of circulating erythrocytes. Decreased numbers are seen in disorders involving impaired erythropoiesis excessive blood cell destruction (e.g., hemolytic anemia), and blood loss, and in chronic inflammatory diseases. A relative decrease also may be seen in situations with increased body fluid in the presence of a normal number of RBCs (e.g., pregnancy). Increases in the RBC count are most commonly seen in polycythemia vera, chronic pulmonary disease with hypoxia and secondary polycythemia, and dehydration with hemoconcentration. Excessive exercise,

anxiety, and pain also produce higher RBC counts.

HEMATOCRIT

 

Blood consists of a fluid portion (plasma) and a solid portion that includes RBCs, WBCs, and platelets. More than 99 percent of the total blood cell mass is composed of RBCs. The Hct or packed RBC volume measures the proportion of RBCs in a volume of whole blood and is expressed as a percentage. Several methods can be used to perform the test. In the classic method, anticoagulated venous blood is pipetted into a tube 100 mm long and then centrifuged for 30 minutes so that the plasma and blood cells separate. The volumes of packed RBCs and plasma are read directly from the millimeter marks along the side of the tube. In the micromethod, venous or capillary blood is used to fill a small capillary tube, which is then centrifuged for 4 to 5 minutes. The proportions of plasma and RBCs are determined by means of a calibrated reading device. Both techniques allow visual estimation of the volume of WBCs and platelets. With the newer, automated methods of cell counting, the Hct is calculated indirectly as the product of the RBC count and mean cell volume. Although this method is generally quite accurate, certain clinical situations may cause errors in interpreting the Hct. Abnormalities in RBC size and extremely elevated WBC counts may produce false Hct values. Elevated blood glucose and sodium may produce elevated Hct values because of the resultant swelling of the erythrocyte. Normally, the Hct parallels the RBC count. Thus, factors influencing the RBC count also affect the results of the Hct.

 

HEMOGLOBIN

 

Hemoglobin is the main intracellular protein of the RBC. Its primary function is to transport oxygen to the cells and to remove carbon dioxide from them for excretion by the lungs. The Hgb molecule consists of two main components: heme and globin. Heme is composed of the red pigment porphyrin and iron, which is capable of combining loosely with oxygen. Globin is a protein that consists of nearly

600 amino acids organized into four polypeptide chains. Each chain of globin is associated with a heme group. Each RBC contains approximately 250 million

molecules of hemoglobin, with some erythrocytes containing more hemoglobin than others. The oxygen-binding, -carrying, and –releasing capacity of Hgb depends on the ability of the globin chains to shift position normally during the

oxygenation–deoxygenation process. Structurally abnormal chains that are unable to shift normally have decreased oxygen-carrying ability. This decreased oxygen transport capacity is characteristic of anemia. Hemoglobin also functions as a buffer in the maintenance of acid–base balance. During transport, carbon dioxide (CO2) reacts with water (H2O) to form carbonic acid (H2CO3). This reaction is speeded by carbonic anhydrase, an enzyme contained in RBCs. The carbonic acid rapidly dissociates to form hydrogen ions (H+) and bicarbonate ions (HCO3). The hydrogen ions combine with the Hgb molecule, thus preventing a buildup of hydrogen ions in the blood. The bicarbonate ions diffuse into the plasma and play a role in the bicarbonate buffer system. As bicarbonate ions enter the bloodstream, chloride ions (Cl_) are repelled and move back into the erythrocyte. This “chloride shift” maintains the electrical balance between RBCs and plasma.

Hemoglobin determinations are of greatest use in the evaluation of anemia, because the oxygen-carrying capacity of the blood is directly related to the Hgb level rather than to the number of erythrocytes. To interpret results accurately, the Hgb level must be determined in combination with the Hct level. Normally, Hgb and Hct levels parallel each other and are commonly used together to express the degree of anemia. The combined values are also useful in evaluating situations involving blood loss and related treatment. The Hct level is normally three times the Hgb level. If erythrocytes are abnormal in shape or size or if Hgb manufacture is defective, the relationship between Hgb and Hct is disproportionate.

STAINED RED BLOOD CELL EXAMINATION

 

The stained RBC examination (RBC morphology) involves examination of RBCs under a microscope. It is usually performed to compare the actual appearance of the cells with the calculated values for RBC indices. Cells are examined for abnormalities in color, size, shape, and contents. The test is performed by spreading a drop of fresh anticoagulated blood on a glass slide. The addition of stain to the specimen is used to enhance RBC characteristics.

 

Red Blood Cell Abnormalities Seen on Stained Smear

 

 

Descriptive Term

 

Observation

Significance

Macrocytosis

 

Cell diameter > 8 µm

MCV > 95 µm3

 

Megaloblastic anemias

Severe liver disease

Hypothyroidism

Microcytosis

 

Cell diameter < 6 µm

MCV < 80 µm3

MCHC< 27

 

Iron-deficiency anemia

Thalassemias

Anemia of chronic disease

Hypochromia

 

 Increased zone of central pallor

Diminished Hgb content

 

Hyperchromia

 

Microcytic, hyperchromic cells

Increased bone marrow stores of iron

 

Chronic inflammation

Defect in ability to use iron for Hgb synthesis

 

Polychromatophilia

 

Presence of red cells not fully

hemoglobinized

Reticulocytosis

 

Poikilocytosis

 

Variability of cell shape

 

Sickle cell disease

Microangiopathic hemolysis

Leukemias

Extramedullaryhematopoiesis

Marrow stress of any cause

 

 

Red Blood Cell Abnormalities Seen on Stained Smear

 

 

Descriptive Term

 

Observation

Significance

Anisocytosis

 

Variability of cell size

 

Reticulocytosis

Transfusing normal blood into microcytic or macrocytic cell population

 

Leptocytosis

 

Hypochromic cells with small

central zone of Hgb (“target

cells”)

 

Thalassemias

Obstructive jaundice

 

Spherocytosis

 

Cells with no central pallor,

loss of biconcave shape

 

Loss of membrane relative to cell volume

Hereditary spherocytosis

 

Schistocytosis

 

MCHC high

 

Accelerated red blood cell destruction by

reticuloendothelial system

 

Acanthocytosis

 

Presence of cell fragments in

circulation

 

Increased intravascular mechanical trauma

Microangiopathic hemolysis

 

Echinocytosis

 

Irregularly spiculated surface

Regularly spiculated cell surface

 

Irreversibly abnormal membrane lipid content

Liver disease

Abetalipoproteinemia

Reversible abnormalities of membrane lipids

High plasma-free fatty acids

Bile acid abnormalities

Effects of barbiturates, salicylates, and so on

 

Stomatocytosis

 

Elongated, slitlike zone of central

pallor

 

Hereditary defect in membrane sodium metabolism

Severe liver disease

 

Elliptocytosis

Oval cells

Hereditary anomaly, usually harmless

 

Types of Abnormal Red Blood Cell Inclusions and Their Causes

 

Type

Causes of inclusion

Heinz bodies (denatured Hgb)

 

Thalassemia

G-6-PD deficiency

Hemolytic anemias

Methemoglobinemia

Splenectomy

Drugs: analgesics, antimalarials, antipyretics,nitrofurantoin (Furadantin),

nitrofurazone (Furacin), phenylhydrazine, sulfonamides, tolbutamide,

vitamin K (large doses)

 

Basophilic stippling (residual

cytoplasmic RNA)

 

Anemia caused by liver disease

Lead poisoning

Thalassemia

 

Howell-Jolly bodies (fragments

of residual DNA)

 

Splenectomy

Intense or abnormal RBC production resulting from hemolysis or inefficient

erythropoiesis

 

Cabot’s rings (composition

unknown)

 

Same as for Howell-Jolly bodies

 

Siderotic granules (ironcontaining

granules)

Abnormal iron metabolism

Abnormal hemoglobin manufacture

 

 

 

 

OSMOTIC FRAGILITY

The osmotic fragility test determines the ability of the RCB membrane to resist rupturing in a hypotonic saline solution. Normal disk-shaped cells can imbibe water and swell significantly before membrane capacity is exceeded, but spherocytes (RBCs that lack the normal biconcave shape) and cells with damaged membranes burst in saline solutions only slightly less concentrated than normal saline. Conversely, in thalassemia, sickle cell disease, and other disorders. The test is performed by exposing RBCs to increasingly dilute saline solutions. The percentage of the solution at which the cells swell and rupture is then noted. Normal erythrocytes rupture in saline solutions of 0.30 to 0.45 percent. RBC rupture in solutions of greater than 0.50 percent saline indicates increased fragility. Lack of rupture in solutions of less than 0.30 percent saline indicates decreased RBC fragility.

 

Causes of Altered Erythrocyte Osmotic Fragility

 

Decreased Fragility

 

Increased Fragility

Iron-deficiency anemias

 

Hereditary spherocytosis

 

Hereditary anemias (sickle cell, hemoglobin C,

thalassemias)

 

Hemolytic anemias

 

 

Liver diseases

 

Autoimmune anemias

 

Polycythemia vera

 

Burns

 

Splenectomy 

 

Toxins (bacterial, chemical)

 

Obstructive jaundice

 

Hypotonic infusions

 

 

Transfusion with incompatible blood

 

 

Mechanical trauma to RBCs (prosthetic heart valves,

disseminated intravascular clotting, parasites)

 

 

Enzyme deficiencies (PK kinase, G-6-PD

 

 

 

 

ERYTHROCYTE SEDIMENTATION RATE

The erythrocyte sedimentation rate (ESR or sedrate) measures the rate at which RBCs in anticoagulated blood settle to the bottom of a calibrated tube. In normal blood, relatively little settling occurs because the gravitational pull on the RBCs is almost balanced by the upward force exerted by the plasma. If plasma is extremely viscous or if cholesterol levels are very high, the upward trend may virtually

neutralize the downward pull on the RBCs. In contrast, anything that encourages RBCs to aggregate or stick together increases the rate of settling. Inflammatory and necrotic processes, for example, cause an alteration in blood proteins that results in

clumping together of RBCs because of surface attraction. These clumps are called rouleaux. If the proportion of globin to albumin increases or if fibrinogen 3 levels are especially high, rouleaux formation is enhanced and the sed rate increases.

 

Causes of Altered Erythrocyte Sedimentation Rates

 

 

 

Increased rate

Decreased rate

Pregnancy (uterine and ectopic)

Polycythemia vera

 

Toxemia of pregnancy

Congestive heart failure

 

Collagen disorders (immune disorders of connective

tissue)

Sickle cell, Hgb C disease

 

Inflammatory disorders

Degenerative joint disease

 

Infections

Cryoglobulinemia

 

Acute myocardial infarction

Drug toxicity (salicylates, quinine derivatives,

adrenal corticosteroids

Most malignancies

 

Drugs (oral contraceptives, dextran, penicillamine,

methyldopa, procainamide, theophylline, vitamin A)

 

Severe anemias

 

Myeloproliferative disorders

 

Renal disease (nephritis)

 

Hepatic cirrhosis

 

Thyroid disorders

 

Acute heavy metal poisoning

 

 

 

 

 

 

WHITE BLOOD CELL COUNT

The WBC count determines the number of leukocytes per cubic millimeter of whole blood. The counting is performed very rapidly by electronic devices. The WBC may be performed as part of a CBC, alone, or with differential WBC count. An elevated WBC count is termed leukocytosis; a decreased count, leukopenia. In addition to the normal physiological variations in WBC count, many pathological problems may result in an abnormal WBC count .

 

Causes of Altered White Blood Cell Differential by Cell Type

 

Cell Type

Increased Levels

Decreased Levels

Neutrophils

Stress (allergies, exercise, childbirth, surgery)

Extremes of temperature

Acute hemorrhage or hemolysis

Infectious diseases

Inflammatory disorders (rheumatic fever,

gout, rheumatoid arthritis, drug reactions, vasculitis, myositis)

Tissue necrosis (burns, crushing injuries, abscesses

Malignancies

Metabolic disorders (uremia,eclampsia, diabetic ketoacidosis, thyroid crisis, Cushing’s syndrome)

Drugs (epinephrine, histamine, lithium, heavy metals, heparin, digitalis, ACTH)

Toxins and venoms (turpentine, benzene)

Leukemia (myelocytic)

Bone marrow depression (viruses, toxic chemicals, overwhelming infection,

Felty’s syndrome, Gaucher’s disease,myelofibrosis, hypersplenism, pernicious

anemia, radiation)

Anorexia nervosa, starvation, malnutrition

Folic acid deficiency

Vitamin B12 deficiency

Acromegaly

Addison’s disease

Thyrotoxicosis

Anaphylaxis

Disseminated lupus erythematosus

Drugs (alcohol, phenylbutazone

[Butazolidin], phenacetin, penicillin,

chloramphenicol, streptomycin, phenytoin

[Dilantin], mephenytoin

[Mesantoin], phenacemide[Phenurone],

tripelennamine [PBZ], aminophylline,

quinine, chlorpromazine, barbiturates,dinitrophenols, sulfonamides,antineoplastics

Bands

Infections

Antineoplastic drugs

Any condition that causes neutrophilia

 

None, as bands should be absent or present only in small numbers

Basophils

Leukemia

Hodgkin’s disease

Polycythemia vera

Ulcerative colitis

Nephrosis

Chronic hypersensitivity states

None, as normal value is 0–1%

Eosinophils

Sickle cell disease

Asthma

Chorea

Hypersensitivity reactions

Parasitic infestations

Autoimmune diseases

Addison’s disease

Malignancies

Sarcoidosis

Chronic inflammatory diseases and

dermatoses

Leprosy

Hodgkin’s disease

Polycythemias

Ulcerative colitis

Autoallergies

Pernicious anemia

Splenectomy

Disseminated lupus erythematosus

Acromegaly

Elevated steroid levels

Stress

Infectious mononucleosis

Hypersplenism

Cushing’s syndrome

Congestive heart failure

Hyperplastic anemia

Hormones (ACTH, thyroxine, epinephrine)

Monocytes

Infections (bacterial, viral, mycotic,rickettsial, amebic)

Cirrhosis

Collagen diseases

Ulcerative colitis

Regional enteritis

Gaucher’s disease

Hodgkin’s disease

Lymphomas

Carcinomas

Monocytic leukemia

Radiation

Polycythemia vera

Sarcoidosis

Weil’s disease

Systemic lupus erythematosus

Hemolytic anemias

Thrombocytopenic purpura

Not characteristic of specific disorders

Lymphocytes

Infections (bacterial, viral)

Lymphosarcoma

Ulcerative colitis

Banti’s disease

Felty’s syndrome

Myeloma

Lymphomas

Addison’s disease

Thyrotoxicosis

Malnutrition

Rickets

Waldenström’s macroglobulinemia

Lymphocytic leukemia

Immune deficiency diseases

Hodgkin’s disease

Rheumatic fever

Aplastic anemia

Bone marrow failure

Gaucher’s disease

Hemolytic disease of the newborn

Hypersplenism

Thrombocytopenic purpura

Transfusion reaction

Massive transfusions

Pernicious anemia

Septicemia

Pneumonia

Burns

Radiation

Toxic chemicals (benzene, bismuth, DDT)

Antineoplastic agents

Adrenal corticosteroids (high doses)

 

 

Oddsei - What are the odds of anything.