Medicine

LABORATORY MEDICINE: HISTORY, MODERN ASPECTS

LABORATORY MEDICINE: HISTORY, MODERN ASPECTS

 

Clinical Laboratory diagnostics(laboratory medicine) is the medical discipline devoted to obtain, explore and employ knowledge about using various techniques for the analysis of body fluids composition and properties of cells and tissues, and interpretation of the results in relation to health and disease.

It should be stressed that laboratory diagnostics or laboratory medicine is both the clinical discipline and the separate medical science. These two fields of laboratory diagnostics are tightly bound as in the case of other clinical sciences. Laboratory tests are used in various stages of the diagnostic process in all fields of clinical medicine, being along with imaging studies, electrophysiological and other procedures the main source of information on the health status of the patient. It is estimated that laboratory results can be the basis of 60 % -70 % of medical decisions. In addition to routine diagnostics in symptomatic patients, laboratory tests are used for screening, treatment monitoring and medical jurisprudence. Thus, laboratory diagnostics generating around 10 % of all healthcare costs is crucial for the healthcare decision making process, contributing to improved outcomes and cost savings

 

Clinical laboratory (also called medical laboratory) is a facility that provides controlled conditions in which tests are done on clinical specimens in order to acquire information about the health of an individual (or patient) for the purpose of diagnosis, treatment, and

prevention of disease or medical research.

The clinical laboratory consists of four major divisions (or departments).

1.     Medical Microbiology and Parasitological Laboratory: This laboratory deals with the study of human pathogens. Pathogens are biological agents that cause diseases to their hosts. They include microorganisms (bacteria, viruses and fungi) and parasites (e.g. intestinal worms, lice and malaria parasites) of medical importance. In bigger health centres or research institutes, medical microbiology and parasitological laboratory is usually split into sub-unit like bacteriology, parasitology and virology laboratories.

2.     Haematology: This laboratory is involved in the performance of relevant tests (on blood) in the diagnosis of blood diseases, (e.g. Anaemia, Haemoglobinopathies, Leukaemia etc) and blood transfusion services e.g. blood group, blood cross matching.

3.     Clinical Biochemistry Laboratory or Clinical Chemistry : This division of laboratory is concerned with the performance of quantitative and qualitative tests on clinical specimens to investigate the state of various body chemistries. Such clinical specimens include body fluids (e.g. whole blood, plasma, serum, urine, sweat, cerebrospinal fluid) and occasionally faeces, tissue, hair e.t.c.

4.     Histopathology Laboratory: This is the laboratory where tissues (or cells) are processed for microscopic examination in order to investigate or study disease manifestations on the tissue (or cells), structure, for diagnostic purposes e.g. Cancer diagnosis. In the laboratory, tissue samples are processed onto glass slides from which effects of diseases on the histological architecture of tissues can be microscopically examined and hence diagnostic inferences are made.

 

Universal Precautionary Measure in Clinical Laboratory

Garner (1997) defined Universal Basic Precaution as the prevention of transmission of blood pathogens through strict respect of rules concerning care and nursing. Gerberding et al., (1995) also defined universal precaution as the routine use of appropriate barrier and techniques to reduce the likelihood of exposure to blood, other body fluid and tissue that may contain blood borne pathogens.

Universal basic precautions assume that all clinical specimens contain infectious agents and should therefore be handled as such. This approach eliminates the need to identify infected patients or specimen from Human Immunodeficiency Virus (HIV) or other blood borne pathogen infected patients.

The followings are the laboratory universal safety precautions.

1. Universal precautions should apply to blood and all body fluid containing visible blood, semen, vaginal secretions, tissues, cerebrospinal fluid, peritoneal fluid, pericardial fluid, synovial fluid and amniotic fluid.

2. Laboratory workers should use protective barriers appropriate for the laboratory procedure and the type and extent of exposure expected. All persons processing blood should wear gloves and laboratory coats; and these should be removed before leaving the

laboratory. Biological safety barriers should be used wherever necessary.

3. Hands should be washed immediately when contaminated with blood or other body fluids, after removing gloves and after completing laboratory activities.

4. Use of needles and syringes should be minimised. They should be used in situations in which there is no alternative. If used, needles should not be recapped or bent or broken by hand. After use, needles and other sharp instruments should be placed in a ‘sharpsafe’

puncture-resistant container for disposal.

5. Specimens of blood should be placed in strong-leak-proof containers during transport.

6. Mouth pipetting must not be performed in the laboratory. Mechanical devices should be used.

7. Contaminated materials used in the laboratory should be decontaminated appropriately before reprocessing or disposal.

8. Laboratory work surfaces should be cleaned and decontaminated with appropriate disinfectant after a blood or body fluid spill and at the end of day’s work.

 

History of laboratory medicine

Three distinct periods in the history of medicine are associated with three different places and, therefore, different methods of determining diagnosis: From the middle ages to the 18th century, bedside medicine was prevalent; then between 1794 and 1848 came hospital medicine; and from that time forward, laboratory medicine has served as medicine’s lodestar. The laboratory’s contribution to modern medicine has only recently been recognized by historians as something more than the addition of another resource to medical science and is now being appreciated

as the seat of medicine, where clinicians account for what they observe in their patients.

The first medical diagnoses made by humans were based on what ancient physicians could observe with their eyes and ears, which sometimes also included the examination of human specimens.

The ancient Greeks attributed all disease to disorders of bodily fluids called humors, and during the late medieval period, doctors routinely performed uroscopy. Later, the microscope revealed not only the cellular structure of human tissue, but also the organisms that cause disease. More sophisticated diagnostic

tools and techniques—such as the thermometer for measuring temperature and the stethoscope for measuring heart rate—were not in widespread use until the end of the 19th century. The clinical laboratory would not become a standard fixture of medicine until the beginning of the 20th century. This four-part article reviews the history and development of diagnostic methods from ancient to modern times, as well as the evolution of the clinical laboratory from the late 19th century to the present.

Ancient diagnostic methods

In ancient Egypt and Mesopotamia, the earliest physicians made diagnoses and recommended treatments based primarily on observation of clinical symptoms. Palpation and auscultation were also used. Physicians were able to describe dysfunctions of the digestive tract, heart and circulation,the liver and spleen, and menstrual disturbances;unfortunately, this empiric medicine was reserved for royalty and the wealthy. Other less-than-scientific methods of diagnosis used in

treating the middle and lower classes included divination through ritual sacrifice to predict the outcome of illness. Usually a sheep would be killed before the statue of a god. Its liver was examined for malformations or peculiarities; the shape of the lobes and the orientation of the common duct were then used to predict the fate of the patient.

Ancient physicians also began the practice of examining patient specimens. The oldest known test on body fluids was done on urine in ancient times (before 400 BC). Urine was poured on the ground and observed to see whether it attracted

insects. If it did, patients were diagnosed with boils. The ancient Greeks also saw the value in examining body fluids to predict disease. At around 300 BC, Hippocrates promoted the use of the mind and senses as diagnostic tools, a principle that played a large part in his reputation as the “Father of Medicine.” The central Hippocratic doctrine of humoral pathology attributed all disease to disorders of fluids of the body. To obtain a clear picture of disease, Hippocrates advocated a diagnostic protocol that included tasting the patient’s urine, listening to the lungs, and observing skin color and other outward appearances. Beyond that, the physician was to “understand the patient as an individual.” Hippocrates related the appearance of bubbles on the surface of urine specimens to kidney disease and chronic illness. He also related certain urine sediments and blood and pus in urine to disease. The first description of hematuria, or the presence of blood in urine, by Rufus of Ephesus surfaced at around AD 50 and was attributed to the failure of kidneys to function properly in filtering the blood. Later (c. AD 180), Galen (AD 131–201), who is recognized as the founder of experimental physiology, created a

system of pathology that combined Hippocrates’ humoral theories with the Pythagorean theory, which held that the four elements (earth, air, fire and water), corresponded to various combinations of the physiologic qualities of dry, cold, hot

and moist. These combinations of physiologic characteristics corresponded roughly to the four humors of the human body: hot + moist = blood; hot + dry = yellow bile; cold + moist = phlegm; and cold + dry = black bile.

Galen was known for explaining everything in light of his theory and for having an explanation for everything.

He also described diabetes as “diarrhea of urine” and noted the normal relationship between fluid intake and urine volume. His unwavering belief in his own infallibility appealed to complacency and reverence for authority. That dogmatism essentially brought innovation and discovery in European medicine to a standstill for nearly 14 centuries. Anything relating to anatomy, physiology and disease was simply referred back to Galen as the final authority from whom there could be no appeal.

Middle Ages

In medieval Europe, early Christians believed that disease was either punishment for sin or the result of witchcraft or possession. Diagnosis was superfluous. The basic therapy was prayer, penitence, and invocation of saints. Lay medicine based diagnosis on symptoms, examination, pulse, palpitation, percussion, and inspection of excreta and sometimes semen. Diagnosis by “water casting” (uroscopy) was

practiced, and the urine flask became the emblem of medieval medicine. By AD 900, Isaac Judaeus, a Jewish physician and philosopher, had devised guidelines for the use of urine as a diagnostic aid; and under the Jerusalem Code of 1090, failure to examine the urine exposed a physician to public beatings. Patients carried their urine to physicians in decorative flasks cradled in wicker baskets and, because urine could be shipped, diagnosis at long distance was common. The first book detailing the color, density, quality and sediment found in urine was written

around this time, as well. By around AD 1300, uroscopy became so widespread that it was at the point of near universality in European medicine.

Medieval medicine also included interpretation of dreams in its diagnostic repertoire. Repeated dreams of floods indicated “an excess of humors that required evacuation,” and dreams of flight signified “excessive evaporation of humors.”

Seventeenth century

The medical advances of the 17th century consisted mostly of descriptive works of bodily structure and function that laid the groundwork for diagnostic and therapeutic discoveries that followed. The status of medicine was helped along by the introduction of the scientific society in Italy and by the advent of periodical literature. Considered the most momentous event in medical history since Galen’s time, the discovery of the circulation of blood by William Harvey (1578–1657) marked the beginning of a period of mechanical explanations for a variety of functions and processes, including digestion, metabolism, respiration and pregnancy. The English scientist proved through vivisection, ligation and perfusion that the heart acts as a muscular pump propelling the blood throughout the body in a continuous cycle.

The invention of the microscope opened the door to the invisible world just as Galileo’s telescope had revealed a vast astronomy. The earliest microscopist was a Jesuit priest, Athanasius Kircher (1602–1680) of Fulda (Germany), who was probably the first to use the microscope to investigate the causes of disease. His experiments showed how maggots and other living creatures developed in decaying matter.

Kircher’s writings included an observation that the blood of patients with the plague contained “worms;” however, what he thought to be organisms were probably pus cells and red blood corpuscles because he could not have observed

Bacillus pestis with a 32-power microscope. Robert Hooke (1635–1703) later used the microscope to document the existence of “little boxes,” or cells, in vegetables and inspired the works of later histologists; but some of the greatest contributions

to medical science came from Italian microscopist, Marcello Malpighi (1628–1694). Malpighi, who is described as the founder of histology, served as physician to Pope Innocent XII and was famous for his investigations of the embryology of the chick and the histology and physiology of the glands and viscera. His

work in embryology describes the minutiae of the aortic arches, the head fold, the neural groove, and the cerebral and optic vesicles.

Uroscopy was still in widespread use and had gained popularity as a method to diagnose “chlorosis,” or love-sick young women, and sometimes to test for chastity. Other methods of urinalysis had their roots in the 17th century, as well.

The gravimetric analysis of urine was introduced by the Belgian mystic, Jean Baptiste van Helmont (1577–1644).

Van Helmont weighed a number of 24-hour specimens, but was unable to draw any valuable conclusions from his measurements. It was not until the late 17th century—when Frederik Dekkers of Leiden, Netherlands, observed in 1694 that urine that contained protein would form a precipitate when boiled with acetic acid—that urinalysis became more scientific and more valuable. The best qualitative analysis of urine at the time was pioneered by Thomas Willis

(1621–1675), an English physician and proponent of chemistry. He was the first to notice the characteristic sweet taste of diabetic urine, which established the principle for the differential diagnosis of diabetes mellitus and diabetes insipidus.

Experiments with blood transfusion were also getting underway with the help of a physiologist in Cornwall, England, named Richard Lower (1631–1691). Lower was the first to perform direct transfusion of blood from one animal to another. Other medical innovations of the time included the intravenous injection of drugs, transfusion of blood, and the first attempts to use pulse rate and temperature as indicators of health status.

18-th century

The 18th century is regarded as the “Golden Age” of the successful practitioner, as well as the successful quack. Use of phrenology (the study of the shape of the skull to predict mental faculties and character), magnets, and various powders and potions for treatment of illness were a few of the more popular scams. The advancement of medicine during this time was more theoretical than practical. Internal medicine was improved by new textbooks that cataloged and described many new forms of disease, as well as by the introduction of new drugs, such as digitalis and opium. The state of hospitals in the 18th century, however, was alarming by today’s standards. Recovery from surgical operations was rare because of septicemia. The concept of antisepsis had not yet been discovered, and hospitals were notorious for filth and disease well into the 19th century. One notable event that is a forerunner to the modern practice of laboratory measurement of prothrombin time, plasma thromboplastin time and other coagulation tests, was

the discovery of the cause of coagulation. An English physiologist, William Hewson (1739–1774) of Hexham, Northumberland, England, showed that when the coagulation of the blood is delayed, a coagulable plasma can be separated from the corpuscles and skimmed off the surface. Hewson found that plasma contains an insoluble substance that can be precipitated and removed from plasma at a temperature slightly higher than 50°C. Hewson deduced that coagulation was the formation in the plasma of a substance he called “coagulable lymph,” which is now known as fibrinogen. A later discovery that fibrinogen is a plasma protein and that in coagulation it is converted into fibrin attests to the importance of Hewson’s work.

The clinical diagnostic methods of percussion, temperature, heart rate and blood pressure measurements were further refined, and there were some remarkable attempts to employ precision instruments in diagnosis. Leopold Auenbrugger (1722–1809) was the first to use percussion of the chest in diagnosis in 1754 in Vienna. This method involved striking the patient’s chest while the patient holds his or her breath. Auenbrugger proposed that the chest of a healthy person sounds like a cloth-covered drum. A student of Auenbrugger’s, Jean Nicolas Corvisart, a French physician at La Charité in Paris, pioneered the accurate diagnosis of heart and lung diseases using Auenbrugger’s chestthumping technique. Corvisart’s translation of Auenbrugger’s treatise on percussion, “New Invention to Detect by Percussion Hidden Diseases in the Chest,” popularized the practice

of thumping on a patient’s chest. The resulting sounds are different when the lungs contain lesions or fluids than in healthy people. This observation was validated by postmortem examination.

James Currie (1756–1805), a Scot, was the first to use cold baths in treatment of typhoid fever; and by monitoring the patient’s temperature using a thermometer, he was able to adjust the temperature and frequency of the baths to treat individual patients. It took another hundred years, however, before thermometry became a recognized feature in clinical diagnosis.

Additional advances in urinalysis occurred with J.W. Tichy’s observations of sediments in the urine of febrile patients (1774); Matthew Dobson’s proof that the sweetness of the urine and blood serum in diabetes is caused by sugar (1776); and the development of the yeast test for sugar in diabetic urine by Francis Home (1780).

19-th century

1893 T. W. Richards invents the nephelometer; Hermann M. Biggs establishes Diagnostic Laboratory in New York City.

1895 Franz Ziehl and Friedrich Neelsen introduce their modification of the

acid-fast stain for tuberculosis; William Roentgen discovers X-rays; William

Pepper Laboratory is established at the Pennsylvania General Hospital.

1896 S. Riva-Rocci invents the sphygmomanometer; C. W. Purdy publishes Practical Urinalysis and Urinary Diagnosis; Ferdinand Widal develops the

agglutination test for identification of the typhoid bacillus; in Great Britain, clinical

laboratories existed in Edinburgh, Leeds, Glasgow, and London by this date.

1897 The first commercial clinical laboratory established in England, The

Clinical Research Association, receives specimens by mail.

20-th century

1899 American Society for Microbiology is founded.

1900 F.G. Hopkins discovers tryptophan; Otto Folin becomes the first full-time clinical biochemist (in its modern sense) in the U.S.

1902 The DuBoscq visual colorimeter is first introduced into clinical laboratories.

1903 Ayer Clinical Laboratory is established at Pennsylvania Hospital, designed by Simon Flexner for work with patients.

1904 Christian Bohr discovers the reciprocal relationship between pH and oxygen content of hemoglobin (Bohr effect); M. Beijerinck obtains the first pure culture of the sulfur-oxidizing bacterium Thiobacillus thioparus; the first ultraviolet

lamps and the first practical photoelectric cell are invented.

1905 H.J. Bechtold discovers immunodiffusion.

1906 American Hospital Association is formed from the Association of Hospital Superintendents of the U.S. and Canada.

1908 Todd and Sanford publish the first edition of

Venipuncture is in widespread use by 1920.

1911 Oskar Heimstadt invents the fluorescence microscope.

1912 American College of Surgeons is chartered in Illinois.

1913 D.D. van Slyke is appointed chemist at Rockefeller Hospital Laboratory; American Association of Immunologists is founded.

1916 K.M.G. Siegbahn develops X-ray spectroscopy. P.A. Kohler develops the colorimeter–nephelometer.

1918 N. Wales and E.J. Copeland develop the electric refrigerator (Kelvinator).

1919 F.W. Aston develops the mass spectrograph.

1920 First clinical laboratory method for serum phosphorus is established; the use of venipuncture for diagnostic testing becomes widespread; Victor Meyers establishes the University of Iowa center for training clinical chemists, primarily for hospital positions; Conference of Public Health Laboratories is founded.

1921 First clinical laboratory method for serum magnesium is introduced; The Denver Society of Clinical Pathologists, precursor of the American Society of

Clinical Pathologists, is founded in Denver, CO.

1922 ASCP is founded in St. Louis, MO.

1925 American Type Culture Collection is founded.

1926 Arne Tiselius develops moving boundary electrophoresis of proteins; Theodor Svedberg determines the molecular weight of hemoglobin by ultracentrifugation; ASCP appoints a “Committee on the Registration of Laboratory Technicians” to define and classify medical technicians.

1928 G.N. Papanicolaou first reported the ability to recognize cancer in vaginal smears, thus beginning clinical cytology; F.A. Paneth founds radiochemistry.

1929 Otto Folin introduces the use of the light filter in colorimetry; R. Gabreus develops the erythrocyte sedimentation rate as an index of severity of disease;

M. Knoll and E. Ruska invent the electron microscope; ASCP establishes its Board of Registry for certifying medical technologists; Mayo Clinic has 21 laboratories by this date.

1930 Kay develops the first clinical laboratory method for alkaline phosphatase, thus beginning clinical enzymology; refractometry is first used in clinical

labs for the determination of protein in urine; ASCP issues its first medical technologist certification to P.H. Adams for Ft. Wayne, IN; Beckman

Instruments is founded.

1932 Cherry and Crandall develop the clinical laboratory method for serum lipase activity; American Society of Clinical Laboratory Technicians, precursor of the American Society for Medical Technology, is founded.

1934 Commercial development of the electron microscope takes place.

1935 Beckman Instruments Co. introduces the first

pH meter; ASCP Board of Registry first requires a college degree for medical technologist certification.

1937 First hospitalbased blood bank is established at Cook

County Hospital, Chicago, IL; ASCP and its Board of Registry officially oppose state licensure of medical technologists.

1938 Somogyi develops 2 major clinical laboratory methods for serum and urine amylase activity; Gutman develops the first assay for acid phosphatase.

1939 Conway and Cook develop the first clinical laboratory method for blood ammonia; American Medical Technologists is founded.

1940 Visual colorimeters begin to be replaced by photoelectric  colorimeters in clinical labs; RCA demonstrates the first commercial electron microscope.

1941 G.N. Papanicolaou and H.F. Traut prove the diagnostic usefulness of vaginal smears in cervical cancer; A.J.P. Martin and R.L.M. Synge separate amino acids and peptides by chromatography.

1943 Penicillin is successfully used in therapy.

1944 William Sunderman applies refractometry of proteins in the clinical lab.

1945 S. Borgstrom develops the whole blood clotting time test; itemized charges for hospital services are begun.

1946 The Vacutainer evacuated serum collection tube is introduced by Becton Dickinson Co.; Arne Tiselius separates proteins by chromatography; College of American Pathologists is founded.

1947 Edwin Land develops the Polaroid camera; American Association of Blood Banks is founded.

1948 American Association of Clinical Chemistry is founded.

1950 R.S. Yalow and S. Berson develop radioimmunoassay; Levey and Jennings adapt the Shewhart QC chart to use in clinical laboratories;

Histochemical Society is founded.

1952 M.D. Poulik invents immunoelectrophoresis.

1954 Kuby develops the clinical lab method for serum creatine phosphokinase activity; A. Walsh develops the atomic absorption spectrometer.

1955 Wroblewski and LaDue develop the clinical laboratory method for serum lactate dehydrogenase; Karmen develops the clinical laboratory method for aspartate aminotransferase; Leonard Skegges develops the concept of “continuous flow dialysis” in connection with treatment of renal disease; Severo Ochoa

synthesizes RNA.

1956 Wroblewski and LaDue develop the method for serum alanine aminotransferase activity called “serum glutamic- pyruvic transaminase” and

recognize its greater specificity for liver disease compared with that of aspartate aminotransferase; J. Edwards proposes prenatal screening for genetic disease.

1957 Van Handel and Zilversmit develop a direct chemical method for the determination of triglycerides.

1959 The first clinical laboratory chemical analyzer, the singlechannel “Auto-Analyzer,” is introduced by Technicon Corp.; Technicon first applies

flame photometry to automated methods.

1960 Methods for serum creatine phosphokinase isoenzymes are

developed; the first method for gamma-glutamyl transferase in serum is developed;

Perkin-Elmer Corp. introduces atomic absorption spectrometry

for the determination of calcium and magnesium; the laser is developed; Feichtmeier invents the mechanical pipettor (Auto Dilator).

1961 Becton Dickinson Co. introduces disposable hypodermic syringe and needle.

1962 Siegelman develops a method for glutamic dehydrogenase;

IBM introduces disk storage for computers; International Society for Clinical

Laboratory Technology is founded.

1965 Scanning electron microscope is developed; the U.S. enacts

Medicare and Medicaid (Titles 18 and 19 of the Social Security Amendments).

1966 Medicare/Medicaid officially goes into effect.

1967 G.I. Abelev shows that alphafetoprotein is elevated in

serum of patients with testicular teratocarcinoma; MetPath Laboratories is founded; U.S. enacts the Clinical Laboratory Improvement Act (CLIA ’67).

1968 The first random-access analyzer is introduced by DuPont (the ACA); the 1% Medicare allowance for unidentified costs is reduced to zero; Canada enacts the Federal Medical Care Act, creating a single-paye

 

The use of laboratory tests:

Laboratory investigations are involved in every branch of clinical medicine.

The results of laboratory tests may be of use in:

1.     Diagnosis and in the monitoring of treatment.

2.     Screening for disease or in assesing the prognosis.

3.     Reseach into the biochemical basis of disease

4.     Clinical trials of new drugs

Laboratory investigations hold the key for the diagnosis and prognosis of diabetes mellitus, jaundice, myocardial infarction, gout, pancreatitis, rickets, cancers, acid-base imbalance etc. Successful medical practice is unimaginable without the service of clinical laboratory.

In general, laboratory tests can be broadly divided into two groups:

In 1. Discretionary or selective requesting, the tests are carried out on the basis of an individual patient's clinical situation. The case for discretionary requesting has been put admirably (Asher, 1954):

1. Why do I request this test?

2. What will I look for in the result?

3. If I find what I am looking for, will it affect my diagnosis?

4. How will this investigation affect my manage­ment of the patient?

5. Will this investigation ultimately benefit the patient?

In contrast, 2. Screening tests are used to search for disease without there being any necessary clinical indication that disease is present.

The situations in which discretionary test requests are undertaken are listed in Table

Test selection for the purposes of discretionary testing

Category

Example

To confirm a diagnosis

Plasma (free T4) and (thyroid-stimulating hormone, TSH) in suspected hyperthyroidism

To aid differential diagnosis

To distinguish between different forms of jaundice

To refine a diagnosis

Use of ACTH to localize Cushing's syndrome

To asses the severity of disease

Plasma (creatinine) or (urea) in renal disease

To monitor progress

Plasma (glucose) to follow of patients with diabetes mellitus

To detect complications or side effects

ALT measurements in patients treated with hepatotoxic drug

To monitor therapy

Plasma drug concentration in patients treated with antiepileptic drugs

 

Screening may take two forms: 1. Well-population screening in which typically a spectrum of tests is carried out on individuals from an apparently healthy population in an attempt to detect presymptomatic or early disease. The value of well-population screening has been called into question and certainly should only be initiated under certain specific circumstances which are listed in Table 1.3.

Table 1.3 Requirements for well-population screening

The disease is common or life-threatening

The tests are sensitive and specific

The tests are readily applied and acceptable to the population to be screened

Clinical, laboratory and other facilities are available for follow-up

Economics of screening have been clarified and the implications accepted

2. Case-finding screening programmes perform appropriate tests on a population sample known to be at high risk of a particular disease.

These are inherently more selective and yield a higher proportion of useful results (Table 1.4).

Examples of tests used in case-finding programmes.

 

Programmes to detect diseases in

Chemical investigations

Neonates:

 

PKA (phenylketonuria)

Serum [phenylalanine]

 

Hypothyroidism

Serum [TSH] and/or [thyroxine]

Adolescents and young adults:

 

Substance abuse                                                                                                      

Drug screen

Pregnancy:

 

Diabetes mellitus in the mother                                                                        

Plasma and urine [glucose]

 

Open neural tube defect (NTD) in the foetus

Maternal serum [a-fetoprotein]

Industry:

 

 

Industrial exposure to lead                                                                              

Blood [lead]

 

Industrial exposure to pesticides

Plasma cholinesterase activity

Malnutrition

Plasma [albumin] and/or [pre-albumin]

 

Thyroid dysfunction 

Plasma [TSH] and/or [thyroxine]

 

ADVANTAGES OF SCREENING

First, an uncommon or unexpected disease may be found and created. Second, the early requesting of a battery of tests might be expected to expedite management of the patient. Most studies have not shown this to be so.

Advantages of screening in identifying unexpected test results

Disease

Unexpected abnormal  test results

Hyperparathyroidism

Raised plasma calcium

Hypothyroidism

Raised plasma TSH and/or a low T4

Diabetes mellitus

High random plasma glucose

Renal tract disease

Raised plasma creatinine or urea

Liver disease

Increased plasma ALT, AST

DISADVANTAGES OF SCREENING

It is easy to miss significant abnormalities in the 'flood' of data coming from the laboratory, even when the abnormalities are 'flagged' in some way. Most of the abnormalities detected will be of little or no significance, yet may need additional time-consuming and often expensive tests to clarify their importance (or lack of it).

In other instances, to simplify requesting, a wide range of tests are routinely requested on all patients in a particular category, for example, admission screening on all those admitted through the Accident and Emergency (A&E) Department. Mention should also be made of batteries of tests which are generally requested on a discre­tionary basis but where the test group collectively provides information about an organ system (e.g. tests for liver disease) or a physiological state (e.g. water and electrolyte status). Many laboratories analyse and report these functional or organ-related groups. For example, a 'liver function test' group might consist of plasma bilirubin, alanine aminotransferase (ALT), alkaline phosphatase (ALP), γ-glutamyltransferase (GGT) and albumin measurements.

Clinical biochemical tests comprise over ⅓ of all hospital laboratory investigations.

Core biochemistry: Most biochemistry laboratories provide the "core analyses", commonly requested tests which are of value in many patients, on a frequent basis.

Core biochemical tests:

1. Sodium, potassium, chloride and bicarbonate

2. Urea and creatinine

3. Calcium and phosphate

4. Total protein and albumin

5. Bilirubin and alkaline phosphatase

6. Alanine aminotransferase (ALT) and Aspartate aminotransferase (AST)

7. Glucose

8. Amylase

Specialized tests:

Not every laboratory is equiped to carry out all possible biochemistry requests.

Large departments may act as reference centres where less commonly asked  for tests are performed.

Specialized tests:

1. Hormones

2. Specific proteins

3. Trace elements

4. Vitamins

5. Drugs

6. Lipids and lipoproteins

7. DNA analyses

The emergency lab:

All clinical biochemistry laboratories provide facilities for urgent tests. An urgent test is designated as one on which the clinician is likely to take immediate action. The main reason for asking for an analysis to be performed on an urgent basis is that immediate treatment depends on the result.

Emergency tests:

1. Urea and electrolytes

2. Blood gases

3. Amylase

4. Glucose

5. Salicylate

6. Paracetamol

7. Calcium

Specimen collection:

The biological fluids employed in the clinical biochemistry laboratory include blood, urine, saliva, sputum, faeces, tissue and cells, cerebrospinal fluid, peritoneal fluid, synovial fluid, pleural fluid, stones.

Among these, blood (directly or in the form of plasma or serum) is frequently used for the investigations in the clinical biochemistry laboratory.

Identification of patients and specimens

The correct patient must be appropriately iden­tified on the specimen and request form, as follows:

1. Patient identification data (PID). This usually comprises name plus unique number.

2. Test request information. This includes relevant clinical details (including any risk of infection hazard), the tests to be performed and where the report is to be sent.

3. Collection of specimens. In the correct tube and the appropriate preservative.

4. Matching of specimens to requests. Each specimen must be easily and unequivocally matched to the corresponding request for investigations.

 

 

Table 1.1 Some commoner causes of errors arising from use of the laboratory.

 

Error

Consequence

Crossover of addressograph labels between patients

This can lead to two patients each with the other's set of results. Labels between patients. Where the patient is assigned a completely wrong set of results, it is important to investigate the problem in case there is a second patient with a corresponding wrong set of results

 

Timing error                        

There are many examples where timing is important but not considered. Sending in a blood sample too early after the administration of a drug can lead to misleadingly high values in therapeutic monitoring. Interpretation of some tests (e.g. cortisol) is critically dependent on the time of day when the blood was sampled

 

Sample collection tube error           

For some tests the nature of the collection tube is critical which is why the Biochemistry Laboratory specifies this detail. For example, using a plasma tube with lithium-heparin as the anticoagulant invalidates this sample tube for measurement of a therapeutic lithium level! Serum electrophoresis requires a serum sample; otherwise, the fibrinogen interferes with the detection of any monoclonal bands. Topping up a biochemistry tube with a haematology (potassium-ethylenediamine tetraacetic acid (EDTA) sample) will lead to high potassium and low calcium values in the biochemistry sample

 

Sample taken from close to the site of an intravenous infusion       

The blood sample will be diluted so that all the tests will be correspondingly site of an intravenous (IV) infusion       low with the exception of those tests which might be affected by the composition of the infusion fluid itself. For example, using normal saline as the infusing fluid would lead to a lowering of all test results but with sodium and chloride results which are likely to be raised

 

Analytical error                      

Although comparatively rare, these do inevitably happen from time to time and any result which is unexpected should lead the requesting clinician to discuss the matter further with the Laboratory. Transcription errors within the Laboratory are increasingly less common because of the electronic download of results to the Laboratory computer as a source of the printout or results on the VDU. Most errors generated within the Laboratory occur at the Reception as a result of mislabelling of samples within the Laboratory

 

 

Collection of blood:

Venous blood is most commonly used for a majority of biochemical investigations. It can be drawn from any prominent vein (usually from a vein on the front of the elbow).

Capillary blood (<0.2 ml) obtained from a finger or thumb, is less frequently employed.

Arterial blood (usually drawn under local anesthesia) is used for blood gas determinations.

 

Precautions for blood collection : Use of sterile (preferably disposable) needles and syringes, cleaning of patients skin, blood collection in clean and dry vials/tubes are some of the important precautions.

 

Biochemical investigations can be performed on 4 types of blood specimens – whole blood, plasma, serum and red blood cells. The selection of the specimen depends on the parameter to be estimated.

1. Whole blood (usually mixed with an anticoagulant) is used for the estimation of hemoglobin, carboxyhemoglobin, pH, glucose, urea, non-protein nitrogen, pyruvate, lactate, ammonia etc. (Note : for glucose determination, plasma is prefered in recent years).

2. Plasma, obtained by centrifuging the whole blood collected with an anticoagulant, is employed for the parameters—fibrinogen, glucose, bicarbonate, chloride, ascorbic acid etc.

3. Serum is the supernatant fluid that can be collected after centrifuging the clotted blood. It is the most frequently used specimen in the clinical biochemistry laboratory. The parameters estimated in serum include proteins (albumin/globulins), creatinine, bilirubin, cholesterol, uric acid, electroylets (Na+, K+, Cl-), enzymes (ALT, AST, LDH, CK, ALP, ACP, amylase, lipase) and vitamins.

4. Red blood cells are employed for the determination of abnormal hemoglobins, glucose 6-phosphate dehydrogenase, pyruvate kinase etc.

Collection and preservation of blood specimens

Lack of thought before collecting specimens or carelessness in collection may adversely affect the interpretation or impair the validity of the tests carried out on the specimens. Some factors to consider include the following:

1. Diet Dietary constituents may alter the concen­trations of analytes in blood significantly (e.g. plasma [glucose] and [triglyceride] are affected by carbohydrate and fat-containing meals, respectively).

2. Drugs Many drugs influence the chemical compo­sition of blood. Such effects of drug treatment, for example, antiepileptic drugs, have to be taken into account when interpreting test results. Details of rel­evant drug treatment must be given when request­ing chemical analyses, especially when toxicological investigations are to be performed.

3. Diurnal variation. The concentrations of many substances in blood vary considerably at different times of day (e.g. cortisol). Specimens for these analyses must be collected at the times specified by the laboratory, as there may be no reference ranges relating to their concentrations in blood at other times

Care when collection blood specimens

The posture of the patient, the choice of skin-cleansing agent and the selection of a suitable vien (or other source) are the principal factors to con­sider before proceeding to collect each specimen:

1.  The skin must be clean over the site for collect­ing the blood specimen. However, it must be remembered that alconol and methylated spir­its can cause haemolysis, and that their use is clearly to be avoided if blood [ethanol] is to be determined.

2.  Limbs into which intravenous infusions are being given must not be selected as the site of venepuncture unless particular care is taken. The needle or cannula must first be thoroughly flushed out with blood to avoid dilution of the specimen with infusion fluid.

3.  Venepuncture technique should be standardised as far as possible to enable closer comparison of successive results on patients.

4. Venous blood specimens should be obtained with minimal stasis Prolonged stasis can markedly raise the concentrations of plasma proteins and other non-diffusible substances (e.g. protein-bound substances). It is advisable to release the tourniquet before withdrawing the sample of blood.

5. Posture should be standardised if possible When a patient's posture changes from lying to standing, there may be an increase of as much as 13% in the concentration of plasma proteins or protein-bound constituents, due to redistribution of fluid in the extracellular space.

6.  Haemolysis should be avoided, since it renders specimens unsuitable for plasma K+, magne­sium and many protein and enzyme activity mea­surements.

7. Infection hazard  High-risk specimens require special care in collection, and this danger must be clearly indicated on the request form.

Care of blood specimens after collection

Blood specimens should be transported to the lab­oratory as soon as possible after collection. Special arrangements are needed for some specimens (e.g. for acid-base measurements, or unstable hormones) because of their lack of stability. Most other analytes are stable for at least 3 h in whole blood, or longer if plasma or serum is first sepa­rated from the cells. As a rule, whole blood specimens for chemical analysis must not be stored in a refrigera­tor, since ionic pumps that maintain electrolyte gradients across the cell membrane are inactive at low temperatures. Conversely, separated serum or plasma is best refrigerated, to minimize chemical changes or bacterial growth.

Several changes occur in whole blood specimens following collection. The commoner and more important changes that occur prior to the separation of plasma or serum from the cells are:

1.  Glucose is converted to lactate: this process is inhibited by fluoride;

2.  Several substances pass through the erythrocyte membrane, or may be added in significant amounts to plasma as a result of red cell destruction insufficient to cause detectable haemolysis. Examples include K+ and lactate dehydrogenase;

3. Loss of CO2 occurs, since the Pco2, of blood is much higher than in air;

4. Plasma [phosphate] increases due to hydrolysis of organic ester phosphates in the red cells;

5. Labile plasma enzymes lose their activity.

 

Oddsei - What are the odds of anything.