In clinical practice, doctors personally assess patients in order to diagnose, which refers both to the process of attempting to determine or identify a possible disease or disorder (and diagnoses in this sense can also be termed (medical) “diagnostic procedure”), and to the opinion reached by this process (also being termed (medical) “diagnostic opinion”), treat, and prevent disease using clinical judgment.
The doctor-patient relationship, central to the practice of healthcare and is essential for the delivery of high-quality health care in the diagnosis and treatment of disease, typically begins an interaction with an examination of the patient’s medical history, or “anamnesis” of a patient, the information gained by a physician by asking specific questions, either of the patient or of other people who know the person and can give suitable information (in this case, it is sometimes called “heteroanamnesis”), with the aim of obtaining information useful in formulating a diagnosis and providing medical care to the patient; and medical/health record/chart, the systematic documentation of a single patient’s medical history and care across time within one particular health care provider’s jurisdiction.
The second step is a medical interview and a physical/medical/clinical examination (more popularly known as a “check-up” or “medical”), the process by which a doctor investigates the body of a patient for signs of disease. Basic diagnostic medical devices, or instruments, apparatuses, implants, in vitro reagents, or similar or related articles that is used to diagnose, prevent, or treat disease or other conditions, and does not achieve its purpose through chemical action within or on the body (which would make it a medicine), are typically used; e.g. stethoscope, an acoustic medical device for auscultation, or listening to the internal sounds of an animal or human body, and tongue depressor, a device used in medical practice to depress the tongue to allow for examination of the mouth and throat.
After examination for (medical) signs—an objective indication of some medical fact or characteristic that may be detected by a physician during a physical examination or by a clinical scientist by means of an in vitro examination of a patient—and interviewing for symptoms, a departure from normal function or feeling is noticed by a patient, indicating the presence of disease or abnormality, the doctor may order medical tests, a kind of medical procedure to detect, diagnose, or monitor diseases, disease processes, susceptibility, and determine a course of treatment (e.g. blood tests, a laboratory analysis performed on a blood sample that is usually extracted from a vein in the arm using a needle, or via finger prick); take a biopsy, a medical test commonly performed by a surgeon or an interventional radiologist involving sampling of cells or tissues for examination; or prescribe pharmaceutical drugs/medicine/medication, loosely defined as any chemical substance intended for use in the medical diagnosis, cure, treatment, or prevention or disease, or other therapies.
Differential diagnosis methods (sometimes abbreviated “DDx,” “ddx,” “DD,” “D/Dx,” “or “ΔΔ”), a systematic diagnostic method used to identify the presence of an entity where multiple alternatives are possible (and the process may be termed “differential diagnostic procedure”), and may also refer to any of the included candidate alternatives (which may also be termed “candidate condition”), help out to rule out conditions based on the information provided. During the encounter, properly informing the patient of all relevant facts is an important part of the relationship and the development of trust. The medical encounter is then documented in the medical record, which is a legal document in many jurisdictions. Followups may be shorter but follow the same general procedure.
The components of the medical interview and encounter are:
The physical/medical/clinical examination (more popularly known as a “check-up” or “medical”) is the process by which a doctor investigates the body of a patient for signs of disease: ‘symptoms’ are what the patient volunteers, while ‘signs’ are what the healthcare provider detects by examination.
The healthcare provider uses the senses of sight, hearing, touch, and sometimes smell; e.g., in infection, uremia/uraemia (a term used to loosely describe the illness accompanying kidney failure (also called renal failure), in particular the nitrogenous waste products associated with the failure of this organ), diabetic ketoacidosis (a potentially life-threatening complication in patients with diabetes mellitus). Taste has been made redundant by the availability of modern lab tests.
Four actions are taught as the basis of physical examination: inspection, which in medicine, is the through and unhurried visualization of the client; palpation (feel), used as part of a physical examination in which an object is felt (usually with hands of a healthcare practitioner) to determine its size, shape, firmness, or location; percussion (tap to determine resonance characteristics), a method to determine the underlying structure, and is used in clinical examinations to assess the condition of the thorax or abdomen; and auscultation (listen), or the term for listening to the internal sounds of the body, usually using a stethoscope. This order may be modified depending on the main focus of the examination (e.g., a joint may be examined by simply “look, feel, move.” Having this set order is an educational tool that encourages practitioners to be systematic in their approach and refrain from using tools such as the stethoscope—an acoustic medical device for auscultation, or listening to the internal sounds of an animal or human body—before they have fully evaluated the other modalities).
The clinical examination involves study of:
It is to likely focus on areas of interest highlighted in the medical history and may not include everything listed above.
A medical/clinical laboratory, a laboratory where tests are done on clinical specimens in order to get information about the health of a patient ads pertaining to the diagnosis, treatment, and prevention of disease; and imaging studies (“medical imaging”), the technique and process used to create images of the human body (or parts and function thereof) for clinical purposes (medical procedures seeking to reveal, diagnose, or examine disease) or medical science (including the study of normal anatomy and physiology), results may be obtained, if necessary.
The medical decision-making (MDM) process involves analysis and synthesis of all the above data to come up with a list of possible diagnoses (the differential diagnoses; sometimes abbreviated “DDx,” “ddx,” “DD,”” D/Dx,” or ΔΔ), a systematic diagnostic method used to identify the presence of an entity where multiple alternatives are possible (and the process may be termed “differential diagnostic procedure”), and may also refer to any of the included candidate alternatives (which may also be termed “candidate condition”); along with an idea of what needs to be done to obtain a definitive diagnosis that would explain their patient’s problem.
The treatment plan may include ordering additional laboratory tests and studies, starting therapy, referral to a specialist, or watchful observation; a laboratory is a facility that provides controlled conditions in which scientific research, experiments, and measurements may be performed. Follow-up may be advised.
This process is used by primary care providers as well as specialists. It may take only a few minutes if the problem is simple and straightforward. On the other hand, it may take weeks in a patient who has been hospitalized with bizarre symptoms or multi-system problems, with involvement by several specialists.
On subsequent visits, the process may be repeated in an abbreviated manner to obtain any new history, symptoms, physical findings, and lab or imaging results, or specialist consultations.
The post-18th century modernity period, typically referring to an historical era, roughly defined as a post-traditional or post-medieval period beginning Renaissance (ca. 14th-17th Centuries), characterized by a move from feudalism (or agrarianism) toward capitalism, industrialization, secularization, rationalization, the nation-state and its constituent institutions and forms of surveillance (Barker 2005, 444), brought more groundbreaking researchers from Europe.
From Germany, a federal parliamentary republic in west-central Europe, and Austria, the following doctors made notable contributions: Rudolf Virchow, a German doctor, anthropologist, pathologist, prehistorian, biologist and politician, known for his advancement of public health; Wilhelm Conrad Röntgen, a German physicist, who, on November 8, 1895, produced and detected electromagnetic radiation in wavelength range today known as X-rays or Röntgen rays, an achievement that earned him the first Nobel Prize in Physics in 1901; Karl Landsteiner, an Austrian biologist and physician; and Otto Loewi, a German born pharmacologist whose discovery of acetylcholine helped enhance medical therapy.
In the United Kingdom, the following are considered important: Alexander Fleming, a Scottish biologist, pharmacologist and botanist; Sir Joseph Lister, Bt., a British surgeon and a pioneer of antiseptic surgery, who promoted the idea of sterile surgery while working at their Glasgow Royal Infirmary; Francis Crick, an English molecular biologist, biophysicist, and neuroscientist, and most noted for being a co-discoverer of the structure of the DNA molecule in 1953 together with James D. Watson; and Florence Nightingale, a celebrated English social reformer and statistician, and the founder of modern nursing.
A doctor from the country of Spain, now a sovereign state and a member of the European union located in southwestern Europe, on the Iberian peninsula, Santiago Ramón y Cajal, a pathologist, histologist, neuroscientist, and Nobel laureate, is considered the father of modern neuroscience, or the study of the nervous system.
From New Zealand and Australia came Maurice Wilkins, a New Zealand-born English physicist and molecular biologist, and Nobel laureate whose research contributed to the scientific understanding of phosphorescence, isotope separation, optical microscopy and X-ray diffraction, and to the development of radar; Howard Florey, an Australian pharmacologist and pathologist who shared the Nobel Prize in Physiology or Medicine in 1945 with Sir Ernst Boris Chain and Sir Alexander Fleming for his role in the making of penicillin; and Frank Macfarlane Burnet, usually known as “Macfarlane” or “Mac Burnet,” an Australian virologist best known for his contributions to immunology.
With their respective countries, the following also did significant work:
The United States: William Williams Keen, the first brain surgeon in the United States; William Coley, an American bone surgeon and cancer researcher, pioneer of cancer immunotherapy; and James D. Watson, an American molecular biologist, geneticist, and zoologist, best known as a co-discoverer of the structure of DNA in 1953 with Francis Crick.
Italy: Salvador Luria, an Italian microbiologist.
Switzerland: Alexandre Yersin, a Swiss and French physician and bacteriologist.
Japan: Kitasato Shibasaburō, a Japanese physician and bacteriologist during the prewar period.
France: Jean-Martin Charcot, a French neurologist and professor of anatomical pathology; Claude Bernard, a French physiologist; and Paul Broca, a French physician surgeon, anatomist, and anthropologist; etc.
And the others: Nikolai Korotkov, a Russian surgeon, a pioneer of 20th century vascular surgery, and the inventor of auscultatory technique for blood pressure measurement; Sir William Osler, a Canadian physician; and Harvey Cushing, an American neurosurgeon and a pioneer of brain surgery, and the first to describe Cushings’ syndrome.
As science and technology developed, medicine became more reliant upon medications, or pharmaceutical drugs, which can be loosely defined as chemical substances intended for use in the medical diagnosis, cure, treatment, or prevention of disease. Throughout history and in Europe right until the late 18th century, not only animal and plant products were used as medicine, but also human body parts and fluids.
Pharmacology is the branch of medicine and biology with the study of drug action, where a drug can be broadly defined as any man-made natural, or endogenous (within cell) molecule which exerts a biochemical and/or physiological effect on the cell, tissue, organ, or organism; developed from herbalism (“herbal medicine”), the study and use of medicinal properties of plants. Today, many drugs are still derived from plants (atropine, ephedrine, warfarin, aspirin (also known as “Acetylsalicylic acid, a salicylate drug, often used as an analgesic to relieve minor aches and pains, as an antipyretic to reduce fever, and as an anti-inflammatory medication), digoxin, vinca alkaloids, taxol, hyoscine, etc.)
Vaccines were discovered by Edward Jenner, an English physician and scientist from Berkeley, Gloucestershire, who was the pioneer of smallpox vaccine; and Louis Pasteur, a French chemist and microbiologist who was one of the most important founders of medical microbiology.
The first antibiotic was arsphenamine/Salvarsan, also known as “compound 606,” is a drug that was introduced at the beginning in the 1910s as the first effective treatment for syphilis, and was also used to treat trypanosomiasis, discovered in 1908 by Paul Ehrlich, a German physician and scientist who worked in the field of hematology, immunology, and chemotherapy, after he observed that bacteria took up toxic dyes that human cells did not. The first major class of antibiotics, a compound or substance that kills or slows the growth of bacteria, was the sulfa drugs (“sulfonamide”), the basis of several groups of drugs, derived by German chemists originally from azo dyes—”azo compounds” are compounds bearing the functional group R-N=N-R’, in which R and R’ can be either aryl or alkyl.
Pharmacology has become increasingly sophisticated; modern biotechnology (“biotech”), generally accepted as the use of living systems and organisms to develop or make useful products, allows drugs targeted towards specific physiological processes to be developed, sometimes designed for compatibility with the body to reduce side-effects; in medicine, a side effect is an effect, whether therapeutic or advertise, that is secondary to the one intended; although the term is predominantly employed to describe adverse effects, it can also apply to beneficial, but intended, consequences of the use of a drug.
Genomics, a discipline in genetics concerned with the study of the genomes of organisms, and knowledge of human genetics, the study of inheritance as it occurs in human beings, is having some influence on medicine. The causative genes, molecular unit of hereditary of a living organism, of most monogenic genetic disorders, illnesses caused by abnormalities in genes or chromosomes, especially a condition that is present from before birth, have now been identified. The development of techniques in molecular biology, the branch of biology that deals with the molecular basis of biological activity, and genetics are influencing medical technology, practice, and decision-making.
Evidence-based medicine (“EBM”), sometimes called evidence-based health care or “EHBC” to broaden its application to allied healthcare professionals, defined as “the conscientious, explicit and judicious use of current best evidence in making decisions about the care of individual patients,” is a contemporary movement to establish the most effective algorithms, which in mathematics and computer science, is a step-by-step procedure for calculations, of practice (ways of doing things). It is through the use of systematic reviews, a literature review focused on a research question that tries to identify, appraise, select and synthesize high quality research evidence relevant to that question, and meta-analysis, which in statistics, refers to methods on contrasting and combining results from different studies, in the hope of identifying patterns among study results, sources of disagreement among those results, or other interesting relationships that may come to light in the context of multiple studies.
The movement is facilitated by modern global information science/studies, an interdisciplinary field primarily concerned with the analysis, collection, classification, manipulation, storage, retrieval and dissemination of information, which allows as much as the available evidence as possible to be collected and analyzed according to standard protocols that are then disseminated to healthcare providers. The Cochrane Collaboration, an independent nonprofit organization consisting of a group of over 28,000 volunteers in more than 100 countries, leads this movement. A 2001 review of 160 Cochrane systematic reviews revealed that, according to tow readers, 21.3% of the reviews concluded insufficient evidence, 20% concluded evidence of no effect, and 22.5% concluded positive effect.
Technology can be viewed as an activity that forms or changes culture. Additionally, technology is the application of math, science, and the arts for the benefit of life as it is known. A modern example is the rise of communication technology—the activity of conveying information through the exchange of thoughts, messages, or information, as by speech, visuals, signals, writing, or behavior. It has lessened barriers to human interaction and, as a result, has helped spawn new subcultures; the rise of cyberculture, or the culture that has emerged, or is emerging, from the use of computer networks for communication, entertainment, and business, has, at its basis, the development of the Internet, a global system of interconnected computer networks that use the standard Internet protocol suite (often called TCP/IP, although not all applications use TCP) to serve billions of users worldwide, and the computer, a general purpose device that can be programmed to carry out a finite set of arithmetic or logical operations. In much of reflection on ICTs, the term “cyberculture” can clearly be identified one of the frequently and flexibly used terms lacking an explicit meaning.
Not all technology enhances culture in a creative way; technology can also help facilitate political oppression, the persecution of an individual or group for political reasons, particularly for the purpose of restricting or preventing their ability to take part in the political life of a society, and war tools such as guns. As a cultural activity, technology predates both science, a systematic enterprise that builds and organizes knowledge in the form of testable explanations and predictions about the universe, and engineering, the science skill and profession of acquiring and applying scientific, economic, social, and practical knowledge, in order to design and also build structures, machines, devices, materials and processes; each of which formalize some aspects of technological endeavor.
See: Signal Latency: A Satellite Internet Limitation
Sir John Bertrand Gurdon (JBG) is a British development biologist, the one who studies the process by which organisms grow and develop.
JBG is best known for his pioneering research on somatic-cell nuclear transplantation (“SCNT”), which in genetics and developmental biology, is a laboratory technique for creating a clone embryo with a donor nucleus; and cloning, also in biology, is the process of producing similar populations of genetically identical individuals that occurs in nature when organisms such as bacteria, insects or plants reproduce asexually.
In 2009, Gurdon was awarded the Lasker Award, which is awarded annually since 1946 to living persons who have made major contributions to medical science or who have performed public service on behalf of medicine. And this year, won a Nobel Prize for Physiology or Medicine with Yamanaka, administered by the Nobel Foundation, awarded once a year for outstanding discoveries in the fields of life sciences and medicine.
Gurdon attended Eton College, usually referred to as “Eton,” a British independent boarding school for boarding pupils aged between 13 to 18 years, where he ranked last out of the 250 boys in his year group at biology, and was in bottom set in every other science subject. A schoolmaster wrote a report stating, “I believe he has ideas about becoming a scientist; on his present showing that is quite ridiculous.” Gurdon later had this report framed; he told a reporter, “When you have problems like an experiment doesn’t work, which often happens, it’s nice to remind yourself that perhaps after all you are not so good at this job and the schoolmaster may have been right.”
Gurdon went to Christ Church, Oxford, one of the largest constituent colleges of the University of Oxford in England, to study classics (sometimes encompassing “Classical Studies” or “Classical Civilization”), the branch of the Humanities comprising the languages, literature, philosophy, history, art, archaeology and other culture of the ancient Mediterranean world (Bronze Age ca. BC 3000 - Late Antiquity ca. AD 300-600); especially Ancient Greece and Ancient Rome during Classical Antiquity (ca. BC 600 - AD 600). However, he switched to zoology, the branch of biology that relates to the animal kingdom, including structure, embryology, evolution, classification, habits, and distribution of all animals, both living and extinct.
For his D.Phil. (“Doctor of Philosophy” abbreviation in English-speaking countries and originally as “Dr.Phil.”), a postgraduate academic degree awarded by universities, he studied nuclear transplantation in the frog “Xenopus,” a genus of highly aquatic frogs native to Sub-Saharan Africa, with Michael Fischberg at Oxford.
Following postdoctoral work at the California Institute of Technology (commonly referred to as “Caltech”), a private research university located in Pasadena, California, United States, he returned to England and his early posts were at the Department of Zoology of the University of Oxford (1962-71), a university located in Oxford, England.
Gurdon has spent much of his research career at the University of Cambridge, a public research university located in Cambridge, United Kingdom. First, he worked at the Laboratory of Molecular Biology, or “LMB” (1971-83), a research institute in Cambridge, England, which was at the forefront of the revolution in molecular biology which occurred in the 1950-60s, since then it remains a major medical research laboratory with a much broader focus; by the “Medical Research Council” (MRC), a publicly funded government agency responsible for coordinating and funding medical research in the UK. After that, he worked at the Department of Zoology (1983-date).
In 1989, he was a founding member of the Wellcome/CRC Institute for Cell Biology and Cancer (later Wellcome/CR UK) in Cambridge, and was its Chair until 2001. He was a member of the Nuffield Council on Bioethics 1991-1995, a UK-based independent charitable body, which examines and reports on ethical issues raised by new advances in biological and medical research; and Master of Magdalene, College, Cambridge, a constituent college of the University of Cambridge, England, from 1995 to 2002.
In 1958, Gurdon, then at the University of Oxford, a university located in Oxford, England, successfully cloned a frog using intact nuclei, which in biology, are membrane-enclosed organelle found in eukaryotic cells, from the somatic cells of a “Xenopus” tadpole, a genus of highly aquatic frogs native Sub-Saharan Africa. This was an important extension of work of Briggs and King in 1952 on transplanting nuclei from embryonic blastula cells, a hollow sphere of cells formed during an early stage of embryonic development in animals.
Gurdon’s experiments captured the attention of the scientific community and the tools and techniques he developed for nuclear transfer are still used today. The term clone (from the ancient Greek word which means “twig”) had already been in use since the beginning of the 20th century in reference to the plants. In 1963, the British biologist J. B. S. Haldane, known as Jack (but uses the former name for printed works), a British-born geneticist and evolutionary biologist generally credited with a central role in the development of neo-Darwinian thinking (popularized by Richard Dawkins’ 1976 work titled “The Selfish Gene”), in describing Gurdon’s results, became one of the first to use the word “clone” in reference to animals.
After the US tech giant won a $1 billion patent infringement case against Samsung, the latter has cancelled a lucrative contract with Apple to supply the LCD displays used in devices such as the iPhone, Macbooks, and iPad. The development means that by next year Apple devices will be using the Samsung display.
Apple is already in talks with Taiwan Semiconductor Manufacturing Company (TSMC) to replace Samsung, who has also been manufacturing the CPU used for the iPad and iPhone. The California-based consumer electronics giant is already looking for other display manufacturers after the cancelled contract with the South Korean company.
Samsung Display supplied majority of Apple’s LCD displays for the first half of this year, shipping numbers of up to a 15 million. Now, Samsung is looking to compensate for the cancelled contracts by supplying to Amazon, and its sister company, Samsung Mobile.
Technology Inventory recommends:
Technology has affected human society—a group of people related to each other through persistent relations, or a large social grouping sharing the same geographical or virtual territory, subject to the same political authority and dominant cultural expectations—and its surroundings in a number of ways.
In many societies, technology has helped develop more advanced economies, which consists of the economic systems of a country or other area; the labor capital, and land resources; and the manufacturing, production, trade, distribution, and consumption of goods and services of that area. This includes today’s global economy, or the increasing economic interdependence of national economies across the world through a rapid increase in cross-border movement of goods, service, technology and capital. It has allowed the rise of a leisure class, a concept on the social sciences and political Mattis theory centered on models of social stratification in which the people are grouped as people who has a lot of “free time,” or time spent away from business, work, and domestic chores.
Many technological processes produce unwanted by-products, known as pollution, or the introduction of contaminants into the neutral environment that cause adverse change, and deplete natural resources, to the detriment of the Earth—the third planet from the Sun, and the densest and fifth-largest of the eight planets in the Solar System—and its natural environment, encompassing all living and nonliving things occurring naturally on Earth or some region thereof.
Various implementations of technology influence the values of a society—an extremely absolute or relative ethical value, the assumption of which can be the basis for ethical action—and new technology often raises new ethical questions. Examples include the rise of the notion of efficiency in terms of human productivity, a term originally applied only to machines, and the challenge of traditional norms.
Philosophical debates have arisen over the present and future use of technology in society, with disagreements over whether technology improves the human condition, encompassing the unique and believed to be inescapable features of being human, or worsens it.
The following and similar movements criticize the pervasiveness of technology in the modern world, saying that it harms the environment and alienates people: Neo-Luddism, a personal world view opposing many forms of modern technology; and anarcho-primitivism, an anarchist critique of the origins and progress of civilization.
Proponents of ideologies like the following, however, view continued technological progress as beneficial to society and the human condition: transhumanism, abbreviated as “H+” or “h+,” an international intellectual and cultural movement that affirms the possibility and desirability of fundamentally transforming the human condition by developing and making widely available technologies to eliminate aging and to greatly enhance human intellectual, physical and psychological capacities; and techno/techno-progressivism, or tech/techno progressivism a stance of active support for the convergence of technological change and social change.
Indeed, until recently, it was believed that the development of technology was restricted only to human beings, but recent scientific studies indicate that other primates, or mammals of the order “Primates,” which contains prosimians and simians, and certain dolphin communities, marine mammals closely related to whales and porpoises, have developed simple tools and learned to pass their knowledge to other generations.