You are not logged in.
79. Edward Jenner, (born May 17, 1749, Berkeley, Gloucestershire, England—died January 26, 1823, Berkeley), English surgeon and discoverer of vaccination for smallpox.
Jenner was born at a time when the patterns of British medical practice and education were undergoing gradual change. Slowly the division between the Oxford- or Cambridge-trained physicians and the apothecaries or surgeons—who were much less educated and who acquired their medical knowledge through apprenticeship rather than through academic work—was becoming less sharp, and hospital work was becoming much more important.
Jenner was a country youth, the son of a clergyman. Because Edward was only five when his father died, he was brought up by an older brother, who was also a clergyman. Edward acquired a love of nature that remained with him all his life. He attended grammar school and at the age of 13 was apprenticed to a nearby surgeon. In the following eight years Jenner acquired a sound knowledge of medical and surgical practice. On completing his apprenticeship at the age of 21, he went to London and became the house pupil of John Hunter, who was on the staff of St. George’s Hospital and was one of the most prominent surgeons in London. Even more important, however, he was an anatomist, biologist, and experimentalist of the first rank; not only did he collect biological specimens, but he also concerned himself with problems of physiology and function.
The firm friendship that grew between the two men lasted until Hunter’s death in 1793. From no one else could Jenner have received the stimuli that so confirmed his natural bent—a catholic interest in biological phenomena, disciplined powers of observation, sharpening of critical faculties, and a reliance on experimental investigation. From Hunter, Jenner received the characteristic advice, “Why think [i.e., speculate]—why not try the experiment?”
In addition to his training and experience in biology, Jenner made progress in clinical surgery. After studying in London from 1770 to 1773, he returned to country practice in Berkeley and enjoyed substantial success. He was capable, skillful, and popular. In addition to practicing medicine, he joined two medical groups for the promotion of medical knowledge and wrote occasional medical papers. He played the violin in a musical club, wrote light verse, and, as a naturalist, made many observations, particularly on the nesting habits of the cuckoo and on bird migration. He also collected specimens for Hunter; many of Hunter’s letters to Jenner have been preserved, but Jenner’s letters to Hunter have unfortunately been lost. After one disappointment in love in 1778, Jenner married in 1788.
Smallpox was widespread in the 18th century, and occasional outbreaks of special intensity resulted in a very high death rate. The disease, a leading cause of death at the time, respected no social class, and disfigurement was not uncommon in patients who recovered. The only means of combating smallpox was a primitive form of vaccination called variolation - intentionally infecting a healthy person with the “matter” taken from a patient sick with a mild attack of the disease. The practice, which originated in China and India, was based on two distinct concepts: first, that one attack of smallpox effectively protected against any subsequent attack and, second, that a person deliberately infected with a mild case of the disease would safely acquire such protection. It was, in present-day terminology, an “elective” infection-i.e., one given to a person in good health. Unfortunately, the transmitted disease did not always remain mild, and mortality sometimes occurred. Furthermore, the inoculated person could disseminate the disease to others and thus act as a focus of infection.
Jenner had been impressed by the fact that a person who had suffered an attack of cowpox—a relatively harmless disease that could be contracted from cattle-could not take the smallpox-i.e., could not become infected whether by accidental or intentional exposure to smallpox. Pondering this phenomenon, Jenner concluded that cowpox not only protected against smallpox but could be transmitted from one person to another as a deliberate mechanism of protection.
The story of the great breakthrough is well known. In May 1796 Jenner found a young dairymaid, Sarah Nelmes, who had fresh cowpox lesions on her hand. On May 14, using matter from Sarah’s lesions, he inoculated an eight-year-old boy, James Phipps, who had never had smallpox. Phipps became slightly ill over the course of the next 9 days but was well on the 10th. On July 1 Jenner inoculated the boy again, this time with smallpox matter. No disease developed; protection was complete. In 1798 Jenner, having added further cases, published privately a slender book entitled 'An Inquiry into the Causes and Effects of the Variolae Vaccinae'.
The reaction to the publication was not immediately favourable. Jenner went to London seeking volunteers for vaccination but, in a stay of three months, was not successful. In London vaccination became popularized through the activities of others, particularly the surgeon Henry Cline, to whom Jenner had given some of the inoculant, and the doctors George Pearson and William Woodville. Difficulties arose, some of them quite unpleasant; Pearson tried to take credit away from Jenner, and Woodville, a physician in a smallpox hospital, contaminated the cowpox matter with smallpox virus. Vaccination rapidly proved its value, however, and Jenner became intensely active promoting it. The procedure spread rapidly to America and the rest of Europe and soon was carried around the world.
Complications were many. Vaccination seemed simple, but the vast number of persons who practiced it did not necessarily follow the procedure that Jenner had recommended, and deliberate or unconscious innovations often impaired the effectiveness. Pure cowpox vaccine was not always easy to obtain, nor was it easy to preserve or transmit. Furthermore, the biological factors that produce immunity were not yet understood; much information had to be gathered and a great many mistakes made before a fully effective procedure could be developed, even on an empirical basis.
Despite errors and occasional chicanery, the death rate from smallpox plunged. Jenner received worldwide recognition and many honours, but he made no attempt to enrich himself through his discovery and actually devoted so much time to the cause of vaccination that his private practice and personal affairs suffered severely. Parliament voted him a sum of £10,000 in 1802 and a further sum of £20,000 in 1806. Jenner not only received honours but also aroused opposition and found himself subjected to attacks and calumnies, despite which he continued his activities on behalf of vaccination. His wife, ill with tuberculosis, died in 1815, and Jenner retired from public life.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
I see.
Mathaholic | 10th most active poster | Maker of the 350,000th post | Person | rrr's classmate
Offline
Hi, mathaholic!
80. Hans Christian Ørsted (often rendered Oersted in English; 14 August 1777 – 9 March 1851) was a Danish physicist and chemist who discovered that electric currents create magnetic fields, an important aspect of electromagnetism. He is still known today for Oersted's Law. He shaped post-Kantian philosophy and advances in science throughout the late 19th century.
In 1824, Ørsted founded Selskabet for Naturlærens Udbredelse (SNU), a society to disseminate knowledge of the natural sciences. He was also the founder of predecessor organizations which eventually became the Danish Meteorological Institute and the Danish Patent and Trademark Office. Ørsted was the first modern thinker to explicitly describe and name the thought experiment.
A leader of the so-called Danish Golden Age, Ørsted was a close friend of Hans Christian Andersen and the brother of politician and jurist Anders Sandøe Ørsted, who eventually served as Danish prime minister (1853–54).
The oersted (Oe), the cgs unit of magnetic H-field strength, is named after him.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
81. Melvil Dewey, (born Dec. 10, 1851, Adams Center, N.Y., U.S.—died Dec. 26, 1931, Lake Placid, Fla.), American librarian who devised the Dewey Decimal Classification for library cataloging and, probably more than any other individual, was responsible for the development of library science in the United States.
Dewey graduated in 1874 from Amherst College and became acting librarian at that institution. In 1876 he published A Classification and Subject Index for Cataloguing and Arranging the Books and Pamphlets of a Library, in which he outlined what became known as the Dewey Decimal Classification. This system was gradually adopted by libraries throughout the English-speaking world. In 1877 Dewey moved to Boston, where, with R.R. Bowker and Frederick Leypoldt, he founded and edited the Library Journal. He was also one of the founders of the American Library Association. In 1883 he became librarian of Columbia College, New York City, and there set up the School of Library Economy, the first institution for training librarians in the United States. The school was moved to Albany, N.Y., as the State Library School under his direction.
From 1889 to 1906 he was director of the New York State Library. He also served as secretary of the State University of New York (1889–1900) and as state director of libraries (1904–06). He completely reorganized the New York state library, making it one of the most efficient in the United States, and established the system of traveling libraries and picture collections.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
82. Sir Humphry Davy, Baronet, (born Dec. 17, 1778, Penzance, Cornwall, Eng.—died May 29, 1829, Geneva), English chemist who discovered several chemical elements (including sodium and potassium) and compounds, invented the miner’s safety lamp, and became one of the greatest exponents of the scientific method.
Early life.
Davy was the elder son of middle-class parents, who owned an estate in Ludgvan. He was educated at the grammar school in nearby Penzance and, in 1793, at Truro. In 1795, a year after the death of his father, Robert, he was apprenticed to a surgeon and apothecary, and he hoped eventually to qualify in medicine. An exuberant, affectionate, and popular lad, of quick wit and lively imagination, he was fond of composing verses, sketching, making fireworks, fishing, shooting, and collecting minerals. He loved to wander, one pocket filled with fishing tackle and the other with rock specimens; he never lost his intense love of nature and, particularly, of mountain and water scenery.
While still a youth, ingenuous and somewhat impetuous, Davy had plans for a volume of poems, but he began the serious study of science in 1797, and these visions “fled before the voice of truth.” He was befriended by Davies Giddy (later Gilbert; president of the Royal Society, 1827–30), who offered him the use of his library in Tradea and took him to a chemistry laboratory that was well equipped for that day. There he formed strongly independent views on topics of the moment, such as the nature of heat, light, and electricity and the chemical and physical doctrines of A.-L. Lavoisier. In his small private laboratory, he prepared and inhaled nitrous oxide (laughing gas), in order to test a claim that it was the “principle of contagion,” that is, caused diseases. On Gilbert’s recommendation, he was appointed (1798) chemical superintendent of the Pneumatic Institution, founded at Clifton to inquire into the possible therapeutic uses of various gases. Davy attacked the problem with characteristic enthusiasm, evincing an outstanding talent for experimental inquiry. He investigated the composition of the oxides and acids of nitrogen, as well as ammonia, and persuaded his scientific and literary friends, including Samuel Taylor Coleridge, Robert Southey, and P.M. Roget, to report the effects of inhaling nitrous oxide. He nearly lost his own life inhaling water gas, a mixture of hydrogen and carbon monoxide sometimes used as fuel. The account of his work, published as Researches, Chemical and Philosophical (1800), immediately established his reputation, and he was invited to lecture at the newly founded Royal Institution of Great Britain in London, where he moved in 1801, with the promise of help from the British-American scientist Sir Benjamin Thompson (Count von Rumford), the British naturalist Sir Joseph Banks, and the English chemist and physicist Henry Cavendish in furthering his researches; e.g., on voltaic cells, early forms of electric batteries. His carefully prepared and rehearsed lectures rapidly became important social functions and added greatly to the prestige of science and the institution. In 1802 he became professor of chemistry. His duties included a special study of tanning: he found catechu, the extract of a tropical plant, as effective as and cheaper than the usual oak extracts, and his published account was long used as a tanner’s guide. In 1803 he was admitted a fellow of the Royal Society and an honorary member of the Dublin Society and delivered the first of an annual series of lectures before the board of agriculture. This led to his Elements of Agricultural Chemistry (1813), the only systematic work available for many years. For his researches on voltaic cells, tanning, and mineral analysis, he received the Copley Medal in 1805. He was elected secretary of the Royal Society in 1807.
Major discoveries.
Davy early concluded that the production of electricity in simple electrolytic cells resulted from chemical action and that chemical combination occurred between substances of opposite charge. He therefore reasoned that electrolysis, the interactions of electric currents with chemical compounds, offered the most likely means of decomposing all substances to their elements. These views were explained in 1806 in his lecture “On Some Chemical Agencies of Electricity,” for which, despite the fact that England and France were at war, he received the Napoleon Prize from the Institut de France (1807). This work led directly to the isolation of sodium and potassium from their compounds (1807) and of the alkaline-earth metals from theirs (1808). He also discovered boron (by heating borax with potassium), hydrogen telluride, and hydrogen phosphide (phosphine). He showed the correct relation of chlorine to hydrochloric acid and the untenability of the earlier name (oxymuriatic acid) for chlorine; this negated Lavoisier’s theory that all acids contained oxygen. He explained the bleaching action of chlorine (through its liberation of oxygen from water) and discovered two of its oxides (1811 and 1815), but his views on the nature of chlorine were disputed. He was not aware that chlorine is a chemical element, and experiments designed to reveal oxygen in chlorine failed.
In 1810 and 1811 he lectured to large audiences at Dublin (on agricultural chemistry, the elements of chemical philosophy, geology) and received £1,275 in fees, as well as the honorary degree of LL.D., from Trinity College. In 1812 he was knighted by the Prince Regent (April 8), delivered a farewell lecture to members of the Royal Institution (April 9), and married Jane Apreece, a wealthy widow well known in social and literary circles in England and Scotland (April 11). He also published the first part of the Elements of Chemical Philosophy, which contained much of his own work; his plan was too ambitious, however, and nothing further appeared. Its completion, according to a Swedish chemist, J.J. Berzelius, would have “advanced the science of chemistry a full century.”
His last important act at the Royal Institution, of which he remained honorary professor, was to interview the young Michael Faraday, later to become one of England’s great scientists, who became laboratory assistant there in 1813 and accompanied the Davys on a European tour (1813–15). By permission of Napoleon, he travelled through France, meeting many prominent scientists, and was presented to the empress Marie Louise. With the aid of a small portable laboratory and of various institutions in France and Italy, he investigated the substance “X” (later called iodine), whose properties and similarity to chlorine he quickly discovered; further work on various compounds of iodine and chlorine was done before he reached Rome. He also analyzed many specimens of classical pigments and proved that diamond is a form of carbon.
Later years.
Shortly after his return, he studied, for the Society for Preventing Accidents in Coal Mines, the conditions under which mixtures of firedamp and air explode. This led to the invention of the miner’s safety lamp and to subsequent researches on flame, for which he received the Rumford medals (gold and silver) from the Royal Society and, from the northern mine owners, a service of plate (eventually sold to found the Davy Medal). After being created a baronet in 1818, he again went to Italy, inquiring into volcanic action and trying unsuccessfully to find a way of unrolling the papyri found at Herculaneum. In 1820 he became president of the Royal Society, a position he held until 1827. In 1823–25 he was associated with the politician and writer John Wilson Croker in founding the Athenaeum Club, of which he was an original trustee, and with the colonial governor Sir Thomas Stamford Raffles in founding the Zoological Society and in furthering the scheme for zoological gardens in Regent’s Park, London (opened in 1828). During this period, he examined magnetic phenomena caused by electricity and electrochemical methods for preventing saltwater corrosion of copper sheathing on ships by means of iron and zinc plates. Though the protective principles were made clear, considerable fouling occurred, and the method’s failure greatly vexed him. But he was, as he said, “burned out.” His Bakerian lecture for 1826, “On the Relation of Electrical and Chemical Changes,” contained his last known thoughts on electrochemistry and earned him the Royal Society’s Royal Medal.
Davy’s health was by then failing rapidly; in 1827 he departed for Europe and, in the summer, was forced to resign the presidency of the Royal Society, being succeeded by Davies Gilbert. Having to forgo business and field sports, Davy wrote Salmonia: or Days of Fly Fishing (1828), a book on fishing (after the manner of Izaak Walton) that contained engravings from his own drawings. After a last, short visit to England, he returned to Italy, settling at Rome in February 1829—“a ruin amongst ruins.” Though partly paralyzed through stroke, he spent his last months writing a series of dialogues, published posthumously as Consolations in Travel, or the Last Days of a Philosopher (1830).
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
83. Alfred Bernhard Nobel, (born October 21, 1833, Stockholm, Sweden—died December 10, 1896, San Remo, Italy), Swedish chemist, engineer, and industrialist, who invented dynamite and other, more powerful explosives and who also founded the Nobel Prizes.
Alfred Bernhard Nobel was the fourth son of Immanuel and Caroline Nobel. Immanuel was an inventor and engineer who had married Caroline Andrietta Ahlsell in 1827. The couple had eight children, of whom only Alfred and three brothers reached adulthood. Alfred was prone to illness as a child, but he enjoyed a close relationship with his mother and displayed a lively intellectual curiosity from an early age. He was interested in explosives, and he learned the fundamentals of engineering from his father. Immanuel, meanwhile, had failed at various business ventures until moving in 1837 to St. Petersburg in Russia, where he prospered as a manufacturer of explosive mines and machine tools. The Nobel family left Stockholm in 1842 to join the father in St. Petersburg. Alfred’s newly prosperous parents were now able to send him to private tutors, and he proved to be an eager pupil. He was a competent chemist by age 16 and was fluent in English, French, German, and Russian, as well as Swedish.
Alfred Nobel left Russia in 1850 to spend a year in Paris studying chemistry and then spent four years in the United States working under the direction of John Ericsson, the builder of the ironclad warship Monitor. Upon his return to St. Petersburg, Nobel worked in his father’s factory, which made military equipment during the Crimean War. After the war ended in 1856, the company had difficulty switching to the peacetime production of steamboat machinery, and it went bankrupt in 1859.
Alfred and his parents returned to Sweden, while his brothers Robert and Ludvig stayed behind in Russia to salvage what was left of the family business. Alfred soon began experimenting with explosives in a small laboratory on his father’s estate. At the time, the only dependable explosive for use in mines was black powder, a form of gunpowder. A recently discovered liquid compound, nitroglycerin, was a much more powerful explosive, but it was so unstable that it could not be handled with any degree of safety. Nevertheless, Nobel in 1862 built a small factory to manufacture nitroglycerin, and at the same time he undertook research in the hope of finding a safe way to control the explosive’s detonation. In 1863 he invented a practical detonator consisting of a wooden plug inserted into a larger charge of nitroglycerin held in a metal container; the explosion of the plug’s small charge of black powder serves to detonate the much more powerful charge of liquid nitroglycerin. This detonator marked the beginning of Nobel’s reputation as an inventor as well as the fortune he was to acquire as a maker of explosives. In 1865 Nobel invented an improved detonator called a blasting cap; it consisted of a small metal cap containing a charge of mercury fulminate that can be exploded by either shock or moderate heat. The invention of the blasting cap inaugurated the modern use of high explosives.
Nitroglycerin itself, however, remained difficult to transport and extremely dangerous to handle. So dangerous, in fact, that Nobel’s nitroglycerin factory blew up in 1864, killing his younger brother Emil and several other people. Undaunted by this tragic accident, Nobel built several factories to manufacture nitroglycerin for use in concert with his blasting caps. These factories were as safe as the knowledge of the time allowed, but accidental explosions still occasionally occurred. Nobel’s second important invention was that of dynamite in 1867. By chance, he discovered that nitroglycerin was absorbed to dryness by kieselguhr, a porous siliceous earth, and the resulting mixture was much safer to use and easier to handle than nitroglycerin alone. Nobel named the new product dynamite (from Greek dynamis, “power”) and was granted patents for it in Great Britain (1867) and the United States (1868). Dynamite established Nobel’s fame worldwide and was soon put to use in blasting tunnels, cutting canals, and building railways and roads.
In the 1870s and ’80s Nobel built a network of factories throughout Europe to manufacture dynamite, and he formed a web of corporations to produce and market his explosives. He also continued to experiment in search of better ones, and in 1875 he invented a more powerful form of dynamite, blasting gelatin, which he patented the following year. Again by chance, he had discovered that mixing a solution of nitroglycerin with a fluffy substance known as nitrocellulose results in a tough, plastic material that has a high water resistance and greater blasting power than ordinary dynamites. In 1887 Nobel introduced ballistite, one of the first nitroglycerin smokeless powders and a precursor of cordite. Although Nobel held the patents to dynamite and his other explosives, he was in constant conflict with competitors who stole his processes, a fact that forced him into protracted patent litigation on several occasions.
Nobel’s brothers Ludvig and Robert, in the meantime, had developed newly discovered oilfields near Baku (now in Azerbaijan) along the Caspian Sea and had themselves become immensely wealthy. Alfred’s worldwide interests in explosives, along with his own holdings in his brothers’ companies in Russia, brought him a large fortune. In 1893 he became interested in Sweden’s arms industry, and the following year he bought an ironworks at Bofors, near Varmland, that became the nucleus of the well-known Bofors arms factory. Besides explosives, Nobel made many other inventions, such as artificial silk and leather, and altogether he registered more than 350 patents in various countries.
Nobel’s complex personality puzzled his contemporaries. Although his business interests required him to travel almost constantly, he remained a lonely recluse who was prone to fits of depression. He led a retired and simple life and was a man of ascetic habits, yet he could be a courteous dinner host, a good listener, and a man of incisive wit. He never married, and apparently preferred the joys of inventing to those of romantic attachment. He had an abiding interest in literature and wrote plays, novels, and poems, almost all of which remained unpublished. He had amazing energy and found it difficult to relax after intense bouts of work. Among his contemporaries, he had the reputation of a liberal or even a socialist, but he actually distrusted democracy, opposed suffrage for women, and maintained an attitude of benign paternalism toward his many employees. Though Nobel was essentially a pacifist and hoped that the destructive powers of his inventions would help bring an end to war, his view of mankind and nations was pessimistic.
By 1895 Nobel had developed angina pectoris, and he died of a cerebral hemorrhage at his villa in San Remo, Italy, in 1896. At his death his worldwide business empire consisted of more than 90 factories manufacturing explosives and ammunition. The opening of his will, which he had drawn up in Paris on November 27, 1895, and had deposited in a bank in Stockholm, contained a great surprise for his family, friends, and the general public. He had always been generous in humanitarian and scientific philanthropies, and he left the bulk of his fortune in trust to establish what came to be the most highly regarded of international awards, the Nobel Prizes.
We can only speculate about the reasons for Nobel’s establishment of the prizes that bear his name. He was reticent about himself, and he confided in no one about his decision in the months preceding his death. The most plausible assumption is that a bizarre incident in 1888 may have triggered the train of reflection that culminated in his bequest for the Nobel Prizes. That year Alfred’s brother Ludvig had died while staying in Cannes, France. The French newspapers reported Ludvig’s death but confused him with Alfred, and one paper sported the headline “Le marchand de la mort est mort” (“The merchant of death is dead.”) Perhaps Alfred Nobel established the prizes to avoid precisely the sort of posthumous reputation suggested by this premature obituary. It is certain that the actual awards he instituted reflect his lifelong interest in the fields of physics, chemistry, physiology, and literature. There is also abundant evidence that his friendship with the prominent Austrian pacifist Bertha von Suttner inspired him to establish the prize for peace.
Nobel himself, however, remains a figure of paradoxes and contradictions: a brilliant, lonely man, part pessimist and part idealist, who invented the powerful explosives used in modern warfare but also established the world’s most prestigious prizes for intellectual services rendered to humanity.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
84. Valentina Vladimirovna Tereshkova (born 6 March 1937) is a Russian former cosmonaut. She is the first woman to have flown in space, having been selected from more than four hundred applicants and five finalists to pilot Vostok 6 on 16 June 1963. In order to join the Cosmonaut Corps, Tereshkova was honorarily inducted into the Soviet Air Force and thus she also became the first civilian to fly in space.
Before her recruitment as a cosmonaut, Tereshkova was a textile-factory assembly worker and an amateur skydiver. After the dissolution of the first group of female cosmonauts in 1969, she became a prominent member of the Communist Party of the Soviet Union, holding various political offices. She remained politically active following the collapse of the Soviet Union and is still regarded as a hero in post-Soviet Russia.
In 2013, she offered to go on a one-way trip to Mars if the opportunity arose. At the opening ceremony of the 2014 Winter Olympics, she was a carrier of the Olympic flag.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
85. Chester F. Carlson, (born Feb. 8, 1906, Seattle, Wash., U.S.—died Sept. 19, 1968, New York, N.Y.), American physicist who was the inventor of xerography, an electrostatic dry-copying process that found applications ranging from office copying to reproducing out-of-print books.
By age 14 Carlson was supporting his invalid parents, yet he managed to earn a college degree from the California Institute of Technology, Pasadena, in 1930. After a short time spent with the Bell Telephone Company, he obtained a position with the patent department of P.R. Mallory Company, a New York electronics firm.
Plagued by the difficulty of getting copies of patent drawings and specifications, Carlson began in 1934 to look for a quick, convenient way to copy line drawings and text. Since numerous large corporations were already working on photographic or chemical copying processes, he turned to electrostatics for a solution to the problem. Four years later he succeeded in making the first xerographic copy.
Carlson obtained the first of many patents for the xerographic process and tried unsuccessfully to interest someone in developing and marketing his invention. More than 20 companies turned him down. Finally, in 1944, he persuaded Battelle Memorial Institute, Columbus, Ohio, a nonprofit industrial research organization, to undertake developmental work. In 1947 a small firm in Rochester, N.Y., the Haloid Company (later the Xerox Corporation), obtained the commercial rights to xerography, and 11 years later Xerox introduced its first office copier. Carlson’s royalty rights and stock in Xerox Corporation made him a multimillionaire.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
86. André-Marie Ampère, (born Jan. 22, 1775, Lyon, France—died June 10, 1836, Marseille), French physicist who founded and named the science of electrodynamics, now known as electromagnetism. His name endures in everyday life in the ampere, the unit for measuring electric current.
Early life
Ampère, who was born into a prosperous bourgeois family during the height of the French Enlightenment, personified the scientific culture of his day. His father, Jean-Jacques Ampère, was a successful merchant, and also an admirer of the philosophy of Jean-Jacques Rousseau, whose theories of education, as outlined in his treatise Émile, were the basis of Ampère’s education. Rousseau argued that young boys should avoid formal schooling and pursue instead an “education direct from nature.” Ampère’s father actualized this ideal by allowing his son to educate himself within the walls of his well-stocked library. French Enlightenment masterpieces such as Georges-Louis Leclerc, comte de Buffon’s Histoire naturelle, générale et particulière (begun in 1749) and Denis Diderot and Jean Le Rond d’Alembert’s Encyclopédie (volumes added between 1751 and 1772) thus became Ampère’s schoolmasters. In addition, he used his access to the latest mathematical books to begin teaching himself advanced mathematics at age 12. His mother was a devout woman, so Ampère was also initiated into the Catholic faith along with Enlightenment science. The French Revolution (1787–99) that erupted during his youth was also formative. Ampère’s father was called into public service by the new revolutionary government, becoming a justice of the peace in a small town near Lyon. Yet when the Jacobin faction seized control of the Revolutionary government in 1792, Jean-Jacques Ampère resisted the new political tides, and he was guillotined on Nov. 24, 1793, as part of the Jacobin purges of the period.
While the French Revolution brought these personal traumas, it also created new institutions of science that ultimately became central to André-Marie Ampère’s professional success. He took his first regular job in 1799 as a modestly paid mathematics teacher, which gave him the financial security to marry and father his first child, Jean-Jacques, the next year. (Jean-Jacques Ampère eventually achieved his own fame as a scholar of languages.) Ampère’s maturation corresponded with the transition to the Napoleonic regime in France, and the young father and teacher found new opportunities for success within the technocratic structures favoured by the new French emperor.
In 1802 Ampère was appointed a professor of physics and chemistry at the École Centrale in Bourg-en-Bresse. He used his time in Bourg to research mathematics, producing Considérations sur la théorie mathématique de jeu (1802; “Considerations on the Mathematical Theory of Games”), a treatise on mathematical probability that he sent to the Paris Academy of Sciences in 1803. After the death of his wife in July 1803, Ampère moved to Paris, where he assumed a tutoring post at the new École Polytechnique in 1804. Despite his lack of formal qualifications, Ampère was appointed a professor of mathematics at the school in 1809. In addition to holding positions at this school until 1828, in 1819 and 1820 Ampère offered courses in philosophy and astronomy, respectively, at the University of Paris, and in 1824 he was elected to the prestigious chair in experimental physics at the Collège de France. In 1814 Ampère was invited to join the class of mathematicians in the new Institut Impériale, the umbrella under which the reformed state Academy of Sciences would sit.
Ampère engaged in a diverse array of scientific inquiries during these years leading up to his election to the academy—writing papers and engaging in topics ranging from mathematics and philosophy to chemistry and astronomy. Such breadth was customary among the leading scientific intellectuals of the day.
Founding of electromagnetism
Had Ampère died before 1820, his name and work would likely have been forgotten. In that year, however, Ampère’s friend and eventual eulogist François Arago demonstrated before the members of the French Academy of Sciences the surprising discovery of Danish physicist Hans Christiaan Ørsted that a magnetic needle is deflected by an adjacent electric current. Ampère was well prepared to throw himself fully into this new line of research.
Ampère immediately set to work developing a mathematical and physical theory to understand the relationship between electricity and magnetism. Extending Ørsted’s experimental work, Ampère showed that two parallel wires carrying electric currents repel or attract each other, depending on whether the currents flow in the same or opposite directions, respectively. He also applied mathematics in generalizing physical laws from these experimental results. Most important was the principle that came to be called Ampère’s law, which states that the mutual action of two lengths of current-carrying wire is proportional to their lengths and to the intensities of their currents. Ampère also applied this same principle to magnetism, showing the harmony between his law and French physicist Charles Augustin de Coulomb’s law of magnetic action. Ampère’s devotion to, and skill with, experimental techniques anchored his science within the emerging fields of experimental physics.
Ampère also offered a physical understanding of the electromagnetic relationship, theorizing the existence of an “electrodynamic molecule” (the forerunner of the idea of the electron) that served as the constituent element of electricity and magnetism. Using this physical understanding of electromagnetic motion, Ampère developed a physical account of electromagnetic phenomena that was both empirically demonstrable and mathematically predictive. In 1827 Ampère published his magnum opus, Mémoire sur la théorie mathématique des phénomènes électrodynamiques uniquement déduite de l’experience (Memoir on the Mathematical Theory of Electrodynamic Phenomena, Uniquely Deduced from Experience), the work that coined the name of his new science, electrodynamics, and became known ever after as its founding treatise. In recognition of his contribution to the making of modern electrical science, an international convention signed in 1881 established the ampere as a standard unit of electrical measurement, along with the coulomb, volt, ohm, and watt, which are named, respectively, after Ampère’s contemporaries Coulomb, Alessandro Volta of Italy, Georg Ohm of Germany, and James Watt of Scotland.
The 1827 publication of Ampère’s synoptic Mémoire brought to a close his feverish work over the previous seven years on the new science of electrodynamics. The text also marked the end of his original scientific work. His health began to fail, and he died while performing a university inspection, decades before his new science was canonized as the foundation stone for the modern science of electromagnetism.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
87. Wilhelm Conrad Röntgen, Röntgen also spelled Roentgen (born March 27, 1845, Lennep, Prussia [now Remscheid, Germany]—died February 10, 1923, Munich, Germany), physicist who was a recipient of the first Nobel Prize for Physics, in 1901, for his discovery of X-rays, which heralded the age of modern physics and revolutionized diagnostic medicine.
Röntgen studied at the Polytechnic in Zürich and then was professor of physics at the universities of Strasbourg (1876–79), Giessen (1879–88), Würzburg (1888–1900), and Munich (1900–20). His research also included work on elasticity, capillary action of fluids, specific heats of gases, conduction of heat in crystals, absorption of heat by gases, and piezoelectricity.
In 1895, while experimenting with electric current flow in a partially evacuated glass tube (cathode-ray tube), Röntgen observed that a nearby piece of barium platinocyanide gave off light when the tube was in operation. He theorized that when the cathode rays (electrons) struck the glass wall of the tube, some unknown radiation was formed that traveled across the room, struck the chemical, and caused the fluorescence. Further investigation revealed that paper, wood, and aluminum, among other materials, are transparent to this new form of radiation. He found that it affected photographic plates, and, since it did not noticeably exhibit any properties of light, such as reflection or refraction, he mistakenly thought the rays were unrelated to light. In view of its uncertain nature, he called the phenomenon X-radiation, though it also became known as Röntgen radiation. He took the first X-ray photographs, of the interiors of metal objects and of the bones in his wife’s hand.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
88. Sir Joseph Lister, Baronet (born April 5, 1827, Upton, Essex, Eng.—died Feb. 10, 1912, Walmer, Kent), British surgeon and medical scientist who was the founder of antiseptic medicine and a pioneer in preventive medicine. While his method, based on the use of antiseptics, is no longer employed, his principle—that bacteria must never gain entry to an operation wound—remains the basis of surgery to this day. He was made a baronet in 1883 and raised to the peerage in 1897.
Education
Lister was the second son of Joseph Jackson Lister and his wife, Isabella Harris, members of the Society of Friends, or Quakers. J.J. Lister, a wine merchant and amateur physicist and microscopist, was elected a fellow of the Royal Society for his discovery that led to the modern achromatic (non-colour-distorting) microscope.
While both parents took an active part in Lister’s education, his father instructing him in natural history and the use of the microscope, Lister received his formal schooling in two Quaker institutions, which laid far more emphasis upon natural history and science than did other schools. He became interested in comparative anatomy, and, before his 16th birthday, he had decided upon a surgical career.
After taking an arts course at University College, London, he enrolled in the faculty of medical science in October 1848. A brilliant student, he was graduated a bachelor of medicine with honours in 1852; in the same year he became a fellow of the Royal College of Surgeons and house surgeon at University College Hospital. A visit to Edinburgh in the fall of 1853 led to Lister’s appointment as assistant to James Syme, the greatest surgical teacher of his day, and in October 1856 he was appointed surgeon to the Edinburgh Royal Infirmary. In April he had married Syme’s eldest daughter. Lister, a deeply religious man, joined the Scottish Episcopal Church. The marriage, although childless, was a happy one, his wife entering fully into Lister’s professional life.
When three years later the Regius Professorship of Surgery at Glasgow University fell vacant, Lister was elected from seven applicants. In August 1861 he was appointed surgeon to the Glasgow Royal Infirmary, where he was in charge of wards in the new surgical block. The managers hoped that hospital disease (now known as operative sepsis—infection of the blood by disease-producing microorganisms) would be greatly decreased in their new building. The hope proved vain, however. Lister reported that, in his Male Accident Ward, between 45 and 50 percent of his amputation cases died from sepsis between 1861 and 1865.
Work in antisepsis
In this ward Lister began his experiments with antisepsis. Much of his earlier published work had dealt with the mechanism of coagulation of the blood and role of the blood vessels in the first stages of inflammation. Both researches depended upon the microscope and were directly connected with the healing of wounds. Lister had already tried out methods to encourage clean healing and had formed theories to account for the prevalence of sepsis. Discarding the popular concept of miasma—direct infection by bad air—he postulated that sepsis might be caused by a pollen-like dust. There is no evidence that he believed this dust to be living matter, but he had come close to the truth. It is therefore all the more surprising that he became acquainted with the work of the bacteriologist Louis Pasteur only in 1865.
Pasteur had arrived at his theory that microorganisms cause fermentation and disease by experiments on fermentation and putrefaction. Lister’s education and his familiarity with the microscope, the process of fermentation, and the natural phenomena of inflammation and coagulation of the blood impelled him to accept Pasteur’s theory as the full revelation of a half-suspected truth. At the start he believed the germs were carried solely by the air. This incorrect opinion proved useful, for it obliged him to adopt the only feasible method of surgically clean treatment. In his attempt to interpose an antiseptic barrier between the wound and the air, he protected the site of operation from infection by the surgeon’s hands and instruments. He found an effective antiseptic in carbolic acid, which had already been used as a means of cleansing foul-smelling sewers and had been empirically advised as a wound dressing in 1863. Lister first successfully used his new method on Aug. 12, 1865; in March 1867 he published a series of cases. The results were dramatic. Between 1865 and 1869, surgical mortality fell from 45 to 15 percent in his Male Accident Ward.
In 1869, Lister succeeded Syme in the chair of Clinical Surgery at Edinburgh. There followed the seven happiest years of his life when, largely as the result of German experiments with antisepsis during the Franco-German War, his clinics were crowded with visitors and eager students. In 1875 Lister made a triumphal tour of the leading surgical centres in Germany. The next year he visited America but was received with little enthusiasm except in Boston and New York City.
Lister’s work had been largely misunderstood in England and the United States. Opposition was directed against his germ theory rather than against his “carbolic treatment.” The majority of practicing surgeons were unconvinced; while not antagonistic, they awaited clear proof that antisepsis constituted a major advance. Lister was not a spectacular operative surgeon and refused to publish statistics. Edinburgh, despite the ancient fame of its medical school, was regarded as a provincial centre. Lister understood that he must convince London before the usefulness of his work would be generally accepted.
His chance came in 1877, when he was offered the chair of Clinical Surgery at King’s College. On Oct. 26, 1877, Lister, at King’s College Hospital, for the first time performed the then-revolutionary operation of wiring a fractured patella, or kneecap. It entailed the deliberate conversion of a simple fracture, carrying no risk to life, into a compound fracture, which often resulted in generalized infection and death. Lister’s proposal was widely publicized and aroused much opposition. Thus, the entire success of his operation carried out under antiseptic conditions forced surgical opinion throughout the world to accept that his method had added greatly to the safety of operative surgery.
More fortunate than many pioneers, Lister saw the almost universal acceptance of his principle during his working life. He retired from surgical practice in 1893, after the death of his wife in the previous year. Many honours came to him. Created a baronet in 1883, he was made Baron Lister of Lyme Regis in 1897 and appointed one of the 12 original members of the Order of Merit in 1902. He was a gentle, shy, unassuming man, firm in his purpose because he humbly believed himself to be directed by God. He was uninterested in social success or financial reward. In person he was handsome, with a fine athletic figure, fresh complexion, hazel eyes, and silver hair. For some years before his death, however, he was almost completely blind and deaf. Lister wrote no books but contributed many papers to professional journals. These are contained in The Collected Papers of Joseph, Baron Lister, 2 vol. (1909).
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
89. William Sturgeon, (born May 22, 1783, Whittington, Lancashire, Eng.—died Dec. 4, 1850, Prestwich, Lancashire), English electrical engineer who devised the first electromagnet capable of supporting more than its own weight. This device led to the invention of the telegraph, the electric motor, and numerous other devices basic to modern technology.
Sturgeon, self-educated in electrical phenomena and natural science, spent much time lecturing and conducting electrical experiments. In 1824 he became lecturer in science at the Royal Military College, Addiscombe, Surrey, and the following year he exhibited his first electromagnet. The 7-ounce (200-gram) magnet was able to support 9 pounds (4 kilograms) of iron using the current from a single cell.
Sturgeon built an electric motor in 1832 and invented the commutator, an integral part of most modern electric motors. In 1836, the year he founded the monthly journal Annals of Electricity, he invented the first suspended coil galvanometer, a device for measuring current. He also improved the voltaic battery and worked on the theory of thermoelectricity. From more than 500 kite observations he established that in serene weather the atmosphere is invariably charged positively with respect to the Earth, becoming more positive with increasing altitude.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
90. Daniel Gabriel Fahrenheit
Fahrenheit was the scion of a wealthy merchant family that had come to Danzig from Königsberg in the middle of the serventeenth century. His father, Daniel, married Concordia Schumann, the daughter of Danzig wholesaler. From this union there were five children, three girls and two boys, of whom Daniel was the eldest.
In 1701 Fahrenheit’s parents died suddenly, and his guardian sent him to Amsterdam to learn business. It was there, apparently, the Fahrenheit first became acquainted with, and then fascinated by, the rather specialized and small but rapidly growing business of making scientific instruments. About 1707 he began his years of wandering, during which he acquired the techniques of his trade by observing the practices of other scientists and instrument makers. He traveled throughout Germany, visiting his native city of Danzig as well as Berlin, Halle, Leipzig, and Dresden. He met Olaus Roemer in Copenhagen in 1708, and in 1715 he entered into correspondence with Leibniz about a clock for determining longitude at sea. In 1714 Christian von Wolff published a description of one of Fahrenheit’s early thermometers in the Acta eruditorum. Fahrenheit returned to Amsterdam in 1717 and established himself as a maker of scientific instruments. There he became acquainted with three of the greatest Dutch scientists of his era: W. J. ’sGravesande, Hermann Boerhaave, and Pieter van Musschenbroek. In 1724 he was admitted to the Royal Society, and in the same year he published in the Philosophical Transactions his only scientific writings, five brief articles in Latin. Just before his death in 1736, Fahrenheit took out a patent on a pumping device that he hoped would be useful in draining Dutch polders.
Fahrenheit’s most significant achievement was his development of the standard thermometric scale that bears his name. Nearly a century had passed since the construction of the first primitive thermometers, and although many of the basic problems of thermometry had been solved, no standard thermometric scale had been developed that would allow scientists in different locations to compare temperatures. About 1701 Olaus Roemeṙ had constructed a spirit thermometer based upon two universal fiducial points. The upper fixed point, determined by the temperature of boiling water, was labeled 60°; the lower fixed point, determined by the temperature of melting ice, was set at 7–1/2°. This latter, seemingly arbitrary, number was chosen to allow exactly 1/8 of the entire scale to stand below the freezing point. Since 00 on the Roemer scale approximated the temperature of an ice and salt mixture (which was widely considered to be at the coldest possible temperature), all readings on Roemer’s thermometer were assumed to be positive.
Roemer did not publish anything about his thermometer, and its existence was unknown to most of his contemporaries except Fahrenheit, who thought mistakenly that his own thermometric scale was patterned after Roemer’s. In 1708, while visiting Roemer, Fahrenheit watched the Danish astronomer as he graduated several thermometers. These particular instruments were being graduated to a scale of 22–1/2°, or 3/8 of Roemer’s standard scale of 60°. Since most of the scale would then be in the temperate range, it is probable that Roemer was designing them for meteorological purposes. In a letter addressed to Boerhaave, Fahrenheit gave the following description of Roemer’s procedure.
The problem with Fahrenheit’s account is that he took Roemer’s “blood-warm” (22–1/2°) to be a primary fiducial point, fixed quite literally at the temperature of the human blood. In fact, 22–1/2° on the Roemer scale is considerably below body temperature (by about 15° on the modern Fahrenheit thermometer). Furthermore, Roemer used boiling water (set at 60°), not blood temperature, as his upper fixed point. The simplest explanation for Fahrenheit’s misunderstanding of the Roemer scale seems to lie in the ambiguity of the term “blood-warm.” It can mean either a tepid heat or the exact temperature of the human blood. Roemer probably intended to convey the former meaning, and Fahrenheit obviously understood the latter one.
When Fahrenheit began producing thermometers of his own, he graduated them after what he believed were Roemer’s methods. The upper fixed point (labeled 22–1/2°) was determined by placing the bulb of the thermometer in the mouth or armpit of a healthy male. The lower fixed point (labeled 7–1/2°) was determined by an ice and water mixture. In addition, Fahrenheit divided each degree into four parts, so that the upper point became 90° and the lower one 30°. Later (in 1717) he moved the upper point to 96° and the lower one to 32° in order to eliminate “inconvenient and awkward fractions.”
In an article on the boiling points of various liquids, Fahrenheit reported that the boiling temperature of water was 212° on his thermometric scale. This figure was actually several degrees higher than it should have been. After Fahrenheit’s death it became standard practice to graduate Fahrenheit thermometers with the boiling point of water (set at 212°) as the upper fixed point. As a result, normal body temperature became 98.6° instead of Fahrenheit’s 96°. This variant of the Fahrenheit scale became standard throughout Holland and Britain. Today it is used for meteorological purposes in most English-speaking countries.
Fahrenheit knew that the boiling temperature of water varied with the atmospheric pressure, and on this Principle he constructed a hypsometric thermometer that enabled one to determine the atmospheric Pressure directly from a reading of the boiling point of water. He also invented a hydrometer that became a model for subsequent developments.
In the early eighteenth century, it was not at all unusual for a person without formal scientific training to be admitted to the Royal Society. Makers of scientific instruments could be particularly valuable members because they often operated on the farthest frontiers of scientific knowledge, defining universal constants on which to scale their instruments and isolating the variables that affected their operation. In order to make reliable instruments that would be useful to the scientific community as a whole, Fahrenheit was obliged to concern himself with a wide variety of scientific Problems: measuring the expansion of glass, assessing the thermometric behavior of mercury and alcohol, describing the effects of atmospheric pressure on the boiling points of liquids and establishing the densities of various substances. His direct contributions, it is true, were small, but in raising appreciably the level of precision that was obtainable in many scientific observations, Fahrenheit affected profoundly the course of experimental physics in the eighteenth century.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
91. Christopher Columbus (between 31 October 1450 and 30 October 1451 – 20 May 1506) was an Italian explorer, navigator, colonizer and citizen of the Republic of Genoa. Under the auspices of the Catholic Monarchs of Spain, he completed four voyages across the Atlantic Ocean. Those voyages, and his efforts to establish permanent settlements on the island of Hispaniola, initiated the Spanish colonization of the New World.
In the context of emerging Western imperialism and economic competition between European kingdoms through the establishment of trade routes and colonies, Columbus's proposal to reach the East Indies by sailing westward eventually received the support of the Spanish Crown, which saw in it a chance to enter the spice trade with Asia through a new westward route. During his first voyage in 1492, instead of arriving at Japan as he had intended, Columbus reached the New World, landing on an island in the Bahamas archipelago that he named "San Salvador". Over the course of three more voyages, Columbus visited the Greater and Lesser Antilles, as well as the Caribbean coast of Venezuela and Central America, claiming all of it for the Crown of Castile.
Though Columbus was not the first European explorer to reach the Americas (having been preceded by the Vikinger expedition led by Leif Ericson in the 11th century), his voyages led to the first lasting European contact with the Americas, inaugurating a period of European exploration, conquest, and colonization that lasted for several centuries. These voyages had, therefore, an enormous impact in the historical development of the modern Western world. Columbus spearheaded the transatlantic slave trade and has been accused by several historians of initiating the genocide of the Hispaniola natives. Columbus himself saw his accomplishments primarily in the light of spreading the Christian religion.
Never admitting that he had reached a continent previously unknown to Europeans rather than the East Indies he had set out for, Columbus called the inhabitants of the lands he visited indios (Spanish for "Indians"). Columbus's strained relationship with the Spanish crown and its appointed colonial administrators in America led to his arrest and dismissal as governor of the settlements on the island of Hispaniola in 1500 and later to protracted litigation over the benefits which Columbus and his heirs claimed were owed to them by the crown.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
92. Fermat's last theorem earns Andrew Wiles the Abel Prize
Mathematician receives coveted award for solving three-century-old problem in number theory.
Andrew Wiles (in 1998) poses next to Fermat's last theorem — the proof of which has won him the Abel prize.
British number theorist Andrew Wiles has received the 2016 Abel Prize for his solution to Fermat’s last theorem — a problem that stumped some of the world’s greatest minds for three and a half centuries. The Norwegian Academy of Science and Letters announced the award — considered by some to be the 'Nobel of mathematics' — on 15 March.
Wiles, who is 62 and now at the University of Oxford, UK, will receive 6 million kroner (US$700,000) for his 1994 proof of the theorem, which states that there cannot be any positive whole numbers x, y and z such that x^n + y^n = z^n, if n is greater than 2.
Soon after receiving the news on the morning of 15 March, Wiles told Nature that the award came to him as a “total surprise”.
That he solved a problem considered too hard by so many — and yet a problem relatively simple to state — has made Wiles arguably “the most celebrated mathematician of the twentieth century”, says Martin Bridson, director of Oxford's Mathematical Institute — which is housed in a building named after Wiles. Although his achievement is now two decades old, he continues to inspire young minds, something that is apparent when school children show up at his public lectures. “They treat him like a rock star,” Bridson says. “They line up to have their photos taken with him.”
Lifelong quest
Wiles's story has become a classic tale of tenacity and resilience. While a faculty member at Princeton University in New Jersey in the 1980s, he embarked on a solitary, seven-year quest to solve the problem, working in his attic without telling anyone except for his wife. He went on to make a historic announcement at a conference in his hometown of Cambridge, UK, in June 1993, only to hear from a colleague two months later that his proof contained a serious mistake. But after another frantic year of work — and with the help of one of his former students, Richard Taylor, who is now at the Institute for Advanced Study in Princeton — he was able to patch up the proof. When the resulting two papers were published in 1995, they made up an entire issue of the Annals of Mathematics1, 2.
But after Wiles's original claim had already made front-page news around the world, the pressure on the shy mathematician to save his work almost crippled him. “Doing mathematics in that kind of overexposed way is certainly not my style, and I have no wish to repeat it,” he said in a BBC documentary in 1996, still visibly shaken by the experience. “It’s almost unbelievable that he was able to get something done” at that point, says John Rognes, a mathematician at the University of Oslo and chair of the Abel Committee.
“It was very, very intense,” says Wiles. “Unfortunately as human beings we succeed by trial and error. It’s the people who overcome the setbacks who succeed.”
Wiles first learnt about French mathematician Pierre de Fermat as a child growing up in Cambridge. As he was told, Fermat formulated his eponymous theorem in a handwritten note in the margins of a book in 1637: “I have a truly marvellous demonstration of this proposition which this margin is too narrow to contain,” he wrote (in Latin).
“I think it has a very romantic story,” Wiles says of Fermat's idea. “The kind of story that catches people’s imagination when they’re young and thinking of entering mathematics.”
But although he may have thought he had a proof at the time, only a proof for one special case has survived him, for exponent n = 4. A century later, Leonhard Euler proved it for n = 3, and Sophie Germain's work led to a proof for infinitely many exponents, but still not for all. Experts now tend to concur that the most general form of the statement would have been impossible to crack without mathematical tools that became available only in the twentieth century.
In 1983, German mathematician Gerd Faltings, now at the Max Planck Institute for Mathematics in Bonn, took a huge leap forward by proving that Fermat's statement had, at most, a finite number of solutions, although he could not show that the number should be zero. (In fact, he proved a result viewed by specialists as deeper and more interesting than Fermat's last theorem itself; it demonstrated that a broader class of equations has, at most, a finite number of solutions.)
To narrow it to zero, Wiles took a different approach: he proved the Shimura-Taniyama conjecture, a 1950s proposal that describes how two very different branches of mathematics, called elliptic curves and modular forms, are conceptually equivalent. Others had shown that proof of this equivalence would imply proof of Fermat — and, like Faltings' result, most mathematicians regard this as much more profound than Fermat’s last theorem itself. (The full citation for the Abel Prize states that it was awarded to Wiles “for his stunning proof of Fermat’s Last Theorem by way of the modularity conjecture for semistable elliptic curves, opening a new era in number theory.”)
The link between the Shimura–Taniyama conjecture and Fermat's theorum was first proposed in 1984 by number theorist Gerhard Frey, now at the University of Duisburg-Essen in Germany. He claimed that any counterexample to Fermat's last theorem would also lead to a counterexample to the Shimura–Taniyama conjecture.
Kenneth Ribet, a mathematician at the University of California, Berkeley, soon proved that Frey was right, and therefore that anyone who proved the more recent conjecture would also bag Fermat's. Still, that did not seem to make the task any easier. “Andrew Wiles is probably one of the few people on Earth who had the audacity to dream that he can actually go and prove this conjecture,” Ribet told the BBC in the 1996 documentary.
Fermat's last theorem is also connected to another deep question in number theory called the abc conjecture, Rognes points out. Mathematician Shinichi Mochizuki of Kyoto University's Research Institute for Mathematical Sciences in Japan claimed to have proved that conjecture in 2012, although his roughly 500-page proof is still being vetted by his peers. Some mathematicians say that Mochizuki's work could provide, as an extra perk, an alternative way of proving Fermat, although Wiles says that sees those hopes with scepticism.
Wiles helped to arrange an Oxford workshop on Mochizuki's work last December, although his research interests are somewhat different. Lately, he has focused his efforts on another major, unsolved conjecture in number theory, which has been listed as one of seven Millennium Prize problems posed by the Clay Mathematics Institute in Oxford, UK. He still works very hard and thinks about mathematics for most of his waking hours, including as he walks to the office in the morning. “He doesn’t want to cycle,” Bridson says. “He thinks it would be a bit dangerous for him to do it while thinking about mathematics.”
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
93. Me!
"Time not important. Only life important." - The Fifth Element 1997
Offline
93. Niels Henrik Abel
Niels Henrik Abel, (born August 5, 1802, island of Finnøy, near Stavanger, Norway—died April 6, 1829, Froland), Norwegian mathematician, a pioneer in the development of several branches of modern mathematics.
Abel’s father was a poor Lutheran minister who moved his family to the parish of Gjerstad, near the town of Risør in southeast Norway, soon after Niels Henrik was born. In 1815 Niels entered the cathedral school in Oslo, where his mathematical talent was recognized in 1817 with the arrival of a new mathematics teacher, Bernt Michael Holmboe, who introduced him to the classics in mathematical literature and proposed original problems for him to solve. Abel studied the mathematical works of the 17th-century Englishman Sir Isaac Newton, the 18th-century German Leonhard Euler, and his contemporaries the Frenchman Joseph-Louis Lagrange and the German Carl Friedrich Gauss in preparation for his own research.
Abel’s father died in 1820, leaving the family in straitened circumstances, but Holmboe contributed and raised funds that enabled Abel to enter the University of Christiania (Oslo) in 1821. Abel obtained a preliminary degree from the university in 1822 and continued his studies independently with further subsidies obtained by Holmboe.
Abel’s first papers, published in 1823, were on functional equations and integrals; he was the first person to formulate and solve an integral equation. His friends urged the Norwegian government to grant him a fellowship for study in Germany and France. In 1824, while waiting for a royal decree to be issued, he published at his own expense his proof of the impossibility of solving algebraically the general equation of the fifth degree, which he hoped would bring him recognition. He sent the pamphlet to Gauss, who dismissed it, failing to recognize that the famous problem had indeed been settled.
Abel spent the winter of 1825–26 with Norwegian friends in Berlin, where he met August Leopold Crelle, civil engineer and self-taught enthusiast of mathematics, who became his close friend and mentor. With Abel’s warm encouragement, Crelle founded the Journal für die reine und angewandte Mathematik (“Journal for Pure and Applied Mathematics”), commonly known as Crelle’s Journal. The first volume (1826) contains papers by Abel, including a more elaborate version of his work on the quintic equation. Other papers dealt with equation theory, calculus, and theoretical mechanics. Later volumes presented Abel’s theory of elliptic functions, which are complex functions (see complex number) that generalize the usual trigonometric functions.
In 1826 Abel went to Paris, then the world centre for mathematics, where he called on the foremost mathematicians and completed a major paper on the theory of integrals of algebraic functions. His central result, known as Abel’s theorem, is the basis for the later theory of Abelian integrals and Abelian functions, a generalization of elliptic function theory to functions of several variables. However, Abel’s visit to Paris was unsuccessful in securing him an appointment, and the memoir he submitted to the French Academy of Sciences was lost.
Abel returned to Norway heavily in debt and suffering from tuberculosis. He subsisted by tutoring, supplemented by a small grant from the University of Christiania and, beginning in 1828, by a temporary teaching position. His poverty and ill health did not decrease his production; he wrote a great number of papers during this period, principally on equation theory and elliptic functions. Among them are the theory of polynomial equations with Abelian groups. He rapidly developed the theory of elliptic functions in competition with the German Carl Gustav Jacobi. By this time Abel’s fame had spread to all mathematical centres, and strong efforts were made to secure a suitable position for him by a group from the French Academy, who addressed King Bernadotte of Norway-Sweden; Crelle also worked to secure a professorship for him in Berlin.
In the fall of 1828 Abel became seriously ill, and his condition deteriorated on a sled trip at Christmastime to visit his fiancée at Froland, where he died. The French Academy published his memoir in 1841.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
94. Leonhard Euler, (born April 15, 1707, Basel, Switzerland—died September 18, 1783, St. Petersburg, Russia), Swiss mathematician and physicist, one of the founders of pure mathematics. He not only made decisive and formative contributions to the subjects of geometry, calculus, mechanics, and number theory but also developed methods for solving problems in observational astronomy and demonstrated useful applications of mathematics in technology and public affairs.
Euler’s mathematical ability earned him the esteem of Johann Bernoulli, one of the first mathematicians in Europe at that time, and of his sons Daniel and Nicolas. In 1727 he moved to St. Petersburg, where he became an associate of the St. Petersburg Academy of Sciences and in 1733 succeeded Daniel Bernoulli to the chair of mathematics. By means of his numerous books and memoirs that he submitted to the academy, Euler carried integral calculus to a higher degree of perfection, developed the theory of trigonometric and logarithmic functions, reduced analytical operations to a greater simplicity, and threw new light on nearly all parts of pure mathematics. Overtaxing himself, Euler in 1735 lost the sight of one eye. Then, invited by Frederick the Great in 1741, he became a member of the Berlin Academy, where for 25 years he produced a steady stream of publications, many of which he contributed to the St. Petersburg Academy, which granted him a pension.
In 1748, in his Introductio in analysin infinitorum, he developed the concept of function in mathematical analysis, through which variables are related to each other and in which he advanced the use of infinitesimals and infinite quantities. He did for modern analytic geometry and trigonometry what the Elements of Euclid had done for ancient geometry, and the resulting tendency to render mathematics and physics in arithmetical terms has continued ever since. He is known for familiar results in elementary geometry—for example, the Euler line through the orthocentre (the intersection of the altitudes in a triangle), the circumcentre (the centre of the circumscribed circle of a triangle), and the barycentre (the “centre of gravity,” or centroid) of a triangle. He was responsible for treating trigonometric functions—i.e., the relationship of an angle to two sides of a triangle—as numerical ratios rather than as lengths of geometric lines and for relating them, through the so-called Euler identity
, with complex numbers (e.g., 3 + 2√(−1)). He discovered the imaginary logarithms of negative numbers and showed that each complex number has an infinite number of logarithms.Euler’s textbooks in calculus, Institutiones calculi differentialis in 1755 and Institutiones calculi integralis in 1768–70, have served as prototypes to the present because they contain formulas of differentiation and numerous methods of indefinite integration, many of which he invented himself, for determining the work done by a force and for solving geometric problems, and he made advances in the theory of linear differential equations, which are useful in solving problems in physics. Thus, he enriched mathematics with substantial new concepts and techniques. He introduced many current notations, such as Σ for the sum; the symbol e for the base of natural logarithms; a, b and c for the sides of a triangle and A, B, and C for the opposite angles; the letter f and parentheses for a function; and i for √(−1). He also popularized the use of the symbol π (devised by British mathematician William Jones) for the ratio of circumference to diameter in a circle.
After Frederick the Great became less cordial toward him, Euler in 1766 accepted the invitation of Catherine II to return to Russia. Soon after his arrival at St. Petersburg, a cataract formed in his remaining good eye, and he spent the last years of his life in total blindness. Despite this tragedy, his productivity continued undiminished, sustained by an uncommon memory and a remarkable facility in mental computations. His interests were broad, and his Lettres à une princesse d’Allemagne in 1768–72 were an admirably clear exposition of the basic principles of mechanics, optics, acoustics, and physical astronomy. Not a classroom teacher, Euler nevertheless had a more pervasive pedagogical influence than any modern mathematician. He had few disciples, but he helped to establish mathematical education in Russia.
Euler devoted considerable attention to developing a more perfect theory of lunar motion, which was particularly troublesome, since it involved the so-called three-body problem—the interactions of Sun, Moon, and Earth. (The problem is still unsolved.) His partial solution, published in 1753, assisted the British Admiralty in calculating lunar tables, of importance then in attempting to determine longitude at sea. One of the feats of his blind years was to perform all the elaborate calculations in his head for his second theory of lunar motion in 1772. Throughout his life Euler was much absorbed by problems dealing with the theory of numbers, which treats of the properties and relationships of integers, or whole numbers (0, ±1, ±2, etc.); in this, his greatest discovery, in 1783, was the law of quadratic reciprocity, which has become an essential part of modern number theory.
In his effort to replace synthetic methods by analytic ones, Euler was succeeded by J.-L. Lagrange. But, where Euler had delighted in special concrete cases, Lagrange sought for abstract generality, and, while Euler incautiously manipulated divergent series, Lagrange attempted to establish infinite processes upon a sound basis. Thus it is that Euler and Lagrange together are regarded as the greatest mathematicians of the 18th century, but Euler has never been excelled either in productivity or in the skillful and imaginative use of algorithmic devices (i.e., computational procedures) for solving problems.
Last edited by Jai Ganesh (2016-03-19 16:44:21)
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
95. William Shakespeare
The English playwright, poet, and actor William Shakespeare (1564-1616) is generally acknowledged to be the greatest of English writers and one of the most extraordinary creators in human history.
The most crucial fact about William Shakespeare's career is that he was a popular dramatist. Born 6 years after Queen Elizabeth I had ascended the throne, contemporary with the high period of the English Renaissance, Shakespeare had the good luck to find in the theater of London a medium just coming into its own and an audience, drawn from a wide range of social classes, eager to reward talents of the sort he possessed. His entire life was committed to the public theater, and he seems to have written nondramatic poetry only when enforced closings of the theater made writing plays impractical. It is equally remarkable that his days in the theater were almost exactly contemporary with the theater's other outstanding achievements—the work, for example, of Christopher Marlowe, Ben Jonson, and John Webster.
Shakespeare was born on or just before April 23, 1564, in the small but then important Warwickshire town of Stratford. His mother, born Mary Arden, was the daughter of a landowner from a neighboring village. His father, John, son of a farmer, was a glove maker and trader in farm produce; he had achieved a position of some eminence in the prosperous market town by the time of his son's birth, holding a number of responsible positions in Stratford's government and serving as mayor in 1569. By 1576, however, John Shakespeare had begun to encounter the financial difficulties which were to plague him until his death in 1601.
Though no personal documents survive from Shakespeare's school years, his literary work shows the mark of the excellent if grueling education offered at the Stratford grammar school (some reminiscences of Stratford school days may have lent amusing touches to scenes in The Merry Wives of Windsor). Like other Elizabethan schoolboys, Shakespeare studied Latin grammar during the early years, then progressed to the study of logic, rhetoric, composition, oration, versification, and the monuments of Roman literature. The work was conducted in Latin and relied heavily on rote memorization and the master's rod. A plausible tradition holds that William had to discontinue his education when about 13 in order to help his father. At 18 he married Ann Hathaway, a Stratford girl. They had three children (Susanna, 1583-1649; Hamnet, 1585-1596; and his twin, Judith, 1585-1662) and who was to survive him by 7 years. Shakespeare remained actively involved in Stratford affairs throughout his life, even when living in London, and retired there at the end of his career.
The years between 1585 and 1592, having left no evidence as to Shakespeare's activities, have been the focus of considerable speculation; among other things, conjecture would have him a traveling actor or a country schoolmaster. The earliest surviving notice of his career in London is a jealous attack on the "upstart crow" by Robert Greene, a playwright, professional man of letters, and profligate whose career was at an end in 1592 though he was only 6 years older than Shakespeare. Greene's outcry testifies, both in its passion and in the work it implies Shakespeare had been doing for some time, that the young poet had already established himself in the capital. So does the quality of Shakespeare's first plays: it is hard to believe that even Shakespeare could have shown such mastery without several years of apprenticeship.
Early Career
Shakespeare's first extant play is probably The Comedy of Errors (1590; like most dates for the plays, this is conjectural and may be a year or two off), a brilliant and intricate farce involving two sets of identical twins and based on two already-complicated comedies by the Roman Plautus. Though less fully achieved, his next comedy, The Two Gentlemen of Verona (1591), is more prophetic of Shakespeare's later comedy, for its plot depends on such devices as a faithful girl who educates her fickle lover, romantic woods, a girl dressed as a boy, sudden reformations, music, and happy marriages at the end. The last of the first comedies, Love's Labour's Lost (1593), is romantic again, dealing with the attempt of three young men to withdraw from the world and women for 3 years to study in their king's "little Academe," and their quick surrender to a group of young ladies who come to lodge nearby. If the first of the comedies is most notable for its plotting and the second for its romantic elements, the third is distinguished by its dazzling language and its gallery of comic types. Already Shakespeare had learned to fuse conventional characters with convincing representations of the human life he knew.
Though little read and performed now, Shakespeare's first plays in the popular "chronicle," or history, genre are equally ambitious and impressive. Dealing with the tumultuous events of English history between the death of Henry V in 1422 and the accession of Henry VII in 1485 (which began the period of Tudor stability maintained by Shakespeare's own queen), the three "parts" of Henry VI (1592) and Richard III (1594) are no tentative experiments in the form: rather they constitute a gigantic tetralogy, in which each part is a superb play individually and an integral part of an epic sequence. Nothing so ambitious had ever been attempted in England in a form hitherto marked by slapdash formlessness.
Shakespeare's first tragedy, Titus Andronicus (1593), reveals similar ambition. Though its chamber of horrors— including mutilations and ingenious murders—strikes the modern reader as belonging to a theatrical tradition no longer viable, the play is in fact a brilliant and successful attempt to outdo the efforts of Shakespeare's predecessors in the lurid tradition of the revenge play.
When the theaters were closed because of plague during much of 1593-1594, Shakespeare looked to nondramatic poetry for his support and wrote two narrative masterpieces, the seriocomic Venus and Adonis and the tragic math of Lucrece, for a wealthy patron, the Earl of Southampton. Both poems carry the sophisticated techniques of Elizabethan narrative verse to their highest point, drawing on the resources of Renaissance mythological and symbolic traditions.
Shakespeare's most famous poems, probably composed in this period but not published until 1609, and then not by the author, are the 154 sonnets, the supreme English examples of the form. Writing at the end of a brief, frenzied vogue for sequences of sonnets, Shakespeare found in the conventional 14-line lyric with its fixed rhyme scheme a vehicle for inexhaustible technical innovations—for Shakespeare even more than for other poets, the restrictive nature of the sonnet generates a paradoxical freedom of invention that is the life of the form—and for the expression of emotions and ideas ranging from the frivolous to the tragic. Though often suggestive of autobiographical revelation, the sonnets cannot be proved to be any the less fictions than the plays. The identity of their dedicatee, "Mr. W. H.," remains a mystery, as does the question of whether there were real-life counterparts to the famous "dark lady" and the unfaithful friend who are the subject of a number of the poems. But the chief value of these poems is intrinsic: the sonnets alone would have established Shakespeare's preeminence among English poets.
Lord Chamberlain's Men
By 1594 Shakespeare was fully engaged in his career. In that year he became principal writer for the successful Lord Chamberlain's Men—one of the two leading companies of actors; a regular actor in the company; and a "sharer," or partner, in the group of artist-managers who ran the entire operation and were in 1599 to have the Globe Theater built on the south bank of the Thames. The company performed regularly in unroofed but elaborate theaters. Required by law to be set outside the city limits, these theaters were the pride of London, among the first places shown to visiting foreigners, and seated up to 3,000 people. The actors played on a huge platform stage equipped with additional playing levels and surrounded on three sides by the audience; the absence of scenery made possible a flow of scenes comparable to that of the movies, and music, costumes, and ingenious stage machinery created successful illusions under the afternoon sun.
For this company Shakespeare produced a steady outpouring of plays. The comedies include The Taming of the Shrew (1594), fascinating in light of the first comedies since it combines with an Italian-style plot, in which all the action occurs in one day, a more characteristically English and Shakespearean plot, the taming of Kate, in which much more time passes; A Midsummer Night's Dream (1595), in which "rude mechanicals," artisans without imagination, become entangled with fairies and magic potions in the moonlit woods to which young lovers have fled from a tyrannical adult society; The Merchant of Venice (1596), which contributed Shylock and Portia to the English literary tradition; Much Ado about Nothing (1598), with a melodramatic main plot whose heroine is maligned and almost driven to death by a conniving villain and a comic subplot whose Beatrice and Benedick remain the archetypical sparring lovers; The Merry Wives of Windsor (1599), held by tradition to have been written in response to the Queen's request that Shakespeare write another play about Falstaff (who had appeared in Henry IV), this time in love; and in 1600 the pastoral As You Like It, a mature return to the woods and conventions of The Two Gentlemen of Verona and A Midsummer Night's Dream, and Twelfth Night, perhaps the most perfect of the comedies, a romance of identical twins separated at sea, young love, and the antics of Malvolio and Sir Toby Belch.
Shakespeare's only tragedies of the period are among his most familiar plays: Romeo and Juliet (1596), Julius Caesar (1599), and Hamlet (1601). Different from one another as they are, these three plays share some notable features: the setting of intense personal tragedy in a large world vividly populated by what seems like the whole range of humanity; a refusal, shared by most of Shakespeare's contemporaries in the theater, to separate comic situations and techniques from tragic; the constant presence of politics; and—a personal rather than a conventional phenomenon—a tragic structure in which what is best in the protagonist is what does him in when he finds himself in conflict with the world.
Continuing his interest in the chronicle, Shakespeare wrote King John (1596), despite its one strong character a relatively weak play; and the second and greater tetralogy, ranging from Richard II (1595), in which the forceful Bolingbroke, with an ambiguous justice on his side, deposes the weak but poetic king, through the two parts of Henry IV (1597), in which the wonderfully amoral, fat knight Falstaff accompanies Prince Hal, Bolingbroke's son, to Henry V (1599), in which Hal, become king, leads a newly unified England, its civil wars temporarily at an end but sadly deprived of Falstaff and the dissident lowlife who provided so much joy in the earlier plays, to triumph over France. More impressively than the first tetralogy, the second turns history into art. Spanning the poles of comedy and tragedy, alive with a magnificent variety of unforgettable characters, linked to one another as one great play while each is a complete and independent success in its own right—the four plays pose disturbing and unanswerable questions about politics, making one ponder the frequent difference between the man capable of ruling and the man worthy of doing so, the meaning of legitimacy in office, the value of order and stability as against the value of revolutionary change, and the relation of private to public life. The plays are exuberant works of art, but they are not optimistic about man as a political animal, and their unblinkered recognition of the dynamics of history has made them increasingly popular and relevant in our own tormented era.
Three plays of the end of Elizabeth's reign are often grouped as Shakespeare's "problem plays," though no definition of that term is able successfully to differentiate them as an exclusive group. All's Well That Ends Well (1602) is a romantic comedy with qualities that seem bitter to many critics; like other plays of the period, by Shakespeare and by his contemporaries, it presents sexual relations between men and women in a harsh light. Troilus and Cressida (1602), hardest of the plays to classify generically, is a brilliant, sardonic, and disillusioned piece on the Trojan War, unusually philosophical in its language and reminiscent in some ways of Hamlet. The tragicomic Measure for Measure (1604) focuses more on sexual problems than any other play in the canon; Angelo, the puritanical and repressed man of ice who succumbs to violent sexual urges the moment he is put in temporary authority over Vienna during the duke's absence, and Isabella, the victim of his lust, are two of the most interesting characters in Shakespeare, and the bawdy city in which the action occurs suggests a London on which a new mood of modern urban hopelessness is settling.
King's Men
Promptly upon his accession in 1603, King James I, more ardently attracted to theatrical art than his predecessor, bestowed his patronage upon the Lord Chamberlain's Men, so that the flag of the King's Men now flew over the Globe. During his last decade in the theater Shakespeare was to write fewer but perhaps even finer plays. Almost all the greatest tragedies belong to this period. Though they share the qualities of the earlier tragedies, taken as a group they manifest new tendencies. The heroes are dominated by passions that make their moral status increasingly ambiguous, their freedom increasingly circumscribed; similarly the society, even the cosmos, against which they strive suggests less than ever that all can ever be right in the world. As before, what destroys the hero is what is best about him, yet the best in Macbeth or Othello cannot so simply be commended as Romeo's impetuous ardor or Brutus's political idealism (fatuous though it is). The late tragedies are each in its own way dramas of alienation, and their focus, like that of the histories, continues to be felt as intensely relevant to the concerns of modern men.
Othello (1604) is concerned, like other plays of the period, with sexual impurity, with the difference that that impurity is the fantasy of the protagonist about his faithful wife. Iago, the villain who drives Othello to doubt and murder, is the culmination of two distinct traditions, the "Machiavellian" conniver who uses deceit in order to subvert the order of the polity, and the Vice, a schizophrenically tragicomic devil figure from the morality plays going out of fashion as Shakespeare grew up. King Lear (1605), to many Shakespeare's masterpiece, is an agonizing tragic version of a comic play (itself based on mythical early English history), in which an aged king who foolishly deprives his only loving daughter of her heritage in order to leave all to her hypocritical and vicious sisters is hounded to death by a malevolent alliance which at times seems to include nature itself. Transformed from its fairy-tale-like origins, the play involves its characters and audience alike in metaphysical questions that are felt rather than thought.
Macbeth (1606), similarly based on English chronicle material, concentrates on the problems of evil and freedom, convincingly mingles the supernatural with a representation of history, and makes a paradoxically sympathetic hero of a murderer who sins against family and state—a man in some respects worse than the villain of Hamlet.
Dramatizing stories from Plutarch's Parallel Lives, Antony and Cleopatra and Coriolanus (both written in 1607-1608) embody Shakespeare's bitterest images of political life, the former by setting against the call to Roman duty the temptation to liberating sexual passion, the latter by pitting a protagonist who cannot live with hypocrisy against a society built on it. Both of these tragedies present ancient history with a vividness that makes it seem contemporary, though the sensuousness of Antony and Cleopatra, the richness of its detail, the ebullience of its language, and the seductive character of its heroine have made it far more popular than the harsh and austere Coriolanus. One more tragedy, Timon of Athens, similarly based on Plutarch, was written during this period, though its date is obscure. Despite its abundant brilliance, few find it a fully satisfactory play, and some critics have speculated that what we have may be an incomplete draft. The handful of tragedies that Shakespeare wrote between 1604 and 1608 comprises an astonishing series of worlds different from one another, created of language that exceeds anything Shakespeare had done before, some of the most complex and vivid characters in all the plays, and a variety of new structural techniques.
A final group of plays takes a turn in a new direction. Commonly called the "romances," Pericles (1607), Cymbeline (1609), The Winter's Tale (1611), and The Tempest (1611) share their conventions with the tragicomedy that had been growing popular since the early years of the century. Particularly they resemble in some respects plays written by Beaumont and Fletcher for the private theatrical company whose operation the King's Men took over in 1608. While such work in the hands of others, however, tended to reflect the socially and intellectually narrow interests of an elite audience, Shakespeare turned the fashionable mode into a new kind of personal art form. Though less searing than the great tragedies, these plays have a unique power to move and are in the realm of the highest art. Pericles and Cymbeline seem somewhat tentative and experimental, though both are superb plays. The Winter's Tale, however, is one of Shakespeare's best plays. Like a rewriting of Othello in its first acts, it turns miraculously into pastoral comedy in its last. The Tempest is the most popular and perhaps the finest of the group. Prospero, shipwrecked on an island and dominating it with magic which he renounces at the end, may well be intended as an image of Shakespeare himself; in any event, the play is like a retrospective glance over the plays of the 2 previous decades.
After the composition of The Tempest, which many regard as an explicit farewell to art, Shakespeare retired to Stratford, returning to London to compose Henry VIII and The Two Noble Kinsmen in 1613; neither of these plays seems to have fired his imagination. In 1616, at the age of 52, he was dead. His reputation grew quickly, and his work has continued to seem to each generation like its own most precious discovery. His value to his own age is suggested by the fact that two fellow actors performed the virtually unprecedented act in 1623 of gathering his plays together and publishing them in the Folio edition. Without their efforts, since Shakespeare was apparently not interested in publication, many of the plays would not have survived.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
96. H. G. Wells
H.G. Wells, in full Herbert George Wells (born Sept. 21, 1866, Bromley, Kent, Eng.—died Aug. 13, 1946, London), English novelist, journalist, sociologist, and historian best known for such science fiction novels as The Time Machine and The War of the Worlds and such comic novels as Tono-Bungay and The History of Mr. Polly.
Early life
Wells was the son of domestic servants turned small shopkeepers. He grew up under the continual threat of poverty, and at age 14, after a very inadequate education supplemented by his inexhaustible love of reading, he was apprenticed to a draper in Windsor. His employer soon dismissed him; and he became assistant to a chemist, then to another draper, and finally, in 1883, an usher at Midhurst Grammar School. At 18 he won a scholarship to study biology at the Normal School (later the Royal College) of Science, in South Kensington, London, where T.H. Huxley was one of his teachers. He graduated from London University in 1888, becoming a science teacher and undergoing a period of ill health and financial worries, the latter aggravated by his marriage, in 1891, to his cousin, Isabel Mary Wells. The marriage was not a success, and in 1894 Wells ran off with Amy Catherine Robbins (d. 1927), a former pupil, who in 1895 became his second wife.
Early writings
Wells’s first published book was a Textbook of Biology (1893). With his first novel, The Time Machine (1895), which was immediately successful, he began a series of science fiction novels that revealed him as a writer of marked originality and an immense fecundity of ideas: The Wonderful Visit (1895), The Island of Doctor Moreau (1896), The Invisible Man (1897), The War of the Worlds (1898), The First Men in the Moon (1901), and The Food of the Gods (1904). He also wrote many short stories, which were collected in The Stolen Bacillus (1895), The Plattner Story (1897), and Tales of Space and Time (1899). For a time he acquired a reputation as a prophet of the future, and indeed, in The War in the Air (1908), he foresaw certain developments in the military use of aircraft. But his imagination flourished at its best not in the manner of the comparatively mechanical anticipations of Jules Verne but in the astronomical fantasies of The First Men in the Moon and The War of the Worlds, from the latter of which the image of the Martian has passed into popular mythology.
Behind his inventiveness lay a passionate concern for man and society, which increasingly broke into the fantasy of his science fiction, often diverting it into satire and sometimes, as in The Food of the Gods, destroying its credibility. Eventually, Wells decided to abandon science fiction for comic novels of lower middle-class life, most notably in Love and Mr. Lewisham (1900), Kipps: The Story of a Simple Soul (1905), and The History of Mr. Polly (1910). In these novels, and in Tono-Bungay (1909), he drew on memories of his own earlier life, and, through the thoughts of inarticulate yet often ambitious heroes, revealed the hopes and frustrations of clerks, shop assistants, and underpaid teachers, who had rarely before been treated in fiction with such sympathetic understanding. In these novels, too, he made his liveliest, most persuasive comment on the problems of Western society that were soon to become his main preoccupation. The sombre vision of a dying world in The Time Machine shows that, in his long-term view of humanity’s prospects, Wells felt much of the pessimism prevalent in the 1890s. In his short-term view, however, his study of biology led him to hope that human society would evolve into higher forms, and with Anticipations (1901), Mankind in the Making (1903), and A Modern Utopia (1905), he took his place in the British public’s mind as a leading preacher of the doctrine of social progress. About this time, too, he became an active socialist, and in 1903 joined the Fabian Society, though he soon began to criticize its methods. The bitter quarrel he precipitated by his unsuccessful attempt to wrest control of the Fabian Society from George Bernard Shaw and Sidney and Beatrice Webb in 1906–07 is retold in his novel The New Machiavelli (1911), in which the Webbs are parodied as the Baileys.
Middle and late works
After about 1906 the pamphleteer and the novelist were in conflict in Wells, and only The History of Mr. Polly and the lighthearted Bealby (1915) can be considered primarily as fiction. His later novels are mainly discussions of social or political themes that show little concern for the novel as a literary form. Wells himself affected not to care about the literary merit of his work, and he rejected the tutelage of the American novelist Henry James, saying, “I would rather be called a journalist than an artist.” Indeed, his novel Boon (1915) included a spiteful parody of James. His next novel, Mr. Britling Sees It Through (1916), though touched by the prejudice and shortsightedness of wartime, gives a brilliant picture of the English people in World War I.
World War I shook Wells’s faith in even short-term human progress, and in subsequent works he modified his conception of social evolution, putting forward the view that man could only progress if he would adapt himself to changing circumstances through knowledge and education. To help bring about this process of adaptation Wells began an ambitious work of popular education, of which the main products were The Outline of History (1920; revised 1931), The Science of Life (1931), cowritten with Julian Huxley and G.P. Wells (his elder son by his second wife), and The Work, Wealth, and Happiness of Mankind (1932). At the same time he continued to publish works of fiction, in which his gifts of narrative and dialogue give way almost entirely to polemics. His sense of humour reappears, however, in the reminiscences of his Experiment in Autobiography (1934).
In 1933 Wells published a novelized version of a film script, The Shape of Things to Come. (Produced by Alexander Korda, the film Things to Come [1936] remains, on account of its special effects, one of the outstanding British films of the 20th century.) Wells’s version reverts to the utopianism of some earlier books, but as a whole his outlook grew steadily less optimistic, and some of his later novels contain much that is bitterly satiric. Fear of a tragic wrong turning in the development of the human race, to which he had early given imaginative expression in the grotesque animal mutations of The Island of Doctor Moreau, dominates the short novels and fables he wrote in the later 1930s. Wells was now ill and aging. With the outbreak of World War II, he lost all confidence in the future, and in Mind at the End of Its Tether (1945) he depicts a bleak vision of a world in which nature has rejected, and is destroying, humankind.
Assessment
In spite of an awareness of possible world catastrophe that underlay much of his earlier work and flared up again in old age, Wells in his lifetime was regarded as the chief literary spokesman of the liberal optimism that preceded World War I. No other writer has caught so vividly the energy of this period, its adventurousness, its feeling of release from the conventions of Victorian thought and propriety. Wells’s influence was enormous, both on his own generation and on that which immediately followed it. None of his contemporaries did more to encourage revolt against Christian tenets and accepted codes of behaviour, especially as regards male-female, in which, both in his books and in his personal life, he was a persistent advocate of an almost complete freedom. Though in many ways hasty, ill-tempered, and contradictory, Wells was undeviating and fearless in his efforts for social equality, world peace, and what he considered to be the future good of humanity.
As a creative writer his reputation rests on the early science fiction books and on the comic novels. In his science fiction, he took the ideas and fears that haunted the mind of his age and gave them symbolic expression as brilliantly conceived fantasy made credible by the quiet realism of its setting. In the comic novels, though his psychology lacks subtlety and the construction of his plots is often awkward, he shows a fund of humour and a deep sympathy for ordinary people. Wells’s prose style is always careless and lacks grace, yet he has his own gift of phrase and a true ear for vernacular speech, especially that of the lower middle class of London and southeastern England. His best work has a vigour, vitality, and exuberance unsurpassed, in its way, by that of any other British writer of the early 20th century.
Last edited by Jai Ganesh (2016-03-22 00:02:31)
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
97. Guglielmo Marconi, (born April 25, 1874, Bologna, Italy—died July 20, 1937, Rome), Italian physicist and inventor of a successful wireless telegraph (1896). In 1909 he received the Nobel Prize for Physics, which he shared with German physicist Ferdinand Braun. He later worked on the development of shortwave wireless communication, which constitutes the basis of nearly all modern long-distance radio.
Education and early work
Marconi’s father was Italian and his mother Irish. Educated first in Bologna and later in Florence, Marconi then went to the technical school in Leghorn, where, in studying physics, he had every opportunity for investigating electromagnetic wave technique, following the earlier mathematical work of James Clerk Maxwell and the experiments of Heinrich Hertz, who first produced and transmitted radio waves, and Sir Oliver Lodge, who conducted research on lightning and electricity.
In 1894 Marconi began experimenting at his father’s estate near Bologna, using comparatively crude apparatuses: an induction coil for increasing voltages, with a spark discharger controlled by a Morse key at the sending end and a simple coherer (a device designed to detect radio waves) at the receiver. After preliminary experiments over a short distance, he first improved the coherer; then, by systematic tests, he showed that the range of signaling was increased by using a vertical aerial with a metal plate or cylinder at the top of a pole connected to a similar plate on the ground. The range of signaling was thus increased to about 2.4 km (1.5 miles), enough to convince Marconi of the potentialities of this new system of communication. During this period he also conducted simple experiments with reflectors around the aerial to concentrate the radiated electrical energy into a beam instead of spreading it in all directions.
Receiving little encouragement to continue his experiments in Italy, he went, in 1896, to London, where he was soon assisted by Sir William Preece, the chief engineer of the post office. Marconi filed his first patent in England in June 1896 and, during that and the following year, gave a series of successful demonstrations, in some of which he used balloons and kites to obtain greater height for his aerials. He was able to send signals over distances of up to 6.4 km (4 miles) on the Salisbury Plain and to nearly 14.5 km (9 miles) across the Bristol Channel. These tests, together with Preece’s lectures on them, attracted considerable publicity both in England and abroad, and in June 1897 Marconi went to La Spezia, where a land station was erected and communication was established with Italian warships at distances of up to 19 km (11.8 miles).
There remained much skepticism about the useful application of this means of communication and a lack of interest in its exploitation. But Marconi’s cousin Jameson Davis, a practicing engineer, financed his patent and helped in the formation of the Wireless Telegraph and Signal Company, Ltd. (changed in 1900 to Marconi’s Wireless Telegraph Company, Ltd.). During the first years, the company’s efforts were devoted chiefly to showing the full possibilities of radiotelegraphy. A further step was taken in 1899 when a wireless station was established at South Foreland, England, for communicating with Wimereux in France, a distance of 50 km (31 miles); in the same year, British battleships exchanged messages at 121 km (75 miles).
In September 1899 Marconi equipped two American ships to report to newspapers in New York City the progress of the yacht race for the America’s Cup. The success of this demonstration aroused worldwide excitement and led to the formation of the American Marconi Company. The following year the Marconi International Marine Communication Company, Ltd., was established for the purpose of installing and operating services between ships and land stations. In 1900 also, Marconi filed his now-famous patent No. 7777 for Improvements in Apparatus for Wireless Telegraphy. The patent, based in part on earlier work in wireless telegraphy by Sir Oliver Lodge, enabled several stations to operate on different wavelengths without interference. (In 1943 the U.S. Supreme Court overturned patent No. 7777, indicating that Lodge, Nikola Tesla, and John Stone appeared to have priority in the development of radio-tuning apparatus.)
Major discoveries and innovations
Marconi’s great triumph was, however, yet to come. In spite of the opinion expressed by some distinguished mathematicians that the curvature of the Earth would limit practical communication by means of electric waves to a distance of 161–322 km (100–200 miles), Marconi succeeded in December 1901 in receiving at St. John’s, Newfoundland, signals transmitted across the Atlantic Ocean from Poldhu in Cornwall, England. This achievement created an immense sensation in every part of the civilized world, and, though much remained to be learned about the laws of propagation of radio waves around the Earth and through the atmosphere, it was the starting point of the vast development of radio communications, broadcasting, and navigation services that took place in the next 50 years, in much of which Marconi himself continued to play an important part.
During a voyage on the U.S. liner Philadelphia in 1902, Marconi received messages from distances of 1,125 km (700 miles) by day and 3,200 km (2,000 miles) by night. He thus was the first to discover that, because some radio waves travel by reflection from the upper regions of the atmosphere, transmission conditions are sometimes more favourable at night than during the day. This circumstance is due to the fact that the upward travel of the waves is limited in the daytime by absorption in the lower atmosphere, which becomes ionized—and so electrically conducting—under the influence of sunlight. In 1902 also, Marconi patented the magnetic detector in which the magnetization in a moving band of iron wires is changed by the arrival of a signal causing a click in the telephone receiver connected to it. During the ensuing three years, he also developed and patented the horizontal directional aerial. Both of these devices improved the efficiency of the communication system. In 1910 he received messages at Buenos Aires from Clifden in Ireland over a distance of approximately 9,650 km (6,000 miles), using a wavelength of about 8,000 metres (5 miles). Two years later Marconi introduced further innovations that so improved transmission and reception that important long-distance stations could be established. This increased efficiency allowed Marconi to send the first radio message from England to Australia in September 1918.
In spite of the rapid and widespread developments then taking place in radio and its applications to maritime use, Marconi’s intuition and urge to experiment were by no means exhausted. In 1916, during World War I, he saw the possible advantages of shorter wavelengths that would permit the use of reflectors around the aerial, thus minimizing the interception of transmitted signals by the enemy and also effecting an increase in signal strength. After tests in Italy (20 years after his original experiments with reflectors), Marconi continued the work in Great Britain and, on a wavelength of 15 metres (49 feet), received signals over a range of 30–160 km (20–100 miles). In 1923 the experiments were continued on board his steam yacht Elettra, which had been specially equipped. From a transmitter of 1 kilowatt at Poldhu, Cornwall, signals were received at a distance of 2,250 km (1,400 miles). These signals were much louder than those from Caernarfon, Wales, on a wavelength several hundred times as great and with 100 times the power at the transmitter. Thus began the development of shortwave wireless communication that, with the use of the beam aerial system for concentrating the energy in the desired direction, is the basis of most modern long-distance radio communication. In 1924 the Marconi company obtained a contract from the post office to establish shortwave communication between England and the countries of the British Commonwealth.
A few years later Marconi returned to the study of still shorter waves of about 0.5 metres (1.6 feet). At these very short wavelengths, a parabolic reflector of moderate size gives a considerable increase in power in the desired direction. Experiments conducted off the coast of Italy on the yacht Elettra soon showed that useful ranges of communication could be achieved with low-powered transmitters. In 1932, using very short wavelengths, Marconi installed a radiotelephone system between Vatican City and the pope’s palace at Castel Gandolfo. In later work Marconi once more demonstrated that even radio waves as short as 55 cm (22 inches) are not limited in range to the horizon or to optical distance between transmitter and receiver.
Marconi received many honours and several honorary degrees. He was awarded the Nobel Prize for Physics (1909) for the development of wireless telegraphy; sent as plenipotentiary delegate to the peace conference in Paris (1919), in which capacity he signed the peace treaties with Austria and with Bulgaria; created marchese and nominated to the Italian senate (1929); and chosen president of the Royal Italian Academy (1930).
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
98. Sir Isaac Newton (25 December 1642 – 20 March 1726/27) was an English physicist and mathematician (described in his own day as a "natural philosopher") who is widely recognised as one of the most influential scientists of all time and a key figure in the scientific revolution. His book Philosophiæ Naturalis Principia Mathematica ("Mathematical Principles of Natural Philosophy"), first published in 1687, laid the foundations for classical mechanics. Newton made seminal contributions to optics, and he shares credit with Gottfried Wilhelm Leibniz for the development of calculus.
Newton's Principia formulated the laws of motion and universal gravitation, which dominated scientists' view of the physical universe for the next three centuries. By deriving Kepler's laws of planetary motion from his mathematical description of gravity, and then using the same principles to account for the trajectories of comets, the tides, the precession of the equinoxes, and other phenomena, Newton removed the last doubts about the validity of the heliocentric model of the Solar System. This work also demonstrated that the motion of objects on Earth and of celestial bodies could be described by the same principles. His prediction that Earth should be shaped as an oblate spheroid was later vindicated by the measurements of Maupertuis, La Condamine, and others, which helped convince most Continental European scientists of the superiority of Newtonian mechanics over the earlier system of Descartes.
Newton built the first practical reflecting telescope and developed a theory of colour based on the observation that a prism decomposes white light into the many colours of the visible spectrum. He formulated an empirical law of cooling, studied the speed of sound, and introduced the notion of a Newtonian fluid. In addition to his work on calculus, as a mathematician Newton contributed to the study of power series, generalised the binomial theorem to non-integer exponents, developed a method for approximating the roots of a function, and classified most of the cubic plane curves.
Newton was a fellow of Trinity College and the second Lucasian Professor of Mathematics at the University of Cambridge. He was a devout but unorthodox Christian, and, unusually for a member of the Cambridge faculty of the day, he refused to take holy orders in the Church of England, perhaps because he privately rejected the doctrine of the Trinity. Beyond his work on the mathematical sciences, Newton dedicated much of his time to the study of biblical chronology and alchemy, but most of his work in those areas remained unpublished until long after his death. In his later life, Newton became president of the Royal Society. Newton served the British government as Warden and Master of the Royal Mint.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
99. Karl Landsteiner, (born June 14, 1868, Vienna, Austrian Empire [Austria]—died June 26, 1943, New York, N.Y., U.S.), Austrian American immunologist and pathologist who received the 1930 Nobel Prize for Physiology or Medicine for his discovery of the major blood groups and the development of the ABO system of blood typing that has made blood transfusion a routine medical practice.
After receiving his M.D. in 1891 from the University of Vienna, Landsteiner studied organic chemistry with many notable scientists in Europe, including the German chemist Emil Fischer. In 1897 he returned to the University of Vienna, where he pursued his interest in the emerging field of immunology and in 1901 published his discovery of the human ABO blood group system. At that time, although it was known that the mixing of blood from two individuals could result in clumping, or agglutination, of red blood cells, the underlying mechanism of this phenomenon was not understood. Landsteiner discovered the cause of agglutination to be an immunological reaction that occurs when antibodies are produced by the host against donated blood cells. This immune response is elicited because blood from different individuals may vary with respect to certain antigens located on the surface of red blood cells. Landsteiner identified three such antigens, which he labeled A, B, and C (later changed to O). A fourth blood type, later named AB, was identified the following year. He found that if a person with one blood type—A, for example—receives blood from an individual of a different blood type, such as B, the host’s immune system will not recognize the B antigens on the donor blood cells and thus will consider them to be foreign and dangerous, as it would regard an infectious microorganism. To defend the body from this perceived threat, the host’s immune system will produce antibodies against the B antigens, and agglutination will occur as the antibodies bind to the B antigens. Landsteiner’s work made it possible to determine blood type and thus paved the way for blood transfusions to be carried out safely. Landsteiner also discovered other blood factors during his career: the M, N, and P factors, which he identified in 1927 with Philip Levine, and the Rhesus (Rh) system, in 1940 with Alexander Wiener.
In addition to his study of human blood groups, Landsteiner made a number of other important contributions to science. He and the Romanian bacteriologist Constantin Levaditi discovered that a microorganism is responsible for poliomyelitis and laid the groundwork for the development of the polio vaccine. Landsteiner also helped identify the microorganisms responsible for syphilis. However, he considered his greatest work to be his investigations into antigen-antibody interactions, which he carried out primarily at Rockefeller Institute (now called Rockefeller University) in New York City (1922–43). In this research Landsteiner used small organic molecules called haptens—which stimulate antibody production only when combined with a larger molecule, such as protein—to demonstrate how small variations in a molecule’s structure can cause great changes in antibody production. Landsteiner summarized his work in The Specificity of Serological Reactions (1936), a classic text that helped establish the field of immunochemistry.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
100. Carl Linnaeus (23 May 1707 – 10 January 1778), also known after his ennoblement as Carl von Linné, was a Swedish botanist, physician, and zoologist, who formalised the modern system of naming organisms called binomial nomenclature. He is known by the epithet "father of modern taxonomy". Many of his writings were in Latin, and his name is rendered in Latin as Carolus Linnæus (after 1761 Carolus a Linné).
Linnaeus was born in the countryside of Småland, in southern Sweden. He received most of his higher education at Uppsala University, and began giving lectures in botany there in 1730. He lived abroad between 1735 and 1738, where he studied and also published a first edition of his Systema Naturae in the Netherlands. He then returned to Sweden, where he became professor of medicine and botany at Uppsala. In the 1740s, he was sent on several journeys through Sweden to find and classify plants and animals. In the 1750s and '60s, he continued to collect and classify animals, plants, and minerals, and published several volumes. At the time of his death, he was one of the most acclaimed scientists in Europe.
The Swiss philosopher Jean-Jacques Rousseau sent him the message: "Tell him I know no greater man on earth." The German writer Johann Wolfgang von Goethe wrote: "With the exception of Shakespeare and Spinoza, I know no one among the no longer living who has influenced me more strongly." Swedish author August Strindberg wrote: "Linnaeus was in reality a poet who happened to become a naturalist". Among other compliments, Linnaeus has been called 'Princeps botanicorum' (Prince of Botanists), "The Pliny of the North," and "The Second Adam". He is also considered as one of the founders of modern ecology.
In botany, the author abbreviation used to indicate Linnaeus as the authority for species' names is L. In older publications, sometimes the abbreviation "Linn." is found (for instance in: Cheeseman, T.F. (1906) – Manual of the New Zealand Flora). Linnaeus' remains comprise the type specimen for the species Homo sapiens, following the International Code of Zoological Nomenclature, since the sole specimen he is known to have examined when writing the species description was himself.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
This is a test post to clear an error in this thread reported by ganesh.
In mathematics, you don't understand things. You just get used to them.
If it ain't broke, fix it until it is.
Always satisfy the Prime Directive of getting the right answer above all else.
Offline