You are not logged in.
148. Lars Magnus Ericsson
Lars Magnus Ericsson (5 May 1846 – 17 December 1926) was a Swedish inventor, entrepreneur and founder of telephone equipment manufacturer Ericsson (incorporated as Telefonaktiebolaget LM Ericsson).
Lars Magnus was born in Värmskog, Värmland, and grew up in the small village of Vegerbol located between Karlstad and Arvika. At the age of 12, Ericsson's father died forcing him to seek work as a miner. He worked until he had money enough to leave the village and move to Stockholm in 1867. He then worked for six years for an instrument maker named Öllers & Co. who mainly created telegraph equipment. Because of his skills, he was given two state scholarships to study instrument making abroad between 1872 and 1875. One of the companies he worked at was Siemens & Halske.
Upon his return to Sweden in 1876, he founded a small mechanical workshop together with his friend Carl Johan Andersson who had also worked at Öllers & Co.. This workshop was actually a former kitchen of some 13 square meters situated at Drottninggatan 15 in the most central part of Stockholm. Here, he started a telephone company by analyzing Bell company and Siemens telephones and creating his own copies in their image. It was not until they started cooperating with Henrik Tore Cedergren in 1883 that the company would start to grow into the Ericsson corporation.
In the year 1900 Lars Magnus retired from Ericsson at the age of 54. He kept his shares in the company until 1905 and then sold them all.
He is said to have been a demanding person, and disliked any direct publicity about his personality and did not wish to be idolized. He was, however, deeply respected by his employees. He was always a skeptic and cautious in business. He was also somewhat opposed to patents, as many of the products he made would not have been possible to do if the patent legislation had been overly effective. When his phones were copied by Norwegian companies he did not care, as his phones had in turn been largely copied from Siemens. He initially did not believe in a mass market for telephones, and saw it as a toy for the leisure class.
After his death in 1926, he was buried at Hågelby gård in Botkyrka. At his explicit request, there is no headstone marking his grave.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
149. Florence Nightingale
Florence Nightingale, (12 May 1820 – 13 August 1910) was a celebrated English social reformer and statistician, and the founder of modern nursing.
She came to prominence while serving as a manager of nurses trained by her during the Crimean War, where she organised the tending to wounded soldiers. She gave nursing a highly favourable reputation and became an icon of Victorian culture, especially in the persona of "The Lady with the Lamp" making rounds of wounded soldiers at night.
Some recent commentators have asserted Nightingale's achievements in the Crimean War were exaggerated by the media at the time, to satisfy the public's need for a hero. Nevertheless, critics agree on the decisive importance of her follow-up achievements in professionalising nursing roles for women. In 1860, Nightingale laid the foundation of professional nursing with the establishment of her nursing school at St Thomas' Hospital in London. It was the first secular nursing school in the world, now part of King's College London. In recognition of her pioneering work in nursing, the Nightingale Pledge taken by new nurses, and the Florence Nightingale Medal, the highest international distinction a nurse can achieve, were named in her honour, and the annual International Nurses Day is celebrated around the world on her birthday. Her social reforms include improving healthcare for all sections of British society, advocating better hunger relief in India, helping to abolish laws that were over-harsh to women, and expanding the acceptable forms of female participation in the workforce.
Nightingale was a prodigious and versatile writer. In her lifetime, much of her published work was concerned with spreading medical knowledge. Some of her tracts were written in simple English so that they could easily be understood by those with poor literary skills. She also helped popularise the graphical presentation of statistical data. Much of her writing, including her extensive work on religion and mysticism, has only been published posthumously.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
150. Christian Goldbach
Christian Goldbach, (born March 18, 1690, Königsberg, Prussia [now Kaliningrad, Russia]—died Nov. 20, 1764, Moscow, Russia) Russian mathematician whose contributions to number theory include Goldbach’s conjecture.
In 1725 Goldbach became professor of mathematics and historian of the Imperial Academy at St. Petersburg. Three years later he went to Moscow as tutor to Tsar Peter II, and from 1742 he served as a staff member of the Russian Ministry of Foreign Affairs.
Goldbach first proposed the conjecture that bears his name in a letter to the Swiss mathematician Leonhard Euler in 1742. He claimed that “every number greater than 2 is an aggregate of three prime numbers.” Because mathematicians in Goldbach’s day considered 1 a prime number (prime numbers are now defined as those positive integers greater than 1 that are divisible only by 1 and themselves), Goldbach’s conjecture is usually restated in modern terms as: Every even natural number greater than 2 is equal to the sum of two prime numbers.
The first breakthrough in the effort to prove Goldbach’s conjecture occurred in 1930, when the Soviet mathematician Lev Genrikhovich Shnirelman proved that every natural number can be expressed as the sum of not more than 20 prime numbers. In 1937 the Soviet mathematician Ivan Matveyevich Vinogradov went on to prove that every “sufficiently large” (without stating exactly how large) odd natural number can be expressed as the sum of not more than three prime numbers. The latest refinement came in 1973, when the Chinese mathematician Chen Jing Run proved that every sufficiently large even natural number is the sum of a prime and a product of at most two primes.
Goldbach also made notable contributions to the theory of curves, to infinite series, and to the integration of differential equations.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
151. Johann Heinrich Lambert
Johann Heinrich Lambert, (born August 26, 1728, Mülhausen, Alsace—died September 25, 1777, Berlin, Prussia [Germany]) Swiss German mathematician, astronomer, physicist, and philosopher who provided the first rigorous proof that π (the ratio of a circle’s circumference to its diameter) is irrational, meaning that it cannot be expressed as the quotient of two integers.
Lambert, the son of a tailor, was largely self-educated and early in his life began geometric and astronomical investigations by means of instruments he designed and built himself. He worked for a time as a bookkeeper, secretary, and editor. As a private tutor in 1748, he gained access to a good library, which he used for self-improvement until 1759, when he resigned his post to settle in Augsburg. In 1764 he went to Berlin, where he received the patronage of Frederick the Great. His memoir containing the proof that π is irrational was published in 1768. In 1774 at Berlin he became editor of Astronomisches Jahrbuch oder Ephemeriden, an astronomical almanac.
Lambert made the first systematic development of hyperbolic functions. He is also responsible for many innovations in the study of heat and light. The lambert, a measurement of light intensity, was named in his honour. Among his most important works are Photometria (1760; “The Measurement of Light”); Die Theorie der Parallellinien (1766; “The Theory of Parallel Lines”), which contains results later included in non-Euclidean geometry; and Pyrometrie (1779; “The Measurement of Heat”). The Neues Organon (1764; “New Organon”), his principal philosophical work, contains an analysis of a great variety of questions, among them formal logic, probability, and the principles of science. He also corresponded with Immanuel Kant, with whom he shares the honour of being among the first to recognize that spiral nebulae are disk-shaped galaxies like the Milky Way.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
152. Richard Attenborough
Richard Attenborough, in full Richard Samuel Attenborough, Baron Attenborough of Richmond-upon-Thames, (born August 29, 1923, Cambridge, England—died August 24, 2014, London) English actor, director, and producer known for his dynamic on-screen presence, nuanced work behind the camera, and charity efforts.
Attenborough—the eldest of three brothers, one of whom was nature documentarian Sir David Attenborough—was raised in Leicester, England, where his father was principal of the local university. In 1942 he graduated from the Royal Academy of Dramatic Art (RADA), and that year he also made his film debut in Noël Coward’s In Which We Serve. He had made his stage debut in 1941.
Following a stint (1943–46) in the Royal Air Force during World War II, part of which was spent with the RAF film unit, Attenborough established himself as a character actor, which included creating the role of Detective Sergeant Trotter in the original West End production of Agatha Christie’s The Mousetrap (1952). He also garnered accolades for his film portrayals of a sociopathic thug in Brighton Rock (1947); a soldier in the comedy Private’s Progress (1956) and its sequel, I’m All Right Jack (1959); and a squadron leader engineering a breakout from a German POW camp in The Great Escape (1963). Attenborough won Golden Globe Awards for best supporting actor for The Sand Pebbles (1966) and for his comedic turn as a circus owner in Doctor Dolittle (1967). After a lengthy acting hiatus, he returned to the screen in American director Steven Spielberg’s Jurassic Park (1993), playing the obliviously hubristic owner of a dinosaur theme park, a role he reprised in the 1997 sequel. He then played Kris Kringle in a 1994 remake of Miracle on 34th Street (1947) and appeared as Sir William Cecil in Elizabeth (1998).
Attenborough was also noted as a director. In 1969 he directed his first film, the musical Oh! What a Lovely War (1969). Gandhi (1982)—his biographical film about Mohandas K. Gandhi—earned eight Academy Awards, including those for best picture and best director. Further directorial efforts included A Bridge Too Far (1977), the antiapartheid film Cry Freedom (1987), the Charlie Chaplin biopic Chaplin (1992), and Shadowlands (1993), a depiction of the relationship between American poet Joy Gresham and English writer C.S. Lewis. He also helmed Closing the Ring (2007), a World War II romance told in flashbacks.
Attenborough served as vice president (1973–95) and president (2002–10) of the British Academy of Film and Television Arts and as president (2003–14) of RADA. He was on the board of the education organization United World Colleges and in 1987 became a goodwill ambassador for UNICEF. He was made a Commander of the Order of the British Empire (CBE) in 1967, was knighted in 1976, and was granted a life peerage in 1993. In 1998 he received the Japan Art Association’s Praemium Imperiale prize for theatre/film.
Last edited by Jai Ganesh (2016-06-16 00:47:52)
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
One of my favorite actors, first saw him in "Flight of the Phoenix."
In mathematics, you don't understand things. You just get used to them.
If it ain't broke, fix it until it is.
Always satisfy the Prime Directive of getting the right answer above all else.
Offline
One of my favorite actors, first saw him in "Flight of the Phoenix."
Happy to learn that!
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
153. Pierre de Fermat
Pierre de Fermat, (born August 17, 1601, Beaumont-de-Lomagne, France—died January 12, 1665, Castres) French mathematician who is often called the founder of the modern theory of numbers. Together with René Descartes, Fermat was one of the two leading mathematicians of the first half of the 17th century. Independently of Descartes, Fermat discovered the fundamental principle of analytic geometry. His methods for finding tangents to curves and their maximum and minimum points led him to be regarded as the inventor of the differential calculus. Through his correspondence with Blaise Pascal he was a co-founder of the theory of probability.
Life and early work
Little is known of Fermat’s early life and education. He was of Basque origin and received his primary education in a local Franciscan school. He studied law, probably at Toulouse and perhaps also at Bordeaux. Having developed tastes for foreign languages, classical literature, and ancient science and mathematics, Fermat followed the custom of his day in composing conjectural “restorations” of lost works of antiquity. By 1629 he had begun a reconstruction of the long-lost Plane Loci of Apollonius, the Greek geometer of the 3rd century bc. He soon found that the study of loci, or sets of points with certain characteristics, could be facilitated by the application of algebra to geometry through a coordinate system. Meanwhile, Descartes had observed the same basic principle of analytic geometry, that equations in two variable quantities define plane curves. Because Fermat’s Introduction to Loci was published posthumously in 1679, the exploitation of their discovery, initiated in Descartes’s Géométrie of 1637, has since been known as Cartesian geometry.
In 1631 Fermat received the baccalaureate in law from the University of Orléans. He served in the local parliament at Toulouse, becoming councillor in 1634. Sometime before 1638 he became known as Pierre de Fermat, though the authority for this designation is uncertain. In 1638 he was named to the Criminal Court.
Analyses of curves.
Fermat’s study of curves and equations prompted him to generalize the equation for the ordinary parabola
, and that for the rectangular hyperbola , to the form . The curves determined by this equation are known as the parabolas or hyperbolas of Fermat according as n is positive or negative. He similarly generalized the Archimedean spiral . These curves in turn directed him in the middle 1630s to an algorithm, or rule of mathematical procedure, that was equivalent to differentiation. This procedure enabled him to find equations of tangents to curves and to locate maximum, minimum, and inflection points of polynomial curves, which are graphs of linear combinations of powers of the independent variable. During the same years, he found formulas for areas bounded by these curves through a summation process that is equivalent to the formula now used for the same purpose in the integral calculus.It is not known whether or not Fermat noticed that differentiation of
, leading to , is the inverse of integrating . Through ingenious transformations he handled problems involving more general algebraic curves, and he applied his analysis of infinitesimal quantities to a variety of other problems, including the calculation of centres of gravity and finding the lengths of curves. Descartes in the Géométrie had reiterated the widely held view, stemming from Aristotle, that the precise rectification or determination of the length of algebraic curves was impossible; but Fermat was one of several mathematicians who, in the years 1657–59, disproved the dogma. In a paper entitled “De Linearum Curvarum cum Lineis Rectis Comparatione” (“Concerning the Comparison of Curved Lines with Straight Lines”), he showed that the semicubical parabola and certain other algebraic curves were strictly rectifiable. He also solved the related problem of finding the surface area of a segment of a paraboloid of revolution. This paper appeared in a supplement to the Veterum Geometria Promota, issued by the mathematician Antoine de La Loubère in 1660. It was Fermat’s only mathematical work published in his lifetime.Disagreement with other Cartesian views
Fermat differed also with Cartesian views concerning the law of refraction (the sines of the angles of incidence and refraction of light passing through media of different densities are in a constant ratio), published by Descartes in 1637 in La Dioptrique; like La Géométrie, it was an appendix to his celebrated Discours de la méthode. Descartes had sought to justify the sine law through a premise that light travels more rapidly in the denser of the two media involved in the refraction. Twenty years later Fermat noted that this appeared to be in conflict with the view espoused by Aristotelians that nature always chooses the shortest path. Applying his method of maxima and minima and making the assumption that light travels less rapidly in the denser medium, Fermat showed that the law of refraction is consonant with his “principle of least time.” His argument concerning the speed of light was found later to be in agreement with the wave theory of the 17th-century Dutch scientist Christiaan Huygens, and in 1849 it was verified experimentally by A.-H.-L. Fizeau.
Through the mathematician and theologian Marin Mersenne, who, as a friend of Descartes, often acted as an intermediary with other scholars, Fermat in 1638 maintained a controversy with Descartes on the validity of their respective methods for tangents to curves. Fermat’s views were fully justified some 30 years later in the calculus of Sir Isaac Newton. Recognition of the significance of Fermat’s work in analysis was tardy, in part because he adhered to the system of mathematical symbols devised by François Viète, notations that Descartes’s Géométrie had rendered largely obsolete. The handicap imposed by the awkward notations operated less severely in Fermat’s favourite field of study, the theory of numbers; but here, unfortunately, he found no correspondent to share his enthusiasm. In 1654 he had enjoyed an exchange of letters with his fellow mathematician Blaise Pascal on problems in probability concerning games of chance, the results of which were extended and published by Huygens in his De Ratiociniis in Ludo Aleae (1657).
Work on theory of numbers
Fermat vainly sought to persuade Pascal to join him in research in number theory. Inspired by an edition in 1621 of the Arithmetic of Diophantus, the Greek mathematician of the 3rd century ad, Fermat had discovered new results in the so-called higher arithmetic, many of which concerned properties of prime numbers (those positive integers that have no factors other than 1 and themselves). One of the most elegant of these had been the theorem that every prime of the form 4n + 1 is uniquely expressible as the sum of two squares. A more important result, now known as Fermat’s lesser theorem, asserts that if p is a prime number and if a is any positive integer, then
is divisible by p. Fermat seldom gave demonstrations of his results, and in this case proofs were provided by Gottfried Leibniz, the 17th-century German mathematician and philosopher, and Leonhard Euler, the 18th-century Swiss mathematician. For occasional demonstrations of his theorems Fermat used a device that he called his method of “infinite descent,” an inverted form of reasoning by recurrence or mathematical induction. One unproved conjecture by Fermat turned out to be false. In 1640, in letters to mathematicians and to other knowledgeable thinkers of the day, including Blaise Pascal, he announced his belief that numbers of the form , known since as “numbers of Fermat,” are necessarily prime; but a century later Euler showed that has 641 as a factor. It is not known if there are any primes among the Fermat numbers for n > 5. Carl Friedrich Gauss in 1796 in Germany found an unexpected application for Fermat numbers when he showed that a regular polygon of N sides is constructible in a Euclidean sense if N is a prime Fermat number or a product of distinct Fermat primes. By far the best known of Fermat’s many theorems is a problem known as his “great,” or “last,” theorem. This appeared in the margin of his copy of Diophantus’ Arithmetica and states that the equation , where x, y, z, and n are positive integers, has no solution if n is greater than 2. This theorem remained unsolved until the late 20th century.Fermat was the most productive mathematician of his day. But his influence was circumscribed by his reluctance to publish.
Last edited by Jai Ganesh (2016-06-18 23:05:44)
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
154. Sir Tim Berners-Lee
Sir Tim Berners-Lee, (born June 8, 1955, London, England) British computer scientist, generally credited as the inventor of the World Wide Web. In 2004 he was awarded a knighthood by Queen Elizabeth II of the United Kingdom and the inaugural Millennium Technology Prize (€1 million) by the Finnish Technology Award Foundation.
Computing came naturally to Berners-Lee, as both of his parents worked on the Ferranti Mark I, the first commercial computer. After graduating in 1976 from the University of Oxford, Berners-Lee designed computer software for two years at Plessey Telecommunications Ltd., located in Poole, Dorset, England. Following this, he had several positions in the computer industry, including a stint from June to December 1980 as a software engineering consultant at CERN, the European particle physics laboratory in Geneva.
While at CERN, Berners-Lee developed a program for himself, called Enquire, that could store information in files that contained connections (“links”) both within and among separate files—a technique that became known as hypertext. After leaving CERN, Berners-Lee worked for Image Computer Systems Ltd., located in Ferndown, Dorset, where he designed a variety of computer systems. In 1984 he returned to CERN to work on the design of the laboratory’s computer network, developing procedures that allowed diverse computers to communicate with one another and researchers to control remote machines. In 1989 Berners-Lee drew up a proposal for creating a global hypertext document system that would make use of the Internet. His goal was to provide researchers with the ability to share their results, techniques, and practices without having to exchange e-mail constantly. Instead, researchers would place such information “online,” where their peers could immediately retrieve it anytime, day or night. Berners-Lee wrote the software for the first Web server (the central repository for the files to be shared) and the first Web client, or “browser” (the program to access and display files retrieved from the server), between October 1990 and the summer of 1991. The first “killer application” of the Web at CERN was the laboratory’s telephone directory—a mundane beginning for one of the technological wonders of the computer age.
From 1991 to 1993 Berners-Lee evangelized the Web. In 1994 in the United States he established the World Wide Web (W3) Consortium at the Massachusetts Institute of Technology’s Laboratory for Computer Science. The consortium, in consultation with others, lends oversight to the Web and the development of standards. In 1999 Berners-Lee became the first holder of the 3Com Founders chair at the Laboratory for Computer Science. His numerous other honours include the National Academy of Engineering’s prestigious Charles Stark Draper Prize (2007). Berners-Lee is the author, along with Mark Fischetti, of Weaving the Web: The Original Design and Ultimate Destiny of the World Wide Web (2000).
Last edited by Jai Ganesh (2016-06-20 00:16:10)
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
155. Dale Carnegie
Dale Carnegie, original name Dale Carnegey (born November 24, 1888, Maryville, Missouri, U.S.—died November 1, 1955, Forest Hills, New York) American lecturer, author, and pioneer in the field of public speaking and the psychology of the successful personality.
Carnegie was born into poverty on a farm in Missouri. In high school and college he was active in debating clubs. After graduating he was a salesman in Nebraska and an actor in New York City and finally taught public speaking at the YMCA. His classes became extremely successful, and Carnegie began lecturing to packed houses. To standardize his teaching methods he began publishing pamphlets, which he collected into book form as 'Public Speaking: A Practical Course for Business Men' (1926; also published as Public Speaking and Influencing Men in Business). At this time he also served as manager for a lecture tour with Lowell Thomas and compiled 'Little Known Facts About Well Known People' (1934).
Carnegie became an instant success with the hugely popular 'How To Win Friends and Influence People' (1936). Like most of his books, it revealed little that was unknown about human psychology but stressed that an individual’s attitude is crucial. He taught that anyone could benefit from a handicap if it were advantageously presented. Carnegie capitalized on the American longing for success by selling advice that helped readers feel, and perhaps become, successful. Other books include 'How to Stop Worrying and Start Living' (1948), which is primarily a collection of commonsense tricks to prevent stress.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
156. Arthur Holly Compton
Arthur Holly Compton, (born Sept. 10, 1892, Wooster, Ohio, U.S.—died March 15, 1962, Berkeley, Calif.) American physicist and joint winner, with C.T.R. Wilson of England, of the Nobel Prize for Physics in 1927 for his discovery and explanation of the change in the wavelength of X rays when they collide with electrons in metals. This so-called Compton effect is caused by the transfer of energy from a photon to an electron. Its discovery in 1922 confirmed the dual nature of electromagnetic radiation as both a wave and a particle.
Compton, a younger brother of the physicist Karl T. Compton, received his doctorate from Princeton University in 1916 and became head of the department of physics at Washington University, St. Louis, in 1920. Compton’s Nobel Prize–winning research focused on the strange phenomena that occur when beams of short-wavelength X rays are aimed at elements of low atomic weight. He discovered that some of the X rays scattered by the elements are of longer wavelength than they were before being scattered. This result is contrary to the laws of classical physics, which could not explain why the scattering of a wave should increase its wavelength. Compton initially theorized that the size and shape of electrons in the target atoms could account for the change in the X rays’ wavelength. In 1922, however, he concluded that Einstein’s quantum theory, which argued that light consists of particles rather than waves, offered a better explanation of the effect. In his new model, Compton interpreted X rays as consisting of particles, or “photons,” as he called them. He argued that an X-ray photon can collide with an electron of a carbon atom; when this happens, the photon transfers some of its energy to the electron and then continues on with diminished energy and a longer wavelength than it had before. Compton’s interpretation provided the first widely accepted experimental evidence that electromagnetic radiation can exhibit both particle and wave behaviour, and thus helped to establish the legitimacy of the still-radical quantum theory.
From 1923 to 1945 Compton was a professor of physics at the University of Chicago. In 1941 he was chairman of the committee of the National Academy of Sciences that studied the military potential of atomic energy. In this capacity he was instrumental, with the physicist Ernest O. Lawrence, in initiating the Manhattan Project, which created the first atomic bomb. From 1942 to 1945 he was director of the Metallurgical Laboratory at the University of Chicago, which developed the first self-sustaining atomic chain reaction and paved the way for controlled release of nuclear energy. He became chancellor of Washington University in 1945 and was professor of natural history there from 1953 until 1961.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
157. Alicia Silverstone
Alicia Silverstone (born October 4, 1976) is an American actress, producer, author, and activist. Silverstone made her film debut in The Crush, earning the 1994 MTV Movie Award for Best Breakthrough Performance, and gained further prominence as a teen idol when she appeared in three music videos for the band Aerosmith. She starred in the 1995 hit Clueless (which earned her a multimillion-dollar deal with Columbia) and in the big-budget 1997 film Batman & Robin, in which she played Batgirl. She has continued to act in film and television and on stage. For her role in the short-lived drama comedy Miss Match, Silverstone received a Golden Globe nomination for Best Actress – Television Series Musical or Comedy. A vegan, Silverstone endorsed PETA activities and published a book titled The Kind Diet.
Early life
Silverstone was born in San Francisco, California, the daughter of Deirdre "Didi" (née Radford), a Scottish former Pan Am flight attendant, and Monty Silverstone, an English real estate agent. She grew up in Hillsborough, California.[6] Her father was born to a Jewish family and her mother converted to Conservative Judaism before marriage; Silverstone had a Bat Mitzvah ceremony. Silverstone began modeling when she was six years old, and was subsequently cast in television commercials, the first being for Domino's Pizza. She attended Crocker Middle School and then San Mateo High School.
Career
Silverstone has won several awards for her film performances. She received multiple MTV Movie Awards and a Young Artist Award for The Crush. For Clueless, she received multiple MTV Movie Awards and a Young Artist Award once again, plus awards from Blockbuster Entertainment Award, Kids' Choice Awards, National Board of Review, and an American Comedy Award.
1990s
Her first credited acting role was in The Wonder Years, starring Fred Savage, in the episode entitled "Road Test", as Savage's character's high school "dream girl". Silverstone then won a leading part in the 1993 film The Crush, playing a teenage girl who sets out to ruin an older man after he spurns her affections; she won two awards at the 1994 MTV Movie Awards for the role—Best Breakthrough Performance and Best Villain. Silverstone became legally emancipated at the age of 15 in order to work the hours required for the shooting schedule of the film. Silverstone made some television movies in her early career including Torch Song, Cool and the Crazy and Scattered Dreams. In addition, Silverstone was almost chosen to play the lead role of Angela Chase in the critically acclaimed cult television series My So-Called Life before production of the pilot episode began. The role eventually went to Claire Danes.
After seeing her in The Crush, Marty Callner decided Silverstone would be perfect for a role in a music video he was directing for the band Aerosmith, called "Cryin'"; she was subsequently cast in two more videos, "Amazing" and "Crazy." These were hugely successful for both the band and Silverstone, making her a household name (and also gaining her the nickname, "the Aerosmith chick"). After seeing Silverstone in the three videos, filmmaker Amy Heckerling decided to cast her in Clueless.
Clueless became a hit and critical darling during the summer of 1995. As a result, she signed a deal with Columbia-TriStar valued between $8 and $10 million. As part of the package, she got a three-year first look deal for her own production company, First Kiss Productions. Silverstone also won "Best Female Performance" and "Most Desirable Female" at the 1996 MTV Movie Awards for her performance in the film. In the same year Silverstone starred in the erotic thriller, The Babysitter, film adaptation of the novel by Dean Koontz, Hideaway, and the French drama about Americans, New World.
Silverstone's next role was as Batgirl in Batman & Robin, and while it was not a critical success, the film grossed $238,207,122 worldwide. Silverstone's turn as Batgirl was not well received, and won her a Razzie Award for Worst Supporting Actress. She also, however, won a Blimp Award at the Kid's Choice Awards for the role. Also released in 1997 was Excess Baggage, the first movie by Silverstone's production, Silverstone starred in the Saturn Award-nominated romance/comedy film Blast from the Past which also stars Brendan Fraser, Christopher Walken and Sissy Spacek.
2000s
In 2000, Silverstone appeared in Kenneth Branagh's film adaptation of the William Shakespeare's Love's Labour's Lost, in which she was required to sing and dance. In 2001, Silverstone provided the voice of Sharon Spitz, the lead character in the Canadian animated television Braceface. During this time, she also appeared in the films Global Heresy and Scorched. In 2002, she made her Broadway debut alongside Kathleen Turner and Jason Biggs in The Graduate. After removing herself from the public eye for a few years, she resurfaced in the short-lived 2003 NBC television series Miss Match, which was canceled after 11 episodes. Silverstone later acknowledged that she hates the trappings of fame, saying, "Fame is not anything I wish on anyone. You start acting because you love it. Then success arrives, and suddenly you're on show".
Alicia Silverstone in 2005
After the cancellation of Miss Match in 2003, Silverstone did a pilot with Fox called Queen B, in which she played a former high school prom queen named Beatrice (Bea) who has discovered that the real world is nothing like high school. It was not picked up for production. In 2005, she co-starred with Queen Latifah in Beauty Shop, a spinoff of the BarberShop films, as one of the stylists in the beauty shop. In the same year, she played a villainous reporter alongside Sarah Michelle Gellar and Freddie Prinze, Jr. in Scooby Doo 2: Monsters Unleashed, which did well financially, and appeared in the direct-to-video film Silence Becomes You.
In 2006, Silverstone starred in an ABC pilot called Pink Collar, in which her character worked in a law firm. Like Queen B, this pilot was not picked up to series. That year, she also starred alongside Alex Pettyfer, Ewan McGregor and Mickey Rourke in the film Stormbreaker, and appeared in the Hallmark Hall of Fame made-for-TV movie Candles on Bay Street, based on the book by Cathie Pelletier. Silverstone continued her theatre work, next appearing in David Mamet's Boston Marriage and Speed-the-Plow. In 2008, she filmed another ABC pilot alongside Megan Mullally called Bad Mother's Handbook and made a cameo appearance in the comedy film Tropic Thunder.
In early 2009, Silverstone starred in the world premiere of Donald Margulies's Time Stands Still at the Geffen Playhouse LA. The play focuses on a longtime couple and journalistic team who return to New York from an extended stint in the war-torn Middle East. She also starred in the music video for Rob Thomas's 2009 single "Her Diamonds".
Silverstone filmed a small segment in Elektra Luxx, a sequel to Women In Trouble. Director Sebastian Gutierrez cut her segment but will possibly use it for a third installment, tentatively titled Women In Ecstasy.
2010s
In 2010, she reprised her role in Time Stands Still alongside Laura Linney in the New York production of the play on Broadway, which premiered on January 28, 2010, directed by Daniel Sullivan, who described Silverstone as "a breath of fresh air." The play received good reviews with The New York Times praising Silverstone, saying she "brings warmth, actorly intelligence and delicate humour."
Silverstone next starred in the teen romance The Art of Getting By, which premiered at the 2011 Sundance Film Festival. and appeared in four episodes of Suburgatory, reuniting with her Clueless castmate Jeremy Sisto.
Her next role was in Butter as the adoptive mother of a 12-year-old African American girl who enters a local butter sculpture competition in a small Iowa town. Rob Corddry, who plays her husband, invited her to appear in an episode of his show Childrens Hospital. She also reunited with Clueless director Amy Heckerling in Vamps, playing one of two vampires who fall in love and face a choice that could jeopardise their immortality. She was offered the role after Heckerling came to see her in Time Stands Still.
Silverstone later returned to Broadway in the 2012 New York production of The Performers and starred in Angels in Stardust. In 2013, she shot TV pilot HR, which was not picked up. In 2015 she starred in the New York production of Of Good Stock.
She is set to appear in five upcoming movies: Who Gets the Dog? Catfight, King Cobra and Tribes Of Palos Verdes. In 2011 she starred in the film adaptation of Marie Phillips's novel Gods Behaving Badly, however as of 2016, it remains unreleased.
Personal life
Silverstone has two older siblings, a half-sister from her father's previous marriage named Kezi Silverstone and a brother named David Silverstone. She married her longtime boyfriend, rock musician Christopher Jarecki, in a beachfront ceremony at Lake Tahoe on June 11, 2005. After meeting outside a movie theater in 1997, the couple dated for eight years prior to their marriage. They got engaged about a year before their marriage, and Jarecki presented Silverstone with an engagement ring that had belonged to his grandmother. They live in an eco-friendly Los Angeles house, complete with solar panels and an organic vegetable garden. Silverstone bought the house, shared with a "menagerie of rescued dogs", in 1996.
In 2009, Silverstone released The Kind Diet, a guide to vegan nutrition, and launched its associated website The Kind Life. The Kind Diet has topped the Hardcover Advice & Misc. category of The New York Times Best Seller list. In 2014, her follow up book The Kind Mama was published. She plans to write a third, The Kind Diet Cookbook.
Political beliefs
Silverstone is noted for being an animal rights and environmental activist. She became a vegan in 1998 after attending an animal rights meeting, saying "I realized that I was the problem … I was an animal lover who was eating animals." She has stated she struggled with childhood vegetarianism, stating "at eight years old it's hard to stick to your guns – and so through the years I was always starting and stopping trying to be a vegetarian."
Last edited by Jai Ganesh (2016-06-27 23:58:07)
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
158. Jack Dorsey
Jack Dorsey, (born November 19, 1976, St. Louis, Missouri, U.S.) American Web developer and entrepreneur who, with Evan Williams and Christopher Stone, cofounded (2006) the online microblogging service Twitter.
As a teenager, Dorsey created taxi-dispatching software that was adopted by taxicab companies. He attended New York University before moving (1999) to San Francisco, where he set up a company that used the Internet to handle the dispatching of couriers and emergency vehicles as well as taxis. In 2000 he first considered using text and instant messaging (based on the principles of dispatch software) as a way of keeping in touch with friends. Six years later he approached Williams and Stone with his idea; together they developed a prototype of what would become the Twitter platform. Dorsey posted the first Twitter message on March 21, 2006. The service, which allowed users to send messages no longer than 140 characters, quickly became a popular social networking hub as well as a mainstream form of communication. Dorsey served as CEO until October 2008, when he became chairman of the board. In that capacity, he was involved with Twitter’s initial public offering (2013), which raised $1.8 billion.
In 2009 Dorsey cofounded and became CEO of Square, a mobile-payments venture that offered devices and software to facilitate credit-card transactions. It launched in 2010 and by 2012 had more than two million users. Square initially was available only in North America, but it expanded to overseas markets in 2013, when its services became available in Japan. That year Dorsey also became a member of Disney Company’s board of directors. In October 2015 he once again became CEO at Twitter while also remaining as Square’s CEO.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
159. Adam Smith
Adam Smith, (baptized June 5, 1723, Kirkcaldy, Fife, Scotland—died July 17, 1790, Edinburgh) Scottish social philosopher and political economist. After two centuries, Adam Smith remains a towering figure in the history of economic thought. Known primarily for a single work—An Inquiry into the Nature and Causes of the Wealth of Nations (1776), the first comprehensive system of political economy—Smith is more properly regarded as a social philosopher whose economic writings constitute only the capstone to an overarching view of political and social evolution. If his masterwork is viewed in relation to his earlier lectures on moral philosophy and government, as well as to allusions in The Theory of Moral Sentiments (1759) to a work he hoped to write on “the general principles of law and government, and of the different revolutions they have undergone in the different ages and periods of society,” then The Wealth of Nations may be seen not merely as a treatise on economics but also as a partial exposition of a much larger scheme of historical evolution.
Early life
Much more is known about Adam Smith’s thought than about his life. He was the son by second marriage of Adam Smith, comptroller of customs at Kirkcaldy, a small (population 1,500) but thriving fishing village near Edinburgh, and Margaret Douglas, daughter of a substantial landowner. Of Smith’s childhood nothing is known other than that he received his elementary schooling in Kirkcaldy and that at the age of four years he was said to have been carried off by gypsies. Pursuit was mounted, and young Adam was abandoned by his captors. “He would have made, I fear, a poor gypsy,” commented his principal biographer.
At the age of 14, in 1737, Smith entered the University of Glasgow, already remarkable as a centre of what was to become known as the Scottish Enlightenment. There he was deeply influenced by Francis Hutcheson, a famous professor of moral philosophy from whose economic and philosophical views he was later to diverge but whose magnetic character seems to have been a main shaping force in Smith’s development. Graduating in 1740, Smith won a scholarship (the Snell Exhibition) and traveled on horseback to Oxford, where he stayed at Balliol College. Compared with the stimulating atmosphere of Glasgow, Oxford was an educational desert. His years there were spent largely in self-education, from which Smith obtained a firm grasp of both classical and contemporary philosophy.
Returning to his home after an absence of six years, Smith cast about for suitable employment. The connections of his mother’s family, together with the support of the jurist and philosopher Lord Henry Kames, resulted in an opportunity to give a series of public lectures in Edinburgh—a form of education then much in vogue in the prevailing spirit of “improvement.” The lectures, which ranged over a wide variety of subjects from rhetoric to history and economics, made a deep impression on some of Smith’s notable contemporaries. They also had a marked influence on Smith’s own career, for in 1751, at the age of 27, he was appointed professor of logic at Glasgow, from which post he transferred in 1752 to the more remunerative professorship of moral philosophy, a subject that embraced the related fields of natural theology, ethics, jurisprudence, and political economy.
Glasgow
Smith then entered upon a period of extraordinary creativity, combined with a social and intellectual life that he afterward described as “by far the happiest, and most honourable period of my life.” During the week he lectured daily from 7:30 to 8:30 am and again thrice weekly from 11 am to noon, to classes of up to 90 students, aged 14 to 16. (Although his lectures were presented in English rather than in Latin, following the precedent of Hutcheson, the level of sophistication for so young an audience strikes one today as extraordinarily demanding.) Afternoons were occupied with university affairs in which Smith played an active role, being elected dean of faculty in 1758; his evenings were spent in the stimulating company of Glasgow society.
Among his wide circle of acquaintances were not only members of the aristocracy, many connected with the government, but also a range of intellectual and scientific figures that included Joseph Black, a pioneer in the field of chemistry; James Watt, later of steam-engine fame; Robert Foulis, a distinguished printer and publisher and subsequent founder of the first British Academy of Design; and, not least, the philosopher David Hume, a lifelong friend whom Smith had met in Edinburgh. Smith was also introduced during these years to the company of the great merchants who were carrying on the colonial trade that had opened to Scotland following its union with England in 1707. One of them, Andrew Cochrane, had been a provost of Glasgow and had founded the famous Political Economy Club. From Cochrane and his fellow merchants Smith undoubtedly acquired the detailed information concerning trade and business that was to give such a sense of the real world to The Wealth of Nations.
The Theory of Moral Sentiments
In 1759 Smith published his first work, The Theory of Moral Sentiments. Didactic, exhortative, and analytic by turns, it lays the psychological foundation on which The Wealth of Nations was later to be built. In it Smith described the principles of “human nature,” which, together with Hume and the other leading philosophers of his time, he took as a universal and unchanging datum from which social institutions, as well as social behaviour, could be deduced.
One question in particular interested Smith in The Theory of Moral Sentiments. This was a problem that had attracted Smith’s teacher Hutcheson and a number of Scottish philosophers before him. The question was the source of the ability to form moral judgments, including judgments on one’s own behaviour, in the face of the seemingly overriding passions for self-preservation and self-interest. Smith’s answer, at considerable length, is the presence within each of us of an “inner man” who plays the role of the “impartial spectator,” approving or condemning our own and others’ actions with a voice impossible to disregard. (The theory may sound less naive if the question is reformulated to ask how instinctual drives are socialized through the superego.)
The thesis of the impartial spectator, however, conceals a more important aspect of the book. Smith saw humans as creatures driven by passions and at the same time self-regulated by their ability to reason and—no less important—by their capacity for sympathy. This duality serves both to pit individuals against one another and to provide them with the rational and moral faculties to create institutions by which the internecine struggle can be mitigated and even turned to the common good. He wrote in his Moral Sentiments the famous observation that he was to repeat later in The Wealth of Nations: that self-seeking men are often “led by an invisible hand…without knowing it, without intending it, [to] advance the interest of the society.”
It should be noted that scholars have long debated whether Moral Sentiments complemented or was in conflict with The Wealth of Nations. At one level there is a seeming clash between the theme of social morality contained in the first and the largely amoral explication of the economic system in the second. On the other hand, the first book can also be seen as an explanation of the manner in which individuals are socialized to become the market-oriented and class-bound actors that set the economic system into motion.
Travels on the Continent
The Theory quickly brought Smith wide esteem and in particular attracted the attention of Charles Townshend, himself something of an amateur economist, a considerable wit, and somewhat less of a statesman, whose fate it was to be the chancellor of the Exchequer responsible for the measures of taxation that ultimately provoked the American Revolution. Townshend had recently married and was searching for a tutor for his stepson and ward, the young duke of Buccleuch. Influenced by the strong recommendations of Hume and his own admiration for The Theory of Moral Sentiments, he approached Smith to take the charge.
The terms of employment were lucrative (an annual salary of £300 plus traveling expenses and a pension of £300 a year thereafter), considerably more than Smith had earned as a professor. Accordingly, Smith resigned his Glasgow post in 1763 and set off for France the next year as the tutor of the young duke. They stayed mainly in Toulouse, where Smith began working on a book (eventually to be The Wealth of Nations) as an antidote to the excruciating boredom of the provinces. After 18 months of ennui he was rewarded with a two-month sojourn in Geneva, where he met Voltaire, for whom he had the profoundest respect, thence to Paris, where Hume, then secretary to the British embassy, introduced Smith to the great literary salons of the French Enlightenment. There he met a group of social reformers and theorists headed by François Quesnay, who called themselves les économistes but are known in history as the physiocrats. There is some controversy as to the precise degree of influence the physiocrats exerted on Smith, but it is known that he thought sufficiently well of Quesnay to have considered dedicating The Wealth of Nations to him, had not the French economist died before publication.
The stay in Paris was cut short by a shocking event. The younger brother of the duke of Buccleuch, who had joined them in Toulouse, took ill and perished despite Smith’s frantic ministrations. Smith and his charge immediately returned to London. Smith worked in London until the spring of 1767 with Lord Townshend, a period during which he was elected a fellow of the Royal Society and broadened still further his intellectual circle to include Edmund Burke, Samuel Johnson, Edward Gibbon, and perhaps Benjamin Franklin. Late that year he returned to Kirkcaldy, where the next six years were spent dictating and reworking The Wealth of Nations, followed by another stay of three years in London, where the work was finally completed and published in 1776.
The Wealth of Nations
Despite its renown as the first great work in political economy, The Wealth of Nations is in fact a continuation of the philosophical theme begun in The Theory of Moral Sentiments. The ultimate problem to which Smith addresses himself is how the inner struggle between the passions and the “impartial spectator”—explicated in Moral Sentiments in terms of the single individual—works its effects in the larger arena of history itself, both in the long-run evolution of society and in terms of the immediate characteristics of the stage of history typical of Smith’s own day.
The answer to this problem enters in Book V, in which Smith outlines the four main stages of organization through which society is impelled, unless blocked by wars, deficiencies of resources, or bad policies of government: the original “rude” state of hunters, a second stage of nomadic agriculture, a third stage of feudal, or manorial, “farming,” and a fourth and final stage of commercial interdependence.
It should be noted that each of these stages is accompanied by institutions suited to its needs. For example, in the age of the huntsman, “there is scarce any property…; so there is seldom any established magistrate or any regular administration of justice.” With the advent of flocks there emerges a more complex form of social organization, comprising not only “formidable” armies but the central institution of private property with its indispensable buttress of law and order as well. It is the very essence of Smith’s thought that he recognized this institution, whose social usefulness he never doubted, as an instrument for the protection of privilege, rather than one to be justified in terms of natural law: “Civil government,” he wrote, “so far as it is instituted for the security of property, is in reality instituted for the defence of the rich against the poor, or of those who have some property against those who have none at all.” Finally, Smith describes the evolution through feudalism into a stage of society requiring new institutions, such as market-determined rather than guild-determined wages and free rather than government-constrained enterprise. This later became known as laissez-faire capitalism; Smith called it the system of perfect liberty.
There is an obvious resemblance between this succession of changes in the material basis of production, each bringing its requisite alterations in the superstructure of laws and civil institutions, and the Marxian conception of history. Though the resemblance is indeed remarkable, there is also a crucial difference: in the Marxian scheme the engine of evolution is ultimately the struggle between contending classes, whereas in Smith’s philosophical history the primal moving agency is “human nature” driven by the desire for self-betterment and guided (or misguided) by the faculties of reason.
Society and the “invisible hand”
The theory of historical evolution, although it is perhaps the binding conception of The Wealth of Nations, is subordinated within the work itself to a detailed description of how the “invisible hand” actually operates within the commercial, or final, stage of society. This becomes the focus of Books I and II, in which Smith undertakes to elucidate two questions. The first is how a system of perfect liberty, operating under the drives and constraints of human nature and intelligently designed institutions, will give rise to an orderly society. The question, which had already been considerably elucidated by earlier writers, required both an explanation of the underlying orderliness in the pricing of individual commodities and an explanation of the “laws” that regulated the division of the entire “wealth” of the nation (which Smith saw as its annual production of goods and services) among the three great claimant classes—labourers, landlords, and manufacturers.
This orderliness, as would be expected, was produced by the interaction of the two aspects of human nature, its response to its passions and its susceptibility to reason and sympathy. But whereas The Theory of Moral Sentiments had relied mainly on the presence of the “inner man” to provide the necessary restraints to private action, in The Wealth of Nations one finds an institutional mechanism that acts to reconcile the disruptive possibilities inherent in a blind obedience to the passions alone. This protective mechanism is competition, an arrangement by which the passionate desire for bettering one’s condition—“a desire that comes with us from the womb, and never leaves us until we go into the grave”—is turned into a socially beneficial agency by pitting one person’s drive for self-betterment against another’s.
It is in the unintended outcome of this competitive struggle for self-betterment that the invisible hand regulating the economy shows itself, for Smith explains how mutual vying forces the prices of commodities down to their “natural” levels, which correspond to their costs of production. Moreover, by inducing labour and capital to move from less to more profitable occupations or areas, the competitive mechanism constantly restores prices to these “natural” levels despite short-run aberrations. Finally, by explaining that wages and rents and profits (the constituent parts of the costs of production) are themselves subject to this same discipline of self-interest and competition, Smith not only provided an ultimate rationale for these “natural” prices but also revealed an underlying orderliness in the distribution of income itself among workers, whose recompense was their wages; landlords, whose income was their rents; and manufacturers, whose reward was their profits.
Economic growth
Smith’s analysis of the market as a self-correcting mechanism was impressive. But his purpose was more ambitious than to demonstrate the self-adjusting properties of the system. Rather, it was to show that, under the impetus of the acquisitive drive, the annual flow of national wealth could be seen to grow steadily.
Smith’s explanation of economic growth, although not neatly assembled in one part of The Wealth of Nations, is quite clear. The core of it lies in his emphasis on the division of labour (itself an outgrowth of the “natural” propensity to trade) as the source of society’s capacity to increase its productivity. The Wealth of Nations opens with a famous passage describing a pin factory in which 10 persons, by specializing in various tasks, turn out 48,000 pins a day, compared with the few pins, perhaps only 1, that each could have produced alone. But this all-important division of labour does not take place unaided. It can occur only after the prior accumulation of capital (or stock, as Smith calls it), which is used to pay the additional workers and to buy tools and machines.
The drive for accumulation, however, brings problems. The manufacturer who accumulates stock needs more labourers (since labour-saving technology has no place in Smith’s scheme), and, in attempting to hire them, he bids up their wages above their “natural” price. Consequently, his profits begin to fall, and the process of accumulation is in danger of ceasing. But now there enters an ingenious mechanism for continuing the advance: in bidding up the price of labour, the manufacturer inadvertently sets into motion a process that increases the supply of labour, for “the demand for men, like that for any other commodity, necessarily regulates the production of men.” Specifically, Smith had in mind the effect of higher wages in lessening child mortality. Under the influence of a larger labour supply, the wage rise is moderated and profits are maintained; the new supply of labourers offers a continuing opportunity for the manufacturer to introduce a further division of labour and thereby add to the system’s growth.
Here then was a “machine” for growth—a machine that operated with all the reliability of the Newtonian system with which Smith was quite familiar. Unlike the Newtonian system, however, Smith’s growth machine did not depend for its operation on the laws of nature alone. Human nature drove it, and human nature was a complex rather than a simple force. Thus, the wealth of nations would grow only if individuals, through their governments, did not inhibit this growth by catering to the pleas for special privilege that would prevent the competitive system from exerting its benign effect. Consequently, much of The Wealth of Nations, especially Book IV, is a polemic against the restrictive measures of the “mercantile system” that favoured monopolies at home and abroad. Smith’s system of “natural liberty,” he is careful to point out, accords with the best interests of all but will not be put into practice if government is entrusted to, or heeds, “the mean rapacity, the monopolizing spirit of merchants and manufacturers, who neither are, nor ought to be, the rulers of mankind.”
The Wealth of Nations is therefore far from the ideological tract it is often supposed to be. Although Smith preached laissez-faire (with important exceptions), his argument was directed as much against monopoly as against government; and although he extolled the social results of the acquisitive process, he almost invariably treated the manners and maneuvers of businessmen with contempt. Nor did he see the commercial system itself as wholly admirable. He wrote with discernment about the intellectual degradation of the worker in a society in which the division of labour has proceeded very far; by comparison with the alert intelligence of the husbandman, the specialized worker “generally becomes as stupid and ignorant as it is possible for a human being to become.”
In all of this, it is notable that Smith was writing in an age of preindustrial capitalism. He seems to have had no real presentiment of the gathering Industrial Revolution, harbingers of which were visible in the great ironworks only a few miles from Edinburgh. He had nothing to say about large-scale industrial enterprise, and the few remarks in The Wealth of Nations concerning the future of joint-stock companies (corporations) are disparaging. Finally, one should bear in mind that, if growth is the great theme of The Wealth of Nations, it is not unending growth. Here and there in the treatise are glimpses of a secularly declining rate of profit; and Smith mentions as well the prospect that when the system eventually accumulates its “full complement of riches”—all the pin factories, so to speak, whose output could be absorbed—economic decline would begin, ending in an impoverished stagnation.
The Wealth of Nations was received with admiration by Smith’s wide circle of friends and admirers, although it was by no means an immediate popular success. The work finished, Smith went into semiretirement. The year following its publication he was appointed commissioner both of customs and of salt duties for Scotland, posts that brought him £600 a year. He thereupon informed his former charge that he no longer required his pension, to which Buccleuch replied that his sense of honour would never allow him to stop paying it. Smith was therefore quite well off in the final years of his life, which were spent mainly in Edinburgh with occasional trips to London or Glasgow (which appointed him a rector of the university). The years passed quietly, with several revisions of both major books but with no further publications. He died at the age of 67, full of honours and recognition, and was buried in the churchyard at Canongate with a simple monument stating that Adam Smith, author of The Wealth of Nations, lay there.
Assessment
Beyond the few facts of his life, which can be embroidered only in detail, exasperatingly little is known about the man. Smith never married, and almost nothing is known of his personal side. Moreover, it was the custom of his time to destroy rather than to preserve the private files of illustrious men, with the unhappy result that much of Smith’s unfinished work, as well as his personal papers, was destroyed (some as late as 1942). Only one portrait of Smith survives, a profile medallion by James Tassie; it gives a glimpse of the older man with his somewhat heavy-lidded eyes, aquiline nose, and a hint of a protrusive lower lip. “I am a beau in nothing but my books,” Smith once told a friend to whom he was showing his library of some 3,000 volumes.
From various accounts, he was also a man of many peculiarities, which included a stumbling manner of speech (until he had warmed to his subject), a gait described as “vermicular,” and above all an extraordinary and even comic absence of mind. On the other hand, contemporaries wrote of a smile of “inexpressible benignity” and of his political tact and dispatch in managing the sometimes acerbic business of the Glasgow faculty.
Certainly, he enjoyed a high measure of contemporary fame; even in his early days at Glasgow his reputation attracted students from nations as distant as Russia, and his later years were crowned not only with expressions of admiration from many European thinkers but by a growing recognition among British governing circles that his work provided a rationale of inestimable importance for practical economic policy.
Over the years, Smith’s lustre as a social philosopher has escaped much of the weathering that has affected the reputations of other first-rate political economists. Although he was writing for his generation, the breadth of his knowledge, the cutting edge of his generalizations, and the boldness of his vision have never ceased to attract the admiration of all social scientists, economists in particular. Couched in the spacious, cadenced prose of his period, rich in imagery and crowded with life, The Wealth of Nations projects a sanguine but never sentimental image of society. Never so finely analytic as David Ricardo nor so stern and profound as Karl Marx, Smith is the very epitome of the Enlightenment: hopeful but realistic, speculative but practical, always respectful of the classical past but ultimately dedicated to the great discovery of his age—progress.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
160. Cecil B. DeMille
Cecil B. DeMille, in full Cecil Blount DeMille (born August 12, 1881, Ashfield, Massachusetts, U.S.—died January 21, 1959, Hollywood, Los Angeles, California) American motion-picture producer-director whose use of spectacle attracted vast audiences and made him a dominant figure in Hollywood for almost five decades.
Long before he made his first sound picture, DeMille had become a cinema legend for his efforts in the development of silent movies from shorts to feature-length productions and in helping to establish Hollywood as the new centre of the filmmaking industry. Unlike such other great directors of the silents as D.W. Griffith and Mack Sennett, DeMille easily made the transition to sound pictures, continuing to be productive—and profitable—well into the 1950s.
Early life and silent films: The Squaw Man to The Godless Girl
DeMille was the son of the cleric and playwright Henry Churchill DeMille. He was raised by his mother after his father died when he was 12, and he was later sent to the Pennsylvania Military College. He enrolled in New York’s American Academy of Dramatic Arts in 1898, and after graduating he debuted as an actor in 1900. He was soon collaborating with his brother, playwright William Churchill DeMille.
DeMille’s theatrical career was marked by long strings of failures, and he was better known for being William’s brother than for any of his own performances or plays. Seeking a change, in 1913 he joined his friend and collaborator producer Jesse Lasky, businessman (and Lasky’s brother-in-law) Samuel Goldfish (later Goldwyn), and attorney Arthur Friend in forming the Jesse L. Lasky Feature Play Company. DeMille was director-general in the new film company. His first film was a western, The Squaw Man (1914), about the love between an English nobleman and the Indian woman who dies for him. It was one of the first full-length feature films produced in Hollywood. The film was an instant success, assuring the future of the Lasky Company. Five more features emerged in 1914 under DeMille’s direction, including The Virginian; he had another 12 to his credit in 1915, including Carmen (the first of six films he made starring popular opera singer Geraldine Farrar) and The Girl of the Golden West.
The Cheat (1915) and The Golden Chance (1915) were shot simultaneously by DeMille. In The Cheat, a spendthrift socialite (Fannie Ward) turns to a Japanese businessman (Sessue Hayakawa) to recoup the charity money she has embezzled. In The Golden Chance, a poor seamstress (Cleo Ridgely) is given the opportunity to play the part of a rich woman. Both films were noted for their expressive use of lighting, with much of the screen in shadow.
The Lasky Company merged with Adolph Zukor’s Famous Players in 1916 to form Famous Players–Lasky (later Paramount Pictures). There DeMille made his first historical epic, Joan the Woman (1916), with Farrar playing Joan of Arc, and a remake of The Squaw Man (1918).
DeMille’s ability to give the public what it wanted soon made him a “name” director in the days when directors were virtually unknown. He made comedies and melodramas about married life that reflected the postwar freedom from moral restraint, beginning with Old Wives for New (1918). These films also made a star of Gloria Swanson, who made six films with DeMille, beginning with Don’t Change Your Husband (1919), and featured the lavish costumes and opulent sets that marked his later epics.
DeMille next produced his first biblical epics, which featured spectacular crowd scenes and sets. The Ten Commandments (1923) has two stories, the first being that of the Exodus and the second being about a conflict in modern times between two brothers, one who is a Christian and the other who rejects religion. Despite the commercial success of The Ten Commandments, budget overruns on it and other films strained DeMille’s relations with Zukor and Paramount. He left Paramount in 1925 and formed his own production company, Cecil B. DeMille Pictures, where he made four movies. The most commercially successful was The King of Kings (1927), a life of Christ that was one of the most popular films of the silent era. The company’s last film and his last silent film, The Godless Girl (1929), was about atheism sweeping through a high school and was also an indictment of the harsh conditions in juvenile reform schools.
Talking pictures: Dynamite to Union Pacific
DeMille joined Metro-Goldwyn-Mayer (MGM) in 1928. In Dynamite (1929), his first talking picture, a frivolous society girl marries a poor death-row inmate to retain her inherited fortune, but her plans for a brief marriage are upset when he is proved innocent. Madame Satan (1930) boasted a typically extravagant DeMille finale: a costume party held on a zeppelin over New York is struck by a bolt of lightning, necessitating a mass exit via parachutes. However, the box-office receipts were weak, and they did not improve much for his third version of The Squaw Man (1931).
MGM and DeMille let their disappointing association dissolve, and he approached Paramount with an epic about the persecution of Christians under the dissolute emperor Nero, The Sign of the Cross (1932), for which he was willing to pay half of the $650,000 budget. The combination of lurid debauchery with religious uplift was enormously successful. The film grossed $2.9 million, and he remained at Paramount for the rest of his career.
This Day and Age (1933) was an original turn on the gangster saga, with a killer dealt justice for his crimes by a group of intrepid high-school vigilantes. Four Frightened People (1934) was also atypical for DeMille—a survival story in which four Americans (Claudette Colbert, Herbert Marshall, Mary Boland, and William Gargan) flee a plague outbreak on their ship only to try to survive the rigours of the Malayan jungle (filmed on location in Hawaii).
With Cleopatra (1934) DeMille returned to the historical spectacular with which he would forever after be associated. Here Cleopatra (Colbert) exercises her wiles on Marc Antony (Henry Wilcoxon) and Julius Caesar (Warren William). The Crusades (1935) was another lavish spectacle, with Loretta Young as Berangaria of Navarre and Wilcoxon as Richard the Lionheart, but it was a box-office disappointment.
DeMille turned to American history for his next films. In The Plainsman (1936), Gary Cooper and Jean Arthur starred as the romantically involved Wild Bill Hickok and Calamity Jane. It was DeMille’s biggest box-office success since returning to Paramount. The Buccaneer (1938) was about privateer Jean Lafitte (Frederic March) and the Battle of New Orleans. Union Pacific (1939) was an account of the building of the transcontinental railroad and starred Joel McCrea and Barbara Stanwyck.
Films of the 1940s and 1950s: North West Mounted Police to The Ten Commandments
North West Mounted Police (1940) was DeMille’s first colour film. Gary Cooper played a Texas Ranger who travels to Canada to hunt a fugitive, and it was Paramount’s biggest hit of 1940. Reap the Wild Wind (1942) was another smash; John Wayne and Raymond Massey starred as competing salvagers in the Florida Keys (circa 1840) who battle storms, shipwrecks, and a giant squid.
In The Story of Dr. Wassell (1944) a navy doctor (Cooper) saves nine wounded men during World War II by sneaking them past the Japanese to the safety of Australia. DeMille invited Cooper back for Unconquered (1947) to play a militia captain during the French and Indian War who rescues a convict (Paulette Goddard) from indentured servitude while readying for the attack of the Seneca nation on Fort Pitt. The $4 million epic incurred a huge loss for Paramount.
DeMille rebounded with Samson and Delilah (1949), a profitable epic whose $11 million gross ignited a mania in Hollywood for biblical films. After appearing as himself with his former protégée Gloria Swanson in the memorable finale to Billy Wilder’s Sunset Boulevard, he made The Greatest Show on Earth (1952), a salute to the circus starring Charlton Heston and James Stewart. It received the Academy Award for best picture, and DeMille received his only Oscar nomination for best director.
DeMille’s final movie, The Ten Commandments (1956), was a remake of his 1923 film but without the modern-day story. Heston starred (in his best-known role) as Moses and Yul Brynner as his foe the Pharaoh Ramses. The vast scale of The Ten Commandments (particularly in the scenes of the Israelites leaving Egypt and the parting of the Red Sea), the Oscar-winning special effects, and the larger-than-life performances have made it the film for which DeMille is best remembered.
DeMille’s Autobiography was published in 1959. It acknowledged the strong and assertive personality for which he was known: he was the first director to use a megaphone on the set and the first to install a loudspeaker system for issuing orders. Apart from his film work, from 1936 to 1945 he appeared on radio in Lux Radio Theatre, a popular weekly series of adaptations of recent motion pictures. He was also noted for his right-wing political views and strenuous opposition to labour unions.
Although critics often dismissed DeMille’s films as devoid of artistic merit, he was conspicuously successful in a genre—the epic—that he made distinctively his own. His honours included a special Academy Award (1949) for “brilliant showmanship” and the Irving G. Thalberg Award (1952).
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
161. Bhāskara II, also called Bhāskarācārya or Bhaskara the Learned (born 1114, Biddur, India—died c. 1185, probably Ujjain) the leading mathematician of the 12th century, who wrote the first work with full and systematic use of the decimal number system.
Bhāskara II was the lineal successor of the noted Indian mathematician Brahmagupta (598–c. 665) as head of an astronomical observatory at Ujjain, the leading mathematical centre of ancient India. The II has been attached to his name to distinguish him from the 7th-century astronomer of the same name.
In Bhāskara II’s mathematical works (written in verse like nearly all Indian mathematical classics), particularly Līlāvatī (“The Beautiful”) and Bījagaṇita (“Seed Counting”), he not only used the decimal system but also compiled problems from Brahmagupta and others. He filled many of the gaps in Brahmagupta’s work, especially in obtaining a general solution to the Pell equation
and in giving many particular solutions (e.g., , which has the solution x = 1,766,319,049 and y = 226,153,980; French mathematician Pierre de Fermat proposed this same problem as a challenge to his friend Frenicle de Bessy five centuries later in 1657). Bhāskara II anticipated the modern convention of signs (minus by minus makes plus, minus by plus makes minus) and evidently was the first to gain some understanding of the meaning of division by zero, for he specifically stated that the value of 3/0 is an infinite quantity, though his understanding seems to have been limited, for he also stated wrongly that a⁄0 × 0 = a. Bhāskara II used letters to represent unknown quantities, much as in modern algebra, and solved indeterminate equations of 1st and 2nd degrees. He reduced quadratic equations to a single type and solved them and investigated regular polygons up to those having 384 sides, thus obtaining a good approximate value of π = 3.141666.In other of his works, notably Siddhāntaśiromaṇi (“Head Jewel of Accuracy”) and Karaṇakutūhala (“Calculation of Astronomical Wonders”), he wrote on his astronomical observations of planetary positions, conjunctions, eclipses, cosmography, geography, and the mathematical techniques and astronomical equipment used in these studies. Bhāskara II was also a noted astrologer, and, according to a legend first recorded in a 16th-century Persian translation, he named his first work, Līlāvatī, after his daughter in order to console her. He tried to determine the best time for Līlāvatī’s marriage by using a water clock consisting of a cup with a small hole in the bottom floating in a larger vessel. The cup would sink at the beginning of the correct hour. Līlāvatī looked into the water clock, and a pearl fell off of her clothing, plugging up the hole. The cup never sank, depriving her of her only chance for marriage and happiness. It is unknown how true this legend is, but some problems in Līlāvatī are addressed to women, using such feminine vocatives as “dear one” or “beautiful one.”
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
162. Confucius, Pinyin romanization Kongfuzi, or Kongzi, Wade-Giles K’ung-fu-tzu, or K’ung-tzu, original name Kongqiu, literary name Zhongni (born 551, Qufu, state of Lu [now in Shandong province, China]—died 479 bce, Lu) China’s most famous teacher, philosopher, and political theorist, whose ideas have influenced the civilization of East Asia.
Confucius’s life, in contrast to his tremendous importance, seems starkly undramatic, or, as a Chinese expression has it, it seems “plain and real.” The plainness and reality of Confucius’s life, however, underlines that his humanity was not revealed truth but an expression of self-cultivation, of the ability of human effort to shape its own destiny. The faith in the possibility of ordinary human beings to become awe-inspiring sages and worthies is deeply rooted in the Confucian heritage, and the insistence that human beings are teachable, improvable, and perfectible through personal and communal endeavour is typically Confucian.
Although the facts about Confucius’s life are scanty, they do establish a precise time frame and historical context. Confucius was born in the 22nd year of the reign of Duke Xiang of Lu (551 bce). The traditional claim that he was born on the 27th day of the eighth lunar month has been questioned by historians, but September 28 is still widely observed in East Asia as Confucius’s birthday. It is an official holiday, “Teachers’ Day,” in Taiwan.
Confucius was born in Qufu in the small feudal state of Lu in what is now Shandong province, which was noted for its preservation of the traditions of ritual and music of the Zhou civilization. His family name was Kong and his personal name Qiu, but he is referred to as either Kongzi or Kongfuzi (Master Kong) throughout Chinese history. The adjectival “Confucian,” derived from the Latinized Confucius, is not a meaningful term in Chinese, nor is the term Confucianism, which was coined in Europe as recently as the 18th century.
Confucius’s ancestors were probably members of the aristocracy who had become virtual poverty-stricken commoners by the time of his birth. His father died when Confucius was only three years old. Instructed first by his mother, Confucius then distinguished himself as an indefatigable learner in his teens. He recalled toward the end of his life that at age 15 his heart was set upon learning. A historical account notes that, even though he was already known as an informed young scholar, he felt it appropriate to inquire about everything while visiting the Grand Temple.
Confucius had served in minor government posts managing stables and keeping books for granaries before he married a woman of similar background when he was 19. It is not known who Confucius’s teachers were, but he made a conscientious effort to find the right masters to teach him, among other things, ritual and music. His mastery of the six arts—ritual, music, archery, charioteering, calligraphy, and arithmetic—and his familiarity with the classical traditions, notably poetry and history, enabled him to start a brilliant teaching career in his 30s.
Confucius is known as the first teacher in China who wanted to make education broadly available and who was instrumental in establishing the art of teaching as a vocation, indeed as a way of life. Before Confucius, aristocratic families had hired tutors to educate their sons in specific arts, and government officials had instructed their subordinates in the necessary techniques, but he was the first person to devote his whole life to learning and teaching for the purpose of transforming and improving society. He believed that all human beings could benefit from self-cultivation. He inaugurated a humanities program for potential leaders, opened the doors of education to all, and defined learning not merely as the acquisition of knowledge but also as character building.
For Confucius the primary function of education was to provide the proper way of training exemplary persons (junzi), a process that involved constant self-improvement and continuous social interaction. Although he emphatically noted that learning was “for the sake of the self” (the end of which was self-knowledge and self-realization), he found public service integral to true education. Confucius confronted learned hermits who challenged the validity of his desire to serve the world; he resisted the temptation to “herd with birds and animals,” to live apart from the human community, and opted to try to transform the world from within. For decades Confucius tried to be actively involved in politics, wishing to put his humanist ideas into practice through governmental channels.
In his late 40s and early 50s Confucius served first as a magistrate, then as an assistant minister of public works, and eventually as minister of justice in the state of Lu. It is likely that he accompanied King Lu as his chief minister on one of the diplomatic missions. Confucius’s political career was, however, short-lived. His loyalty to the king alienated him from the power holders of the time, the large Ji families, and his moral rectitude did not sit well with the king’s inner circle, who enraptured the king with sensuous delight. At 56, when he realized that his superiors were uninterested in his policies, Confucius left the country in an attempt to find another feudal state to which he could render his service. Despite his political frustration he was accompanied by an expanding circle of students during this self-imposed exile of almost 12 years. His reputation as a man of vision and mission spread. A guardian of a border post once characterized him as the “wooden tongue for a bell” of the age, sounding heaven’s prophetic note to awaken the people (Analects, 3:24). Indeed, Confucius was perceived as the heroic conscience who knew realistically that he might not succeed but, fired by a righteous passion, continuously did the best he could. At the age of 67 he returned home to teach and to preserve his cherished classical traditions by writing and editing. He died in 479 bce, at the age of 73. According to the Records of the Historian, 72 of his students mastered the “six arts,” and those who claimed to be his followers numbered 3,000.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
163. Dr. Harvey Cushing was given the title of the father of modern neurological surgery as he earned a worldwide reputation in this field, bringing about bold and novel surgical innovations in the field of medicine and surgery. Descending from a long line of medical practitioners, Cushing was always expected to get into a similar field. Many of the tools, techniques and procedures used in the operation theater today are the ones that were developed by Harvey Cushing in the early 19th century. He defied all medical traditions and took control of the most important, functioning system in the human body; the Central Nervous System. He discovered the deadly Cushing’s disease and was also rewarded for his efforts and contribution to surgery and science. The Harvey Cushing Society, a first-of-its-kind, neurosurgical association was set up in honor of the prominent neurosurgeon. With his expertise, innovations and discoveries, Harvey Cushing made Neurology and Neurosurgery one of the most important divisions in medicine all over the world.
Childhood And Early Life
Harvey Williams Cushing was the youngest of ten children born to Bessie Williams and Kirke Cushing on April 8th 1869. The Cushing family had a long list of medical practitioners who were all well-equipped doctors at the time. Harvey’s father, Kirke Cushing, was a physician himself, which is probably how Harvey developed an interest and eagerness towards the subject. In 1891, Harvey graduated with a Bachelor of Arts degree from one of the Ivy League institutions, Yale University. Later, he opted to study medicine at the Harvard Medical School and earned his M.D degree in the year 1895.
Career
After Cushing graduated from Harvard, he took up an internship at the Massachusetts General Hospital with the hope of studying under big names. Once he completed his internship, he assumed post as a Surgeon in residence at the Johns Hopkins hospital in Baltimore, under the guidance of the eminent William Stewart Halsted. After a short stint at the Johns Hopkins Hospital, he studied cerebral surgery at Bern and Sherrington at Liverpool, under Emil Theodor Kocher. This intrigued him further in the subject of Neurology, and Harvey Cushing began practicing privately in the city of Baltimore.
Around the age of 32, he was made the secondary professor of Surgery at the Johns Hopkins hospital. He slowly transitioned to a position where he governed almost all surgical cases at the hospital and all other cases pertaining to the Central Nervous System. It was at this point of time, when he started writing essays and thesis’s on the Spine and also contributed to the department of bacteriology. During his tenure, he made important suggestions and contributions relating to intra-cerebral pressure and the localization of cerebral components. These theories were developed with the help of Emil Theodor Kocher and Charles Sherrington. While he practiced at Baltimore, he also discovered the method of conducting surgeries and other intricate procedures under the influence of the local anesthesia. The use of local anesthesia rose to prominence after he wrote a dissertation on its effects on hernia. His thesis was well received and Cushing slowly rose to prominence in other parts of the world.
Apart from neurosurgery, Cushing studied the effects of blood pressure and stated its importance during surgery and how it could be regulated. All of his theories and suggestions are put to practice till today. Cushing was appointed as chief surgeon at the Peter Bent Hospital in Boston. His surgical procedures at this hospital, under the influence of anesthesia, became successful and in 1912, he assumed post as a professor of Surgery in the Harvard Medical School.
Achievements
In the year 1913, Cushing was awarded the prestigious position in the Fellowship of the Royal College of Surgeons by the United Kingdom and Ireland. He was also elected as a Fellow of the American Academy of Arts and Sciences in the following year. It was during this period, he discovered the Cushing’s syndrome that referred to a tumor in the pituitary gland. In 1917-1919, before WWI, Cushing was appointed as the director of the U.S hospital complementary to the British Expeditionary force. He was also given the prestigious rank of ‘Colonel’ at the Army Medical Corps in the United States. His career was accelerating at a surreal pace and Cushing was also contributing heavily to the field of neurology with his research and theses. Towards the end of his career, Cushing was awarded the Pulitzer Prize in the Biography or Autobiography category where he wrote about his offerings to medicine and surgery. He was also listed into the Fellow of the Royal Society of London.
Later Years
Towards the end of his fast-paced career, Harvey Cushing decided to retire and briefly worked at the Yale University School of Medicine. He wrote and discovered the polyglandular syndrome and was also given the title of the ‘father of modern medicine’. He emphasized on the need of the sphygmomanometer that measured the blood pressure levels in the human body during surgery. This was one of his greatest contributions, next only to making neurosurgery a more realistic procedure and at par with the discovery of Cushing’s disease.
Personal Life
Harvey Cushing married Katherine Stone Crowell, his childhood sweetheart, on June 10th, 1902, and they had five children in the following years. Cushing was said to have been very interested in fishing, and his interest grew after he was taken on high school expedition by his teachers to the Great Lakes in 1844. Ever since, Cushing made fishing a hobby and was reportedly seen fishing on countless occasions.
Death And Legacy
Ironically, being a neurosurgeon himself, he succumbed to a brain disorder and died of a cyst in the third ventricle of the brain. Cushing died on October 7th, 1939, at the age of 70. He left behind a rich legacy for his successors and the upcoming generations. At the beginning of the early 20th century, he had already developed primary surgical techniques for the brain and had made full use of state-of-the-art technology such as X-Rays and electric machines to understand the complex aspects of the brain such as the cortex. He stressed on the importance of anesthesia that became a staple practice in medical institutions all over the world. His famous publication ‘the Basophil Adenomas of the Pituitary body and their Clinical Manifestations’ went on to talk about one of his most important discoveries relating to the pituitary glands. Apart from winning countless awards during his lifetime, he was also honored in the year 1988, when the United States Postal service honored his medical contributions by printing mini-sized, 45 cent stamps of him, under the title of some of the ‘Great Americans’. Apart from accolades and appreciations, medical tools discovered by Cushing were also given his name, such as the Cushing’s Forceps, and a separate medical library has been dedicated to Harvey Cushing at the Yale University. His strong dedication and passion for the subject of neurosurgery brought him to the forefront and made him a legendary surgeon during his time.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
164. Cesar Chavez (born César Estrada Chávez; March 31, 1927 – April 23, 1993) was an American labor leader and civil rights activist who, with Dolores Huerta, co-founded the National Farm Workers Association (later the United Farm Workers union, UFW) in 1962. Originally a Mexican American farm worker, Chavez became the best known Latino American civil rights activist, and was strongly promoted by the American labor movement, which was eager to enroll Hispanic members. His public-relations approach to unionism and aggressive but nonviolent tactics made the farm workers' struggle a moral cause with nationwide support. By the late 1970s, his tactics had forced growers to recognize the UFW as the bargaining agent for 50,000 field workers in California and Florida. However, by the mid-1980s membership in the UFW had dwindled to around 15,000.
During his lifetime, Colegio Cesar Chavez was one of the few institutions named in his honor, but after his death he became a major historical icon for the Latino community, with many schools, streets, and parks being named after him. He has since become an icon for organized labor and leftist politics, symbolizing support for workers and for Hispanic empowerment based on grass roots organizing. He is also famous for popularizing the slogan "Sí, se puede" (Spanish for "Yes, one can" or, roughly, "Yes, it can be done"), which was adopted as the 2008 campaign slogan of Barack Obama. His supporters say his work led to numerous improvements for union laborers. Although the UFW faltered a few years after Chavez died in 1993, he became an iconic "folk saint" in the pantheon of Mexican Americans. His birthday, March 31, has become Cesar Chavez Day, a state holiday in California, Colorado, and Texas.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
165. Martin Heinrich Klaproth
(b. Wernigerode, Germany, I December 1743; d. Berlin, Germany, 1 January 1817)
“Suffer and hope”—with these words Klaproth in 1765 captured the essence of his youth. The third son of Johann Julius Klaproth, a poor but respected tailor with pietistic leanings, he had been intended for the clergy. Shortly after his fifteenth birthday, however, an unpleasant incident apparently forced him to drop out of Wernigerode’ Latin school. Deciding to take up pharmacy, probably because of its connection with the natural sciences, Klaproth became an apprentice in a Quedlinburg apothecary shop in 1759. His master worked him hard, giving him little, if any, theoretical training and less spare time. In 1766, two years after becoming a journeyman, he moved, in the same capacity, to Hannover. There, at last, he had the opportunity to begin transcending pharmacy. Choosing chemistry, he read the texts of J. F. Cartheuser and J. R. Spielmann and conducted many minor investigations. After two years in Hannover, followed by two and a half years in Berlin and a few months in Danzig, Klaproth settled at Berlin in 1771. During his first decade there he supported himself by managing the apothecary shop of a deceased friend, the minor chemist Valentin Rose the elder. In 1780 he finally gained self-sufficiency—a fortunate marriage to A. S. Marggraf’ wealthy niece enabled him to purchase his own shop.
In the meantime Klaproth had continued his pursuit of chemistry, studying not only by himself but also, it seems, with Marggraf. He ventured into print for the first time in 1776 when a friend persuaded him to contribute a chapter on the chemical properties of copal to a book on the natural history of that resinous substance. By 1780 he felt sufflciently knowledgeable to request permission to give private lectures on chemistry under the auspices of Berlin’ Medical-Surgical College. The college’s professors, who were eager to avoid such competition for student fees, blocked his request. In 1782, after publishing several articles on chemical topics and securing the backing of influential Masonic brothers, Klaproth was in a stronger position. That year he was named to the second seat for pharmacy on Prussia’s highest medical board and soon afterward was granted permission to lecture on chemistry. Thus, at the relatively advanced age of thirty-nine, he embarked on his administrative and teaching career.
Over the years Klaproth moved up in the Prussian medical bureacracy from assessor (1782-1797) to councillor (1797-1799), to high councillor (1799-1817). Meanwhile he secured teaching posts, serving as private lecturer at the Medical-Surgical College (1782- 1810); teacher of chemistry at the Mining School (1784-1817); professor of chemistry at G. F. von Tempelhoff’ Artillery School and its successors, the Royal Artillery Academy and the General War School (1787-1812?); and full professor of chemistry in the University of Berlin’ Philosophical Faculty (1810-1817). In 1800 Klaproth was appointed to succeed F. K. Achard as the Berlin Academy’s representative for chemistry. No longer needing his apothecary shop, he sold it at a handsome profit and moved into the academy’s new laboratory-residence complex in 1803. Here Klaproth, the tailor’s son who once could only “suffer and hope,” worked until his death from a stroke on New Year’s Day 1817.
Although his wealthy wife and influential friends had helped Klaproth launch his career, it was his accomplishments as a chemist that propelled his subsequent rise. His most important work was in analytical chemistry. Indeed, he was the leading analytical chemist in Europe from the late 1780’s, when he established himself as Bergman’s intellectual successor, until the early 1800’s, when Berzelius gradually took his place. Working with minerals from all parts of the globe, Klaproth discovered or codiscovered zirconium (1789), uranium (1789), titanium (1792), strontium (1793,) chromium (1797), mellitic acid (1799), and cerium (1803) and confirmed prior discoveries of tellurium (1798) and beryllium (1798). More consequential than these specific results were Klaproth’s new techniques. For instance, he found that many particularly insoluble minerals could be dissolved if they were first ground to a fine powder and then fused with a carbonate. With his student Valentin Rose the younger he introduced the use of barium nitrate in the decomposition of silicates. He constantly drew attention to the necessity of either avoiding or making allowances for contamination from apparatus and reagents. Most significant, he broke with the tradition of ignoring “small” losses and gains in weight in analytical work. Instead, he used discrepancies over a few percentage points as a means of detecting faulty and incomplete analyses. Once satisfied with his procedure for analyzing a mineral, he reported his final results—including the remaining discrepancy. This practice became a convention with the next generation of analysts.
Besides his influence as an analyst, Klaproth played a role of some consequence in the German acceptance of Lavoisier’s theory. In the spring of 1792, after studying his friend S. F. Hermbstäadt’s manuscript translation of Lavoisier’s Traité and repeating some of its main experiments, he announced his tentative support for the antiphlogistic system. During the ensuing year he often joined with Hermbstäadt in repeating the reduction of mercuric oxide before skeptical and important witnesses. By the summer of 1793 they had discredited F. A. C. Gren and other phlogistonists who denied the accuracy of Lavoisier’ account of the experiment, thereby preparing the way for the success of the antiphlogistic revolution in Germany. In the remaining decades of his life, however, Klaproth avoided taking an active part in the theoretical development of chemistry.
Klaproth’s aversion to theory in no way dampened international enthusiasm for his work. Among the numerous honors that he received were membership in the Royal Society of London (1795) and, far more important, membership as one of six foreign associates in the Institut de France (1804).
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
166. Gustav Robert Kirchhoff
Born : March 12, 1824, Königsberg, East Prussia
Died October 17, 1887, Berlin, Germany
Known for Kirchhoff's circuit laws
Kirchhoff's law of thermal radiation
Notable prizes : Rumford medal
Gustav Robert Kirchhoff (March 12, 1824 – October 17, 1887) was a German physicist who contributed to the fundamental understanding of electrical circuits, spectroscopy, and the emission and absorption of radiation. His discoveries helped set the stage for the advent of quantum mechanics.
Biography : Birth and early life
Gustav Kirchhoff was born in Königsberg, East Prussia, the son of Friedrich Kirchhoff, a lawyer, and Johanna Henriette Wittke. He attended the Albertus University of Königsberg where he was taught by the physicist Franz Ernst Neumann. Influenced by Neumann's approach to electricity and magnetism, he made his first contribution to physics while still a student. By applying the laws of conservation of charge and conservation of energy to electrical circuits, he established what are now called Kirchoff's laws for circuits. By applying these laws, electrical engineers can determine the current flow and voltage in various branches of complex circuitry.
Professional life
Kirchhoff graduated in 1847, and in the same year married Clara Richelot, the daughter of his mathematics professor, Friedrich Julius Richelot. The couple moved to Berlin, where Kirchhoff was awarded his teaching credentials, and remained there until 1850, when he was given a professorship at Breslau.
Spectroscopy
In 1851, Kirchhoff met Robert Wilhelm Bunsen, who remained only briefly in Breslau before accepting a position at Heidelberg in 1851. Kirchhoff moved to Heidelberg in 1854 and began a fruitful collaboration with Bunsen that resulted in the establishment of the field of spectroscopy, involving analysis of the composition of chemical compounds through the spectra they produce.
Intrigued by the different colors produced when various substances were heated in a flame, Bunsen wanted to use the colors the colors to identify chemical elements and compounds. Broadening the concept, Kirchhoff suggested that Bunsen not only pay attention to the immediately visible colors but also that he study the spectra of color components produced by passing the light produced by each substance through a prism. Thus was the field of spectroscopy initiated.
In 1859, Kirchhoff noted that dark lines found in the Sun's spectrum were further darkened when the sunlight passes through a sodium compound heated by a bunsen burner. From this, he concluded that the original dark lines, called Fraunhofer lines after the scientist who discovered them, result from sodium in the Sun's atmosphere. This opened up a new technique for analyzing the chemical composition of stars.
That same year, Kirchhoff researched the manner in which radiation is emitted and absorbed by various substances, and formulated what is now known as Kirchoff's Law of Thermal Radiation: In a state of thermal equilibrium the radiation emitted by a body is equal to the radiation absorbed by the body. By 1860, Bunsen and Kirchhoff were able to assign distinct spectral characteristics to a number of metals. Together they discovered caesium (1860) and rubidium (1861) while studying the chemical composition of the Sun via its spectral signature.
In 1862, Kirchoff introduced the concept of a "black body," a body that is both a perfect emitter and absorber of heat radiation. That same year, Kirchhoff was awarded the Mumford Medal for his work on spectral analysis. Later research on black body radiation was pivotal in the the development of quantum theories that emerged at the beginning of the twentieth century.
Later years
In 1869, Kirchhoff's first wife died, and in 1872 he married Luise Brommel, the superintendant of a medical facility. In 1875, he returned to Berlin to accept a chair in theoretical physics. While there, he came into contact with Max Planck, but disputed Planck's thermodynamic formulations. Planck would later promulgate the energy laws that ushered in the age of quantum mechanics. Kirchhoff continued his research until poor health forced him to retire in 1886. He died in 1887, and was buried at the Saint Matthäus Kirchhof Cemetery in Schöneberg, Berlin.
Details of scientific work :
Circuit laws
Kirchhoff's circuit laws (or circuit rules) are a pair of laws that deal with the conservation of charge and energy in electrical circuits, and were first described in 1845 by Kirchhoff. Widely used in electrical engineering, they are also called Kirchhoff's rules or simply Kirchhoff's laws.
Kirchhoff's Current Law (KCL)
The current law is also called Kirchhoff's first law, Kirchhoff's point rule, Kirchhoff's junction rule, and Kirchhoff's first rule. Based on the principle of conservation of electric charge, it may be stated as:
At any point in an electrical circuit where charge density is not changing in time, the sum of currents flowing toward that point is equal to the sum of currents flowing away from that point.
Kirchhoff's Voltage Law (KVL)
The voltage law is also called Kirchhoff's second law, Kirchhoff's loop rule, and Kirchhoff's second rule. Based on the principle of conservation of energy, it may be stated as:
The directed sum of the electrical potential differences around a circuit must be zero.
Spectroscopy research
Kirchhoff contributed greatly to the field of spectroscopy by formalizing three laws that describe the spectral composition of light emitted by incandescent objects, building substantially on the discoveries of David Alter and Anders Jonas Angstrom.
Kirchhoff's Three Laws of Spectroscopy:
A hot solid object produces light with a continuous spectrum.
A hot tenuous gas produces light with spectral lines at discrete wavelengths (or specific colors), which depend on the energy levels of the atoms in the gas.
A hot solid object surrounded by a cool tenuous gas (that is, cooler than the hot object) produces light that on passing through the surrounding gas yields an almost continuous spectrum which has gaps at discrete wavelengths depending on the energy levels of the atoms in the gas.
The existence of these discrete lines was later explained by the Bohr model, which helped lead to the development of quantum mechanics.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
Hi Ganesh,
Its an excellent article, which I discovered just after you posted the above message and it came in the top. Hi, I would suggest you to make this article "Sticky" here so that it gets locked in the top and any new member(and any old one too if they have not seen it) can come and see these interesting information.
Practice makes a man perfect.
There is no substitute to hard work
All of us do not have equal talents but everybody has equal oppurtunities to build their talents.-APJ Abdul Kalam
Offline
Hi iamaditya,
Thanks for the views and comments. Regarding making it 'Sticky', MathsIsFun decides.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
167. Thomas Kuhn
Thomas S. Kuhn, in full Thomas Samuel Kuhn (born July 18, 1922, Cincinnati, Ohio, U.S. - died June 17, 1996, Cambridge, Mass.), American historian of science noted for The Structure of Scientific Revolutions (1962), one of the most influential works of history and philosophy written in the 20th century.
Kuhn earned bachelor’s (1943) and master’s (1946) degrees in physics at Harvard University but obtained his Ph.D. (1949) there in the history of science. He taught the history or philosophy of science at Harvard (1951–56), the University of California at Berkeley (1956-64), Princeton University (1964-79), and the Massachusetts Institute of Technology (1979–91).
In his first book, The Copernican Revolution (1957), Kuhn studied the development of the heliocentric theory of the solar system during the Renaissance. In his landmark second book, The Structure of Scientific Revolutions, he argued that scientific research and thought are defined by “paradigms,” or conceptual world-views, that consist of formal theories, classic experiments, and trusted methods. Scientists typically accept a prevailing paradigm and try to extend its scope by refining theories, explaining puzzling data, and establishing more precise measures of standards and phenomena. Eventually, however, their efforts may generate insoluble theoretical problems or experimental anomalies that expose a paradigm’s inadequacies or contradict it altogether. This accumulation of difficulties triggers a crisis that can only be resolved by an intellectual revolution that replaces an old paradigm with a new one. The overthrow of Ptolemaic cosmology by Copernican heliocentrism, and the displacement of Newtonian mechanics by quantum physics and general relativity, are both examples of major paradigm shifts.
Kuhn questioned the traditional conception of scientific progress as a gradual, cumulative acquisition of knowledge based on rationally chosen experimental frameworks. Instead, he argued that the paradigm determines the kinds of experiments scientists perform, the types of questions they ask, and the problems they consider important. A shift in the paradigm alters the fundamental concepts underlying research and inspires new standards of evidence, new research techniques, and new pathways of theory and experiment that are radically incommensurate with the old ones.
Kuhn’s book revolutionized the history and philosophy of science, and his concept of paradigm shifts was extended to such disciplines as political science, economics, sociology, and even to business management. Kuhn’s later works were a collection of essays, The Essential Tension (1977), and the technical study Black-Body Theory and the Quantum Discontinuity (1978).
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
Hi iamaditya,
Thanks for the views and comments. Regarding making it 'Sticky', MathsIsFun decides.
Cant moderators make an article a "sticky" one. Well if not then I'm sure bobbym can do it.
P.S.-Again an excellent article. Is this ur hobby,ganesh to find good information and write about it. If yes, then keep it up!
Another doubt: There is a section in MIFF named "Ganesh's Puzzles". I thought you created it, but now I think MIF did. Am I right? If yes then ask him to create anothet section.....
Where you can store these articles in various topics such as Singers, dancers, musicians, scientists and so on. Eventually you will set up another Wikipedia here!
Practice makes a man perfect.
There is no substitute to hard work
All of us do not have equal talents but everybody has equal oppurtunities to build their talents.-APJ Abdul Kalam
Offline