You are not logged in.
1424) Hurdling
Summary
Hurdling is the act of jumping over an obstacle at a high speed or in a sprint. In the early 19th century, hurdlers ran at and jumped over each hurdle (sometimes known as 'burgles'), landing on both feet and checking their forward motion. Today, the dominant step patterns are the 3-step for high hurdles, 7-step for low hurdles, and 15-step for intermediate hurdles. Hurdling is a highly specialized form of obstacle racing, and is part of the sport of athletics. In hurdling events, barriers known as hurdles are set at precisely measured heights and distances. Each athlete must pass over the hurdles; passing under or intentionally knocking over hurdles will result in disqualification.
Accidental knocking over of hurdles is not cause for disqualification, but the hurdles are weighted to make doing so disadvantageous. In 1902 Spalding equipment company sold the Foster Patent Safety Hurdle, a wood hurdle. In 1923 some of the wood hurdles weighed 16 lb (7.3 kg) each. Hurdle design improvements were made in 1935, when they developed the L-shaped hurdle. With this shape, the athlete could hit the hurdle and it will tip down, clearing the athlete's path. The most prominent hurdles events are 110 meters hurdles for men, 100 meters hurdles for women, and 400 meters hurdles (both sexes) – these three distances are all contested at the Summer Olympics and the World Athletics Championships. The two shorter distances take place on the straight of a running track, while the 400 m version covers one whole lap of a standard oval track. Events over shorter distances are also commonly held at indoor track and field events, ranging from 50 meters hurdles upwards. Women historically competed in the 80 meters hurdles at the Olympics in the mid-20th century. Hurdles race are also part of combined events contests, including the decathlon and heptathlon.
In track races, hurdles are normally 68–107 cm (27–42 in) in height, depending on the age and gender of the hurdler. Events from 50 to 110 meters are technically known as high hurdles races, while longer competitions are low hurdles races. The track hurdles events are forms of sprinting competitions, although the 400 m version is less anaerobic in nature and demands athletic qualities similar to the 800 meters flat race.
A hurdling technique can also be found in the steeplechase, although in this event athletes are also permitted to step on the barrier to clear it. Similarly, in cross country running athletes may hurdle over various natural obstacles, such as logs, mounds of earth, and small streams – this represents the sporting origin of the modern events. Horse racing has its own variant of hurdle racing, with similar principles.
Details
Hurdling is an sport in athletics (track and field) in which a runner races over a series of obstacles called hurdles, which are set a fixed distance apart. Runners must remain in assigned lanes throughout a race, and, although they may knock hurdles down while running over them, a runner who trails a foot or leg alongside a hurdle or knocks it down with a hand is disqualified. The first hurdler to complete the course is the winner.
Hurdling probably originated in England in the early 19th century, where such races were held at Eton College about 1837. In those days hurdlers merely ran at and jumped over each hurdle in turn, landing on both feet and checking their forward motion. Experimentation with numbers of steps between hurdles led to a conventional step pattern for hurdlers—3 steps between each high hurdle, 7 between each low hurdle, and usually 15 between each intermediate hurdle. Further refinements were made by A.C.M. Croome of Oxford University about 1885, when he went over the hurdle with one leg extended straight ahead at the same time giving a forward lunge of the trunk, the basis of modern hurdling technique.
A major improvement in hurdle design was the invention in 1935 of the L-shaped hurdle, replacing the heavier, inverted-T design. In the L-shaped design and its refinement, the curved-L, or rocker hurdle, the base-leg of the L points toward the approaching hurdler. When upset, the hurdle tips down, out of the athlete’s path, instead of tipping up and over as did the inverted-T design.
Modern hurdlers use a sprinting style between hurdles and a double-arm forward thrust and exaggerated forward lean while clearing the hurdle. They then bring the trailing leg through at nearly a right angle to the body, which enables them to continue forward without breaking stride after clearing the hurdle.
Under rules of the International Association of Athletics Federations (IAAF), the world governing body of track-and-field athletics, the standard hurdling distances for men are 110, 200, and 400 metres (120, 220, and 440 yards, respectively). Men’s Olympic distances are 110 metres and 400 metres; the 200-metre race was held only at the 1900 and 1904 Games. The 110-metre race includes 10 high hurdles (1.067 metres [42 inches] high), spaced 9.14 metres (10 yards) apart. The 400-metre race is over 10 intermediate hurdles (91.4 cm [36 inches] high) spaced 35 metres (38.3 yards) apart. The 200-metre race, run occasionally, has 10 low hurdles (76.2 cm [30 inches] high) spaced 18.29 metres (20 yards) apart. Distances and specifications vary somewhat for indoor and scholastic events.
The women’s international distance formerly was 80 metres over 8 hurdles 76.2 cm high. In 1966 the IAAF approved two new hurdle races for women: 100 metres over 10 hurdles 84 cm (33.1 inches) high, to replace the 80-metre event in the 1972 Olympics; and 200 metres (supplanted in 1976 by 400 metres) over 10 hurdles 76.2 cm high.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
1425. Hammer Throw
Summary
The hammer throw is one of the four throwing events in regular track and field competitions, along with the discus throw, shot put and javelin.
The "hammer" used in this sport is not like any of the tools also called by that name. It consists of a metal ball attached by a steel wire to a grip. The size of the ball varies between men's and women's competitions.
Competition
The men's hammer weighs 16 pounds (7.26 kg) and measures 3 feet 11+3⁄4 inches (121.3 cm) in length, and the women's hammer weighs 4 kg (8.82 lb) and 3 ft 11 in (119.4 cm) in length. Like the other throwing events, the competition is decided by who can throw the implement the farthest.
Although commonly thought of as a strength event, technical advancements in the last 30 years have developed hammer throw competition to a point where more focus is on speed in order to gain maximum distance.
The throwing motion starts with the thrower swinging the hammer back-and-forth about two times to generate momentum. The thrower then makes three, four or (rarely) five full rotations using a complex heel-toe foot movement, spinning the hammer in a circular path and increasing its angular velocity with each rotation. Rather than spinning the hammer horizontally, it is instead spun in a plane that angles up towards the direction in which it will be launched. The thrower releases the hammer as its velocity is upward and toward the target.
Throws are made from a throwing circle. The thrower is not allowed to step outside the throwing circle before the hammer has landed and may only enter and exit from the rear of the throwing circle. The hammer must land within a 34.92º throwing sector that is centered on the throwing circle. The sector angle was chosen because it provides a sector whose bounds are easy to measure and lay out on a field (10 metres out from the center of the ring, 6 metres across). A violation of the rules results in a foul and the throw not being counted.
As of 2015 the men's hammer world record is held by Yuriy Sedykh, who threw 86.74 m (284 ft 6+3⁄4 in) at the 1986 European Athletics Championships in Stuttgart, West Germany on 30 August. The world record for the women's hammer is held by Anita Włodarczyk, who threw 82.98 m (272 ft 2+3⁄4 in) during the Kamila Skolimowska Memorial on 28 August 2016.
Details
Hammer throw is a sport in athletics (track and field) in which a hammer is hurled for distance, using two hands within a throwing circle.
The sport developed centuries ago in the British Isles. Legends trace it to the Tailteann Games held in Ireland about 2000 BCE, when the Celtic hero Cú Chulainn gripped a chariot wheel by its axle, whirled it around his head, and threw it farther than did any other individual. Wheel hurling was later replaced by throwing a boulder attached to the end of a wooden handle. Forms of hammer throwing were practiced among the ancient Teutonic tribes at religious festivals honouring the god Thor, and sledgehammer throwing was practiced in 15th- and 16th-century Scotland and England.
Since 1866 the hammer throw has been a regular part of track-and-field competitions in Ireland, Scotland, and England. The English standardized the event in 1875 by establishing the weight of the hammer at 7.2 kg (16 pounds) and its length at 1,067.5 mm (later changed to a maximum 1,175 mm [46.3 inches]) and by requiring that it be thrown from a circle 2.135 metres (7 feet) in diameter.
The men’s event has been included in the Olympic Games since 1900; the women’s hammer throw made its Olympic debut in 2000. Early hammers had forged-iron heads and wooden handles, but the International Association of Athletics Federations (IAAF) now requires use of a wire-handled spherical weight. The ball is of solid iron or other metal not softer than brass or is a shell of such metal filled with lead or other material. The handle is spring steel wire, with one end attached to the ball by a plain or ball-bearing swivel and the other to a rigid two-hand grip by a loop. The throwing circle is protected by a C-shaped cage for the safety of officials and onlookers.
In the modern hammer throw technique, a thrower makes three full, quick turns of the body before flinging the weight. Strength, balance, and proper timing are essential. The throw is a failure if the athlete steps on or outside the circle or if the hammer lands outside a 40° sector marked on the field from the centre of the circle.
How it works
Another of the throws events, athletes throw a metal ball (16lb/7.26kg for men, 4kg/8.8lb for women) that is attached to a grip by a steel wire no longer than 1.22m while remaining inside a seven-foot (2.135m) diameter circle.
In order for the throw to be measured, the ball must land inside a marked 35-degree sector and the athlete must not leave the circle before it has landed, and then only from the rear half of the circle.
The thrower usually makes three or four spins before releasing the ball. Athletes will commonly throw six times per competition. In the event of a tie, the winner will be the athlete with the next-best effort.
Quality hammer throwers require speed, strength, explosive power and co-ordination. At major championships the format is typically a qualification session followed by a final.
History
Legend traces the concept of the hammer throw to approximately 2000BC and the Tailteann Games in Tara, Ireland, where the Celtic warrior Culchulainn gripped a chariot wheel by its axle, whirled it around his head and threw it a huge distance.
The wheel was later replaced by a boulder attached to a wooden handle and the use of a sledgehammer is considered to have originated in England and Scotland during the Middle Ages. A 16th century drawing shows the English king Henry VIII throwing a blacksmith’s hammer.
The hammer was first contested by men at the 1900 Olympic Games in Paris. Women first contested the hammer at the Olympics some 100 years later at the Sydney Games. Hungary has a strong tradition in the men’s hammer and has won gold medals at the 1948, 1952, 1968, 1996 and 2012 Games. Poland has been the dominant force in women’s hammer, winning Olympic titles in 2000, 2012 and 2016.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
1426) Shot put
Summary
The shot put is a track and field event involving "putting" (pushing rather than throwing) a heavy spherical ball—the shot—as far as possible. The shot put competition for men has been a part of the modern Olympics since their revival in 1896, and women's competition began in 1948.
Details
Shot put is a sport in athletics (track and field) in which a spherical weight is thrown, or put, from the shoulder for distance. It derives from the ancient sport of putting the stone.
The first to use a shot (cannon ball) instead of a stone competitively were British military sports groups. Although the weight varied in early events from 3.63 to 10.9 kg (8 to 24 pounds), a standard, regulation-weight 7.26-kg (16-pound) shot was adopted for men in the first modern Olympic Games (1896) and in international competition. The event was added to the women’s Olympic program in 1948. The weight of the shot used for women’s competition is 4 kg (8.8 pounds); lighter weights are also used in school, collegiate, and veteran competitions.
The shot generally is made of solid iron or brass, although any metal not softer than brass may be used. It is put from a circle 2.135 metres (7 feet) in diameter into a 40° sector as measured from the centre of the circle. The circle has a stop board 10 cm (4 inches) high at its front; if the competitor steps on or out of the circle, the throw is invalidated. The shot is put with one hand and must be held near the chin to start. It may not drop below or behind shoulder level at any time.
Constant improvements in technique have resulted in better than doubled record distances. The International Association of Athletics Federations (IAAF) recognizes the first official world record as 9.44 metres (31 feet) by J.M. Mann of the United States in 1876. It had long been conventional to start from a position facing at a right angle to the direction of the put. In the 1950s, however, American Parry O’Brien developed a style of beginning from a position facing backward. Thus he brought the shot around 180°, rather than the usual 90°, and found that the longer he pushed the shot, the farther it would travel. By 1956 O’Brien had doubled Mann’s record with a put of 19.06 metres (62.5 feet), and with this success, his style was almost universally imitated. By 1965 American Randy Matson had pushed the record beyond 21 metres (68 feet); later athletes extended the world mark to more than 23 metres (75 feet), many using a technique in which the putter spins with the shot for more than 360°.
Additional Information
How it works
One of the four traditional throws events in track and field. The shot, a metal ball (7.26kg/16lb for men, 4kg/8.8lb for women), is put – not thrown – with one hand. The aim is to put it as far as possible from a seven-foot diameter (2.135m) circle that has a curved 10-centimetre high toe-board at the front.
In order for the put to be measured, the shot must not drop below the line of the athlete’s shoulders at any stage of the put and must land inside a designated 35-degree sector. The athlete, meanwhile, must not touch the top of the toe-board during their put or leave the circle before the ball has landed, and then only from the rear half of the circle. The results order is determined by distance.
Athletes will commonly throw six times per competition. In the event of a tie, the winner will be the athlete with the next-best effort. A shot putter requires strength, speed, balance and explosive power. At major championships the format is typically a qualification session followed by a final.
History
The Ancient Greeks threw stones as a sport and soldiers are recorded as throwing cannon balls in the Middle Ages, but a version of the modern form of the discipline can be traced to the Highland Games in Scotland during the 19th century where competitors threw a rounded cube, stone or metal weight from behind a line.
The men’s shot put has been part of every modern Olympics since 1896, but women putters had to wait until 1948 before they could compete at the Games.
The US are the most successful shot nation in Olympic history and grabbed gold at every men’s shot competition from 1896 through to 1968 - except two. Poland’s Tomasz Majewski became the third man in Olympic shot history to win back-to-back titles, achieving the feat in 2008 and 2012. Valerie Adams is one of the leading women’s performers in shot history. The New Zealander claimed successive Olympic titles in 2008 and 2012 and snared a silver medal at the 2016 Rio Games.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
1427) Pole vault
Summary
Pole vaulting, also known as pole jumping, is a track and field event in which an athlete uses a long and flexible pole, usually made from fiberglass or carbon fiber, as an aid to jump over a bar. Pole jumping competitions were known to the Mycenaean Greeks, Minoan Greeks and Celts. It has been a full medal event at the Olympic Games since 1896 for men and since 2000 for women.
It is typically classified as one of the four major jumping events in athletics, alongside the high jump, long jump and triple jump. It is unusual among track and field sports in that it requires a significant amount of specialised equipment in order to participate, even at a basic level. A number of elite pole vaulters have had backgrounds in gymnastics, including world record breakers Yelena Isinbayeva and Brian Sternberg, reflecting the similar physical attributes required for the sports. Running speed, however, may be the most dominant factor. Physical attributes such as speed, agility and strength are essential to pole vaulting effectively, but technical skill is an equally if not more important element. The object of pole vaulting is to clear a bar or crossbar supported upon two uprights (standards) without knocking it down.
Details
Pole vault is a sport in athletics (track and field) in which an athlete jumps over an obstacle with the aid of a pole. Originally a practical means of clearing objects, such as ditches, brooks, and fences, pole-vaulting for height became a competitive sport in the mid-19th century. An Olympic event for men since the first modern Games in 1896, a pole-vault event for women was added for the 2000 Olympics in Sydney, Australia.
In competition, each vaulter is given three chances to clear a specified height. A bar rests on two uprights so that it will fall easily if touched. It is raised progressively until a winner emerges by process of elimination. Ties are broken by a “count back” based on fewest failures at the final height, fewest failures in the whole contest, or fewest attempts throughout the contest. The pole may be of any material: bamboo poles, introduced in 1904, quickly became more popular than heavier wooden poles; glass fibre became the most effective and popular by the early 1960s. The poles may be of any length or diameter.
A slideway, or box, is sunk into the ground with its back placed directly below the crossbar (see illustration). The vaulter thrusts the pole into this box upon leaving the ground. A pit at least 5 metres (16.4 feet) square and filled with soft, cushioning material is provided behind the crossbar for the landing.
Requirements of the athlete include a high degree of coordination, timing, speed, and gymnastic ability. The modern vaulter makes a run of 40 metres (131.2 feet) while carrying the pole and approaches the takeoff with great speed. As the stride before the spring is completed, the vaulter performs the shift, which consists in advancing the pole toward the slideway and at the same time allowing the lower hand to slip up the pole until it reaches the upper hand, then raising both hands as high above the head as possible before leaving the ground. The vaulter is thus able to exert the full pulling power of both arms to raise the body and help swing up the legs.
The vaulter plants the pole firmly in the box, and, running off the ground (rather than jumping), the vaulter’s body is left hanging by the hands as long as possible; the quick, catapulting action of the glass-fibre pole makes timing especially important. The legs swing upward and to the side of the pole, and then shoot high above the crossbar. The body twists to face downward. The vaulter’s body travels across the crossbar by “carry”—the forward speed acquired from the run.
Additional Information
How it works
A field event and one of the two vertical jumps (the other being high jump), the pole vault is the sport's high-adrenaline discipline.
Competitors vault over a 4.5-metre long horizontal bar by sprinting along a runway and jamming a pole (usually made out of carbon fibre or fibreglass) against a ‘stop board’ at the back of a recessed metal ‘box’ sited centrally at the base of the uprights. They seek to clear the greatest height without knocking the bar to the ground.
All competitors have three attempts per height, although they can elect to ‘pass’, i.e. advance to a greater height despite not having cleared the current one. Three consecutive failures at the same height, or combination of heights, cause a competitor’s elimination.
If competitors are tied on the same height, the winner will have had the fewest failures at that height. If competitors are still tied, the winner will have had the fewest failures across the entire competition. Thereafter, a jump-off will decide the winner. Each jumper has one attempt and the bar is lowered and raised until one jumper succeeds at one height.
The event demands speed, power, strength, agility and gymnastic skills. At major championships the format is usually a qualification competition followed by a final.
History
Pole vaulting, originally for distance, dates back to at least the 16th century and there is also evidence it was even practised in Ancient Greece. The origins of modern vaulting can be traced back to Germany in the 1850s, when the sport was adopted by a gymnastic association, and in the Lake District region of England, where contests were held with ash or hickory poles with iron spikes in the end.
The first recorded use of bamboo poles was in 1857. The top vaulters started using steel poles in the 1940s. Flexible fibreglass and later carbon fibre poles started to be widely used in the late 1950s.
Men’s pole vault has featured at every modern Olympic Games with the US winning every Olympic title from 1896 to 1968 (if we discount the 1906 intercalated Games). Bob Richards (1952 and 1956) is the only man in history to win two Olympic pole vault titles. Women only made their Olympic pole vault debut in 2000 when American Stacy Dragila struck gold.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
1428) High jump
Summary
The high jump is a track and field event in which competitors must jump unaided over a horizontal bar placed at measured heights without dislodging it. In its modern, most-practiced format, a bar is placed between two standards with a crash mat for landing. Since ancient times, competitors have introduced increasingly effective techniques to arrive at the current form, and the current universally preferred method is the Fosbury Flop, in which athletes run towards the bar and leap head first with their back to the bar.
The discipline is, alongside the pole vault, one of two vertical clearance events in the Olympic athletics program. It is contested at the World Championships in Athletics and the World Athletics Indoor Championships, and is a common occurrence at track and field meets. The high jump was among the first events deemed acceptable for women, having been held at the 1928 Olympic Games.
Javier Sotomayor (Cuba) is the current men's record holder with a jump of 2.45 m (8 ft 1⁄4 in) set in 1993 – the longest-standing record in the history of the men's high jump. Stefka Kostadinova (Bulgaria) has held the women's world record at 2.09 m (6 ft 10+1⁄4 in) since 1987, also the longest-held record in the event.
Details
High jump is a sport in athletics (track and field) in which the athlete takes a running jump to attain height. The sport’s venue (see illustration) includes a level, semicircular runway allowing an approach run of at least 15 metres (49.21 feet) from any angle within its 180° arc. Two rigid vertical uprights support a light horizontal crossbar in such a manner that it will fall if touched by a contestant trying to jump over it. The jumper lands in a pit beyond the bar that is at least 5 by 3 metres (16.4 feet by 9.8 feet) in size and filled with cushioning material. The standing high jump was last an event in the 1912 Olympics. The running high jump, an Olympic event for men since 1896, was included in the first women’s Olympic athletics program in 1928.
The only formal requirement of the high jumper is that the takeoff of the jump be from one foot. Many styles have evolved, including the now little-used scissors, or Eastern, method, in which the jumper clears the bar in a nearly upright position; the Western roll and straddle, with the jumper’s body face-down and parallel to the bar at the height of the jump; and a more recent backward-twisting, diving style often termed the Fosbury flop, after its first prominent exponent, the 1968 American Olympic champion Dickinson Fosbury.
In competition the bar is raised progressively as contestants succeed in clearing it. Entrants may begin jumping at any height above a required minimum. Knocking the bar off its supports constitutes a failed attempt, and three failures at a given height disqualify the contestant from the competition. Each jumper’s best leap is credited in the final standings. In the case of ties, the winner is the one with the fewest misses at the final height, or in the whole competition, or with the fewest total jumps in the competition.
Additional Information
How it works
One of two field events also referred to as vertical jumps, competitors in the high jump take off (unaided) from one foot over a four-metre long horizontal bar. They seek to clear the greatest height without knocking the bar to the ground. Athletes land on a crash mat.
All competitors have three attempts per height, although they can elect to ‘pass’, i.e. advance to a greater height despite not having cleared the current one. Three consecutive failures at the same height, or combination of heights, leads to elimination.
If competitors are tied on the same height, the winner is the one with fewest failures at that height. If competitors are still tied, the winner will have had the fewest failures across the entire competition. Thereafter, a jump-off will decide the winner. The jump-off will start at the next greater height. Each jumper has one attempt and the bar is lowered and raised until one jumper succeeds at one height.
The event demands speed, explosive power and agility among other qualities. At major championships the format is usually a qualification competition followed by a final.
History
High jump contests were popular in Scotland in the early 19th century, and the event was incorporated into the first modern Olympics Games in 1896 for men. Women made their Olympic high jump debut in 1928.
Of the field events, the high jump has perhaps undergone the most radical changes of technique. The Eastern Cut-off, Western Roll and Straddle are methods that have been previously used by the world’s elite. However, the Fosbury Flop, which involves going over with the jumper's back to the bar, popularised by the 1968 Olympic champion Dickinson Fosbury, is now almost exclusively the technique adopted by all the top high jumpers.
Javier Sotomayor won Olympic gold at the 1992 Barcelona Olympics. The Cuban great set the current high jump world record of 2.45m in 1993 and is the first man in history to jump over 8ft. One of the greatest women’s high jumpers in history is Iolanda Balas. The Romanian great won back-to-back Olympic titles in 1960 and 1964 and went 11 years unbeaten in her event.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
1429) Long Jump
Summary
Long jump, also called broad jump, is a sport in athletics (track-and-field) consisting of a horizontal jump for distance. It was formerly performed from both standing and running starts, as separate events, but the standing long jump is no longer included in major competitions. It was discontinued from the Olympic Games after 1912. The running long jump was an event in the Olympic Games of 708 BCE and in the modern Games from 1896.
The standard venue for the long jump includes a runway at least 40 metres (131 feet) in length with no outer limit, a takeoff board planted level with the surface at least 1 metre (3.3 feet) from the end of the runway, and a sand-filled landing area at least 2.75 metres (9 feet) and no more than 3 metres (9.8 feet) wide.
The jumper usually begins his approach run about 30 metres (100 feet) from the takeoff board and accelerates to reach maximum speed at takeoff while gauging his stride to arrive with one foot on and as near as possible to the edge of the board. If a contestant steps beyond the edge (scratch line), his jump is disallowed; if he leaps from too far behind the line, he loses valuable distance.
The most commonly used techniques in flight are the tuck, in which the knees are brought up toward the chest, and the hitch kick, which is in effect a continuation of the run in the air. The legs are brought together for landing, and, since the length of the jump is measured from the edge of the takeoff board to the nearest mark in the landing area surface made by any part of the body, the jumper attempts to fall forward.
In international competition the eight contestants who make the longest jumps in three preliminary attempts qualify to make three final attempts. The winner is the one who makes the single longest jump over the course of the preliminary and final rounds. In 1935 Jesse Owens of the United States set a record of 8.13 metres (26.6 feet) that was not broken until 1960. Similarly, American Bob Beamon held the long jump record of 8.90 metres (29.2 feet) from 1968 until 1991, when it was broken by American Mike Powell, who leapt 8.95 metres (29.4 feet). Beginning in 1948, the women’s long jump has been an Olympic event.
Details
The long jump is a track and field event in which athletes combine speed, strength and agility in an attempt to leap as far as possible from a takeoff point. Along with the triple jump, the two events that measure jumping for distance as a group are referred to as the "horizontal jumps". This event has a history in the ancient Olympic Games and has been a modern Olympic event for men since the first Olympics in 1896 and for women since 1948.
Rules
At the elite level, competitors run down a runway (usually coated with the same rubberized surface as running tracks, crumb rubber or vulcanized rubber, known generally as an all-weather track) and jump as far as they can from a wooden or synthetic board 20 cm or 8 in wide, that is built flush with the runway, into a pit filled with soft damp sand. If the competitor starts the leap with any part of the foot past the foul line, the jump is declared a foul and no distance is recorded. A layer of plasticine is placed immediately after the board to detect this occurrence. An official (similar to a referee) will also watch the jump and make the determination. The competitor can initiate the jump from any point behind the foul line; however, the distance measured will always be perpendicular to the foul line to the nearest break in the sand caused by any part of the body or uniform. Therefore, it is in the best interest of the competitor to get as close to the foul line as possible. Competitors are allowed to place two marks along the side of the runway in order to assist them to jump accurately. At a lesser meet and facilities, the plasticine will likely not exist, the runway might be a different surface or jumpers may initiate their jump from a painted or taped mark on the runway. At a smaller meet, the number of attempts might also be limited to four or three.
Each competitor has a set number of attempts. That would normally be three trials, with three additional jumps being awarded to the best 8 or 9 (depending on the number of lanes on the track at that facility, so the event is equatable to track events) competitors. All valid attempts will be recorded but only the best mark counts towards the results. The competitor with the longest valid jump (from either the trial or final rounds) is declared the winner at the end of competition. In the event of an exact tie, then comparing the next best jumps of the tied competitors will be used to determine place. In a large, multi-day elite competition (like the Olympics or World Championships), a qualification is held in order to select at least 12 finalists. Ties and automatic qualifying distances are potential factors. In the final, a set of trial round jumps will be held, with the best 8 performers advancing to the final rounds.
For record purposes, the maximum accepted wind assistance is two metres per second (m/s) (4.5 mph).
History
The long jump is the only known jumping event of ancient Greece's original Olympics' pentathlon events. All events that occurred at the Olympic Games were initially supposed to act as a form of training for warfare. The long jump emerged probably because it mirrored the crossing of obstacles such as streams and ravines. After investigating the surviving depictions of the ancient event it is believed that unlike the modern event, athletes were only allowed a short running start. The athletes carried a weight in each hand, which were called halteres (between 1 and 4.5 kg). These weights were swung forward as the athlete jumped in order to increase momentum. It was commonly believed that the jumper would throw the weights behind him in midair to increase his forward momentum; however, halteres were held throughout the duration of the jump. Swinging them down and back at the end of the jump would change the athlete's center of gravity and allow the athlete to stretch his legs outward, increasing his distance. The jump itself was made from the bater ("that which is trod upon"). It was most likely a simple board placed on the stadium track which was removed after the event. The jumpers would land in what was called a skamma ("dug-up" area). The idea that this was a pit full of sand is wrong. Sand in the jumping pit is a modern invention. The skamma was simply a temporary area dug up for that occasion and not something that remained over time.
The long jump was considered one of the most difficult of the events held at the Games since a great deal of skill was required. Music was often played during the jump and Philostratus says that pipes at times would accompany the jump so as to provide a rhythm for the complex movements of the halteres by the athlete. Philostratus is quoted as saying, "The rules regard jumping as the most difficult of the competitions, and they allow the jumper to be given advantages in rhythm by the use of the flute, and in weight by the use of the halter." Most notable in the ancient sport was a man called Chionis, who in the 656 BC Olympics staged a jump of 7.05 m (23 ft 1+1⁄2 in).
There has been some argument by modern scholars over the long jump. Some have attempted to recreate it as a triple jump. The images provide the only evidence for the action so it is more well received that it was much like today's long jump. The main reason some want to call it a triple jump is the presence of a source that claims there once was a fifty-five ancient foot jump done by a man named Phayllos.
The long jump has been part of modern Olympic competition since the inception of the Games in 1896. In 1914, Dr. Harry Eaton Stewart recommended the "running broad jump" as a standardized track and field event for women. However, it was not until 1948 that the women's long jump was added to the Olympic athletics programme.
Additional Information
How it works
Known as one of track and field’s two horizontal jumps, competitors sprint along a runway and jump as far as possible into a sandpit from a wooden take-off board. The distance travelled, from the edge of the board to the closest indentation in the sand to it, is then measured.
A foul is committed – and the jump is not measured – if an athlete steps beyond the board.
History
The origins of the long jump can be traced to the Olympics in Ancient Greece, when athletes carried weights in each hand. These were swung forward on take-off and released in the middle of the jump in a bid to increase momentum.
The long jump, as we know it today, has been part of the Olympics since the first Games in 1896. The men’s event has seen some long-standing world records by US jumpers. Jesse Owens jumped 8.13m in 1935, a distance that was not exceeded until 1960. Bob Beamon flew out to a world record 8.90m in the rarefied air of Mexico City at the 1968 Olympic Games a mark that remained until Mike Powell surpassed it with a leap of 8.95m at the 1991 World Championships.
As a winner of four successive Olympic titles – from 1984 to 1996 - Carl Lewis is regarded as the world’s greatest male long jumper in history. The inaugural women’s Olympic long jump took place in 1948 and athletes from five different regions have struck gold in the event; Europe, North America, South America, Africa and Oceania.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
1430) Triple Jump
Summary
The triple jump, sometimes referred to as the hop, step and jump or the hop, skip and jump, is a track and field event, similar to the long jump. As a group, the two events are referred to as the "horizontal jumps". The competitor runs down the track and performs a hop, a bound and then a jump into the sand pit. The triple jump was inspired by the ancient Olympic Games and has been a modern Olympics event since the Games' inception in 1896.
According to World Athletics rules, "the hop shall be made so that an athlete lands first on the same foot as that from which he has taken off; in the step he shall land on the other foot, from which, subsequently, the jump is performed."
The current male world record holder is Jonathan Edwards of the United Kingdom, with a jump of 18.29 m (60 ft 0 in). The current female world record holder is Yulimar Rojas of Venezuela, with a jump of 15.74 m (51 ft 7+1⁄2 in).
Details
Triple jump, also called hop, step, and jump, is an event in athletics (track and field) in which an athlete makes a horizontal jump for distance incorporating three distinct, continuous movements—a hop, in which the athlete takes off and lands on the same foot; a step, landing on the other foot; and a jump, landing in any manner, usually with both feet together. If a jumper touches ground with a wrong leg, the jump is disallowed. Other rules are similar to those of the long jump.
The origins of the triple jump are obscure, but it may be related to the ancient children’s game hopscotch. It has been a modern Olympic event since the first Games in 1896; at those Games two hops were used, but one hop was used at the Olympics thereafter. (The standing triple jump was contested only in the 1900 and 1904 Olympics.)
Equipment needed for the triple jump includes a runway and a takeoff board identical to those used in the long jump, except that the board is at least 13 metres (42.7 feet) from the landing area for men and 11 metres (36 feet) for women.
Additional Information
How it works
One of the two horizontal jump events on the track and field programme, competitors sprint along a runway before taking off from a wooden board. The take-off foot absorbs the first landing, the hop. The next phase, the step, is finished on the opposite foot and then followed by a jump into a sandpit. The distance travelled, from the edge of the board to the closest indentation in the sand to it, is then measured.
A foul is committed – and the jump is not measured – if an athlete steps beyond the board. The order of the field is determined by distance jumped.
Most championship competitions involve six jumps per competitor, although usually a number of them, those with the shorter marks, are often eliminated after three jumps. If competitors are tied, the athlete with the next best distance is declared the winner.
The event requires speed, explosive power, strength and flexibility. At major championships the format is usually a qualification session followed by a final.
History
At the inaugural modern Olympic Games in 1896, the event consisted of two hops and a jump but the format of a hop, a skip, a jump – hence its alternative name which was still in common usage until recently – was standardised in 1908.
Viktor Saneyev of the Soviet Union won a hat-trick of Olympic men’s triple jump titles from 1968 to 1976. Christian Taylor of the US won back-to-back Olympic titles in 2012 and 2016. When Great Britain’s Jonathan Edwards set the world record of 18.29m to win gold the 1995 IAAF World Championships, he jumped a distance in excess of the width of a football penalty box.
The first Olympic women’s triple jump competition took place in 1996. In 2004 Francoise Mbango struck gold to become the first female athlete from Cameroon to win an Olympic medal. Four years later at the Beijing Games she successfully retained her title.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
1431) Kalambo Falls
Summary
The Kalambo Falls on the Kalambo River is a 772-foot (235 m) single-drop waterfall on the border of Zambia and Rukwa Region, Tanzania at the southeast end of Lake Tanganyika. The falls are some of the tallest uninterrupted falls in Africa (after South Africa's Tugela Falls, Ethiopia's Jin Bahir Falls and others). Downstream of the falls is the Kalambo Gorge, which has a width of about 1 km and a depth of up to 300 m, running for about 5 km before opening out into the Lake Tanganyika rift valley. The Kalambo waterfall is the tallest waterfall in both Tanzania and Zambia. The expedition which mapped the falls and the area around it was in 1928 and led by Enid Gordon-Gallien.[1] Initially it was assumed that the height of falls exceeded 300 m, but measurements in the 1920s gave a more modest result, above 200 m. Later measurements, in 1956, gave a result of 221 m. After this several more measurements have been made, each with slightly different results. The width of the falls is 3.6–18 m.
Kalambo Falls is also considered one of the most important archaeological sites in Africa, with occupation spanning over 250,000 years.
Details
Kalambo Falls is said to be Africa's second tallest free-leaping or single-drop waterfall (second to one of the tiers of Tugela Falls in South Africa) at 221m.
As a matter of fact, the Kalambo River defines the Tanzania-Zambia border all the way into the vast Lake Tanganyika, which itself is shared by a foursome of countries (i.e. Democratic Republic of Congo, Burundi, Zambia, and Tanzania).
The waterfall is in high flow in the May/June timeframe. But this depends on how much rainfall the region gets during its rainy season from January through April. The flow diminishes as the year progresses. Some of the locals we’ve spoken to said that around October or November, the falls probably won’t look impressive.
Though few visitors realise it, the Kalambo Falls are also one of the most important archaeological sites in southern Africa. Just above the falls, by the side of the river, is a site that appears to have been occupied throughout much of the Stone Age and early Iron Age. The earliest tools and other remains discovered there may be over 300,000 years old, including evidence for the use of fire.
For years Kalambo provided the earliest evidence of fire in sub-Saharan Africa – charred logs, ash and charcoal have been discovered amongst the lowest levels of remains. This was a tremendously important step for Stone-Age man as it enabled him to keep warm and cook food, as well as use fire to scare off aggressive animals. Burning areas of grass may even have helped him to hunt. However, more recent excavations of older sites in Africa have discovered evidence of the use of fire before the time when we believe that this site at Kalambo was occupied.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
1432) Coma
Summary
A coma is a deep state of prolonged unconsciousness in which a person cannot be awakened, fails to respond normally to painful stimuli, light, or sound, lacks a normal wake-sleep cycle and does not initiate voluntary actions. Coma patients exhibit a complete absence of wakefulness and are unable to consciously feel, speak or move. Comas can be derived by natural causes, or can be medically induced.
Clinically, a coma can be defined as the inability consistently to follow a one-step command. It can also be defined as a score of ≤ 8 on the Glasgow Coma Scale (GCS) lasting ≥ 6 hours. For a patient to maintain consciousness, the components of wakefulness and awareness must be maintained. Wakefulness describes the quantitative degree of consciousness, whereas awareness relates to the qualitative aspects of the functions mediated by the cortex, including cognitive abilities such as attention, sensory perception, explicit memory, language, the execution of tasks, temporal and spatial orientation and reality judgment. From a neurological perspective, consciousness is maintained by the activation of the cerebral cortex—the gray matter that forms the outer layer of the brain—and by the reticular activating system (RAS), a structure located within the brainstem.
Details
Coma is a state of unconsciousness, characterized by loss of reaction to external stimuli and absence of spontaneous nervous activity, usually associated with injury to the cerebrum. Coma may accompany a number of metabolic disorders or physical injuries to the brain from disease or trauma.
Different patterns of coma depend on the origin of the injury. Concussions may cause losses of consciousness of short duration; in contrast, lack of oxygen (anoxia) may result in a coma that lasts for several weeks and is often fatal. Stroke, a rupture or blockage of vessels supplying blood to the brain, can cause sudden loss of consciousness in some patients, while comas caused by metabolic abnormalities or cerebral tumours are characterized by a more gradual onset, with stages of lethargy and stupor before true coma. Metabolic comas are also more likely to have associated brain seizures and usually leave pupillary light reflexes intact, whereas comas with physical causes usually eradicate this reflex.
Common causes of metabolic coma include diabetes, excessive consumption of alcohol, and barbiturate poisoning. In diabetes, low insulin levels allow the buildup of ketones, breakdown products of fat tissue that destroy the osmotic balance in the brain, damaging brain cells. Ingestion of large quantities of alcohol over a short period can cause a coma that may be treated by gastric lavage (stomach pump) in its early stages; alcohol combined with barbiturates is a common cause of coma in suicide attempts. Large doses of barbiturates alone will also produce coma by suppressing cerebral blood flow, thus causing anoxia. Gastric lavage soon after the drug is ingested may remove a sufficient amount of the barbiturate to allow recovery.
For most metabolic comas, the first step in treatment is to protect the brain cells and attempt to eliminate the cause of coma. Assisted ventilation is often necessary. In some psychiatric conditions, such as catatonic schizophrenia, a comalike state may also occur. Electroencephalography (EEG) can be used to detect signs of consciousness in patients who are unresponsive; research suggests that EEG recordings potentially can be used to predict whether a patient will emerge from coma.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
1433) Arachnophobia
Summary
Arachnophobia is a specific phobia brought about by the irrational fear of spiders and other arachnids such as scorpions.
Signs and symptoms
People with arachnophobia tend to feel uneasy in any area they believe could harbour spiders or that has visible signs of their presence, such as webs. If an arachnophobe sees a spider, they may not enter the general vicinity until they have overcome the panic attack that is often associated with their phobia. Some people scream, cry, have emotional outbursts, experience trouble breathing, sweat and experience increased heart rates when they come in contact with an area near spiders or their webs. In some extreme cases, even a picture, a toy, or a realistic drawing of a spider can trigger intense fear.
Details
Arachnophobia or the fear of spiders is the oldest and most common phobia in the Western culture. The word Arachnophobia is derived from the Greek word ‘arachne’ meaning spiders. The response to spiders shown by an arachnophobic individual may seem irrational to others and often to the sufferer himself.
Causes of Arachnophobia
Scientists who have studied this fear of spiders explain it to be a result of evolutionary selection. This means that Arachnophobia is an evolutionary response, since spiders, especially the venomous ones, have always been associated with infections and disease. Hence, the fear of spiders triggers a “disgust response” in many individuals.
A study conducted in the UK on 261 adults showed that nearly 32% women and 18% men in the group felt anxious, nervous or extremely frightened when confronted with a spider(real or images).
The exact causes of Arachnophobia are different for different people:
* For some people, it is a learned response: they learn to fear spiders after having seen others being fearful.
* An imbalance in brain chemicals may be linked with Arachnophobia
* The fear of spiders can be a family or cultural trait: many people in Africa are known to fear large spiders whereas in South Africa, where they eat spiders, most are unafraid.
* Traumatic experience in the past involving spiders is another reason for Arachnophobia.
Symptoms of Arachnophobia
Initial symptoms of arachnophobia or the fear of spiders may appear in one’s childhood or adolescence. However, following a traumatic episode, some or all of the following symptoms may be present at all ages when the sufferer is confronted with the object of the phobia, in this case, a spider.
1) Rapid heart rate
2) Dizziness
3) Hot or cold flashes
4) Feeling of going crazy and losing control
5) Chest pain
6) Feeling of choking
7) Inability to distinguish between reality and unreality
8) Trembling sweating
9) Thoughts of death
10) Nausea or other gastrointestinal distress
In some arachnophobic individuals, these symptoms may be triggered merely by anticipating contact with a spider. Even the sight or mention of cobwebs can trigger such a response.
Treatment for fear of spiders
True sufferers of the fear of spiders have an extreme aversion to these creatures so much so, that their daily life may be adversely impacted. These individuals show an active need to avoid areas where spiders may be present. A combination of therapy, counseling, and medications must be used to treat this fear.
It is important that one aggressively seek out the chosen treatment for it to be effective. Medicines like benzodiazepines are helpful in reducing the intensity of the reactions in presence of spiders, but they must be used sparingly and under medical supervision. Relaxation techniques like meditation and positive reaffirmations also form an essential part of the therapy.
One of the more modern methods of treating Arachnophobia includes systematic desensitization. This is a method that has been used for treating many different phobias. The goal of gradual desensitization is to slowly eliminate one’s Arachnophobia and help the individual cope with fear. An application called Phobia Free is known to utilize this gradual exposure technique to help people overcome their fear of spiders. It is available for tablet computers and Smartphone devices. This app, which has been reviewed and approved by the NHS of UK, uses game-play and relaxation methods to help one confront spiders (or other objects of fear).
If you or someone you know is severely impacted by the fear of spiders then it is essential to seek help in order to lead a more relaxed life. It is possible to eliminate Arachnophobia but the first step is to ask for help in order to learn to cope and eliminate the fear completely.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
1434) Tuberculosis
Summary
Tuberculosis (TB) is an infectious disease usually caused by Mycobacterium tuberculosis (MTB) bacteria. Tuberculosis generally affects the lungs, but it can also affect other parts of the body. Most infections show no symptoms, in which case it is known as latent tuberculosis. Around 10% of latent infections progress to active disease which, if left untreated, kill about half of those affected. Typical symptoms of active TB are chronic cough with blood-containing mucus, fever, night sweats, and weight loss. It was historically referred to as consumption due to the weight loss associated with the disease. Infection of other organs can cause a wide range of symptoms.
Tuberculosis is spread from one person to the next through the air when people who have active TB in their lungs cough, spit, speak, or sneeze. People with Latent TB do not spread the disease. Active infection occurs more often in people with HIV/AIDS and in those who smoke. Diagnosis of active TB is based on chest X-rays, as well as microscopic examination and culture of body fluids. Diagnosis of Latent TB relies on the tuberculin skin test (TST) or blood tests.
Prevention of TB involves screening those at high risk, early detection and treatment of cases, and vaccination with the bacillus Calmette-Guérin (BCG) vaccine. Those at high risk include household, workplace, and social contacts of people with active TB. Treatment requires the use of multiple antibiotics over a long period of time. Antibiotic resistance is a growing problem with increasing rates of multiple drug-resistant tuberculosis (MDR-TB).
Details
Tuberculosis (TB), infectious disease that is caused by the tubercle bacillus, Mycobacterium tuberculosis. In most forms of the disease, the bacillus spreads slowly and widely in the lungs, causing the formation of hard nodules (tubercles) or large cheeselike masses that break down the respiratory tissues and form cavities in the lungs. Blood vessels also can be eroded by the advancing disease, causing the infected person to cough up bright red blood.
During the 18th and 19th centuries, tuberculosis reached near-epidemic proportions in the rapidly urbanizing and industrializing societies of Europe and North America. Indeed, “consumption,” as it was then known, was the leading cause of death for all age groups in the Western world from that period until the early 20th century, at which time improved health and hygiene brought about a steady decline in its mortality rates. Since the 1940s, antibiotic drugs have reduced the span of treatment to months instead of years, and drug therapy has done away with the old TB sanatoriums where patients at one time were nursed for years while the defensive properties of their bodies dealt with the disease.
Today, in less-developed countries where population is dense and hygienic standards poor, tuberculosis remains a major fatal disease. The prevalence of the disease has increased in association with the HIV/AIDS epidemic; an estimated one out of every four deaths from tuberculosis involves an individual coinfected with HIV. In addition, the successful elimination of tuberculosis as a major threat to public health in the world has been complicated by the rise of new strains of the tubercle bacillus that are resistant to conventional antibiotics. Infections with these strains are often difficult to treat and require the use of combination drug therapies, sometimes involving the use of five different agents.
The course of tuberculosis
The tubercle bacillus is a small, rod-shaped bacterium that is extremely hardy; it can survive for months in a state of dryness and can also resist the action of mild disinfectants. Infection spreads primarily by the respiratory route directly from an infected person who discharges live bacilli into the air. Minute droplets ejected by sneezing, coughing, and even talking can contain hundreds of tubercle bacilli that may be inhaled by a healthy person. There the bacilli become trapped in the tissues of the body, are surrounded by immune cells, and finally are sealed up in hard, nodular tubercles. A tubercle usually consists of a centre of dead cells and tissues, cheeselike (caseous) in appearance, in which can be found many bacilli. This centre is surrounded by radially arranged phagocytic (scavenger) cells and a periphery containing connective tissue cells. The tubercle thus forms as a result of the body’s defensive reaction to the bacilli. Individual tubercles are microscopic in size, but most of the visible manifestations of tuberculosis, from barely visible nodules to large tuberculous masses, are conglomerations of tubercles.
In otherwise healthy children and adults, the primary infection often heals without causing symptoms. The bacilli are quickly sequestered in the tissues, and the infected person acquires a lifelong immunity to the disease. A skin test taken at any later time may reveal the earlier infection and the immunity, and a small scar in the lung may be visible by X-ray. In this condition, sometimes called latent tuberculosis, the affected person is not contagious. In some cases, however, sometimes after periods of time that can reach 40 years or more, the original tubercles break down, releasing viable bacilli into the bloodstream. From the blood the bacilli create new tissue infections elsewhere in the body, most commonly in the upper portion of one or both lungs. This causes a condition known as pulmonary tuberculosis, a highly infectious stage of the disease. In some cases the infection may break into the pleural space between the lung and the chest wall, causing a pleural effusion, or collection of fluid outside the lung. Particularly among infants, the elderly, and immunocompromised adults (organ transplant recipients or AIDS patients, for example), the primary infection may spread through the body, causing miliary tuberculosis, a highly fatal form if not adequately treated. In fact, once the bacilli enter the bloodstream, they can travel to almost any organ of the body, including the lymph nodes, bones and joints, skin, intestines, genital organs, kidneys, and bladder. An infection of the meninges that cover the brain causes tuberculous meningitis; before the advent of specific drugs, this disease was always fatal, though most affected people now recover.
The onset of pulmonary tuberculosis is usually insidious, with lack of energy, weight loss, and persistent cough. These symptoms do not subside, and the general health of the patient deteriorates. Eventually, the cough increases, the patient may have chest pain from pleurisy, and there may be blood in the sputum, an alarming symptom. Fever develops, usually with drenching night sweats. In the lung, the lesion consists of a collection of dead cells in which tubercle bacilli may be seen. This lesion may erode a neighbouring bronchus or blood vessel, causing the patient to cough up blood (hemoptysis). Tubercular lesions may spread extensively in the lung, causing large areas of destruction, cavities, and scarring. The amount of lung tissue available for the exchange of gases in respiration decreases, and if untreated the patient will die from failure of ventilation and general toxemia and exhaustion.
Diagnosis and treatment
The diagnosis of pulmonary tuberculosis depends on finding tubercle bacilli in the sputum, in the urine, in gastric washings, or in the cerebrospinal fluid. The primary method used to confirm the presence of bacilli is a sputum smear, in which a sputum specimen is smeared onto a slide, stained with a compound that penetrates the organism’s cell wall, and examined under a microscope. If bacilli are present, the sputum specimen is cultured on a special medium to determine whether the bacilli are M. tuberculosis. An X-ray of the lungs may show typical shadows caused by tubercular nodules or lesions. The prevention of tuberculosis depends on good hygienic and nutritional conditions and on the identification of infected patients and their early treatment. A vaccine, known as BCG vaccine, is composed of specially weakened tubercle bacilli. Injected into the skin, it causes a local reaction, which confers some immunity to infection by M. tuberculosis for several years. It has been widely used in some countries with success; its use in young children in particular has helped to control infection in the developing world. The main hope of ultimate control, however, lies in preventing exposure to infection, and this means treating infectious patients quickly, possibly in isolation until they are noninfectious. In many developed countries, individuals at risk for tuberculosis, such as health care workers, are regularly given a skin test to show whether they have had a primary infection with the bacillus.
Today, the treatment of tuberculosis consists of drug therapy and methods to prevent the spread of infectious bacilli. Historically, treatment of tuberculosis consisted of long periods, often years, of bed rest and surgical removal of useless lung tissue. In the 1940s and ’50s several antimicrobial drugs were discovered that revolutionized the treatment of patients with tuberculosis. As a result, with early drug treatment, surgery is rarely needed. The most commonly used antituberculosis drugs are isoniazid and rifampicin (rifampin). These drugs are often used in various combinations with other agents, such as ethambutol, pyrazinamide, or rifapentine, in order to avoid the development of drug-resistant bacilli. Patients with strongly suspected or confirmed tuberculosis undergo an initial treatment period that lasts two months and consists of combination therapy with isoniazid, rifampicin, ethambutol, and pyrazinamide. These drugs may be given daily or two times per week. The patient is usually made noninfectious quite quickly, but complete cure requires continuous treatment for another four to nine months. The length of the continuous treatment period depends on the results of chest X-rays and sputum smears taken at the end of the two-month period of initial therapy. Continuous treatment may consist of once daily or twice weekly doses of isoniazid and rifampicin or isoniazid and rifapentine.
If a patient does not continue treatment for the required time or is treated with only one drug, bacilli will become resistant and multiply, making the patient sick again. If subsequent treatment is also incomplete, the surviving bacilli will become resistant to several drugs. Multidrug-resistant tuberculosis (MDR TB) is a form of the disease in which bacilli have become resistant to isoniazid and rifampicin. MDR TB is treatable but is extremely difficult to cure, typically requiring two years of treatment with agents known to have more severe side effects than isoniazid or rifampicin. Extensively drug-resistant tuberculosis (XDR TB) is a rare form of MDR TB. XDR TB is characterized by resistance to not only isoniazid and rifampin but also a group of bactericidal drugs known as fluoroquinolones and at least one aminoglycoside antibiotic, such as kanamycin, amikacin, or capreomycin. Aggressive treatment using five different drugs, which are selected based on the drug sensitivity of the specific strain of bacilli in a patient, has been shown to be effective in reducing mortality in roughly 50 percent of XDR TB patients. In addition, aggressive treatment can help prevent the spread of strains of XDR TB bacilli.
In 1995, in part to prevent the development and spread of MDR TB, the World Health Organization began encouraging countries to implement a compliance program called directly observed therapy (DOT). Instead of taking daily medication on their own, patients are directly observed by a clinician or responsible family member while taking larger doses twice a week. Although some patients consider DOT invasive, it has proved successful in controlling tuberculosis.
Despite stringent control efforts, however, drug-resistant tuberculosis remained a serious threat in the early 21st century. In 2009, for example, researchers reported the emergence of extremely drug-resistant tuberculosis (XXDR-TB), also known as totally drug-resistant tuberculosis (TDR-TB), in a small subset of Iranian patients. This form of the disease, which has also been detected in Italy (in 2003) and India (in 2011), is resistant to all first- and second-line antituberculosis drugs.
At the same time, development of a vaccine to prevent active disease from emerging in persons already infected with the tuberculosis bacterium was underway. In 2019 the results of a preliminary trial indicated that the vaccine could prevent pulmonary disease in more than half of infected individuals.
Other mycobacterial infections
The above discussion of tuberculosis relates to the disease caused by M. tuberculosis. Another species, M. bovis, is the cause of bovine tuberculosis. M. bovis is transmitted among cattle and some wild animals through the respiratory route, and it is also excreted in milk. If the milk is ingested raw, M. bovis readily infects humans. The bovine bacillus may be caught in the tonsils and may spread from there to the lymph nodes of the neck, where it causes caseation of the node tissue (a condition formerly known as scrofula). The node swells under the skin of the neck, finally eroding through the skin as a chronic discharging ulcer. From the gastrointestinal tract, M. bovis may spread into the bloodstream and reach any part of the body. It shows, however, a great preference for bones and joints, where it causes destruction of tissue and eventually gross deformity. Tuberculosis of the spine, or Pott disease, is characterized by softening and collapse of the vertebrae, often resulting in a hunchback deformity. Pasteurization of milk kills tubercle bacilli, and this, along with the systematic identification and destruction of infected cattle, has led to the disappearance of bovine tuberculosis in humans in many countries.
The AIDS epidemic has given prominence to a group of infectious agents known variously as nontuberculosis mycobacteria, atypical mycobacteria, and mycobacteria other than tuberculosis (MOTT). This group includes such Mycobacterium species as M. avium (or M. avium-intracellulare), M. kansasii, M. marinum, and M. ulcerans. These bacilli have long been known to infect animals and humans, but they cause dangerous illnesses of the lungs, lymph nodes, and other organs only in people whose immune systems have been weakened. Among AIDS patients, atypical mycobacterial illnesses are common complications of HIV infection. Treatment is attempted with various drugs, but the prognosis is usually poor owing to the AIDS patient’s overall condition.
Tuberculosis through history
Evidence that M. tuberculosis and humans have long coexisted comes primarily from studies of bone samples collected from a Neolithic human settlement in the eastern Mediterranean. Genetic evidence gathered from these studies indicates that roughly 9,000 years ago there existed a strain of M. tuberculosis similar to strains present in the 21st century. Evidence of mycobacterial infection has also been found in the mummified remains of ancient Egyptians, and references to phthisis, or “wasting,” occur in the writings of the Greek physician Hippocrates. In the medical writings of Europe through the Middle Ages and well into the industrial age, tuberculosis was referred to as phthisis, the “white plague,” or consumption—all in reference to the progressive wasting of the victim’s health and vitality as the disease took its inexorable course. The cause was assumed to be mainly constitutional, either the result of an inherited disposition or of unhealthy or dissolute living. In the first edition of Encyclopædia Britannica (1768), it was reported that a tendency to develop “consumption of the lungs” could tragically be expected in people who were fine, delicate, and precocious:
This is known from a view of the tender and fine vessels, and of the slender make of the whole body, a long neck, a flat and narrow thorax, depressed scapulæ; the blood of a bright red, thin, sharp, and hot; the skin transparent, very white and fair, with a blooming red in the cheeks; the wit quick, subtle, and early ripe with regard to the age, and a merry chearful disposition.
Based on the stage of the disease, treatments included regular bloodletting, administration of expectorants and purgatives, healthful diet, exercise such as vigorous horseback riding, and, in the grim final stages, opiates. The view that tuberculosis might be a contagious disease also had its adherents, but it was not until 1865 that Jean Antoine Villemin, an army doctor in Paris, showed that it could be transmitted from tuberculous animals to healthy animals by inoculation. The actual infectious agent, the tubercle bacillus, was discovered and identified in 1882 by the German physician Robert Koch. By that time the cultural status of the disease was assured. As summarized by Dr. J.O. Affleck of the University of Edinburgh, Scotland, in the ninth edition of Encyclopædia Britannica (1885), “Few diseases possess such sad interest for humanity as consumption, both on account of its widespread prevalence and its destructive effects, particularly among the young.” Causing as much as one-quarter of all deaths in Europe, arising with particular frequency among young adults between the ages of 18 and 35, and bringing on a lingering, melancholy decline characterized by loss of body weight, skin pallor, and sunken yet luminous eyes, tuberculosis was enshrined in literature as the “captain of death,” the slow killer of youth, promise, and genius. Prominent artists who died of consumption in the 19th century included the English poet John Keats, the Polish composer Frédéric Chopin, and all of the Brontë sisters (Charlotte, Emily, and Anne); in the early 20th century they were followed by the Russian playwright Anton Chekhov, the Italian painter Amedeo Modigliani, and the German writer Franz Kafka. Without a clear understanding of the bacterium that caused the disease, little could be done for its victims except to isolate them in sanitariums, where cleanliness and fresh air were thought to help the body’s natural defenses to stop or at least slow the progress of the disease.
Preventive inoculation against tuberculosis, in which live but attenuated tubercle bacilli are used as a vaccine, was introduced in France in 1921 by bacteriologists Albert Calmette and Camille Guérin. The strain designated BCG (Bacillus Calmette-Guérin), of bovine origin, became attenuated while growing on culture media containing bile. After its introduction by Calmette, large numbers of children were vaccinated in France, elsewhere in Europe, and in South America; after 1930 the vaccine was used on an extensive scale. In 1943–44 the Ukrainian-born microbiologist Selman A. Waksman and his associates, working at Rutgers University, New Jersey, U.S., discovered the potent antimicrobial agent streptomycin in the growth medium of the soil microorganism Streptomyces griseus. In 1944–45 veterinarian W.H. Feldman and physician H.C. Hinshaw, working at the Mayo Clinic in Minnesota, demonstrated its specific effect in inhibiting tuberculosis in both animals and people. Wide clinical use of streptomycin promptly followed, eventually in combination with other drugs to attack resistant bacilli.
In 1952 a great advance was made with the successful testing of isoniazid in the United States and Germany. Isoniazid is the most important drug in the history of chemotherapy for tuberculosis; other drugs were brought out in following years, pyrazinamide in 1954, ethambutol in 1962, and rifampicin in 1963. By this time the industrialized countries were already seeing the health benefits of economic improvement, better sanitation, more widespread education, and particularly the establishment of public health practice, including specific measures for tuberculosis control. The rate of deaths from tuberculosis in England and Wales dropped from 190 per 100,000 population in 1900 to 7 per 100,000 in the early 1960s. In the United States during the same time period, it dropped from 194 per 100,000 to approximately 6 per 100,000. In the popular mind, tuberculosis was then a disease of the past, of the indigent, and of the Third World.
However, in the mid-1980s the number of deaths caused by tuberculosis began to rise again in developed countries. The disease’s resurgence was attributed in part to complacent health care systems, increased immigration of people from regions where tuberculosis was prevalent, and the spread of HIV. In addition, throughout the 1990s the number of cases of tuberculosis increased in Africa. Global programs such as the Stop TB Partnership, which was established in 2000, have worked to increase awareness of tuberculosis and to make new and existing treatments available to people living in developing countries most affected by the disease. In the early 2000s, as a result of the rapid implementation of global efforts to combat the disease, the epidemic in Africa slowed and incidence rates stabilized. Despite a leveling off of per capita incidence of tuberculosis, the global number of new cases continued to rise, due to population growth, especially in regions of Africa, Southeast Asia, and the eastern Mediterranean. The mortality rate from tuberculosis remains between 1.6 million and 2 million deaths per year.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
1435) Hernia
Summary
A hernia is the abnormal exit of tissue or an organ, such as the bowel, through the wall of the cavity in which it normally resides. Various types of hernias can occur, most commonly involving the abdomen, and specifically the groin. Groin hernias are most commonly of the inguinal type but may also be femoral. Other types of hernias include hiatus, incisional, and umbilical hernias. Symptoms are present in about 66% of people with groin hernias. This may include pain or discomfort in the lower abdomen, especially with coughing, exercise, or urinating or defecating. Often, it gets worse throughout the day and improves when lying down. A bulge may appear at the site of hernia, that becomes larger when bending down. Groin hernias occur more often on the right than left side. The main concern is bowel strangulation, where the blood supply to part of the bowel is blocked. This usually produces severe pain and tenderness in the area. Hiatus, or hiatal hernias often result in heartburn but may also cause chest pain or pain while eating.
Risk factors for the development of a hernia include smoking, chronic obstructive pulmonary disease, obesity, pregnancy, peritoneal dialysis, collagen vascular disease and previous open appendectomy, among others. Predisposition to hernias is genetic and occur more often in certain families. Deleterious mutations causing predisposition to hernias seem to have dominant inheritance (especially for men). It is unclear if groin hernias are associated with heavy lifting. Hernias can often be diagnosed based on signs and symptoms. Occasionally, medical imaging is used to confirm the diagnosis or rule out other possible causes. The diagnosis of hiatus hernias is often by endoscopy.
Groin hernias that do not cause symptoms in males do not need to be repaired. Repair, however, is generally recommended in women due to the higher rate of femoral hernias, which have more complications. If strangulation occurs, immediate surgery is required. Repair may be done by open surgery or laparoscopic surgery. Open surgery has the benefit of possibly being done under local anesthesia rather than general anesthesia. Laparoscopic surgery generally has less pain following the procedure. A hiatus hernia may be treated with lifestyle changes such as raising the head of the bed, weight loss and adjusting eating habits. The medications H2 blockers or proton pump inhibitors may help. If the symptoms do not improve with medications, a surgery known as laparoscopic Nissen fundoplication may be an option.
About 27% of males and 3% of females develop a groin hernia at some point in their lives. Inguinal, femoral and abdominal hernias were present in 18.5 million people and resulted in 59,800 deaths in 2015. Groin hernias occur most often before the age of 1 and after the age of 50. It is not known how commonly hiatus hernias occur, with estimates in North America varying from 10% to 80%. The first known description of a hernia dates back to at least 1550 BC, in the Ebers Papyrus from Egypt.
Details
Hernia is a protrusion of an organ or tissue from its normal cavity. The protrusion may extend outside the body or between cavities within the body, as when loops of intestine escape from the abdominal cavity into the chest through a defect in the diaphragm, the muscular partition between the two cavities. The term is usually applied, however, to an external herniation of tissue through the abdominal wall.
An abdominal hernia, or rupture, may occur at any weak point in the abdominal wall. The common sites are the groin (inguinal), the upper part of the thigh (femoral), and the navel (umbilical). In inguinal hernia, the protruding tissue descends along the canal that holds the spermatic cord in the male and the round ligament in the female. If such a hernia occurs bilaterally, it is called a double hernia. A femoral hernia lies on the inner side of the large femoral blood vessels of the thigh. An umbilical hernia protrudes through the navel.
A hernia may be present at birth as the result of defective development of the abdominal wall, or it may occur later in life as the result of an injury. An acquired hernia usually is caused by overexertion, as in lifting a heavy weight, jumping off a high wall, or violent coughing. Men develop hernias more frequently than do women because of their greater physical exertions and because the canal for the spermatic cord leading through the abdominal wall is wider than the canal for the round ligament. A special type of acquired hernia is the incisional hernia, which occurs at an incision after surgery.
The hernia may be classified as reducible, irreducible, or strangulated. A reducible hernia is one in which the contents can be pushed back into the abdomen and often may be held in place by a truss, a pad of heavy material that is placed over the herniated area. A truss is usually a temporary expedient and is seldom used as a substitute for surgical care. A reducible hernia may increase in size or may form adhesions to other organs or structures, becoming irreducible. A strangulated hernia is one in which the circulation of blood through the hernia is impeded by pinching at the narrowest part of the passage; congestion is followed by inflammation, infection, and gangrene. The tighter the constriction, the more rapidly these events take place; unrelieved strangulation may be fatal. Surgery is often necessary for the permanent relief of reducible hernia, and it is the only safe treatment for more advanced forms.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
1436) Necrosis
Summary
Necrosis (from Ancient Greek 'death') is a form of cell injury which results in the premature death of cells in living tissue by autolysis. Necrosis is caused by factors external to the cell or tissue, such as infection, or trauma which result in the unregulated digestion of cell components. In contrast, apoptosis is a naturally occurring programmed and targeted cause of cellular death. While apoptosis often provides beneficial effects to the organism, necrosis is almost always detrimental and can be fatal.
Cellular death due to necrosis does not follow the apoptotic signal transduction pathway, but rather various receptors are activated and result in the loss of cell membrane integrity and an uncontrolled release of products of cell death into the extracellular space. This initiates in the surrounding tissue an inflammatory response, which attracts leukocytes and nearby phagocytes which eliminate the dead cells by phagocytosis. However, microbial damaging substances released by leukocytes would create collateral damage to surrounding tissues. This excess collateral damage inhibits the healing process. Thus, untreated necrosis results in a build-up of decomposing dead tissue and cell debris at or near the site of the cell death. A classic example is gangrene. For this reason, it is often necessary to remove necrotic tissue surgically, a procedure known as debridement.
Details
Necrosis is a death of a circumscribed area of plant or animal tissue as a result of disease or injury. Necrosis is a form of premature tissue death, as opposed to the spontaneous natural death or wearing out of tissue, which is known as necrobiosis. Necrosis is further distinguished from apoptosis, or programmed cell death, which is internally regulated by cells, plays a critical role in embryonic development, and serves as a protective mechanism against disease and other factors.
Necrosis may follow a wide variety of injuries, both physical and biological in nature. Examples of physical injuries include cuts, burns, bruises, oxygen deprivation (anoxia), and hyperthermia. Biological injuries can include immunological attack and the effects of disease-causing agents. Notable conditions involving necrotic tissue death include avascular necrosis and gangrene, which result from a lack of blood supply to the affected area; necrotizing fasciitis, which is caused by a rapidly spreading bacterial infection; and loxoscelism, in which venom in a bite from a recluse spider (Loxosceles) produces a gangrenous wound. Such injuries and diseases inhibit crucial intracellular metabolic processes, in which intracellular enzymes become activated upon injury and destroy damaged cells. Lesions caused by necrosis often are of diagnostic value.
Early cellular signs of necrosis include swelling of the mitochondria, a process that impairs intracellular oxidative metabolism. Later, localized densities appear, with condensation of genetic material. Cytoplasmic organelles are disrupted, and affected cells separate from neighbouring cells. The dissolution of lysosomes, which normally house hydrolytic enzymes, leads to intracellular acidosis. The nucleus swells and darkens (pyknosis) and eventually ruptures (karyolysis). The outer membrane of the cell also ruptures, resulting in a loss of ion-pumping capacity and a rapid flow of sodium and calcium ions into the intracellular environment, resulting in osmotic shock (a sudden shift in intracellular and extracellular solute concentrations).
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
1437) Ringworm
Summary
Dermatophytosis, also known as ringworm, is a fungal infection of the skin. Typically it results in a red, itchy, scaly, circular rash. Hair loss may occur in the area affected. Symptoms begin four to fourteen days after exposure. Multiple areas can be affected at a given time.
About 40 types of fungus can cause ringworm. They are typically of the Trichophyton, Microsporum, or Epidermophyton type. Risk factors include using public showers, contact sports such as wrestling, excessive sweating, contact with animals, obesity, and poor immune function. Ringworm can spread from other animals or between people. Diagnosis is often based on the appearance and symptoms. It may be confirmed by either culturing or looking at a skin scraping under a microscope.
Prevention is by keeping the skin dry, not walking barefoot in public, and not sharing personal items. Treatment is typically with antifungal creams such as clotrimazole or miconazole. If the scalp is involved, antifungals by mouth such as fluconazole may be needed.
Globally, up to 20% of the population may be infected by ringworm at any given time. Infections of the groin are more common in males, while infections of the scalp and body occur equally in both sexes. Infections of the scalp are most common in children while infections of the groin are most common in the elderly. Descriptions of ringworm date back to ancient history.
Details
Ringworm, also called tinea or dermatophytosis, is a superficial skin lesions caused by a highly specialized group of fungi called dermatophytes that live and multiply on the surface of the skin and feed on keratin, the horny protein constituting the major part of the outermost layer of the skin and of the hair and nails. The fungi produce responses in the skin that vary from slight scaling to blistering and marked disruption of the keratin layer. The lesions are usually round or ring-shaped and can be either dry and scaly or moist and covered with vesicles (blisters), depending on the body area and the type of fungus involved.
Ringworm is also known as tinea, both names referring to the round shape of most of the lesions, similar to the larva of the clothes moth, genus Tinea. In specifying the condition, tinea is usually followed by a modifying term indicating the body area or characteristics of the lesions. Thus, ringworm of the scalp, beard, and nails is also referred to as tinea capitis, tinea barbae or tinea sycosis, and tinea unguium (also called onychomycosis), respectively; ringworm of the body, groin, hands, and feet are referred to as tinea corporis, tinea cruris (also called jock itch), tinea manuum, and tinea pedis, respectively. Tinea pedis is commonly referred to as athlete’s foot, which may be of either the dry or inflammatory type. In the latter type, the infection may lie dormant much of the time and undergo occasional acute exacerbations, with the development of vesicles (blisters) affecting chiefly the skin folds between the toes. The dry type is a chronic process marked by slight redness of the skin and dry scaling that may involve the sole and sides of the foot as well as the toenails, which become thick and brittle.
Other varieties of ringworm are characterized by specific skin lesions. For example, tinea imbricata (Latin: “overlapping like tiles”) is so called because lesions consist of concentric rings of overlapping scales. Tinea imbricata occurs exclusively in Central America, Southeast Asia, India, and Polynesia. Favus, also known as crusted, or honeycomb, ringworm, occurs on the scalp and is characterized by the formation of yellow cup-shaped crusts that enlarge to form honeycomb-like masses. Black dot ringworm, also a ringworm of the scalp, derives its distinctive appearance and name from the breaking of the hairs at the scalp surface. Except for ringworm of the scalp, which tends to be highly contagious, the contraction of ringworm depends to a large extent on individual susceptibility and predisposing factors, such as excessive perspiration.
Diagnosis of ringworm is made by observation and by microscopic examination. Treatment with topical or oral antifungal agents may be effective. Limited exposure to ultraviolet radiation may also be helpful.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
1438) Osteomyelitis
Summary
Osteomyelitis is an infection in a bone. Infections can reach a bone by traveling through the bloodstream or spreading from nearby tissue. Infections can also begin in the bone itself if an injury exposes the bone to germs.
Osteomyelitis (OM) is an infection of bone. Symptoms may include pain in a specific bone with overlying redness, fever, and weakness. The long bones of the arms and legs are most commonly involved in children e.g. the femur and humerus, while the feet, spine, and hips are most commonly involved in adults.
The cause is usually a bacterial infection, but rarely can be a fungal infection. It may occur by spread from the blood or from surrounding tissue. Risks for developing osteomyelitis include diabetes, intravenous drug use, prior removal of the spleen, and trauma to the area. Diagnosis is typically suspected based on symptoms and basic laboratory tests as C-reactive protein (CRP) and Erythrocyte sedimentation rate (ESR).This is because plain radiographs are unremarkable in the first few days following acute infection. Diagnosis is further confirmed by blood tests, medical imaging, or bone biopsy.
Treatment of bacterial osteomyelitis often involves both antimicrobials and surgery. In people with poor blood flow, amputation may be required. Treatment of the relatively rare fungal osteomyelitis as mycetoma infections entails the use of antifungal medications. In contrast to bacterial osteomyelitis, amputation or large bony resections is more common in neglected fungal osteomyelitis, namely mycetoma, where infections of the foot account for the majority of cases. Treatment outcomes of bacterial osteomyelitis are generally good when the condition has only been present a short time. About 2.4 per 100,000 people are affected each year. The young and old are more commonly affected. Males are more commonly affected than females. The condition was described at least as early as the 300s BC by Hippocrates. Prior to the availability of antibiotics, the risk of death was significant.
Details
Osteomyelitis is an infection of bone tissue. The condition is most commonly caused by the infectious organism Staphylococcus aureus, which reaches the bone via the bloodstream or by extension from a local injury; inflammation follows with destruction of the cancellous (porous) bone and bone marrow, loss of blood supply, and bone death. Living bone grows around the infected area and walls in the dead tissue, forming an involucrum, the contents of which are gradually resorbed as the lesion is repaired.
Symptoms of osteomyelitis include fever, chills, and bone pain; later, swelling and redness may develop around the area of infection. Diagnosis is confirmed by radionuclide bone scans. Biopsy or bone aspiration may be used to determine the precise cause of the infection.
Treatment of osteomyelitis typically requires the long-term administration of intravenous antibiotics; some patients also require surgery to remove dead bone tissue. If the disease is not treated appropriately, acute osteomyelitis can progress to a chronic disease. In chronic osteomyelitis, infection remains active, and periodic drainage to the surface via sinus tracts may occur. Bone damage may be extensive, leading to increased susceptibility to fractures, stunted growth in children, and, in severe cases, amputation of the affected limb.
Osteomyelitis may occur as a complication of many diseases, such as typhoid, syphilis, tuberculosis, or sickle cell anemia. In middle-aged individuals, spinal osteomyelitis may be associated with urinary bladder infection. Intravenous drug use may also cause osteomyelitis.
Overview
Osteomyelitis is an infection in a bone. Infections can reach a bone by traveling through the bloodstream or spreading from nearby tissue. Infections can also begin in the bone itself if an injury exposes the bone to germs.
Smokers and people with chronic health conditions, such as diabetes or kidney failure, are more at risk of developing osteomyelitis. People who have diabetes may develop osteomyelitis in their feet if they have foot ulcers.
Although once considered incurable, osteomyelitis can now be successfully treated. Most people need surgery to remove areas of the bone that have died. After surgery, strong intravenous antibiotics are typically needed.
Symptoms
Signs and symptoms of osteomyelitis include:
* Fever
* Swelling, warmth and redness over the area of the infection
* Pain in the area of the infection
* Fatigue
Sometimes osteomyelitis causes no signs and symptoms or the signs and symptoms are hard to distinguish from other problems. This may be especially true for infants, older adults and people whose immune systems are compromised.
When to see a doctor
See your doctor if you experience worsening bone pain along with fever. If you're at risk of infection because of a medical condition or recent surgery or injury, see your doctor right away if you notice signs and symptoms of an infection.
Causes
Most cases of osteomyelitis are caused by staphylococcus bacteria, types of germs commonly found on the skin or in the nose of even healthy individuals.
Germs can enter a bone in a variety of ways, including:
* The bloodstream. Germs in other parts of your body — for example, in the lungs from pneumonia or in the bladder from a urinary tract infection — can travel through your bloodstream to a weakened spot in a bone.
* Injuries. Severe puncture wounds can carry germs deep inside your body. If such an injury becomes infected, the germs can spread into a nearby bone. Germs can also enter the body if you have broken a bone so severely that part of it is sticking out through your skin.
* Surgery. Direct contamination with germs can occur during surgeries to replace joints or repair fractures.
Risk factors
Your bones are normally resistant to infection, but this protection lessens as you get older. Other factors that can make your bones more vulnerable to osteomyelitis may include:
A severe bone fracture or a deep puncture wound gives bacteria a route to enter your bone or nearby tissue. A deep puncture wound, such as an animal bite or a nail piercing through a shoe, can also provide a pathway for infection.
Surgery to repair broken bones or replace worn joints also can accidentally open a path for germs to enter a bone. Implanted orthopedic hardware is a risk factor for infection.
Circulation disorders
When blood vessels are damaged or blocked, your body has trouble distributing the infection-fighting cells needed to keep a small infection from growing larger. What begins as a small cut can progress to a deep ulcer that may expose deep tissue and bone to infection.
Diseases that impair blood circulation include:
* Poorly controlled diabetes
* Peripheral artery disease, often related to smoking
* Sickle cell disease
* Problems requiring intravenous lines or catheters
There are a number of conditions that require the use of medical tubing to connect the outside world with your internal organs. However, this tubing can also serve as a way for germs to get into your body, increasing your risk of an infection in general, which can lead to osteomyelitis.
Examples of when this type of tubing might be used include:
* Dialysis machine tubing
* Urinary catheters
* Long-term intravenous tubing, sometimes called central lines
* Conditions that impair the immune system
If your immune system is affected by a medical condition or medication, you have a greater risk of osteomyelitis. Factors that may suppress your immune system include:
* Cancer treatment
* Poorly controlled diabetes
* Needing to take corticosteroids or drugs called tumor necrosis factor inhibitors
*
Illicit drugs
People who inject illegal drugs are more likely to develop osteomyelitis because they may use nonsterile needles and are less likely to sterilize their skin before injections.
Complications
Osteomyelitis complications may include:
* Bone death (osteonecrosis). An infection in your bone can impede blood circulation within the bone, leading to bone death. Areas where bone has died need to be surgically removed for antibiotics to be effective.
* Septic arthritis. Sometimes, infection within bones can spread into a nearby joint.
* Impaired growth. Normal growth in bones or joints in children may be affected if osteomyelitis occurs in the softer areas, called growth plates, at either end of the long bones of the arms and legs.
* Skin cancer. If your osteomyelitis has resulted in an open sore that is draining pus, the surrounding skin is at higher risk of developing squamous cell cancer.
Prevention
If you've been told that you have an increased risk of infection, talk to your doctor about ways to prevent infections from occurring. Reducing your risk of infection will also help your risk of developing osteomyelitis.
In general, take precautions to avoid cuts, scrapes and animal scratches or bites, which give germs easy access to your body. If you or your child has a minor injury, clean the area immediately and apply a clean bandage. Check wounds frequently for signs of infection.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
1439) Glossophobia
Summary
Glossophobia or speech anxiety is the fear of public speaking. The word glossophobia derives from the Greek glossa, meaning tongue, and phobos, fear or dread. Speech is the action of speaking out loud and anxiety is the feeling of worry, tension or concern that manifests when thinking that negative things will occur, and physically by increased blood pressure, sweating and shaking. Thus, speech anxiety is the feeling of worry and physical reactions caused when speaking to others, especially to larger groups.
The causes of glossophobia are uncertain but explanations include communibiology and the illusion of transparency. Further explanations range from nervousness produced by a lack of preparation to, one of the most common psychiatric disorders, Social anxiety disorder (SAD).
Its symptoms include one or more of physiological changes, mental disruptions, and detrimental speech performance.
There are several ways to overcome glossophobia, which include preparation and rehearsing, deconstructing beliefs, engaging in positive self-talk, visualizing optimal performance, practicing mindfulness, breathing exercises, creating an anxiety hierarchy, using virtual reality, computerized coaches and medications such as beta-blockers.
Details
The fear of public speaking or stage fright is also termed as Glossophobia. Many people experience some degree of stage anxiety before speaking/performing at events; however most manage to get over it no matter how unpleasant they may find it.
Fear of Public Speaking Phobia - Glossophobia
In case of extreme Glossophobia however, individuals simply freeze before their audience. They might be unable to speak. They might find that their mouth dries up or they start sweating, shaking or experiencing palpitations.
Needless to say that Glossophobia can lead to highly embarrassing situations. People with a fear of public speaking try to avoid these situations completely. Businesspeople might experience professional setbacks owing to their inability to make presentations. Family members or friends requested to make speeches at weddings, birthdays etc may refuse them owing to their Glossophobia. Glossophobia can also come on suddenly in experienced actors and musicians who might start to find concerts difficult. Pilots and cabin crew might refrain from making announcements.
Symptoms of the fear of public speaking
Common signs and symptoms of the fear of public speaking include anxiety or nervousness before the event that involves speaking or performing before a small/large group of people. Physical symptoms of glossophobia include:
* Panic attacks characterized by sweating or trembling
* Dry mouth
* Nausea and vomiting in the extreme cases
* Stiffness in the neck and back muscles
* Tense and weak or quivering voice
Apart from these physical symptoms, verbal and nonverbal symptoms might also manifest themselves such as increased blood pressure and heart rate.
Causes of Glossophobia
The exact cause of glossophobia is unknown but it is likely that certain traumatic events in one’s past as a child or even as an adult might have led to this fear of public speaking. Often the individual coping with this phobia might avoid speaking in public for so long that what beings as normal anxiety might turn into full blown Glossophobia.
Most individuals who suffer from the fear of public speaking are also low on self esteem, expect perfection in everything they do, seek constant approval, or expect failure.
Treatment of Glossophobia
There are many herbal and homeopathic remedies that can help in calming the anxiety experienced before events that involve public speaking. Aconitum napellus or Gelsemium, etc may be recommended by homeopaths based on the exact history, symptoms as well as the individual’s nature and temperament. Herbal remedies like Lemon balm, lavender and Passion Flower etc can also help soothe the nerves and calm one before a public speaking event.
In case of traditional or orthodox treatment for glossophobia, beta blockers may be prescribed for soothing anxiety, controlling shaking or trembling and also for lowering heart rate etc. There are several restrictions on taking such medicines: one must especially speak to a doctor about these medications when suffering from diabetes, depression or heart diseases etc.
Many public speaking courses, associations and clubs are dedicated to help individuals alleviate their fear of public speaking. Talk therapy, cognitive behavior therapy and counseling etc can also help individuals overcome their glossophobia.
Alternative or complementary remedies like hypnosis, positive visualization, meditation and even acupuncture etc can help one address the root of the problem and overcome the fear of speaking publicly.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
1440) Atychiphobia
Summary
Fear of negative evaluation (FNE), also atychiphobia, is a psychological construct reflecting "apprehension about others' evaluations, distress over negative evaluations by others, and the expectation that others would evaluate one negatively". The construct and a psychological test to measure it were defined by David Watson and Ronald Friend in 1969. FNE is related to specific personality dimensions, such as anxiousness, submissiveness, and social avoidance. People who score high on the FNE scale are highly concerned with seeking social approval or avoiding disapproval by others, and may tend to avoid situations where they have to undergo evaluations. High FNE subjects are also more responsive to situational factors. This has been associated with conformity, pro-social behavior, and social anxiety.
Details
Fear of Failure Phobia – Atychiphobia or Kakorrhaphiophobia
Normal amount of doubt regarding success in certain project, relationships or examinations is usually present in most people. However, when the fear of failure takes on an extreme form then it is termed as Atychiphobia. Atychiphobia is also known by several other names like Kakorrahaphobia, or Kakorraphiophobia which also covers the fear of rejection.
Individuals coping with Atychiphobia mainly fear failure because they lack confidence in their abilities. Some experience extreme fear of failure because of the ridicule one might face owing to the failure. Likewise, some suffer from Atychiphobia due to the fear of risk taking. Individuals coping with Atychiphobia often have rigid or unrealistic expectations and/or excessive standards of behavior.
Causes of the fear of failure
Atychiphobia is often linked to traumatic or embarrassing events in one’s past. Strict or overly demanding parents or demeaning siblings or friends can also lead a child to suffer extreme fear of failure. Minor failures in one’s childhood can cause embarrassment or ridicule. These lead to negative thoughts when undertaking other challenges. The fear of failure continues to grow and adds up as one matures. Add to this the fact that our cultures and societies lay certain expectations regarding looks, relationships, education and in general preset definitions and norms of failures and success.
Persons with the fear of failure often give up trying unless they have been guaranteed or assured of perfection in certain tasks.
Symptoms of Atychiphobia or the fear of failure
Atychiphobia can severely affect the quality of life of the person suffering from it. One might even go to great lengths to avoid things that are unlikely to have a favorable ending.
Many coping with the fear of failure phobia give up trying completely especially where relationships, education or job related projects are concerned. They believe in their mind that the outcome of most of these projects would be imperfect not realizing that perfection is merely an illusion. Their Atychiphobia causes these individuals to quit their jobs and end relationships to avoid the failure therein. The fear of failure can also lead the person to sabotage his/her life. S/he might fake illnesses, make constant excuses and tell blatant lies. This is known to lead to demotions, unemployment, negative reviews and divorces.
Apart from these signs, persons with Atychiphobia also experience several physical symptoms. The worrying thoughts regarding a given task at hand can lead to sleepless nights, tension headaches, muscle pain etc. Instead of focusing on the task at hand, the individual with Atychiphobia spends all his energy worrying about failure. Physical symptoms of Atychiphobia include gastrointestinal distress, headaches, sweating, anxiety and panic attacks, twitching, trembling, and irritability etc.
Treatment of Atychiphobia
Atychiphobia affects both men and women. Medication and drugs are usually the last line of treatment for this kind of phobia. This is because; drugs merely mask the symptoms and do not tackle the problem from its root. Talk therapy, counseling etc can help the patient open up about his fears in order to come up with effective solutions to cope with the stress experienced on being given a task.
The most effective treatment for the fear of failure is self motivation. Experts recommend breaking a task into smaller and manageable pieces and doing more gradually. This can help patients realize that failure does not mean the end of life rather it is crucial for the growing process.
If the irrational failure of fear is affecting you or someone you know and is standing in the way of success, then it is time to regain control over life. There are many different treatment options available today and one can definitely overcome their Atychiphobia with their help.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
1441) Scandinavian literature
Summary
Scandinavian literature or Nordic literature is the literature in the languages of the Nordic countries of Northern Europe. The Nordic countries include Denmark, Finland, Iceland, Norway (including Svalbard), Sweden, and Scandinavia's associated autonomous territories (Åland, Faroe Islands and Greenland). The majority of these nations and regions use North Germanic languages. Although majority of Finns speak Uralic languages, Finnish history and literature are clearly interrelated with those of both Sweden and Norway who have shared control of various areas and who have substantial Sami populations/influences.
These peoples have produced an important and influential literature. Henrik Ibsen, a Norwegian playwright, was largely responsible for the popularity of modern realistic drama in Europe, with plays like The Wild Duck and A Doll's House. Nobel prizes for literature, itself a Scandinavian award, have been awarded to Selma Lagerlöf, Verner von Heidenstam, Karl Adolph Gjellerup, Henrik Pontoppidan, Knut Hamsun, Sigrid Undset, Erik Axel Karlfeldt, Frans Eemil Sillanpää, Johannes Vilhelm Jensen, Pär Lagerkvist, Halldór Laxness, Nelly Sachs, Eyvind Johnson, Harry Martinson, and Tomas Tranströmer.
Details
Scandinavian literature, also called Nordic literature, is the body of works, both oral and written, produced within Scandinavia in the North Germanic group of languages, in the Finnish language, and, during the Middle Ages, in the Latin language.
Scandinavian literature traditionally consists of works in modern Swedish, Norwegian, Icelandic, Danish, and Faroese, all members of the North Germanic group of languages. The literary works written in these languages show deep-seated common linguistic ties. The Finnish language is unrelated to the North Germanic languages; it belongs instead to the Baltic-Finnic branch of the Finno-Ugric language family and is most closely related to Estonian and Karelian. Because Sweden ruled Finland for more than six centuries, Finnish literature, despite its linguistic differences, became closely intertwined with Swedish literature.
The term Scandinavia traditionally designates the two countries of the Scandinavian Peninsula—Norway and Sweden—and Denmark. Finland and Iceland are frequently called Scandinavian countries on geographic, political, and cultural grounds. The term Nordic is often used today to refer collectively to the Åland Islands, Denmark, Finland, the Faroe Islands, Greenland, Iceland, Norway, and Sweden.
Although the Scandinavian literatures exhibit similarities stemming from close cultural ties, they manifest differences reflective of distinct national institutions and historical and geographic conditions. They are therefore discussed separately under Danish literature, Faroese literature, Icelandic literature, Norwegian literature, and Swedish literature. Works written in Finland in the Swedish language (Finland-Swedish literature) and in the Finnish language are discussed under Finnish literature.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
1442) Color blindness
Color blindness (color vision deficiency) is the decreased ability to see color or differences in color. It can impair tasks such as selecting ripe fruit, choosing clothing, and reading traffic lights. Color blindness may make some academic activities more difficult. However, issues are generally minor, and the colorblind automatically develop adaptations and coping mechanisms. People with total color blindness (achromatopsia) may also be uncomfortable in bright environments and have decreased visual acuity.
The most common cause of color blindness is an inherited problem or variation in the functionality of one or more of the three classes of cone cells in the retina, which mediate color vision. Males are more likely to be color blind than females, because the genes responsible for the most common forms of color blindness are on the X chromosome. Non-color-blind females can carry genes for color blindness and pass them on to their children. Color blindness can also result from physical or chemical damage to the eye, the optic nerve, or parts of the brain. Screening for color blindness is typically done with the Ishihara color test.
There is no cure for color blindness. Diagnosis may allow an individual, or their parents/teachers to actively accommodate the condition. Special lenses such as EnChroma glasses or X-chrom contact lenses may help people with red–green color blindness at some color tasks, but they do not grant the wearer "normal color vision". Mobile apps can help people identify colors.
Red–green color blindness is the most common form, followed by blue–yellow color blindness and total color blindness. Red–green color blindness affects up to 1 in 12 males (8%) and 1 in 200 females (0.5%). The ability to see color also decreases in old age. In certain countries, color blindness may make people ineligible for certain jobs, such as those of aircraft pilots, train drivers, crane operators, and people in the armed forces. The effect of color blindness on artistic ability is controversial, but a number of famous artists are believed to have been color blind.
Details
Colour blindness is the inability to distinguish one or more of the three colours red, green, and blue. Most people with colour vision problems have a weak colour-sensing system rather than a frank loss of colour sensation. In the retina (the light-sensitive layer of tissue that lines the back and sides of the eyeball), humans have three types of cones (the visual cells that function in the perception of colour). One type absorbs light best in wavelengths of blue-violet and another in the wavelengths of green. The third type is most sensitive to longer wavelengths—more sensitive to red. Normal colour vision, when all three cone types are functioning correctly, is known as trichromacy (or trichromatism).
Types of colour blindness
There are several different types of colour blindness, which may be subdivided generally into dichromacy (dichromatism), when only two cone types are functional, and monochromacy (monochromatism), when none or only one type of cone receptor is functional. Dichromatic individuals are ordinarily unable to distinguish between red and green. Blindness to red is known as protanopia, a state in which the red cones are absent, leaving only the cones that absorb blue and green light. Blindness to green is known as deuteranopia, wherein green cones are lacking and blue and red cones are functional. Some persons experience anomalous dichromatic conditions, which involve only minor reductions or weaknesses in colour sensitivity. In protanomaly, for example, sensitivity to red is reduced as a result of abnormalities in the red cone photopigment. In deuteranomaly, in which sensitivity to green is reduced, the green cones are functionally limited. Two forms of blue-yellow colour blindness are known: tritanopia (blindness to blue, usually with the inability to distinguish between blue and yellow), which occurs when blue cones are absent; and tritanomaly (reduced sensitivity to blue), which arises from the abnormal function of blue cones.
Monochromacy, or complete colour blindness, is sometimes accompanied by deficiencies in visual acuity. Such conditions are rare and include achromatopsia (or rod monochromacy; the complete absence of functional cone photopigments) and cone monochromacy (when two of the three cone types are nonfunctional).
Inherited and acquired colour blindness
Hereditary red-green colour blindness occurs mainly in males and Caucasian persons, with about 8 percent of men and 0.5 percent of women of European ancestry inheriting the conditions. Its predominance in males is due to the fact that red-green colour blindness is a gender-linked recessive characteristic, carried on the X chromosome. Hence, the trait for red-green colour blindness is passed from mother to son, from mother to daughter, or from mother and father to daughter. A son who inherits the trait from a carrier mother will be red-green colour blind (males inherit only one X chromosome, directly from the mother). A daughter who inherits the trait from a carrier mother (with a normal father) will have normal colour vision but be a carrier of the trait. A daughter who inherits the trait from both her mother and her father will be red-green colour blind.
Blue-yellow colour blindness, by contrast, is an autosomal dominant disorder and therefore is not gender-linked and requires only one copy of the defective gene from either parent to be expressed. Achromatopsia is an autosomal recessive disorder, occurring only when two copies of the defective gene (one from each parent) have been inherited. Persons who inherit colour blindness may show symptoms at birth (congenital colour blindness), or they may become symptomatic later, in childhood or adulthood.
Acquired colour blindness is usually of the blue-yellow type and ranges from mild to severe. Often it is associated with chronic disease, such as macular degeneration, glaucoma, diabetes mellitus, retinitis pigmentosa, or Alzheimer disease. Certain drugs and chemicals can also cause acquired colour blindness.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
1443) Ptosis
Summary
Ptosis, also known as blepharoptosis, is a drooping or falling of the upper eyelid. The drooping may be worse after being awake longer when the individual's muscles are tired. This condition is sometimes called "lazy eye", but that term normally refers to the condition amblyopia. If severe enough and left untreated, the drooping eyelid can cause other conditions, such as amblyopia or astigmatism. This is why it is especially important for this disorder to be treated in children at a young age, before it can interfere with vision development.
Details
Ptosis (Drooping Eyelid) Ptosis is a condition where the upper eyelid droops. It is also called blepharoptosis, or upper eyelid ptosis.
Ptosis, also called blepharoptosis, is drooping of the upper eyelid. The condition may be congenital or acquired and can cause significant obscuration of vision. In congenital ptosis the muscle that elevates the lid, called the levator palpebrae superioris, is usually absent or imperfectly developed. If severe and not corrected in a timely manner, congenital ptosis can lead to amblyopia and permanent vision loss. Congenital palsy of the third (oculomotor) cranial nerve (which normally stimulates elevation of the upper lid) is a more rare cause of congenital ptosis.
Acquired ptosis has many potential causes, but it is usually due to age-related stretching or displacement of the fibres connecting the levator palpebrae superioris muscle to structures within the upper eyelid. It can also result from muscular diseases (such as muscular dystrophy or myasthenia gravis) or damage to the oculomotor nerve from diabetes, hypertension, atherosclerosis, trauma, or direct compression. In a disorder called Horner syndrome, a slight ptosis occurs in association with a smaller pupil and decreased sweat production on the affected side.
Treatment of persistent blepharoptosis is usually surgical. Depending on the circumstances surrounding the onset of the ptosis, testing may be required to investigate possible underlying causes.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
1444) Etymology
Summary
Etymology is the study of the history of the form of words and, by extension, the origin and evolution of their semantic meaning across time. It is a subfield of historical linguistics, and draws upon comparative semantics, morphology, semiotics, and phonetics.
For languages with a long written history, etymologists make use of texts, and texts about the language, to gather knowledge about how words were used during earlier periods, how they developed in meaning and form, or when and how they entered the language. Etymologists also apply the methods of comparative linguistics to reconstruct information about forms that are too old for any direct information to be available. By analyzing related languages with a technique known as the comparative method, linguists can make inferences about their shared parent language and its vocabulary. In this way, word roots in European languages, for example, can be traced all the way back to the origin of the Indo-European language family.
Even though etymological research originated from the philological tradition, much current etymological research is done on language families where little or no early documentation is available, such as Uralic and Austronesian.
Details
Etymology is the history of a word or word element, including its origins and derivation. Although the etymologizing of proper names appears in the Old Testament and Plato dealt with etymology in his dialogue Cratylus, lack of knowledge of other languages and of the historical developments that languages undergo prevented ancient writers from arriving at the proper etymologies of words.
Modern scientific etymological study is based on the methods and findings of historical and comparative linguistics, the basic principles of which were established by linguists during the 19th century. The general principles involved in present-day etymology are:
1. The earliest form of a word, or word element, must be ascertained, as well as all parallel and related forms.
2. Every sound of a given word, or word element, must be compared with the corresponding sound in the form (often called its etymon) from which it is derived.
3. Any deviation in the previously established phonetic correspondences for the language of which the word is a part must be plausibly and rationally explained.
4. Any shift in meaning that has occurred in the historical transmission of the word must also be explained.
5. Words that present nonnative sounds, or combinations of sounds, that appear isolated in the language, or that demonstrate marked deviation from the usual phonetic correspondences, are probably borrowed rather than inherited, and the language of origin must be determined.
Definition
a) the history of a linguistic form (such as a word) shown by tracing its development since its earliest recorded occurrence in the language where it is found, by tracing its transmission from one language to another, by analyzing it into its component parts, by identifying its cognates in other languages, or by tracing it and its cognates to a common ancestral form in an ancestral language
b) a branch of linguistics concerned with etymologies.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
1445) Entomology
Summary
Entomology (from Ancient Greek (entomon) 'insect', and (-logia) 'study of') is the scientific study of insects, a branch of zoology. In the past the term "insect" was less specific, historically the definition of entomology would also include the study of animals in other arthropod groups, such as arachnids, myriapods, and crustaceans. This wider meaning may still be encountered in informal use.
Like several of the other fields that are categorized within zoology, entomology is a taxon-based category; any form of scientific study in which there is a focus on insect-related inquiries is, by definition, entomology. Entomology therefore overlaps with a cross-section of topics as diverse as molecular genetics, behavior, neuroscience, biomechanics, biochemistry, systematics, physiology, developmental biology, ecology, morphology, and paleontology.
Over 1.3 million insect species have been described, more than two-thirds of all known species. Some insect species date back to around 400 million years ago. They have many kinds of interactions with humans and other forms of life on Earth
Details
Entomology is a branch of zoology dealing with the scientific study of insects. The Greek word entomon, meaning “notched,” refers to the segmented body plan of the insect. The zoological categories of genetics, taxonomy, morphology, physiology, behaviour, and ecology are included in this field of study. Also included are the applied aspects of economic entomology, which encompasses the harmful and beneficial impact of insects on humans and their activities. Entomology also plays an important role in studies of biodiversity and assessment of environmental quality.
Throughout history the study of insects has intrigued great scientific minds. In the 4th century BCE, the Greek philosopher and scientist Aristotle provided descriptions of insect anatomy, establishing the groundwork for modern entomology. Pliny the Elder added to Aristotle’s list of species. The Italian naturalist Ulisse Aldrovandi published a major treatise, De Animalibus Insectis (“Of Insect Animals”) in 1602. With the aid of the newly developed microscope, the Dutch naturalist Jan Swammerdam was able to observe the minute structures of many insect species. Modern insect classification began in the 18th century. The French biologist René-Antoine Ferchault de Réaumur published the first of six volumes of Mémoires pour servir à l’histoire des insectes (“Memoirs Serving as a History of Insects”) in 1734. Carolus Linnaeus, in Systema Naturae (10th ed., 1758), applied his system of binomial nomenclature to organize the classification of insect species. Entomology emerged as a distinct field of study in the early 19th century, with the publication of such works as the eight-volume British Entomology (1824–39), by John Curtis, and the founding of entomological societies in Paris and London.
The body of knowledge gleaned from the study of insects has enabled modern economic entomologists to develop a wide range of methods for controlling insect pests. Some insects are perceived as threats to humans, both as agents of crop destruction and as disseminators of disease. Methods of integrating pest management, which combine chemical, biological, cultural, and sanitation strategies, have been devised to control the damage done by insects to agricultural products. Benefits from insect studies include improvements in pest-management practices and advances in genetics research. Studies using the vinegar fly (Drosophila melanogaster) have established the foundation and techniques used in virtually all aspects of genetics research conducted today. Insects also have been used in biochemical, developmental, behavioral, environmental, and ecological studies. The many functions that insects perform in the ecosystem, such as the pest control that dragonflies and mantises provide as predators of other insects or the decomposition of organic matter that scavenger insects accelerate, have been elucidated by entomological study. Insects that inhabit streams and other freshwater habitats such as mayflies, caddisflies, and stoneflies are used as biotic indicators of water quality. Insects are also used by forensic entomologists in a wide variety of legal situations that include both civil and criminal cases.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
1446) Petrology
Summary
Petrology is the study of rocks - igneous, metamorphic, and sedimentary - and the processes that form and transform them. Mineralogy is the study of the chemistry, crystal structure and physical properties of the mineral constituents of rocks.
Petrology is a scientific study of rocks that deals with their composition, texture, and structure; their occurrence and distribution; and their origin in relation to physicochemical conditions and geologic processes. It is concerned with all three major types of rocks—igneous, metamorphic, and sedimentary. Petrology includes the subdisciplines of experimental petrology and petrography. Experimental petrology involves the laboratory synthesis of rocks for the purpose of ascertaining the physical and chemical conditions under which rock formation occurs. Petrography is the study of rocks in thin section by means of a petrographic microscope (i.e., an instrument that employs polarized light that vibrates in a single plane). Petrography is primarily concerned with the systematic classification and precise description of rocks.
Petrology relies heavily on the principles and methods of mineralogy because most rocks consist of minerals and are formed under the same conditions. Also essential to petrological research is the careful mapping and sampling of rock units, which provide data on regional gradations of rock types and on associations unavailable by other means.
Details
Petrology (from Ancient Greek (pétros) 'rock', and (-logía) 'study of') is the branch of geology that studies rocks and the conditions under which they form. Petrology has three subdivisions: igneous, metamorphic, and sedimentary petrology. Igneous and metamorphic petrology are commonly taught together because they both contain heavy use of chemistry, chemical methods, and phase diagrams. Sedimentary petrology is, on the other hand, commonly taught together with stratigraphy because it deals with the processes that form sedimentary rock.
Background
Lithology was once approximately synonymous with petrography, but in current usage, lithology focuses on macroscopic hand-sample or outcrop-scale description of rocks while petrography is the speciality that deals with microscopic details.
In the petroleum industry, lithology, or more specifically mud logging, is the graphic representation of geological formations being drilled through and drawn on a log called a mud log. As the cuttings are circulated out of the borehole, they are sampled, examined (typically under a 10× microscope) and tested chemically when needed.
Methodology
Petrology utilizes the fields of mineralogy, petrography, optical mineralogy, and chemical analysis to describe the composition and texture of rocks. Petrologists also include the principles of geochemistry and geophysics through the study of geochemical trends and cycles and the use of thermodynamic data and experiments in order to better understand the origins of rocks.
Branches
There are three branches of petrology, corresponding to the three types of rocks: igneous, metamorphic, and sedimentary, and another dealing with experimental techniques:
* Igneous petrology focuses on the composition and texture of igneous rocks (rocks such as granite or basalt which have crystallized from molten rock or magma). Igneous rocks include volcanic and plutonic rocks.
* Sedimentary petrology focuses on the composition and texture of sedimentary rocks (rocks such as sandstone, shale, or limestone which consist of pieces or particles derived from other rocks or biological or chemical deposits, and are usually bound together in a matrix of finer material).
* Metamorphic petrology focuses on the composition and texture of metamorphic rocks (rocks such as slate, marble, gneiss, or schist which started out as sedimentary or igneous rocks but which have undergone chemical, mineralogical or textural changes due to extremes of pressure, temperature or both)
* Experimental petrology employs high-pressure, high-temperature apparatus to investigate the geochemistry and phase relations of natural or synthetic materials at elevated pressures and temperatures. Experiments are particularly useful for investigating rocks of the lower crust and upper mantle that rarely survive the journey to the surface in pristine condition. They are also one of the prime sources of information about completely inaccessible rocks such as those in the Earth's lower mantle and in the mantles of the other terrestrial planets and the Moon. The work of experimental petrologists has laid a foundation on which modern understanding of igneous and metamorphic processes has been built.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
1447) Somnambulism
Summary
Sleepwalking, also known as somnambulism or noctambulism, is a phenomenon of combined sleep and wakefulness. It is classified as a sleep disorder belonging to the parasomnia family. It occurs during slow wave stage of sleep, in a state of low consciousness, with performance of activities that are usually performed during a state of full consciousness. These activities can be as benign as talking, sitting up in bed, walking to a bathroom, consuming food, and cleaning, or as hazardous as cooking, driving a motor vehicle, violent gestures and grabbing at hallucinated objects.
Although sleepwalking cases generally consist of simple, repeated behaviors, there are occasionally reports of people performing complex behaviors while asleep, although their legitimacy is often disputed. Sleepwalkers often have little or no memory of the incident, as their consciousness has altered into a state in which memories are difficult to recall. Although their eyes are open, their expression is dim and glazed over. This may last from 30 seconds to 30 minutes.
Sleepwalking occurs during slow-wave sleep (N3) of non-rapid eye movement sleep (NREM sleep) cycles. It typically occurs within the first third of the night when slow-wave sleep is most prominent. Usually, it will occur once in a night, if at all.
Details
Sleepwalking, also called somnambulism, is a behavioral disorder of sleep in which a person sits up and performs various motor actions, such as standing, walking about, talking, eating, screaming, dressing, going to the bathroom, or even leaving the house. The episode usually ends with the sleepwalker’s returning to sleep, with no subsequent memory of the episode.
Sleepwalking is most common in children, though it may also appear in adolescents and young adults. It occurs only during deep sleep, when dreams are basically absent. Sleepwalking becomes dangerous only when the possibility exists of the sleepwalker’s accidentally injuring himself or herself. Sleepwalking may also occur in persons with post-traumatic stress disorder, whose nightmares terminate in shouting, struggling, or jumping out of bed. These episodes often result in awakening.
Although the causes of sleepwalking are not fully understood, the disorder in many instances appears to run in families and hence may be associated with genetic factors. A study of a family that had been affected by sleepwalking over multiple generations traced the condition to a region of chromosome 20 and revealed that persons carrying the sleepwalking version of this chromosome had a 50 percent chance of transmitting the disorder to their children. The identification of specific genes that contribute to sleepwalking could facilitate diagnosis and treatment of the disorder.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline
1448) Anthropology
Summary
Anthropology is the scientific study of humanity, concerned with human behavior, human biology, cultures, societies, and linguistics, in both the present and past, including past human species. Social anthropology studies patterns of behaviour, while cultural anthropology studies cultural meaning, including norms and values. A portmanteau sociocultural anthropology[4] is commonly used today. Linguistic anthropology studies how language influences social life. Biological or physical anthropology studies the biological development of humans.
Archaeological anthropology, often termed as 'anthropology of the past', studies human activity through investigation of physical evidence. It is considered a branch of anthropology in North America and Asia, while in Europe archaeology is viewed as a discipline in its own right or grouped under other related disciplines, such as history.
Details
Anthropology is the study of humanity through the application of biology, cultural studies, archaeology, linguistics, and other social sciences.
In the most general sense, anthropology is the study of humanity. More specifically, anthropologists study human groups and culture, with a focus on understanding what it means to be human. Toward this goal, anthropologists explore aspects of human biology, evolutionary biology, linguistics, cultural studies, history, economics, and other social sciences.
Anthropology emerged out of the New Imperialism of nineteenth-century Europe. During this time, European explorers came into contact with diverse groups and societies in the Americas and Asia. In the twentieth century, anthropology became increasingly specialized and professionalized as a social science.
Modern anthropology is often divided into four distinct subdisciplines: biological anthropology, cultural anthropology, linguistic anthropology, and archaeology. The four disciplines can be generally characterized as follows: biological anthropology (also known as physical anthropology) is the study of human-environmental adaptation; cultural anthropology is the study of how people develop and use culture as a tool; linguistic anthropology is the study of how people communicate and formulate language; and archaeology is the study of the past through material left behind (also known as artifacts).
While different types of anthropologists conduct different research, they all rely heavily on fieldwork. For archaeologists, this fieldwork involves the excavation of sites where ancient societies once lived. For cultural anthropologists, fieldwork commonly consists of interacting with modern social groups in order to better understand them or their distant ancestors. Anthropologists from different fields also commonly collaborate using their different skills to create a more comprehensive understanding of a particular group.
It appears to me that if one wants to make progress in mathematics, one should study the masters and not the pupils. - Niels Henrik Abel.
Nothing is better than reading and gaining more and more knowledge - Stephen William Hawking.
Offline