Posted on 07/10/2004 7:06:11 AM PDT by jalisco555
Charles Darwin's theory of natural selection struck a body-blow to human hubris. We were not, after all, an elevated species, untainted by the vagaries of nature. Instead, we had obtained our exalted powers in the same manner as all other living things - through fortuitous evolutionary adaptations to a natural world characterised by what Darwin called "blind, pitiless indifference".
Natural selection works on us because millions of random mutations occur in our genetic blueprint between one generation and the next. Suppose one of those gives rise to a trait that enhances your capacity to survive some environmental hazard; you live in the tropics, say, and a genetic mutation means that you are born with slightly darker skin than your parents. In that case, you will have a slightly better chance than your paler peers of coping with intense sunlight, and hence surviving to have babies of your own. It is through this incremental matching of mutations with environment that people from the tropics have browner skin than those in cooler climates.
But evolution has equipped us with inventive minds that let us mould the environment to our own specifications. We can eliminate the hazards and leave evolution nothing to work on. As we daub ourselves with sunblock creams, there is no longer a selective advantage to brown skin, and asthma, which used to be a killer, has become a mild inconvenience.
It is easy to argue that many, perhaps most, of us would not have survived to pass on our genes without the benefits of modern technology. Now that we have eliminated many of the worst infectious diseases from our cities, some even say that we are no longer subject to the destiny of natural selection. For 21st-century human beings, could evolution have come to a full stop?
Steve Jones, a geneticist from University College London, is one of the protagonists of this notion. He calls natural selection a "two-stage exam." First, he says, you need to survive long enough to pass through puberty and become fertile. Second, you need to produce some children who will inherit your genes.
Even in the relatively recent past, both these exams could be hard to pass. Thanks to malnutrition, poor sanitary conditions and widespread disease, about half the babies born in 1850s London died before they reached puberty, and family sizes varied considerably. But in modern Britain, both those factors have changed. Since most people now pass the tests set by evolution with the same grade, says Jones, nature has lost its power to select.
Not everybody agrees that evolution is so easily conquered. Many biologists insist that natural selection is still continuing to force the pace of human evolution. And in 1999, Christopher Wills from the University of California at San Diego, wrote a book entitled Children of Prometheus in which he asserted that, far from stopping human evolution in its tracks, cultural changes are forcing it to accelerate. If he is right, humans just a few thousand years from now could be unrecognisable from today, and some of us could even evolve into separate species.
Can culture drive evolution?
Scientists on both sides agree in principle that cultural changes can affect human evolution. After all, such changes are just another way of altering the environment, and natural selection can respond to any environmental change provided that it lasts long enough. The earliest, best documented and most intriguing cultural change is something that anthropologist Kristen Hawkes from the University of Utah calls the "grandmother effect."
Like others before her, Hawkes was baffled by the age structure of modern human societies. When a woman's childbearing years are over, natural selection should lose interest in her; she has no further chance to pass on her genes so there should be no evolutionary benefit to prolonging her life. But women survive long after they have lost fertility.
It is tempting to attribute this to modern medicine. Life expectancies in the developed world have only recently soared to their current heights. In 19th-century France, for instance, female life expectancy was just 39 years, yet by the late 20th century it had almost doubled. However, Hawkes points out that very high rates of child mortality skewed 19th-century life expectancies. Though many girls died before they reached adulthood, those women who survived to the age of 45 lived on for an average of more than 20 more years - even without the benefits of medicine.
What's more, the same strange pattern shows up in societies around the world. However primitive or advanced the culture, about a third of adult women are beyond their childbearing years. Among all the other primates, loss of fertility is quickly followed by loss of life. Why should humans be different?
Hawkes found clues in a modern hunter-gatherer group, the Hadza in northern Tanzania. She noticed that women past childbearing age were very productive when foraging for food, and that they donated their spare supplies to their grandchildren.
This, Hawkes believes, is evidence for one of the oldest instances of a social change that helped drive human evolution. Before about 1.8m years ago, our hominid ancestors were australopithecines, small hairy creatures that walked on two legs, but in most other respects - for instance body and brain size - were more like modern chimpanzees than humans. Then, abruptly, homo erectus emerged, a new much larger species, and Hawkes thinks that this was when grandmothers started to come into their own.
Clues from fossilised skulls, pelvises and long bones of homo erectus suggest it took around five years longer to reach maturity than its australopithecene antecedents. Across the animal kingdom, the older you are at maturity, the longer you tend to live. So, says Hawkes, the late maturity of homo erectus shows that it must have lived considerably longer than the australopithecines.
Behaviour, of course, does not leave fossil evidence but we do know that homo erectus appeared at a time when the climate was growing steadily colder and drier. Plants tend to respond to such conditions by investing in deeply buried roots or coating themselves with hard shells, both of which would have made them less accessible as food. Hawkes realised that infant chimps, and by analogy australopithecines, can feed themselves relatively easily on soft fruits. But to dig up deep tubers or break open hard shells would require the help of an adult.
If this climate shift forced early hominids to start eating more difficult food, there would be a tremendous advantage to having a grandmother still alive. While the mother was turning her attention to the next baby in line, a weaned infant could be fed by the older female. For this to work, the grandmother would have to be barren - or suckling her own children would take precedence over providing food for the weanlings. In other words, a cultural shift in which older females provided food for their grandchildren facilitated a genetic shift to greater longevity after menopause. And while australopithecines, like chimps, were apparently restricted to relatively special environments where their food was easy to obtain, Hawkes thinks that the help of their grandmothers is what allowed homo erectus to spread out of Africa and colonise most of the world.
Other genetic shifts in our archaic ancestors may also have been triggered by cultural effects. For instance, learning to control fire enabled our ancestors to migrate to colder habitats and to detoxify food through cooking. Both developments could have provided new opportunities for natural selection to operate. But there are no really convincing cases until much more recently in human history, with the onset of agriculture. Agricultural influences, rather than affecting the whole species, tend to be more localised. For instance, most northern Europeans have genes which allow them to digest milk, presumably selected for when they began herding cattle around 10,000 years ago. Yet in parts of the world that have no history of dairy herds, most adults are lactose-intolerant
More dramatic is the effect that the development of agriculture seems to have had on encouraging the spread of infectious diseases, such as malaria. The mosquito that transmits the malarial parasite does not thrive in virgin rainforest. It needs disturbance and the general mêlée that comes with human habitations. Thus malaria was probably not particularly dangerous until a few thousand years ago, when humans living in the tropics began to tear down the forests and build themselves villages, unwittingly providing the mosquito with its perfect habitat, while at the same time clustering together in groups large enough for the disease to spread quickly.
Malaria, especially the cerebral form, is so deadly that even a tiny genetic improvement in your capacity to withstand the disease considerably boosts the chances of passing on your genes. Sure enough, wherever malaria was endemic - from the Mediterranean throughout Africa and Asia - humans evolved genetic strategies to combat the disease.
The best of these seem to involve modifying red blood cells to stop the parasite from entering. But it is a testament to the severity of malaria that the cure comes with a penalty of its own. Inherit a single protective gene from one of your parents, and your modified red blood cells will help repel the parasite. But get a double dose - one from each parent - and you fall prey to deadly forms of anaemia. This is why sickle cell anaemia is so prevalent in Africa, and why another inherited anaemia, thalassaemia, shows up across the tropics.
Genetic analysis confirms that the evolutionary responses to malaria seem to have appeared in the human genome in the past 5,000 years or so, which is the right timescale for agricultural practices to be the culprit.
Is the rich world now beyond evolution?
Historical evidence throws up many instances of how developments in human society and culture promoted evolution rather than halting it. But does that pattern survive into the modern world? Certainly, people in developing countries are still at great risk from infectious diseases. Malaria kills some 2m people every year. Worse still is the HIV pandemic. The virus would probably have stayed harmlessly in whatever animal was harbouring it if we hadn't started cutting highways through the rainforest and thus given it the opportunity to leap across into humans. In some cases, transmission of HIV was facilitated by modern medicine - during blood transfusions, or vaccinations - and by drug users sharing needles. Unless a cure appears soon, HIV - like malaria - will have its inevitable evolutionary effect, forcing natural selection to pick out those individuals who are slightly more resistant than the rest. For third world communities, at least, evolution is definitely not over.
Developed countries, on the other hand, have wiped out most of the worst killers. Children no longer die from cholera or diphtheria, let alone from malnutrition. In Britain, thanks to our social safety net and advanced medical practices, if a baby reaches the age of six months it has almost a 100 per cent chance of surviving to adulthood. That is why Steve Jones insists that in westernised countries, evolution is effectively over. Our hairstyles and dress sense may change, but our genes, he says, are going nowhere.
Other biologists refuse to accept that something as powerful as natural selection can be overcome by mere human intervention. "As long as some people have more babies than others and die earlier or later you'll have subtle sorts of selection taking place," says Robert Foley, an evolutionary biologist from Cambridge University. "In the 18th century it might have been cholera or measles, and in highly developed worlds it might be the ability to dispose of fat. But it'll always be there."
Take the much quoted fact that in America poor families tend to have more children than rich ones. If every family has one child or 15 children, natural selection has no power. But if some families have even one or two more children than others, that will surely have its effect. According to Jones, this sort of argument doesn't hold because the difference in family size has nothing to do with genetics. He cites the fact that Palestinian Arab couples living in the territories have an average of more than six children whereas in Italy the birth rate is little more than one child per family. But the reason for the large Palestinian families is partly political, and partly improved healthcare, and the Italian families are small through cultural changes. Genes are not the driving force in either case. Unless the differences in family size come from a genetically inspired trait, they cannot drive Darwinian evolution.
The same goes for Africa. True, African families have many more children than those in western societies, but once again the incentive is cultural, not genetic. Demographers have found that when women in poorer countries obtain the means to control their fertility, family sizes tend to fall abruptly. Where is the genetic selection in that?
However, there are other ways in which western culture could be changing our environment enough to give natural selection a foothold. David Goldstein, a colleague of Steve Jones's at University College London, is an expert in pharmacogenomics - the science that studies the way that individual variations in our genetic makeup can affect our response to drugs. He points out that adverse reactions to medicines cause more than 100,000 deaths in America every year. Though only some of these are related to genetics, the very drugs that we administer to prevent illness could be providing a new set of selection pressures. Thalidomide is a case in point. Only the babies with the wrong genetic susceptibility were affected by the drug. Modern medicine is shining a spotlight on genetic variations that would otherwise have stayed hidden.
For reasons such as these, the geneticist Alan Templeton from Washington University in St Louis is convinced that evolution is not yet over, though he thinks it is meaningless to ask how fast human evolution is moving, since some traits change rapidly while others hold their ground. For one thing, he says, the diseases that we have combated are not standing still. "Disease organisms are evolving," he says, "and that's having a big selective impact." Our widespread use of antibiotics is leading to super-resistant organisms. And the malarial parasite is evolving strategies to evade the prophylactics we take. As long as diseases keep evolutionary pace with modern treatments, they will still have the capacity to affect our evolutionary development.
Then there is the effect that a western high-fat diet and sedentary lifestyle is having on people's weight. Obesity is particularly severe in American Indians and Pacific islanders, who possess genes that were once useful in helping guard against sudden variations in food supply. The same traits that protected them from starvation in the past are causing ballooning weight and an epidemic of diabetes, now that scarce plantains have been replaced by plentiful hamburgers.
Templeton studies a particular gene related to coronary artery disease, and sees clear evidence that it has been evolving throughout human prehistory. Genetics is too blunt a tool to identify more recent evolutionary changes, but Templeton sees no reason that the driving forces should have weakened. "All these factors tell me that natural selection is just as strong as ever, and that humans are still evolving and will continue to evolve," he says.
Will we become too unfit to survive?
These are powerful arguments, and Jones admits that cultural changes to our environment do seem to be occurring in ways that evolution might manage to seize on. However, he has a counterargument: it has not happened yet. For now, there is no evidence that any of these genetic factors is causing significant numbers of people to die before they can reproduce, or making any difference to the size of their families. Obesity, diabetes, heart disease - all the new epidemics inspired by western culture may well have some genetic component that affects survivability, but at least in their current form the problems they cause tend to appear later in life. "Some people say that the coming generation will have a shorter lifespan than the previous one on obesity grounds," says Jones. "But if people die at 66 rather than 76, evolution won't notice." They will still have had plenty of opportunities to pass on their genetic material, whether "defective" or not.
Alexey Kondrashov from the National Institutes for Health in Washington DC thinks Jones is right. He agrees that medicine has effectively shut down natural selection and this troubles him. "This will be a situation that's unknown in the history of life - and nothing good can come of it," he says. "We will all go straight to hell."
The reason is that without natural selection to weed out problematic mutations, humans won't just stop evolving - we will also start accumulating defective genes. Each of us, says Kondrashov, has around a thousand mutations which we would be better off without. Without natural selection, future generations will have many more. Kondrashov is quick to point out that he is not talking about eugenics, or traits that are of dubious disadvantage. Rather, he is worried that we will build up traits that are clearly a problem - such as mutations that inhibit our ability to process cholesterol in the blood.
A few years ago, Kondrashov even tested his idea using fruit flies, insects beloved of geneticists for their rapid regeneration time. First he took 100 pairs of flies, each in their own jar, and allowed them to mate and reproduce. Then he selected at random a male and female fly from each brood, shuffled them into new pairs, allowed them to mate and again randomly chose two offspring - one male and one female. The idea was to mimic the process in modern middle America, with neatly isolated couples all producing precisely two children over whom natural selection had no control. (He called it the "middle-class neighbourhood" experiment, and the different populations were labelled MCN1, MCN2 and so on.) After 30 generations, Kondrashov tested the offspring against "control" flies which had been raised more normally, competing with siblings for the available resources so that natural selection could weed out the weaker ones. Sure enough the middle-class flies were only half as fit as the ones that had experienced natural selection. A build-up of defective mutations had rendered them less able to compete and survive.
So is it true - as pessimistic biologists have long been warning - that modern medicine is weakening our gene pool, and making us unfit for the rigours of any future, less cushioned society? Well, perhaps. But it is also allowing our genes to maintain their variety. "Protecting against common diseases doesn't uniformly preserve bad mutations - it preserves genetic diversity," says Goldstein. "It maintains a system that makes us asthmatic once in a while and brilliant once in a while and just makes us different." And there is also the point that, by changing our environment, we have simply made these "defective" mutations irrelevant. For early humans, a genetic propensity for poor eyesight would have been disastrous - try gathering food or dodging predators when everything is a blur. Today, poor eyesight may be a nuisance, but thanks to modern technology it is no longer remotely hazardous. Jones admits that dependence on such technology might be dangerous if it failed for some reason in the future. Still, he says, "I'd rather go to the NHS for spectacles than live in a dark Darwinian world where you have to starve if you can't see."
On balance, then, it seems that Jones is right that in the relatively narrow confines of western societies, humans are no longer evolving. But that is not to say that we will never evolve again. "We're on the edge of a cliff," says Jones "on the simple grounds that we're far more abundant in number than we ought to be." However good our healthcare, a sudden outbreak of a new infection could run wild. Whenever we carve our way into a pristine wilderness, we risk unearthing diseases that were better left alone. We have more direct contact with more species of wild animal than at any point in the past, and we are living in huge conurbations and travelling the world in ways that would make any new disease scarily easy to spread.
Take the horrifying Ebola virus which, like HIV, leapt out of the scarred African rainforest into new human hosts. It is just good luck that this particular organism is so swiftly fatal to humans that so far it has killed off its hosts before they could enable it to spread. But for other viruses, it might be different. That, says Sarah Tishkoff from the University of Maryland, is why we are right to be worried about both emerging diseases and the threat of bioterrorism. "Look at how rapidly Aids has spread in just a few decades," she says. "There are infectious diseases out there that could wipe out entire populations. They could be released and all of us could be affected. None of us are immune." If this happened, the power of modern medicine would be neutralised, and natural selection would come back into its savage own.
That is a gloomy prospect, but perhaps we could still count on our human ingenuity to protect us from natural selection even if we were threatened by some new pandemic. "What's special about human beings is that we learn from one another," says Svante Pääbo, a geneticist from the Max Planck Institute in Leipzig. "When we invent something new like cars, we don't wait for evolution to make us better drivers or learn how to cross the street. We have driving schools and we teach our children to look both ways before crossing."
Steve Jones agrees. "The answer to the threat of infectious diseases is not to wait 10,000 years for genetic change. The answer is human ingenuity and forward planning. When it comes to a disease like Aids we can do something that chimps can't. We can use condoms."
The author is a science writer. Her book "Snowball Earth" is published by Bloomsbury
Think about the deaths in WWII.
Don't think in terms of which side they were on,
Whether American, British, German, Jewish, Russian, Japanese, or Chinese...
think instead of what % of the alpha males were
destroyed prior to having a chance to pass on their genes.
Add to that WWI, Korea, Vietnam, and every other war.
Agressive males have been killed off prior to reproduction
in great numbers for thousands of years.
This is but one aspect of our current evolution.
There are many more.
And then there are the DUhmmies....or maybe this just means they're on the path to extinction.
.
Yeah ... we're "still evolving".
Sometime soon we will evolve into lizard like creatures.
LOL!
We're not "evolving" and netiher is anything else.
It's not necessary for a male to reproduce directly. If he gives his life so that his tribe can survive, he will pass on his desirable genetics indirectly. Many brave soldiers that attacked knowing they faced certain death passed on their genes by proxy, when their tribe continued on. That's why some of us have no problem sacrificing our lives during war. It's been bred into us indirectly. Tribes with a percentage of heroes do really well against tribes with a disposition to surrender and appease. It's why America is a super power and France is an embarrassment.
Genetics is only a raw material, at best is only half of who we are. The other half is our culture and environment. That lives on even if our best and brightest die on the battlefield.
Once again, you sure showed us with your stunning, detailed and fully scientific refutation.
"Once again, you sure showed us with your stunning, detailed and fully scientific refutation."
No sense of humor. Tsk, tsk.
What's even more humorous is that I have never seen you display, "stunning, detailed ... scientific" evidence to support evolution.
Maybe this will cheer you up. Maybe you're evolving into a genius. There. Better now?
Nighty night.
The only folk not evolving are the severely uptight snake handling creationists that have condemned all them scientist types to hell. Oh well, the gene pool don't need them.
Where have I heard this before??? Hmmm Oh Yes!!!
When der Fuehrer says, "Ve ist der master race"
Ve HEIL! (phhht!) HEIL! (phhht!) Right in der Fuehrer's face...
...Are ve not the supermen
Aryan pure supermen
Ja ve ist der supermen
Super-duper supermen...
Ah, but that doesn't count because you didn't demonstrate when it was a pimple.
OK everybody, who volunteers to step off? I say that whoever complains about overpopulation needs to slit his own throat first and then go on complaining.
The page gives a sequence which starts with an eye spot, and ends with a fish eye. There are 8 other kinds of eyes (such as a fly's compound eyes) but this is the sequence leading towards human eyes.
The sequence meets the criteria we have already stated, namely:
The diagram below shows a simple eye spot. Let us assume it is in the skin of a multicellular creature. It has a dark backing, because that makes vision a bit more directional.
Next, an inward dimple happens under the eye spot. The eye spot begins to be on the surface of a shallow pit or depression. This increases the visual acuity, and also protects the eye spot from damage.
The dimpling continues until the depth of the pit is about equal to its width. This is now much like the eye of a planarian (flatworm).
Next, the rim of the pit begins to constrict. In camera terms, the eye begins to have an "aperture".
At some point - perhaps now, or perhaps later - the pit fills with a clear jelly. This may be a small mutation, or it could just be that the creature is covered by a slime layer anyway. In either case, the jelly or slime helps to hold the shape of the pit, and helps to protect the light sensitive cells from chemical damage. And, the jelly keeps mud out.
The aperture continues to decrease. Visual acuity increases until the aperture gets so small that it begins to shut out too much light. There will come a point when the aperture is the perfect size. A bigger aperture gives worse eyesight, and a smaller one gives worse eyesight. (The exact size that is "perfect" depends on how bright the lighting is.)
This is now much like the eye of a nautilus.
The eye above is a perfect "pinhole camera". It can only be improved by adding a lens.
To get a lens, one mutation is needed. The pit must be roofed over with a transparent layer. This mutation is not that strange. First, it could have happened at any time before this stage. (The original eye spot might have been covered.) Second, the transparent layer is useful, to keep a lensless eye from damage. And third, transparent materials are not hard to come by. (The human cornea is made from a protein which is also used elsewhere in the human body.)
So, the next step is the transparent layer becoming a little thicker in the center. Suddenly it isn't just a layer. It is a lens.
Now that the eye has a lens, the aperture is in the wrong place. The eye will be more acute if the lens moves inward, towards the center of curvature of the light-sensitive surface.
The lens continues to move inward. As it moves, the laws of optics say that a thicker and thicker lens is valuable.
Also, the refractive index of the center of the lens changes. This is possible because the lens is made from a mixture of proteins. The ratio of the proteins can be different in different places, so the lens material is not optically uniform. It is common for a biological lens to have a higher refractive index at the center than at the edges. This "graded index" is a very valuable property.
And we're done. This is a fish eye, complete with a spherical graded-index lens, placed at the exact center of the light-sensitive layer. The optical quality is excellent, being "aberration-free" over a 180 degree field of view.
These diagrams, and the analysis about them, are taken from
A Pessimistic Estimate Of The Time Required For An Eye To Evolve, D.-E. Nilsson and S. Pelger, Proceedings of the Royal Society London B, 1994, 256, pp. 53-58.
Placemarker
If you find a source, I wear a 42 (Plain old "large" where I buy T-shirts) and I'll even pay for yours.
By determining whether or not the scientists doing and proposing new procedures have adequately considered what could go wrong and taken adequate procedures to control the downside of experiments.
From what I've seen they haven't. They grow experimental genetically engineered crops not in a contained lab, but out in the open where cross polination with other plants can occur.
I would probably support the one attempt using a genetically engineered virus to correct a deadly genetic defect. But they applied the virus and sent the kids home. Since the defect they were trying to correct was deadly anyway, I think that was a worthwhile try. But how certain were they that the virus couldn't have affected other people? I'm not sure how long they kept them in isolation, but apparently it wasn't long enough for the ill effects of the virus to manifest. I hope before they released them to go home, they were VERY certain they weren't contagious. One bad mistake with a virus and you could really dimenish the human race.
SHAZAM!!!
Interesting choice of words. Shazam!!! Well chosen, for that would be magic if it were true.
A = Side of worm.
B = Pimple on worm.
C = 2 Pimples on worm.
D = 2 popped pimples.
E = Acne problem (Pimples showing some fight).
F = Dreaded dual whiteheads.
G = Irritated dual whiteheads.
Evolutionary Clearasil needed!
Totally honest placemarker. 100?
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.