Early one evening late in my second trimester of pregnancy, I was standing in the dairy aisle of the grocery store, with one hand on my back and the other over the kicking baby in my distended belly. A young man approached me, initiated a conversation about the World Cup, and, casually, asked me if I’d like watch the game with him that weekend. “You’re pretty!” he whispered. I was shocked.
I wasn’t putting out a sexy vibe. (Not at all.) I had assumed that any male attention I receive in late pregnancy, including that from my husband, would be friendly, not sexual. Why would a man who is not the expectant father think pregnancy is sexy? But then other women told me similar stories about how they got hit on in third trimester. So I decided to look into it, and it turns out that a study on sexual attraction to pregnancy has recently come out.
A team of Swedish and Italian doctors, led by Emmanuele Jannini and Magnus Enquist, recruited nearly 2,200 men who had joined online fetish groups such as alt.sex.fetish and alt.sex.fetish.breastmilk. They presented a questionnaire that asked the respondents questions about their preferences for pregnant and lactating women. The survey also asked for the sex and age of each sibling, and whether the sibling is a full sibling or not (half-sibling or adopted child). Most respondents reported both a pregnancy and a lactation preference. The average age at which respondents became aware of their preference was about 18 years.
What Jannini and Enquist and their colleagues were searching for was evidence that there was something special about the upbringing of men that are secually aroused by pregnancy. They knew that a specific stimulus early in life can elicit sexual behavior when theat animal reaches sexual maturity. For instance, goats that are raised by sheep are sexually aaroused by sheep only. This is called sexual imprinting.
Is it possible that boys that are raised by women who are pregnant for much of their childhoods are unusually attracted to pregnant women?
It turns out, what’s good for the goat is good for the guy. The more exposed a man was to his mother being pregnant and breastfeeding when he was between 1.5 and 5 years old, the more likely he is, as an adult, to be sexually attracted to pregnant and breastfeeding women.
A younger sibling is the key to early exposure. The respondents who eroticized pregnancy and breastfeeding had significantly more younger siblings than expected by chance. Respondents with one sibling were older than their sister or brother in 66 percent of cases. Interstingly, siblings born of a different mother does not appear to be related to respondents’ sexual preferences. Only a boy’s own pregnant mother seemed to leave a sexual imprint.
Freud’s “oedipal phase,” from about 3 to about 5-6 years of age, only overlaps partially with the sensitive period suggested by this study’s data, the researchers are careful to point out. Sexual imprinting is different in that it’s motivated not by sexual drive but because the individual learns what’s normal during a sensitive phase of development and later seeks sexual partners that resemble his (or her) own parents.
What does this mean for women who are pregnant or plan to be pregnant? It means you may be able to predict how attracted your partner will be to you in late pregnancy. Does he have sibling born within five years after him? If so, he’s likelier to be turned on by your pregnant self.
As for the guy I met in the dairy aisle, I’d wager he had a younger brother or sister. I’d bet more on getting this right than the winner of the next World Cup.
*If you like this blog, click here for previous posts and here to read a description of my most recent book, Do Gentlemen Really Prefer Blondes?, on the science behind love, sex, and attraction. If you wish, check out my forthcoming book, available October 11, Do Chocolate Lovers Have Sweeter Babies?: The Surprising Science of Pregnancy.
Popular-science writer Pincott (Do Gentlemen Really Prefer Blondes?, 2008, etc.) provides a lively, accessible romp through the science of pregnancy.
Known for her previous research on love and sexual attraction, the author makes a natural transition in her latest. Delving into the science of pregnancy, parenthood and fetal development, she presents her findings with wit, personal anecdotes and playful humor. Eschewing predictable “avoid the shellfish” advice, Pincott provides a science-based trivia collection, drawing from studies in evolutionary psychology, biology, neuroscience, social science, epigenetics and more. She explores topics such as how a woman’s activities might influence her unborn baby’s personality, how pregnancy and motherhood can change the behavior of mothers and fathers, what factors might influence a baby’s gender and why the first hour after a baby’s birth means so much for mother-newborn bonding. Inspired by questions from her own first pregnancy, the author also digs up the answers to common inquiries such as “what does baby’s birth season predict?”; “what can Mozart really do?”; and “will what we eat now influence baby’s tastes later?” Despite the bombardment of information, Pincott presents her research as fun things to contemplate rather than additional things to worry about, so nervous expectant parents can thoroughly enjoy the book.
A fascinating supplement to the typical maternity guide.
For the nine-plus months of pregnancy, I dutifully downed fish oil pills. I had heard all about the virtues of essential fatty acids (especially DHA, docosahexaenoic acid), known collectively as omega-3s, which are found in fish such as salmon and sardines. These fats are involved in the development of new neurons and help form the cell walls — the structural support — of nerve cells. If the healthy brain is like a sponge, then the brain deprived of omega-3 is like a puddle.
Several years ago, in 2007, an enormous study funded by the National Institute of Health looked at the link between children’s scores on aptitude tests (at ages 6 months to 8 years) and their mother’s prenatal consumption of fish. It turned out that the kids whose moms ate fish more than twice weekly during pregnancy were significantly less likely to have low scores on cognitive tests. Low maternal seafood intake (two or fewer servings weekly) was also associated with increased risk of suboptimum outcomes for prosocial behavior, fine motor, communication, and social development scores. This was a huge deal. The nearly 12,000 expectant women who participated in the study were asked to record how much whole fish they ate, not fish oil supplements.
Naturally, this study — and smaller studies like it involving whole-fish consumption — inspired millions of pregnant women to focus on fish oil.
Problem is, not many of us want to or can afford to eat fish every day. Fears of mercury and PCB contamination are valid (many varieties of fish, such as tuna, have high levels that are toxic to fetuses). It’s not much of a stretch to say that fish oil pills are a better way to get your daily DHA.
But here’s the interesting part. Everyone has assumed that when it comes to omega-3 fatty acids like DHA, the source — whole fish or fish oil pills –shouldn’t matter. Seems reasonable, but is it?
A few very recent fish oil studies cast doubt:
Results of fish oil pill supplementation range from neutral to negative…
• A review of six clinical trials (1280 women in total) involving fish oil pill supplementation during breastfeeding found no significant difference in children’s neurodevelopment: language development (intelligence or problem-solving ability, psychomotor development, motor development. In child attention there was a significant difference. For child visual acuity there was no significant difference. For language development at 12 to 24 months and at five years in child attention, weak evidence was found (one study) favouring the supplementation.
• At the Women’s and Children’s Hospital in Adelaide, Australia, researchers tracked the children of 2400 women who took DHA-rich fish oil pills in the last trimester of pregnancy. The use of these fish oil capsules compared with vegetable oil cap- sules during pregnancy did not result in improved cognitive and language development in their offspring during early childhood.
Other fish oil pill studies found disturbingly negative results:
• At the Universities of Copenhagen and Chapel Hill, researchers followed 120 Danish women who nursed their babies for four months after birth and took fish oil supplements (or olive oil pills). The children were tested in intervals up to seven years. The higher the early intake, the lower the child scored in speed of information processing, inhibitory control, and working memory tests. Boys whose mothers consumed fish oil had lower prosocial scores relative to the olive oil group.
Meanwhile, these recent studies strengthened the evidence that eating fish is brain-boosting:
• In a study that took place the Arctic, 154 11-year-old Inuit children took standardized tests for memory and verbal learning. Their scores were compared with their levels of DHA present in their cord blood at birth. Children who had higher cord plasma concentrations of DHA at birth achieved significantly higher scores on tests related to recognition memory processing. The source of DH in their mothers’ diets was fish and marine mammals. Intriguingly, the connection with higher test scores remained intact regardless of seafood-contaminant (PCB and mercury) amounts.
* A UK study of 217 nine-year-olds whose mothers had eaten oily fish in early pregnancy had a reduced risk of hyperactivity and children whose mothers had eaten fish (whether oily or non-oily) in late pregnancy had a verbal IQ that was 7.55 points higher than those whose mothers did not eat fish.
This is what I’d love to see: large studies that compare pregnant/nursing fish-eaters versus pill-poppers. Few researchers have tackled this, in part because we assume DHA works the same no matter how we get it, and because DHA from sources other than pills is difficult to measure or isolate. Interestingly, a study at the Norwegian Institute of Public Health compared height, weight and head circumference results of newborns whose mothers whose main source of DHA was fish versus pills. They found that fish-eaters generally gave birth to larger babies while fish-oil-pill-poppers had newborns with a smaller head circumference.
Is it possible that fish consumption boosts IQ, but fish oil pills do not?
It’s dumbfounding, the difference in results between whole fish and fish oil. The researchers that found negative results of supplementation on nursing infants speculated on what goes wrong. It may be that early intervention with fish oil pills results in an “environmental mismatch” between prenatal and postnatal life,” (e.g. the fetus is “programmed” in the womb to live in an environment without abundant DHA and is thrown off when inundated with these fats later on).
Another theory is that the timing in these recent fish oil pill studies is off. The critical period in which fish oil may influence brain growth may be in the first trimester of pregnancy or toward the end of the first year of life — not during the time periods in which women in these studies were taking fish oil pills. It may be that DHA has a “sweet spot” — an optimum level below and above which may be detrimental to the developing brain. Indeed, when researchers look at fish oil pill supplementation and DHA-deficient premature infants, the results are much rosier.
There’s another compelling explanation of why fish oil pills don’t yield the desired results: DHA doesn’t do its magic alone. Nutrients and proteins in fish and seafood, other than DHA, may be brain-boosters — or at least help us (and our fetuses or babies) to absorb or metabolize DHA better. All the fish oil in the sea can’t compensate for a bad diet.
In the US, a federal advisory recommends that pregnant women not eat more than two servings of fish weekly. This advice may be misguided given that fish such as salmon and sardines are high in DHA but low in mercury. Pop fish oil pills instead; they’re just as good– that’s been the message. But these recent studies point to a different truth.
Thus the case for fish, the whole fish, and nothing but the fish.
Food for thought.
*If you like this blog, click here for previous posts and here to read a description of my most recent book, Do Gentlemen Really Prefer Blondes?, on the science behind love, sex, and attraction. If you wish, check out my forthcoming book, Do Chocolate Lovers Have Sweeter Babies?: The Surprising Science of Pregnancy.
Forgive me, I believe my one-year-old is the cutest baby ever. Yes, yes, mothers are biased about their own children. As I detail in my new book, certain reward circuits “light up” in parental brains only when looking at their own offspring. But objectively — objectively! — my daughter is adorable.
The little one has “Gerber baby” features: a bulbous forehead, big eyes, luscious cheeks and thighs (and curls). Babies with these qualities are rated as cuter than those with sunken foreheads, small eyes, and large or long chins. Adults smile and gaze longer at them. Attractive infants are perceived to be more sociable, easier to care for, and more competent than their homely peers. They inhibit aggression in adult men. They receive more nurture.
Our baby thrills to the attention, and my husband and I have started to worry that being cute might not lead to anything good. I have a theory that ugly ducklings and tomboys grow up to have richer inner lives. I don’t want a princess.
We want to know: Do the cutest babies turn out to be the most attractive adults?
Conveniently, a recent study by psychologists Gordon Gallup Jr, Marissa Hamilton, and their colleagues addresses this very question. (I love these whimsical studies; they’re motivated by genuine curiosity.) The presumption is that physical attractiveness remains stable over time. This has been proven in childhood onward: attractive ten-year-olds are likelier to be attractive adults. (Another study found that adult attractiveness can be predicted as early as age five). But until now no study had tracked attractiveness from infancy.
It’s interesting, how the psychologists went about it. They sifted through high school yearbooks and found forty graduating seniors who featured photos of themselves as infants. Then they asked several hundred college students to rate the the individuals — in infancy and in adulthood — for attractiveness.
There was no correlation between attractiveness in infancy and (young) adulthood. Some ugly ducklings turned into swans, some baby swans become ugly ducks. Some gawky, awkward babies remained that way into their senior year of high school. And some beautiful babies kept their glow through the years. This was true of males and females alike. Cuteness — or homeliness — in infancy does not predict future attractiveness.
The study included an interesting side finding: While the raters were likely to agree about which infants were attractive, they often disagreed about which eighteen-year-olds made the cut. Why? The gold standard of baby beauty — the forehead, the eyes, the thighs — is universal. These preferences are hard-wired in us to elicit care and protection, while the perception of adult beauty is tempered by culture.
Cute babies are universal positives. In this light, it’s OK that mine gets attention now. The future will be much less predictable.
*If you like this blog, click here for previous posts and here to read a description of my most recent book, Do Gentlemen Really Prefer Blondes?, on the science behind love, sex, and attraction. If you wish, check out my forthcoming book, Do Chocolate Lovers Have Sweeter Babies?: The Surprising Science of Pregnancy.
Not long ago, a handful of scientists at the University of California at Irvine were curious about why some people live longer than others — even within groups that have similar ethnic and educational backgrounds, demographic and disease risk profiles, and are exposed to similar stressors in life. At heart, they know the question is impossible to answer. People are complex. The effects of life events on our genes—what we eat, what we breathe, who we love and how well we’re loved, and so on —are impossible to isolate.
But the scientists had a hunch that some of us had a bad start —beginning in the womb — because our mothers were highly stressed during pregnancy. There’s an avalanche of evidence that women who are under extreme duress in pregnancy have kids who have shorter attention spans, lower IQ, memory deficiencies, and health problems.
Could prenatal stress also set a baby’s life expectancy clock to tick faster?
One way to find out is to look at the genes of people whose mothers were extremely stressed during pregnancy. In each of our cells are DNA-protein complexes called telomeres, which cap the end of chromosomes. Telomeres are like the plastic bit at the end of a shoelace to keep it from unraveling. Each time a cell divides, they become a little shorter. This makes telomeres something of a longevity marker. People with long tips at the end of their DNA strands tend to live longer than people who have short tips. It doesn’t matter how long your shoelace is; what counts is the integrity of the cap.
In the UCI study, researchers recruited volunteers in their twenties. Some were selected because their mothers experienced a horrid event during pregnancy. The scientists weren’t looking for the normal pregnancy stressors — work-life balance, weight gain, fretting about the baby’s health, and so on. They meant extreme stressors: a sudden divorce, a death in the family, a natural disaster, and physical or emotional abuse.
What they found is disturbing.
Compared to the control group (whose moms had a relatively stress-free pregnancy), people exposed to their moms’ extreme prenatal stress had significantly shorter telomeres. By our mid-twenties, most of us lose about 60 base pairs of telomere length annually. Not so of people who were exposed to extreme prenatal stress — they lose drastically more telomere length each year. The men had 178 fewer base pairs on average (equivalent to 3.5 additional years of aging). Women had a shocking 295 base-pair deficit (5 years of accelerated aging). It seems that a mother’s prenatal stress hits her daughter harder than her son.
How does this happen? During pregnancy, stress may alter blood flow, oxygen, and glucose metabolism between mother and baby. High levels of the stress hormone cortisol from the mother flood the placental barrier. Excess cortisol may also slow down in the production of telomerase, an enzyme that acts as a repair kit for telomeres. Telomerase adds telomeric DNA to shortened telomeres. It regenerates our cells and tissues. Like a fountain of youth, telomerase gives back what time takes away.
So what if you’re on a telomerase-less trajectory?
Here’s the big relief: Your clock doesn’t have to keep ticking so quickly, even if it has been set that way before birth. There’s strong evidence that lifestyle changes can amp up telomerase production. One study found that stress management, counseling, and a healthy diet are associated with higher telomerase activity. Another found that meditation turns up the telomerase dial.
In the research community there’s much interest in the idea that, by maintaining our telomeres, gene therapy might someday reverse or prevent aging if started early enough. Is it possible? As a measure to conceal the abuses of youth, teens could freebase on telomerase.
Oh, the ways to stress out Mom.
*If you like this blog, click here for previous posts and here to read a description of my most recent book, Do Gentlemen Really Prefer Blondes?, on the science behind love, sex, and attraction. If you wish, check out my forthcoming book, Do Chocolate Lovers Have Sweeter Babies?: The Surprising Science of Pregnancy.
Scientists found that men whose ring fingers are longer than their index fingers are likelier to have longer-than-average penises, at least among Korean men whose flaccid genitals were stretched under anesthesia. Studying the files of women who were raped in 1999-2006, French researchers discovered that there were fewer incidences of living sperm than in rape victims in previous generations, which supports the theory that sperm quality is declining. Women are likelier to get pregnant if they ovulate from their right-side ovary, visible by ultrasound, especially after two consecutive left-side cycles, inspiring women undergoing fertility treatment to desire a L-L-R pattern. Among women whose fetuses inexplicably died in third trimester, 64 percent (392/614) had a premonition before their doctors told them. They described a feeling of discomfort, of a strange unease; that they understood subconsciously that the baby would die. Many described how they dreamed of dead relatives and of death on the night the baby probably died. A recent fMRI study reported that women who had given birth vaginally exhibited greater activation in brain regions involved in the regulation of empathy, arousal, motivation and reward circuits in response to their baby’s cries compared to those who had not. Women who snore loudly and frequently were at high risk for low birth weight (relative risk = 2.6 [95% confidence interval = 1.2-5.4]), and fetal-growth-restricted neonates. The success of an IVF transfer may in part be predicted by how much glucose medium an embryo “eats” on days 4 and 5. On Day 4, female embryos consume significantly more sugar than males.
Is it any coincidence that the most laidback people I’ve ever met hail from Brazil, land of fish and coconuts?
The mellowness of Brazilians came to mind when I read a study on prenatal stress to be published next month in the International Journal of Neurodevelopmental Medicine. The researchers, including lead author Carlos Galduróz, are biologists at Universidade Federal de São Paulo (in Brazil).
It’s been long known that significant prenatal stress — characterized by a blitz of the stress hormone cortisol — harms a fetus. Prenatal stress results in an increased risk of premature birth and low birthweight. In humans, it’s linked with anxiety, attention deficit disorder, impaired memory, low test scores in childhood, and depressive behavior in adulthood. Rats whose mothers are exposed to extreme stressors are likelier to have impaired motor skills and are slower to learn.
Intriguingly, there’s evidence that the mother’s diet might offset some of these disadvantages. A baby whose stressed-out mom ate “special” foods during pregnancy and lactation may fare better than one whose equally stressed -out mom ate a normal diet.
Galduróz and his colleagues were curious to know if the composition of fat in a prenatal diet might make the difference. So, during the equivalent of second and third trimester, they subjected some of the rats in their study to extreme stress — restraint and bright lights for forty-finve minutes, three times daily. Some of these pregnant rats were fed a diet high in omega-3 fatty acids, the kind found in salmon, sardines, and other fish. Others were fed a diet high in saturated fatty acid from coconut milk. A third group ate normal rat chow.
As expected, babies of stressed-out moms had lower birth weights. The surprise came three weeks later: Babies whose moms ate fish oil or coconut fat diets during pregnancy and lactation gained weight quickly. So quickly, in fact, that they became the same weight as the babies whose moms weren’t stressed during pregnancy. In other words, fish and coconut fats reversed the impact of low birthweight, a potentially dangerous effect of prental stress.
That’s not all.
Babies exposed to prenatal stress were more active (restless) than other pups if their moms were on a regular or coconut-oil diet. Interestingly, if a stressed-out mother was on a fish oil diet, her pups were not more restless than those of pups with non-stressed moms.
In an earlier study by the same authors, adult rats whose moms ate a coconut fat or fish oil-based diet released fewer stress hormones (a reduced corticosteroid response) than rats whose moms ate a normal diet.
Many studies have shown that fish oil, omega-3s, modulate mood by reducing the stress response. This has been shown in rat studies, and also in many (but not all) human studies. Is it possible that when a mother consumes food containing omega-3s, her babies are less agitated? Are they happier? Of course, rodents express anxiety, neuroticism, and depression differently from human babies. But the healing effect of nutrients is fascinating. Do stressed-out moms on fish-and-coconut diets have happier, healthier babies than their equally stressed peers who don’t eat as well?
For the real possibility that fish and coconut oil have prenatal physical and psychological perks, I link to a favorite recipe here. It’s for moqueca, a stew made of fish and coconut fats, from Bahia, the Coconut Coast of Brazil.
When I was in the second trimester of pregnancy, my husband and I bought a new king-sized mattress. Like all cotton mattresses sold in the U.S., ours had been treated with a flame retardant containing polybrominated diphenyl ethers (PBDEs) and/or organohalogen compounds (OHCs). Flame retardants are also in pillows, car and airplane seats, drapes, rugs, and insulation. They’re in electronic equipment, like TVs, and in the dust on top of TVs. They’re in air and soil and breast milk. Almost all humans have flame retardants flowing through their veins. Around the same time I got my new mattress (on which I tossed and turned in third trimester), two surprising studies were published on the effects of flame retardants on fetuses and young children. A group of researchers at the University of Gronigden in the Netherlands recruited nearly 70 pregnant women in third trimester, taking samples of their blood and measuring it for PBDEs and OHCs. Five years later, the children were given standardized developmental tests for motor skills (balance and coordination), cognition (intelligence, spatial skills, control, verbal memory, and attention), and behavior. The result: PBDEs were correlated with worse performance on fine motor tasks and a shortened attention span. Strikingly, they were also linked with better coordination and visual perception, as well as better (more placid?) behavior. OHCs, meanwhile, were correlated with worse fine motor skills. Oddly, these kids had better visual perception. Researchers at Columbia University tested for PBDEs in the cord blood of nearly 400 women who delivered their babies at a New York City hospital. These children were given mental and motor development tests in infancy and, later, at four-to-six years. These tests measure memory, problem solving, habituation, language, mathematical concept formation, and object constancy. They also assess ability to manipulate hands and fingers and control and coordinate their movements. The result: At both age intervals, children who had higher cord blood concentrations of PBDEs scored significantly lower on tests of mental (lower IQ) and motor development. This was particularly evident at age two for motor skills and age four for IQ (nearly 8 points lower for certain PBDEs). Are flame retardants slowing us down? Correlation is not causation, but there’s a real risk that they do — and researchers have some ideas about how these chemicals have a toxic effects on the brain. OHCs (for instance) have been found to decrease a fetus’s production of thyroid hormone by interfering with thyroid receptors. This leads to an increase in thyroid stimulating hormone (TSH). Brain development in the fetus relies on the precise timing and quantity of thyroid hormone; too much or too little causes developmental delays. High prenatal exposure to TSH is associated with lower IQs – 4 points less on average. During critical developmental periods, PBDEs and OHCs may also have a toxic effect on neurons in the hippocampus, the memory region of the brain, by reducing the number of neurotrasnmitter receptors. Infants and toddlers have what researchers call a high “body burden” of flame retardants. Household dust, which floor-playing infants and toddlers encounter constantly, accounts for 80-93 percent of postnatal PBDE exposure, followed by breast milk (however, the benefits of nursing appear to outweigh this drawback; breastfed babies score higher on neurodevelopmental tests). A disturbing fact is that American kids have levels of PCBEs that are 10 to 1,000 times higher than their peers in Europe or Asia. We produce 1.2 billion pounds of the stuff annually. (Interestingly, the Scandivanian study, whose subjects had lower levels of prenatal exposure, found no IQ deficit while the U.S. study did.) Consider our nation’s problems: attention deficit disorder, placidity, lower standardized test scores in reading and math. Are flame retardants making kids dimmer? The question fires up the imagination. Should pregnant women be advised to avoid, say, dusting and buying new mattresses in the same way we avoid emptying the litter box (to avoid toxoplasmosis)? Are the perceived gains in visual perception real, and, if so, why, and do they come at the expense of other abilities? Are urban kids at a higher risk than average? Are there naturally flame-retardant materials that we can use in lieu of chemicals? More research, especially on American kids, is warranted. After all, the nightmare scenarios can keep an expectant mom up all night, tossing and turning on her nonflammable mattress. * If you wish, check out my forthcoming book, Do Chocolate Lovers Have Sweeter Babies: The Surprising Science of Pregnancy.
When I was in the second trimester of pregnancy, my husband and I bought a new king-sized mattress. Like all cotton mattresses sold in the U.S., ours had been treated with a flame retardant containing polybrominated diphenyl ethers (PBDEs) and/or organohalogen compounds (OHCs). Flame retardants are also in pillows, car and airplane seats, drapes, rugs, and insulation. They’re in electronic equipment, like TVs, and in the dust on top of TVs. They’re in air and soil and breast milk. Almost all humans have flame retardants flowing through their veins.
Around the same time I got my new mattress (on which I tossed and turned in third trimester), two surprising studies were published on the effects of flame retardants on fetuses and young children.
A group of researchers at the University of Gronigden in the Netherlands recruited nearly 70 pregnant women in third trimester, taking samples of their blood and measuring it for PBDEs and OHCs. Five years later, the children were given standardized developmental tests for motor skills (balance and coordination), cognition (intelligence, spatial skills, control, verbal memory, and attention), and behavior.
The result: PBDEs were correlated with worse performance on fine motor tasks and a shortened attention span. Strikingly, they were also linked with better coordination and visual perception, as well as better (more placid?) behavior. OHCs, meanwhile, were correlated with worse fine motor skills. Oddly, these kids had better visual perception.
Researchers at Columbia University tested for PBDEs in the cord blood of nearly 400 women who delivered their babies at a New York City hospital. These children were given mental and motor development tests in infancy and, later, at four-to-six years. These tests measure memory, problem solving, habituation, language, mathematical concept formation, and object constancy. They also assess ability to manipulate hands and fingers and control and coordinate their movements.
The result: At both age intervals, children who had higher cord blood concentrations of PBDEs scored significantly lower on tests of mental (lower IQ) and motor development. This was particularly evident at age two for motor skills and age four for IQ (nearly 8 points lower for certain PBDEs).
Are flame retardants slowing us down? Correlation is not causation, but there’s a real risk that they do — and researchers have some ideas about how these chemicals have a toxic effects on the brain. OHCs (for instance) have been found to decrease a fetus’s production of thyroid hormone by interfering with thyroid receptors. This leads to an increase in thyroid stimulating hormone (TSH). Brain development in the fetus relies on the precise timing and quantity of thyroid hormone; too much or too little causes developmental delays. High prenatal exposure to TSH is associated with lower IQs – 4 points less on average. During critical developmental periods, PBDEs and OHCs may also have a toxic effect on neurons in the hippocampus, the memory region of the brain, by reducing the number of neurotrasnmitter receptors.
Infants and toddlers have what researchers call a high “body burden” of flame retardants. Household dust, which floor-playing infants and toddlers encounter constantly, accounts for 80-93 percent of postnatal PBDE exposure, followed by breast milk (however, the benefits of nursing appear to outweigh this drawback; breastfed babies score higher on neurodevelopmental tests).
A disturbing fact is that American kids have levels of PCBEs that are 10 to 1,000 times higher than their peers in Europe or Asia. We produce 1.2 billion pounds of the stuff annually. (Interestingly, the Scandivanian study, whose subjects had lower levels of prenatal exposure, found no IQ deficit while the U.S. study did.) Consider our nation’s problems: attention deficit disorder, placidity, lower standardized test scores in reading and math.
Are flame retardants making kids dimmer?
The question fires up the imagination. Should pregnant women be advised to avoid, say, dusting and buying new mattresses in the same way we avoid emptying the litter box (to avoid toxoplasmosis)? Are the perceived gains in visual perception real, and, if so, why, and do they come at the expense of other abilities? Are urban kids at a higher risk than average? Are there naturally flame-retardant materials that we can use in lieu of chemicals? More research, especially on American kids, is warranted.
After all, the nightmare scenarios can keep an expectant mom up all night, tossing and turning on her nonflammable mattress.
* If you wish, check out my forthcoming book, Do Chocolate Lovers Have Sweeter Babies: The Surprising Science of Pregnancy.
It’s bad enough that Congressman Anthony Weiner had been taking photos of his naked self and sending them to women who weren’t his wife. It’s worse when we learn that his wife is three months pregnant.
Aha, that it!, some cynics claim. Now that Weiner’s oats are sowed, he’s exploring new (and, if the twittering teen rumor is real, very green) pastures. It’s only natural.
But is it? Are men really more likely to cheat when their wives are pregnant?
Turns out, the answer is that it depends on the man.
Reviewing the studies of pregnancy and sex, it seems there are three categories of expectant fathers.
- Type Z cheats or wants to cheat (the Weiners).
- Type Y desires his pregnant wife more than ever.
- And then there’s Type X — a man who has a decreased sex drive and a lower risk of cheating on his wife.
The bad news is that at least one study found that, yes, the risk of a given man to cheat on his wife increases during pregnancy, even if he is otherwise satisfied in his marriage. His reasons? He may feel ambivalent about the pregnancy or the changes that go with it. His partner, especially in her first and third trimesters, may not feel like having sex. Her sex drive may diminish. She may think her body is unattractive.
(Incidentally, bodily dissatisfaction happens to be the number one reason why most women have less sex during pregnancy. Most of us think pregnancy is a turn-off for men. That’s a misconception.)
But here’s the good news for pregnant women. Fact is, many men — the majority as found in this study — desire their pregnant partner even more over the course of the pregnancy, even if they aren’t having as much sex as before. They find her as physically attractive as she was prepregnancy, if not more so. These are usually the Type Y guys. Another study found that, while couples had sex less frequently in third trimester, the only circumstances under which men change their sexual behavior is if they are older or worried about the safety of the fetus. (Note: Sex does not raise the risk of miscarriage in pregnancies that are not high risk.) Otherwise, men desire sex with their wives just as much.
From an evolutionary perspective,this makes some sense. Women benefited from having their mates around to help support them through pregnancy and childrearing. Sex helps men stick around.
The Type X expectant father – the one with a low sex drive and a lower risk of infidelity – may overlap with Type Ys. These are men who, at some point over the nine months, are afflicted with pregnancy symptoms: nausea, weight gain, mood swings, fatigue, even vomiting. Hormones are the culprit. These men have higher levels of prolactin, a hormone associated with sluggishness, weight gain, and bonding and parental behaviors. Their testosterone levels plummet, making them less combative and sexually aggressive.
There’s an upside to Type Xs. It turns out that these faithful, fattening men display the most fatherly behavior when the baby arrives. As new dads, they’re more likely to hear and respond to their infant’s cries. They’re more compassionate and tolerant. They make better fathers.
One might speculate that Weiner’s Type-Z behavior while his wife is pregnant doesn’t bode well for Weiner’s fathering instincts. It’s clear that if any hormone is raging in the man, it’s testosterone — not prolactin. He is probably not sharing his wife’s morning sickness and taking turns with her over the toilet.
There’s no crime in what Weiner has done; he’s just another politician more interested in power more than paternity. But he is making us a little nauseous.
*If you like this blog, click here for previous posts and here to read a description of my most recent book, Do Gentlemen Really Prefer Blondes?, on the science behind love, sex, and attraction. If you wish, check out my forthcoming book, Do Chocolate Lovers Have Sweeter Babies: Exploring the Surprising Science of Pregnancy.
A season ago, when my daughter reached the six-month mark, her pediatrician told us to introduce her to a new food every few days and see what she likes. It wasn’t time to wean her, but soon it will be, and supplementation should help the transition. So I lovingly shopped for organic fruits and vegetables: apples, bananas, avocados, peas, and so on. I presented them passively — as items for her to experiment withon her placemat — and actively, by making mmmms, playing airplane, and swallowing the goop and showing her my tongue.
Three months later, we’ve made astonishingly little progress on the solids front. At best, the infant deigns to nibble delicately on peas and lentils. She’ll squish the bits of mango and avocado on her plate and drop them on the floor. She’ll taste a food then whip her head to the other side and bat away the spoon. She wrinkles her nose.
All she really wants to do is nurse. Baby loves to nurse. She cries and cries in the wee hours of the morning because she wants to nurse. She is tall and heavy for her age.
Who’s to blame (at least in part) for her unweanable stubbornness?
It’s not only convenient to blame the father for babies who won’t give up nursing, It’s scientific. There’s evidence.
Here’s how it works, according to a new study Bernard Crespi, an evolutionary biologist at Simon Fraser University. How much and how long a baby nurses depends in part on her genes. The genes she inherits from her father have an ulterior motive. Paternal genes want the baby to extract as much as possible from the mother.
Paternal genes are thought to influence:
- suckling strength (so the baby extracts as much milk as she can)
- tongue size (a larger tongue is a better suction pump)
- crying (for maternal attention and food)
- appetite and speed of eating
- duration of breastfeeding before weaning
- night-time suckling (results in suppression of periods, which helps delay future pregnancies/siblings)
The genes that influence these behaviors are active only when they come from the dad. This is called genetic imprinting — when only the genes from one parent are expressed. Dad’s genes strongly affect the intensity of infant behavior. Only a tiny percentage of human genes are imprinted.
Dad’s genes are greedy for a good reason. From a biological perspective he has nothing to lose by making sure this particular offspring who carries his genes demands a lot of her mom — including suckling often, crying a lot, and taking a long time to wean. This behavior may be essential to a child’s survival in a setting in which resources are limited. “Weaning” genes have been shaped this way under evolutionary pressure in a premonogamous era.
Mom’s genes, meanwhile, are more moderate. They want the child to survive but dial back the feed controls. They’d prefer for a baby to self-feed and start solids sooner. Mom’s genes push moderation to save resources (time and energy) for her other (or future) offspring. When paternal genes are disabled and maternal genes are active, babies have Prader-Willi syndrome, a condition that manifests as inability to latch and suckle effectively, complacency, and lack of crying or other solicitation for food. These infants wean early because they never really nurse. They fail to thrive.
Demanding, unweanable infants come from dads. At a minimum, paternal genes play a real role in their aggressive eating, crying, and nursing behaviors.
Now that they’re outed, perhaps guilty fathers should be the ones to work the night shift and scrape food off the floor?
A few weeks ago, Israeli neuroscientists Shani Gelstein and Noam Sobel published a study about mind-control properties in human tears. The gist of the research, which enjoyed much media attention, is that women’s tears contain a chemical signal that reduces sexual desire in men. Tears were collected from the cheeks of emotionally-distraught women watching sad films and wiped on the upper lips of male volunteers. Compared to men who whiffed a salt solution control, the tear-sniffers not only had a reduced sex drive but also lower testosterone levels and reduced brain activity.
A leading explanation is that chemicals in tears generally reduce male aggression, making them more sympathetic.
How does this work? One theory is that one or more of the hundreds of chemicals in tears has “mind-control” properties, triggering specific predictable behaviors in others. (Here and here I write about how this happens in sweat, too.) One candidate is prolactin, a hormone associated with bonding. When inhaled in a person’s tears, prolactin may affect the sniffer’s hypothalamus, the part of the brain that produces hormones which in turn affect behavior.
Baby tears have not been the subject of a study yet (hopefully soon). But it’s not a far cry from certain that if there are chemicals in the tears of women that affect men, there are also chemical triggers in the tears of babies that affect their caregivers or anyone else that comes into contact with them. These tears may trigger care-giving instincts and reduce aggression toward the screaming infant.
I wonder: Infant abuse is relatively uncommon given how irritating a screaming baby can be. Are the people guilty of this crime more likely to be amnosiacs (loss of smell-sense) or have another form of brain damage that would prevent them from inhaling aggression-reducing signals in the baby’s tears?
Another theory: Kids cry all the time and sometimes it’s hard to tell when they really need attention. Might chemicals in emotional tears direct parents to respond appropriately when there is a real need for attention? Assume these chemical signals are only in emotional tears–not crocodile tears or sleepy-time tears. Do they help us intuitively know when it’s OK to let a child cry it out instead of rushing to soothe her?
Not long ago, people everywhere started to do the “finger game” on a first date. This is not as naughty as it sounds. As I describe in BLONDES, the finger game involves asking your companion for a look at his (or her) right hand. If his ring finger is longer than his index finger it’s a sign of prenatal exposure to high levels of testosterone. People with longer ring (than index) fingers are likelier to be more aggressive, better at sports, and more musically inclined. They may have more sex partners in life.
Now you can take the game to the next level: fingerprints.
Take a close look at the ridges on your companion’s fingers. (Actually, they’re best seen under a magnifying lens or photocopied and enlarged.) Most people have slightly more ridges on the fingers of one hand than the other.
More ridges on the right-hand fingers: This indicates higher levels of prenatal testosterone. He or she might master mental rotation – knowing which one of four abstract figures, revolved in three-dimensional mindspace, matches a diagram (a “masculine” task). Right-ridge dominant people are also better at aiming at a target and getting a bull’s eye.
More ridges on the left-hand fingers: This indicates lower levels of prenatal exposure. He or she may be a whiz at games like word associations, taking a word like clear and coming up with glass then Philip then opera then ghost, or naming as many round objects as she can in three minutes (considered “feminine” tasks). Compared to straight men, gay men have more ridges on their left pinkies and thumbs.
Four or more ridges on the fingers of one hand than the other: This reflects how much stress your companion weathered when he or she was a second-trimester fetus. For instance, researchers found that women who were 14-22 weeks pregnant when an epic ice storm hit Canada were more likely to have babies whose ridge counts varied greatly between hands. In nature, dramatic asymmetry is often a sign that the fetus has been stressed in some way. The more stress, the less symmetry. In fact, those with significantly asymmetric ridge counts between right and left hands were more likely to score lower in language and intellectual development as toddlers. Both fingerprint development and the brain may have been affected by constriction of blood flow to the placenta or stress hormone levels.
Other ridge count studies have also found interesting correlations: a significant difference between the ring and pinky fingers of the right hand is associated with less muscle mass in the lower extremities and a bulked-up upper body, including a thicker waist. A difference of around three or four more ridges between the thumb and pinky fingers is also associated with diabetes later in life. Asymmetries are also connected to cleft lip, dyslexia, schizophrenia, infections, and other prenatal problems.
Around ten weeks after conception is when the bottom (basal) layer of fetal skin outgrows the top (epidermis), and the tension between the two causes the skin to buckle. At this time fingerprints are like wet cement: any disturbance until mid- pregnancy may leave a lifelong impression. At this time the skin and the brain are both are made of the same raw material — fetal ectodermal tissue. Any disruptive event in the womb left its mark on both. This means that fingerprints give us clues about the brain.
You would like to know more about the minds of the people you date, which is why you’re analyzing their fingers. Of course if you could read their minds, you’d know they think you’re crazy.
Many years ago, scientists first discovered that a large minority of women have Y-chromosome gene sequences in their blood. At first glance, this seems strange. Men are born with Y-chromosomes but most women are not. The male cells in these women must’ve come from somewhere else.
The most obvious source is a fetus. Nearly every woman who has ever been pregnant or had a baby has cells from her fetus circulating in her bloodstream. These cells filter through the placenta and reside in the mother’s bloodstream and/or organs — including her heart and brain — for the rest of her life. This condition is called microchimerism, named after the Greek chimera, a creature composed of the parts of multiple animals. Pregnancy-related microchimerism explains why women with sons would have Y-chromosome sequences in their blood.
This is fascinating enough. But how do you explain why women without sons also have male cells circulating in their bloodstream?
This was the subject of a study by immunologists at the Fred Hutchinson Cancer Center. They took blood samples from 120 women without sons and found that 21 percent of them had male DNA. Women were then categorized into four groups according to pregnancy history: women with daughters only, spontaneous abortions, induced abortions, and no children/no abortions.
While the number of women bearing male DNA was highest in the groups that had abortions (nearly 80 percent), women who had only girls or no babies (20 percent) also had male cells in their blood. For no apparent reason.
There are other reasons why women in the fourth group carried male cells: inherited in the womb from a male twin that passed, from a miscarriage they did not know about, from their mother via an older brother…
Or through sexual intercourse.
There remains a possibility, however remote, that cells from a lover may pass be transmitted during sex. Those cells may hang out forever in the recipient’s body, taking residence in any organ. These cells are the imprint of lovers past, a trace of living history.
Might a woman’s bodily fluids enter a cut in a man’s genitals as well? Could men carry around the genes of women they’ve slept with?
The imagination is stirred. What are those foreign cells doing in hearts and minds? Are they wreaking havoc in our heads? Do the cells of former lovers clash? In a science fiction scenario a person could even take a drop of her own blood, isolate a cell from her former boyfriend, and clone him. Then do with him what she will.
The upshot of this research? It’s yet another reason to use a condom.
“You’ve never seen a mother cat with postpartum depression, right?” a woman in my prenatal yoga class asked me. She had a challenging look in her eyes. Before I could respond she rushed to her punchline. “It’s because cats eat their placentas.”
The woman introduced herself as a doula-in-training who prepares placenta on the side. She thought I might be interested.
I learned that placentophagy, the act of eating the afterbirth, is common among other mammals. Animals probably eat it for the extra iron and other nutrients, to detract predators, or possibly to alleviate pain (not to thwart the kitty blues). My fellow yogi is among the small but passionate population of birthing specialists who believe that women should eat their placentas, too — especially to ward off postpartum depression. The placenta is rich in hormones: progesterone, estrogen, cortisol, and others. These hormones originate in the placenta, which means a woman’s levels take a plunge immediately after she gives birth. One theory of why women get depressed after birth is their hormone levels are low. Eating the placenta, it seems, could raise hormone levels enough to ward off depression.
I once bought a placental cream in New Zealand, and the hormones in it made my face break out in violent pustules. That doesn’t make me want to eat the stuff.
“It’s spongy like liver,” the woman said, going for the hard sell. She could use it in lieu of meat in any dish: a simple sauté, lasagna, meatloaf, anything. “Placenta” means “cake” in Latin because it’s round and flat; she could make it into a burger. If none of this appeals, she could have it freeze-dried, emulsified, and made into capsules.
“Oh, but I’m vegetarian,” I said, moving my eyes reverently in the direction of a Krishna wall hanging. But the doula-in-training was armed with a response. “Placenta,” she said, “isn’t meat that is killed.” She patted me reassuringly. “It’s OK!”
“I’m OK,” I automatically responded, as if already stuffed and passing on seconds. I didn’t want to burst her bubble, but sautéing, stir-frying, or even baking placenta would likely change the molecular structure of the hormones in it. I suspect she’d have difficulty attracting clients if they had to eat their bloody organ raw, sushi-style.
As it turned out, my obstetrician had a difficult time removing my placenta. Once out, I let my eyes linger on the silver platter it was heaped on. Weighing in at about a pound and a half, this grayish bloody sack fed and protected my daughter and manipulated me for the nearly ten months of pregnancy. It removed her waste. “It’s got to be tough,” I thought.
But regret came over me as I watched it leave the room. Should I have kept it, tried it? I reminded myself there is no proof that consumption of the placenta wards off serious depression or even the baby blues. Humans in traditional cultures only very rarely eat the afterbirth. Hippies ate it but chimps won’t. Many ethnic groups, honoring the placenta’s indispensability, bury it ceremoniously.
I admit a medical incinerator is not a respectful end. But neither is a vegetarian’s hostile gut. I hate to be close-minded, but my jaws are locked shut.
[Click here for an account of a woman who ate the placenta in the pic above.]
At four o’clock in the morning, in the street in front of our home, I nearly lose it. Our three-week old has been crying for ten hours. I’ve wrestled her into a sling and am jumping up and down under a streetlight, singing “Amazing Grace” in agitated bursts.
Things have taken a turn for the worse. Earlier in the day when I lifted the baby up to my face, eyeball to eyeball, she jerked her head away and cried harder. The infant has been rocked and bounced, shushed and swaddled – with increasing force and desperation. It occurs to me that maybe I should ignore her for a spell. I could lay her down on the dewy grass, let her scream at the stars and the moon, while I drop my head in my hands and weep. How sweet the sound.
If there’s a mommy gene, I don’t have it.
Mommy genes! The idea started about fifteen years ago when a doctoral student named Jennifer Brown and her colleagues at Harvard Medical School noticed something wrong with their mice experiment. Pups were dying. Whole litters, in fact, were wiped out just a day or two after birth. This was strange, because the babies were healthy and so were the mothers. One glance at the mouse cage solved the mystery. Pups were scattered everywhere, shivering and starving, while the mothers nonchalantly went about their business. Normal mother mice round up their brood and feed and lick them. But these dams didn’t give a damn. They acted oblivious to their babies’ frantic squeaks.
The mother mice were specially bred to lack a gene called fosB. Brown and her colleagues had no idea that knocking out fosB would make mice into deadbeat moms, but it apparently does. It turns out that the gene, when activated, creates a protein that turns on other genes and is partly responsible for the function of neurons in the hypothalamus, a region of the brain that controls emotional behavior — including nurture. If you’re a murine mother, just being around your babies usually triggers the fosB gene to express itself. Because mother mice lacking a working copy of the gene are not motherly, fosB hit the headlines as the first “mommy gene.”
Several years later researchers found that genes called Peg 1/Mest and Peg3 also have an effect on the motherliness of mice. When scientists disabled these genes the result was similar the FosB experiment: cold-hearted mothers, empty-bellied pups. Both these genes influence how oxytocin, the “love hormone” behind caressing and nursing and other mothering behavior, is processed in the brain. When oxytocin doesn’t get to where it needs to go, the result is less nurture, more neglect. (Interestingly, in mice and humans, only the Peg1/Mest and Peg3 genes are imprinted and only the one inherited from the father is active. This means an afflicted mouse can blame her lack of mothering instinct on her dads. An attentive one can give him credit. )
“More Mommy genes!” the headlines raved. Mice and humans share many of the same genes, so these genes may influence women’s nurturing instincts, too. Perhaps we can test every wannabe mom to see if she has working copies of FosB, Peg1/ Mest, and Peg3. Then we’ll know who can soothe babies into submission and who thinks it’s a good idea to leave them to cry under the stars. Perhaps we can use genetic engineering to make us supermoms. No new parent would feel exasperated and hopeless again. Let’s make sure everyone has warm fuzzy mommy genes.
The scientists doing this research never claimed they found mommy genes. That sort of bravado would be embarrassing. Humans are obviously more complex than mice, and our behavior is more nuanced.
To say a gene makes a woman a good mother is a little like saying the carburetor is what makes a plane fly. Sure, the plane wouldn’t get off the ground without the device to blend air and fuel. But to credit the carburetor for flight? What about the wings, the pilot, the fuel? Or even the screws and the steel? And what about air around the plane, and the molecules in it? We can’t give all the credit (or blame) to one widget.
The same goes for “mommy genes.” Sure, genes influence how proteins are transcribed and neurons fire and signals are dispatched and hormones are received and processed, and so on. Every part of this infrastructure supports our nurturing behavior. We may be especially deficient if particular genes are defective or if they malfunction. There’s no doubt that researching these genes gives us valuable information about our nurturing behavior. But it’s likely that any one gene is just a widget in what makes us fly.
What’s a good mommy, anyway? That’s a debate beyond the realm of science. It’s slippery. When my newborn finally falls asleep in my arms, angelically, clutching my pinky, I feel like a good mommy again. It doesn’t require mommy genes.
It takes amazing grace.
Officially known as T. gondii, toxoplasmosis (or toxo) is a single-celled protozoa transmitted by exposure to cat excrement and by eating raw meat. We can also get it by gardening, eating unwashed fresh veggies and fruit, walking with bare feet on feces-rich soil.
My doctor tests all pregnant women for toxo, as do many doctors in Europe. Infection rates hover around 12 percent in the United States. In Brazil about 67 percent are infected (due to warm climate), in Hungary 59 percent, and in France about 45 percent (for the latter, blame all that steak tartare and pink lamb).
We’ve known for decades that toxo does weird things to the brain because rats infected with the parasite act a bit strange. By strange I mean they’re not only afraid of cat scents, they’re strangely aroused by them. And because they seek out cats, they’re often consumed, and in being consumed they infect the cats, completing toxo’s lifecycle. This is how the parasite perpetuates — by puppeteering. It manipulates rodents to sacrifice themselves to infect other cats and other rats, and so on.
Toxo may also invade and manipulate the human brain, which shares much of the same anatomy and neurotransmitters with rats — although mind control here is different (cats don’t usually eat humans, so there’s no evolutionary pressure on the parasite to tweak its effect on people). Paristologist Jaroslav Flegr of Charles University in Prague found that people with a latent infection tend to be more apprehensive, guilt-prone, self-doubting, and insecure. They have slower reaction times, especially if they also lack a certain blood protein, and three times as likely to get into traffic accidents due to impaired attention or reflexes. Infected women tend to be warmer-hearted, dutiful, moralistic, conforming, easy-going, persistent, and more outgoing and promiscuous. Infected men tend to be more jealous, rigid, slow-tempered, rule-flaunting, emotionally unstable, and impulsive.
Correlation is not causation, as scientists say when fascinating associations like this arise. But toxo may have an impact on personality and behavior because causes slight brain inflammation and alters its host’s levels of dopamine, the neurotransmitter associated with reward and anticipation (and also movement). The parasite does this by producing an enzyme called hydroxylase, which makes dopamine.
Dramatic as this sounds, most people are completely oblivious that toxo haunts their cells. Only pregnant women are commonly tested. And I’m one of them. Because I’m a hypochondriachal life-long cat owner who once worked on a farm, travels extensively, and doesn’t always scrub her veggies vigorously, I’m convinced I’ve been infected.
The nurse doesn’t think it’s an issue. “Not much happens if you’re positive,” she says, and shrugs. Her body language suggests it’s a silly test.
“Unless it’s a recent infection it doesn’t matter. We can tell by the antibodies if you’ve been infected in the last few months. If so, we give you antiparasitic drugs.”
Simple as that.
From a medical perspective, what she says is true. The risk to a fetus depends on the timing of infection and recent infection has the most disastrous consequences. If you happen to become infected with toxoplasmosis while pregnant, or soon before, the parasite or its toxins may cross over the placenta to infect your baby’s nervous system. Babies born to mothers infected in the first half of pregnancy often have shrunken or swollen brains and mental retardation. If infected in the second half, babies may not show symptoms at birth yet central nervous system problems may emerge years later. These babies are at a higher risk of developing schizophrenia — delusions, hallucinations — later in life, likely due to altered levels of dopamine triggered by the parasite.
The nice news is that if you’ve been infected for years before pregnancy you probably won’t pass toxo to your baby, nor will you likely have any obvious signs of infection (although cysts form in the brain). According to Dr. Flegr, only an active infection in the mom suggests a causal link between infection and her baby’s temperament. This is because your immune system usually keeps the parasite in check. But don’t think it’s completely asymptomatic.
In the past decade or so, studies have found that moms with dormant toxo infections have more sons (up to two boys for every girl), and those fetuses develop slightly more slowly than other babies. Perhaps there are other side effects that are undocumented.
Reading up on the science of prenatal infection I get reflective. Viruses, bacteria, and other parasites have always entered us — and some, such as our mitochondrial DNA (originally a bacterium), have become part of us and we can not live without them. Ancient viruses now exist deactivated or defanged in our DNA (in fact, genes from the placenta are thought to be a legacy of ancient viruses) Some viruses may be reactivated, like half-cured villains released from prison, and are thought to be a cause of cancer. Some invaders, initially dangerous, have converted to communalism, such as the thousands of good-guy varieties of healthy gut bacterial that make digestion possible. Strange but true: there are more bacterial than human genes in our bodies.
In a way, pregnancy has made me less fixed on the notion that my self is a singular identity over which I have total control. The fetus is me but not me, and she has changed me in ways I can’t yet fathom. The line between self and other is getting fuzzier.
But as philosophical as I get about self and other, me and microbe, my heart still races when I call the nurse to read my test results.
Negative for toxoplasmosis.
I’m relieved. Truth is, the only parasite I really don’t mind carrying is the baby.
Here are the astonishing statistics: 1 in 455 women doesn’t know she’s pregnant until after week twenty, and 1 in 2,500 is oblivious until she actually goes into labor. The latter are known to give birth, without medical assistance and in agonizing pain, in Walmart bathrooms and at proms, in dorm rooms and in their own bathrooms. They had no idea they were pregnant because they had irregular periods, have been on birth control pills, are in perimenopause, have had menstrual-like bleeding, and/or are overweight and less sensitive to weight gain.
But I know what you’re thinking because I’ve thought it too: it’s denial. On some level the ladies must’ve known they were pregnant but couldn’t deal with the reality.
Yet the more I explore the origins of cryptic pregnancy, as the condition is clinically called, I realize that denial or mental illness doesn’t explain most of the cases. Only a minority of cryptic pregnancy cases has been attributed to personality disturbances (8 percent) or schizophrenia (5 percent). It appears that most of these women are perfectly sane, educated, and in stable relationships. Quite simply, they do not know they’re pregnant because they have no symptoms — no weight gain, no nausea, and little to no abdominal swelling. Or the symptoms are so subtle as to be easily mistaken for something else.
Every pregnancy is a tug-of-war of resources between Mom and fetus. Each has her self-interest in mind. Most of the time the tug-of-war ends up in a happy equilibrium. Mom provides enough nutrients, but not too much too handicap herself. But sometimes Mom gets more rope….at the expense of the fetus.
According to evolutionary psychologist Marco Del Giudice this might happen in a few ways. For one, the fetus might not be putting out enough signals that it exists and needs resources. One way fetuses announce their existence is through HCG, the hormone that makes a home pregnancy test turn positive. In many cases, the higher the HCG, the more severe the morning sickness and other symptoms. A baby that produces a scant amount amounts of HCG might go “under the radar,” failing the pregnancy test and going unnoticed by the mother — physiologically and psychologically. This would mean the baby gets fewer resources than she otherwise would. The lack of HCG signaling in cryptic pregnancies explains why these babies are so often born preterm, underweight, and small for their gestational age. They didn’t ask for more resources from Mom, and they didn’t get any.
There are a few reasons why a baby wouldn’t produce enough HCG. One is chromosomal anomalies; that is, the fetus has a birth defect and is in danger of miscarrying. It’s also possible that an otherwise healthy fetus simply puts out low quantities of the hormone due to a genetic quirk.
Or, here’s an interesting theory: Maybe Mom has stress and relationship problems. In this case, biologically speaking, it may be in the fetus’s best interest for the mother to be completely oblivious to the fact that she’s carrying to prevent being rejected and miscarried, which may happen when a woman is stressed. As Del Giudice points out, in our evolutionary past a woman who did not know she was pregnant and had few to no symptoms could conserve precious energy. She would be able to move freely and eat food of any kind, and as a result be better able to survive in the face of stresses and threats. In this case, babies may put out less HCG or stressed-out moms may unconsciously lower their sensitivity to the hormones.
Seen this way, cryptic pregnancy is an adaptive “emergency” mechanism — essentially, the fetus sensing a threat and striking a bargain with the mother by demanding little and laying low. When the normal stresses of pregnancy might otherwise trigger a miscarriage, this “stealth strategy” allows the fetus to survive.
If you were to take the Trier Social Stress Test (TSST), as nearly one hundred fifty pregnant women did in a study led by Sonja Entinger at the University of California at Irvine, you’d be led to a windowless room with a video camera and instruments that measure your vital signs. There, an assistant would ask you to sit and be hooked up to instruments that measure your vital signs. In the room you’d also find three men and women sitting at a table, waiting for you.
They are your interview panel.
Facing them belly-on, your instructions are to pretend that you’re applying for a job and must deliver a five-minute speech to convince them that you’re right for the position. Someone would say 1-2-3-GO, and you’d start babbling, hopefully coherently. If you have nothing more to say before your time is up, one of your interviewers will blandly instruct you to continue. Run out of words again and twenty seconds of eerie silence will fill the room. And when you’re finally done, you’ll be asked to do a bit of mental math — say, to count down, in increments of thirteen, from a large prime number like 54,499. Before and afterward the fifteen minute ordeal, a researcher will enter the room and hand you a swab to collect your saliva for testing.
Analyzing all the data from their study, including an analysis of body language and hormone levels of women who took the TSST, the UC Irvine researchers confirmed something remarkable: the further along a woman was in her pregnancy, the less stressful she found the stress test. Compared to their stress levels in second trimester (17 weeks), volunteers in their third trimester (31 weeks) had lower blood pressure, slower heartrates, and lesser spike in the hormone cortisol. Pregnant women also did not stress out as much as nonpregnant controls who took the same tests at the same time intervals. This was not the first study that found that pregnant women, especially those in third trimester, are calmer than nonpregnant women under the same (short and moderately stressful) circumstances. But it was the first time that the same women were tracked at different stages of gestation.
So what is that makes pregnant women more Zen as they approach their due date? The likely answer is that the body reduces the sensitivity of cortisol receptors, even though baseline levels of the stress hormone are higher. In other words, it takes more stress hormones than usual to get the nervous system all hot and bothered. At the same time, the placenta increases production of an enzyme that changes cortisol to an inactive form, meaning that less of the toxic stuff filters through to the baby. Near the end of pregnancy, probably to calm you down before labor and help you bond with the baby, your body also produces more of the nervous-system soothing hormones oxytocin and prolactin.
All this is good news for moms who are slammed with short-term mild to moderate stress late in their pregnancies.
But there’s an even bigger surprise to come out of this. You may think this is your body subconsciously protecting the baby at a time of stress. But it’s just as likely that it works the other way around: your baby protecting you (as well as herself), because her placenta is responsible for at least some of the stress-dampening response to cortisol. It’s a beautiful idea — mother and child soothing one another in the face of life’s assaults.
For women not trying to get pregnant, life should be easy. Conception can only happen in the 12-24 hours after ovulation. Sure, sperm may last as many as 3-4 days in the genital tract, hanging around for the egg to arrive. But you’d think not having sex during the 4-5-day window would be sufficient to avoid mishaps. That’s what the rhythm method is — a natural form of birth control that relies on abstinence on fertile days.
But slips happen even among the most careful practitioners of the rhythm method. Some of this may have to do with women not keeping perfect track of their menstrual cycles or having naturally irregular cycles. (I discuss in BLONDES the evolutionary reasons why ovulation is hidden to both women and their partners.) The failure rate for rhythm method is 25 percent each year (with a perfect-use rate of 9 percent).
Why so high?
Another reason could be pheromones. The latest issue of my favorite journal, Medical Hypotheses, includes a submission that suggests that pheromones from men may cause an early ovulation in women. By invoking an early release of the egg — in advance of the expected fertile window — chances of fertilization are higher. As I mention in BLONDES, studies have the found that androstadienone, a testosterone-related compound found in men’s sweat, semen, and saliva, increases the amount of luteinizing hormone in women, which thereby triggering ovulation. It’s possible that high-testosterone men may be likelier to have this effect on their lovers. Their sweat smell alone may do the trick.
As I mentioned in an earlier post, there are other properties in semen that may also trigger early ovulation. For instance, seminal fluid contains follicle stimulating hormone (FSH) and luteinizing hormone (LH), which may coax the ovary to release an egg.
Despite the high failure rate, the Roman Catholic Chruch continues to promote the rhythm method, now renamed natural family planning (adding cervical mucus and temperature data to the regimen). Problem is, we don’t live in a clockwork universe, nor do we have clockwork bodies.
If you’re like most women you probably think ovulation is something of a meritocracy — that both ovaries do equal work, and that they alternate every cycle.
If by chance you were not taught that the ovaries soldier on left-right-left-right, then you probably think ovulation is random, like a coin toss.
The second scenario is closer to the truth, but it’s not the whole truth. At least not all the time or for most women.
Fact is, your right ovary is likelier to ovulate more often than the left. This means that in two consecutive months, the right side is probably the one doing more of the hard work of producing the dominant follicle that could become a baby.
At least this is what multiple studies have found, including here (57.7% of women have right-side ovulation), here (54.5 percent have right-side ovulation), and here (62% of total follicles are on the right), and here (larger, more numerous follicles).
Why is the right ovary often dominant?
Anatomical asymmetries between the left and right sides are thought to be the reason. The left ovarian vein drains to the left renal vein and the right ovarian vein to the inferior vena cava. The left renal vein is thought to be under higher pressure than the right and therefore drains slower. Because the left ovary drains slower, the collapsed follicle (called a corpus luteum) takes longer to clear and thereby diminishes the chance that ovulation will occur on that side the following month. No such condition exists on the right side, which is why successive right-side ovulation is more common. Estradiol and testosterone levels are also higher during a right-side cycle; this may also be related to the right ovary’s more efficient plumbing as it flushes lining-plumping hormones into the uterus.
All this leads to some fascinating statistics. For instance, right-sided ovulation favors pregnancy more often than left-sided ovulation (64 percent of pregnancies came from women’s right ovaries), according to a study in Japan that tracked nearly 2,700 natural cycles. Then again, according to another study, odds of pregnancy are best when the dominant follicle develops in the ovary opposite to where ovulation took place in the previous cycle (with pregnancy occurring more often in a right-side cycle that follows a left-side cycle) because the dominant follicles in such cycles are healthier. Even if the right ovary drains faster than the left, the corpus luteum left over from the previous cycle still negatively affects the hormonal health of the dominant follicle. Best to start with a clean slate.
Interestingly, researchers in another study speculate that right-side ovulation is dominant for most of a women’s reproductive years. Toward perimenopause women are more likely to become left-dominant, presumably because the supply of follicles in the right ovary has diminished.
Apart from ultrasound, there’s no reliable way of telling which ovary you’re ovulating from. ( I devote a section of BLONDES to why ovulation is concealed, even to women themselves.) If you think about it, perhaps that’s a good thing.