Author: Jyoti Madhusoodanan

(Don’t) Watch that Mouth: Listen Up by Looking Away

HMV

A dish clatters to the floor, and you spin around to view the damage. A friend calls out from beyond your line of sight, and you turn toward the sound. We’re instinctively aware that looking at the source of a sound makes it easier to understand—except when your eyes trick your brain into hearing things.

In a phenomenon known as the McGurk illusion, the syllables you hear sound different if you simultaneously watch a person’s mouth moving in the shape of another syllable. Being aware of this audio-visual trick doesn’t stop your brain from falling for it over and over again, though watching subtitled movies can help a little.

Recently published PLOS ONE research shows that the illusion is caused by visual signals reaching the auditory cortex in the brain faster than the sounds processed by your ears. Researchers analyzed brain signals in the auditory cortex, the part of the brain that processes sound, when volunteers were given a combination of videos and sounds to watch and hear.

The sound clips were of the syllables “ba,” “ga,” “va,” and “tha,” but physical mouth movements in the concurrent video weren’t always the same. In some tests, movements in the video matched the spoken sound perfectly, but in others, the sounds were either completely mismatched, like watching a poorly dubbed movie, or just slightly mismatched. Listeners had no trouble identifying the sound they heard in the extreme case of an absolute mismatch, such as a video of “ba” paired with the sound of “tha”, and they did just as well when sound and video lined up perfectly.   However, when the mismatch was less obvious, such as “ba” with “va,” participants “heard” what they saw (va), and not what was played for them (ba).

When sounds and videos were perfectly matched or mismatched, participants’ brain activity corresponded to the auditory signals. But, when the mismatch wasn’t as obvious, activity in the auditory cortex increased in response to what participants saw, rather than what they heard. More simply, their brains ‘heard’ what they saw, not the sound that was played.

Understanding why the McGurk illusion occurs in the brain isn’t likely to change how we experience the effect (no really, try it for yourself), but the results take us a step closer to learning how we really hear voices in our heads.

Citation: Smith E, Duede S, Hanrahan S, Davis T, House P, et al. (2013) Seeing Is Believing: Neural Representations of Visual Stimuli in Human Auditory Cortex Correlate with Illusory Auditory Perceptions. PLoS ONE 8(9): e73148. doi:10.1371/journal.pone.0073148

Image: Dog looking at and listening to a phonograph, from Wikimedia Commons

Category: Topic Focus | Tagged , , , , , | 1 Comment

Plant Roots May Lie Beneath Namibian Fairy Circles

fairy3Circles of barren land, ranging from one to several feet in diameter, appear and disappear spontaneously in Namibian grasslands. The origins of these ‘fairy circles’ remain obscure, and have been attributed to causes ranging from the fantastic (the poisonous breath of a subterranean dragon) to those backed by more evidence, such as the work of a soil termite. A recent PLOS ONE paper suggests another possibility: Patterns that emerge during normal plant growth. Author Michael Cramer elaborates on the results of this study:

How did you become interested in studying the Namibian fairy circles, and are similar circles seen elsewhere?

It would be hard not to be intrigued by these mysterious barren circles on the edge of the spectacular Namibian sand sea! These circles are also reminiscent of soil mounds in other places, for example mima mounds in the US, “heuweltjies” in South Africa and “campos de murundus” in South America that have primarily been ascribed to faunal activity. Like fairy circles, these mounds may, however, represent a distinct product of patterns formed by vegetation. My co-author, Nichole Barger, became intrigued by both these phenomena while I was on sabbatical in her lab.

Many other scientific ideas have been proposed to explain the occurrence of these circles. What’s missing from these explanations? 

Any explanation of fairy circles has to provide a plausible mechanism for regular spacing of these relatively large circles in the landscape. The most common explanation to date has been that termites cause the circles. While it is undoubtedly true that ants, termites and other fauna do occur in the circles and may play a role in maintenance of the circles, we suggest that inter-plant competition is the primary cause that drives circle formation. This places plant competition in focus as a possible mechanism for determining the shape, size and distribution of the circles.

What made you think the patterns could be formed by plant growth patterns themselves?

We stood on the shoulders of giants! Previous studies have alluded to vegetation patterning as a possible cause.  Other researchers have also produced computer models to predict fairy circle occurrence and found plant growth may play a role.  More generally, understanding of spatial patterns formed by plants and the realization that this emergent phenomenon is common in arid landscapes has increased recently. Several groups have produced mathematical models that explain the production of vegetation patterns (gaps, bands and spots) and show that increasing aridity can result in transition from one pattern to another.

fairy2How did you analyze the fairy circles?

We adopted two approaches. We used Google Earth to obtain images of sites across Namibia, analyzed these to determine circle morphological characteristics, and  then combined the images with environmental data to predict the distribution of fairy circles. We performed ground surveys to measure circle morphology and collect soil samples. Soils were sampled at various depths and regular intervals inside and outside the circles and analyzed for water and nutrient contents.

What did you find?

We found that we could predict, with 95% accuracy, the distribution of fairy circles based on just three variables. Rainfall strongly determined their distribution, and differences in rainfall from year to year may thus explain why circles dynamically appear and disappear in this landscape.  The patterns of moisture depletion across the circles are also consistent with plant roots foraging for water in the circle-soil. The size and density of the circles is inversely related to resource availability, indicating that bigger circles occur in drier areas and where soil nitrogen is lower.

Do the data in this study strengthen previous results or disprove any older explanations for the circles?

Our results corroborate previous results and extend them, but we have interpreted the results in a novel manner. Since our study was correlative, i.e: we correlated the occurrence of fairy circles with certain environmental conditions, it does not disprove existing hypotheses. Direct experiments that result in fairy circles being created or closing up are perhaps the only way  to prove or disprove any of these ideas.

fairy1Do these results have implications for other ecosystems? For example, could similar ecological conditions cause fairy circles to form in other grasslands around the world?

Circular grass rings do occur in many contexts. For example, Stipagrostis ciliata in the Negev and Muhlenbergia torreyi (ring muhly) in the US (e.g. New Mexico, Utah) form rings. The distinction is that these are much smaller (ca. < 1 – 2 m diameter) and less regularly spaced than fairy circles. Nevertheless, their origins may have some commonalities with fairy circles. The special circumstance that results in the spectacular Namibian fairy circles may be the fact that the soils are very sandy and homogenous.

More generally, the fairy circles represent an example of how patterns formed by growing plants can create heterogenous spaces in otherwise homogenous grassland. Differences in soil moisture or composition across the span of a fairy circle can provide habitat for both grasses and fauna that would otherwise not thrive in this arid environment.

 

Citation: Cramer MD, Barger NN (2013) Are Namibian “Fairy Circles” the Consequence of Self-Organizing Spatial Vegetation Patterning? PLoS ONE 8(8): e70876. doi:10.1371/journal.pone.0070876

Images: fairy circles by Vernon Swanepoel (top); images below from 10.1371/journal.pone.0070876

 

Category: Author Spotlight | Tagged , , , , , , | 1 Comment

Coffee Plants Don’t Like It Hot

Guest blogger Atreyee Bhattacharya is a science correspondent and climate scientist, currently a research affiliate at the Department of Earth and Planetary Sciences, Harvard University. 

espressoLike most, I like my coffee black and piping hot. Coffee plants, however, may not be as fond of the heat.

When thinking about the impact of changing climate (increased droughts, wilder fluctuations in seasons) and increasing pest activity on food production—my thoughts tend toward crops such as rice, wheat, and corn. Not so much wine, chocolate, or coffee, though I probably consume more coffee throughout the day than I do these other staples.

However, two recent papers published in PLOS ONE deliver a double whammy to coffee, or more particularly the Coffea arabica plant, a species that today accounts for more than 70 percent of the world’s coffee. (Another, less common, variety is C. robusta, which has twice the caffeine content.)

In a 2011 study, Juiliana Jaramilo from the University of Hannover and her coauthors, Figure 2showed that warming air and land temperatures can change the distribution of the coffee berry borer Hypothenemus hampei in East African C. arabica producing regions.

The borer, a pest that attacks coffee beans, “causes losses exceeding US $500 million annually, and worldwide affects many of the more than 25 million rural households involved in coffee production” the study reports. A serious infestation can lower coffee production by more than three times!

Until about ten years ago, reports of H. hampei attacks on coffee plants growing above 1500 m (the preferred altitude of cultivated and naturally occurring C. arabica) were few and far between. But thanks to the 0.2-0.5 degrees Celsius temperature increase in coffee growing regions of East Africa, the pests are now found at higher altitude plantations as well.

As temperatures continue to rise as per projections from the  Intergovernmental Panel on Climate Change (IPCC), coffee borer infestations in this region are likely to spread  farther. Increasing temperatures will increase the number of H.hampei generations each year from 1-4.5 to 5-10 or more.

“These outcomes will have serious implications for C. arabica production and livelihoods in East Africa,” caution the authors, adding, “We suggest that the best way to adapt to a rise of temperatures in coffee plantations could be via the introduction of shade trees in sun grown plantations.”

Figure 3Though C. arabica plants do like to grow in the shade; another study indicates that this protection may still not be enough to combat the threat of warming temperatures. According to this research by Aaron Davis from the Royal Botanic Gardens in the United Kingdom, warming temperatures may make several localities within southwest Ethiopia and neighboring regions climatologically ill-suited to growing C. arabica.

“Based on known occurrences and ecological tolerances of Arabica, bioclimatic unsuitability would place populations in peril, leading to severe stress and a high risk of extinction,” write the researchers.

According to their estimates, the most favorable outcome of warming is a 65% decrease in areas with climate suitable for coffee plantations, and at worst, an almost 100% loss of these regions by 2080. In terms of available area for growing coffee, the most favorable outcome  is a 38% reduction in suitable space, and at worst a 90% reduction. Neighboring areas could fare even worse by as early as 2020.

Coffee is a 90-billion-dollar industry , but it is an industry that depends on long-term planning. The beans that we grind every morning today were planted about 7-10 years ago, and our morning brew a decade hence depends on today’s plantations.

Demand for coffee continues to rise in our ‘coffee culture’, and C. arabica still constitutes about 75-80% of the world’s coffee production. C. arabica is believed to be the first species of coffee to be cultivated, well over a thousand years ago. It epitomizes an incredible journey, and is one beverage that is certainly worth a second thought as rising temperatures threaten its existence.

Read these studies and more on the ecological impacts of climate change in the new PLOS Collection: http://www.ploscollections.org/ecoclimatechange

Citations:Jaramillo J, Muchugu E, Vega FE, Davis A, Borgemeister C, et al. (2011) Some Like It Hot: The Influence and Implications of Climate Change on Coffee Berry Borer (Hypothenemus hampei) and Coffee Production in East Africa. PLoS ONE 6(9): e24528. doi:10.1371/journal.pone.0024528

Davis AP, Gole TW, Baena S, Moat J (2012) The Impact of Climate Change on Indigenous Arabica Coffee (Coffea arabica): Predicting Future Trends and Identifying Priorities. PLoS ONE 7(11): e47981. doi:10.1371/journal.pone.0047981

Images: 

Espresso by Richard Masoner on Flickr 

Distribution of the coffee berry borer (Hypothenemus hampei) in Eastern Africa under current climate. The EI values (0–100), indicates unsuitability of the location’s climate (0), and a ‘perfect’ climate for the given species (100). doi:10.1371/journal.pone.0024528.g001

Predicted and actual distribution of indigenous Arabica. Green dots show recorded data-points. Colored areas (yellow to red) show predicted distribution based on modeling. A context map is given in the top left hand corner. doi:10.1371/journal.pone.0047981.g001

Category: Aggregators, Collections, Topic Focus | Tagged , , , | 4 Comments

Announcing the Ecological Impacts of Climate Change Collection

Ecoclimate change collection

Post authored by Collection Curator Ben Bond-Lamberty 

The ecological impacts of climate change are broad and diverse, and include alterations to species’ range limits, plant phenology and growth, carbon and nutrient cycling, as well as biodiversity and extinction risk. Recent PLOS articles have used a variety of experimental and observational approaches to examine these subjects.

Identifying at-risk regions, taxa, and species is a critical first step in adaptation and conservation efforts. A study by Mouillot et al. suggested that rare species are particularly important in conservation efforts, as rare species in diverse ecosystems are not replaceable by other species that fulfill the same ecological functions. At the same time, both rare and more common species experience the ecological impacts of climate change. Foden et al. combined biology and ecology to assess, on a global scale, the climate change vulnerability of birds, amphibians, and corals based on expert assessment and literature surveys. In a more regionally focused study, Gardali et al. assessed climate-change risk for California’s vulnerable bird species.

Birds were also the focus of two studies documenting how particular species can be ‘winners’ or ‘losers’ in a changing climate. Receding glaciers and thus increased breeding habitat have led to population increases for Adélie penguins in the southern Ross Sea. The outlook was more mixed for Pacific western grebes , which have shifted south, perhaps in response to changes in their forage fish prey. Further down the food chain, Suikkanen et al.  used thirty years of marine data to infer that climate change and eutrophication drove a trophic shift in Baltic Sea food webs.

Long-term data were also used to study how flowering dates have changed since the mid-19th century. In a study that received extensive media coverage, Ellwood et al. used flowering records initiated as early as 1852 to show that high spring temperatures in 2010 and 2012 resulted in the earliest flowering in recorded history in the eastern United States. The biological pathways through which temperature affects seasonal timing in endotherms were discussed by Caro et al. Two other widely-covered studies focused on coffee: predicting future trends and identifying priorities, and climate change impacts on this plant and one of its important pests. Both examine adaptation possibilities for managing coffee crops over the coming century.

Adaptation and vulnerability were central themes for Guest et al., who reported that corals under thermal stress showed lower bleaching susceptibility at locations that bleached a decade earlier, implying an adaptive or acclimatization response. The molecular mechanisms behind such thermal tolerance were explored by Bellantuono et al.

Finally, the ecological impacts of climate change affect our health, the urban environment, and the agricultural economy. Airborne pollen counts have been increasing across Europe, and Ziello et al. suggest that rising CO2 levels may be influencing this increase. In another study, Meineke et al. used an elegant combination of observation and manipulative experiments to show that urban warming was a key driver of insect pest outbreaks in the southeastern U.S. Rising temperatures are a significant driver for the expanding range of Asian tiger mosquitoes, known vectors for West Nile and other viral infections. Warming was also found to contribute to the decreasing quality of grassland for grazers such as bison and cattle, although the effects are often exerted via complex interactions with other factors.

The broad range of these papers emphasize not only the multi-faceted impacts of climate change on ecological and human systems, but also the breadth and depth of research on these subject being reported in the PLOS journals. These journals seem a particularly appropriate venue for the ‘citizen science’ and other long-term data used by many of these studies.

Collection Citation: Ecological Impacts of Climate Change Collection (2013) http://www.ploscollections.org/ecoclimatechange

Image Credit: (Clockwise from top) William Warby. Flickr.com. Thomas Vignaud. PLOS Biology. 2011. 9(4). Colombi et al. PLOS ONE. 2013. Soto-Azat et al. PLOS ONE. 2013.

This Collection is also available on Flipboard, please search “PLOS Collections” to subscribe.

 

Category: Collections, Conferences, Topic Focus | Tagged , , , , , | 1 Comment

Putting the brakes on blood clots

blood clots Helena de PuigWhether you get a paper cut or have a bad accident, our bodies respond  with a near-universal command: when bleeding, clot. Within seconds of skin being broken, a cascade of cells and proteins align at precise positions to hold the breach. They form a fine mesh to stop blood flow, identify offensive invaders (splinter or microbe?) and recruit cells to clean up the mess. The operation is swift, precise, and for minor injuries, leaves no trace.

For major wounds and during surgeries though, doctors must use anti-coagulant drugs to stop the clotting process and ensure a free flow of blood. However, once an anti-coagulant is used, the only way to reinitiate the process of clotting is to wait for the drug to run out.

Now, a laser-controlled gold switch could change that wait, as researchers have developed a way to switch blood clotting on and off with the flick of a nanoparticle switch. The switch relies on the ability of paired particles to release two different DNA molecules from their surface depending on the wavelength of laser light used to turn the switch on. When released, one piece of DNA binds to thrombin, a key protein in the clotting cascade, and blocks its activity, preventing coagulation. When the complementary DNA piece is released from the nanoparticle, it acts as an antidote, releasing thrombin to restore clotting.

Prior to this advance, there was no way to restore clotting after an anticoagulant was administered. As Kimberly Hamad-Schifferli, senior author on the study, explained in an MIT news release, “It’s like you have a light bulb, and you can turn it on with the switch just fine, but you can’t turn it off. You have to wait for it to burn out.”

The new method developed in this study could provide doctors and researchers with a more precise way to control where and when blood coagulates during surgery and healing. In the MIT news release, Luke Lee, a professor of bioengineering at the University of California at Berkeley (not an author on the study), elaborates, “It’s really a fascinating idea that you can control blood clotting not just one way but by having two different optical antennae to create two-way control. It’s an innovative and creative way to interface with biological systems.”

Citation: de Puig H, Cifuentes Rius A, Flemister D, Baxamusa SH, Hamad-Schifferli K (2013) Selective Light-Triggered Release of DNA from Gold Nanorods Switches Blood Clotting On and Off. PLoS ONE 8(7): e68511. doi:10.1371/journal.pone.0068511

Image: Red blood cells with gold nanorods (yellow dots) on their surfaces. The blue represents a fixing polymer. credit: Helena de Puig

Category: Topic Focus | Tagged , , | Leave a comment

Contextualizing the Hobbits

LB1

18,000 years ago, the remote Indonesian island of Flores was home to a population of tiny humans. They stood only about 3.5 feet tall on their large feet, and their skulls housed unusually small brains approximately the size of a grapefruit. The identity of these ‘hobbits’ has been hotly debated for years: Were they modern humans suffering a disease, or a new species, Homo floresiensis?

Biological anthropologist Karen Baab first studied a model of LB1, the only skull recovered from the site, at the American Museum of Natural History in 2005. In a recently published PLOS ONE study, she and other researchers compare this specimen to a range of other modern human and extinct hominin skulls to get closer to settling the identity of Homo floresiensis, or ‘Flores man’.

The origins of ‘Flores man’ have been debated for quite a while now. What are the possible origins that are being discussed, and why the uncertainty?

The primary debate has centered on whether LB1 (and possibly the other individuals found on Flores) represents a new species that descended from an extinct species of the genus Homo or whether it is instead a pathological modern Homo sapiens, i.e the same species as us. If the Flores remains do in fact represent a distinct species, then the next question is whether they descended from Homo erectus, a species that may be our direct ancestor, or an even more primitive species. The latter scenario implies an otherwise undocumented migration out of Africa.

What makes it so hard to settle the argument one way or the other?

One of the difficulties in settling this particular argument is that most studies have focused on one or the other of these ideas and compared the Flores remains to either fossil hominins or to pathological modern humans, each using a different set of features. This makes it challenging to compare the alternative hypotheses side-by-side.

What kind of diseases might have caused modern humans to have features similar to these ‘hobbits’?

The three that have been discussed most prominently (and the three we looked at) are microcephaly, endemic hypothyroidism (“cretinism”) and Laron Syndrome. Microcephaly is not a disease per se, but rather a symptom of many different disorders. It refers to having an abnormally small brain and therefore skull. “Cretins” suffer from a lack of thyroid hormone before and after birth, which leads to stunted growth and possibly a slight decrease in skull size. Laron Syndrome individuals produce growth hormone, but their bodies do not properly recognize it, again leading to stunted growth and other developmental issues.

Only a few specimens of this hominin have been found, and there’s only one known skull, from the specimen named LB1. Are there reasons why these specimens have not been discovered elsewhere?

If Homo floresiensis descended from Homo erectus, then their closest relative lived just “next door” on the nearby island of Java. In this case, the unique features of the Homo floresiensis species probably evolved in the isolated island environment of Flores. If, however, the ancestor was a more primitive species, and Homo floresiensis didn’t branch off from H.erectus, it is possible that they might have migrated earlier than known, and we could still find older sites in mainland Asia containing this ancestral species.

Liang Bua cafe

You compared the morphology of the LB1 skull to many hominin ancestors and modern human populations from around the world. What were some of the most striking similarities and differences?

The LB1 skull is very distinct from the typical modern human’s, as it has a lower,  more elongated silhouette when viewed from the side, , greater width at the rear of the braincase, and a flatter frontal bone (the bone underlying the forehead) with a more pronounced brow ridge. Interestingly, these are some of the same features that distinguish archaic species like Homo erectus from modern humans.

Specimens of Laron Syndrome and “cretin” skulls from modern Homo sapiens presented large, round, globular braincases, which are very different from the smaller, lower and less rounded braincase of LB1. The microcephalic human skulls present a closer comparison to LB1, but still show clear distinctions from LB1 in much the same way that they differ from species like Homo erectus or Homo habilis.

Overall, the LB1 braincase is most similar in shape to small-brained Homo erectus from Eurasia that are 1.8 million years old.

How does this analysis add to, or change, what we knew about Flores man? 

This analysis provides a unique opportunity to evaluate these evolutionary and pathological hypotheses side-by-side based on the same criterion – of cranial shape similarity. The results support a stronger affiliation of LB1 with fossil Homo than with any of the proposed pathologies. This study also offers an improvement over previous assessments of the microcephaly hypothesis by using a more extensive sample that better captures the variability in this disorder.

Do these results conclusively settle the discussion? What other possibilities still exist for the origins of H. floresiensis?

While very little in paleoanthropology is ever “settled,” I do think this study represents an important step forward in terms of putting the pathological hypotheses to rest. The question that remains to be answered definitively is which species of archaic Homo is the most likely ancestor of Homo floresiensisHomo erectus or an earlier and more primitive species of Homo?

Citation: Baab KL, McNulty KP, Harvati K (2013) Homo floresiensis Contextualized: A Geometric Morphometric Comparative Analysis of Fossil and Pathological Human Samples. PLoS ONE 8(7): e69119. doi:10.1371/journal.pone.0069119

Images: Homo floresiensis by Ryan Somma, Cave where the remains of Homo Floresiensis where discovered in 2003, Liang Bua, Flores, Indonesia by Rosino

 

Category: Author Spotlight | Tagged , , , , , | 2 Comments

Malaria, Tuberculosis Caused Death on the Ancient Nile

 

Nile

Southwest of Cairo, the Nile branches into a network of canals that feed Fayum, a fertile agricultural basin that was a center of civilization and royal pyramid-building for several centuries. The unusual geology responsible for Fayum’s rich terrain may have also led to the prevalence of malaria and tuberculosis in the region during these ancient times.

Ancient DNA (aDNA) from sixteen mummified heads recovered from the region reveals that at least four of these individuals suffered both these infections simultaneously. Many of the others showed signs of infection with either malaria or tuberculosis, as scientists report in a recent PLOS ONE study.

DNA extracted from muscle tissue samples was tested for the presence of two genes specific to Plasmodium falciparum, the malarial parasite, and another gene specific to Mycobacteria, which cause tuberculosis. Two samples tested positive for DNA specific to Plasmodium, one tested positive for the mycobacterial gene, and four individuals tested positive for DNA from both infectious agents, suggesting they suffered both infections together while alive. A previous study suggests that both malaria and tuberculosis were rampant in the Fayum region in the early 19th century, but the age of these mummified samples extends evidence of these diseases in Lower Egypt as far back as approximately 800 B.C.

The World Health Organization estimates that malaria is almost non-existent in the Fayum basin and the rest of Egypt now, but before its eradication, high levels of infection were seen in certain parts of the country, and were strongly linked to certain geological features. The lakes and canals that made the Fayum region so fertile also served as breeding grounds for the mosquito that carries the malarial parasite.

The heads tested here (all were missing bodies) were recovered from a village cemetery on the west bank of the lower Nile, and date from about 1064 BC to 300 AD, a period marked by an agricultural boom and dense crowding in the region, especially under the rule of the Ptolemies. These conditions may have increased the chances of tuberculosis incidence and spread of the disease. As the aDNA from these mummified heads attests, these living conditions and the unique irrigation of the Fayum basin likely created a harbor for both malaria and tuberculosis in ancient populations of this region.

Citation: Lalremruata A, Ball M, Bianucci R, Welte B, Nerlich AG, et al. (2013) Molecular Identification of Falciparum Malaria and Human Tuberculosis Co-Infections in Mummies from the Fayum Depression (Lower Egypt). PLoS ONE 8(4): e60307. doi:10.1371/journal.pone.0060307

Image: Sailing on the Nile by David Corcoran

Category: Topic Focus | Tagged , , , , | Leave a comment

Moms and babies respond to childbirth with different stress hormones

stress

A quick internet search reveals that many women rank giving birth as one of the most painful human experiences. Though pain can be hard to quantify objectively, the physiological stress of childbirth is clinically assessed by measuring blood levels of the stress hormone cortisol.

Cortisol is currently used to estimate the stress experienced by both mother and child during the process of giving birth, but recently published PLOS ONE research suggests that a different stress hormone, corticosterone, may be a more accurate way to measure the stress experienced by healthy, full-term babies.

For their study, researchers tested fetal levels of cortisol and corticosterone in 265 samples of umbilical cord blood from healthy deliveries. Though the total levels of cortisol detected were higher than corticosterone levels, fetuses produced the latter at a greater rate in response to the stress of labor and delivery. Newborns secreted more corticosterone when a Caesarian section was performed due to complications during labor than they did after a normal C-section. Fetal corticosterone levels were also higher after passage through the birth canal. These differences were not seen in levels of cortisol production. Based on these data, the authors suggest that the full-term fetus is more likely to secrete corticosterone than cortisol in response to stress and hence, corticosterone may be a more accurate clinical biomarker to assess fetal stress.

Corticosterone isn’t unheard of in the adult world, as adults continue to make the hormone throughout our lives, though in a much smaller proportion relative to cortisol. When babies switch to producing more cortisol rather than corticosterone isn’t yet clear, but the developmental changes involved may help track or diagnose adrenal gland functions in newborns.

Citation: Wynne-Edwards KE, Edwards HE, Hancock TM (2013) The Human Fetus Preferentially Secretes Corticosterone, Rather than Cortisol, in Response to Intra-Partum Stressors. PLoS ONE 8(6): e63684. doi:10.1371/journal.pone.0063684

Image: stress by topgold

Category: Conferences, Topic Focus | Leave a comment

Hairy, Sticky Leg Pads are In: How Different Spiders Hunt

Euophrys_L2_cryo__q17_bearb_color_composite

Spiders are everywhere (Arachnophobes, stop reading now). They’re among the most successful predators on earth today and colonize nearly every terrestrial habitat (that is, not just ceiling corners and under beds), and occasionally do so in numbers large enough to take over small islands. Spider silk may be strong enough to stop a speeding train and some webs, ten times stronger than Kevlar, can be large enough to cross rivers in tropical rainforests.

But more than half of today’s spider species don’t rely on webs or silk to capture their prey. Instead, these hunting spiders have evolved hairy adhesive pads on their legs to grab and hold struggling prey down, according to the results of a recently published PLOS ONE study. The adhesive pads, called scopulae, were commonly seen in many spider species but what wasn’t clear until now was whether they were found in all species, or more likely to occur in hunting spiders.

scopulaeIn this study, researchers used a phylogenetic analysis of spider family trees to correlate different species’ prey capture strategies with the presence or absence of adhesive pads on their legs. They found that the majority of spiders were either web builders or free-ranging hunters, and the latter were most often found to have adhesive hairs on their legs (Apart from these two, at least one rare variety may be mostly vegetarian). Nearly 83% of hunting spiders had adhesive bristles on their legs (compared with 1.1% of web-building varieties). Most of these hunters had either not developed silk-dependent strategies to capture prey, or abandoned web-building for hunting.

Spider Web on PlantWhy would so many spiders abandon an obviously successful way to catch prey? Web-building is a useful way to trap insects and some small mammals, but even to a spider, silk is expensive. Creating a web requires work, damages caused by prey or people need frequent repairs, and certain kinds of webs can require large amounts of silk to be effective. The classic orb-web (seen in the picture here) radically reduced these costs, which may be why the spiders that make these are particularly common. However, this new study reveals that hunting has proved at least as successful a strategy as web-building to more than half of today’s spiders.

Bristly scopulae on hunting spiders’ legs have played a big part in this, enabling spiders to grasp and hold on to struggling prey. The thin bristles on scopulae come in many shapes and forms, and also contribute to these spiders’ mad climbing skills. Read more about which spiders evolved these bristles or learn about other arachnid research published in PLOS ONE here.

 

Citations: Gregorič M, Agnarsson I, Blackledge TA, Kuntner M (2011) How Did the Spider Cross the River? Behavioral Adaptations for River-Bridging Webs in Caerostris darwini (Araneae: Araneidae). PLoS ONE 6(10): e26847. doi:10.1371/journal.pone.0026847

Rogers H, Hille Ris Lambers J, Miller R, Tewksbury JJ (2012) ‘Natural experiment’ Demonstrates Top-Down Control of Spiders by Birds on a Landscape Level. PLoS ONE 7(9): e43446. doi:10.1371/journal.pone.0043446

Wolff JO, Nentwig W, Gorb SN (2013) The Great Silk Alternative: Multiple Co-Evolution of Web Loss and Sticky Hairs in Spiders. PLoS ONE 8(5): e62682. doi:10.1371/journal.pone.0062682

Nyffeler M, Knörnschild M (2013) Bat Predation by Spiders. PLoS ONE 8(3): e58120. doi:10.1371/journal.pone.0058120

Images: Foot of the little jumping spider Euophrys frontalis, credit Jonas Wolffvaried shapes and sizes of bristles on scopulae from pone.0062682spider web on plant by mikebaird

Category: Topic Focus | Tagged , , , , | Leave a comment

Opportunistic pathogens evolve mostly harmlessly in healthy humans

Staphylococcus_aureus_VISA_2

Humans interact with bacteria almost every minute of our lives. Of the millions of these interactions, only a handful result in disease, and some bacteria only cause infections under certain conditions. In a recent PLOS ONE study, researchers probe these healthy human-bacterial relations  in one particularly notorious pathogen as it spends the majority of its time in our bodies, doing no harm.

Staphylococcus aureus can cause endocarditis, toxic shock syndrome and other diseases, killing approximately 1 in 100,000 infected people in the US each year. Strains like MRSA have also evolved to carry multiple antibiotic resistance genes, making infections extremely difficult to treat. If human-bacterial interactions are to be described as a ‘genetic arms race’, it may be tempting to cast S. aureus as an enemy that carries every available genetic weapon.

Yet despite a few sporadic skirmishes, the majority of our interactions remain peaceful, as these bacteria thrive in healthy human hosts.  In fact, about a third of healthy adults carry S. aureus in our noses at some point in our lives.  In the article, researchers analyzed the genetic changes in S. aureus carried in such hosts by sequencing the genomes of 130 strains of S. aureus from the nasal passages of 13 healthy adults, five of whom carried strains of MRSA (which is often harmless when carried nasally). Despite the arms race metaphors, they found that S. aureus strains in healthy hosts are not incessantly beefing up their genetic arsenal of antibiotic resistance or pathogenesis genes.

They found bacterial genomes were changed by processes of ‘micro-mutation’, i.e.: small bits of genetic material being added or removed, or changes in a single letter in the genetic code. Large insertions and deletions (macro-mutation) were also common, as were changes caused by bacteria-infecting viruses or small, independently moving rings of DNA called plasmids. Overall, the constant changes in S. aureus genomes were geared toward keeping bacterial genomes healthy by clearing erroneous or harmful mutations. Only on rare occasions did these bacteria acquire distinctive surface proteins or an enterotoxin that could alter their pathogenic potential. In addition, their research also analyzed changes in specific genes used to assess bacterial diversity and relatedness, and developed a new method to detect transmission of bacterial strains among human carriers. Read the full study to learn more about these interesting results.

Many of the changes identified in this study may not directly increase the virulence of disease-causing S. aureus. However, previous work by these researchers demonstrated that mutations arising in bacteria carried by healthy hosts may play an important role in tipping the balance between human health and disease. Here, the authors begin to paint a picture of what these mutations are and how they may occur.

Citation: Golubchik T, Batty EM, Miller RR, Farr H, Young BC, et al. (2013) Within-Host Evolution of Staphylococcus aureus during Asymptomatic Carriage. PLoS ONE 8(5): e61319. doi:10.1371/journal.pone.0061319

Image: Scanning electron micrograph of S.aureus with increased resistance to vancomycin. Credit CDC/ Matthew J. Arduino, DRPH

 

Category: Topic Focus | Tagged , , , , | Leave a comment