Three simple tips to survive grant writing

Tip 2: To survive grant writing, take a break from the computer and go on a walk with friends. Image via Core Athletics.

Tip 2: To survive grant writing, take a break from the computer and go on a walk with friends. Image via Core Athletics.

Like any field, working in research has its ups and downs. Ask any scientist and they will likely identify the opportunity to guide their own inquiries through research as an upside, but grant writing as one of the main downsides.

A recent PLOS ONE paper by Ted and Courtney von Hippel notes that the average principal investigator (PI) spends a minimum of 116 hours per grant submission. That’s at least one full month of work for the PI alone! Add to that the fact that grant funding can make or break a career and it’s no wonder that grant writing is stressful. To avoid burn out from writing a grant (or dissertation), try the following tips.

1. Give yourself a break

typewriter

Keep your mind energized by taking breaks from work. Image courtesy: Marcel Oosterwijk from Amsterdam, The Netherlands [CC BY-SA 2.0], via Wikimedia Commons

Grant writing can be an all encompassing process in positive and negative ways. Grant writing is a wonderful opportunity to take a deep dive into a body of literature. However, it demands time you might rather spend doing something else (e.g., conducting research as opposed to writing about research conducted by others). To avoid turning into a dull boy or girl, I suggest you engage in at least one small pleasurable activity per day. The activity depends on your interests, but there’s a whole body of literature on the benefits of this approach, so choose whatever is right for you and be sure to stick to it. If you find that you’re making excuses to cancel fun activities, try asking yourself, “What makes more sense, a 15-minute break now or a 2-hour breakdown later?

2. Give yourself an energy boost

When you’re on a deadline, it’s tempting to work around the clock, but it’s likely that this sort of schedule does more harm than good. For example, the evidence shows sleep deprivation reduces creative thinking. Without enough food or sleep you’re unlikely to have enough energy to engage in a task as cognitively complex as grant writing. There are at least three components to getting an energy boost. First, eat regularly. Don’t go for more than three to four hours without eating. Second, sleep regularly — go to sleep and wake up at the same time each day. Third, exercise regularly – engage in some physical activity every day, it can be as simple as going for a walk. You can even combine giving yourself a break with an energy boost; walking with a friend for a snack is a way to have fun, get some exercise, and get enough to eat.

3. Give yourself some props

Tip 3: Think positive by writing affirmations around your workspace. Image by Ignacio Palomo Duarte, via Flickr.

Tip 3: Think positive by writing affirmations around your workspace. Image by Ignacio Palomo Duarte, via Flickr.

Getting feedback from mentors and peers is an important part of grant writing. It also exposes you to a near constant stream of criticism, which, while (hopefully) constructive, can still take a toll on your confidence. To combat this, remind yourself of your past accomplishments and the exciting work you’ll do if the grant is awarded. It’s tempting to do this in your head, but it’s more effective to write down these positive statements and keep them on your phone or on a piece of paper by your computer, that way if you’re feeling down and can’t think of many positive qualities, you’ll have a cheat sheet. If nothing else, the affirmations can improve your mood (though research shows it may depend on your initial level of self-esteem).

These tips may seem simple, but they’re often overlooked and undervalued. Even as a clinical psychologist, it took me weeks to realize that taking care of myself made grant writing easier. While there’s no guarantee that following these tips will increase the likelihood of getting funded (as Drs. von Hippel note, ever-diminishing funds and the excellent quality of many grant applications makes winning funding a “roll of the dice”), they are important to preserving your well being and productivity. After all, having fun, eating and sleeping well, getting exercise, and building your confidence will probably improve your quality of life, which is ultimately more important than grant money. Right?

References
Dimidjian, S., Barrera Jr, M., Martell, C., Muñoz, R. F., & Lewinsohn, P. M. (2011). The origins and current status of behavioral activation treatments for depression. Annual review of clinical psychology, 7, 1-38.

Hames, J. L., & Joiner, T. E. (2012). Resiliency factors may differ as a function of self-esteem level: Testing the efficacy of two types of positive self-statements following a laboratory stressor. Journal of Social and Clinical Psychology, 31(6), 641-662.

Landmann, N., Kuhn, M., Maier, J. G., Spiegelhalder, K., Baglioni, C., Frase, L., … & Nissen, C. (2015). REM sleep and memory reorganization: Potential relevance for psychiatry and psychotherapy. Neurobiology of learning and memory.

von Hippel, T., & von Hippel, C. (2015). To Apply or Not to Apply: A Survey Analysis of Grant Writing Costs and Benefits. PloS one, 10(3), e0118494.

Category: Academia, The Student Blog | Tagged , , , , , , | Leave a comment

Snark-Hunters Once More: Rejuvenating the Comparative Approach in Modern Neuroscience

By Jeremy Borniger

65 years ago, the famed behavioral endocrinologist Frank Beach wrote an article in The American Psychologist entitled ‘The Snark was a Boojum’. The title refers to Lewis Carroll’s poem ‘The Hunting of the Snark’, in which several characters embark on a voyage to hunt species of the genus Snark. There are many different types of Snarks, some that have feathers and bite, and others that have whiskers and scratch. But, as we learn in Carroll’s poem, some Snarks are Boojums! Beach paraphrases Carroll’s writing outlining the problem with Boojums:

If your Snark be a Snark, that is right:
Fetch it home by all means—you may serve it
with greens
And it’s handy for striking a light.
 
**************
But oh, beamish nephew, beware of the day,
If your Snark be a Boojum! For then,
You will softly and suddenly vanish away,
And never be met with again!

View post on imgur.com

Instead of the Pied Piper luring the rats out of town with a magic flute, the tables have turned and the rat plays the tune and a large group of scientists follows. (From Beach, 1950)

 
Beach provides this metaphorical context to describe a problem facing comparative psychologists in the 1950’s: an increasing focus on a few ‘model species’ at the cost of reduced breadth in the field. The comparative psychologists were hunting a Snark called “Animal Behavior”, but this Snark too, turned out to be a Boojum. Instead of finding many animals on which to test their hypotheses, they settled on one: the albino rat. It was there that “the comparative psychologist suddenly and softly vanished away”.

Even in the mid-1900’s Beach recognized the funneling of biological and psychological research efforts towards a single or few ‘model species’. He even went as far as to suggest that the premier journal in the field be renamed The Journal of Rat Learning as its focus had almost entirely shifted to the rat. This trend has culminated in a true bottleneck, where the vast majority of research now focuses on phenomena occurring in a small amount of ‘model organisms’ like the laboratory mouse (Mus musculus), Norway rat (Rattus norvegicus), nematode worm (Caenorhabditis elegans), fruit fly (Drosophila melanogaster), or zebrafish (Danio rerio). Indeed, a 2008 analysis found that “75% of our research efforts are directed to the rat, mouse and human brain, or 0.0001% of the nervous systems on the planet”. Focusing on such a small fraction of the biological diversity available to us may be skewing or vastly restricting our conclusions.

The Genetics Revolution

In the last quarter of a century, the incredible advancement in genetic technology has pushed a few model organisms further towards the top. For example, the mouse was among the first mammals to have its genome sequenced, the results being published in 2002. Because of the presence of readily available sequence information, subsequent tools (primers, shRNA, oligonucleotides, etc…) and genetic techniques (conditional knockout/overexpression models) were developed specifically for use in the mouse. This further discouraged the use of other organisms in research as most of the ‘cutting edge’ tools were being developed almost exclusively in the mouse. It also promoted ‘shoe-horning’ of research that may not be appropriate for this model organism to take advantage of the genetic tools available. Indeed, this may be the case with research regarding the visual system, or many mental disorders in mouse models. The lab mouse primarily interprets environmental stimuli via olfactory (smell) cues, rather than sight (as it is nocturnal), making it a suboptimal organism in which to study visual function. Also, as mental disorders are poorly understood, developing a robust animal model in which to test treatments remains a significant obstacle. Trying to force the mouse to become the bastion of modern psychiatry research is potentially hampering progress in a field that could benefit from the comparative approach. For example, wild white-footed mice (Peromyscus leucopus), which are genetically distinct from their laboratory mouse relatives, show seasonal variations in many interesting behaviors. For instance, in response to the short days of winter, they enhance their fear responses and alter the cellular structure of their amygdala, a key brain region in the regulation of fear. Because these changes are reversible and controlled by a discrete environmental signal (day length), these wild mice contribute to the development of current translational models that involve amygdala dysfunction, such as post-traumatic stress disorder (PTSD).

What are the Benefits to the Comparative Approach?

Emphasis on a few model organisms became prevalent primarily due to their ease of use, rapid embryonic development, low cost, and accessible nervous systems. In the last number of decades, access to an organism’s genetic code provided another incentive to use it for research purposes. While these advantages are useful, they encourage researchers to become short-sighted and neglect previous contributions of diverse species to the advancement of science. As Brenowitz and Zakon write, “this myopia affects choice of research topic and funding decisions, and might cause biologists to miss out on novel discoveries”. Some examples of breakthroughs that were made possible through the use of the comparative approach include the understanding of the ionic basis of the action potential (squid), the discovery of adult neurogenesis (canary), conditioned reflexes (dog), dendritic spines (chicken), and the cellular basis of learning and memory (sea slug). More recently, incredible advancements in temporal control of neuronal function have been made (optogenetics) thanks to the characterization of channel rhodopsins in algae.

The revolution in genetics ushered in new tools that were only available to be used in the few organisms that have had their genomes sequenced. The comparative approach, however, is now gaining the tools necessary to become part of the 21st century genetic revolution. New gene-editing techniques, such as TALENS, TILLING, or CRISPR/Cas9 allow for fast, easy, and efficient genome manipulation in a wide variety of species. Indeed, this has already been accomplished in Atlantic salmon, tilapia, goats, sea squirts, and silkworms. Also, as the price to sequence an entire genome rapidly decreases, new tools will be developed for use in a wider variety of species than ever before. It is unlikely that many of the groundbreaking discoveries that stemmed from research on diverse and specialized organisms would be funded in the current ‘model species’ climate. We should not put all of our research ‘eggs’ in one model organism ‘basket’, and instead invest in a broad range of organisms that are fit for each question at hand. The time to revive the comparative approach has arrived. In the words of Brenowitz and Zakon, “Grad students, dust off your field boots!”

References

Beach, F. A. (1950). The Snark was a Boojum. American Psychologist5(4), 115.

Brenowitz, E. A., & Zakon, H. H. (2015). Emerging from the bottleneck: benefits of the comparative approach to modern neuroscience. Trends in neurosciences38(5), 273-278.

Chinwalla, A. T., Cook, L. L., Delehaunty, K. D., Fewell, G. A., Fulton, L. A., Fulton, R. S., … & Mauceli, E. (2002). Initial sequencing and comparative analysis of the mouse genome. Nature,420(6915), 520-562.

Edvardsen, R. B., Leininger, S., Kleppe, L., Skaftnesmo, K. O., & Wargelius, A. (2014). Targeted mutagenesis in Atlantic salmon (Salmo salar L.) using the CRISPR/Cas9 system induces complete knockout individuals in the F0 generation.

García-López, P., García-Marín, V., & Freire, M. (2007). The discovery of dendritic spines by Cajal in 1888 and its relevance in the present neuroscience.Progress in neurobiology83(2), 110-130.

Goldman, S. A. (1998). Adult neurogenesis: from canaries to the clinic. Journal of neurobiology36(2), 267-286.

Li, M., Yang, H., Zhao, J., Fang, L., Shi, H., Li, M., … & Wang, D. (2014). Efficient and heritable gene targeting in tilapia by CRISPR/Cas9. Genetics,197(2), 591-599.

Ni, W., Qiao, J., Hu, S., Zhao, X., Regouski, M., Yang, M., … & Chen, C. (2014). Efficient gene knockout in goats using CRISPR/Cas9 system.

Manger, P. R., Cort, J., Ebrahim, N., Goodman, A., Henning, J., Karolia, M., … & Štrkalj, G. (2008). Is 21st century neuroscience too focussed on the rat/mouse model of brain function and dysfunction?.Frontiers in neuroanatomy2: 5

Stolfi, A., Gandhi, S., Salek, F., & Christiaen, L. (2014). Tissue-specific genome editing in Ciona embryos by CRISPR/Cas9. Development141(21), 4115-4120.

Pavlov, I. P. (1941). Conditioned reflexes and psychiatry (Vol. 2). W. H. Gantt, G. Volborth, & W. B. Cannon (Eds.). New York: International publishers.

Walton, J. C., Haim, A., Spieldenner, J. M., & Nelson, R. J. (2012). Photoperiod alters fear responses and basolateral amygdala neuronal spine density in white-footed mice (Peromyscus leucopus).Behavioural brain research233(2), 345-350.

Wei, W., Xin, H., Roy, B., Dai, J., Miao, Y., & Gao, G. (2014). Heritable genome editing with CRISPR/Cas9 in the silkworm, Bombyx mori.

Category: The Student Blog | Tagged , , , , , , , , , | Leave a comment

Knowledge is where you find it: Leveraging the Internet’s unique data repositories

A Last.fm user shares the music recommendation system's representation of his listening habits over a month. Photo courtesy of Aldas Kirvaitis via Flickr.

A Last.fm user shares the music recommendation system’s representation of his listening habits over a month. Photo courtesy of Aldas Kirvaitis via Flickr.

By Chris Givens

Sometimes, data doesn’t look like data. But when circumstances conspire and the right researchers come along, interesting facets of human nature reveal themselves. Last.fm and World of Warcraft are two entities made possible by the Internet, both aimed at entertainment of consumers. However, through new means of social interaction and larger scales of data collection they also, perhaps unintentionally, advanced science. Scientific achievement may seem like a stretch for a music service and a video game, but these unlikely candidates for scientific study show that the information age constantly offers new ways to study human behavior. Last.fm and World of Warcraft are contemporary social constructions, part of the new way that humans interact in our rapidly changing digital world. By applying scientific rigor to the data unwittingly generated by two Internet-based companies, we see that knowledge is everywhere, but sometimes requires creative routes to coax it out of hiding.

Last.fm: more than a musical concierge

Last.fm is a music service that uses consumers’ listening data and genre tags to recommend new music to the user. It has a huge cache of song clips in its databases, which were not viewed as a data set until recently, when a group of computer scientists mined the songs for certain characteristics and created a phylogeny of popular music. The lead author on the study, Dr. Matthias Mauch, formerly worked on the Music Information Retrieval (MIR) team at Last.fm. MIR is essentially automated analysis of musical data, usually from audio samples. Uses for the data gleaned from audio samples include improved music search, organization, and recommendation. This kind of research has clear benefit to a company like Last.fm, whose main goal is to catalog users’ listening habits and recommend music they would like based on past listening patterns. Dr. Mauch, however, is interested in more than simply improving musical recommendations; he wants to trace the evolution of the variety of music from around the world. In a recent study, he used a huge data set obtained from his time at Last.fm to start cracking the code on musical evolution.

Hip-hop is a confirmed revolution

When hip-hop burst into the public consciousness in the late 1980’s, the music polarized Americans. Hip-hop music originally centered on themes of social ills in inner-city America, providing a creative outlet for the frustration felt by many working-class African Americans at the time. Gangsta rap eventually grew out of hip-hop, characterized by at times violent, masculine lyrical themes. After release of their seminal album, Straight Outta Compton, the hip-hop group N.W.A received a warning letter from the FBI as a result of controversial songs on the album. The explosive and politicized emergence of hip-hop created a new genre of popular music, thrusting a marginalized group of Americans into the pop culture spotlight. Starting from humble roots, hip-hop is now a multi-billion dollar industry. But even with all of the popular exposure and controversy, until Dr. Mauch’s study the degree to which hip-hop revolutionized popular music was hard to quantify.


See Dr. Mauch’s TED Talk about music infomatics here.

A group of researchers, led by Dr. Mauch, used MIR techniques on the Last.fm data set, and in doing so, found previously unknown relationships between hip-hop and other types of twentieth-century popular music. After recognizing the song clips obtained from Last.fm held a repository of data, the group devised a method of classifying songs based on two categories of attributes: harmonic and timbral. Harmonic attributes are quantifiable, encompassing chord changes and the melodic aspects of songs; timbral attributes are more subjective and focus on quality of sound, like bright vocals or aggressive guitar. The authors deemed these attributes “musically meaningful” and thus more appropriate for quantitative analysis than simple measures of loudness or tempo.

The researchers used modified text-mining techniques to carry out their analysis. They combined characteristics from the harmonic and timbral lists to create “topics” which could then be used to analyze each song based on the number of topics present. Next, the researchers analyzed 17,000 songs from the Billboard Hot 100 charts for the 50 years between 1960 and 2010. After finishing song analysis and clustering songs based on their harmonic and timbral characteristics, the researchers created a phylogenetic tree of popular music.

The tree empirically verified what we already knew — that hip-hop is in a league of its own. Out of four clusters on the tree, hip-hop is the most divergent. Using the tree of life as an analogy, if the genres of rock, soul, and easy listening are animals, fungi, and plants, hip-hop would be musical bacteria.

Using these data, extensive knowledge of musical history is possible. The authors state in their paper that instead of using anecdote and conjecture to understand musical evolution, their methods make it possible to pinpoint precisely where musical revolutions occurred. Due to their efforts, popular music now has a quantitative evolutionary history, and Dr. Mauch isn’t finished yet. He plans to do similar analyses on recordings of classical music and indigenous music from all over the world, in an attempt to trace the origins and spread of music predating the radio. I feel the innovative techniques and range of this study is incredible. Dr. Mauch and colleagues adapted research methods frequently used to improve of music delivery (already an interesting field) and used them to unlock a small amount of transcendent musical knowledge. This study shows that tens of thousands of song clips isn’t a typical scientific data set, until someone says so. By taking what was provided and forging it into something workable, Dr. Mauch and colleagues applied scientific methods to Last.fm’s unrecognized and unexamined data repository.

Surviving a pandemic in the Wide, World of Warcraft

World of Warcraft (WoW) is a highly social video game that connects players globally. WoW is also arguably the last place anyone would look for scientific insight. Launched in 2004, WoW is one of the most popular games ever created, with around ten million subscribers at its peak popularity. WoW was designed as a “massively multiplayer online role-playing game”. When launched, players from all over the world began interacting in real time, throughout an intricately designed virtual world. The world was designed as a fantastical model of the real world, complete with dense urban areas and remote, low population zones. In 2005, a glitch that caused a highly contagious sickness to be spread between players revealed this game to be an apt model of human behavior under pandemic conditions. The glitch drastically affecting gameplay for gamers and piqued the interest of several epidemiologists.

The “Corrupted Blood Incident”

The glitch came to be known as the “Corrupted Blood Incident” in the parlance of the game. It originated from one of the many things present in WoW that are not present in the real world: “dungeons”. Dungeons in WoW are difficult areas populated by powerful “boss” characters that possess special abilities not normally found in the game. In 2005, one of these abilities, the “Corrupted Blood” spell, was modified by a glitch to have powers outside of the zone it normally resided in. Consequently, the highly contagious “Corrupted Blood” swept though WoW, killing many player characters and providing an accurate simulation of real-world pandemic conditions. “Corrupted Blood” infected player characters, pets, and non-player characters, which aided transmission throughout the virtual landscape. Only one boss character in one remote zone cast this spell, so its spread was a surprise to players and developers alike, adding to the accuracy of the “simulation”.

The glitch stayed active for about a week, and during that time, gameplay changed dramatically. Because pets and non-player characters carried the disease without symptoms, reservoirs of the plague existed in the environment and helped nourish the outbreak. Players avoided cities for fear of contracting the disease. Some players who specialized in healing stayed in cities, helping especially weak players stay alive long enough to do business. Weaker, low-level players who wanted to lend a hand posted themselves outside of cities and towns, warning other players of the infection ahead. After a week of the pandemic in the game, the developers updated the code and reset their servers, “curing” the WoW universe of this scourge of a glitch.

Some epidemiologists took note after observing the striking similarities between real world pandemics and the virtual pandemic in WoW. In the virtual pandemic, pets acted as an animal reservoir, as birds did in the case of avian flu. Additionally, air travel in WoW (which takes place on the back of griffins) proved analogous to air travel in the real world, thwarting efforts to quarantine those affected by the disease. Also, WoW is a social game full of tight-knit communities, and at the time had around 6.5 million subscribers, making it a reasonable virtual approximation of the social stratification that exist in real world society.

See Dr. Fefferman’s 2010 TED talk here.

The behavior observed in WoW was not taken as a prescription for how to handle a pandemic or a prediction of what will happen. Rather, as Dr. Nina Fefferman put it in a 2010 TED talk, this event provided “ inspiration about the sorts of things we should consider in the real world” when making epidemiological models. Dr. Fefferman’s group discovered two behaviors displayed by players experiencing the virtual pandemic empathy and curiosity, which are not normally taken into account by epidemiological models. Curiosity was the most notable, because it paralleled the behavior of journalists in real world pandemics. Journalists rush into the infected site to report, and then rush out, hopefully before becoming infected, which is exactly what many players did in the infected virtual cities of WoW.

The “Corrupted Blood Incident” is the first known time that an unplanned virtual plague spread in a similar way to a real world plague. Though at first, most looked at this instance simply an annoying video game glitch. It took some creative scientists decided to see what they knowledge they could glean from the incident. Their observations suggest that sometimes, the best agent-based model is the one where actual people control the agents, and that simulations similar to computer games might “bridge the gap between real world epidemiological studies and large scale computer simulations.” Epidemiological models are now richer as a result of this knowledge. To learn more about how the “Corrupted Blood Incident” changed scientific modeling for pandemics, head on over to the PLOS Public Health Perspectives blog to hear Atif Kukaswadia’s take on it.

Concluding Thoughts

The Last.fm study and “Corrupted Blood Incident” show ways scientists can use esoteric corners of the Internet to illuminate interesting pieces of human history and behavior. New means of social interaction and new methods for collecting information bring about interesting, if slightly opaque, ways to discover new knowledge and advance scientific discovery. It is a credit that these scientists helped shed light on human history and interactions by looking past the traditional and finding data from novel sources.

Category: Gaming, PLoS Blogs, The Student Blog | Tagged , , , , , , , | Leave a comment

PLOS Computational Biology Community Going All Out to Cover #ISMB15 with the ISCB Student Council

Calling All Bloggers for ISMB/ECCB 2015

ISMB/ECCB 2015 in Dublin, Ireland, is fast approaching and we invite you to be involved in the live coverage of the event.

If you can’t make it to Dublin, follow our live collaborative blog coverage at http://plos.io/CBFieldReports

In previous years, ISMB has been way ahead of the social media curve with microblogging in 2008, one year before the launch of Flickr, one year after the launch (in the USA) of the Apple original iPhone 1, and just two years after Twitter was founded. Now at the last count, Twitter has averaged at 236 million users, three million blogs come online each month, and Tumblr owners publish approximately 27,778 new blog posts every minute. We all know that, in like-fashion, social media is a growing aspect of conferences, — read more in our Ten Simple Rules for Live Tweeting at Scientific Conferences, — and we think ISMB/ECCB 2015 is a great venue for progress.

How can you be involved?

We want you to take live blogging to ISMB/ECCB. If you are planning to attend the conference in Dublin and you blog or tweet, or even if you would like to try it for the first time, we want to hear from you — Everyone can get involved.

Our invitation extends to attendees from all backgrounds and experience who could contribute blog posts covering the conference. In addition we are looking for a number of ‘super bloggers’ who can commit to blogging two to three high-quality posts or who would be interested in interviewing certain speakers at the conference. If you are speaking at ISMB/ECCB and would like to be involved, please do also get in touch.

iscb sc

For the “next generation computational biologists”

We will be working on this collaborative blog project with the  ISCB Student Council.

In acknowledgment of the time and effort of all who work with us, each contributor will receive a select PLOS Computational Biology 10th Anniversary t-shirt (only available at ISMB/ECCB 2015) and your work will be shared on the PLOS Comp Biology Field Reports  page [http://plos.io/CBFieldReports], making it easier for all to contribute and collaborate.

CB-10thAnniversary_Collection-image-with-logo-300x300

Winning design for the PLOS Comp Biol 10th Anniversary T-shirt; get yours by blogging with us at ISMB15! PLOS/Kifayathullah Liakath-Ali

What are the next steps?

If you’re active on Twitter or the blogosphere and want to help us share the latest and greatest from ISMB/ECCB 2015 conference, please email us at ploscompbiol@plos.org with a bit about your background and how you’d like to contribute. See you in Dublin!

FOR MORE ON PLOS AT ISMB/ECCB 2015 read this post.

Category: The Student Blog | Tagged , , , , | Leave a comment

Walk the walk, talk the talk: Implications of dual-tasking on dementia research

A group of friends chat as they walk in St. Marlo, France. Photo courtesy of Antoine K (Flickr).

A group of friends chat as they walk in St. Marlo, France. Photo courtesy of Antoine K (Flickr).

By Ríona Mc Ardle

You turn the street corner and bump into an old friend. After the initial greetings and exclamations of “It’s so good to see you!” and “Has it been that long?”, your friend inquires as to where you are going. You wave your hands to indicate the direction of your desired location, and with a jaunty grin, your friend announces that they too are headed that way. And so, you both set off, exchanging news of significant others and perceived job performance, with the sound of your excited voices trailing behind you.

This scenario seems relatively normal, and indeed it occurs to hundreds of people in everyday life. But have you ever truly marvelled our ability to walk and talk at the same time? Most people discount gait as an automatic function. They spare no thought to the higher cognitive processes that we engage in, in order to place one foot in front of the next. The act of walking requires huge attentional and executive function resources, and it is through the sheer amount of practice we accumulate throughout our lives that makes it feel like such a mindless activity. Talking while walking recruits even more resources with the additional requirement of an attentional split – this is so we don’t overemphasize our attention on one task at the detriment of the other. And it’s not just talking! We can assume all manner of activities while walking – carrying trays, waving our hands, checking ourselves out in the nearest shop window. But what sort of costs does such multitasking afford us?

Dual-Tasking: How well can we really walk and talk?

Dual-tasking refers to the simultaneous execution of two tasks – usually a walking task and a secondary cognitive or motor task. It is commonly used to assess the relationship between attention and gait. Although there is no consensus yet on the manner in which dual-tasking may hinder gait, dual-task studies have demonstrated decreased walking speed and poorer performance of the secondary task. This has been particularly seen in healthy older adults, who struggle more with the secondary task when prioritizing walking – this is likely to maintain balance and avoid falls. Such findings have supported the role of executive function and attention in gait, as both are implicated as part of frontal lobe function. Age-related changes in the brain have demonstrated a focal atrophy of the frontal cortex, with reductions of 10-17%  observed in the over-65 age group. When compared to the reported 1% atrophy of the other lobes, it can be postulated that the changes to the frontal lobes contribute to gait disturbances experienced by the elderly. The older generation must recruit many of the remaining attentional resources in order to maintain gait, and thus have few left to carry out other activities at the same time.

It has been incredibly difficult to confirm the frontal cortex’s role in gait from a neurological perspective. Imaging techniques have largely been found unsatisfactory for such an endeavor, as many rely on an unmoving subject lying in a fixed position. However, a recent study in PLOS ONE has shed some light in the area. Lu and colleagues (2015) employed Functional Near-Infrared Spectroscopy (fNIRS) to capture activation in the brain’s frontal regions: specifically, the prefrontal cortex (PFC), the premotor cortex (PMC) and the supplementary motor areas (SMA). Although obtaining a similar image of the cortex to that of functional magnetic resonance imaging (fMRI), fNIRS can acquire these during motion. These researchers laid out three investigatory aims:

  1. To assess if declining gait performance was due to different forms of dual task interference.
  2. To observe if there was any differences in cortical activation in the PFC, PMC and SMA during dual-task and normal walking.
  3. To evaluate the relationship between such activation and gait performance during dual-tasking.

The research team predicted that the PFC, PMC and SMA would be more active during dual-task due to the increased cognitive or motor demand on resources.

A waiter balances a tray of waters. Photo courtesy of Tom Wachtel (Flickr).

A waiter balances a tray of waters. Photo courtesy of Tom Wachtel (Flickr).

Lu and colleagues recruited 17 healthy, young individuals to take part in the study. Each subject underwent the three conditions three times each, with a resting condition in between. The conditions were as follows: a normal walking condition (NW) in which the participant would be instructed to walk at their usual pace, a walking while carrying out a cognitive task condition (WCT) in which the subject had to engage in a subtraction task, and a walking while carrying out a motor task condition (WMT) in which the participant had to carry a bottle of water on a tray without spilling it. Results showed that both the WCT and WMT induced a slower walk than the NW. Interestingly, WMT displayed a higher number of steps per minute and a shorter stride time – this could be attributed to the intentional alteration of gait in order not to spill the water. An analysis of the fNIRS data revealed that all three frontal regions were activated during dual-task conditions. The PFC showed the strongest, most continuous activation during WCT. As the PFC is highly implicated in attention and executive function, its increased role in the cognitive dual-task condition seems reasonable. The SMA and PMC were found to be most strongly activated during the early stages of the WMT. Again, this finding makes sense as both these areas are associated with the planning and initiation of movement – in this case, the researchers postulated that this activity may reflect a demand for stability of the body in order to carry out the motor task. This study has successfully demonstrated the frontal cortex’s role in maintaining gait, particularly when concerned with a secondary task.

Why is this important?

Although gait disturbances are extremely rare in young people, their prevalence increases with age. 30% of adults over 65  fall at least once a year, with the incidence rate climbing to 50% in the over-85 population. Falls carry a high risk of critical injuries in the elderly, and often occur during walking. This latest study has provided evidence for the pivotal roles of executive function and attention in maintaining gait, and has allowed us to gain insight into the utilities of dual-task studies. Having correlated activation of the frontal cortex with carrying out a secondary task while walking, poor performance of one of these tasks may reveal a subtle deficit in attention or executive function. Therefore, gait abnormalities may be able to act as a predictor of mild cognitive impairment (MCI) and even dementia. Studies have reported that a slowing of gait may occur up to 12 years prior to the onset of MCI, with longitudinal investigations observing that gait irregularities significantly increased individuals’ likelihood of developing dementia 6-10 years later.

But what does the relationship between gait and cognition tell us about dementia? Dementia is one of the most prevalent diseases to afflict the human population, with 8% of over-65s and over 35% of the over-85s suffering from it. It is incredibly important for researchers to strive to reduce the amount of time individuals must live their lives under the grip of such a crippling disorder. Our growing knowledge of gait and cognition can allow us to do so in two different ways: through early diagnosis of dementia and through developments in interventions.

For the former, a drive in research has begun regarding the correlation of different gait deficits with dementia subtypes. The principle behind this is that the physical manifestation of the gait disturbance could lend clinicians a clue as to the lesion site in the brain – for example, if a patient is asked to prioritize an executive function task when walking, and displays a significant gait impairment while doing so, this may predict vascular dementia. This is because executive function relies on frontal networks, which are highly susceptible to vascular risk. Studies have shown that this type of dual-task will often cause a decrease in velocity and stride length and are associated with white matter disorders and stroke. To the latter point, practice of either gait or cognition may benefit the other. It has been acknowledged that individuals who go for walks daily have a significantly reduced risk of dementia. This is attributed to gait’s engagement of executive function and attention, which exercises the neural networks associated with them. Similarly, improving cognitive function may in turn help one to maintain a normal gait. Verghese and colleagues (2010) demonstrated this in a promising study, in which cognitive remediation improved older sedentary individuals’ mobility.

Closing Thoughts

As both a neuroscientist and a citizen of the world, one of my personal primary concerns is the welfare of the older generation. The aging population is growing significantly as life expectancy increases and these individuals are susceptible to a range of medical issues. While any health problem is hard to see in our loved ones, dementia and failing cognition are a particularly difficult brunt on both the individual, their family and society as a whole. Our exploration into the relationship between gait and cognition has so far offered a glimmer of hope into progressing our understanding and fight against dementia, and I personally hope this intriguing area continues to do just that.

References

Beurskens, R., & Bock, O. (2012). Age-related deficits of dual-task walking: a review. Neural plasticity2012.

Holtzer, R., Mahoney, J. R., Izzetoglu, M., Izzetoglu, K., Onaral, B., & Verghese, J. (2011). fNIRS study of walking and walking while talking in young and old individuals. The Journals of Gerontology Series A: Biological Sciences and Medical Sciences, glr068.

Li, K. Z., Lindenberger, U., Freund, A. M., & Baltes, P. B. (2001). Walking while memorizing: age-related differences in compensatory behavior.Psychological Science12(3), 230-237.

Lu, C. F., Liu, Y. C., Yang, Y. R., Wu, Y. T., & Wang, R. Y. (2015). Maintaining Gait Performance by Cortical Activation during Dual-Task Interference: A Functional Near-Infrared Spectroscopy Study. PloS one10(6), e0129390.

Montero-Odasso, M., & Hachinski, V. (2014). Preludes to brain failure: executive dysfunction and gait disturbances. Neurological Sciences35(4), 601-604.

Montero‐Odasso, M., Verghese, J., Beauchet, O., & Hausdorff, J. M. (2012). Gait and cognition: a complementary approach to understanding brain function and the risk of falling. Journal of the American Geriatrics Society,60(11), 2127-2136.

Sparrow, W. A., Bradshaw, E. J., Lamoureux, E., & Tirosh, O. (2002). Ageing effects on the attention demands of walking. Human movement science,21(5), 961-972.

Verghese, J., Mahoney, J., Ambrose, A. F., Wang, C., & Holtzer, R. (2010). Effect of cognitive remediation on gait in sedentary seniors. The Journals of Gerontology Series A: Biological Sciences and Medical Sciences65(12), 1338-1343.

Yogev‐Seligmann, G., Hausdorff, J. M., & Giladi, N. (2008). The role of executive function and attention in gait. Movement disorders23(3), 329-342.

Category: The Student Blog | Tagged , , , , , , , | 2 Comments

Highlights from the 2015 Meeting of the Vision Sciences Society

Enjoying the view from the VSS Conference in St. Pete Beach, Florida. Photo courtesy of Minjung Kim.

Enjoying the view from the VSS Conference in St. Pete Beach, Florida. Photo courtesy of Minjung Kim.

Going to conferences is one of my favorite aspects about being a scientist. As a PhD student, I spend a lot of my life in solitude: when I read new literature, when I program new experiments, or when I conduct new analyses, I am very much alone in my thoughts. Outside of scheduled meetings with my advisors, there is little reason for interaction with other people. The loneliness and boredom that swallows up every graduate student – yes, it will happen to you, too – is an unfortunate, little-discussed aspect of being a PhD student. The isolation, together with the drive to succeed and the illusion that everyone else is doing better than you, can at times wreak havoc on mental health.
But, there is a cure: talking to people! Of course, the number one confidants are lab mates or department friends (this is why it’s important to have a friendly, accepting lab culture). Another perhaps less obvious outlet for fun and relief are conferences. At conferences, you will:

1) Meet people who find your research novel and interesting
2) Learn about other people’s research
3) Discover that other people have experience the same graduate school woes that you may be experiencing, and survived.

In short, being at a conference allows you to step back from the details of your own graduate research and think again about why it is you decided to become a scientist in the first place. It’s stimulating. In fact, I cured my ennui with this year’s meeting of the Vision Sciences Society (VSS).

What is VSS?
VSS is an annual conference dedicated to studying the function of the human visual system, and took place this year from May 15th to May 20th in idyllic St. Pete Beach, Florida. With 1400 attendees, VSS focused on all things visual perception, including 3-dimensional (3D) perception, attention, and methodologies spanning from psychophysics to neuroimaging to computational modeling.

This year’s VSS was filled with interesting panels, exhibits, and demos devoted to visual perception, including the viral phenomenon #theDress. I’ve highlighted in my round-up below some of the talks and posters that I found the most interesting.

The research round-up

Optimal camouflage under different lighting conditions
As a person who studies lighting and shading, I loved Penacchio, Lovell, Sanghera, Cuthill, Ruxton and Harris’s poster on how the effectiveness of countershading, a type of camouflage, varies with lighting condition. Countershading refers to shading patterns commonly found in aquatic animals, where the belly is coloured a brighter shade than the back: viewed from below, the animal’s bright belly is camouflaged against the bright light from the sun above, and viewed from above, the animal’s dark back is camouflaged against the darkness of the water.

But, should the transition between back and belly be sharp or smooth? Penacchio et al.’s answer is that it depends on whether the lighting condition is sunny or cloudy. On a sunny day, the shading is characterized by high contrast and sharp shadows, whereas on a cloudy day, the shading is characterized by low contrast and soft shadows. Penacchio et al. found that, when the target item’s countershading was matched to the lighting (e.g., target was sharply countershaded and was in a sunny scene), people had difficulty finding the target, whereas when the target was mismatched to the lighting (e.g., target was in a cloudy scene), people were very good. Interestingly, birds behaved the same as humans, showing that optimal camouflage works across different species.

I really enjoy ecologically based studies like this, because they help me understand how biological systems exploit constraints posed by the environment.

Can computers discriminate between glossy and matte surfaces?
Tamura and Nakauchi won the student poster prize for the Monday morning session for answering this question. Glossy materials are interesting, because unlike matte materials, they are characterized by white highlights. It’s an old painting trick: to make objects look glossy and shiny, add white highlights. However, the location of the highlights matters: if highlights are haphazardly placed on the surface without attention to surface geometry, they will look like streaks of paint (Anderson & Kim, 2009).

Tamura and Nakauchi examined whether a computer algorithm (a “classifier”) without sophisticated understanding of scene geometry could nevertheless learn to discriminate between images of matte, glossy, and textured (painted) surfaces. This does not mean that scene geometry is unimportant for glossiness perception, but rather, that the image representation they were using (Portilla & Simoncelli, 2000), conveys at least information about the surface geometry without explicitly encoding the shape.

I think this study is an excellent example of combining knowledge in two different fields, in this case human vision and computer vision, to answer an interesting question.

The shrunken finger illusion
One of my favorite talks at VSS this year was the talk on the shrunken finger illusion by Ekroll, Sayim, van der Hallen and Wagemans, a novel illusion that demonstrates how something as basic as your knowledge of your finger length can be overridden by visual cues.

Drawing by Rebecca Chamberlain. Reproduced from Ekroll, V., Sayim, B., Van der Hallen, R., & Wagemans, J. (2015). The shrunken finger illusion: Unseen sights can make your finger feel shorter. Manuscript in revision. Copyright by Ekroll et al. (2015).

Drawing by Rebecca Chamberlain. Reproduced from Ekroll, V., Sayim, B., Van der Hallen, R., & Wagemans, J. (2015). The shrunken finger illusion: Unseen sights can make your finger feel shorter. Manuscript in revision. Copyright by Ekroll et al. (2015).

Ekroll et al. gave human observers hollow hemispherical shells (imagine a ping pong ball cut in half), who wore them on their fingers. When viewed from the top, the observers experienced two illusions: (1) they saw a full sphere, not a hemisphere, and (2) they felt their fingers were shorter.

The explanation has to do with amodal completion, the mental “filling in” of object parts that are hidden behind another object. When my cat peeks out from behind a door, I know that her body has not been truncated in half (perish the thought!); my visual system knows – has a representation of – the rest of her body. Amazingly, humans are not born knowing amodal completion, acquiring this ability at around four-to-six months of age (Kellman & Spelke, 1983).

So, the observers were amodally completing the hemisphere, thus seeing a sphere (Ekroll, Sayim & Wagemans, 2015). But, they also knew that their fingers started behind the “sphere.” So, the observers’ brains are “making room” for the back half of the sphere by assuming that the finger is shorter than usual.

This is a bizarre but interesting illusion that is consistent with previous work on the flexibility of body representations.

Demo Night and #theDress
The second night of VSS is demo night, where researchers and exhibitors share a new illusion, or new software, or anything fun that might not fit in the ordinary proceedings of the conference. At this conference, #theDress was a popular topic, with three demos dedicated to the viral sensation. I presented a demo of my own, as well, based on a project that I am working on with Dr. Richard Murray and Dr. Laurie Wilcox of York University Centre for Vision Research.

Let there be light
When people think of glowing objects, people typically assume that the object must be exceptionally bright. Our demo, an extension of my old master’s thesis work, showed that that’s not true — for some types of glow, it is the perceived shape of the object that determines whether it appears to glow.

We computer-rendered a random, bumpy disc under simulated cloudy lighting. From the front, this disc looks like an ordinary, solid, white object. However, as the disc rotates, revealing its underside, the disc takes a translucent appearance, and appears to glow.

glow_mv

Note that the luminance of the disc is the same from the front and the back – it is only left-right reversed. But, critically, the correlation between the luminance and the depth changes between the front view and the back view: viewed from the front, the peaks of the discs are bright and the valleys are dark; viewed from the back, the peaks are dark and the valleys are bright. Why are the valleys so bright? It must be because there is a light source either inside or behind the object!

The demo was very well received. For me, this was a highlight of the conference — talking to people about something that I am enthusiastic about, and convincing them that it is, in fact, cool. I imagine most scientists feel the same way about their work.

I still see it as white/gold
“The dress” refers to a photo of a dress that went viral in March 2015. To many people, the dress appeared to be white with gold fringes, whereas to others, it appeared to be blue with black fringes; a small minority reported seeing blue/brown. As an unrepentant white/gold perceiver, I was astounded to learn that the dress is in, real life, blue/black.

Most vision scientists agree that the dress “illusion” is an example of color constancy gone wrong. Color constancy is the visual system’s remarkable ability to “filter out” the effects of lighting. For example, my salmon-coloured shirt appears salmon-coloured regardless of whether I look at it indoors or outdoors, on a sunny day or a cloudy day – even though, if I were to take a photo of the shirt, the RGB value of the shirt would vary tremendously across the conditions. The predominant explanation of the dress is that different people’s visual systems are assuming different lighting conditions, and therefore filtering differently, resulting in different percepts. One of the dress demos (Rudd, Olkkonen, Xiao, Werner & Hurlbert, 2015) showed that, indeed, the same blue/black dress under different lighting can appear very different, and that, had the dress been white/black, the illusion would not have occurred. (I should note that there two other dress demos — Shapiro, Flynn & Dixon, and Lafer-Sousa & Conway — but sadly I did not get to see them as I was busy with my own demo.)
However, questions remain. Why is there such huge individual variability? Why are some people able to flip between percepts? I cannot answer all these questions, but I can direct you to this future issue of Journal of Vision dedicated to exploring the dress. If you have your own idea that you would like to test, the submission deadline is July 1, 2016. In the meantime, you can follow these links to see what vision scientists have said so far.
Gegenfurtner, Bloj & Toscani (2015)
Lafer-Sousa, Herman & Conway (2015)
Macknik, Martinez-Conde & Conway (2015)
Winkler, Spillman, Werner & Webster (2015)

David Knill Memorial Symposium
David Knill, renowned vision scientist, suddenly passed away in October last year, at age 53. He was known for his early work on Bayesian approaches to visual perception – that is, the notion that visual perception is the result of computation that optimally combines noisy information from the environment with loose prior knowledge about the environment. As Bayesian inference is such an important foundation in computational theories of vision, it is unbelievable now to imagine that there was ever a time when Bayesian perspective was the minority view in vision science.

In his memory, Weiji Ma – a former post-doctoral fellow of his who is now a professor at New York University – organized a symposium for celebrating Dr. Knill’s life and work. Speaker after speaker talked about his dedication to science, and about his kind and gentle personality. Dr. Ma’s tribute, in particular, made me realize that I am lucky to have met Dave Knill when I did, when I applied to work with him for my PhD. The symposium was respectful and touching, and was the perfect way to commemorate a brilliant scientist.
Dr. Knill’s Forever Missed page is here.

The real reason for going to conferences
But the most memorable aspects of VSS were not part of any scheduled proceedings. I remember: catching up with old friends at the Tiki bar, cooking with my friends in the hotel room, complimenting a speaker on his talk, him complimenting me back on the question I asked, commiserating about the lack of job prospects and sharing new-hire stories… and of course, I can’t forget the annual night-time ocean dip that marks the end of VSS.

As banal as it sounds, scientists are what drive science. Science is not done in a vacuum, and some of the best collaborations come out of friendships that you forge at conferences. And even if nothing productive comes out – so what? Maybe its reward enough to know that there are friendly nerds are out there who share your interests.

References
Anderson, B. & Kim, J. (2009). Image statistics do not explain the perception of gloss. Journal of Vision, 9(11):10, 1-17. doi:10.1167/9.11.10.

Ekroll, V., Sayim, B., van der Hallen, R. & Wagemans, J. (2015, May). The shrunken finger illusion: amodal volume completion can make your finger feel shorter. Talk presented at the annual meeting of the Vision Sciences Society, St. Pete Beach, FL.

Ekroll, V., Sayim, B. & Wagemans, J. (2015). Against better knowledge: the magical force of amodal volume completion. i-Perception, 4(8) 511–515. doi:10.1068/i0622sas.

Gegenfurtner, K.R., Bloj, M. & Toscani, M. (2015). The many colours of ‘the dress.’ Current Biology. doi:10.1016/j.cub.2015.04.043.

Kellman, P.J. & Spelke, E.S. (1983). Perception of partly occluded objects in infancy. Cognitive Psychology, 15(4). 483-524.

Kim, M., Wilcox, L. & Murray, R.F. (2015, May). Glow toggled by shape. Demo presented at the annual meeting of the Vision Sciences Society, St. Pete Beach, FL.

Lafer-Sousa, R., Hermann, K.L. & Conway, B.R. (2015). Striking individual differences in color perception uncovered by ‘the dress.’ Current Biology. doi:10.1016/j.cub.2015.04.053

Lafer-Sousa, R. & Conway, B.R. (2015, May). A color constancy color controversy. Demo presented at the annual meeting of the Vision Sciences Society, St. Pete Beach, FL.

Macknik, S.L., Martinez-Conde, S. & Conway, B.R. (2015). How “the dress” became an illusion unlike any other. Scientific American Mind, 26(4). Retrieved from
http://www.scientificamerican.com/article/how-the-dress-became-an-illusion-unlike-any-other/

Penacchio, O., Lovell, P.G., Sanghera, S., Cuthill, I.C., Ruxton, G. & Harris, J.M. (2015, May).
Concealing cues to shape-from-shading using countershading camouflage. Poster presented at the annual meeting of the Vision Sciences Society, St. Pete Beach, FL.

Portilla, J. & Simoncelli, E.P. (2000). A Parametric Texture Model based on joint statistics of complex wavelet coefficients. International Journal of Computer Vision, 40(1), 49-71.

Rudd, M., Olkkonen, M., Xiao, B., Werner, A. & Hurlbert, A. (2015, May). The blue/black and gold/white dress pavilion. Demo presented at the annual meeting of the Vision Sciences Society, St. Pete Beach, FL.

Shapiro, A., Flynn, O. & Dixon, E. (2015, May). #theDress: an explanation based on simple spatial filter. Demo presented at the annual meeting of the Vision Sciences Society, St. Pete Beach, FL.

Tamura, H. & Nakauchi, S. (2015, May). Can the classifier trained to separate surface texture from specular infer geometric consistency of specular highlight? Poster presented at the annual meeting of the Vision Sciences Society, St. Pete Beach, FL.

Winkler, A.D., Spillman, L., Werner, J.S. & Webster, M.A. (2015). Asymmetries in blue-yellow color perception and in the color of ‘the dress.’ Current Biology. doi:10.1016/j.cub.2015.05.004.

fig3

Photo by Karen Meberg

Minjung (MJ) Kim is a PhD candidate at New York University (NYU) Department of Psychology, in the Cognition and Perception program. She studies the visual perception of light and colour, with a keen interest in material perception (e.g., what makes glowing objects appear to glow?). She is co-advised by Dr. Richard Murray at York University Centre for Vision Research and Dr. Laurence Maloney at NYU.

Category: The Student Blog | Tagged , , , , , , | 1 Comment

LAMP Diagnostics: The key to malaria elimination?

A medical researcher with the U.S. Army tests a patient for malaria in Kisumu, Kenya. Photo courtesy of the U.S. Army Africa.

A medical researcher with the U.S. Army tests a patient for malaria in Kisumu, Kenya. Photo courtesy of the U.S. Army Africa.

By Patrick McCreesh

Malaria elimination is possible within a generation. But controlling malaria and eliminating malaria are different, and each pose certain challenges. Overcoming the unique challenges of malaria elimination is essential to meeting this goal, and the barriers presented in malaria elimination settings will require different strategies and policies. Researchers at the Malaria Elimination Initiative (MEI) at the University of California San Francisco (UCSF) are gathering evidence to inform elimination efforts going forward. I am a Masters candidate in the UCSF Global Health Sciences program, and I am working in the field in collaboration with the University of Namibia on a cross-sectional survey of malaria prevalence.

Malaria 101

Malaria is caused by parasites of the genus Plasmodium. The parasites infect people through the bite of female Anopheles mosquitoes. The parasites hide inside the body’s cells before bursting into the bloodstream; leaving fever, chills, and body aches in their wake. Most people can recover from the infection, but in some severe cases the parasites invade the brain or cause organ failure. Children under five years old and pregnant women are at highest risk for severe complications.

History and Global Impact

The world has made substantial progress toward reducing the burden of malaria in the past 15 years thanks to new technological innovations in malaria prevention and treatment. For example, pyrethrin impregnated bed nets are a low cost and effective way to reduce malaria (For you basketball fans, Stephen Curry not only scores on the court, but assists around the world with his contributions to Nothing But Nets #gowarriors). Rapid diagnostic tests (RDTs) allow for point-of-care diagnosis of malaria. Artemisinin combination therapy (ACT) was a major breakthrough in malaria treatment. The global incidence of malaria has decreased by an average of 3.27% per year since 2000 thanks to expanded coverage of these interventions. Child mortality due to malaria in sub-Saharan Africa has decreased by 31.5% during the same time period. Due to this drop in malaria cases, many believe releasing the world from the grip of malaria permanently is possible. In a seminal conference in 2007, Bill and Melinda Gates galvanized global leaders to commit to malaria eradication.

Despite this pledge to fight malaria around the world, about 3.2 billion people are at still at risk of malaria infection. In 2013, there were an estimated 198 million cases and 584,000 deaths due to malaria. Transmission is most intense in Africa and severely affects those in poverty. There is an urgent need to expand malaria control and elimination efforts to ultimately eradicate malaria. The current malaria eradication strategy calls for expanding control efforts in high transmission settings, eliminating malaria gradually at the fringes of transmission, and developing a preventive vaccine. The estimated cost of malaria elimination is $5.3 billion, but this investment will save millions of lives and dollars in the long run.

A mother and child rest under an insecticide-treated bed net in Zambia. Photo courtesy of the Gates Foundation.

A mother and child rest under an insecticide-treated bed net in Zambia. Photo courtesy of the Gates Foundation.

Malaria Elimination
Elimination involves interrupting transmission in a defined geographical area and dropping incidence to zero. Eliminating malaria in low transmission setting settings has several unique concerns compared to high transmission settings.

• Changing epidemiology

The epidemiology of malaria changes as malaria incidence drops. Asymptomatic carriers, people who are infected by the malaria parasite but who do not show symptoms, are responsible for about 20-50% of transmission. Most of these carriers are adult men instead of children and pregnant women. The proportion of people with sub-microscopic infections increases, making diagnosis difficult. Imported malaria from highly endemic neighboring countries create a source of epidemics in countries with low transmission.

• Increased importance of surveillance

In malaria control settings, local clinics see a large number of cases, too many cases to investigate and intervene at the source of infection. Malaria surveillance in high burden settings relies on passive case detection, where cases are detected when patients with fever seek care at health facilities. The high volume of malaria cases makes investigating every case impractical, and reporting consists of aggregate reporting by district. As the number of malaria cases decrease, it becomes practical and necessary to investigate and find additional cases. Additional cases are often asymptomatic. The problem of asymptomatic cases illustrates the importance of understanding the difference of incidence versus prevalence in achieving malaria elimination. Incidence is the number of malaria cases out of the total population over a period of time. Prevalence is the number of people with malaria out of the total population at a set point in time. Passive surveillance does not measure the true prevalence of malaria, nor addresses the epidemiological and diagnostic challenges in settings preparing for elimination. Effective surveillance in elimination settings requires active surveillance. Active surveillance involves finding and testing individuals in a selected geographic area, regardless of symptoms, to identify and treat additional cases of malaria.

• Diagnostic difficulties

There are limitations to the currently accepted malaria diagnostic tools. Blood smear microscopy is time consuming and requires highly trained staff to be accurate at lower parasite densities. Although rapid diagnostic tests are quick and easy to use, sensitivity decreases for infections with less than 100 parasites per microliter. The most accurate method to diagnose malaria is polymerase chain reaction (PCR) but this requires expensive equipment and reagents, highly trained laboratory personnel, and has long turnaround times. Diagnosing asymptomatic cases of malaria is challenging because asymptomatic people do not seek care and asymptomatic cases are often undetectable by microscopy or RDTs. The changing epidemiology and increased importance of surveillance in elimination settings requires a diagnostic tool that can detect asymptomatic infections as well as PCR, but costs less and gives results faster.
Loop-mediated isothermal amplification (LAMP)

A promising new nucleic acid based method is loop mediated isothermal amplification (LAMP). Like PCR, LAMP amplifies a specific section of DNA. Unlike PCR, LAMP does not require a thermocycler, the reaction takes place at a constant temperature. This process uses multiple different primers and a DNA polymerase to create a stem-loop structure of DNA, which serves as a template for amplification. LAMP has a sensitivity similar to PCR at low parasite density, 92.7% for 10 parasites per microliter. LAMP has lower costs, faster turnaround times, and less technical requirements compared to PCR for malaria diagnosis. LAMP can detect malaria DNA for from dried blood spots, which are commonly used to collect and test for malaria in active surveillance. In the lab at the University of Namibia, the speed of LAMP has enabled our group to process about 150 samples per day. Fast and accurate results makes LAMP a good option for diagnosing malaria in elimination settings.

Closing thoughts

The new challenges presented in malaria elimination settings will require different surveillance and diagnostic tools. Researchers at the Malaria Elimination Initiative at UCSF are gathering evidence to inform elimination efforts going forward. As more countries transition from malaria control to elimination, elimination policy should be made with these challenges in mind. Most importantly, policy makers should ask “Does our health system have the tools to quickly investigate every case, test everyone around the case, and provide treatment to positives in a timely manner?” A recent article in PLOS Neglected Tropical Diseases about malaria elimination in Mesoamerica and Hispanola notes that LAMP could help more countries answer “yes” to this question, ultimately resulting in a malaria free world.

Patrick McCreesh is a Master’s candidate in the Global Health Sciences program at University of California, San Francisco. His research focuses on infectious disease, with an emphasis on malaria elimination efforts in partnership with the UCSF Malaria Elimination Initiative.

Category: Neglected Diseases, PLoS Neglected Tropical Diseases, Public Health, The Student Blog | Tagged , , , , , , , | Leave a comment

Uncovering the biology of mental illness

The visual cortex of the brain is depicted here. Photo courtesy of Bryan Jones.

The visual cortex of the brain is depicted here. Photo courtesy of Bryan Jones.

By James Fink

The human brain is capable of complex processes. The brain senses time and visualizes space. It allows us to communicate through language and create beautiful works of art. But what about when these cognitive abilities go awry? The National Institute of Mental Health (NIMH) cites serious mental illness (SMI) as a mental, behavioral, or emotional disorder that interferes with or limits one or more major life activities. The cited survey estimated the prevalence of SMI in the United States as ~4%, with the estimated prevalence for any mental illness being ~18%.

Mental illnesses, also known as psychiatric disorders, include many diverse conditions, including: anxiety disorders, Obsessive Compulsive Disorder (OCD), and post-traumatic stress disorder (PTSD). This group of illnesses presents a major global burden. In the 2010 Global Burden of Disease report, mental and substance use disorders comprised 7.4% of total disability-adjusted life years (DALYS) globally, and 8.6 million years of life lost (YLL), the single greatest cause of YLL worldwide. Psychiatric disorders also pose a significant burden to individuals and their families, and a challenge for clinicians and scientists.

For clinicians, many psychiatric disorders are difficult to treat. Even individuals with the same disorder present with a spectrum of symptoms and symptom severity, and many of the drugs currently used to treat these disorders have a multitude of undesirable side effects. For scientists, the mechanisms underlying this family of illnesses are still being unveiled, and reliable biological explanations for these disorders are still unclear, though it is known that biology and genetics play a role.

The lack of insight into the determinants of these disorders may relate to the difficulty in developing effective pharmacological treatments for them. Though various treatments for each of these disorders exist (ranging from drugs to cognitive behavioral therapy (CBT)), these treatments can be greatly improved. The field of neuroscience in particular is providing insight into the brain systems, cellular deficits, and genetics behind many of these disorders, which may help the development of new therapies. A limitation to current research efforts is that many of these insights come from the study of animal models. Also, conflicting results are often found in the literature. Despite these obstacles, the future of neuroscience holds a wealth of promise for developing a better understanding of psychiatric illness, studying these disorders with a new set of model systems, and interesting new research techniques.

Schizophrenia – An example of what we know

Perhaps one of the most studied of the psychiatric disorders is schizophrenia, a neurological disorder affecting about 1% of the general population and characterized by a variety of cognitive impairments, including a loss of affect and motivation, and often, the presence of hallucinations and delusions. A search in PubMed for articles with “schizophrenia” in the title yields more than 50,000 results, an indication of just how much research focuses on schizophrenia. Researchers have identified several hereditary factors (genes) with diverse sets of functions, that may be tied to this disorder..

DISC 1 (disrupted in schizophrenia 1) is a gene that makes a protein with many interacting partners, and plays a role in a variety of pathways within cells — including the ability of cells to divide, mature, and move towards their final location within a tissue. Such processes have been shown to lead to neurological deficits and disorders if disturbed.
Neuregulin 1 is a member of a protein family that has three other types of neuregulins. Perhaps most interestingly (and to make matters more complex), neuregulin 1 itself can also undergo a type of processing in the cell, called alternative splicing, that winds up producing many alternative forms of neuregulin 1 (up to 31 forms!) which all perform slightly different functions. The main job of neuregulin 1 seems to be to aid in brain and nervous system development.
The CACNA1C gene is responsible for making a protein in the cell that forms part of an important calcium channel, playing a role in a variety of brain cell (neuron) function.
Shank 3 is used by the cell as a scaffold, providing support for other cellular molecules that are important for the signaling that goes on between individual neurons of the brain.

Genes are not the only story though; researchers identified deficits at the cellular and network levels of the brain. The brain is comprised of both neurons and supporting cells called glia, the two major cell types of the brain. But there are a few classes of neurons (which change depending on the classification system you use), and each class is known to play its own important role in proper brain function. For instance, excitatory neurons use a chemical called glutamate to signal to other cells and are responsible for promoting the activity of their partners, whereas inhibitory neurons use a chemical called GABA to signal to their partners and are responsible for quieting these cells. There have been many reports of disrupted function of both excitatory and inhibitory neurons in mouse models of schizophrenia. These disruptions have been found in multiple brain regions and at different ages. But there have also been reports that fail to find a disruption of either of these cells types.

So what is the primary mechanism? This is exactly the problem outlined above: the complexity of mental illness makes it difficult for researchers to pin down a single biological explanation. Variations in mouse models, experimental approaches, animal age, or brain region being studied may be factors that contribute to inconsistencies across findings. The problem is that the brain changes over time and each brain region behaves a little differently from even its neighboring brain region. These factors complicate finding accurate and meaningful deficits in psychiatric disorders, a problem that may disappear in the near future.

A microscopy image of a mouse brain. One of the major barriers to greater understanding of the biological mechanisms of mental illness is reliance on animal models, which have different experiences of mental illness than humans. Image courtesy of Zeiss Microscopy.

A microscopy image of a mouse brain. One of the major barriers to greater understanding of the biological mechanisms of mental illness is reliance on animal models, which have different experiences of mental illness than humans. Image courtesy of Zeiss Microscopy.

The new toolbox
Optogenetics
A recent article published by the New Yorker profiles Karl Deisseroth, a psychiatrist and neuroscientist at Stanford University. In the article Deisseroth mentions the difficulty in treating and understanding neurologic and psychiatric disorders, asking: “When we have the complexity of any biological system — but particularly the brain — where do you start?”.

Dr.Deisseroth is known for more than just treating patients. He is one of the inventors of one of the most exciting and cutting-edge experimental techniques used in neuroscience today – optogenetics. Optogenetics is a technique that allows expression of light-sensitive channels in neurons. By combining this technique with genetic and viral approaches, researchers can insert these channels into very specific populations of neurons. Ultimately, this approach allows researchers to control distinct groups of neurons and individual circuits of the brain by using flashes of light, providing unprecedented control on cellular and circuit function.

The study of neural circuits underlying behaviors has been a main aim of the field of neurobiology. Various circuits that underlie many human behaviors and cognitive functions are now known. Also, the specific circuits that are affected in psychiatric illnesses are starting to be uncovered. Applying optogenetics to the study of these disorders will provide researchers with a much more accurate approach to probing how various circuits function in models of neuropsychiatric disorders without affecting surrounding circuits. This is important, as non-specific circuit stimulation can actually cause confounding results.

iPSC
The advent of induced pluripotent stem cells (iPSC), a method published by Shinya Yamanaka’s group in 2007 in which skin cells can be reverted to a stem cell-like state via the expression of “reprogramming factors”, now provides a means of allowing researchers to use human cells to study disease. iPSC’s can be driven to form various cell types, including neurons, by exposing these cells to a cocktail of factors known to be important in driving the development of nervous tissue.

Before the discovery of iPSCs by the Yamanaka group, the only available way of studying human brain tissue was through the use of post-mortem tissue and human embryonic stem cells. Now, iPSC technology allows researchers to collect skin cells from large groups of patients via skin biopsy, samples that can then be used to form patient-specific neurons. These neurons can be derived from actual patients with mental disorders, allowing researchers to study these diseases using human neurons from these patients.

Cerebral Organoids
Understanding the neural circuits that are disrupted in neuropsychiatric disorders remains a huge goal for neuroscience research. This is highlighted by the BRAIN initiative, put forth by the Obama administration in 2013, an initiative that aims at understanding how individual cells and neural circuits work together in order for researchers and clinicians to better understand brain disorders. A few years ago, approaches to studying and probing brain circuits, even by optogenetics, was limited to animal models, because cells grown in a culturing dish in a lab fail to form the neural circuits that are observed in the brain. But a paper published in 2013 from the Knoblich lab, showed that iPSC-derived neurons can be used to create “cerebral organoids”, small bits (4mm diameter) of neural tissue that were found to express markers characteristic of various brain regions including cortical and hippocampal regions. Since the publication of this innovative technique, other groups have published similar methods, creating additional versions of 3-D neural cultures and even making cerebellar-like structures, a brain structure known to be important for movement and coordination. In fact, WIRED magazine recently published an article discussing a recent paper that created what the authors call “cortical spheroids” (and what WIRED calls “brain balls”), a different method for developing organoid-like structures. This technique cannot yet be used to study neural circuits as they truly exist in the actual mouse or human brain (the circuits and brain-like regions observed in culture are very rudimentary). However, the advancement of these techniques over the next decade or two could provide new and exciting ways to probe actual human brain circuits using patient cells.

Though everyday we gain greater insight into how the human brain works and how brains might be disrupted in psychiatric disorders, we are far away from uncovering the exact circuits and mechanisms that underlie each of these disorders. It is clear that tools such as optogenetics, iPSC-derived neurons, and cerebral organoids can be used to provide tremendous control and detailed study of human neurons from these patients. Together, these studies might be able to gain a better understanding of how human neurons and neural circuits go awry in these disorders; leading to identification of novel targets for the development of drug therapies, providing promise for these patients and finally allowing scientists and clinicians to uncover the biology behind mental illness.

References

1. Uhlhaas, P. J. & Singer, W. Abnormal neural oscillations and synchrony in schizophrenia. 1–14 (2010).
2. Brandon, N. J. & Sawa, A. Linking neurodevelopmental and synaptic theories of mental illness through DISC1. 1–16 (2011).
3. Green, E. K. et al. The bipolar disorder risk allele at CACNA1C also confers risk of recurrent major depression and of schizophrenia. Molecular Psychiatry 15, 1016–1022 (2009).
4. Mei, L. & Xiong, W.-C. Neuregulin 1 in neural development, synaptic plasticity and schizophrenia. Nature Publishing Group 9, 437–452 (2008).
5. Gauthier, J. et al. De novo mutations in the gene encoding the synaptic scaffolding protein SHANK3in patients ascertained for schizophrenia. Proceedings of the National Academy of Sciences 107, 7863–7868 (2010).
6. Feng, Y. & Walsh, C. A. Protein-protein interactions, cytoskeletal regulation and neuronal migration. Nat Rev Neurosci 2, 408–416 (2001).
7. Lewis, D. A., Curley, A. A., Glausier, J. R. & Volk, D. W. Cortical parvalbumin interneurons andcognitive dysfunction in schizophrenia. Trends in Neurosciences 35, 57–67 (2012).
8. Takahashi, K. et al. Induction of Pluripotent Stem Cells from Adult Human Fibroblasts by Defined Factors. Cell 131, 861–872 (2007).
9. Dolmetsch, R. & Geschwind, D. H. The Human Brain in a Dish: The Promise of iPSC-Derived Neurons. Cell 145, 831–834 (2011).
10. Lancaster, M. A. et al. Cerebral organoids model human brain development and microcephaly. Nature 501, 373–379 (2013).
11. Paşca, A. M. et al. Functional cortical neurons and astrocytes from human pluripotent stem cells in 3D culture. Nature Chemical Biology (2015).

Category: The Student Blog | Tagged , , , , , , , | Leave a comment

Deepwater Horizon: Five years later, research continues but headlines fade

An aerial view of the oil leaked from Deepwater Horizon, taken May 6, 2010. Photo courtesy of The Boston Globe.

An aerial view of the oil leaked from Deepwater Horizon, taken May 6, 2010. Photo courtesy of The Boston Globe.

By Kristan Uhlenbrock

June 8th marked another World Ocean’s Day that has come and gone with Presidential Proclamations and local advocacy events. But the fanfare about the ocean doesn’t seem to last. It’s been five years since the Deepwater Horizon oil spill in the Gulf of Mexico and as scientific research continues to unfurl more details about the impacts of the spill, the public pays less attention as time moves on. Yet many questions still go unanswered. Where did all the oil go? How is it impacting the environment? What does this mean for future oil spill response?

Interest in the worst oil spill disaster in U.S. history has diminished as measured by the number of web searches performed over the past five years (See Figure 1).

Web search interest in “Deepwater Horizon oil spill” over time, as measured by Google Trends. Data from Google 2015.

Web search interest in “Deepwater Horizon oil spill” over time, as measured by Google Trends. Data from Google 2015.

Yet research is still unveiling interesting new findings about the impacts of the spill, and the legal process to determine how much BP will pay continues to play out in the courts.

4.9 million

This is how many barrels of oil spilled into the ocean during the Deepwater Horizon disaster in the summer of 2010. For comparison, this is more than seven times the amount of oil that is found annually in the Gulf of Mexico from natural seepage and other activities, like transportation.

As the oil spill was happening, researchers and other experts struggled to put an exact number on the amount of oil flowing out of the wellhead. Further complicating matters was the question of where the oil was going and where would it end up. Tourism to the Gulf region was impacted as nervous travelers cancelled their trips, which would ultimately have a lingering impact on the local economy for years after the spill.

Nonetheless, scientists today are making strides on some of those issues. In a recent PLOS ONE article, researchers from the University of South Florida found elevated hydrocarbon concentrations in the DeSoto Canyon — a 1500 m deep trench located in the Gulf of Mexico northeast of the spill site — in sediment samples taken from December 2010 to February 2011. This deep-sea environment is home to a unique ecosystem, but impacts from the oil spill on the corals and other organisms have been difficult to study. Therefore, finding elevated above normal amounts of hydrocarbons in this deep-sea region sheds some light on where some of the oil ended up and what this potentially means for the ecosystem.

Modeling where the oil went is a complicated process due to multiple modes of mixing and transport from currents, eddies, and three-dimensional circulation patterns. The researchers identified a couple of possible pathways of transmission for hydrocarbons to be transported from the well blowout site to the DeSoto Canyon. One is the interaction of the oil with microbes and particulates that bind to the oil and carry it to the sea floor as they sink, commonly referred to at “marine snow”. The other pathway of hydrocarbons into the canyon is through direct contact between the oil plume spilling out of the wellhead, and remaining below the surface, and the sediments on the trench slope.

8,743

The number of foraminifera species in the modern record, with the majority of them found on the bottom of the sea floor. Forams, short for foraminifera, are an essential part of the marine food web and can serve as indicators of coral health and other environmental conditions. Many forams consist of calcium carbonate shells, which is similar to the structure of coral reefs. How forams react to changing ocean chemistry can be a marker for other species. Forams are also sensitive to hydrocarbons and other toxins.

Another recent study, conducted by many of the same researchers and released in March 2015, examined how forams were affected by the oil spill. They found that the density of forams on the deep-sea floor of the Gulf of Mexico (1000-1200 m) decreased by 80-93%, at two sample sites taken in December 2010. Another set of samples taken in February 2011 showed that forams closest to the wellhead may have begun to recover, however the site located further away showed a decrease in the foram populations that almost reached zero. The researchers encouraged subsequent sampling of forams to determine if recovery continued or occurred at either of the sites.

The eastern region of the Gulf of Mexico, and the deepest parts of the sea floor, were not well mapped before the disaster, but are considered hot spots of abundance in diversity. As oil spilled into the ocean for five months, researchers scrambled to determine baseline estimates for the ecosystem, including sediment cores to look at pre-oil spill bottom habitat. They also used remotely operated vehicles to compare sites that were contaminated with oil to sites that were not. Located in the deep darkness of the DeSoto Canyon in the Gulf of Mexico are corals and invertebrates like the brittlestar. These unique ecosystems serve as feeding grounds, nurseries, and homes for commercially viable fisheries, so understanding how they are affected by the oil spill will help to determine how recovery funds get distributed.

13.7 billion

This is the amount of money that BP is facing in penalties for spill-related violations of the Clean Water Act. 80 percent of these penalties will go to a trust fund to support research, restoration, and recovery efforts in the region, as directed by the RESTORE Act, passed by Congress in 2012.

This large sum of money directed towards the Gulf region will fund research for many years in the future. And although public attention may not be focused on the aftermath of the Deepwater Horizon oil spill, scientists are gaining a little bit more understanding every day to deal with the decades long process of recovery.

References

McNutt, M.K., et al., 2012. Applications of science and engineering to quantify and control the Deepwater Horizon oil spill, Proc. Nat. Acad. Sci., doi:10.1073/pnas.1214389109

Paris, C.B. et al., 2012. Evolution of the Macondo Well Blowout: Simulating the Effects of the Circulation and Synthetic Dispersants on the Subsea Oil Transport, Environ. Sci. Technol., doi:10.1021/es303197h

 Romero, I.C., et al., 2015. Hydrocarbons in Deep-Sea Sediments following the 2010 Deepwater Horizon Blowout in the Northeast Gulf of Mexico, PLOS ONE, doi:10.1371/journal.pone.0128371

Schwing, P.T. et al., 2015. A Decline in Benthic Foraminifera following the Deepwater Horizon Event in the Northeastern Gulf of Mexico, PLOS ONE, doi:10.1371/journal.pone.0120565

Kristan Uhlenbrock is a marine scientist and graduate student in the Johns Hopkins science writing program. Follow her @oceankris.

Category: The Student Blog | Tagged , , , , , , , , , | Leave a comment

Celebrating 10 years of Athena SWAN Charter advancing women in science

A scientist performs some tests in a beaker for AIDS research. Photo courtesy of World Bank Photo Collection.

A scientist performs some tests in a beaker for AIDS research. Photo courtesy of World Bank Photo Collection.

By Sara Carvalhal

Gender inequality in science has been in the news lately, particularly around the fall-out of Sir Tim Hunt’s biased comments toward female scientists. Sir Hunt’s comments are not held in isolation, but rather indicate the need for greater efforts to promote gender equality and advance women’s roles in the scientific workplace. The Athena SWAN Charter, signed by the UK 10 years ago as of June 2015, is one such effort that merits recognition. The Athena SWAN charter is a policy designed to establish greater opportunities and representation of women in science, technology, engineering, mathematics and medicine (STEMM) fields. In recognition of this seminal event, I will take a closer look at how Athena SWAN has worked to better accommodate gender equality in STEMM.

Women are underrepresented in STEMM

In 2002 Patricia Hewitt, the Secretary of State for Trade and Industry, was concerned that women were not appropriately represented in science, technology, and engineering (SET). By then, Baroness Susan Greenfield CBE was appointed to examine the representation of women in SET and to advise the UK government on best practices to address the trend of women’s underrepresentation in of women on their scientific path, in both private and public sectors. Baroness Greenfield is a scientist, and by then she was director of Royal Institution, an organization devoted to science education and research.

In this report she concludes: “The problem is not just a social and cultural one, although inequity should be addressed in all its forms, but also economic, and as such cannot be ignored. If Britain is to remain a nation with successful and developing businesses of all sizes, it must make the most of its workforce”.

The British Government recognized that STEMM subjects were an important part of the UK economy, and that women were not adequately represented in STEMM, an issue affecting innovation, economic growth, and productivity. In response to the report, and to promote women’s engagement in STEMM subjects, the Government partnered with non-governmental organizations (NGOs) to launch a series of policies to address issues of women in SET and promote Science & Innovation Investment.

A joint effort to help elevate women in STEMM

In 2005 two organizations, Athena Project and the Scientific Women’s Academic Network (SWAN) signed a charter, known as The Athena SWAN Charter. The main goal of this charter was to reverse the consistent loss of women employed in STEMM.

This charter is particularly novel because it recognizes the achievements in recruiting, retaining or promoting women at each stage of STEMM subjects by an official award.

In general, organizations have been relying on enthusiastic programs based on events, websites and/or handbooks to tackle the problems with retaining and recruiting women in STEMM subjects. The Athena SWAN charter awards institutions that incorporate policies that support the career development of women in STEMM. The official recognition, which can be bronze, silver or gold award based on performance, also functions as (indirect) advertising of an institution’s internal policies.

Any British institution from the private or public sector can become an Athena SWAN member. By signing the charter, the institution promises to take actions to address the six principles of the charter. In summary, all Athena SWAN members should address gender inequalities at all levels of the organization by changing cultures and attitudes. Some key institutional policies can be reviewed to include the use of long-term contracts as opposed to short-term employment. Also, institutions can improve pathways from PhD to a sustainable academic career in science.

Each member has the possibility to submit an application based on a self-assessment and a plan for future actions. A peer review panel evaluates the assessment, and determines how many policies exist to support women in STEMM. Even if an institution is awarded by an official recognition, new assessments are made routinely to ensure that friendly policies towards women progress in STEMM are maintained.

The first awards were presented in 2006. Currently, there are 253 award-holding universities and departments from all 129 Athena SWAN members. This figure represents more than half of all higher education institutions in the UK.

University of Dundee — a bronze Athena SWAN Award

I am a PhD Student at the College of Life Sciences at the University of Dundee in Scotland, a recipient of a bronze award from Athena SWAN. Though we are only at the early stages, I have felt inspired by the significant changes in the academic environment since adopting Athena SWAN policies. The university frequently hosts talks from female and male scientists about their career paths, as well as Q&A sessions where we can address more personal issues. I also feel that departments are more open to hiring women since adopting the charter. The Athena SWAN policies had a significant effect on recruitment of qualified women in science; the number of women applying at SET departments increased almost 10% in a years time after the University of Dundee adopted more flexible working patterns and family-friendly policies. During this period, women who were selected for promotions in the SET departments increased by more than 30%, showing that Athena SWAN policies will lead to measured progress in gender equality in the sciences.

Female scientists have been taking to Twitter to poke fun at Sir Tim Hunt’s comments and bring attention to sexism in science under #distractinglysexy.

Athena SWAN Charter: The first step to changing social norms

When the Athena SWAN Charter was first created, statistics showed that the majority of graduate and post-graduate biology students were women, but less than 10% of these women were present at senior levels in higher education. Through the self-assessments, the national scheme found many women lacked career support between post-doc and tenure tracks, and it is during this period that women drop out of SET careers. This also coincides with the time many women elect to start families, which suggests that STEMM institutions are not adequately accommodating families.

Also, many young scientists believe that it is not feasible to be both a scientist and a mother. Athena SWAN members have helped change this notion by promoting and supporting more flexible working patterns to take care of dependents, including spouses, children and even parents. Athena SWAN members also provide funding for family support programs (e.g. the cost of childcare during a conference or other commitments).

Expanding Athena SWAN Charter to other subjects

Recently, Athena SWAN Charter policies were expanded to 10 key principles. Building on the initial six principles, Athena SWAN members recognize that academia cannot reach its full potential unless it benefits from talents of all individuals. New policies at institutions can be established to address the gender pay gap, and discrimination based on gender identity. Women’s lack of representation in senior roles is not limited to STEMM, women are also underrepresented in senior roles in arts, humanities, social sciences, business, and law (AHSSBL). I believe that these academic disciplines would see measured progress toward gender equality if the principles of the Athena SWANN charter were extended to these professional communities.

Some see the scheme as a feminist, or women-centered movement, and attempt to discredit programs designed to level the playing field. But I believe that by promoting greater equality, the principles of the Athena SWAN Charter works to the benefit of all people.

References
Official Athena-SWAN Charter: http://www.ecu.ac.uk/

WES (Women’s Engineering Society): http://www-womeninengineering.eng.cam.ac.uk/Athenaswan/history

British Department for Business Innovation & Skills: https://www.gov.uk/government/organisations/department-for-business-innovation-skills

Athena SWAN University of Dundee website: http://www.dundee.ac.uk/hr/athenaswan/

Baroness Greenfield. Set Fair. A Report on Women in SET, 2002. [ONLINE]:
http://image.guardian.co.uk/sys-files/Education/documents/2002/11/28/4408-DTI-Greenfield.pdf

Sarah Hawkes. Report on Athena SWAN Charter for Women in Science, 2011. [ONLINE]:
http://www.ecu.ac.uk/wp-content/uploads/2015/04/Athena-SWAN-Impact-Report-2011-1.pdf

Joe Cullen, Kerstin Junge, Chris Ramsden. Evaluation of the UK Resource Centre for Women in Science, Engineering and Technology, 2008. [ONLINE]:
http://www.tavinstitute.org/wp-content/uploads/2013/01/Tavistock_Report_Evaluation-of-the-UK-Resource-Centre-for-Women-in-Science-Engineering-and-Technology_2008.pdf

The Royal Society, Leading the way: Increasing diversity in the scientific workforce. [ONLINE]:
https://royalsociety.org/policy/projects/leading-way-diversity/athena-swan-charter-awards/

Alison Kingston-Smith. Wisdom, justice and skill in science, engineering and technology: Are the objectives of the Athena Project mythology? Bulletin The Society for Experimental Biology, March 2008. [ONLINE]:
http://www.sebiology.org/publications/Bulletin/March08/Athena.html

Category: Academia, PLoS, PLoS Blogs, The Student Blog, Women | Tagged , , , , , , , , , | Leave a comment