We can’t live with anything less than Open

The Open Access Button has mapped over 6,000 paywalls since it launched four months ago. We know this is just the tip of a very large restricted access iceberg and there is still so much work to do. Currently we are recruiting new student team members and a steering committee. We’ve also started developing Button 2.0 and will have exciting announcements in the upcoming weeks. To make sure you’re up to date on these follow us on Facebook and Twitter.

The Open Access Button seeks to make the often invisible problem of paywalls visible, but paywalls aren’t the only open access problems that need to be made more visible. The last few days alone have highlighted problems with publishers and governments that need to be made visible.

There’s the unethical and possibly illegal actions from the publisher Elsevier. Two years ago Dr. Mike Taylor blogged about Elsevier charging to download open access articles and last August Dr. Peter Murray-Rust called attention to Elsevier charging to read open access CC-BY articles. On Sunday, Murray-Rust revisited the topic in “Elsevier are still charging THOUSANDS of pounds for CC-BY articles.” Murray-Rust found that many open access CC-BY articles were labeled as “All rights reserved” and users would be charged hefty sums for permission to reprint the articles. One example that Murray-Rust noted was Elsevier charging 8000 GBP for just the permission for 100 reprints of a CC-BY article that was incorrectly labeled as “All rights reserved.” No one should have to ask permission to re-use a CC-BY paper in any way.

Authors pay an article processing charge (APC) of $500-5,000 ($3,000 is often the standard) to publish their articles open access under a CC-BY license in an Elsevier journal. So, Elsevier is doubly profiting off APCs and the large permission charges for incorrectly labeled articles. Alicia Wise, Director of Access and Policy at Elsevier, responded to Murray-Rust’s email stating that Elsevier is “improving the clarity of our OA license labeling … This is work in progress and should be completed by summer.” This means their work in clarifying their OA license will take at minimum a full year. Taylor, Murray-Rust, and many other bloggers and Twitter users have made some noise about the issue. But there’s more work ahead to keep Elsevier and other publishers accountable.

taylortweet.png

On Monday, H.R. 4186, Frontiers in Innovation, Research, Science and Technology (FIRST) Act was introduced into the United States House of Representatives. Section 303 of the FIRST Act would be a leap backwards for open access to United States federally funded research articles. Some of the harms in Section 303 include embargoes increasing from six to 12 months up to three years, failing to ensure federal agencies have full text copies of their articles to archive, and a lack of clarity about what data will be made accessible and where it would be stored. Read more about the FIRST Act from SPARC here.

Paywalls are the blockade to articles, but we need to also keep our eyes open to other problems on the open access front even in areas we already think we “won” by publishing openly. As advocates we need to keep publishers and governments accountable. We need follow up by ensuring publishers are properly displaying an article’s open access and copyright status and not confusing (whether intentionally or not) their site’s users. We need to push for stronger public access to publicly funded research and fight back when governments are regressing the progress open access advocates have made.

 

The Open Access Button team is currently comprised of student volunteers. Most of us will soon become early career researchers, librarians or doctors, and we see the need for open access. Often younger academics are told that open access is risky or advocates call for “punk scholars” to pave the gold road first. But isn’t it more risky to let our research hide behind paywalls? To silently wait out a publisher’s year long label fix while they continue to profit off the mistake? Or to not to reach out to our elected representatives and challenge bills that will harm access?

Last week SPARC held their Open Access Meeting in Kansas City. One of the most notable presentations from the conference came from Dr. Erin McKiernan, an early career researcher working at the National Institute of Public Health in Mexico. Her institution only has access to 139 journals and she has pledged to be open. During the talk, McKiernan said, “If I am going to ‘make it’ in science, it has to be on terms I can live with.” (Presentation slides here, and video available in the next 2-3 weeks here.)

Over at team Button, we’re on the same page with McKiernan. This isn’t about what is “risky” or who is “punk” enough. It is about what we can live with. We can’t live with publishers incorrectly labeling open access articles and charging users just for the permission to make copies. We can’t live with governments setting back public access to research. We can’t live with anything less than open. How about you?

Start taking action to support open access by opposing section 303 of the FIRST Act if you’re a U.S. Citizen and by joining the Right to Research Coalition.

7eaeff9584106ef9ea92061d41d18aab

Chealsye Bowley is a solo librarian and Master’s in Library and Information Studies student. She presently coordinates social media for the Open Access Button and will soon be transitioning into Launch Coordinator. You can follow her on Twitter at @chealsye and the Open Access Button at @OA_Button. 

Category: The Student Blog | Tagged , , , , , , , | 3 Comments

That Homeostat’s Got Rhythm!

Biological stability, or ‘homeostasis’, where an organism works to maintain an internal ‘steady state’ in response to the environment, is a concept familiar to all modern biologists. At all levels of organization, from cells to entire organisms, negative and positive forces act in a yin and yang fashion to dynamically respond to the environment and re-establish balance. In humans, for instance, our temperature ‘set point’ is ~98 °F. In response to an infection, our systems respond by raising the ‘set point’ via changes in neuronal activity in the hypothalamus (a key brain region in the regulation of temperature). This produces a fever, which facilitates immune system function and stifles the growth of the invading pathogen. When allowed to run rampant, however, fever can be detrimental to the health of the organism and eventually lead to death. This highlights the importance of precisely regulated homeostatic mechanisms in maintaining health and preventing disease.

If your education in the life sciences was similar to mine, a lot of emphasis was placed on homeostasis, with little if any placed on an equally important concept: biological rhythms. For a long time, researchers disregarded unexplained daily variations in what they were studying (like temperature, blood pressure, growth hormone levels…) as ‘noise’ rather than true biological phenomena.  Indeed, until the latter half of the 20th century, research in ‘biological rhythms’ was lumped into an unfashionable category along with fringe subjects, including astrology and ‘mood rings’.  I wrote this post aiming to address misunderstandings in the relationship between homeostasis and biological rhythms, and share some cool and interesting real world examples that highlight this relationship and the value of biological rhythms.

Temperature regulation shows how the set point (or “homeostat”) can adapt in response to challenges from the environment to aid in survival (a process called allostasis). To add another layer to this, wouldn’t it be even more advantageous for an organism to pre-emptively change its set point to coincide with predictable environmental challenges (i.e., those that occur with some predictable frequency)?  In humans, for instance, body temperature and cortisol rise during the day to promote alertness, while growth hormone and melatonin secretion peak at night to aid in rest and recovery. At first glance, these variations seem to fly in the face of homeostasis, which dictates that each of these factors should remain at a fixed point to ensure optimum function. The reality is that this point must move up and down over time because the obstacles an organism must face change throughout the day and the year. Biological rhythms govern these changes and help explain why they occur.

Jürgen Aschoff, a fundamental figure in biological rhythm research, described the link between homeostasis and daily rhythms in the mid-1960s:

Homeostasis is a shielding against the environment, one might say, a turning away from it. For a long time, this phenomenon has been taken as the prime objective for an overall organization in physiology; and it evidently has great survival value. But there is another general possibility in coping with the varying situations in the environment; it is, instead of shielding, ‘to turn toward it’; instead of keeping the ‘milieu interne’ stable, to establish a mirror of the changing outside world in the internal organization. This has one clear prerequisite; the events in the environment must be predictable, which of course is the case when they change periodically.’

Nicholas Mrosovsky provides the example of temperature regulation in the camel to provide a real life example of Aschoff’s point. A camel faces a major problem every day of its life: how to keep cool. It is too big to bury into the sand, there is hardly any shade, and it would quickly die of dehydration if it were to utilize evaporative cooling to dissipate heat through sweating. There is a fundamental opposition between water balance and temperature regulation. Evolution has given the camel the necessary tools to deal with this situation.

The camel offers a prime example of homeostatic ‘set point’ regulation over time (Credit: Wikipedia)

During the day, the camel’s body temperature can rise as high as ~106°F – a wicked and almost certainly lethal fever if found in humans. At night, however, when water is scarce, the camel drops its temperature down dramatically to ~93°F, which would classify as dangerous hypothermia in people. The camel drastically reduces its temperature to protect itself from the next day’s heat. Because it drops it temperature so low at night, it now takes longer to heat up following day. In other words, the camels’ ‘set point’ is not fixed; it varies in response to predictable environmental challenges.

An organism’s physiology isn’t the only thing that needs precise timing; its behavior is set to a rhythm as well. Animals must not only adapt to a spatial niche (e.g., canopy, tide pools) but to a temporal niche (e.g., nocturnal, diurnal). It’s not only what an animal does that’s important, but when it does it. The ability to predict future events (either consciously, or in the case of biological rhythms, unconsciously) is of paramount importance in passing on your genetic information to future generations. One dramatic example of precise timing is the 17-year cicada, which emerges in a predictable fashion after lying dormant for nearly two decades. Another is the short-tailed shearwater, a bird that arrives at its breeding site in mid-autumn on small islands north of Tasmania. All the individuals in the population lay their eggs between November 24th and 27th each year, and they hatch at the same time. The cicada and shearwater make use of their exquisite timekeeping machinery to overwhelm potential predators with their progeny, allowing more newborns to survive than would if offspring emerged over a longer period.

Predator avoidance most certainly played a role in shaping the evolution of biological clocks. About 20 years ago, Pat DeCoursey and colleagues at the University of South Carolina conducted a study to investigate the adaptive function of rhythms in behavior. They lesioned the suprachiasmatic nuclei (SCN; a brain structure that acts as the primary ‘time keeper’ in vertebrates) of wild eastern chipmunks and then released them back into the wild and followed their survival for the next two years. To control for the effects of the surgery itself, they also “sham” lesioned several chipmunks, and left their SCN intact. After just 3 months, only a single intact chipmunk had become the target of predators, while 40% of the SCN lesioned animals became lunch. These deaths were attributed to the animals being active when they were not biologically inclined to be (i.e., their ‘clock’ was broken), making them easy prey.

A short day (left) and long day (right) adapted Siberian hamster (Credit: Gregory Demas, Indiana University)

In many rodents that live at non-tropical latitudes, the shortening amount of light each day signals the approach of winter months before the really cold weather hits. Siberian hamsters, for instance, have evolved to tell the time of year by measuring day length (photoperiod). With just two bits of information: (1) day length, and (2) whether days are getting longer or shorter; the hamster can tell what time of year it is, and if winter is coming or going. When days get shorter, males rapidly reduce their body size by ~20-30%, put on an extra layer of newly white fur, and all but eliminate their reproductive organs….they won’t be doing any mating when the weather hits -50°C. In response to short days, white-footed mice (commonly found in Ohio), actually reduce the size of their brain to putatively aid in saving energy. Every year, species like these need to radically reorganize their bodies to adapt to their changing environments, or die. This involves changing that ever stable ‘set point’ drastically throughout the year.

I hope some of the examples I’ve described above help provide context for thinking about biological rhythms. I also hope that a discussion of these rhythms in addition to homeostasis will facilitate the implementation of them into early lessons on the natural world. With the increased use of artificial lighting, shift work, and trans-meridian travel, our biological rhythms are being tested in contexts in which we have not evolved. Disruption of these rhythms is only now being appreciated as a contributing factor to many diseases including metabolic syndrome, depression, and cancer.

Nothing is more interesting than discovering the remarkable strategies animals have evolved to survive in their (sometimes) extreme environments. By adding the additional ‘wrinkle’ of rhythmicity in physiology and behavior, animals can exploit environments at one time of day or year that would be dangerous or even lethal at other times! I am excited for the future of the still new field of ‘chronobiology’ (aka the study of biological timekeeping), and can’t wait to see what nature has in store for us next!

Many of the examples I describe above are discussed in more detail in the excellent “Rhythms of Life: The Biological Clocks that Control the Daily Lives of Every Living Thing” by Russell G. Foster and Leon Kreitzman.

Jeremy Borniger is currently a doctoral student in the Neuroscience Graduate Studies Program at The Ohio State University. He received his BA in anthropology with a minor in medical science from Indiana University and has worked with chimpanzeesorangutans and gorillas. In his spare time, Jeremy enjoys playing the piano, scuba diving, cooking, and writing and reading as much about science as he can. You can follow Jeremy on Twitter: @JBorniger

Category: The Student Blog | Tagged , , , , , | Leave a comment

Beakers, Ballplayers, and Failures

One of the most valuable experiences that I have had while involved in undergraduate research has been that of failure. My participation in research over the past three years has taught me how to fail gracefully, how to handle the emotional and psychological impact of failure, and even how to predict and minimize failure. I am fortunate to have had all of these lessons and experiences before successful experiments become a prerequisite for tasks like completing a PhD thesis or submitting a competitive grant application. I have learned at an early stage in my career that failure is large a part of science. If an individual is unwilling or unable to accept failure as a part of his or her career, it is unlikely that he or she will be well suited for a career in research.

I now recognize that when I entered my freshman year, I had a very idealized view of the process by which scientific progress is accomplished. I had imagined teams of researchers working at furious pace toward curing diseases, innovating new electronics, and better understanding the universe in which we live. After all, the pace of scientific innovation over the past one hundred years can be described as nothing short of incredible. To an outsider, who reads news of treatments and technologies that could hardly have been imagined 15 years ago, science appears to progress with relatively few obstacles and rate-limiting steps. I never could have imagined just how naive I was.

In my high school biology class, I first learned of the polymerase chain reaction (PCR), one of the most beautiful, elegant, and ingenious techniques in modern biological research. This technique makes use of the cellular DNA replication machinery to amplify short segments of DNA to incredibly high quantities. PCR underlies much of the biological research currently being conducted and will only grow in importance as the field moves toward more sequence-driven research. Accordingly, when I tried to complete my first PCR, I was stunned to find that it failed to produce any level of DNA amplification.

http://www.paulvanouse.com/dwpcr.html

An overview of the PCR process. (Courtesy http://www.paulvanouse.com/dwpcr.html)

In the years since, I have completed this reaction numerous times, often with great success, but occasionally with the unyielding sting of failure. I have failed for many different reasons, some my fault and some beyond my control. I have failed to add reagents, I have added reagents in incorrect volumes, and I have worked with expired or “cooked” enzymes. Over the past three years, I have come up with checks and systems that work to minimize failure in both PCR and other laboratory procedures that I must complete. I have learned to accept these setbacks and failures while simultaneously working diligently to minimize them.

In many ways, my experience in research thus far has been reminiscent of my experience playing baseball when I was younger. Take, for example, Ted Williams, the Boston Red Sox hall-of-famer considered by many to be one of the greatest offensive players to ever grace the game. Despite a Hall of Fame career, a reputation as one of greatest batters to ever play the game, and the publication of one of the seminal works on hitting (The Science of Hitting), Ted Williams was only a .344 lifetime hitter. For every ten plate-appearances, he was expected to reach base on a hit only about 3.5 times. In other words, he was expected to fail 6.5 times out 10, and he is still considered to be one of the greatest players of all time.

Ted Williams’ book, The Science of Hitting

In baseball, as in science, it is not the number, but rather the nature of the failures that ultimately determines the legacy of an individual. Small failures, like botched PCRs or failed crosses, can often be overlooked as long as they do not interfere substantially with the larger project. However, the best scientists and ballplayers alike avoid the large, critical failures. The best scientists often have the foresight to avoid dead ends and blind alleys. Likewise, the best ballplayers are those that can manage a hit in critical situations, even though the odds of reaching base are forever against them. My dad once told me that the best ballplayers are those with the shortest memories—those that work hard to perfect their swing and the variables within their control, yet do not mentally hold on to their failures. I believe that the same is true of scientists.

TylerShimko_HeadshotTyler Shimko is an undergraduate studying and conducting research in biology at the University of Utah. You can follow him on Twitter @TylerShimko

Category: The Student Blog | Leave a comment

Science Funding and Politics – Learning How to Play their Game

In this guest post, John Vernon, an undergraduate in the College of Science at Notre Dame, reflects on the lessons he learned about science and policy after a summer in Washington DC.

United States Capitol - west front

By Architect of the Capitol (aoc.gov) [Public domain], via Wikimedia Commons

This past summer I had the opportunity to work for a science policy consulting firm and get a glimpse into the real world outside of academia. I discovered that a strangely symbiotic partnership exists between scientists and those who create and legislate from Capitol Hill. Many scientists and researchers try to stay out of the political game, but that can make it more difficult, and often frustrating, to get public health projects accomplished. Healthcare policy fuels funding, and navigating the Washington, DC political landscape in order to advocate for important scientific research is definitely not for the timid novice. There is no scarcity of interest groups on the scene, including professional consultants who focus on representing public and global health sectors. But how do they do it, and can you learn to do it for your projects too?

The name of the game is money, and that means proving that your research interests are worth investing in. This affects university research departments that have a critical need for government funding in order to carry on their work.  However, convincing the powers that be about the value of research science is easier said than done.

On the Executive side, the White House has its own Office of Science and Technology Policy (OSTP) to help the President in addressing scientific research questions and making judgments regarding new polices and programs. OSTP also forms relationships with the private sector for evaluating potential investments in industry, academia, and other sources.   Researchers who are able to match their areas of interest with those identified as high priority by the government will clear the first hurdle in the funding process.  However, there are multiple agencies that also weigh in on what they think is worthwhile to pursue. They all have the opportunity to influence what will actually happen once the “budgeting game” begins.  The Washington insiders are clearly familiar with this and have a sophisticated level of expertise that scientists frequently lack or loathe. This poses a challenge each spring when all the different government agencies submit their own budget proposals to the Office of Management and Budget (OMB).  After lots of “horse trading” the final recommendations are given to the President in order to prepare the final budget for Congress to vote on in February of each year. They control the purse strings dictating how the money should be spent, and the allocation of funds is often based on political considerations, as much if not more than proven need.

When there is a dearth of financial resources to go around, the competition within the scientific community heats up.  This rose to new levels when there were drastic budget cuts enacted through sequestration.  For many scientists, sequestration is synonymous with the “Day the World Stood Still”. Prospects for continuing grants, fellowships, and ongoing research projects were drastically compromised.  However, this government process of withholding funds actually began in 2011, and has been a hotly debated topic because of its significant affect on the scientific and research community. For the past 2+ years since Congress passed the Budget Control Act, there have been caps on discretionary funds for Public Health, Environmental Protection, and Law Enforcement.From my perspective as a scientist, it is important to consider the potential consequences that accompany this mandate to reduce funding by over $1 trillion dollars over the next 10 years. What price will society pay for this austerity?

According to a report from the American Public Health Association (APHA), the Department of Health and Human Services faced $3.7 billion in cuts on March 1, 2013.  The Center for Disease Control and Prevention specifically saw $35 million in funding cuts, and experts warn that this will reduce our ability to respond in the case of health threats. In addition, the funding reduction for the National Institute of Health was $1.6 billion. This lack of financial support in multiple areas of scientific research will undoubtedly hamper progress and impact the public health of millions.

A recent Huffington Post article, based on more than two-dozen interviews with scientists and academic officials summarized specific examples of the devastating impact that sequestration is having on research projects.  The NIH $29.1 billion budget for the current year seems quite large, but it has dropped from a high of $30.8 billion before sequestration.  There is every indication that circumstances will continue to be less than ideal and funding competition will only intensify. There is a strong argument for being able to make a case to the powers that be why your research makes scientific and practical sense for the public interest.

Thomas Frieden official CDC portrait

Thomas Frieden. By Center for Disease Control [Public domain or Public domain], via Wikimedia Commons

Some scientists have recognized how important it is to become pro-science activists and engage in the legislative process in order to be successful in advancing their cause in this ever changing, and politically driven climate. On April 19, 2013 Dr. Thomas R. Frieden, the Director of the Centers for Disease Control and Prevention testified before the House Subcommittee on Africa, Global Health, Human Rights, and International Organizations. He got the attention of the policymakers because he discussed a rising concern over the overuse of antibiotics, while also highlighting the noticeable void of research into this area. Dr. Frieden stressed the need for research and preventive treatment citing the facts that  diseases like MRSA and the H7N9 influenza are becoming a major health concern.

By introducing a plan developed by the Centers for Disease Control (CDC) to protect against health threats, and reinforce containment and border protection, Dr. Frieden moved the conversation into a science policy discussion. His presentation was a plea for additional funds to combat growing antimicrobial resistance, and emerging threats including salmonella and multi-drug resistant tuberculosis. Congressional Hearings in May 2013 also identified other domestic and global health concerns that are preventable as long as there continues to be adequate research funding and support. This is the collaborative relationship that will promote the best science and result in the most effective public health policy.

There are several organizations worth knowing about if you want help with getting support for science initiatives. The Coalition for Life Sciences (CLS) is the conglomerate of six different nonprofit organizations focused on supporting public policy, and advancing life science research and its many applications. One particularly well known, and impressive group that has a strong presence in Washington, D.C. is the American Association for the Advancement of Science (AAAS). This organization recognizes the importance of communicating science and technology, and believes that currently the needs for credible and objective information are not adequately fulfilled.  As a solution, in 1973 they developed a special Fellowship Program with just seven Fellows, and now they fund over 250 annually. They include recent PhD graduates and accomplished scientists with many years in education and industry. Their mission is to link policy and science, while creating an association of knowledgeable leaders who understand both fields, and can provide solutions to policymakers who make government decisions. The main program areas they cover are security and development, energy and the environment, health and education, big data, and global health. I have personally spoken with AAAS fellows about how they view their role in Congress, and the consensus is that the political process is very slow but thorough and policy decisions can be influenced with sound scientific input.         

Now more than ever, it is imperative that the scientific community learns to work collaboratively with policymakers on establishing a platform for funding significant research initiatives. In order to achieve this objective, the first strategy is to build credibility and develop relationships with policymakers, rather than simply trying to disseminate information, when they may not be receptive. As research scientists, we need to remember to talk in a language that is understandable to non-scientists, and includes practical applications as well.  The “KISS (Keep it Simple Scientists) Formula” is a good model to follow in order for research to be presented in a format that is user-friendly for the policy makers, regardless of their background, training, or experience.  They are charged with a very important job, and will be better able to do it effectively if they have the support of scientific data and professional expertise. Another strategy is to make sure your information tells a story and includes some reference to a problem or concern to which many people can relate. Although certain areas of public and global health may not get a lot of media attention, lives are at stake and the potential contributions from scientific developments is extremely important both domestically and throughout the world.

As members of a new breed of young scientists, we need to encourage and empower one another.  We must be ready to rise to the challenge and become involved in the debate in order to advocate for those initiatives we believe need to be pursued. Science must grow beyond the lab, and I believe this will only happen if we are part of the solution. We must  be willing to get out of our comfort zones and embrace the strategies that work well in the public policy arena in order to get the really important scientific messages out.  It is our time to step up and make a real difference in the world of public and global health.

2013-12-15 11.02.48 pmJohn Vernon is a senior Science Pre-Professional and Psychology double major at the University of Notre Dame. He has been involved in multiple research labs related to the study of Autism as well as concussions. Next year, John will be pursuing a Masters of Science at Notre Dame in Scientific Entrepreneurship and Innovation. Sources:

1.     Olsen, KL, Gilbert L. Science Policy Ethics Guiding Science Through Regulation of Research and Funding [PowerPoint]. Notre Dame, IN. College of Science Science Technology Policy Seminar; 2013.

2.     Sequestration ushers in a dark age for science in America. Huffington Post Website. http://www.huffingtonpost.com/2013/08/14/sequestration-cuts_n_3749432.html?utm_hp_ref=tw. Accessed August 20, 2013

3.     Sequestration’s impact on public health funding. The American Public Helath Association Web site. http://www.apha.org/advocacy/activities/resources.htm. Accessed August 20, 2013.

4.     Frieden TR. Meeting the Challenge of Drug-Resistant Diseases in Developing Countries.  Washington, DC: The U.S. House of Representatives Committee on Foreign Affairs Subcommittee on Africa, Global Health, Human Rights, & International Organizations; 2013.

5.      Science and Technology Policy Fellowships. The AAAS Fellowships Web site. http://www.aaas.org/program/science-technology-policy-fellowships. Accessed August 20, 2013.

6.     Bogenschneider KP, Little OM.  Advancing Evidence Based Policy: Getting Your Research Across To Policymakers. [PowerPoint]  Honolulu, HI. American Psychological Association Conference; 2013.

Category: The Student Blog | Leave a comment

Let’s talk about science

Quick quiz: Does the earth go around the sun, or does the sun go around the earth?

If you answered that the earth goes around the sun, congratulations! You scored better than 26% of respondents in the NSF’s 2014 Science & Technology: Public Attitudes And Understanding survey.

Yes, let that sink in for a moment. One in four Americans answered that question incorrectly.

Aristachus of Samos (310-230 BC) was a Greek astronomer who developed the first known model of the universe with the Earth orbiting the sun. It is now 2014 AD and 26% of Americans still don’t know what Aristachus knew.

Some might be quick to blame our education system, but that may not be the whole story. The NSF’s data suggests that Americans aren’t just short on facts, but rather, they don’t accept science as fact; rather, they think science is a matter of belief. When presented with the statement, “Human beings, as we know them today, developed from earlier species of animals,” 48% of respondents answered “true”, but when the same statement was prefaced with, “According to the theory of evolution,” the number of respondents answering “true” increased to 72% (a 24% difference). Researchers found similar results when comparing responses to the statement “The universe began with a big explosion” (39%), versus when the same statement was prefaced by “According to astronomers” (60% — let’s not even get into the fact that 40% of Americans apparently don’t know about the Big Bang).

Perhaps most perplexing is that many of these statistics have not improved over time. For the past 35 years, the NSF has asked Americans whether astrology is scientific. They found that the percentage of respondents who believe astrology is based in science is virtually unchanged since 1979! Thirty-five years ago, 50% of respondents correctly answered that astrology was not at all scientific; in 2012, 55% of respondents answered correctly. In some groups, it seems that misinformation identifying astrology as a science is taking a firmer hold; apparently, between 2010 and 2012, the percentage of 35-44 year-olds who believe astrology is scientific increased by 13%.

Many media outlets have used these stats to argue that we’re “doomed”, but don’t believe the hype. For instance, it’s been reported that while many Americans (45%) answered that astrology is scientific, almost none of the Chinese respondents were fooled (8%). This statistic is only partially accurate; Chinese respondents were asked whether horoscopes, not astrology as a whole, were scientific. And, most interestingly, that’s the only question Chinese respondents scored higher on than Americans. In fact, Americans had a higher rate of accuracy on almost every other question compared with the other surveyed countries (China, India, Japan, Malaysia, Russia, South Korea, and the EU), except the one about evolution. Keep in mind that, as I mentioned above, Americans’ rate of accuracy improves when that statement is preceded by “according to the theory of evolution”, and that rate is similar to other countries’ responses. Cherry-picking facts to support sensationalist headlines does nothing to inform the public, but does reinforce stereotypes (Chinese people are good at science and math!). More rigorously researched science reporting can avoid misrepresentation of facts, and can inform people about recent findings.

Here’s some good news from the report, which, mysteriously, not many media outlets have reported: the American public is receptive to science. Four out of five Americans reported that they are interested in new scientific discoveries, and similar numbers said they agreed or strongly agreed with the statement, “Even if it brings no immediate benefits, scientific research that advances the frontiers of knowledge is necessary and should be supported by the federal government.” More than half (58%) of respondents said they’d been to a zoo, aquarium, natural history museum, or science and tech museum in the last year. Unfortunately, science reporting accounts for only 2% of traditional media (TV, newspapers, radio), but more and more Americans are turning to the Internet for their news.

This is where scientists come in. Though we are encouraged to write papers and give talks geared towards other researchers, we also need to reach out to students and non-scientists. Find ways to communicate your research to as many people as you can. Not every person has had the privilege of receiving a solid math & science education, but many Americans do consume media. Many Americans watch TV, read articles online, or browse blogs, Facebook, or Twitter (check out this interview James Coyne conducted with Gozde Ozakinci in a Mind the Brain blog post on why scientists should use Twitter as a tool, and how). Consider posting science news and facts on your social media accounts, or starting a blog or website about your research or your field. Elise Andrew’s Facebook page “I F-cking Love Science” has 10 million followers; this is a testament to how receptive people can be to science when it’s presented in a fun and easy-to-understand way.

Kirsten Sanford, “Dr. Kiki“, interviews Stanford researcher Benjamin Tee about his research on flexible, pressure-sensitive electronic skin at the Berkeley Science Review‘s fall outreach event, Touch Me.

Get active in your local communities, formally or informally. Think about out how you’d describe your research to a new acquaintance at a dinner party, or to someone you meet in an elevator. Here in the Bay Area, we’re fortunate to have many established events where scientists can share their science knowledge with the public – Nerd Nite, TEDx, and The Bay Area Science Festival, among others. Check to see if there are groups in your area. If there’s not, consider starting one.

Spending an hour a week—or even an hour a month—communicating your science to others could teach someone something new. Maybe you can reach one of the 26% of Americans who don’t know we are orbiting the sun.

———-

Test your science knowledge! Here are other science questions the NSF asked respondents.

(Results by nation can be found in this report, on pg. 23.)

  1. The center of the Earth is very hot.
  2. The continents have been moving their location for millions of years and will continue to move.
  3. Does the Earth go around the Sun, or does the Sun go around the Earth?
  4. All radioactivity is man-made.
  5. Electrons are smaller than atoms.
  6. Lasers work by focusing sound waves.
  7. The universe began with a huge explosion.
  8. It is the father’s gene that decides whether the baby is a boy or a girl.
  9. Antibiotics kill viruses as well as bacteria.
  10. Human beings, as we know them today, developed from earlier species of animals.

——

Answers:

1. True 2. True 3. Earth around Sun 4. False 5. True 6. False 7. True 8. True 9. False 10. True

——

This article is being cross-listed on The Berkeley Science Review. Check out some other really interesting pieces there!

——

jane hu

Jane Hu is a Ph.D. candidate in the psychology department at University of California, Berkeley. Her research focuses on social cognition and learning in preschoolers. She is also an editor of the Berkeley Science Review and an organizer of the Beyond Academia conference. Follow her on Twitter @jane_c_hu, and check out her science blog: metacogs.tumblr.com

 

 

 

 

Category: The Student Blog | Tagged , , , , , | 6 Comments

Nature’s Batman Utility Belt: The Genome

A trilogy and controversial main-character bait-and-switch later, the Dark Knight has resurfaced as a new age hero, bent on brooding and fighting crime from the rooftops of Gotham City. However, one of the greatest mysteries in the story of this powerful vigilante is how his utility belt, which usually stands out bright yellow against his otherwise matte black body armor, always seems to have the right tools on hand for saving the world from someone that day.

"Coolest Belt Ever Made" (credit: House of El. Retrieved February 25, 2014)

“Coolest Belt Ever Made” (credit: House of El. Retrieved February 25, 2014)

Since I am not Batman, I cannot provide an answer to those questions, but it may be useful to think of another daily occurrence to serve as an analogy for B-man’s belt: genomics. Throughout my time participating in research, I have been fortunate enough to have mentors who always taught me techniques with an eye permanently fixed on the scientific future. Their guidance helped me gain insight into what I saw as nature’s utility belt: a series of codes (genomic DNA), complete with a printer (RNA synthase), that allowed any cell to print out the exact instructions it needed (messenger RNA) for the tools (proteins) exactly when they were needed. In that sense, nature and Batman are not all that different.

Alignment of DNA sequences across species. Each color represents a successful alignment of DNA elements, and each block of the same pattern represents successful alignment of a section of DNA, which may signify a conserved protein. (credit: http://www.sequence-alignment.com/ Retrieved February 25, 2014)

Alignment of DNA sequences across species. Each color represents a successful alignment of DNA elements, and each block of the same pattern represents successful alignment of a section of DNA, which may signify a conserved protein. (credit: http://www.sequence-alignment.com/ Retrieved February 25, 2014)

However, genomics and genetic expression do have some quirks of its own. One reoccurring theme in nature is the reappearance of the same gene during development in completely different organs! One of the most notable is the prevalence of messenger RNA sequences coding for a class of proteins known as claudins. Claudins act like the nails in a wall, sticking the membranes of adjacent cells together to form a water-tight junction between them.

During development, we can locate where these specific mRNA sequences are expressed by a process known as whole mount in situ hybridization, abbreviated WISH. By exposing whole zebrafish embryos to sequences of RNA directly opposite the normal sequence found in the embryos, these two sequences stick together within a cell in a process known as hybridization, where two separate sequences become one joint sequence. A protein carrying a dye can then attach to these pieces of hybridized RNA, which will deposit the dye in the cell upon exposure to a particular mixture of chemicals. Using this process, we can see that in early stages of zebrafish development, the sequences coding for members of the claudin family are expressed in many areas of the developing brain and surprisingly in the developing kidney as well!

Who would have thought that the organs used to produce thoughts, feelings, and memories would share structural similarities with an organ used for less glorious purposes on a molecular scale? Our genome is filled with such examples of multipurpose tools, expressed in organs that perform completely different functions. From junction proteins used to hold organs together to sodium channels (also coincidentally conserved between the brain and the kidneys), these structures, derived from the codes which unite all of us, form a powerful example about how intricately connected the structures of our bodies are.

However, this is where we must diverge from Batman. His utility belt may be all-purpose for him, but Mother Nature takes that concept a step further. This code, this natural utility belt, is replicated with striking similarity across all living organisms! They all function with a similar premise: From DNA to RNA to protein; manual to instruction to tool. However, every student with any fleeting interest in modern biology has known this fact, so I will attempt to illustrate the impact of this knowledge in a more concrete way.

Imagine, if you will, a world in which all living systems had a different form of molecular tool creation. In such a world, it would be likely that we would not nearly have the miracle cures of today, such as insulin for those afflicted with diabetes. The great strides or progress made through the use of animal models would never have happened. Almost all of modern progress in the biological sciences would never have been justified with medical relevance as an underpinning factor if this toolkit were not conserved between living species.

The remarkable functionality of the cells which constitute all living beings, and the degree to which they share commonalities, in and among organisms, is an amazing fact that breathed life into all fields of the biological sciences today. The ability to only synthesize the necessary components to our survival from second to second makes our cells extremely energy-efficient, and it also makes our genetic toolkit even more versatile than those dreamed up by the brains they initially created. On the off chance that you may be feeling a little less than super, it may be comforting to remember that at any given moment, every living cell in your body is performing feats beyond the capabilities of the richest superhero ever created.

Category: The Student Blog | 1 Comment

Why Every Science Student Should Attend a Conference

convention

Illustration by Yoo Jung Kim

This is a guest post by Dartmouth senior Yoo Jung Kim. 

Last week, the American Association for the Advancement of Sciences (AAAS)–the world’s largest general scientific society–held its annual meeting in Chicago. Among other globally recognized science conferences, the AAAS general meeting is unique in that it represents a broad spectrum of the sciences:  the conference hosts over 150 scientific symposia with topics ranging from science to technology to education to policy. AAAS also offers a wide range of opportunities that it offers to undergraduate students.

“At the Annual Meeting Student Poster Competition, undergraduate and graduate students present their research to other attendees, and scientists from many different fields provide feedback and advice,” said Tiffany Lohwater, the Director of Meetings and Public Engagement at the AAAS.

On top of the deeply discounted trainee registration rate, student attendees can take advantage of career development workshops and networking opportunities. The Student Poster Competition, featuring around 160 to 180 students presenting their research.

According to Michelle Oberoi, a junior at Univerity of California Irvine who won the best poster for “Brain and Behavior” category last year, “Competing in the 2013 AAAS poster competition against both graduate and undergraduate students as an undergraduate sophomore was intimidating and challenging, but overall rewarding. I was asked interesting questions regarding my research, which have ultimately benefited my current findings in the lab.”

AAAS is by no means unique in providing educational opportunities to college students. Many other scientific societies and associations–particularly those in the basic sciences–feature a number of programs to get student involved in their disciplines. For instance, the combined undergraduate/graduate poster session during the American Society of Cell Biology Annual Meeting drew in nearly student 200 presenters during its December 2013 meeting in New Orleans, many of whom were also registered to present in the general poster session. Over 1,200 students are expected to present at the special undergraduate poster session for the upcoming American Chemical Society National Meeting & Expo in Dallas, Texas, and thousands more will be participating in the 2.5 days of programming designed specifically for college students. Given these student-oriented learning opportunities that many scientific societies provide, every student researcher should consider attending a professional science symposium during their undergraduate career.

Despite the popular media depiction of scientists as antisocial individuals, academia is an inherently collaborative profession. At the undergraduate level, it may be hard to see the exchanges across laboratories, but the accretion of knowledge requires the communication of complex ideas from one scientist to another. Although the advent of the e-mail has largely supplanted the necessity of face-to-face conversations, this cross-fertilization of ideas still takes place during academic conferences, where researchers present their projects, discuss their findings, network with potential collaborators, and socialize with their peers. Other sort of interactions occur during meetings as well; graduate students look for post-doc positions, post-docs look for post-post-doc opportunities, principal investigators look for ideas to take back to their lab, science reporters look for interesting scoops, and vendors look for attendees to hawk their services and convention tchotchkes. Seeing these exchanges first-hand will help students to get a feel for a component of scientific research that is not readily observable in the laboratory.

By presenting at a conference, students can gain soft-skills that will be valuable at every level of their academic careers. Students participating in a poster presentation must prepare a visual representation of their work and present the summary of their findings clearly and concisely to other attendees. The poster-making process requires students organize their data and to delve into science writing at a deeper level than allowed by class lab reports. Many undergraduate poster symposiums pair presenters with scientist judges who have some degree of expertise on the topic at hand. This requires the student to be well versed on the paradigms and the methods used in his or her field. The entire process of preparing and presenting a poster necessitate a significant amount of sustained effort and helps student researchers to internalize their research and to build skills that will come handy in the future.

Students also have the opportunity to explore the leading edge of the discipline, and making connections in the scientific community can be of huge benefit for an undergraduate interested in embarking on a scientific career. Walking around the general poster session and attending oral presentations will allow students to get acquainted with the most important recent discoveries circulating in the field. This can help students to identify potential projects, laboratories, and institutions that they would like to work with in the future.

“The AAAS conference reminded me that I’m not alone and that there were lots of other people who were interested in science. I’ve kept pursuing research opportunities and I’ll be participating in the Princeton REU [Research Experiences for Undergraduates] in the summer,” said Christopher Luna, an Arizona State University junior and winner of the “Math/Technology/Engineering” category in 2013.

Any undergraduate who is interested in academia should identify a conference or a meeting of interest–especially those that offer programming for college students–by asking their research advisers. Science is a cooperative endeavor, and conferences allow undergraduate researchers a chance to explore their field outside of their laboratories and to gain skills early in their scientific training.

Frost

Yoo Jung (Y.J.) Kim is a senior at Dartmouth College in Hanover, NH. She is a former editor-in-chief of the Dartmouth Undergraduate Journal of Science and is currently working on a book to be published by the University of Chicago Press in Fall 2015. You can find Y.J. on her website at http://www.yoojkim.com/

Category: The Student Blog | Tagged , , | Leave a comment

Building Button 2.0: The Open Access Button

David Carroll, co-lead of the Open Access Button gives an update on the Open Access Button, with some news and plans for the future.

The Open Access Button is a bookmarklet that lets users track when they are denied access to research, then search for alternative access to the article.  Each time a user encounters a paywall, they simply click the button in their bookmark bar, fill out an optional dialogue box, and their experience is added to a map alongside other users.  Then, the user receives a link to search for free access to the article. The Open Access Button hopes to create a worldwide map showing the impact of denied access to research and to push for a more open scholarly publishing system. The second version of the Button is currently in development but we need you to make it happen – read on to find out why.

We launched the Open Access Button Beta after 7 months of hard work at the Berlin 11 Student and Early Stage Researcher Conference in November 2013, with coverage in the Guardian, Scientific American and beyond.

Open Access Button

Over 5000 paywalls have been marked since launch.

To date, the response has been overwhelming. Over 5000 paywalls have been mapped since launch (just the tip of  the iceberg) and we’ve had people from all over the world sending us messages to tell us the Open Access Button has helped them get access to the research they need. What’s currently at openaccessbutton.org is just a taste of what we are now building and since November, the team has been hard at work building Button 2.0.

The next version of the Button will be more powerful, better and more useful than the Beta. In order to build and launch Open Access Button 2.0, we’re seeking input from a variety of stakeholders to shape the Button project’s future. We want to be community centered and this survey will be the first of many opportunities in which the community can input and shape the Button they want to see in the future. This survey will form a key part of our consultation work and we encourage anyone interested in the Open Access Button to complete this. You can find the survey here.

The Open Access Button Team with Nick and Nicole from SPARC

As we build the bigger, better Button 2.0, were going to need a bigger team to make it all happen. Do you love Open Access and want to be part of the creation of Button 2.0? Then it’s your lucky day, we’re recruiting. You can find out more details including the application form here. The new team members will expand the already global team, reflecting the very nature of the Internet that we’ve grown up with. The people that make up the team embody the system we’re working towards, a world in which every single person on the planet is given free access to the sum of all human knowledge.

If you’re not a student but you love Open Access, there will be opportunities coming up soon, promise! You can follow the latest updates on @OA_Button.

We are best when standing on the shoulders of giants, we’re better when those shoulders are openly available to read and re-use. The Open Access Button hopes to make the problems of paywalls impossible to ignore, get yours at openaccessbutton.org. Get involved in building Button 2.0 here.

Screen Shot 2013-11-07 at 16.57.44

David Carroll is a medical student at the Queen’s University Belfast. He is co-founder and co-lead of the Open Access Button. You can find him on Twitter @davidecarroll and the Open Access Button @OA_Button

 

 

Category: The Student Blog | Tagged , | 4 Comments

All the Small Things

Humans are always trying to breach the boundaries of the unimaginable. In recent years, thinking big has meant thinking significantly small, 10-9 meters small. Nanotechnology has fascinated many scientists from diverse backgrounds with its possible applications. From medicine to everyday electronics, our capabilities have enabled us to build atom by atom and thus, make enormous strides in science. Yet, this trend should come as no surprise. In 1959, Richard P. Feynman gave what could be seen as the catalyst of the nanotechnology age. In “Plenty of Room at the Bottom” he describes a bottom up way of thinking, building bottom up. Feynman expressed his surprise in the lack of nano scale work while simultaneously providing a waterfall of possible methods and applications.

Now, forwarding to the present day a significant amount of work has been done with nanotechnology. When we play with atoms themselves, we must account for weird outcomes. Probably the most popular example is the carbon nanotube. Imagine a monolayer sheet of carbon rolled into a cylinder. Depending on the parameters of the tube, the strength to density ratio can be quite impressive. This strength is due to the covalent sp 2 bonds between each carbon atom. Moreover, the electrical and thermal properties of these tubes also make it an exciting material. Recently, a research team at Rice University led by professors Junichiro Kono and Matteo Pasquali created wet-spun nanowires out of trillions of carbon nanotubes. These wires have proven superior to copper wires in electrical current carrying capacity, stiffness, and convenience, as it is extremely light and thinner than a strand of human hair. Overall, these fibers can carry four times more current than copper wiring of the same mass.

Scanning electron microscope images of carbon nanowires produced by the Rice group in different gases.

Scanning electron microscope images of carbon nanowires produced by the Rice group in different gases. (Credit: Kono Lab/Rice University)

Scientists have also turned towards nanoribbons composed of graphene, a 2D crystal lattice one atom thick. This material can be considered the “star” of the nanotechnology world, with many scientists working on maximizing the material’s capabilities. Recently, a study led by Georgia Institute of Technology professor, Walt de Heer, presented epitaxial graphene nanonribbons grown on silicon carbide that actually conducts electricity at room temperature ten times better than theoretical predicted. Moreover, compared to exfoliated grapehene, we see an increase in conduction length by 1000 times at an impressive >10 micrometer distance. Interestingly, the nature of the flow of electrons along the sides of these ribbons resembles that of photons through an optical fiber. Sleek when you think of the scattered electron paths inside a standard conductor.

Artist's representation of  electrons (blue) traveling impeded through the graphene, which was grown on silicon carbide (yellow steps). (Credit: John Hankinson/Georgia Tech)

Artist’s representation of electrons (blue) traveling impeded through the graphene, which was grown on silicon carbide (yellow steps). (Credit: John Hankinson/Georgia Tech)

At Notre Dame we have our own Nanofabrication Facility, complete with those flattering clean room gowns. I am currently working on growing my own recipe for a novel semiconductor material using MoS2 and WS2. Though it may seem like graphene has a monopoly on the hearts of scientists and engineers working on nanoelectronics, its lack of a band gap, and subsequent leaking current, restricts its use as semiconductor material. Thus, transition metal oxides and sulfides have come onto the seen due to the band gaps they possess. This results in a decrease in the amount of energy lost as well as the size of the electronics the material will be used in. The most successful method for growth has been chemical vapor deposition, which includes precursors and a substrate. Generally, the precursors are vaporized to combine and depose on the substrate to form the desired substance. One can control the shape and the thickness of the crystals by altering the temperature, pressure, amount of precursor, and time.

University of Notre Dame Nanofabrication Facility. (Credit: University of Notre Dame)

University of Notre Dame Nanofabrication Facility. (Credit: University of Notre Dame)

Even the government has begun to praise the value of wide band gap semiconductors. Earlier this year, President Obama announced the Department of Energy’s new manufacturing innovation institute focused on the proliferation of WBG semiconductors. The main motivation of the institute lies in the environmental benefits that result from the reduction, approximately 75-80%, in electronic heat waste. This is a significant stride in the enthusiasm for nanelectronics as it now resides within the intersection of science and government.

Illustration of the energy efficiency benefits and other implications of wide band gap semiconductors. (Credit: U.S. Department of Energy)

Illustration of the energy efficiency benefits and other implications of wide band gap semiconductors. (Credit: U.S. Department of Energy)

Nanotechnology obviously has an outstanding future in electronics, but I should also mention some of its medical promises. Scientists have worked with nanoshell solutions and lasers to replace traditional suturing as well as with nanowires that can electronically identify the proteins present during the early signs of cancer. Though I could list many medical studies in this one post, and I encourage the interested reader to look through the many online papers, I believe it is better to highlight a recent development that is both amazing and creepy. A group at the Pennsylvania State University has, for the first time, succeeded in placing gold-ruthenium nanomotors into living HeLa cells. Through the use of ultrasonic waves, the team was able to magnetically control the motors and watch as they interact with the cellular membranes. As uncomfortable as it is to think of several of these tiny bots constantly playing bumper cars with your cells, the ramifications of this breakthrough are actually quite inspiring. Using this nanotechnology we have better chances of targeting and destroying cancers cells, improving patient diagnosis, delivering noninvasive drugs, and even performing cellular surgery.

Scientists at Pennsylvania State University were able to insert nanomotors into living HeLa cells and guide their movements for the first time.

Scientists at Pennsylvania State University were able to insert nanomotors into living HeLa cells and guide their movements for the first time. (Credit: Mallouk Lab/ Penn State)

Scientists and engineers are often asked to predict the future of technology. Where will we be in 50 years? Will science fiction become a norm? Even companies have taken it upon themselves to give the public a taste of their compelling, though somewhat ambitious future predictions. Samsung has been boasting their flexible screens for a little over a year now and yet; none of these products have come onto the market. Ultimately, we have not refined these products well enough. Time is still needed to make these tech dreams come true.

As a junior physics major, still dreaming of a lab of my very own, I am excited with where science and engineering is headed. We now have the capabilities to manipulate atoms; we can now be more creative than ever. The brilliant minds before us were able to combine materials into products that advanced our lifestyles as well as our knowledge of the surrounding Universe. With nanotechnology the current generation of scientists and engineers can use the inventions of the past to create new and better materials. Combine child like curiosity and imagination with mature scientific knowledge and mankind experiences ingenious discoveries.

 

References:

  1.  Baringhaus, J., & Ruan, M. (2014). Exceptional ballistic transport in epitaxial graphene nanoribbons. Nature, Retrieved from http://www.nature.com/nature/journal/v506/n7488/full/nature12952.html
  2. Feynman, R. P. (1959). Plenty of room at the bottom. Retrieved from http://www.pa.msu.edu/~yang/RFeynman_plentySpace.pdf
  3. Gibney, E. (2014, February 06). Graphene conducts electricity ten times better than expected. Retrieved from http://www.nature.com/news/graphene-conducts-electricity-ten-times-better-than-expected-1.14676
  4. Kahn, J. (2006, June). Nano’s big future. National Geographic, Retrieved from http://ngm.nationalgeographic.com/2006/06/nanotechnology/kahn-text/1
  5. U.S. Energy Department. (2014, January 16). Next-generation power electronics: Reducing energy waste and powering the future. Retrieved from http://energy.gov/articles/factsheet-next-generation-power-electronics-manufacturing-innovation-institute
  6. Weidner, K. (2014, February 10). Nanomotors are controlled, for the first time, inside living cells. Retrieved from http://news.psu.edu/story/303296/2014/02/10/research/nanomotors-are-controlled-first-time-inside-living-cells
  7. Williams, M. (2014, February 13). Rice’s carbon nanotube fibers outperform copper. Retrieved from http://news.rice.edu/2014/02/13/rices-carbon-nanotube-fibers-outperform-copper-2/

 

 

 

Category: The Student Blog | Leave a comment

Genome editing just got a lot easier

This post is cross-posted with Berkeley Scientific Journal

If you’ve recently taken a glimpse at the front page of any major science news outlet, it is likely you are no stranger to an emerging genome editing technology known as CRISPR/Cas9. With the help of RNA, Cas9 (a bacterial enzyme) can be programmed to target specific locations within the human genome, enabling scientists to delete, modify, or insert sequences that may treat, or even cure patients with genetic diseases. Although the CRISPR/Cas9 field is still in its early stages, major breakthroughs have been made recently, paving the road for a new line of gene therapy.

Just a couple weeks ago, researchers at Nanjing Medical University and Yunnan Key Laboratory reported successful usage of Cas9 in monkeys, thereby progressing toward the exciting possibility of editing human genomes. They injected Cas9 into monkey embryos and impregnated female monkeys with the resulting eggs; quite

These set of twins were born with targeted deletions in their DNA as a result of Cas9.

These set of twins were treated with Cas9 as embryos and as a result, born with targeted deletions in their DNA.

remarkably, the newly born monkeys had deletions in the targeted gene of interest. In the world of science, where expected results sometimes never come to fruition, this marks an important stepping-stone in Cas9 technology. Targeting genes with this high degree of specificity could potentially lead to therapies that will prevent individuals from developing genetic diseases.

Cas9 looks for PAM sequences (gold) and matching sequences before cutting the DNA (picture by KC Roeyer)

Cas9 searches for PAM sequences (gold) and matching sequences before cutting the DNA (picture by KC Roeyer)

In the same week, a paper published in Nature revealed the mechanism by which Cas9 finds target DNA sequences tens of base-pairs in size within a genome that contains three billion base pairs. Interestingly, the researchers showed that Cas9 searches for a specific sequence known as the PAM – if the target doesn’t carry this short DNA tag, the sequence is neither recognized nor cut. Samuel Sternberg, lead author on the paper, explains that the presence of PAM sequences “accelerates the rate at which the target can be located, and minimizes the time spent interrogating non-target DNA sites.”

One of the most recent discoveries came out last week in Science magazine: two structural biologists at UC Berkeley, Jennifer Doudna and Eva Nogales, published the structure of Cas9 from two different organisms, providing key insights into the mode of DNA recognition and cleavage by the RNA-guided enzyme. Fuguo Jiang, one of the lead authors on the paper, said, “although the two Cas9s are from different organisms, and overall they look very different, when you superimpose the two structures, their functional domains are very similar. This suggests all the Cas9s have a similar mechanism to make cuts in DNA.” Additionally, the paper revealed an important loop within Cas9 is responsible for recognizing the PAM. As for what this means for the future of CRISPR/Cas9 technology, Jiang elaborated, “The original Cas9 recognizes one particular PAM, but if you can engineer it to recognize a different PAM sequence through mutagenesis, that would be great. The genome sequence is usually fixed. But what you could do is change the PAM sequences that Cas9 recognizes and tailor it to target more genomic loci. In this way, we can expand our Cas9-based genome-editing toolbox.” This discovery, along with the physical mechanism by which Cas9 locates target sequences, may help improve the efficiency of targeted gene editing.

And that’s not all. Editas Medicine, a new company co-founded by some of the leading scientists studying Cas9 aim to translate its genome engineering technology into a novel class of human therapeutics. These therapies are destined to make significant medical advances for people with genetic diseases including, but not limited to, Huntington’s disease, cystic fibrosis, and Alzheimer’s. Since modern sequencing technology has produced a massive amount of human genome sequences, mapping diseases to certain genomic coordinates is becoming faster and easier. With this valuable sequence information, the CRISPR/Cas9 system can simply be engineered to make positive changes in specific diseased DNA sequences and restore normal function.

Editas Medicine's goal is to further develop CRISPR/Cas9 technology into a novel class of human therapeutics. Pictured from left: Jennifer Doudna, Feng Zhang, Keith Joung, and David Liu

Editas Medicine’s goal is to further develop CRISPR/Cas9 technology into a novel class of human therapeutics. Pictured from left: Jennifer Doudna, Feng Zhang, Keith Joung, and David Liu

Genome engineering earned researchers a Nobel Prize in 2007, but with Cas9 speeding ahead, I wouldn’t be surprised if one is awarded to a Cas9-er in the near future.

 

**While writing this article, a paper was published in Cell that reveals another structure of Cas9, but now bound to its target DNA. This structure provides more information about the molecular mechanisms by which Cas9 cuts its targets and will further aid researchers in improving genome-editing tools**

If you want to learn more about how Cas9 functions, check out this video produced by a student in Eric Greene’s lab at Columbia University:

Cas9: The Enzyme, The RNA, & The Virus

Cas9: The Enzyme, The RNA, & The Virus (video by Myles Marshall)

Cas9: The Enzyme, The RNA, & The Virus (video by Myles Marshell)

 

 

 

 

 

 

 

 

 

photo (1)Prashant is a senior undergraduate student studying biochemistry and molecular biology at the University of California, Berkeley. He currently is Editor-in-Chief of Berkeley Scientific Journal, where he became interested in science journalism and its propensity to motivate general audiences.  Read the current issue here. Follow BSJ on Twitter.

Category: Bacteria, Cancer, Genetics, News, PLoS, PLoS Biology, PLoS Blogs, PLoS Genetics, PLoS Medicine, PLoS Medicine Week by Week, PLoS Medicine's Daily Click, ResearchBlogging, Science, science journalism, The Student Blog | Tagged , , , , , , , , , , , , , , | 10 Comments