AmSci Reviews

The Rise of Open Access Scientific Publishing

Accessing the absolute latest in scientific communications directly by the independent amateur or citizen scientist has been a financially daunting prospect for decades; practically impossible. The top research journals carry high subscription rates (price out for yourself one of the best), and the science professional relies on their employing institution to cover the costs of access through the resident library budget, save for a personal subscription to their most treasured journal.

Of course, a great deal of front-line scientific research is funded by governmental agencies, which translates into taxpayer dollars. So, shouldn’t the taxpayer be able to access the fruits of their financial investments? Or, should the taxpayer just expect to later benefit from the results of research after implementation by companies and institutions? To avoiding opening a messy can of political warfare here, we will instead focus on the fact that there has been a rapidly growing trend from part of the scientific publishing industry itself to provide open access (“OA”) to the most recent articles from scientific research. This trend has been influenced from both the scientific authors and their funding agencies.

A traditional culture about to change

The traditional business model for scientific journal publications is for professional scientists to submit their draft papers to a group of peers — typically in related fields of expertise — who brutally review, critique, and feedback on the work. Revisions are made until acceptance of the peers is reached and the paper is published in a future volume of the journal. Individuals or, most typically, institutions then pay for access to obtain printed and digital versions of the published papers. Prices for access are high. It’s expensive to pay for the printing, editing, and the peer review process for such technically challenging and critical information and a bit of profit has to fit in there as well.

To change this culture and allow for the free dissemination of scientific advances, new models are being developed and tested. Money still has to be made, of course: the expenses are still present, even if the published materials are openly available to the readers, and even if they are solely digital presentations. One such model requires the author to pay a significant fee to be published (assuming they passed muster with the peer review), or if the author can’t bite the financial bullet, then their funding grant money or institution can help out with the bill. This approach is referred to as “Gold OA.”

An alternate version, which fits into the traditional subscription publishing model, is self-archiving of published work by the individual author. If the journal offers this permission, which is more often now required by funding agencies, the author maintains a digital copy of the work on a personal or institutional resource separate from the publisher. This “Green OA” approach is dependent on the individual author’s follow-through on archiving and requires a separate cataloging service to provide any sense of organization.

The OA Explosion

The Open Access movement significantly expanded during the first decade of the 21st Century. A detailed study of this growth of open access by Laakso, et al. [1] published in 2011 (available as open access, of course) found that about 19,500 open access articles in 740 journals were available in 2000, and by 2009, nearly 192,000 articles could be accessed for free in 4,769 journals.

The development of open access publishing 1993–2009. (ref. 1)

The Public Library of Science (PLoS), starting simply as an advocacy group in 2000 and then expanding to an OA publisher in 2003, has been arguably the most successful innovator and proponent of the open access movement in scientific communications. PLoS currently follows the model of author-pays-to-publish-post-peer-review and has been publishing cutting edge scientific research featuring seven journals with tens of thousands of peer reviewed articles.

Quite recently announced (January 30, 2012), an even more extreme model of open access will begin later in 2012 that utilizes post-publishing open peer review with the ability to revise published work after releasing it to the public. This new approach being launched by the “Faculty of 1000” (F1000) group will require a similar pay-to-submit model. Called F1000 Research, this system will also allow for an “open” format for how information is presented, which may include poster presentations, traditional written articles, graphs and charts, and even raw data. The published information will be immediately available after a simple formal check from F1000 advisers that the submission is scientifically relevant, and registered readers will have the opportunity to provide comments and ratings.

The other side of OA

On the other hand of this seemingly exciting new trend for the seeker of open knowledge, the author-pay-to-publish model can potentially be a major hurdle for independent amateurs wanting to publish their work in peer-reviewed journals. The fees certainly are steep and it might be assumed that the majority of amateurs do not have the personal funds to move their work through the publishing system. Just as major funding agencies, like the National Science Foundation, require and provide assistance with publication fees for open access availability for scientific professionals, the independent researcher will need to search for outside financing to support the work if it is to intermingle with the professionals.

Although it certainly isn’t required that an independent publish their work through a traditional or open access route, it is the peer-review process that really is required to vet the work and help develop it into a final form that provides accurate and valuable information to the rest of the world. The principal of the peer-review process is a critical resource for the scientific community to ensure that rigorous information is disseminated, and the independent should not be outside of this, or at least some form of, quality control. If the independent wants to be taken seriously, they should try to play with the majors, and deal with the financial burdens in innovative ways.

So, it seems that the long-term success of Open Access is rather exciting for the amateur and citizen scientist. It is certainly exciting for Dynamic Patterns Research, which is not a large academic or commercial organization with massive budgets to cover massive journal subscriptions. However, the OA development is not a straightforward path for the future of the business of scientific publishing. Quality is never really free, and for organizations to provide this important peer review and publication services that we need and desire, expecting it to be free might mean significant sacrifices somewhere else.

The debate between the traditional and OA models has been brewing for years, and although growth in OA resources has been substantial in recent years, there is still a long way to go to discover that secret formula for a sustainable business model for Open Access publishers. The premier publishing group, Nature, (and rather expensive journal publisher) hosted a web debate forum [2] in 2004 on the issue of access to scientific literature. In particular, Kate Worlock wrote a “pros and cons” review describing many important concerns that OA would bring to publishers, all of which would directly impact the end-user. 

Most notable, in the digital age publishing is heavily dependent on new technologies, and advances and innovations require great investments to develop and implement. With the popular pay-per-article model at current typical rates ($500 to $2,500), publishers will expect a great deal less revenue to meet operational expenses, invest in the future, and make a profit. And, making a profit is not an evil activity: it is from this profit that future growth and beneficial investments can be made possible. With reduced profits, the potential lack of investment ability could bring scientific publishing to stagnation and irrelevancy in the marketplace, obviously resulting in a direct negative impact on the scientific community.

As suggested above, although author fees could be covered by funding sources, host institution libraries might also be on the hook to support their resident research. The open access payment model only shifts the costs from one line item category to the next, and as fees grow higher in the future, budgets may continue to be equally stressed. In addition, larger research-focused academic institutions with significant output might end up subsidizing the pay-per fees from smaller teaching-centric colleges or commercial organizations with fewer annual submissions from their faculty.

Possibly the most critical concern is that many important scientific journals are published by academic societies which host additional activities and benefits to the scientific community. The profits from a society’s publishing division allows for the development of membership programs, conferences, and other benefits. With a loss of publishing income, member groups will have to make up revenue elsewhere, likely from the pocket books of the members themselves.

The Federal Government’s Conflicting Approach

As already mentioned, the United States federal government has provided mandates for research funded by taxpayer money to be published in an openly accessible way. For example, the National Institutes of Health requires all of their funded research to be listed with the government-hosted PubMed Central. However, just recently on December 16, 2011, a bill was introduced in the House of Representatives called the “Research Works Act,” which proposes to restrict the existing mandate of network dissemination of published work — without the publisher’s prior consent.

Certainly this a tantalizing development, which one might cynically think that the publishers have something to do with it … and, one might be right: The Association of American Publishers announced strong support for the legislation. A thorough review of this complex debate was offered a couple of weeks ago by The Chronicle of Higher Education [4], and it is interesting to note that many academic publishers, like MIT Press and Oxford University Press, have already expressed their opposition to the AAP’s position and are clearly trying to distance themselves from the anti-open-access side of the debate.

The primary point from the publishers is that although federal funds certainly initiate some of the research work, it is only through the independent professional efforts of the publishers providing peer-review, analysis, editing, and the final development of the work in a format that is desired by the community, that the research work can even be presented for others to access.  And this independent presentation should not be controlled by the government on a case-by-case basis.

Making OA revered

With all of the issues facing scientific publishers to deal with the inevitable advancement of Open Access, it will still largely be up to the scientific authors themselves to make the transition to sustainable OA complete. In the culture of academic scientific research, recognition is still critical as the old mantra of “publish or perish” still drives many young post-docs and non-tenured faculty. Listing twenty articles with unknown journals might not provide the needed prestige as one killer article in the top journal in the field.

Scientists are still getting a handle on what OA resources are available today, and, although many may strongly support their work to be freely available, there still comes a necessarily selfish point where one’s own career and security is paramount, and selecting the right journal for submission becomes vital. A detailed survey from 2006 of researchers was reviewed by Alma Swan [3] describing the current state of mind of authors in their selections of journals and what their concerns and ideas are toward Open Access. Of particular note, many researchers still are not aware of the available OA resources in their field, as they likely remain focused on the publications they “grew up with” during their own education. With the current generation of new scientists, it will then be up to the OA publishers to bring their journals to distinction both in the view of their respective scientific communities and in the eyes of the individual scientists working in the field.

Dynamic Patterns Research has developed a concise reference list for reaching open access scientific knowledge, and tries to highlight the most important resources currently available. This reference will evolve as major changes occur in the availability of resources. Not only will DPR be utilizing these resources frequently, but we hope that it will provide an exciting portal for our readers into some of the top scientific advances happening today.

DPR EDU :: Open Access Resources

… …

Update: February 17, 2012
A significant group of professional mathematicians are taking a stand against a major publisher. It seems like a key issue is the fact that many professional scientists are providing critical editorial services for the publishing houses — as volunteers — while the publisher’s profits are increasing at significant rates.

“Mathematicians Organize Boycott of a Publisher” :: New York Times :: February 13, 2012 :: READ

… …

For more information on the issues in the Open Access revolution, review the PLoS collection of published articles on the topic going back to 2003, including why PLoS became an open access publisher.

[1] Laakso M, Welling P, Bukvova H, Nyman L, Björk B-C, et al. 2011 The Development of Open Access Journal Publishing from 1993 to 2009. PLoS ONE 6(6): e20961. doi:10.1371/journal.pone.0020961

[2] Nature web focus: “Access to the literature: the debate continues.” 2004

[3]  Swan, A. (2006) The culture of Open Access: researchers’ views and responses in Jacobs, N., Eds. Open Access: Key Strategic, Technical and Economic Aspects. Chandos.

[4] “Who Gets to See Published Research?” Jennifer Howard, January 22, 2012, The Chronicle of Higher Education

NSF Features Citizen Science and NestWatch

The National Science Foundation‘s online magazine, Science Nation, features the latest efforts from the Cornell University Lab of Ornithology expanding citizen science program. Lab director and ecologist, Janis Dickinson, discusses how successful the fields of ornithology and astronomy have been in matching professional research activities with hobbyists who thoroughly enjoy doing what they do and simultaneously helping the advancement of science.

In particular, the Lab has been collecting data on nesting events of birds since the 1950s through their Nest Record Card Program. These records are still filed away on little, worn index cards written by amateur observers, but include valuable nesting data, including basic climate information, for the great, great, great, great grandparents of birds in the wild today.

With the funding assistance of the National Science Foundation and the collaborative development from Cornell and the Smithsonian Migratory Bird Center, nesting event data collection has moved into the current century with NestWatch. This online, citizen science data collection tool is an efficient method for anyone interested to learn how to monitor the activity from backyard bird nests, report observations, and explore the activity from other citizen scientists all around North America (view an interactive map of the data).

Simple certification is required before submitting observations, but once set up with an account, anyone using NestWatch will have a great opportunity to help support research that is a critical component to global environmental monitoring. The program is also perfect for families and schools to work with an at-home project that is fun and can lead to many educational moments through spending time outside and looking for bird species and behaviors that you may never have witnessed before.

“Citizen Science” :: A special report from NSF’s Science Nation :: August 30, 2010 :: [ READ ]

Register your backyard nest site with NestWatch [ VISIT ]

Hackerspaces offer a Unique Opportunity for the Citizen Scientist

Webcam view of London Hackerspace

A typical dream of an active citizen scientist might be to have one’s own fully-equipment research laboratory and tinkering space conveniently established in one’s own garage or basement. Proper lab setup, either being a diy bio lab or an electronics lab or even a nuclear fusion lab, takes a great deal of planning, time, and at least some form of significant financial resource.

So, not everyone can implement personal lab spaces at home. And that is where the Hackerspace can be of assistance.

A hackerspace is a specialized open community lab where people with similar interests can meet, collaborate, experiment, and create. These are typically run as a membership organization with a board of directors and paying members, and many maintain non-profit 501(c)3 tax status. Although, Dynamic Patterns Research has not yet been directly involved with any particular hackerpace, the concept of this community format is exciting, and it is growing quickly in world-wide reach and popularity. Hackerspaces offer the essence of citizen science, and by distributing the burden of funding and management to the membership, they offer an accessible and efficient way for anyone to make their amateur research dreams come alive. (visit) provides an international online space for connection and collaboration between brick-and-mortar hackerspace organizations, and provides how-to documentation and support for those interested in joining existing groups or creating your own. Many of the existing hackerspaces focus on the “physical sciences”–namely, electronics, software development, and making machines that go “ping”! In a little more than one month, a San Francisco Bay-area hackerspace is attempting to gain enough funding to open a “biological” hackerspace called BioCurious. They have set up a Kickstarter project (visit) to help quickly grow interest in support, and if you live in the area you should check out the new group as they will likely provide a great, new opportunity for citizen scientists, as well as pave the way for the development of more “biohack”-spaces around the world.

… …

Hackerspaces All Over The World [ VIEW and Find one near you ]

A sampling of US big city hackerspaces…

For our friends in the UK, check out The Hackerspace Foundation [ VISIT ]

… …

UPDATE September 15, 2010: The BioCurious Kickstarter fundraising efforts reached its goal of $30,000 with seven days to go! Congratulations to the group, and we’ll be watching the progress.

Distributed Computing Evolves into Distributed Thinking: A Path Toward The Singularity

The re-emergence of the citizen scientist began a major fast forward in 1999. Scientists at the University of California Berkeley launched a new project to virtually connect millions of computers around the world to simultaneously process and evaluate radio signals from space. The gift wrapping of this program appeared to be a colorful and unique screensaver on a participant’s computer that would innocently chug along while sitting idle. The background of this program was working hard to identify subtle clues of the existence of intelligent life somewhere out in the cosmos.

SETI@Home was not the first kind of this software (a little history), but was the first popularized and most largely distributed computing application that allowed anyone with an internet connection to take part in real scientific analysis. Although over a decade later we’re still searching for life from above, the project is considered an amazing success. It technically harnessed so much computing power and processed the results in a way no other supercomputer of its time could compete with in both efficiency and–especially–cost.

Soon after the successful launch of SETI@Home, many more scientific applications were developed into the platform of distributed computing, and many more opportunities became available to the interested citizen. Evolutionary calculations, solving for drug designs to combat AIDS and cancer, analyzing data to extract the physical structure of the Milky Way, and the detection of gravitational waves all came along into the arena (and many more). In particular, an extraordinary complicated computational problem that scientists across many fields have been trying to make progress on for decades is that of protein folding; or, how does a random chain of amino acids (the building blocks of life on Earth) configure itself into complicated three-dimensional structures, the exact pattern of which determines its vital function.

The problem is one of minimization of energy where every given chain naturally falls into a coiled state that happens to require the least amount of energy to sit in that state. Consider how much energy it takes to play the game Twister, how much more energy it takes to win at Twister. Now, consider how much energy it takes to sit on the couch with your arms at your sides watching a favorite movie. Couch potato might be your personal configuration of a minimum energy state, while your contortions during Twister require uncomfortably high energy.

Nature seems to figure this problem out because it can’t do it any other way–if the protein “feels” too kinked up then it just flips into something more comfortable. But, human beings who want to understand the physics of this process are having a difficult time coming up with a clean mathematical representation of the experience of the protein. Computers–and their mighty brute force–are used to take a chain and calculate the energy required to maintain every possible configuration of the protein with the hope to then look back at all of the results of calculated energies and see which one has the smallest value. A rather straightforward approach, but the statistical possibilities for longer and longer chains become immense.

So, here is where human intuition has been coming into play. More recently, the protein folding team decided to integrate the power of the human mind into the calculation process. They developed a game system (visit Foldit) that allowed players–the citizen scientists–to gaze at a potential configuration of a protein. The edges requiring higher energies are highlighted and they can then “play” with the configuration interactively. With each tweak of the structure, the energies would be recalculated and displayed, and the human being could feel their way to a structure that seemed to carry the least energy. Sort of how nature might do it, too… without the obvious conscious observer, but rather in a more self-organized fashion.

The results of the game can actually be tested, as likely structures identified by the citizen scientists and recognized by the professional scientists can then be generated in the laboratory and monitored to see how stable the folded pattern actual is for that protein. In fact, the first academic paper has just been published, and the author list includes some of the actual citizen scientist players of Foldit.

“Citizen science: People power” :: Nature News :: August 4, 2010

[ READ the Feature ] :: [ Download the PDF ]

So, it’s this “people power” through distributed thinking that is bringing new success to the critically important and painfully difficult computational problem of protein folding. Similar results are also being experienced by GalaxyZoo and their expanding platform of Zooninverse, where nearly 312,000 at home users help identify interesting astronomical structures or phenomena that require subjective classification decisions. The possibilities in these classifications are nearly endless, and programming an endless list of options into computer code is maybe not the most effective use of a computer scientist’s time.

Identification of phenomena in our amazing universe by coding is just not as effective as the slick, and subconscious intuition of the alert human brain. This is a unique skill that neuroscientists don’t even remotely understand, and it is certainly a skill obviously lacking in all computers. And, it is citizen scientists who are using their unique skill to actually drive computational research and development. These efforts will help scientists better understand what results intuition can bring, and possibly how to develop computational platforms to perform in similar ways.

Maybe then, citizen science will help bring us even closer to the inevitable Singularity Event (learn more about it), predicted to occur around 2050. It is this future time when the accelerating advances in computational power will surpass that of the human brain, and things will change dramatically. Computers will gain that unique, once human-only characteristic and will then continue–on their own–to further accelerate their own technology beyond that of human engineering. The final transitional technology that directs us to The Singularity will be the final invention of our species, and from that point on we will be connected in with a powerful force that will continue our evolutionary process at an incredibly high rate.

Maybe the further development of “distributed thinking” during the 2010’s will be the key technology for The Singularity to occur. Maybe citizen scientists around the world will play the critical role in the development of this technology. A fitting connection, of course, because it would represent the ultimate final experience of the evolutionary advantage that the homo sapiens hold over all other existing species; that of invention. As a massive collective effort, our species will together participate in an evolutionary development that will lead to a fundamentally new era in life on Earth. We will as citizen scientists work together to create and invent the next step in our own evolution, and with this will bring about the next version of the humanoid, Human 2.0.

Firefly Watch REVIEW

Where did all of the fireflies go? Generations of families have spent countless evenings spotting and catching little glowing bugs in our backyard. After the sun would set, seemingly hundreds of flashing lights would come out and float around outside. But, many are finding fewer of these flashes, and it is not yet clear what is causing an apparent decline in firefly populations.

Researchers from the Museum of Science, Boston, Tufts University, and Fitchburg State College are enlisting citizen scientists from all over the country to help map out and study firefly habitats to determine what environmental factors affect their geographic distribution and behavior during the summer. This is a wonderful opportunity for families to experience real amateur research and contribute to an important nation-wide study of the evolution of habitats in our country.

The tasks involved in this project are relatively easy, and will consume only a few minutes each week in the evenings… of course, more time may be spent to collect more valuable data. The project features an easy-to-use on line data journal, and provides updated maps of habitat observations across the country. For each data collection session, a few simple questions need to be answered from ten minutes of observations in your backyard … a time that will prove to be a relaxing respite after a long day at work or a fun time to bond with the children.

Several environmental factors are being explored in this study to find out more about what general influences firefly activity. In particular, they are looking at how different types of lawn care activities might affect the habitat, outdoor and street lighting during the evening and night-time, foliage coverage, farmland, and water sources.

Firefly Watch also includes a nice educational overview about fireflies… or lightning bugs (they’re really flashing beetles!)… including how to identify different types, gender, and why and how the little buggies having flashing bums. This is yet another perfect opportunity to learn about science in nature with your family, and then go out and experience the science directly in your own backyard. Certainly, this is a way for younger students (and those newly-inspired adults!) to realize a deeper understanding and appreciation for nature. In addition, if time is spent first with children to “book learn” about fireflies and then directly experience firefly behavior in the “real world,” a further realization and connection between learning and experiencing the real thing can be developed. And, this skill is certainly important to have as they continue with their future education experiences in the classroom.

Register for free on line right away, so that you may collect as much data as possible this summer. Then, additional data collection will open up next summer so that firefly habitat trends year-to-year may developed. Recall your childhood memories, make new memories with your kids, and do real amateur research this summer with Firefly Watch.

“Firefly, oh so bright, how many in flight this night?” :: The Boston Globe :: July 7, 2008 :: [ READ ]

“Where Have All the Fireflies Gone?” :: Radio Interview with Adam South and Don Salvatore from Living on Earth distributed by Public Radio International :: July 18, 2008 :: [ LISTEN or READ THE TRANSCRIPT ]

If you have participated in the Firefly Watch program, please tell us about your experience and results by posting a comment below.

Last updated August 7, 2022