If you have been considering getting involved in Citizen Science and just haven’t found the time or the right project, then let this annual opportunity pique your interest! The Great Backyard Bird Count is hosted every February by the Cornell Lab of Ornithology and the National Audubon Society, and it takes as little as 15 minutes. It’s also fun, free, and perfect for the entire family to do together.
During the four-day event (Friday, February 16 through Monday, February 19, 2018), head out into your backyard and count the birds you observe. Next, simply submit your checklist of observations online or through a mobile app, and your data will be used by researchers for the rest of the year to study how birds are getting along in our environment. This is the 21st year for the GBBC and last year more than 210,000 participants provided bird observations of nearly 6,000 species!
Bird populations shift throughout the United States, and observations of these behaviors are a vital window into environmental trends. For more details, check out the 2017 results and the many great photos sent in along with the observations.
Sign up today to get prepared for your backyard observations and download the free mobile eBird app (iOS | Android) for all of your bird observations.
Let us know if you participate this year and anything interesting you observe, and share your bird photos on our Facebook page!
In the late 1800’s, a small, well-formed cylinder composed of platinum and a little iridium (the same alloy used in fine platinum jewelry today) was defined by the international scientific community to have a mass of exactly one kilogram. This was not a special rock dug up from the Earth, nor a once-in-a-lifetime meteorite fallen from the heavens, but a man-made object that was bestowed this great and important property to be used by generations of scientists and non-scientists. (Happy 125th Birthday, Kilogram.)
For myself as a student of physics, and likely with many professional scientists in the 20th Century, there was a lingering empty feeling from this type of “pull-it-out-of-the-air” proclamation for something fundamental to so many calculations and theories describing how Nature works. The speed of light, c, is a fundamental number that is directly measurable (try it yourself with chocolate), the definition of a unit of time in seconds, s, is directly measured from a naturally occurring phenomenon with unprecedented regularity (originally based on the rotation of the Earth, with all of its wobbles; now on the energy level transition in an atom near absolute zero temperature), and the meter is marked off (since 1983) by a measurable distance traveled by light in a fraction of a second. So, most other important units are built up from more fundamental definitions. Yet, the kilogram, with its smooth lump of metal, is still thrown into this fundamental mix.
For example, the Newton, N, is a unit of force measured from the well-loved equation F = ma, and carries the units of kilogram · meters per unit second. If the value of one kilogram was set only as the collective whim of humanity well over one hundred years ago, then what does that say about every calculation of force since that time? Well, probably not much, since we’ve been working just fine with it ever since. If the fundamental value changes, it just scales all other values with it. However, it might just be nice, or more reasonable, or more scientific to have the value of the kilogram defined from other measurable fundamental values so it may never be questioned or changed (or stolen for a private collection, or fall through a crevice in the Earth after a quake never to be found again).
The mission of cleaning up the fundamental definition of the kilogram has been underway for many years with an international resolution declared in 1999, at the turn of the century. Now, this latest collective whim of scientists is to derive the value of the kilogram from a very fundamental number found in the realm of quantum mechanics, called Planck’s constant, h.
First described during the turn of the previous century, in 1900, by Max Planck, this constant represents the ratio of the energy (E) of an atomically-small oscillating object to its frequency (f) of vibration. The relationship, called the Planck-Einstein relation, E = hf, became a basic underpin to the development of quantum mechanics. The proportionality constant h made an appearance in a plethora of key equations that came to describe the Universe at its tiniest scales, including the counter-intuitive notion that very small things can behave like a wiggling wave and a bouncing particle simultaneously.
The actual value of the Planck constant is likewise incredibly tiny, measuring in at only 6.626 x 10-34 Joules · seconds. So, to define something else directly from a measurement of this value, insane accuracy is required. This is where the expertise of the National Institute of Standards and Technology (NIST) became a valuable player in establishing the new life of the kilogram.
Weighing in with h
The advanced measurement technology at NIST to be used for the kilogram is called a watt balance and is a modern-day extension of the classic equal arm balance dating back to at least the second century BC. Since it was originally conceived, an unknown mass is visually balanced by placing a collection of known masses on the opposite side of the device. When the two sides are resting at an equal height — i.e., the same force due to gravity, F = mg, is acting on each tray — then it can be assumed that the unknown mass equals the known mass. This millennial-old approach may have even coaxed the human drive to base any definition of mass from a known sample leading to the double-bell jar and platinum cylinder we find locked away today in a suburb of Paris.
The watt balance sets up a similar arrangement using a comparison of forces. This time, instead of watching gravity do its thing, the device measures electrical and mechanical power, hence the name “watt balance” where watt is the unit of measurement for power (as in 1.21 gigawatts… Great Scott!). Here, a highly controllable measurement of a force resulting from electromagnetism balances the gravitational force on the unknown mass. Flowing a current of electrons through a coil of wire inside a magnetic field on one side of the watt balance will create a force, and if aligned appropriately, this force will shift the two sides into balance for a particular current providing this electrical power.
This initial measurement provides a value of the unknown object’s mass in terms of a current, the magnetic field and the physical dimension of the coil.
However, we are looking for more: a direct relationship with the tiny and fundamental value of Planck’s constant. So, a second measurement is taken on the exact same setup of coil, alignment, and magnetic field to determine the voltage generated in the circuit when the coil moves through the magnetic field. This is the mechanical power generated during the balancing experiment.
Finally, the math representing these two measurements are merged together giving a relationship between the mass of the unknown object and the current and voltage. Replacing the current and voltage with their “quantum” mathematical versions (via the Josephon effect and the quantum Hall effect), which both contain the fundamental Planck constant, the mass can be directly expressed in terms of h. (If you are interested, check out an overview of the math.)
Historically, this mathematics and experiment on the watt balance has been used with a known mass to accurately calculate the value of h. Flipping the same equation on its head, if a “known” value of h is instead plugged in, then a value for the “unknown” mass, m, may be calculated.
And just with that one mathematical flip, we now have a fundamental definition of the kilogram based on Nature with quantum mechanics being used to describe a macroscopic quantity.
Extreme Accuracy Makes a New Standard
NIST has been building and operating watt balances since the early 1980s in order to nail down our “known” value of h. The latest generation, dubbed NIST-4, began operation in 2015 with specialty modifications to become an international standard for measuring mass. To be a standard, ultimate precision is the goal and NIST-4 is working to master its measurements with an uncertainty to 0.00000003.
The international scientific community is serious about this new definition and there is a deadline to complete all of this work. In late 2014, the International Committee for Weights and Measures (CIPM) established a roadmap of effort toward officially agreeing on the new definition of mass. This plan includes consistent measurements of the Planck constant to within 0.00000005 — placing NIST’s goal into comfortable territory. The end of the road will occur at the 26th Meeting of the General Conference on Weights and Measures (CGPM) in 2018 during which the new unit of mass is expected to be adopted.
Good Accuracy Makes for Extreme Science at Home
This level of extreme accuracy should certainly be left to the extreme scientific national labs such as NIST. However, the foundational idea behind the balance is still one that has been around for centuries. It is with the advancement of our appreciation of the quantum world that we now have mathematics that can relate this type of measurement with one of the most fundamental values representing our Universe, Planck’s constant, h.
So, what if we could now measure — with reasonably good accuracy — h at home? You can … just try building it with LEGO®.
The same team working on the NIST-4 developed a recipe for designing and building an at-home version of the watt balance. For around $400 and with 0.01 (1%) accuracy, masses may be measured at home by using the same technical concept NIST will use in 2018 to provide internationally accepted scientific measurements of the kilogram. The shopping list includes LEGO® (of course), copper wire, off-the-self laser pointers, free data acquisition software, a data acquisition interface (this is the major expense–but you will open up to an enormous new world of experimental opportunities at home!), several permanent magnets, and lots of building and testing fun with the family.
While this might seem a bit over-the-top for an at-home utility, the same device can also take a known value of a mass and measure the fundamental value of Planck’s constant. Tiny physics with big ideas right in your own basement or garage.
Now that the idea of building with LEGO® while doing some excellent experimental physics has you ready to jump right in to start ordering parts, you might first get way more in-depth with the NIST efforts to develop the new standard for the kilogram (download article*). Then, go ahead and dive into the instructions for building it all at home, which is included below for your immediate reference.
Chao, L. S., et al. “A LEGO Watt Balance: An apparatus to determine the mass based on the new SI” [ download ]
* R. Steiner, E.R. Williams, D.B. Newell and R. Liu. “Towards an electronic kilogram: an improved measurement of the Planck constant and electron mass.” Metrologia. 42 (2005) 431-441. [ download ]
A note to the reader: This article requires following special instructions to watch the videos below. It’s also recommended you be on a desktop computer, but if you are on a mobile device (which won’t let you play two videos simultaneously), simply partner with a friend to play the second video.
There is a long-standing urban legend claiming toilets situated in the Northern Hemisphere flush the draining water with a counter-clockwise rotation, while in the Southern Hemisphere it all spins down clockwise. The Coriolis effect — a real observable effect described by physics — is said to be the culprit. However, if you have experimented with this observation in the past (yes, take a moment to go and check your toilet bowl now), you may have been disappointed to discover just the opposite. You might have tried a different drain and seen even a different rotation in the same house.
Unfortunately, toilet bowls, sink drains and household bathtubs are too small in scale to allow the effects of the rotation of the Earth to be visible for everyday observation. In fact, if you were standing at the equator, you’d be moving over 1,000 miles per hour, and this rotation speed gets slower as you get closer to each of Earth’s poles. It is this constant rotation, which you don’t even notice, that provides a rotating reference frame for any object moving about the surface of the Earth. Since one full rotation takes 23 hours, 56 minutes, 4.0916 seconds (called a sideral day, per the full rotation of a single spot at the Earth’s surface, whereas the full 24 hour definition is based on the observation of the Sun returning to approximately the same location in the sky), the effect of this rotating frame of reference is quite small on most objects we might observe in our daily lives, like our flushing toilets. On the other hand, physical events on the scale of cyclones clearly demonstrate the clockwise vs counter-clockwise rotations depending on the hemisphere of the storm.
Hurricanes might be incredible to watch on the news, but they are too frightening to experience directly. So, an at home physics experiment was conducted on each half of the world by Destin Sandlin from Smarter Every Day and Dr. Derek Muller from Veritasium, which were cleverly recorded for simultaneous viewing of the results.
Now, here is where the important instructions come in: If you are on a desktop computer, click play on the upper video (occurring in the Northern Hemisphere) and watch for the count down. At just the right moment, click play in the lower video (occurring in the Southern Hemisphere) and watch both videos simultaneously. If you are on a mobile device, have a friend click play on the second video at the end of the countdown. You might also try expanding your desktop web browser to full screen mode (try hitting the key F11) to make sure you can see both clearly. The videos and music are synchronized, so if you don’t think you have them rolling at the same time, reload this page and try again. It will be worth it.
Discover the truth about toilets and see first hand what it really is like to live in a rotating frame of reference (since you probably didn’t realize it before).
This evening just after sunset, the crescent Moon was positioned in a beautiful triangular alignment with Venus and Jupiter. (view the skymap) I took the kids out to try using the binoculars to see the Moon — which they certainly also just used to walk around the yard finding one another! — and to talk a little about the two planets and how cool it is that we can see them with our own eyes.
These slightly in-focus images were taken with a very simple Nikon CoolPix S8100 auto focus in night landscape mode on a tripod.
Accessing the absolute latest in scientific communications directly by the independent amateur or citizen scientist has been a financially daunting prospect for decades; practically impossible. The top research journals carry high subscription rates (price out for yourself one of the best), and the science professional relies on their employing institution to cover the costs of access through the resident library budget, save for a personal subscription to their most treasured journal.
Of course, a great deal of front-line scientific research is funded by governmental agencies, which translates into taxpayer dollars. So, shouldn’t the taxpayer be able to access the fruits of their financial investments? Or, should the taxpayer just expect to later benefit from the results of research after implementation by companies and institutions? To avoiding opening a messy can of political warfare here, we will instead focus on the fact that there has been a rapidly growing trend from part of the scientific publishing industry itself to provide open access (“OA”) to the most recent articles from scientific research. This trend has been influenced from both the scientific authors and their funding agencies.
A traditional culture about to change
The traditional business model for scientific journal publications is for professional scientists to submit their draft papers to a group of peers — typically in related fields of expertise — who brutally review, critique, and feedback on the work. Revisions are made until acceptance of the peers is reached and the paper is published in a future volume of the journal. Individuals or, most typically, institutions then pay for access to obtain printed and digital versions of the published papers. Prices for access are high. It’s expensive to pay for the printing, editing, and the peer review process for such technically challenging and critical information and a bit of profit has to fit in there as well.
To change this culture and allow for the free dissemination of scientific advances, new models are being developed and tested. Money still has to be made, of course: the expenses are still present, even if the published materials are openly available to the readers, and even if they are solely digital presentations. One such model requires the author to pay a significant fee to be published (assuming they passed muster with the peer review), or if the author can’t bite the financial bullet, then their funding grant money or institution can help out with the bill. This approach is referred to as “Gold OA.”
An alternate version, which fits into the traditional subscription publishing model, is self-archiving of published work by the individual author. If the journal offers this permission, which is more often now required by funding agencies, the author maintains a digital copy of the work on a personal or institutional resource separate from the publisher. This “Green OA” approach is dependent on the individual author’s follow-through on archiving and requires a separate cataloging service to provide any sense of organization.
The OA Explosion
The Open Access movement significantly expanded during the first decade of the 21st Century. A detailed study of this growth of open access by Laakso, et al.  published in 2011 (available as open access, of course) found that about 19,500 open access articles in 740 journals were available in 2000, and by 2009, nearly 192,000 articles could be accessed for free in 4,769 journals.
The Public Library of Science (PLoS), starting simply as an advocacy group in 2000 and then expanding to an OA publisher in 2003, has been arguably the most successful innovator and proponent of the open access movement in scientific communications. PLoS currently follows the model of author-pays-to-publish-post-peer-review and has been publishing cutting edge scientific research featuring seven journals with tens of thousands of peer reviewed articles.
Quite recently announced (January 30, 2012), an even more extreme model of open access will begin later in 2012 that utilizes post-publishing open peer review with the ability to revise published work after releasing it to the public. This new approach being launched by the “Faculty of 1000” (F1000) group will require a similar pay-to-submit model. Called F1000 Research, this system will also allow for an “open” format for how information is presented, which may include poster presentations, traditional written articles, graphs and charts, and even raw data. The published information will be immediately available after a simple formal check from F1000 advisers that the submission is scientifically relevant, and registered readers will have the opportunity to provide comments and ratings.
The other side of OA
On the other hand of this seemingly exciting new trend for the seeker of open knowledge, the author-pay-to-publish model can potentially be a major hurdle for independent amateurs wanting to publish their work in peer-reviewed journals. The fees certainly are steep and it might be assumed that the majority of amateurs do not have the personal funds to move their work through the publishing system. Just as major funding agencies, like the National Science Foundation, require and provide assistance with publication fees for open access availability for scientific professionals, the independent researcher will need to search for outside financing to support the work if it is to intermingle with the professionals.
Although it certainly isn’t required that an independent publish their work through a traditional or open access route, it is the peer-review process that really is required to vet the work and help develop it into a final form that provides accurate and valuable information to the rest of the world. The principal of the peer-review process is a critical resource for the scientific community to ensure that rigorous information is disseminated, and the independent should not be outside of this, or at least some form of, quality control. If the independent wants to be taken seriously, they should try to play with the majors, and deal with the financial burdens in innovative ways.
So, it seems that the long-term success of Open Access is rather exciting for the amateur and citizen scientist. It is certainly exciting for Dynamic Patterns Research, which is not a large academic or commercial organization with massive budgets to cover massive journal subscriptions. However, the OA development is not a straightforward path for the future of the business of scientific publishing. Quality is never really free, and for organizations to provide this important peer review and publication services that we need and desire, expecting it to be free might mean significant sacrifices somewhere else.
The debate between the traditional and OA models has been brewing for years, and although growth in OA resources has been substantial in recent years, there is still a long way to go to discover that secret formula for a sustainable business model for Open Access publishers. The premier publishing group, Nature, (and rather expensive journal publisher) hosted a web debate forum  in 2004 on the issue of access to scientific literature. In particular, Kate Worlock wrote a “pros and cons” review describing many important concerns that OA would bring to publishers, all of which would directly impact the end-user.
Most notable, in the digital age publishing is heavily dependent on new technologies, and advances and innovations require great investments to develop and implement. With the popular pay-per-article model at current typical rates ($500 to $2,500), publishers will expect a great deal less revenue to meet operational expenses, invest in the future, and make a profit. And, making a profit is not an evil activity: it is from this profit that future growth and beneficial investments can be made possible. With reduced profits, the potential lack of investment ability could bring scientific publishing to stagnation and irrelevancy in the marketplace, obviously resulting in a direct negative impact on the scientific community.
As suggested above, although author fees could be covered by funding sources, host institution libraries might also be on the hook to support their resident research. The open access payment model only shifts the costs from one line item category to the next, and as fees grow higher in the future, budgets may continue to be equally stressed. In addition, larger research-focused academic institutions with significant output might end up subsidizing the pay-per fees from smaller teaching-centric colleges or commercial organizations with fewer annual submissions from their faculty.
Possibly the most critical concern is that many important scientific journals are published by academic societies which host additional activities and benefits to the scientific community. The profits from a society’s publishing division allows for the development of membership programs, conferences, and other benefits. With a loss of publishing income, member groups will have to make up revenue elsewhere, likely from the pocket books of the members themselves.
The Federal Government’s Conflicting Approach
As already mentioned, the United States federal government has provided mandates for research funded by taxpayer money to be published in an openly accessible way. For example, the National Institutes of Health requires all of their funded research to be listed with the government-hosted PubMed Central. However, just recently on December 16, 2011, a bill was introduced in the House of Representatives called the “Research Works Act,” which proposes to restrict the existing mandate of network dissemination of published work — without the publisher’s prior consent.
Certainly this a tantalizing development, which one might cynically think that the publishers have something to do with it … and, one might be right: The Association of American Publishers announced strong support for the legislation. A thorough review of this complex debate was offered a couple of weeks ago by The Chronicle of Higher Education , and it is interesting to note that many academic publishers, like MIT Press and Oxford University Press, have already expressed their opposition to the AAP’s position and are clearly trying to distance themselves from the anti-open-access side of the debate.
The primary point from the publishers is that although federal funds certainly initiate some of the research work, it is only through the independent professional efforts of the publishers providing peer-review, analysis, editing, and the final development of the work in a format that is desired by the community, that the research work can even be presented for others to access. And this independent presentation should not be controlled by the government on a case-by-case basis.
Making OA revered
With all of the issues facing scientific publishers to deal with the inevitable advancement of Open Access, it will still largely be up to the scientific authors themselves to make the transition to sustainable OA complete. In the culture of academic scientific research, recognition is still critical as the old mantra of “publish or perish” still drives many young post-docs and non-tenured faculty. Listing twenty articles with unknown journals might not provide the needed prestige as one killer article in the top journal in the field.
Scientists are still getting a handle on what OA resources are available today, and, although many may strongly support their work to be freely available, there still comes a necessarily selfish point where one’s own career and security is paramount, and selecting the right journal for submission becomes vital. A detailed survey from 2006 of researchers was reviewed by Alma Swan  describing the current state of mind of authors in their selections of journals and what their concerns and ideas are toward Open Access. Of particular note, many researchers still are not aware of the available OA resources in their field, as they likely remain focused on the publications they “grew up with” during their own education. With the current generation of new scientists, it will then be up to the OA publishers to bring their journals to distinction both in the view of their respective scientific communities and in the eyes of the individual scientists working in the field.
Dynamic Patterns Research has developed a concise reference list for reaching open access scientific knowledge, and tries to highlight the most important resources currently available. This reference will evolve as major changes occur in the availability of resources. Not only will DPR be utilizing these resources frequently, but we hope that it will provide an exciting portal for our readers into some of the top scientific advances happening today.
Update: February 17, 2012 A significant group of professional mathematicians are taking a stand against a major publisher. It seems like a key issue is the fact that many professional scientists are providing critical editorial services for the publishing houses — as volunteers — while the publisher’s profits are increasing at significant rates.
“Mathematicians Organize Boycott of a Publisher” :: New York Times :: February 13, 2012 :: READ
On November 8, 2011 in the late afternoon (CST), a rather large space rock will fly within about 200,000 miles of our home. There is no chance that it will impact this time around, and has very minimal chances for the next several hundred years.
This certainly isn’t the first time large asteroids have whizzed by Planet Earth, but what is exciting is that astronomers for the first time have had a reasonable head’s up to look for such a large object so close before the flyby. This might be a little disturbing, of course, as this “first” does represent a significant weakness in our past successes of identifying potentially dangerous near-Earth objects. And, Dynamic Patterns Research has written about this important issue earlier this year, with a focus on how amateur researchers can play an important role in early detection.
The path of the asteroid will take about 11 hours to pass through Earth’s field-of-view, and amateur astronomers in North America should be able to glimpse 2005 YU55 with nice backyard telescopes. A detailed path was generated courtesy of Sky and Telescope (VIEW MAP) and you may read more about the flyby along with additional observational tips:
“Mini-Asteroid Makes a House Call”, HOMEPAGE OBSERVING by Kelly Beatty Sky and Telescope November 1, 2011. [ READ ]
Watch how NASA is planning track the close approach of 2005 YU55: