Out of the pool and into the ocean

“…if scientists could communicate more in their own voices-in a familiar tone, with a less specialized vocabulary-would a wide range of people understand them better?  Would their work be better understood by the general public, policy-makers, funders, and, even in some cases, other scientists?” -Alan Alda

I love science.

I have spent many years teaching science and I feel like I have a pretty broad understanding of most scientific concepts.

However, last week I attended a 3 day meeting with a variety of respected researchers who are studying the effects of the Deepwater Horizon oil spill on the Gulf of Mexico, and I was struck by how little of the presentations and panel discussions I was able to absorb, especially when it came to the more abstract nature of their work.  If I, a person with a marked interest and a science background, was struggling for comprehension, what must it be like for the general public?

Now of course the audience at this meeting was made up of people who were in the know and didn’t need their science “dumbed down” if you will, but the language seldom changes when efforts are made to communicate with those outside the scientific sphere.  Science communication and outreach are fast becoming important facets of research and are starting to become tied to funding in big ways.  As the looming sequester has the potential to shrink federal support for research and development (R&D) by US$57.5 billion over the next five years (Nature), it is imperative that scientists work to build a case for the relevance of their work and that means targeting those who do not necessarily speak the same language that they do, namely taxpayers and policymakers!

Where do information professionals come into play in all this? Well, our work is intricately tied together with the researchers and academics that we serve.  Their cause is our cause and our background in outreach, communication, and research can be a valuable tool in making scientific research more accessible and relatable.

Science Communication Table

Source: Richard C. J. Somerville and Susan Joy Hassol, from the October 2011 issue of Physics Today, page 48:

Some Helpful Resources

-A self-directed course on scientific communication from Nature

http://www.nature.com/scitable/topic/scientific-communication-14121566

-Similar resource from the American Association for the Advancement of Science

http://communicatingscience.aaas.org/

The Journal of Science Communication

http://scx.sagepub.com/

-Presentations from the National Academy of Sciences Colloquium “The Science of Science Communication”

http://www.nasonline.org/programs/sackler-colloquia/completed_colloquia/agenda-science-communication.html

-Science communication competitions!

http://famelab.org/

National Science Communication Institute

http://nationalscience.org/

 

 

This is a great presentation on social media for scientists.

http://socialnetworkingforscientists.wikispaces.com/

Advertisements

If taxpayers paid for it, they own it

The push for open access in research got a big boost on Friday with the release of a policy memorandum from the White House Office of Science and Technology Policy Director John Holdren, which has directed Federal agencies with more than $100M in R&D expenditures to develop plans to make the published results of federally funded research freely available to the public within one year of publication and requiring researchers to better account for and manage the digital data resulting from federally funded scientific research.  Whew that was a long sentence!  This is a precursor to the hopefully eventual ratification of the Fair Access to Science and Technology Research Act that was introduced both in the house and the senate on February 14th, 2013.  Check out this Wiki from the Harvard Open Access Project for more information on that and to track the progress of FASTR in congress.

Those of us in the know on this issue, which let’s face it should be all of us in academia, have seen the writing on the wall for quite some time.  And it’s thanks to the continued activism, advocacy, and articulate passion of so many in the academic community that we are at this point today.  If you have a moment, please read this joint letter from ARL, ALA, Creative Commons, PLOS, and many others offering support and a simple, but resonating rationale for the open access of scientific research.

http://www.arl.org/sparc/bm~doc/oawg_thanks_fastr_final-copy.pdf

Happy Holidays Citizen Scientists!

cedar_waxwing_georgi_baird_cropped_1

Everyone knows that birders, or nerders as I like to call them, are a dedicated bunch, willing to spend hours in one spot waiting for a sighting of an elusive species.  For those of us with self-diagnosed ADD, that seems like a true test of fortitude!  Birders play in an important role in the scientific community and each year act as citizen scientists, collecting invaluable data that helps to further research and fuel conservation efforts.  These efforts are mobilized each year during the Annual Audubon Christmas Bird Count.

CBC Santa

The 113th Annual Audubon Christmas Bird Count will take place Dec 14, 2012 to January 5, 2013.  The longest running Citizen Science survey in the world, Christmas Bird Count provides critical data on population trends.  Data from the over 2,000 circles are entered after the count and become available to query here.

112th chirstmas bird count

The first year,in 1900, 27 birders participated, counting in 25 North American locations.  Last year, more than 63,000 volunteers in more than 2,200 locations took part.  Counts were held in the U.S., Canada, Latin America, the Caribbean and the Pacific Islands.

1936 chirstmas count

The original counters tallied about 90 species. Last year volunteers recorded nearly 2,300 species among more than 60 million birds.

Audubon’s chief scientist Gary Langham points out that the counts’ purpose goes beyond tallying birds.

“Data from the Audubon Christmas Bird Count are at the heart of hundreds of peer-reviewed scientific studies and inform decisions by the U.S. Fish and Wildlife Service, the Department of Interior and the EPA,” he said.

christmas_bird_count

From all accounts it’s also a lot of fun!  Find a circle near you…

One More Tool for your Kit

Have you grown weary of sifting through the countless bits of information about how to manage research data? Well, not to worry because, SURA (Southeastern Universities Research Association) has recently launched an institutional tool for Research Data Management (RDM), developed by a working group formed with the Association of Southeastern Research Libraries (ASERL).  The working group brings together CIOs and library professionals from SURA member institutions to explore collaborations for improving their ability to manage the rapidly growing volume of research data.

The 5 page document, entitled the “Step-By-Step Guide to Data Management” is succinct, to the point, and provides links to all relevant outside sources.  The document was developed as a result of a survey of SURA membership to identify goals and projects for improving the management of institutional data.  The authors took their inspiration from the DataONE Best Practices Primer  and while it breaks no new ground, it does provide a clear and easy to digest picture of current trends and best practices in data management at universities.  To truly ensure the accessibility of data it is important to reach a consensus on best practices and methods for optimum accessibility in the future.

Of course it comes as no surprise that the university library is highlighted as resource not to be overlooked. As one fellow blogger put it, “while this may not come as a shock from a group that is half comprised of Research Library professionals, towering or expansive university libraries often have a significant amount of data to handle. Creating a database that can be searched hundreds of different ways of the myriad titles that exist is no small data feat.”  The argument can be made, and has been multiple times, that libraries are uniquely suited to play a pivotal role in the research data management process.  Now we have one more resource to offer our patrons to assist them in managing all that sexy data.

Making Data Accessible to All

In honor of Open Access week, I’d like to take a moment to highlight the work that the World Bank has done in the last several months to make their data freely available.   An Open Access Policy for its research outputs and knowledge products was formally implemented on July 1, 2012 and will continue to be expanded and improved upon over the course of the coming year.

The policy implicitly states that, “The World Bank supports the free online communication and exchange of knowledge as the most effective way of ensuring that the fruits of research, economic and sector work, and development practice are made widely available, read, and built upon. It is therefore committed to open access, which, for authors, enables the widest possible dissemination of their findings and, for readers, increases their ability to discover pertinent information.”

The Open Knowledge Repository, the centerpiece of the policy, is the new home for all of the World Bank’s research outputs and knowledge products. The Repository currently contains works from 2009-2012 (more than 2,100 books and papers) across a wide range of topics and all regions of the world. This includes the World Development Report, and other annual flagship publications, academic books, practitioner volumes, and the Bank’s publicly disclosed country studies and analytical reports. The repository also contains journal articles from 2007-2010 from the two World Bank journals WBRO and WBER.

The repository will be updated regularly with new publications and research products, as well as with content published prior to 2009. Starting in 2013, the repository will also provide links to datasets associated with research. While the vast majority of the works are published in English, over time translated editions will also be added.

Last week the latest effort in the World Bank’s push for transparency and open access was unveiled in the form of Health Stats, a new database featuring information and data on  health, nutrition and population topics.

According to their press release, “Health Stats provides access to more than 250 indicators on health, nutrition and population in 200+ countries, covering topics such as health financing, HIV/AIDS, immunization, health workforce and health facilities use, nutrition, reproductive health, cause of death, non-communicable diseases, and water and sanitation. Users can pull data by country, topic, or indicator, and view the resulting data (and wealth quintiles) in tables, charts or maps, or access pre-made tables for quick query.”

The site draws on a variety of data sources, including administrative statistics and household surveys compiled by the World Bank Group and its client countries, the World Health Organization (WHO), the United Nations Children’s Fund (UNICEF), the Food and Agricultural Organization of United Nations (FAO), the United Nations Population Division, the United Nations Statistics Division, the United Nations Population Fund (UNFPA) and the Organization for Economic Co-operation and Development (OECD).

One new feature that I particularly like is the Data Visualization Map, which allows you to animate data to show how indicators have changed over decades.  For example, this is a map of worldwide female life expectancy from the year 2011.

Additional Resources

Updated information and helpful resources

http://blogs.worldbank.org/

Examples of user generated data visualization

http://worldbank.tumblr.com/

Useful information on accessing World Bank data

http://blogs.worldbank.org/opendata/your-top-5-questions-about-world-bank-open-data

Deepwater Horizon, Research, and the Future of Data Management

The 2010 Deepwater Horizon explosion was in every way a tragedy, resulting in the loss of life for 11 crewmen and causing the largest offshore oil spill in US history.  The clean up efforts are ongoing as is litigation against BP and Transocean, the contractor in charge of Deepwater Horizon.  If there is any silver lining to be found in all of this it is that BP has given $500 million dollars to the Gulf of Mexico Research Initiative (GoMRI) for the purposes of conducting research on the long term effects to the ecosystem from the 4.4 million barrels of oil that gushed into the ocean in the Spring and Summer of 2010.

The Gulf of Mexico has been relatively unexplored in comparison to other regions.  In the 20 years before the oil spill, the Great Lakes received more than $1 billion, while the Chesapeake Bay got just shy of half a billion.  Spending for the same time period on the much-larger Gulf of Mexico: $85 million.

“It’s the hardest working of our ocean basins, but it’s the most underfunded in terms of research monitoring and science,” said Florida State University oceanographer Ian MacDonald.  It’s safe to say that $500 million will go along way in bridging this research gap.  There are many challenges however in attempting to make conclusions on the long-term effects of the oil spill because there is not a lot of existing data on the Gulf of Mexico to compare results with and determine long-term ecological consequences.

The collection and management of data gathered through the efforts of GoMRI funded projects is vitally important to research in this area as well as a model for collaborative research projects of the future.  Unlike some federal funding agencies, who can be vague in their requirements for a data management plan and offer little in the way of training and support, GoMRI has an entire division devoted to data management,The Gulf of Mexico Research Initiative Information and Data Cooperative (GRIIDC) and many of the consortia who have been funded by GoMRI are following suit.  The mission of the GRIIDC is, “to ensure a data and information legacy that promotes continual scientific discovery and public awareness of the Gulf of Mexico ecosystem.”

Having a data management team embedded within a project both takes the pressure off of the researcher, in terms of navigating the somewhat complex territory of DMP’s and long-term management, and ensures that data will be accessible and easy to extrapolate for years to come.  It also promotes collaboration and a holistic approach to scientific research.    This is a model that should be considered for future use , especially in terms of large-scale inter-disciplinary projects such as those funded by GoMRI.