Search on SLJ.com ....
Subscribe to SLJ
Follow This Blog: RSS feed
Neverending Search
Inside Neverending Search

EasyBib’s New Extensions: two/too “easy buttons”?

notsoeasy 214x300 EasyBibs New Extensions: two/too easy buttons? This past week, in its Educator Blog, EasyBib announced its Chrome Toolbar Extension, available for free download in the Chrome Web Store.

Essentially, the tool allows users to do two things:

  1. Automatically cite web sites with one click using the EasyBib Toolbar.
  2. Receive advice on the credibility of the web site you’re citing.

chrome ext store EasyBibs New Extensions: two/too easy buttons?

The toolbar extension offers a drop-down menu from which users may choose to Cite on Easybib, View Bibliography, or check to see if a site is credible.

chrom ext ex EasyBibs New Extensions: two/too easy buttons?

When you select Cite on EasyBib, a pop-up window shares citation information filled into a form.  Sometimes the form populates automatically with data.  Sometimes it does not.  The default citation type is not always right and students should be alert to the opportunity to change the source type from, for instance, from website to journal article.

chrome ext cite EasyBibs New Extensions: two/too easy buttons?

jstor 300x206 EasyBibs New Extensions: two/too easy buttons?

I am grateful for easy button number one.  Automatic bibliography generation is a mighty handy tool, especially for those of us who can recognize format types and can scan finished citations for accuracy.  It’s really nice to be able to send them our bibliographies on the fly.

Number 2 is a different story.

While I applaud EasyBib’s attempt to focus students’ attention on credibility, evaluating documents is not an activity I want to outsource.

IMHO, it’s simply not the right message. It’s just not so easy.

Evaluation is contextual.  It’s a thinking process.  It’s a learning process.

I don’t want my students to believe that they can opt out of the process and accept the decision of this particular easy button, especially when they have to guess at the reasoning behind the judgment.

Screen Shot 2012 10 22 at 10.29.11 PM 289x300 EasyBibs New Extensions: two/too easy buttons?

onion 300x220 EasyBibs New Extensions: two/too easy buttons?

My quick searches reveal that my databases seem off the EB radar.  School Library Journal and its blogs are not yet evaluated.  Blogs, even those by experts, are not yet evaluated.

Everything from the New York Times, including every letter in the Letters section is labeled credible.

Everything on CNN, merely because of its presence on the domain, appears to be labeled credible–comments, entertainment videos, ads, etc.

Pages from Wikipedia and YouTube are labeled maybe credible.

A site I refer to all the time for useful advice, eHow.com, is labeled not credible.

While it immediately found this article in The Onion suspect, it doesn’t really give me any clue as to why.

Stormfront’s Martin Luther King hate site is labeled not yet evaluated.

king1 300x126 EasyBibs New Extensions: two/too easy buttons?

Sites like NARA, PBS, The Library of Congress, and BBC come up as credible every time I searched.  But students need to evaluate the credibility of the specific items they choose to use on those portals individually.  And then, of course, even unreliable sources offer us a valuable lens on time and period, that is, when we critically evaluate them.

EasyBib does not explain its specific judgments.

When  a site is labeled not yet evaluated, users are led to Evaluate it! using EasyBib’s 11-page Web Evaluation document which guides them through thinking about and measuring credibility through a series tests based on authorship, bias, citations and links, currency (when it was published), publisher, etc.  These metrics combined with input from millions of student users seems to form the basis for the toolbar extension’s judgments.

easybib 300x266 EasyBibs New Extensions: two/too easy buttons?

Do we  want this particular task to be easy?

In an information landscape newly rich with user-generated content, evaluation is a fuzzier process than it once was.  It can also be an exciting opportunity for learning.

Depending upon a variety of contextual factors, as well as authorship, blogs and tweets may have new credibility as primary sources.  But a toolbar extension cannot discern these things.

It’s not just about the traditionally measured authority of the source, it’s about the context beyond.  It’s about the specific information need and the reason behind the creation of the content.  In fact, it is possible that depending upon context. all sites may be credible.

I prefer David Warlick’s goals-based evaluation approach.

David suggests

As students’ information products should be based on teacher or student established goals, evaluating the material that they consider using in their products should also be goals-oriented. Rather than judging the material based solely on itself via an examination instrument that has nothing to do with the students work, it should be judged from the perspective of what the student wants to accomplish.

From this standpoint, we would not ask, “Is the author qualified?”, but, “What aspects of the author’s background help me accomplish my goal?” Under certain circumstances, a web page published by a neo-nazi organization might actually be appropriate for an assignment, while other resources, produced by people with credential would not. It depends on what the student wants to accomplish.

This approach actually serves three interesting purposes.

  • The student is focused on drawing supporting or appropriate information into the project rather than just filtering “bad” information out.
  • The student gathers information about the information.
  • As students approaches information with their goals to accomplish, they are less likely to be influenced by the goals of those who generated and published the information, which has interesting implications for media literacy.

I want my students to think through their credibility decisions one by one. I want them to understand what they are holding and decide on its relative value to their inquiry.  I want them to spend some time determining the authority and relevance of the documents they discover. I don’t want them to rely the word of a yes/no crowdsourced algorithm.

Inherently, EasyBib, Number 2, is not an easy process.

Reference:

Warlick, David F. “Evaluating Internet-based Information: A Goals-based Approach.”Evaluating Internet-based Information: A Goals-based Approach. Meridian: North Carolina State University, June 1998. Web. 23 Oct. 2012. <http://ncsu.edu/meridian/jun98/feat2-6/feat2-6.html>.
share save 171 16 EasyBibs New Extensions: two/too easy buttons?
Joyce Valenza About Joyce Valenza

Joyce is the teacher-librarian at Springfield Township High School, a technology writer, and a blogger. Follow her on Twitter: @joycevalenza

Comments

  1. Kerry Kitka says:

    Hi Joyce,

    Thanks for taking the time to check out our new toolbar extension and tell your readers about it! We’re so happy that you use EasyBib’s tools.

    Just to clarify a bit about what seems to be confusion regarding the Website Evaluation component of this extension: The websites that are currently evaluated in our database were all done by a team of experts at EasyBib. The screenshot and link you provide at the bottom of your article: http://www.easybib.com/kb/index/view/id/172/page/1 is the same that the user will be taken to when they click “learn more” from the toolbar. This page explains in detail just how our actual humans evaluated these sites. We absolutely agree that credibility is not something to be left up to an algorithm or aggregate user opinions. We determined the best practices of top librarians from research universities while creating our criteria.

    This is also why you’ll see that many cites were not evaluated yet. We don’t rely on anything but good old fashioned human brain power – so there is no way we could have every website evaluated. We also agree that this process is subjective, open to interpretation, dependent upon purpose, etc. That is why this tool is meant to be a starting point for students. It exists to provoke thought – to make students aware that they need to be considering the quality of their sources, rather than blindly citing the first Google result they happen upon.

    We understand your concern that this tool doesn’t appear to explain its reasoning to the user. This extension is just a representation of the full Website Evaluation tool you will find on EasyBib. At the moment you can only access all of its features with a premium account and at the actual EasyBib site. What you will find, however, given that criteria, is that the full tool offers a complete breakdown of how every site is evaluated and how it scored within each specific criteria.

    I hope that clarifies some things you may have had misgivings with. This is absolutely a tool, and like any tool it is only as effective as the user who wields it. We never want to outsource critical thinking or make things “too easy”, but rather make the pathways to information literacy more evident so that our students will be more likely to use them.

    Best,
    Kerry Kitka

  2. Why not focus on teaching our students to THINK INDEPENDENTLY on sources, digital or not, instead of trying to find short cuts to knowledge?

    It has so much to do with understanding, culture, learning, broadening of horizons that they need to learn the skills the hard way. With help from librarians, teachers, experience and… life!

Speak Your Mind

*