Friday, September 21, 2012

Gamification 6 - CAP 115

Deep learning involves the transfer of learning to new situations.

To reiterate the goals: this game is a structured task in which students use problem solving collaboratively (by working in teams) to apply concepts (advanced searching techniques) to get relevant results (evaluation of resources). They will negotiate meaning (provide each other feedback in teams as they search and afterward as they report out).

The professor will monitor the follow-up transfer of learning to new situations (students searching their own topics, a.k.a., secondary research questions, both in databases and in Google) as the professors grade students' research papers.

In the next 2 class sessions, I will need to emphasize the goals, be firm about the rules (read through directions together or have the students repeat the rules back to me), inform students that they will demonstrate their techniques on the instructor machine, and when they are reporting out, ask the other students to listen to their peers to validate their own learning.

Application of phenomenographic principles:

  • teachers must draw out learners' understanding (calling on students reveals incorrect as well as correct concepts), connect to prior knowledge (the structured task asks students to apply their theoretical knowledge of the advanced searching concepts); 
  • learning is coming to understand something differently after engaging in a learning task (seeing how the techniques lead to relevant resources), relating examples to the big picture/theory (evaluating the resources for relevancy), and reflecting on its meaning relating techniques to multiple databases and Google; reporting out on their successes); 
  • the tasks must be relevant to the learner's world (their secondary research papers for the class). 
  • The teacher has to figure out how the tasks can lead to new perspectives (aka learning) - multiple class sessions in which to practice, observe and revise, conversations with colleagues, and peer observation have all been helpful.

Thursday, September 20, 2012

Gamification 5 - CAP 115

In order to help the students focus on the search strategies, I reworked the form:


Team members names:_____________________________________________________

Title 1:__________________________________________________________________

Subjects or keywords if title isn’t sufficient to judge relevance:____________________
_______________________________________________________________________

A. Relevance: 10 points for each source related to topic: _____ (0-10)

B. Database used to find this title:_____________________________________________

Write down the search which gave you the title listed above:
________________________________________________________________________
________________________________________________________________________
________________________________________________________________________

Indicate the limiter boxes you checked if not obvious above:
________________________________________________________________________

Indicate the following in your search above:

C. Used “Boolean operator” OR with synonyms; 1 point for each relevant synonym: _____ (0-3)

D. Used limiters such as truncation or wild cards, date range, quotation marks for
phrases, language, scholarly or peer reviewed: 1 point for each type of limiter: _____ (0-4)

E. Used subject headings/terms/words dropdown; 1 point for each: _____ (0-3)

(repeat for Titles 2-3)


TOTAL for all Titles: _____ (0-60)

Monday, September 17, 2012

Gamification 4 - CAP 115


Unexpected development: I noticed that after the game was finished last week, we had 3 teams with the max # of points, but the other teams revised their scoring during the discussion (erased & re-filled in their scores) to say they had won too…. Didn’t expect that. The prize was 3 extra-credit points vs. the 2 that everyone else got for participating--not that much of a difference. How to deal with it? I’m going to collect all of the sheets at the end of the game, then have the discussion, to foil cheating.


I also need to have the students report out more fully on which databases and terminology they used to find the relevant articles. One team found 6 relevant resources and another 4, very surprising since the librarians who trialled the game struggled to find 1-3 resources! All-in-all, it was a successful first go.

Today I noticed that the students, while using fairly advanced techniques, weren't able to score themselves accurately, i.e., they reported in writing that they used certain techniques but weren't able to describe them when I asked how or if they did so. I wonder if I should have them come up to the instructor station? The scores weren't an a good assessment of their practice. Maybe the scoring sheet needs to be more explicit: name the database, write down the exact search which yielded the good resource, including any "limiters" or narrowing menu items clicked.  

Today was the 2nd section, then there will be 2 additional sections next week. Stay tuned.... 

Friday, September 7, 2012

Libguides 2


I just read the following article on best practices for Libguides:
Ouellelte, D. (2011). Subject Guides in Academic Libraries: A User-Centred Study of Uses and Perceptions. Canadian Journal Of Information & Library Sciences, 35(4), 436-451.

Purpose and context: The articles I skimmed in the literature indicate that most Libguides users have the same mission/vision/definition problem we do – they are used for courses, disciplines, entire library websites, information literacy tutorials, library marketing, you name it.

Design of content: the focus group results I conducted in May replicated (almost) everything in the article—and I’ve modified my guides accordingly to the best of my ability. One of the things that is disabled in GVSU's Libguides that students told me they wanted (& so did students in the article) is left navigation, instead of tabs across the top.

The article has a lot of very useful student feedback on design.

Thursday, September 6, 2012

Libguides

What is the overall purpose of a library subject guide? I'm trying to write a philosophy rooted in the parlance of teaching and learning in the higher education arena. "Libguides" are the name of a specific platform of templates.

Libguides are a communication strategy formed of subject content and informed by the process of learning with information, all within a disciplinary context.

Libguides are "learning objects." They guide students in learning with information (whatever the format) while learning about the information practices within a discipline or profession. Libguides may be used to by learners to reflect on or frame the changes in the way they see, understand, conceptualize, or experience something (that which they are learning) and then communicate their new knowledge or perform tasks.

Okay, all that is the ideal. What my libguides currently do: frame the ways of finding different types of information within a specific field. They focus on navigating library tools, accessing info, with sideways links to using info ethically (citing, avoiding plagiarism) and evaluating. Is it possible to construct libguides which could come closer to the ideals above?

................

While looking at the library & information literacy literature on libguides, I was excited to come across this article, thinking that it might get me closer to my ideal:

Yelinek, K., Neyer, L., Bressler, D., Coffta, M., & Magolis, D. (2010). Using LibGuides for an information literacy tutorial. College & Research Libraries News, 71(7), 352-355, 

which portrays a tutorial that, while based on the ACRL “Information Literacy Competency Standards for Higher Education,” is not interactive (although each section includes self-grading multiple-choice quizzes). This is very similar to what we had as our Research Skills Tutorial and Library Virtual Tour in the early 2000's. I think it goes against the principles of deep learning and informed learning above, and it leaves me as frustrated with the "competency-based approach" as ever.