Friday, March 25, 2016

Working on my own racism

Today I attended a talk by Dr. Todd E. Robinson, author of A City within a City: The Black Freedom Struggle in Grand Rapids, Michigan and Dr. Louis Moore, about historic racism in GR and how we can do better presently at GVSU and in GR. One of our Financial Aid staff mentioned BL²END, which connects business leaders to
"young professionals of color to learn, network, grow and become engaged in the Grand Rapids community." Today's speakers' advice for caucasians was to go to the neighborhoods and shop at places we might not usually be in, to put ourselves into our discomfort zones, to interact and communicate with people who are different from ourselves. I liked BL²END's fb page.

Earlier in the semester, I worked with a student whose thesis was that the Black Panthers were not a "radical" group in the socially negative sense. I was happy to connect him to the Articles/Databases page of the Library Subject Guide for African American Studies, to find first-hand, participatory and observed accounts from African Americans.

A couple of weeks ago, we watched The Black Panthers: Vanguard of the Revolution, a documentary which was fascinating and worth watching multiple times. I also just finished reading Rita Williams-Garcia's book One Crazy Summer, which set 3 young girls in Oakland, CA, in 1968, interacting with the Black Panthers. Both the film and book showed me that there were really positive aspects of the Panthers, e.g., African American men cooking and serving free breakfasts, the summer camps for youngsters, and the power of a kitchen printing press-produced poster of 20 copies bringing over 1,000 people to a rally.
 

Bias in search result algorithms

Matthew Reidsma recently posted about his research into search results from the Summon tool, and his work was referred to and built upon by both Barbara Fister and Dani B. Cook. Imperfections (aka bias) exist not only in information, but in tools which are meant to help us find information. It isn't just a matter of "question everything" or "question authority" but action too - not just looking for the best (that of god) in everyone but creating ways of finding the best in people and information. Creating and sharing meaning and purpose in community. Matthew's ways of allowing all users to participate in scrutinizing and questioning judgments made by corporations in order to improve information access is truly radical information literacy (Drew Whitworth). 


Friday, March 11, 2016

Advertising psychographics databases

SRDS database is not really being supported well; it is a "failing" database as far as I am concerned. However, its Local Market Audience Analyst reports data comes from Nielsen Demographics and PRIZM Segmentation, and Experian Simmons Lifestyle data.

http://www.anderson.ucla.edu/rosenfeld-library/databases/business-databases-by-name/simplymap has the following info:


Nielsen Claritas PRIZM is the industry-leading consumer segmentation system which defines every U.S. household in terms of 66 demographic and behavior types or segments, identifying those consumers' likes, dislikes, lifestyles, and purchase behaviors.
Nielsen ConneXions
Nielsen Financial CLOUT
Nielsen P$YCLE
(See Databases folder, Nielsen Segmentation E-Resource Discovery Checklist)

Experian Simmons (Experian SimmonsLOCAL) is a powerful targeting and profiling system that provides insights into individual consumer behavior on a local level for all of America's 210 media markets, with 60,000+ data variables, including over 450 categories and 8,000 specific brands.

Excellent tutorials are available: Business Blog of the Ohio University Libraries site: at "Browse by Topic," click on the "SimplyMap" tag in the word cloud to retrieve all of the videos.

http://www.anderson.ucla.edu/rosenfeld-library/databases/business-databases-by-name/simplymap
 

Wednesday, March 9, 2016

Text analysis



I found out that Neal Rogness in Statistics is doing research using lexical analysis (mixed methodological approach that combines quantitative statistical analysis with qualitative surveys), and he learned how to do text analysis coding mainly from Rachel Campbell in Sociology (mixed methodological approach that combines quantitative statistical analysis with in-depth qualitative interviews and focus groups).

I read the article:

Kaplan, Jennifer J; Haudek, Kevin C.; Ha, Minsu; Rogness, Neal; & Fisher, Diane G. (2014). "Using Lexical Analysis Software to Assess Student Writing in Statistics." Technology Innovations in Statistics Education, 8 (1). Retrieved from: http://eprints.cdlib.org/uc/item/57r90703


It compares using the expensive and licensed SPSS-TAS (Text Analysis for Surveys) software to the freely-open LightSIDE to do text analysis of surveys. I found the article really helpful, as it described the iterative processes of (manual) hand-coding student survey responses, defining rules, creating categories to represent ideas, building "libraries" of terms and phrases and synonyms, correcting mistakes (false positives or negatives), and representing the analyses visually. TAS "supports the Grounded Theory method of qualitative research" and can create webmaps of connections between ideas (5).

The article concludes with the goal of providing "real-time feedback to ... instructors about their students' understanding of ... concepts" (22) by using the refined TAS "libraries" "as a basis for subsequent analysis." This could be directly related to assessing teaching and student learning in various courses.

The questions I have come from the webmaps. If students make more connections between categories when they respond using disciplinary-based (statistical) ideas about a particular word, and they make different and fewer connections between categories when they respond with more colloquial usage, does that indicate that coming to  understand terminology in particular disciplinary ways correlates to or causes more connections between meanings? Do the webmaps represent our brain synapse connections?