Previous Contents Next
Issues in Science and Technology Librarianship
Summer 2014
DOI:10.5062/F46H4FDD

[Refereed]

The Evolution of Library Instruction Delivery in the Chemistry Curriculum Informed by Mixed Assessment Methods

Meris A. Mandernach
Head of Research Services
The Ohio State University Libraries
Columbus, Ohio
mandernach.1@osu.edu

Yasmeen Shorish
Science Librarian
Libraries and Educational Technologies
James Madison University
Harrisonburg, Virginia
shorisyl@jmu.edu

Barbara A. Reisner
Professor
Department of Chemistry and Biochemistry
James Madison University
Harrisonburg, Virginia
reisneba@jmu.edu

Abstract

As information continues to evolve over time, the information literacy expectations for chemistry students also change. This article examines transformations to an undergraduate chemistry course that focuses on chemical literature and information literacy and is co-taught by a chemistry professor and a chemistry librarian. This article also describes results from assessment of both content knowledge and student perception, and discusses how the assessment was used to inform changes to the course. This type of student assessment and evaluation has not previously been examined in the delivery of required undergraduate chemistry information courses. Since this course has used in-person, online, and blended delivery methods, the article describes what students can learn from online modules, and where they need more intensive classroom instruction.

Introduction

Over the last 20 years, there has been a dramatic change in how students acquire the chemical information skills for research or to acquire a job. In the past, searching was the most critical skill. Searching remains important, but now students also need strong filtering skills as the world has moved from information scarcity to information overload. While students feel comfortable with technology, they still need to learn how to differentiate scholarly from popular articles and how to critically evaluate the research claims presented in scholarly articles. While many other chemistry courses offer chemistry information seeking skills, few have partnered a librarian with a chemistry faculty member to team teach a course, develop an assessment tool to measure learning gains, and understand student perception of the acquired skills. This paper outlines the evolution of such a course at James Madison University.

James Madison University (JMU) is a comprehensive public institution with over 18,000 undergraduates in Harrisonburg, Virginia. The Department of Chemistry & Biochemistry, housed within the College of Science and Mathematics, awards B.S. degrees in Chemistry and Biophysical Chemistry and has a strong undergraduate research culture. No graduate degrees are offered in chemistry. The Department is an American Chemical Society (ACS) Certified Program and has graduated approximately 30-40 majors each year for the past five years. The Literature and Seminar sequence, required for all students earning a B.S. in Chemistry, consists of two one-credit courses (CHEM 481 and CHEM 482). There are no prerequisites for the courses, but students typically take CHEM 481 after completing the a year of General Chemistry (CHEM 131 and CHEM 132), a year of Organic Chemistry (CHEM 241 and 242), a semester of Inorganic Chemistry (CHEM 270), the Special General Chemistry Laboratory I and II (CHEM 135L and CHEM 136L) and sophomore-level Integrated Inorganic/Organic Laboratory I and II (CHEM 287L and 288L.) Most students complete the Literature and Seminar courses during their junior year, but 10-20% of the students wait until the senior year. Table 1 shows the progression of courses leading up to the Literature and Seminar Sequence.

Table 1: Chemistry major sequence of courses

First Year

Sophomore Year

Junior (or Senior) Year

Fall

Spring

Fall

Spring

Fall

Spring

Course

CHEM 131 & 135L

CHEM 132 & 136L

CHEM 241 & 287L

CHEM 242, 270 & 288L

CHEM 481

CHEM 482

As an ACS Certified Program, the JMU Chemistry Department is committed to meeting the information literacy standards identified by the society. Chemical information literacy is a well-established domain within the ACS Division of Chemical Information (CINF) and the standards detail the specific skills that chemistry majors are expected to have (Chemical Information Skills 2012). To that end, in 2007 CINF and the Special Libraries Association Chemistry Division first issued "Information Competencies for Chemistry Undergraduates," a document enumerating skills expected of chemistry undergraduates related to finding, using, and communicating chemical and scientific literature (Craig and Maddox 2007). These were updated again in 2011, moved to a wikibook format in 2012 and last revised in 2013. As the expectations of chemistry information literate students evolved, the Literature and Seminar sequence also changed to respond to these competencies.

This article presents the evolution of the instructors and instructional delivery methods of the first semester of the Literature and Seminar course (CHEM 481). These changes in chemical information literacy also address how assessment can be used to identify student strengths and skills to focus course instruction. It highlights the critical role of teamwork and communication between a librarian and chemistry faculty, the development of an assessment tool, and evolution of assignments to address student needs.

Literature Review

Many examples exist in the literature that pair librarian expertise with departmental faculty delivery of information literacy courses. Developing personal relationships with an eye toward integrated instruction has always been a key factor in successful liaison relationships (Black et al. 2001). From scanning the literature in chemistry courses, the most successful courses are those where the chemistry librarian and chemistry faculty member collaborate (Jensen et al. 2010; Somerville & Cardinal 2003). While many of these examples point to specific assignment design (Calderhead 1998) or participation in a full course, few attempt to address curriculum development on a more holistic level and use formal assessment measures to drive instructional decisions.

Some chemistry librarians teach semester-long librarian-taught courses, though many of these are offered at the graduate level (Emmett & Emde 2007; Currano 2005). Recently a survey was conducted of ARL institutions to examine the type of instruction and assessment that is offered at the graduate level (Fong 2014). Many of those surveyed indicated that formal assessment did not occur and that most of the topics focused on traditional topics of database searching. Marion Peters details the process at UCLA of an embedded approach to chemistry information literacy instruction where the chemistry librarian worked with students throughout their time as an undergraduate chemistry major (2011); however, this approach did not include a formal assessment technique, instead relying on anecdotal evidence. At the undergraduate level there are examples where students are targeted with chemistry library instruction early in the major (Gawalt & Adams 2011; Locknar et al. 2012). Other times assignments are constructed for specific classes (Ferrer-Vinent 2013) or seminar series (Garritano 2007). Still others take the form of application based science information literacy instruction in general (Brown & Krumholz 2002) and for chemistry courses in particular (Somerville & Cardinal 2003; Walczak & Jackson 2007).

Modular approaches have also been used in chemistry classes, allowing for quick plug in of information literacy concepts, through the development of online tutorials, which could then be used in class or outside of class for self-paced student learning (Aydelott 2007). While aligned with information literacy standards generally or subject-specific priorities, student interaction/satisfaction with these modules is often not assessed. Some chemistry information literacy courses use pre-/post-test assessment techniques to examine skill and/or attitude related to the instruction. For example, in the study by Emmett & Emde, researchers conducted interviews as a means of pre-/post-assessment related to content and saw evidence of skill development during interview observation (2007).

Evolution of the Course

Prior to 2000, chemistry faculty teaching Literature and Seminar had limited experience with, training, and access to online searching tools. Typically, a subject librarian would come to two or three class meetings and deliver a traditional bibliographic instruction session. At JMU, from 2000-2006 the librarian would partner with new faculty hires, who had prior experience with and some instruction in online searching, but the course still focused on tools, such as Science Citation Index and Chemical Abstracts, which were solidly within the paper era. Since 2006 the librarians and the chemistry faculty, now digital natives with chemical information tools, have developed a deep collaboration that includes curriculum development, co-teaching of the course, and the development of assessment tools. CHEM 481 evolved since 2006 through the work of three librarians (MM, KV, YS) and three chemistry faculty (BR, KM, KL). The course has two major components: weekly seminar attendance and a classroom experience where students engage with databases and research tools, read the literature, and develop a research project. Students are required to attend 10 seminars in the College of Science and Mathematics over the course of the semester; the classroom portion of the class consists of one 60-minute meeting each week in a computer laboratory or classroom. This narrative details the evolution of the classroom component of the course since 2006.

2006. This was the first year that the course was co-taught and redeveloped with a chemistry faculty member (BR) and the chemistry librarian (MM). The number of sessions that the librarian taught increased from three to seven of the 14 sessions and for the classes led by the librarian, the librarian was responsible for assignment design and grading. In 2006, the focus of the course was split between print and electronic resources with a focus on databases, such as SciFinder Scholar and Web of Science. The course had one introduction class, six classes focused on searching the literature taught by the librarian, and seven classes that covered reading the literature and understanding how chemistry is communicated taught by the chemistry faculty member. A review game was constructed by the librarian to help students summarize the content covered during the first half of the semester. Some of the questions developed for this activity would lead to the development of the assessment tool in 2010 (see details in the assessment test area for 2010). In 2006, two new assignments on database searching and constructing a bibliography were added and two, an assignment on handbooks and the introduction to the libraries, were significantly altered.

2007. The course continued to evolve and the chemistry librarian (MM) co-taught with a second chemistry faculty member (KM). In 2007, additional time was spent highlighting the advanced features of Scifinder Scholar as the school began subscribing to the substructure database within SciFinder Scholar. Additionally, new components were added to highlight and discuss the changing role of communication within the field and how to stay up to date using RSS and blogs. One new assignment was created where students crafted entries for the fictional "Dictionary of Remarkable Chemistry," detailing information about chemical innovations discovered in the past year. Additionally, the molecule searching, handbooks, and the bibliography assignments were revised.

2008-2010. These years the course was co-taught by the chemistry librarian (MM) and a third chemistry faculty member (KL). During 2008, one new assignment on the chemistry of everyday items replaced the "Dictionary of Remarkable Chemistry" assignment, based on student feedback, and the bibliography and handbooks assignments were revised. In 2008 an assessment was tested (post-test only) to explore question content. The assessment questions were developed by the chemistry librarian with some questions borrowed from the general information literacy test administered at JMU, a 60-item multiple-choice test developed by librarians and assessment specialists (Cameron et al. 2007). In 2009, two assignments, one on databases and an in-class assignment on molecule searching were condensed into a single assignment. Two additional assignments were revised to address performance issues and a formal assessment test (pre- and post-test) was developed. In 2010, the course enrollment doubled, resulting in two sections being offered. Many group assignments and projects were instituted in an attempt to manage the grading load. Students continued complaining about the workload of this one-credit required course. They also indicated that reference material and database instruction would be valuable during the sophomore laboratory sequence when they have to look up this information to prepare for lab. This would indicate that despite the students' regular concerns over the workload of this class, they found the information literacy components useful and relevant to their other chemistry coursework.

2011. This year the librarian (MM) worked with the chemistry faculty member (BR) to revise the course over the summer. The goal of this revision was two-fold: to develop the course with an outcomes based approach and to more closely align with both the American Chemical Society (ACS) Committee on Professional Training (CPT) requirements (Chemical Information Skills 2012) and the second edition of Information Competencies for Chemistry Undergraduates: the elements of information literacy from the Special Libraries Association Chemistry Division/ ACS Division of Chemical Information (Craig and Maddox 2007). In conjunction with switching the course to be outcomes based, the instructors attempted to transition the course entirely online, which would allow for modular and repeated instruction of particular library tools as they were encountered and reviewed in the curriculum. During this semester the class met face-to-face only three times. The pre-test was given during the first meeting. The second meeting was a discussion about careers and preparing for life after graduation. During the third meeting, a discussion of science funding and the publication process occurred and the post-test was administered. As the core content of the class moved online several self-paced tutorial modules were developed: Introduction to the Library; Chemical Identifiers, Handbooks, and Spectroscopy Resources; Publication and Peer Review; and Searching in SciFinder and Scopus each of which included quizzes that had to be completed before students progressed to the next module. As they were now online two specific modules (Chemical Identifiers, Handbooks, and Spectroscopy Resources and Searching in SciFinder and Scopus) were deployed in the sophomore lab, where students indicated this information would be more useful. During 2011 (and the subsequent spring semester), these modules were implemented simultaneously in the sophomore lab (CHEM 287L and CHEM 288L) and CHEM 481.

In an attempt to solicit feedback about content and online delivery methods, students filled out Google surveys at the end of each module. Students self-reported their perception of the amount of instruction in each of the modules. The percentage reporting each choice (too little detail = 1; just right = 2; too much detail = 3) and an average response using the number designation is presented in Table 2.

Table 2. Student perceptions on the level of instruction in the online course (N = 30).

Topic

average

too little detail

just right

too much detail

Introduction to the library

2.2

10%

63%

27%

CAS numbers

2.0

7%

87%

7%

MSDS

2.0

20%

60%

20%

Hill notation

1.8

30%

63%

7%

Handbooks & encyclopedias

1.9

20%

73%

7%

Spectroscopy

1.8

23%

73%

3%

Publication process

2.1

10%

73%

17%

Peer review

2.0

7%

83%

10%

Citing sources

2.0

13%

77%

10%

Topic searching in SciFinder

1.9

13%

83%

3%

Author searching in SciFinder

1.9

10%

87%

3%

Author searching in Scopus

1.9

10%

90%

0%

Structure finding in SciFinder

2.0

7%

87%

7%

Students also voiced their opinion on the delivery method. A total of 12 students (40%) thought that online delivery was the best method, 8 (27%) preferred class with an instructor present, and the remaining 10 (33%) thought either method was okay. It was clear that most students were content with the online format.

More importantly, students in CHEM 481 felt that this material belonged in the sophomore lab sequence where students complete a scaffolded research project that requires both laboratory and literature research (Amenta 1994). When asked to identify the ideal location for this material in the curriculum 80% said the information on "Chemical Identifiers, Handbooks and Spectroscopy Courses" belonged in CHEM 287L; 75% (N = 32) believed "Searching SciFinder and Scopus" also belonged in the sophomore year. Finally, the instructors polled students about content covered in specific modules and whether they found it duplicative (Table 3). While some students believed the material was duplicated or redundant, a majority felt that the review was useful and they knew more about these topics than they did in the sophomore lab.

Table 3. Student self-reporting of learning. Students were told to check all statements that applied.

Chemical Identifiers, Handbooks and Spectroscopy (N = 35)

Searching SciFinder and Scopus
(N = 32)

I did not learn any new material in this module.

3%

9%

This material duplicated what was learned in the Integrated Lab.

31%

22%

I found review of information about _____ useful.

83%

75%

I know more about _____ than when I took the Integrated Lab.

77%

63%

Despite the overwhelming favorable attitude towards the online delivery reported in the survey, students encountered significant problems when completing their literature review project in CHEM 482, the second semester of the Literature and Seminar sequence. They were unable to identify and select the most appropriate sources, based on criteria provided in the first semester, for a literature review project in the modern chemical sciences. They also felt that having only half a semester for conducting research for their paper was not enough time. Based on this feedback and their poor performance on the literature review project, the chemistry librarian and the chemistry faculty member decided that the face-to-face format would be more effective for portions of the course and that instruction should be centered on preparing students for the literature review project.

2012. Rather than switching completely back to face-to-face instruction, a new chemistry librarian (YS) with chemistry faculty (BR) transitioned the course into a hybrid model where the students reviewed the online modules (students in 2012 had completed these modules in CHEM 287L, the sophomore laboratory) via the classroom management system as well as attended in-person classes. The modules, developed in 2011, were placed on the course LibGuide (http://guides.lib.jmu.edu/chem481) and made available to all students. Additionally, the instructors related the activities throughout the semester to the literature review project. Assignments focused on reading and summarizing the literature and the searching exercises were framed in the context of the reading and writing activities. In an effort to meet the outcomes developed in 2011, the instructors spent additional class time discussing topics of analyzing the literature, teaching the importance of citation management software, and understanding PubMed. A class period was devoted to data management, an emerging area of importance to the information literate chemistry student.

2013. This year was again co-taught by chemistry faculty (BR) and the chemistry librarian (YS), with additional contributions from another librarian (KV). Based on results from the assessment test analysis, past semester projects, and student feedback, the course was revised to provide a more comprehensive introduction to the literature. In 2013, students were introduced to a review article early in the semester and subsequent literature reading assignments of communications and full research papers were based on the same topic area. By starting the students with a review article and focusing on a single topic, they learned to follow citations in a comprehensive and holistic manner. To help students identify topics for their research paper, which students self-report as one of the most challenging aspects of the class, they completed short reflection assignments on chemistry articles in the news. The instructors scaffolded the information about the importance of proper citation styles and the process of scientific communication as the students worked through the literature.

Other changes included course time set aside for coverage of patents, a SLA/CINF area that was not being addressed elsewhere in the chemistry curriculum. While both the ACS CPT requirements and the SLA/CINF standards stress the importance of crystallography and spectra, these topics are sufficiently covered elsewhere in the chemistry curriculum and will continue to be excluded from CHEM 481 and the assessment tool. Finally, the data management in-class assignment was revised to be an interactive, team-based learning exercise.

Evolution of the Assessment Test

2009. The assessment test, hereafter referred to as the chemistry information literacy test (ILT), consisted of 24 items in 2009. Each item was mapped to one of five areas: selecting and searching reference sources and databases, reading the chemical literature, peer review and literature types, citing sources, and understanding the JMU Libraries. A pre-/post-test format was recommended by JMU's Center for Assessment and Research Studies (CARS) in order to examine how well students retained information presented during the library instruction portion of the course. In 2009 the pre-test was administered during the first class and the post-test was administered in the eighth week of the semester, at the end of the librarian-taught portion of the course. CARS conducted an item-by-item analysis of the questions and identified items for revision. While the students improved from the pre-test to the post-test, they were still missing 35% of the items.

2010. In the second year, several of the questions were revised to improve item construction and items were removed if they were answered correctly prior to instruction. The pre- and post-tests were again given during the first and eighth weeks of the course, respectively. Students demonstrated similar gains in the post-test as they did in 2009, still missing 35% of the items.

2011. This year ten additional questions were added to gather information about the students' comfort level with specific tools or skills covered in the course. The pre-test continued to be administered in week one of the CHEM 481, but the post-test was given during spring semester of CHEM 482. The change sought to determine if students retained the knowledge they acquired during CHEM 481 while they were working on the paper and presentation during CHEM 482. Students performed higher on the pre-test than they have in years past, but demonstrated a slight decrease in gains (15% to 13%) from pre to post-test, missing 34% of the items.

2012. This year minor revisions were made to the ILT. Eight questions that were high-testing, confusing, or redundant were replaced with questions that were positively written (i.e., not "all except...") and data management or resource focused. Again, the pre-test was administered in the first week of CHEM 481 and the post-test was given in the middle of CHEM 482.

Chemistry Information Literacy Test (ILT) Results: Student Performance on SLA Standards

Since 2009, chemistry majors have completed the pre- and post-test for the ILT. In academic year 2012-2013, students took between 5-10 minutes to complete both the pre- and post-tests, including the comfort level items described below. These times are representative of other years. Descriptive statistics for the test are presented in Table 4. A dependent-samples t-test was run on the data. In every year, the mean from the pre-test was statistically significantly lower than the mean from the post-test. The confidence intervals provide a range of values that within which the actual mean could fall.

Table 4. Means, SD, and CI for pre- and post-test 2009-2012/13

Mean

Standard Deviation

95% Confidence Interval

2009 Pre-test

12.12

2.91

11.62 - 13.11

Post-test

15.70

2.70

14.78 – 16.62

2010 Pre-test

11.93

3.08

11.03 – 12.82

Post-test

15.60

3.00

14.73 - 16.47

2011/2 Pre-test

12.70

2.91

11.61-13.79

Post-test

15.90

2.16

15.10-16.70

2012/3 Pre-test

12.36

2.59

11.45-13.27

Post-test

15.64

2.23

14.84-16.44

In 2013, an effect size was calculated for the difference between the pre- and post-test means in the 2012-13 cohort to provide information about the meaningfulness of the statistically significant difference in test scores. The within-groups effect size (Cohen's d) was 1.69. This is a large effect size and indicates that the difference in scores from pre-test to post-test is practically significant (a 1.69 standard deviation increase from pre-test to post-test). Students scored about a 65% on the pre-test, while on the post-test students scored about 82%, meaning that students scored significantly higher on the post-test than they did on the pre-test. These results were consistent with results from 2009-2012.

To understand student performance, student scores on the ILT were compared to preliminary performance expectations set by two librarians (MM, YS) and two chemistry faculty (BR, BB) (Table 5). The chemistry faculty (BR) asked the librarians who had co-taught the class and a colleague who teaches the sophomore lab (BB) to participate in the performance expectation. The mapping of questions to the SLA/CINF learning objective areas are also provided.

Table 5. Student performance on the ILT administered in CHEM 482 in Spring 2013 vs. initial faculty expectations. The difference column represents how students perform relative to faculty expectations. A negative score indicates that students are performing below faculty expectations.

SLA Obj.

Item

Student Performance

Faculty Expectations

Diff.

1.2f

1. Scholarly journal articles have typically gone through a quality control process called:

97

91

6

1.2ab

2. Which of the following would be classified as grey literature?

67

64

3

2.1

3. Which reference source is the best source to find out whether a drug is toxic to humans?

90

84

6

1.2c

4. The proper way to cite the Journal of the American Chemical Society using the ACS style is:

83

86

-3

1.2b

5. Select the correct pairing of resource type and resource?

57

65

-8

2.2a

6. Which strategy would retrieve the most citations?

93

91

2

2.2a

7. Which strategy would retrieve the fewest citations?

93

91

2

2.2a

8. Which database is best for finding the citation statistics for an author?

83

94

-10

1.2a

9. In a research article, the review of related literature is included in the:

83

88

-5

3.4

10. Identify the best information resource to determine the toxicity and safety handling procedures...

97

83

14

2.4b

11. The best database for searching for chemical structures is:

90

89

1

4.1

12. Which example of a filing naming protocol most closely follows data management best practices?

100

75

25

1.2b/ 2.1

13. In Chemistry journals a review article is:

97

89

8

1.2c

14. What kind of publication is indicated by the following reference? / / Jordan, R.B Reaction mechan...

87

90

-3

1.1b

15. The best way to find scholarly journal articles on a specific research topic is to:

90

84

6

1.2c

16. In the following citation what does the number 305 refer to? / Takahaski, T. The Fate of Industri...

57

79

-22

4.1b

17. RefWorks is a key tool for research for all of the following reasons EXCEPT:

93

78

16

4.1a

18. When citing sources you should include:

50

75

-25

2.1

19. Which resource would be the best starting point to research a topic completely unfamiliar to you...

57

78

-21

For many questions, student performance approaches or exceeds faculty expectations (< 10% difference). Students consistently underperform on three items: the identification of the volume number of a journal (Q16), what to cite in a research paper (Q18), and how to begin a research project (Q19). Although disappointing, none of these trends are surprising. Most of the students have not used print based journals and they do not understand of the difference between volume and issue. Anecdotally, students seem to find articles by direct linking from a research database on the library web site or a search engine; they rarely go to a journal web site to look for a specific article. Students have easy access to information online and using a general search engine satisfies most of their general information needs. The other two problems are larger issues in the writing in research arena. Most students begin higher education not understanding when to cite resources. It takes time and practice to develop the skills to cite knowledge appropriately. While students are comfortable using databases, their challenge is in choosing the appropriate way to learn about topics holistically and to cite information appropriately. In the next iteration of the class, activities will be refined to further develop these skills.

Self-Reported Student Learning Gains

In 2012, students taking the course were exposed to similar material during CHEM 287L. As part of the assessment, the instructors examined areas where students were making gains in information literacy skills by looking at a matched set of students who completed CHEM 287L in Fall 2011, CHEM 481 in Fall 2012, and CHEM 482 in Spring 2013 (Fall 2012 and Spring 2013 had additional enrollees due to transfer students and seniors taking the course out of sequence). This represents the expected progression of courses for chemistry majors. Gains in self-reported comfort level for specific resources can be seen in Table 6.

Table 6. Self-reported comfort level (mean and standard deviation; 4 point scale: 1= never used, 2= used, but not comfortable, 3= comfortable, 4= expert) of chemistry majors with resources. Fall 2011 data are from students at the start of CHEM 287L, Fall 2012 data are from students enrolled in CHEM 481 and Spring 2013 data are for students enrolled in CHEM 482.

Comfort Item

Fall 2011
(N = 20)

Fall 2012
(N = 31)

Spring 2013
(N = 30)

M

SD

M

SD

M

SD

Scifinder

1.25

.55

2.48

.85

3.07

.58

Scopus

1.20

.52

2.13

.81

3.53

.63

Refworks

1.15

.37

1.42

.62

3.13

.73

Printed Handbooks

1.90

.85

2.26

.89

2.73

.64

Online Handbooks

1.85

.75

1.90

.79

2.80

.66

PubMed

1.35

.75

1.90

.79

2.97

.85

Structure Databases

1.60

.82

2.26

.82

3.03

.67

Google Scholar

1.90

.79

2.29

.90

2.80

.93

Structure Drawing Programs

1.55

.95

3.00

.89

3.20

.48

MSDS

2.15

.81

2.77

.81

3.13

.51

The comfort level of all items increases, as would be expected since all of these tools are used in coursework in the major. However, the most significant gains follow direct classroom instruction. For example, in the sophomore year, there is a heavy emphasis on structure drawing programs but this tool is not covered in CHEM 481 or other parts of the junior curriculum. Hence, there is little improvement during the junior year (Fall 2012-Spring 2013). RefWorks is only covered in CHEM 481 and most gains in this database are observed between CHEM 481 and 482, from Fall 2012 to Spring 2013. Direct instruction on SciFinder, Scopus, Handbooks, and Structure Databases is conducted in both the CHEM 287L and CHEM 481, both Fall 2011 and Fall 2012.

The within-groups effect size (Cohen's d) was significant for all items as the students progressed from the sophomore lab through the end of the literature and seminar sequence. This data, and differences in comfort levels, are presented in Table 7. Gains made in particular timeframes that correspond to instruction in the sophomore lab (Fall 2011-Fall 2012) and Literature and Seminar sequence (Fall 2012-Spring 2013) are consistent with where this material is covered in the curriculum.

Table 7. Differences in a matched set of students as they proceed through the chemistry curriculum. Fall 2011-Fall 2012 represents gains made during the sophomore year. Fall 2012-Spring 2013 represents gains made from instruction in Literature and Seminar. Fall 2011-SP!3 shows gains made over the course of the undergraduate career.

Comfort Item

Fall 2011-Fall 2012
(N = 19)

Fall 2012-Spring 2013
(N = 28)

Fall 2011-Spring 2013
(N = 20)

Dif.

p

d

Dif.

p

d

Dif.

p

d

Scifinder

1.32

<.01

1.76

0.50

<.01

0.69

1.75

<.01

3.08

Scopus

0.79

<.01

1.11

1.43

<.01

1.97

2.35

<.01

3.99

Refworks

0.11

.49

0.20

1.79

<.01

2.65

2.00

<.01

3.26

Printed Handbooks

0.21

.43

0.24

0.43

.03

0.55

0.75

.02

1.03

Online Handbooks

0.11

.63

0.14

0.89

<.01

1.22

0.85

<.01

1.22

PubMed

0.37

.03

0.48

1.11

<.01

1.35

1.55

<.01

1.91

Structure Databases

0.63

.04

0.77

0.71

<.01

0.95

1.30

<.01

1.77

Google Scholar

0.26

.17

0.30

0.39

.02

0.43

0.65

<.01

0.74

Structure Drawing Programs

1.68

<.01

1.84

0.11

.56

0.15

1.65

<.01

2.34

MSDS

0.84

<.01

1.04

0.36

.02

0.53

1.05

<.01

1.63

In addition to the deployment of the ILT for chemistry majors, the chemistry department also assesses majors using additional measures that examine both attitudes and proficiency with concepts. The Student Assessment of Learning Gains (SALG) instrument was developed in 1997 as part of the National Science Foundation ChemLinks and Modular CHEM consortiums and focuses on the degree to which a course and specific aspects of a course have contributed to student learning (Seymour et al. 2000). Information on the validity and reliability can be found on the SALG web site (About SALG 2014). Students self-report answers to questions with a Likert Scale where 1 = no help; 2 = a little help; 3 = moderate help; 4 = much help; 5 = great help." Students spent between 10-30 minutes completing the SALG. Students received extra credit for completing this assessment. The prompts asked students to reflect on gains made in understanding (questions 1-5), gains in skills (questions 6-8), and the utility of class resources (questions 9-11) and data are reported in Table 8.

Table 8: Selected SALG data of self-reported student gains.

 

average
(N = 62)

Standard deviation

1. Finding information that can be found in handbooks and other printed resources

3.97

0.92

2. Finding chemical information from online databases

4.52

0.78

3. Using citations

3.98

1.10

4. Data management

3.40

1.22

5. Impact factors

3.74

1.04

6. Finding articles relevant to a particular problem in professional journals or elsewhere

4.29

0.82

7. Identifying good resources

4.21

0.75

8. Confidence that you can find chemical information

4.26

0.70

9. The library course guide

3.48

1.16

10. JMU tutorials on library resources

3.26

1.20

11. Tutorials provided by database vendors (e.g. Scopus, SciFinder, etc.)

3.53

1.17

It is clear that instruction in Literature and Seminar affects student comfort and familiarity with literature research tools and finding chemical information. Since gains are seen after both semesters of instruction, the authors believe that cycling through these ideas multiple times is valuable and may improve learning gains.

Conclusion

Assessment measures, student feedback, and direct collaboration between a librarian and chemist were key features to developing a chemistry information literacy course that satisfies the requirements of the ACS CPT and SLA while meeting the instructional needs of the students. These measures have allowed the Literature and Seminar course at JMU to respond to the changing information landscape. The course has evolved more dramatically since 2006 due to greater collaboration between the subject librarians and chemistry faculty members and since 2009 with the use of formal assessment tool to measure student performance and refine instruction. While students favored online delivery for handbook, database, and literature search instruction, and performed the same as students receiving face-to-face instructions, they were unable to critically evaluate the literature from their searches, cite references appropriately, and identify best sources for future literature projects. By using a hybrid approach, students are able to process information on their own as well as face-to-face and have demonstrated that they are retaining skills acquired through both methods. While the course will certainly continue to evolve, the successful partnership between the chemistry faculty members and the librarians allows for agile and responsive curriculum development based on assessment and in class feedback.

References

About SALG - Student Assessment of Learning Gains. [Internet]. [Accessed August 13, 2014]. Available from: http://salgsite.org/about

Amenta, D.S. and Mosbo, J.A. 1994. Attracting the new generation of chemistry majors to synthetic chemistry without using pheromones: a research-based group approach to multistep syntheses at the college sophomore level. Journal of Chemical Education 71 (8):661-664.

Aydelott K. 2007. Using the ACRL information literacy competency standards for science and engineering/technology to develop a modular critical-thinking-based information literacy tutorial. Science &Technology Libraries 27(4):19-42.

Black, C., Crest, S., and Volland, M. 2001. Building a successful information literacy infrastructure on the foundation of librarian-faculty collaboration. Research Strategies 18(3):215-25.

Brown, C. and Krumholz, L.R. 2002. Integrating information literacy into the science curriculum. College and Research Libraries 63(2):111-23.

Calderhead, V. 1998. Reflections on information confusion in chemistry information learning: The meaning of the shift from library instruction to information literacy. Research Strategies 16(4):285-99.

Cameron, L., Wise, S.L., and Lottridge, S.M. 2007. The development and validation of the information literacy test. College and Research Libraries 68(3):229-37.

Chemical Information Skills. [Internet]. [Updated 2012 March]. American Chemical Society Committee on Professional Training. Available from: http://www.acs.org/content/dam/acsorg/about/governance/committees/training/acsapproved/degreeprogram/chemical-information-skills.pdf

Craig, C. and Maddux, L. [Internet]. [Updated January 2007] Information Competencies for Chemistry Undergraduates: The Elements of Information Literacy, Special Libraries Association, Chemistry Division, Ad Hoc Committee on Information Literacy. Available from: http://chemistry.sla.org/dchearchive/il/cheminfolit2007.pdf

Currano, J.N. 2005. Learning to search in ten easy steps: A review of a chemical information course. Journal of Chemical Education 82(3):484-488.

Emmett, A. and Emde, J. 2007. Assessing information literacy skills using the ACRL standards as a guide. Reference Services Review 35(2):210-29. doi:10.1108/00907320710749146

Ferrer-Vinent I.J. 2013. Using in-class structured exercises to teach SciFinder to chemistry students. Science & Technology Libraries 32(3):260-73.

Fong, B.L. 2014. Searching for the formula: how librarians teach chemistry graduate students research skills. Issues in Science and Technology Librarianship 75.[Internet]. Available from: http://www.istl.org/14-winter/refereed1.html DOI:10.5062/F4J1014M

Garritano, J.R. 2007. Ice cream seminars for graduate students: Imparting chemical information literacy. Public Services Quarterly 3(3-4):53-70.

Gawalt, E.S. and Adams, B. 2011. A chemical information literacy program for first-year students. Journal of Chemical Education 88(4):402-7.

Information Competencies for Chemistry Undergraduates: The Elements of Information Literacy. [Internet]. [Updated 2012 July] Special Libraries Association, Chemistry Division and American Chemical Society Division of Chemical Information Available from: http://en.wikibooks.org/wiki/Information_Competencies_for_Chemistry_Undergraduates

Jensen, D., Narske, R., and Ghinazzi, C. 2010. Beyond chemical literature: Developing skills for chemical research literacy. Journal of Chemical Education 87(7):700-2.

Locknar, A., Mitchell, R., Rankin, J., and Sadoway, D.R. 2012. Integration of information literacy components into a large first-year lecture-based chemistry course. Journal of Chemical Education 89(4):487-91.

Peters, M.C. 2011. Beyond Google: Integrating chemical information into the undergraduate chemistry and biochemistry curriculum. Science and Technology Libraries 30(1):80-8.

Seymour, E., Wiese, D., Hunter, A., and Daffinrud, S.M. 2000. Creating a better mousetrap: On-line student assessment of their learning gains. National Meeting of the American Chemical Society.

Somerville, A.N. and Cardinal, S.K. 2003. An integrated chemical information instruction program. Journal of Chemical Education 80(5):574-9.

Walczak, M.M. and Jackson, P.T. 2007. Incorporating information literacy skills into analytical chemistry: An evolutionary step. Journal of Chemical Education 84(8):1385.

Previous Contents Next

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License. W3C 4.0 
Checked!