Previous Contents Next
Issues in Science and Technology Librarianship
Spring 2009
DOI:10.5062/F4XK8CG6

URLs in this document have been updated. Links enclosed in {curly brackets} have been changed. If a replacement link was located, the new URL was added and the link is active; if a new site could not be identified, the broken link was removed.

[Refereed]

What Engineering Sophomores Know and Would Like to Know About Engineering Information Sources and Access

Zorana Ercegovac
InfoEN Associates
Los Angeles, California
zercegov@ucla.edu

Copyright 2009, Zorana Ercegovac. Used with permission.

Abstract

This exploratory study reports on what engineering undergraduate students know and would like to learn about engineering information sources and access. Responses were obtained on selected performance measures within the framework of Information Literacy Standards for Science and Engineering/Technology (ACRL/ALA/STS 2006). The results are based on 70 sophomores who completed a four-page paper-and-pencil questionnaire on April 4, 2008 at UCLA's School of Engineering and Applied Sciences. The paper discusses implications of this study's findings in light of engineering education in general, and specifically with regard to instructing all engineering students in the proper use of a variety of pathways in accessing high quality engineering resources for numerous lifelong purposes.

Motivation and Framework for the Study

This paper reports on an exploratory study that was designed to gauge levels of information literacy among computer science and computer engineering sophomores at UCLA. The purpose of the study was to examine the students' content knowledge on core information literacy performance indicators and outcome measures (ACRL/ALA/STS 2006). Little has been published on information-seeking patterns among undergraduate engineering students. Students' responses were also used to frame this investigator's 55-minute instruction on Engineering Information Sources and Access, presented two months after the completion of the survey.

The context of the study was a Computer Science 101 class within UCLA's Henry Samueli School of Engineering and Applied Sciences. The class is organized as a series of lectures on important computer science topics that all sophomores need to be acquainted with; their grades were based on attendance, quizzes, a short essay, and presentations on a wide variety of self-selected topics in computer science. Lecturers included experts from UCLA and industry such as Sun Microsystems, Microsoft Research, and Caltech JPL Laboratory. Students who participated in this study were typical engineering undergraduates, mostly sophomores completing their second year of study.

Prior Studies

I examined publications that dealt with the a) Information Literacy Standards for Science and Engineering/Technology (IL SET) (2006), b) engineering undergraduate students, and c) instruments to assess content knowledge on IL SET. I sought studies that were empirical rather than analytical/theoretical in nature, and published in English after 2005. Since no assessment instruments were located, I report on those articles that provided the starting point for the current study.

Nearly ten years ago, a series of papers described research on content knowledge of engineering students (Ercegovac 1999; 2001). The findings demonstrated that the most common obstacle to the use of resources was lack of knowledge about what was available and lack of adequate user training. More was not better; units covering engineering resources and access were typically taught by reference librarians who may not have understood specific information-seeking behaviors of engineering students, nor the importance of customizing teaching and learning to their specific needs and requirements. The study was based on a sample of 27 upper division undergraduate engineering students and 16 graduate engineering students for comparison purposes (Ercegovac 1999). The students felt confident in their information literacy skills, and there was no difference based on gender, educational status, and whether or not they had previously taken library instruction on sources and how to use them.

The 1999 study was carried out during the early phase of Internet search engines. The following year, Information Literacy Competency Standards for Higher Education produced a "framework for assessing the information literate individual" that could be applied to all majors (ACRL/ALA 2000). Other standards related to engineering education, such as ABET, are outside the scope of this study.

More recently, studies in the United States (e.g., Barker, Cook & Whang 2006) and in Ireland (Kerins, Madden & Fulton 2004) have found that engineering students (n=260; n=14 respectively) tend to give preference to quick, easy, and convenient sources (e.g., Internet search engines, friends) rather than online databases, library collections, and information professionals. One of the major differences between the prior studies and this one has been to include plagiarism-related perceptions (per Standard 4 in ACRL/ALA/STS; see also Table 1D). Ethical use of information has been an important component of information literacy competencies for all college-level students. Increased cases of plagiarism across the board, including those in science and engineering disciplines, made it critical to further investigate these issues in this study (Ercegovac & Richardson 2004; Ercegovac 2005; Ercegovac In press b).

Questions Under Study

This author wanted to gauge content knowledge among engineering undergraduates on their information literacy skills. Specifically, she asked:

  1. Which resources are students aware of that are available to them?
  2. How well do students' responses match some of the performance indicators defined in Information Literacy Standards for Science and Engineering/Technology?

Regarding Question 1, the questionnaire used computer science/engineering resources that are available on the UCLA library web page (e.g., ACM Digital Library, Compendex, IEEE Xplore, Inspec, and Web of Science as well as Google Scholar and library catalogs). Examples of informal channels of communication included instant messaging, RSS, podcasting, blogs, wikis, as well as friends and advisors. Examples of formal communication included databases that contain peer-reviewed articles in journals, conference proceedings, as well as writings in technical reports and patent literature. Books, handbooks, Computing Reviews, and encyclopedia articles are also examples of formal communication in science and engineering/technology.

Regarding Question 2, the questionnaire used the information literacy standards specifically created for science and engineering (ACRL/ALA/STS 2006). Table 1 paraphrases and collapses those information literacy performance indicators that were used as the basis in the design of this study's questionnaire. Thirteen of the 104 performance sub-indicators from the published Standards (ACRL/ALA/STS) were chosen. The questionnaire is given in the Appendix.

Each of the selected Standards is first given in bold headings followed by italicized information literacy performance indicators (2006). Corresponding survey questions are given next to each of the indicators. We targeted outcomes that were relevant to our questions and kept the survey economical.

For example, outcomes a) through c) in Standard 1 from the published Standards were purposely omitted in the survey; these have to do with teaching students to identify a research topic (a), to consult with instructor for appropriateness of topic or laboratory exercise (b); and to develop a hypothesis and formulate questions based on the information need (c). Other performance indicators that are important but excluded from the questionnaire are discussed under Further Work.

Table 1: Questions in the context of ACRL/ALA/STS Information Literacy Standards (2006)

Standard 1: The information literate student determines the nature and extent of the information needed
Performance Indicators (used in this study):

1d) Explores general information sources to increase familiarity with current knowledge of the topic. See Question 6. See details in the Appendix at the end of this article.

2a) Identifies popular versus scholarly, current versus historical, primary versus secondary. Question 6.

2c) Identifies the value and differences of potential resources in a variety of formats (e.g., database, web site, patent, book). Questions 1, 6, 16.

4a) Determines the availability of needed information and makes decisions on broadening the information seeking process beyond locally held resources (e.g., interlibrary loan). Question 16.

Standard 2: The information literate student acquires needed information effectively and efficiently
Performance Indicators (used in this study):

2d) Constructs a search strategy using appropriate commands for the information retrieval system selected (e.g., Boolean operators). Question 9.

3a) Uses various relevant search systems to retrieve information in a variety of formats. Questions 1, 6, 8, 10, 16.

3b) Uses various classification schemes and other systems to locate information sources. Question 11.

3c) Uses specialized online or in person services as needed to retrieve information (e.g., ILL, librarians, library staff, professional associations, subject experts). Questions 6, 16.

Standard 4: The information literate student understands the economic, ethical, legal, and social issues surrounding the use of information and its technologies and either as an individual or as a member of a group, uses information effectively, ethically, and legally to accomplish a specific purpose.
Performance Indicators (used in this study):

1. Understands many of the ethical, legal, and socio-economic issues surrounding information and information technology. Specifically: 1d) Demonstrates an understanding of intellectual property, copyright, and fair use of copyrighted material and research data. Question 12, 14.

2e) Legally obtains, stores, and disseminates text, images, or sounds. Question 14.

2f) Demonstrates an understanding of what constitutes plagiarism and does not represent work attributable to others as his/her own. Questions 13, 14, 15.

3b) Posts permission granted notices, as needed, for copyrighted material. Question 7.

Standard 5: The information literate student understands that information literacy is an ongoing process and an important component of lifelong learning and recognizes the need to keep current regarding new developments in his or her field.
Performance Indicators (used in this study):

2. Uses a variety of methods and emerging technologies for keeping current in the field.

b) Uses online table of contents scanning, review journals, and other forms of rapid communication literature. Question 6.

This study revised the earlier questionnaire that was used ten years ago to study information literacy skills of engineering students. Although many sources have changed, interfaces became web-based and more "user-friendly" than they were in the past, the main structure followed the older version.

Methods

Permission from UCLA's Office for Protection of Research Subjects was obtained to use students as human research subjects and collect data on types of sources they use in their class assignments. The design of the questionnaire was vetted with experts on wording, sequence of questions, and types of resources.

Students' confidentiality was secured as they were not identified in any way. Their grades would not be affected by their responses. Students' participation was completely voluntary. The recall was 82% (70 of 85 enrolled students in Spring Quarter of 2008). As a result, we could not correlate student gender with their scores and responses they provided; other variables were hidden as well, including their age, potential correlation between their GPA and their responses on information literacy skills, and native language and their scores.

Students were asked to complete a four-page questionnaire with 17 main questions, a total of 107 responses per survey. Students gave useful information on those information literacy topics they thought would enhance their undergraduate studies and prepare them better for both employment and graduate studies. Question 17 read: "Please suggest two topics you would like to know/learn about in the area of engineering information sources and access." Their suggestions are discussed at the end of the next section. Students completed the questionnaire in their regular classroom; they were given about 15 minutes time to complete it; one of two teaching assistants administered the process.

Results

This study's results are discussed in terms of the two questions that have guided this investigation. Readers are referred to the Appendix for the two research questions that we discuss next.

Students' level of understanding of a wide variety of resources (Question #1)

In Question 16 (see the Appendix), students were presented with a full citation representing a conference paper that appeared in Transactions on Graphics. Students were also given a captured screen with holdings information on that journal from Orion (library catalog for UCLA library holdings). They were asked if they could find the specific issue of Transactions on Graphics which contained the paper on their screen. One of the prevalent reference questions, "How do I find this paper?" has been reported in the literature (Chudnov 2008).

More than one third (34%) of the students incorrectly answered that they could find the needed journal volume in Orion (UCLA's online library catalog). Fifty one percent answered correctly, while the rest "didn't know."

The students were asked to rank order 12 sources that would give them access to some form of that paper. All sources were potential candidates for this conference paper. Students selected Google Scholar and the web as the first best picks. Other sources followed in the following order of importance:

Regarding the level of awareness of a wide variety of resources, 82.8% of the students in the study (n=58 of 70) "don't know" what Melvyl actually is (the University of California systemwide online catalog), while seven percent confused Melvyl with the Orion system (web-based library catalog for UCLA holdings only). Twenty percent said they never heard of Melvyl before they took this survey and didn't know where it was located. Of those who never used Melvyl, students were divided between those who said they "have not had a need to use Melvyl" (35.7%), "don't know where Melvyl is located" (24.3%), "have not had time to learn to use Melvyl" (14%), and "have not taken training sessions on how to use it" (20%). More than one third of all students thought that it would have taken them between one hour and a day to learn how to use Melvyl; in addition, they thought that learning how to use Melvyl would be between "somewhat easy and somewhat difficult." (See Questions 1 through 5 in the Appendix.)

Students were unclear about what Orion and Melvyl provide. While many students expected library catalogs to give bibliographic descriptions for books, sound recordings, CDs, DVDs, technical reports, dissertations, and titles of journals and magazines, more than one third thought that access to book chapters would be also given (33.4%) as well as access to individual magazine articles (82.8%). The students could circle "all that apply".

Question 6 lists 12 very different types of resources, all available to the students under study. The selected sources are both by subscription and free of charge; students found those publications that are peer reviewed and those that normally don't go through the same rigorous review; other sources included informal types of information that come in many different forms such as e-mail from friends and other sources that are unfiltered; dissertations and theses would be on the other side of spectrum, approved by experts, researched, and often supported by highly competitive funding agencies.

The students' task was to rank order the resource that they thought would give them the best critical review of the literature on a fairly new topic (to them). The first choice would be the first best source they would turn to for a literature review. The two sources that they ranked at the top of the list were search engines, such as Google, and community-produced e-resources such as Wikipedia, followed by books, articles in magazines and journals, technical reports, and peer-reviewed encyclopedias and handbooks. Computing Reviews (ACM) was ranked somewhere in the middle, as the seventh best source from the top, followed by informal channels of communication such as blogs and wikis. Dissertations and theses, conference papers, and patents got low ranking, on the average. Podcasts and instant messaging were ranked at the bottom.

How well do students' responses match some of the information literacy outcomes (Question #2)

Some of the critical information literacy indicators that were explored include students' perception of social issues of information and responsible use of information (per Standard 4) (questions 7 and 12-15). Question 7 dealt with mechanical issues of attribution, while the other four questions attempted to obtain students' conceptual understanding of academic honesty including plagiarism. Question 7 asked to identify the part of the web citation that books do not have. Almost half of the students didn't know that web address and access date are the citation elements that books don't have (46%). Some answered that URL address (46%) would make the difference; some students gave the access date as the answer. Very few gave pieces of both data as the correct answer.

Question 12 asked students a hypothetical question regarding their attribution practices. Two thirds would cite sources if "copied exactly" (68%), "used in your own words" (77%), "quoted text" (87%), "copied graph taken off the web" (72%), and "instructor's class notes" (12%). Probing deeper, Question 13 asked students to evaluate two hypothetical cases: if a particular case was plagiarized, not plagiarized, and if they were undecided about each of the two cases. The first case mentions the source but does not explicitly give attribution; 78.5% of the students thought it was a case of plagiarism. The other case mentioned and correctly cited the source; nearly two thirds of the students (62%) correctly thought the case was not plagiarized.

Questions 14 and 15 were designed to explore students' perceptions of plagiarism-related issues in a wide variety of types and forms of literature. Earlier studies (Ercegovac 2005; Ercegovac and Richardson 2004) found that students were typically confused about attributing anything that was not textual and printed. The further they were from printed texts, the more confused students became. For example, we wanted to learn students' practices about attributing intellectual and/or artistic property that related to non-print intellectual property. Answering the question, "which of the following are NOT examples of plagiarism (circle all that apply)," 72.8% of the study participants thought that "not citing a source for ideas that are widely known" would not constitute an example of plagiarism. Nearly forty-three (42.8) percent said that people who do not reference web sites would not plagiarize, 21% who do not cite choreographers would not plagiarize, 13% who do not cite map makers would not be considered plagiarists, as well as those who do not acknowledge e-mail information would not be involved in the act of plagiarizing (also 13%). Eighty-one percent (81) thought that it would be unethical to "use exactly the same logo design, like in Nike shoes, Coca Cola trade mark, in some of my art work." Only one third (34%) thought that it would be unethical to "make rendition of a branded stuffed animal and display in my school exhibit under my name."

The other questions probed students' understanding of query modification. Question 9, for example, asked: You are searching a library catalog (or database) on machine learning; you retrieve over 500 publications; in order to get fewer items, you will (circle one answer): a) add more terms and use the AND operator; b) add more terms and use the OR operator; c) try searching again. Question 10 asked students to search a known item, and Question 11 explored students' understanding of call numbers.

At the end of the questionnaire, students were asked to write down those topics in information literacy that they would like to know more about. Topics "I'd like to learn" were grouped under these three headings:

A. PLAGIARISM PREVENTION:

  1. How to avoid all forms of plagiarism.
  2. What qualifies plagiarism?
  3. Proper citation methods.
  4. Legal issues, like patenting software products

B. ACCESS AND SEARCHING TECHNIQUES:

  1. How to search databases.
  2. How to find good sources.
  3. How to access literature in a field?
  4. How to find answers?
  5. Where to look for anything, including careers, salaries, applications for employment.
  6. Complete list of good sources to use and how to use them properly.
  7. How to refine searching to make it more specific to the topic at hand?
  8. How to find relevant information?

C. SEARCHING SPECIFIC SYSTEM:

  1. What are the contents of Melvyl?
  2. Good ways to navigate search engines, like Melvyl.
  3. How to use Internet/UCLA resources to obtain scholarly research?

Discussion

Students recognized their shortcomings and wanted to learn more about specific topics that they identified. These topics empirically confirm the importance of performance indicators in the information literacy standards: ability to search a wide variety of sources, refine search results in order to get high precision and high recall), critically evaluate the sources in order to get "good sources on anything," and ethically use all types of materials for a variety of purposes.

This study has revealed similar patterns and issues as in the 1999 study: not seeing the big picture, not connecting sources by their contents, poor understanding of organizational schemes, poor interpretation of cataloging entries and corresponding data elements that are used to describe documents. Students struggled with notions of what library catalogs actually consist of, and what the differences are between library catalogs and periodical databases.

In contrast to the students of 10 years ago who were highly confident of their information literacy competency skills, these 70 students perceived themselves as being somewhat competent (34%) and not competent at all (25.7%). Almost 60% thought they were inadequate in information literacy competency skills.

Instructional units should engage students rather than lecture; engineering classes seem to consist of a lecture format presented by a faculty member in charge, and separate recitation or lab session led by teaching assistants. To emulate that two-tier format, invited lecturers in information literacy could ask students to perform certain tasks that are meaningful to them, offer immediate constructive feedback, and give them a chance to improve the newly introduced skills. See examples in Ercegovac (2008a; In press a).

Instead of presenting numerous slides on advanced ways of searching various databases or search engines, it may be more powerful to discuss the big picture and pathways on how engineering information is organized and disseminated. Then, several in-context examples could be discussed to demonstrate how a large number of search results could be made more specific and more effective or how to make decisions about selecting the first best source on a variety of computer science-related topics of students' choosing.

The third guideline would present and discuss the importance of information contents regardless of the medium and format. Since time is always critical, healthy collaboration with the faculty and student societies might help in this regard. Redundancy tends to increase the chance that students will be more aware in and motivated to learn information literacy skills.

We do not claim that the results of this study are generalizable to other engineering students' populations. However, we suspect that UCLA's students are no better or different from the students in the top 15 engineering schools in the country in terms of their age, gender, entrance scores, language proficiency, and other demographic and socio-economic variables.

Further Work

This class, Computer Science 101, is expected to repeat during 2008-09 academic year. Reflecting on this year's experience, it would be useful to maintain a two-tier process. The information literacy presentation should have been presented earlier in the course rather than as the last unit. Students could make better use of the presented material in other computer science-related topics. The instructor could require the students to collaborate in small groups and work on a modest project that includes designing and carrying out a literature search with a specific bibliographic composition, annotating the most relevant materials from the search using bibliographic software, time and project management, and presenting the results to the class. Drafts could be discussed ahead of time and presented during recitation sessions.

The proposed iteration would introduce students to a level of competency that was not possible during the Spring of 2008. For example, per Standard 1, students would have the opportunity to consult instructors and negotiate a topic they would like to learn about; they could profit by learning how to summarize information gathered (per Standard 2); important topics would be to include discussion related free versus fee-based resources (per Standard 4). Finally, more attention should be given to specific application of bibliographic management software, such as EndNote and Zotero (per Standard 5).

It would have been useful to correlate and explain students' responses and abilities with their prior exposure to library training and information literacy sessions, with their frequency of using resources, and with their academic standing; for example, if they participate in honors undergraduate research seminars, are they required to write essays and research papers, and if they are asked to present their work in classes.

With an increasing number of electronic resources available on library's web pages and improved interfaces compared to those a decade ago, the following questions are asked:

Cited References

Association of College and Research Libraries, American Library Association. (2000). Information Literacy Competency for Higher Education. Chicago: ACRL, ALA.

Association of College and Research Libraries/STS Task Force on Information Literacy for Science and Technology, American Library Association. (2006). Information Literacy Standards for Science and Engineering/Technology. Chicago: ACRL, ALA.

Barker, Theresa, Cook, Julie, and Whang, Linda. 2004. Desperately seeking information: where and why engineering students find the information they need? [Online]. Available: {http://depts.washington.edu/englib/eld/conf/06/aseeposter06.pdf} [Accessed: April 30, 2009].

Chudnov, Daniel. 2008. You too can build a better search tool. Computers in Libraries, 28(5):44-46.

Ercegovac, Zorana. 2001. Accessing engineering global information for engineers: phase 2. In: Information in a Networked World: Harnessing the Flow: Proceedings of the 64th Annual Meeting of the American Society for Information Science and Technology. Medford, NJ.: Information Today p.411-424.

Ercegovac, Zorana. 2008a. Information Literacy: Search Strategies, Tools and Resources for High School Students and College Freshmen, 2nd ed. Columbus, OH: Linworth Publishing.

Ercegovac, Zorana. In press a. Is the Google generation information literate? The case study. In: People Transforming Information -- Information Transforming People: Proceedings of the 71st Annual Meeting of the American Society for Information Science and Technology. Medford, NJ: Information Today.

Ercegovac, Zorana. 1999. LEArning portfolio for accessing engineering Information for engineers. In: Knowledge: Creation, Organization and Use: Proceedings of 2the 62nd Annual Meeting of the American Society for Information Science. Medford, NJ: Information Today p.450-461.

Ercegovac, Zorana. In press b. Plagiarism of printed and electronic resources. In: Bates, Marcia J. and Maack, Mary N., editors. Encyclopedia of Library and Information Sciences, 3rd ed. New York: Taylor and Francis.

Ercegovac, Zorana. 2005. What students say they know, feel, and do about cyber-plagiarism and academic dishonesty? A case study. In: Sparking Synergies: Bringing Research and Practice Together: Proceedings of the ASIST 68th Annual Meeting. Medford, NJ: Information Today.

Ercegovac, Zorana and Richardson, John V. 2004. Academic dishonesty, plagiarism included, in the digital age: a literature review. College and Research Libraries 65(4): 301-318.

Kerins, Gillian, Madden, Ronan, and Fulton, Crystal. 2004. Information seeking and students studying for professional careers: the cases of engineering and law students in Ireland. Information Retrieval 10(1). [Online]. Available http://InformationR.net/ir/10-1/paper208.html [Accessed: May 1, 2009].

 


Appendix A

Download PDF

Previous Contents Next

W3C 4.0 
Checked!