Download as pdf or txt
Download as pdf or txt
You are on page 1of 8

ASK US LOG IN TO YOUR LIBRARY ACCOUNT HOURS AND MAPS CONNECT FROM OFF CAMPUS UC BERKELEY HOME

UC Berkeley / Library Guides / Bioscience, Natural Resources & Public Health Library
/ MCB 32: Introduction to Human Physiology / AI tools

MCB 32: Introduction to Human Physiology:


Search this Guide Search

AI tools
Race & racism in medicine
Citing sources AI tools
Drug information
Connecting
Scientific article types Evaluating sources 


AI in literature searching and writing For more help

AI Literacy: Uses and limitations of AI tools


As when using any tool, it's important to understand the strengths and limitations of AI
tools before choosing to use them. Some considerations to be aware of from the start:

"Artificial Intelligence" tools are not intelligent: Large Language Models (LLMs)
ingest a vast text corpus and are then trained to produce text based on statistical Elliott Smith
predictions of word probability and order based on context. Although they generate he/him
text that sounds authoritative, these agents do not, and cannot, evaluate the accuracy
of their own output, and can produce responses that are biased, false, or harmful.
Free ChatGPT-based tools currently have a knowledge cutoff of 2021: Nothing
Schedule an online
meeting

published or posted to the web after 2021 was included in training corpus for the free Contact:
version of the ChatGPT LLM, so any information it references will be from 2021 or Bioscience, Natural
earlier. Resources &
ChatGPT and other AI resources are not supported tools at UC Berkeley: They Public Health Library
have not been reviewed for accessibility, privacy, or security. esmith@library.berkeley.edu

Outlined below are some possible uses of AI tools in literature searching and writing, along Subjects:
with limitations to consider: Biological Sciences,
Biology, Comparative
Brainstorming topics and research questions Biochemistry, Integrative
Biology, Molecular and
gy,
Cell Biology,
Neuroscience

Image: Drtonymc, CC BY-SA 3.0, via Wikimedia Commons

AI tools can:

provide a quick overview of a research field


suggest topics that might be appropriate for a writing assignment

describe relevant keywords


cite initial sources for further exploration.

Limitations:

Generative AI tools are based on LLMs that are typically trained on text scraped
from the open web during a fixed time period (as noted, for free ChatGPT tools
the time period ended in 2021). So generated query answers may not reflect the
most current information.
LLMs may reflect biases or errors in their source material or training process.
AI tools generally provide better answers when prompts are more specific and
precise At the beginning of a research project it can be difficult to formulate
precise. At the beginning of a research project it can be difficult to formulate
precise queries because you are just beginning to learn about a domain.
AI tools may provide better answers when the LLM has specifically been trained
on a relevant corpus. ChatGPT and many other tools were trained on text
scraped from the open web, which is not domain-specific and includes
misleading, biased, false, and harmful information.

Searching for information


Image (detail): Jean-François Millet, Des glaneuses (The Gleaners), 1857. Source: Musée d'Orsay

AI tools can:

provide citations to scientific journal articles and other sources on a topic


suggest similar sources
show citation, authorship, or topic networks among sources.
Limitations:

The process of iterative searching in databases helps you to learn more about a
research domain and to focus your results on the most relevant sources; AI tools
bypass those processes for the user.
Free ChatGPT-based tools currently have a knowledge cutoff of 2021—nothing
published or posted to the web after 2021 was included in training corpus for
the free version of the LLM.
Generative AI tools can "hallucinate" false citations. As with any other form of


research, it's important to verify all sources and the information they contain.
Relevancy rankings on some AI tools rely on article citation metrics and journal 
impact factors, neither of which is necessarily an indication of article quality,
timeliness or relevance to the query.
AI tools are often proprietary black boxes that do not indicate how or why the
results were obtained. Database searches can be revised to improve results,
saved, and shared. Systematic reviews and evidence synthesis projects generally
supply the exact searches employed in the chosen databases so that others can
check the strategy and reproduce the results; neither is possible with AI tools.
Databases have powerful features such as automated search expansion, filters,
and search and citation alerts, that can help focus and simplify your search
process.

Synthesizing information
Synthesizing information

Image (detail): ButuCC, CC BY-SA 3.0, via Wikimedia Commons

AI tools can:

provide summaries of single papers, related groups of papers, or research areas


help you identify information sources for deeper reading and exploration


provide explanations of specific terms or concepts to help you understand
technical or specialized sources.

Limitations:

Critical evaluation and assessment of relevance are important skills involved in


synthesizing information into new knowledge. AI tools are notoriously poor at
those tasks.
Summaries generated by AI tools often rely on the language in the source itself.
This will not necessarily clarify the content.
See the limitations for brainstorming.

ii
Writing

Image (detail): Subhashish Panigrahi, Wikimedia Commons, CC-BY-SA 4.0

AI tools can:

make suggestions for changes to word choice, grammar, and sentence structure
generate short- to medium-length texts responding to prompts

Limitations:

One purpose of courses like this one is to help you become a better writer.
Over-reliance on AI tools undermines that goal.
The process of writing helps you to clarify and deepen your understanding of a
topic (just as the process of teaching someone else helps you to learn). Over-
reliance on AI means that you don't get that benefit.
The output of AI tools can sound authoritative, but as noted above, generative AI
can provide misleading, false, biased, and/or harmful information, including
citations to sources that don't exist.
Copying text without citation from any source, including AI tools, is considered
py g y , g ,
plagiarism: "If a student uses text generated from ChatGPT and passes it off as
their own writing, without acknowledging or citing the influence of ChatGPT in
their process, they are in violation of the university’s academic honor code" (UC
Berkeley Center for Research, Teaching and Learning).

Last Updated: Nov 30, 2023 3:16 PM URL: https://guides.lib.berkeley.edu/mcb32  Print Page Login to LibApps
Report a problem.

Copyright © 2024 The Regents of the University of California.


Except where otherwise noted, this work is subject to a Creative Commons Attribution-Noncommercial 4.0 License.

PRIVACY
ACCESSIBILITY
NONDISCRIMINATION

You might also like