This research blog was written by AI (sort of)
January 14, 2025
Artificial intelligence (AI) seems both inescapable and deeply divisive. Some people are enthusiastic about the myriad possibilities of AI; others are terrified; and the rest of us are trying to understand how AI fits into our personal and professional routines. While AI is impacting every profession, the settlement and integration sector faces unique challenges. ARAISA recently partnered with ISANS to deliver “FA.I.R FOR ALL: AI’s impact on settlement & language learning,” a webinar series that explores how AI technologies can support newcomer settlement. The series addresses the role of AI in language instruction, communities of practice, ethical considerations, and federal guidelines, among other topics.
Researchers are facing similar challenges. Is it appropriate to use AI when working with newcomer populations? How can we productively apply AI when doing research in settlement and integration? What are the strengths, weaknesses, and ethical considerations of AI as a research tool? I certainly don’t have the answers to these questions, but AI might.
AI shouldn’t be used to develop research questions or guide the research process.
First, I’d like to consider why AI isn’t appropriate for research. In particular, AI shouldn’t be used to develop research questions or guide the research process. As I have suggested elsewhere, research should be informed by the needs, interests, and experiences of the communities or populations involved. If you’re planning to study the experiences of newcomers, for example, then newcomers should actively inform research design. Ask people, not AI.
Researchers should be especially careful about using AI to review, analyze, or manage sensitive data.
AI also presents ethical challenges. Researchers should be especially careful about using AI to review, analyze, or manage sensitive data. Data shared through AI is not necessarily private or confidential. When I asked ChatGPT about its confidentiality policy, it advised me “not to share any private, confidential, or sensitive personal information (like passwords, social security numbers, bank details, etc.).”
So how can AI help me with research? Given my own lack of expertise on the subject, I asked ChatGPT. “Hey ChatGPT! How can I use AI when doing research with newcomers?” After receiving a very detailed, ten item response, I rephrased the question. “Okay then: what are the top three ways AI can support me when doing research with newcomers?” This is what it suggested:
“1. Streamlining literature review and knowledge acquisition: AI can help newcomers quickly access relevant academic papers, articles, and resources. AI tools can also summarize lengthy research articles.” First of all, ChatGPT, I asked about research with newcomers, not as a newcomer. Nevertheless, that’s a good point. Finding and summarizing relevant research across multiple databases is time-consuming, and I wouldn’t mind some support.
“2. Data collection, analysis, and visualization: AI can assist newcomers in analyzing large datasets with minimal prior experience. It can suggest statistical methods, identify trends, or generate insightful visualizations automatically.” Yes, it certainly can do that. But again, be careful about sharing sensitive data unless you are absolutely certain the AI technology is secure and confidential (i.e. if you have an agreement with the provider).
“3. Writing and content creation: AI-powered writing assistants can help newcomers draft, edit, and refine their research papers.” I’ll admit: I’m lost without spellcheck. I’ll also admit that AI is pretty good at structuring responses. When I first experimented with ChatGPT, I asked it to write a fully-cited academic article on a subject in which I’m an expert. While it wasn’t the most insightful analysis, it definitely had all the rights parts in all the right places.
Maybe that’s what AI is good for – to offer a second opinion during research, sort of like peer review
So AI is good but not perfect. It offered relatively superficial suggestions that aren’t very relevant to my work in the sector. And while I may disagree with some of these suggestions, it nevertheless gave me something to think about. Maybe that’s what AI is good for – to offer a second opinion during research, sort of like peer review. I don’t need to agree with a peer reviewer, and I don’t always need to incorporate their suggestions. But by engaging and occasionally disagreeing with another perspective, I am able to better understand and strengthen my own.
About the Author
Jason Chalmers
Jason Chalmers holds a PhD in Sociology from the University of Alberta and was a Postdoctoral Fellow in the School of Community and Public Affairs at Concordia University. As an interdisciplinary researcher, Jason draws on diverse methodologies and is particularly inspired by creative and community-based practice. Jason’s research interests include Canada’s immigration history, Indigenous-settler relations, and social inequality.
You can reach Jason at jchalmers@araisa.ca