- “Secularism and Religious Accommodation in Modern Algeria: The Case of Abd al-Qadir al-Jazairi” by Jeremy B. E. DeLong. Journal of North African Studies, vol. 21, no. 5, 2016, pp. 758-772.
- “The Return of Abd al-Qadir al-Jazairi to Algeria as a Secular National Symbol” by Noora Lori. Middle Eastern Studies, vol. 55, no. 2, 2019, pp. 207-222.
- “Conceptualizing Islam in Relation to the Secular: The Case of Abdelkader al-Jazairi” by Michaelle Browers. Islam and Christian–Muslim Relations, vol. 20, no. 2, 2009, pp. 129-142.
- “Revisiting the Religious and Secular in Algeria: Abdulhamid Ben Badis and Abdelkader al-Jaza’iri” by Amir Ahmadi. Journal of Islamic Studies, vol. 30, no. 3, 2019, pp. 355-377.
- “Islam and the Secular State: The Emir Abdel Qadir and the French Occupation of Algeria” by Richard M. Eaton. The Journal of Religious History, vol. 14, no. 3, 1988, pp. 308-322.
Have a look at the above list of references, especially if you are an academic in Middle Eastern studies or any other related field. Do you notice anything suspicious about them? Unless you are a specialist on Emir Abdelkader (also written as Abd al-Qadir al-Jaza’iri, a nineteenth-century Algerian anticolonial hero), they would all probably seem legit to you. Only they aren’t! These are all made-up references by ChatGPT, but admittedly, or perhaps annoyingly or even alarmingly, very well made-up.

After all the hype about ChatGP, last week I finally decided to try it myself. Since I have recently joined a research project about Emir Abdelkader, I wanted to test if it could be helpful for my research, so I asked it to suggest the five academic resources about Emir Abdelkader that contain the word ‘secular’ (or any of its derivatives) the most. Within seconds, or less, ChatGPT produced the above list. I said to myself: wow, they seem like wonderful suggestions! I copied and pasted them in a world file, thanked ChatGPT for its assistance, and closed the chat.
A week later, I wanted to check these articles myself, so I went to Google to search for the first one “Secularism and Religious Accommodation in Modern Algeria: The Case of Abd al-Qadir al-Jazairi” by Jeremy B. E. DeLong. To my surprise, I found nothing. Neither regular Google nor Google Scholar was able to locate the alleged article. Then I went to the website of the journal, the Journal of North African Studies, which is a real one, and I looked for volume 21 issue no. 5, but there was no mention of such an article or such an author.
The strange thing is that ChatGPT was careful to put the name of a real journal and give coherent volume, issue, year, and page numbers! There exists a volume 21 of this journal, published in the year 2016, with an issue numbered 5 on pages 735-925. So everything about this reference, except the name of the article itself and the author, is real or at least coherent. There is even an article about Abdelkader in that very issue but with a different title and a different author.
I thought maybe this was an isolated glitch, so I went to search for the second reference, “The Return of Abd al-Qadir al-Jazairi to Algeria as a Secular National Symbol” by Noora Lori, and, to more surprise, I could not find it too! Again, everything about the second reference in terms of journal name, volume and issue numbers, year, pages, etc. is real or coherent. Even the title of the article makes perfect sense: the body of Emir Abdelkader was returned from Damascus to Algeria in the 1960’s.
Indeed, this time ChatGPT went a step further by giving the name of a real academic in a closely related field of study to the made-up article. Noora Lori is an Assistant Professor of international relations at the Pardee School of Global Studies who did research on “citizenship, migration, and statelessness”. Abdelkader could have very well been the subject of an article about “citizenship, migration, and statelessness” but, as far as I could tell, she didn’t write anything about him. Not only that, I found a real article with a similar title “Following the Leader: A History and Evolution of the Amir ‘Abd al-Qadir al-Jazairi as Symbol“ and similar author name “Nora Achrati” that was published in 2007 in The Journal of North African Studies.
The same thing applies to the other three references in the list above. None of them exists but all elements of the citations are real or close to real. This seems less like making up random stuff than trying maliciously to deceive the user. As I lay person in artificial intelligence, I can’t think of any technical explanations for ChatGPT fabricating answers, so I cannot help but think as if it had been intent on fooling me!
It’s not that there are no academic references that talk about Emir Abdelkader and secularity, so it had to invent some in order not to appear unhelpful. In fact, even when ChatGPT could give a useful legitimate academic resource, it chose (if one may use the word ‘choose’ with artificial intelligence) to fake one. In the same chat, I asked ChatGPT to recommend resources about the relationship between Khedive Ismaili (the ruler of Egypt, 1863-1879) and Emir Abdelkader, which is to be clear a much narrower topic and more complicated request than the earlier one. As far as I can remember (unfortunately, I didn’t archive the whole chat), it said that it could not suggest resources specifically about this topic but continued to say:
You may find more information about this conflict in books and journals on Egyptian history, particularly those discussing the Suez Canal and Khedive Ismail's reign. Some suggested titles are:
"Ismail Pasha: A Political Biography" by P.J. Vatikiotis
"The Emergence of the Middle East, 1914-1924" by Malcolm Yapp
"The Suez Canal Crisis and Its Consequences: A Reappraisal" edited by William Roger Louis and Roger Owen
The thing is that P.J. Vatikiotis (1928-1997) was a renowned Palestinian-born historian of modern Egypt who wrote the entry on Khedive Ismaili in the Encyclopaedia of Islam (2nd edition). Yet ChatGPT ‘chose’ to fabricate a reference by Vatikiotis instead of citing an existing one by him about the same topic. For the third title, William Roger Louis and Roger Owen did edit a book on the Suez Canal but it’s not the above. Theirs was titled Suez 1956: The Crisis and Its Consequences. There is, though, a similar sounding article titled “Moscow and the Suez Crisis, 1956: A Reappraisal” but by O. M. Smolansky.
So what to make out of all of that? As stated above, as a lay person in AI, I cannot comment on the programming aspect of it, and whether this is malfunction or there is a built-in feature in ChatGPT that gears it toward ‘lying’. I didn’t search the internet for explanations, to be honest, but I saw a segment of a video in which a professor says he spotted made-up references in a paper submitted by one of his students, so there might already be some explanations for that.
But I just wanted to share my experience, because it is really some nasty kind of lying to put it in simple human (not bots) terms. In addition, I wonder what that would mean for the future of academic writing. Can we trust now any cited resources that we haven’t read or at least verified ourselves? Should there now be programs that detect made-up resources like there are for plagiarism? With plagiarism at least the knowledge is real. You’re claiming ownership for something that does exist. But with ChatGPT you’re claiming there is something that doesn’t exist. This is some Matrix-level deception!
This happened to me yesterday!! I asked for a list of academic sources for the topic I was researching, and I went through exactly the same process as you did! I looked up the titles, couldn’t find anything. Then found the volume and issue number of the REAL journal it named, and it was an entirely different article!
A quick Google search revealed that ChatGPT is notorious for fabricating sources. Even when you ask what sources it bases its outputs from, those are often made up, too! The explanations were basically that ChatGPT is a language model, and its primary function is about outputs that sound the most natural, and the the most grammatically correct. Its great if you ask it to make your writing more concise, write poems or song lyrics, paraphrase, even translate to other languages etc.
My initial reaction was that students can more easily cheat on assignments, by having ChatGPT do all their writing and editing for them. But then I questioned whether this is so bad….. what if this liberates us to think more deeply…. Just like calculators allowed mathematicians to explore mathematical theories more deeply, by freeing the intellect from the labour of doing sums, perhaps these language AI models will unburden students, writers, academics, from writing blocks and difficultly with expression, and lead to more meaningful, considered ideas… I’m not sure how it will go, would love your thoughts.
Hello Marnie, thank you so much for your thoughts on my post and on the (un)usefulness of ChatGPT. To be honest, I haven’t thought about it as a tool to free up writers, whether students or academics, from some of the burdens of writing. This sounds like an interesting idea, especially for someone who struggles a lot with writing such as myself. Actually, I will maybe give it a try. I was thinking about writing something for some time. I have jotted down some thoughts in my notebook but didn’t bring myself to transfer them into a coherent text on my computer. Maybe it’s worth trying to see if it can ease the writing and the revision process. Let me know please if you try it!
[…] month I wrote a blogpost in which I described how ChatGPT made up academic references after I asked it to suggest resources […]