Everything you have written on this subject rings true. Save for your backtracking on ethnographic research. Having done a fair bit of research that would qualify as “field work” in addition to the APSR/JCR/IO social “science” AI tools would have been quite useful when my colleagues and I were schlepping around Rwanda and India. Developing questionnaires, transcribing taped interviews, translating foreign languages and so on. Given the poor understanding of the rules of inference by so many qualitative researchers, I’d lay dollars to donut nuggets that Claude Code will be at least as useful to the scholars who bray the most today. And we haven’t even started in on teaching…
Thanks, that's a good point. What I meant is more that Claude literally cannot go into the field by itself yet--especially in hard-to-reach places without reliable internet.
But everything around the fieldwork (questionnaire design, transcription, translation, coding) is already fair game. And yeah, qualitative scholars can probably take more advantage of these tools than most realize.
I think the crux point is how can the qual researchers (and really, everyone that produces and holds trustworthy data) get paid. Today's research industrial complex is absolutely not set up to incentivize this.
Qualitative signals of early opportunity to reduce externalities (in my role as a public sector psychiatrist, this means coercion, overdose, suicide, public disorder, and extraordinary expense trajectories) could be paid through a fiduciary system, a market structure for information. If your data doesn’t matter in the lives of people according to their values or the risks society bears for them (and optimally both) then what is its utility value as social science?
I just deleted Bluesky this week. It's a wasteland. Even though it's not old Twitter, I think LinkedIn has become a better place for academic engagement.
I also apperciate the attention to qual work. I was thinking the same thing about the importance of field work in this AI age. Then how to best translate them for what can be read by AI and online. I think those skills will be key moving forward. I have been messing with NotebookLM for some coding and it is very very good for those needs.
Thanks, I agree that LinkedIn is increasing the platform for academic exchange now (which still feels weird but we don’t have any other choice).
I haven’t removed my Bluesky yet since it’s still good for connecting with my European colleagues who are not on X/LinkedIn, but I’m increasingly contemplating it.
I think a lot of "academics" on Bluesky have persuaded themselves that insults are a valid form of argument. What's the point of calling yourself a scholar, if in practice, you determine your beliefs based on who has the most hilarious clapback?
My own take is that if you are good AI can make you better. But my fear is that AI will keep people from getting good in the first place. There are so many horror stories about students, even in grade school, using AI as a substitute for learning rather than as a way to enhance it.
I really like what Paul Allison is doing with his Code Horizons courses. But I suspect they’d be worth little to anyone who didn’t already have a good background on the topics.
Thank you for writing these pieces, Professor. You hit the nail on the head of those academics who assume AI use is copy/paste. It is insulting and incredibly unfair – academic policy cannot be dictated by individuals with this assumption, especially not after HEIs have spent so much money recruiting and training staff to deliver AI training and digital literacy programming. They just refuse to show up and take part even when they know their students are using it and are expected to know how to use it outside of the classroom. Thanks again – enjoyed reading both of these! (And I hope you write more about AI in the social sciences!)
I worked in higher education for nearly 20 years before leaving, and this was a refreshing read. There are a lot of uncomfortable truths in this that people inside the system will recognize, but probably won’t want to admit.
Everything you have written on this subject rings true. Save for your backtracking on ethnographic research. Having done a fair bit of research that would qualify as “field work” in addition to the APSR/JCR/IO social “science” AI tools would have been quite useful when my colleagues and I were schlepping around Rwanda and India. Developing questionnaires, transcribing taped interviews, translating foreign languages and so on. Given the poor understanding of the rules of inference by so many qualitative researchers, I’d lay dollars to donut nuggets that Claude Code will be at least as useful to the scholars who bray the most today. And we haven’t even started in on teaching…
Thanks, that's a good point. What I meant is more that Claude literally cannot go into the field by itself yet--especially in hard-to-reach places without reliable internet.
But everything around the fieldwork (questionnaire design, transcription, translation, coding) is already fair game. And yeah, qualitative scholars can probably take more advantage of these tools than most realize.
I think the crux point is how can the qual researchers (and really, everyone that produces and holds trustworthy data) get paid. Today's research industrial complex is absolutely not set up to incentivize this.
Qualitative signals of early opportunity to reduce externalities (in my role as a public sector psychiatrist, this means coercion, overdose, suicide, public disorder, and extraordinary expense trajectories) could be paid through a fiduciary system, a market structure for information. If your data doesn’t matter in the lives of people according to their values or the risks society bears for them (and optimally both) then what is its utility value as social science?
I just deleted Bluesky this week. It's a wasteland. Even though it's not old Twitter, I think LinkedIn has become a better place for academic engagement.
I also apperciate the attention to qual work. I was thinking the same thing about the importance of field work in this AI age. Then how to best translate them for what can be read by AI and online. I think those skills will be key moving forward. I have been messing with NotebookLM for some coding and it is very very good for those needs.
Thanks, I agree that LinkedIn is increasing the platform for academic exchange now (which still feels weird but we don’t have any other choice).
I haven’t removed my Bluesky yet since it’s still good for connecting with my European colleagues who are not on X/LinkedIn, but I’m increasingly contemplating it.
I think a lot of "academics" on Bluesky have persuaded themselves that insults are a valid form of argument. What's the point of calling yourself a scholar, if in practice, you determine your beliefs based on who has the most hilarious clapback?
Yay for going viral and also for amplifying those of us in this space. It is appreciated.
My own take is that if you are good AI can make you better. But my fear is that AI will keep people from getting good in the first place. There are so many horror stories about students, even in grade school, using AI as a substitute for learning rather than as a way to enhance it.
I really like what Paul Allison is doing with his Code Horizons courses. But I suspect they’d be worth little to anyone who didn’t already have a good background on the topics.
https://aihorizons.io/public-ai-seminars/
Thank you for writing these pieces, Professor. You hit the nail on the head of those academics who assume AI use is copy/paste. It is insulting and incredibly unfair – academic policy cannot be dictated by individuals with this assumption, especially not after HEIs have spent so much money recruiting and training staff to deliver AI training and digital literacy programming. They just refuse to show up and take part even when they know their students are using it and are expected to know how to use it outside of the classroom. Thanks again – enjoyed reading both of these! (And I hope you write more about AI in the social sciences!)
I worked in higher education for nearly 20 years before leaving, and this was a refreshing read. There are a lot of uncomfortable truths in this that people inside the system will recognize, but probably won’t want to admit.