In the spirit of Valentine’s Day 2023, I asked ChatGPT1 a burning question: How does one find romance? ChatGPT gave me a thoughtful bit of wisdom in reply after a shortlist of five suggestions on finding romance: “Focusing on your own happiness and well-being can help attract positive experiences and relationships into your life.” As some readers may already know, at the end of 2022, the fascinating prospects of a chatbot called ChatGPT (Generative Pre-trained Transformer) pushed boundaries in natural language processing and artificial intelligence. In this application, the technology can respond smoothly in human-comprehensible question-answer conversation. Trained by scraping the internet, both understanding and generating conversational language, ChatGPT is a large language model that seems to provide mostly coherent, even if at times superficial, responses to questions asked in human natural language. But as with any new technology, there are potential unintended consequences, despite countless opportunities.

Already, large language models show promise in comprehending and solving problems relating to answering important personalized or population-wide questions that can be only derived from analyzing large numbers of clinical notes in electronic health records.2 There are even explorations into using ChatGPT technology to facilitate early Alzheimer’s disease diagnosis based on a person’s speech-to-text patterns, as one of many potential clinical applications.3 Yet, there are also major concerns around exacerbating issues around misinformation generation and perpetuation or people using generated language and passing it off as their own original work. For simple lists, I asked ChatGPT the following two questions:

What are potential applications of ChatGPT technology in medicine?

  1. Clinical decision support
  2. Medical education
  3. Natural language processing
  4. Health information for patients
  5. Virtual assistants for health care providers”

What are the risks?

  1. Misinformation
  2. Bias
  3. Dependence (reliance on automated decision-making and a decrease in critical thinking and clinical judgment)
  4. Privacy and security
  5. Ethical considerations, such as automation in healthcare, the responsibility of healthcare professionals in using and interpreting ChatGPT-generated recommendations”

The responses are sensible starting points, but not comprehensive. I also sought to paraphrase some content for the purposes of inclusion in this column, even though I don’t know as of yet the appropriate ways to quote an AI-powered chatbot. In lieu of that, readers can ask these questions of ChatGPT to retrieve the full responses, but it is also possible that the answers might change with time.

As scientific communities expand the possibilities of artificial intelligence applications, the many accompanying ethical, legal, and social issues around even more humanlike technology need to be addressed in parallel. Education and digital literacy will necessarily be among these issues to address and be close to home for us as SGIM members. What opportunities and risks do new technologies like ChatGPT present? How do we engage in shaping their ethical design and applications to our daily practices and work? On this last question, I look forward to hearing more voices and views from SGIM members on what a future technology-augmented general internal medicine practice, training, and overall mission would look like.

References

  1. ChatGPT: Optimizing language models for dialogue. OpenAI. https://openai.com/blog/chatgpt/. Published November 30, 2022. Accessed January 15, 2023.
  2. Gordon R. Large language models help decipher clinical notes. MIT News. https://news.mit.edu/2022/large-language-models-help-decipher-clinical-notes-1201. Published December 1, 2022. Accessed January 15, 2023.
  3. Agbavor F, Liang H. Predicting dementia from spontaneous speech using large language models. PLOS Digital Health. 1(12): e0000168. https://doi.org/10.1371/journal.pdig.0000168. Published December 22, 2022. Accessed January 15, 2023.

Issue

Topic

Health Policy & Advocacy, Medical Education, Medical Ethics, Research, SGIM, Wellness

Share