Article Type
Changed
Mon, 04/03/2023 - 01:15
Display Headline
A ‘guest editorial’ … generated by ChatGPT?

The artificial intelligence (AI) program ChatGPT (generative pre-trained transformer) was introduced to the public by Open AI on November 30, 2022. It has been described as a game-changer for society, culture, literature, arts, and science, with a potential impact that rivals that of the internet and social media.

ChatGPT can produce human-like essays in response to queries. Some researchers used ChatGPT to generate abstracts based on the titles of articles published in 5 high-impact journals. These abstracts were so “real” that they escaped detection by expert peer reviewers in 32% of cases.1 In addition, several medical/science articles were published that included ChatGPT as a bylined author.2

Like other journals,3-7CURRENT PSYCHIATRY is drafting guidelines regarding authors’ use of AI-generated content (see www.mdedge.com/currentpsychiatry/page/submission-guidelines). Although these guidelines remained a work in progress as we went to press in mid-March 2023, similar to the policies of other journals, CURRENT PSYCHIATRY’s policy will be predicated on 4 principles:

  • Nonhuman AI technologies do not qualify for named authorship.
  • Authors should be transparent about the use of AI technologies to aid in writing content. Authors should disclose which tool was used and how it was applied in the article submitted to the journal.
  • Human authors are responsible for the accuracy of work compiled by an AI tool, including the absence of plagiarism. Proper referencing of statements is necessary.
  • Images created by AI tools are not permitted for publication.

Because the overwhelming majority of articles published in Current Psychiatry are submitted by invitation, the use of AI will be discussed before the article is written.

Can AI generate an acceptable ‘guest editorial?’

In addition to creating formulaic writing, some of the concerns about using AI in scientific publishing is that the results might be outdated, incorrect, or lacking appropriate references, and that it can produce “plausible-sounding but incorrect or nonsensical” answers to questions.3 To test this, I asked ChatGPT to write an article on the same topic as 4 of my articles published in Current Psychiatry so the differences can be discerned (see “For artificial intelligence, the future is finally here”). Also, I asked ChatGPT about “myths surrounding psychiatry” and then used the result as a “guest editorial.” The Box is what it produced verbatim, without any copyediting, fact-checking, or addition of reference citations.

Box

Myths surrounding psychiatry

There are many myths and misconceptions surrounding psychiatry, the branch of medicine that deals with the diagnosis, treatment, and prevention of mental illness. Some of the most common myths include:

1. Psychiatry is not a real medical specialty: This is a false myth. Psychiatry is a recognized medical specialty, and psychiatrists are licensed medical doctors who have undergone extensive training in the diagnosis and treatment of mental illness.

2. Mental illness is not a real illness: Mental illness is a real and biologically based medical condition, just like physical illness. Research has shown that mental illness is caused by a combination of genetic, environmental, and lifestyle factors.

3. Antidepressants are not effective: Antidepressants have been shown to be effective in treating depression and other mental illnesses. While they are not a cure, they can provide significant symptom relief and improve quality of life for many people.

4. Psychiatric drugs are dangerous and have serious side effects: While all drugs have potential side effects, psychiatric medications have been proven to be safe and effective for many people. The risks and benefits of taking any medication should be carefully considered and discussed with a healthcare professional.

5. Therapy is not effective: Therapy, also known as psychotherapy, has been shown to be effective in treating a wide range of mental illnesses, including depression, anxiety, and post-traumatic stress disorder (PTSD).

6. People with mental illness are violent: People with mental illness are no more likely to be violent than the general population. In fact, people with mental illness are more likely to be victims of violence than perpetrators.

It is important to dispel these myths and misconceptions about psychiatry, as they can prevent people from seeking the help they need and can also contribute to stigma and discrimination towards those with mental illness.

In conclusion, there are many myths and misconceptions surrounding psychiatry. Dispelling these myths and increasing public understanding about mental illness and its treatment can help reduce stigma and encourage more people to seek the help they need.

What do you think? I encourage you to send me your take on this AI-generated “guest editorial,” and whether you deem its quality to be similar to that of an article authored by a human psychiatrist.

References

1. Else H. Abstracts written by ChatGPT fool scientists. Nature. 2023;613(7944):423. doi: 10.1038/d41586-023-00056-7

2. Stokel-Walker C. ChatGPT listed as author on research papers: many scientists disapprove. Nature. 2023;613(7945):620-621. doi:10.1038/d41586-023-00107-z

3. Flanagin A, Bibbins-Domingo K, Berkwits M, et al. Nonhuman “authors” and implications for the integrity of scientific publication and medical knowledge. JAMA. 2023;329(8):637-639. doi:10.1001/jama.2023.1344

4. Tools such as ChatGPT threaten transparent science; here are our ground rules for their use. Nature. 2023;613(7945):612. doi:10.1038/d41586-023-00191-1

5. Thorp HH. ChatGPT is fun, but not an author. Science. 2023;379(6630):313. doi:10.1126/science.adg7879

6. PNAS. The PNAS journals outline their policies for ChatGPT and generative AI. February 21, 2023. Accessed March 9, 2023. https://www.pnas.org/post/update/pnas-policy-for-chatgpt-generative-ai

7. Marušic’ A. JoGH policy on the use of artificial intelligence in scholarly manuscripts. J Glob Health. 2023;13:01002. doi:10.7189/jogh.13.01002

Article PDF
Author and Disclosure Information

Henry A. Nasrallah, MD, DLFAPA
Editor-in-Chief

Issue
Current Psychiatry - 22(4)
Publications
Page Number
6-7
Sections
Author and Disclosure Information

Henry A. Nasrallah, MD, DLFAPA
Editor-in-Chief

Author and Disclosure Information

Henry A. Nasrallah, MD, DLFAPA
Editor-in-Chief

Article PDF
Article PDF

The artificial intelligence (AI) program ChatGPT (generative pre-trained transformer) was introduced to the public by Open AI on November 30, 2022. It has been described as a game-changer for society, culture, literature, arts, and science, with a potential impact that rivals that of the internet and social media.

ChatGPT can produce human-like essays in response to queries. Some researchers used ChatGPT to generate abstracts based on the titles of articles published in 5 high-impact journals. These abstracts were so “real” that they escaped detection by expert peer reviewers in 32% of cases.1 In addition, several medical/science articles were published that included ChatGPT as a bylined author.2

Like other journals,3-7CURRENT PSYCHIATRY is drafting guidelines regarding authors’ use of AI-generated content (see www.mdedge.com/currentpsychiatry/page/submission-guidelines). Although these guidelines remained a work in progress as we went to press in mid-March 2023, similar to the policies of other journals, CURRENT PSYCHIATRY’s policy will be predicated on 4 principles:

  • Nonhuman AI technologies do not qualify for named authorship.
  • Authors should be transparent about the use of AI technologies to aid in writing content. Authors should disclose which tool was used and how it was applied in the article submitted to the journal.
  • Human authors are responsible for the accuracy of work compiled by an AI tool, including the absence of plagiarism. Proper referencing of statements is necessary.
  • Images created by AI tools are not permitted for publication.

Because the overwhelming majority of articles published in Current Psychiatry are submitted by invitation, the use of AI will be discussed before the article is written.

Can AI generate an acceptable ‘guest editorial?’

In addition to creating formulaic writing, some of the concerns about using AI in scientific publishing is that the results might be outdated, incorrect, or lacking appropriate references, and that it can produce “plausible-sounding but incorrect or nonsensical” answers to questions.3 To test this, I asked ChatGPT to write an article on the same topic as 4 of my articles published in Current Psychiatry so the differences can be discerned (see “For artificial intelligence, the future is finally here”). Also, I asked ChatGPT about “myths surrounding psychiatry” and then used the result as a “guest editorial.” The Box is what it produced verbatim, without any copyediting, fact-checking, or addition of reference citations.

Box

Myths surrounding psychiatry

There are many myths and misconceptions surrounding psychiatry, the branch of medicine that deals with the diagnosis, treatment, and prevention of mental illness. Some of the most common myths include:

1. Psychiatry is not a real medical specialty: This is a false myth. Psychiatry is a recognized medical specialty, and psychiatrists are licensed medical doctors who have undergone extensive training in the diagnosis and treatment of mental illness.

2. Mental illness is not a real illness: Mental illness is a real and biologically based medical condition, just like physical illness. Research has shown that mental illness is caused by a combination of genetic, environmental, and lifestyle factors.

3. Antidepressants are not effective: Antidepressants have been shown to be effective in treating depression and other mental illnesses. While they are not a cure, they can provide significant symptom relief and improve quality of life for many people.

4. Psychiatric drugs are dangerous and have serious side effects: While all drugs have potential side effects, psychiatric medications have been proven to be safe and effective for many people. The risks and benefits of taking any medication should be carefully considered and discussed with a healthcare professional.

5. Therapy is not effective: Therapy, also known as psychotherapy, has been shown to be effective in treating a wide range of mental illnesses, including depression, anxiety, and post-traumatic stress disorder (PTSD).

6. People with mental illness are violent: People with mental illness are no more likely to be violent than the general population. In fact, people with mental illness are more likely to be victims of violence than perpetrators.

It is important to dispel these myths and misconceptions about psychiatry, as they can prevent people from seeking the help they need and can also contribute to stigma and discrimination towards those with mental illness.

In conclusion, there are many myths and misconceptions surrounding psychiatry. Dispelling these myths and increasing public understanding about mental illness and its treatment can help reduce stigma and encourage more people to seek the help they need.

What do you think? I encourage you to send me your take on this AI-generated “guest editorial,” and whether you deem its quality to be similar to that of an article authored by a human psychiatrist.

The artificial intelligence (AI) program ChatGPT (generative pre-trained transformer) was introduced to the public by Open AI on November 30, 2022. It has been described as a game-changer for society, culture, literature, arts, and science, with a potential impact that rivals that of the internet and social media.

ChatGPT can produce human-like essays in response to queries. Some researchers used ChatGPT to generate abstracts based on the titles of articles published in 5 high-impact journals. These abstracts were so “real” that they escaped detection by expert peer reviewers in 32% of cases.1 In addition, several medical/science articles were published that included ChatGPT as a bylined author.2

Like other journals,3-7CURRENT PSYCHIATRY is drafting guidelines regarding authors’ use of AI-generated content (see www.mdedge.com/currentpsychiatry/page/submission-guidelines). Although these guidelines remained a work in progress as we went to press in mid-March 2023, similar to the policies of other journals, CURRENT PSYCHIATRY’s policy will be predicated on 4 principles:

  • Nonhuman AI technologies do not qualify for named authorship.
  • Authors should be transparent about the use of AI technologies to aid in writing content. Authors should disclose which tool was used and how it was applied in the article submitted to the journal.
  • Human authors are responsible for the accuracy of work compiled by an AI tool, including the absence of plagiarism. Proper referencing of statements is necessary.
  • Images created by AI tools are not permitted for publication.

Because the overwhelming majority of articles published in Current Psychiatry are submitted by invitation, the use of AI will be discussed before the article is written.

Can AI generate an acceptable ‘guest editorial?’

In addition to creating formulaic writing, some of the concerns about using AI in scientific publishing is that the results might be outdated, incorrect, or lacking appropriate references, and that it can produce “plausible-sounding but incorrect or nonsensical” answers to questions.3 To test this, I asked ChatGPT to write an article on the same topic as 4 of my articles published in Current Psychiatry so the differences can be discerned (see “For artificial intelligence, the future is finally here”). Also, I asked ChatGPT about “myths surrounding psychiatry” and then used the result as a “guest editorial.” The Box is what it produced verbatim, without any copyediting, fact-checking, or addition of reference citations.

Box

Myths surrounding psychiatry

There are many myths and misconceptions surrounding psychiatry, the branch of medicine that deals with the diagnosis, treatment, and prevention of mental illness. Some of the most common myths include:

1. Psychiatry is not a real medical specialty: This is a false myth. Psychiatry is a recognized medical specialty, and psychiatrists are licensed medical doctors who have undergone extensive training in the diagnosis and treatment of mental illness.

2. Mental illness is not a real illness: Mental illness is a real and biologically based medical condition, just like physical illness. Research has shown that mental illness is caused by a combination of genetic, environmental, and lifestyle factors.

3. Antidepressants are not effective: Antidepressants have been shown to be effective in treating depression and other mental illnesses. While they are not a cure, they can provide significant symptom relief and improve quality of life for many people.

4. Psychiatric drugs are dangerous and have serious side effects: While all drugs have potential side effects, psychiatric medications have been proven to be safe and effective for many people. The risks and benefits of taking any medication should be carefully considered and discussed with a healthcare professional.

5. Therapy is not effective: Therapy, also known as psychotherapy, has been shown to be effective in treating a wide range of mental illnesses, including depression, anxiety, and post-traumatic stress disorder (PTSD).

6. People with mental illness are violent: People with mental illness are no more likely to be violent than the general population. In fact, people with mental illness are more likely to be victims of violence than perpetrators.

It is important to dispel these myths and misconceptions about psychiatry, as they can prevent people from seeking the help they need and can also contribute to stigma and discrimination towards those with mental illness.

In conclusion, there are many myths and misconceptions surrounding psychiatry. Dispelling these myths and increasing public understanding about mental illness and its treatment can help reduce stigma and encourage more people to seek the help they need.

What do you think? I encourage you to send me your take on this AI-generated “guest editorial,” and whether you deem its quality to be similar to that of an article authored by a human psychiatrist.

References

1. Else H. Abstracts written by ChatGPT fool scientists. Nature. 2023;613(7944):423. doi: 10.1038/d41586-023-00056-7

2. Stokel-Walker C. ChatGPT listed as author on research papers: many scientists disapprove. Nature. 2023;613(7945):620-621. doi:10.1038/d41586-023-00107-z

3. Flanagin A, Bibbins-Domingo K, Berkwits M, et al. Nonhuman “authors” and implications for the integrity of scientific publication and medical knowledge. JAMA. 2023;329(8):637-639. doi:10.1001/jama.2023.1344

4. Tools such as ChatGPT threaten transparent science; here are our ground rules for their use. Nature. 2023;613(7945):612. doi:10.1038/d41586-023-00191-1

5. Thorp HH. ChatGPT is fun, but not an author. Science. 2023;379(6630):313. doi:10.1126/science.adg7879

6. PNAS. The PNAS journals outline their policies for ChatGPT and generative AI. February 21, 2023. Accessed March 9, 2023. https://www.pnas.org/post/update/pnas-policy-for-chatgpt-generative-ai

7. Marušic’ A. JoGH policy on the use of artificial intelligence in scholarly manuscripts. J Glob Health. 2023;13:01002. doi:10.7189/jogh.13.01002

References

1. Else H. Abstracts written by ChatGPT fool scientists. Nature. 2023;613(7944):423. doi: 10.1038/d41586-023-00056-7

2. Stokel-Walker C. ChatGPT listed as author on research papers: many scientists disapprove. Nature. 2023;613(7945):620-621. doi:10.1038/d41586-023-00107-z

3. Flanagin A, Bibbins-Domingo K, Berkwits M, et al. Nonhuman “authors” and implications for the integrity of scientific publication and medical knowledge. JAMA. 2023;329(8):637-639. doi:10.1001/jama.2023.1344

4. Tools such as ChatGPT threaten transparent science; here are our ground rules for their use. Nature. 2023;613(7945):612. doi:10.1038/d41586-023-00191-1

5. Thorp HH. ChatGPT is fun, but not an author. Science. 2023;379(6630):313. doi:10.1126/science.adg7879

6. PNAS. The PNAS journals outline their policies for ChatGPT and generative AI. February 21, 2023. Accessed March 9, 2023. https://www.pnas.org/post/update/pnas-policy-for-chatgpt-generative-ai

7. Marušic’ A. JoGH policy on the use of artificial intelligence in scholarly manuscripts. J Glob Health. 2023;13:01002. doi:10.7189/jogh.13.01002

Issue
Current Psychiatry - 22(4)
Issue
Current Psychiatry - 22(4)
Page Number
6-7
Page Number
6-7
Publications
Publications
Article Type
Display Headline
A ‘guest editorial’ … generated by ChatGPT?
Display Headline
A ‘guest editorial’ … generated by ChatGPT?
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
Article PDF Media