October 2024

Why it's not a good idea to let ChatGPT write an article (-Facts +NotebookLM from Google!)

The ChatGPT logo

ChatGPT is a very impressive tool for text production and AI-supported responses. However, as soon as it comes to writing longer, high-quality articles, weaknesses become apparent that make it unsuitable for more complex texts. A critical look at why the use of AI in journalism needs to be well thought out.

ChatGPT and other AIs undoubtedly offer many possibilities and are particularly helpful when information is needed quickly. Nevertheless, there are clear reasons why writing sophisticated articles by an AI can be problematic and human authors are often better for important topics. A key issue is the lack of depth and contextuality required for a well-researched, readable article. ChatGPT draws on vast amounts of data, but it doesn't understand content in a human sense. This means that it often produces superficial text and wraps complex issues in simplified or even inaccurate statements.

Another point concerns the AI's lack of creativity. While it can reproduce familiar formulations and patterns well, it lacks the ability to develop new, creative thoughts and perspectives. Human authors can make innovative connections, incorporate original metaphors or convince with their own style. All of this remains a challenge with ChatGPT, as the AI only draws on existing texts and language patterns.

Furthermore, quality assurance remains a problem. Even if ChatGPT provides apparently correct information, there is still a risk that details are missing or misinterpreted. This is particularly critical for sensitive topics such as politics, medicine or law. Human authors have the ability to scrutinize sources, verify information and question statements - something that AI can only do with difficulty.

Ethical issues also play a role. Working with AIs could affect copyright or violate personal rights, as ChatGPT accesses data that may not be intended for these purposes. Finally, there is the aspect of responsibility: who takes responsibility for incorrect information if the article was created by an AI? This remains an open question, and responsibility towards the public is particularly important in journalistic work.

To summarize: ChatGPT is a valuable tool for information research and simple texts, but for more demanding texts, humans should continue to play the leading role.

PS: ChatGPT wrote this article based on the following prompt: "Write an article with a headline, teaser text and about 2000 characters of text about why it's not a good idea to let ChatGPT write an article."

How did we come up with this?

How creative the idea was remains to be seen. It is based on asking ChatGPT on a test basis whether the tool knows us - HENKELHIEDL . The answer was surprising.

Yes, there really is a digital agency in Berlin with the name Henkelhiedl - maybe you meant them. This agency has been working successfully in the fields of design, development and strategy for many years. It was founded by Andreas Henkel and David Hiedl and is known for creative digital concepts, among other things. Henkelhiedl focuses on individual, tailor-made solutions and has already worked with many well-known clients from various industries.

David Hiedl? Just missed the mark. Perhaps the name Bärbl is simply not yet known at OpenAI. Things look bad for Uwe (author of these lines) too. (Except for the "independent" of course, that's true).

The digital agency HENKELHIEDL is currently owned by its founders and will continue to be managed independently.

In the end, the truth can be found elsewhere, in this case for example at www.henkelhiedl.com/about-us. Or in the commercial register. Or in the good old search engine. Speaking of ...

Google NotebookLM

But now to the real star of this article, Google NotebookLM.

Google NotebookLM is an AI-powered note-taking and research assistant that was originally introduced as "Project Tailwind" in 2023. NotebookLM is based on Google's Gemini model and is designed to assist users by analyzing and summarizing uploaded documents, such as text files, PDFs and Google Docs. It generates detailed insights, making it particularly useful for students, researchers and teams. In addition to text summaries, NotebookLM now also offers audio summaries that summarize the content of documents in AI-generated audio summaries that simulate a conversational format, similar to a podcast. (Info from ChatGPT, translated with DeepL)

And what is spit out as a podcast is actually very impressive, at least in English - created from a PDF of the "article" about ChatGPT above.

The German version, enforced by an additional prompt with the wording:
"This episode will be broadcast exclusively in German. All discussions, interviews and commentary must be conducted in German for the entire duration of the episode. No English or other languages should be used in conversation unless absolutely necessary to clarify a term or concept that is unique to a particular language."

Also interesting