Implementing AIMay 23, 2023 02:51PM ● By Kara Schweiss
Photo by Bill Sitzmann.
The term “generative AI” is getting a lot of buzz lately, especially following the launch of ChatGPT in late 2022 and DALL-E in 2021 by research company OpenAI. In simpler terms, ChatGPT (GPT stands for generative pretrained transformer) is a newer generation of ChatBot that can create not only responses to questions, but also more complex writing such as poems and essays. Similarly, DALL-E is among the newest generative image creators.
Artificial intelligence (AI) itself is far from new. The term’s first use is widely attributed to a proposal presented by computer scientists at a 1956 academic conference. ChatBots—programs that simulate human conversation—have also been around for years, as anyone who’s used a company’s pop-up customer service chat can attest, and voice recognition software is part of everything from satellite navigation devices to cellphones to virtual assistants.
“Most of us have probably seen by now apps that can take a photo of you and make you look 40 years older or painted like the Mona Lisa. Those are generative-AI art apps,” said Raj Lulla, principal brand strategist for Fruitful Design. “ChatGPT does the same kind of thing, but with writing. You can ask it to summarize a book in 500 words or write an ad selling cars in the style of William Shakespeare. However, it’s best to think of ChatGPT as a search engine that you can chat with. Instead of giving you a list of links, it compiles information for you in writing.”
“Using this basic platform, I can tell you that the legal profession has long used databases with search features within LexisNexis and Westlaw,” said attorney Thomas Locher, who is associated with Locher, Pavelka, Dostal, Braddy & Hammes LLC. “All of the filing that we do now in terms of litigation is done electronically; so, in that sense, much of the searching that is done is by natural language. Some of these features have been around for a long time.”
The newest incarnation of generative AI can be useful in many professions, said Katie LeDoux, founder and executive director for Sunflower Grant Writers.
“It could be a great way to get some really good general information about an organization and what they’re doing,” she said. “Think of it as…a great start.”
Lulla agreed. “For now, ChatGPT should be used behind the scenes,” he said. “ChatGPT can get you to a solid rough draft of emails, job descriptions, blog posts, and more.”
Locher pointed out that generative AI can’t build from a vacuum of information.
“It relies on the data that is accumulated predominantly from various sources, including the internet,” Locher said. “There are humans involved… I think it can be helpful in automating or shortening certain tasks, particularly repetitive tasks, such things as routine correspondence.”
However, he added, “I think it’s a long ways from being efficacious in those areas.”
AI-generated content has clear shortcomings. For instance, it doesn’t consider a person or organization’s unique “voice,” typically making its use transparent to teachers and professors grading papers or to supervisors and stakeholders reviewing business reports.
“I would say it’s a sort of generic kind of writing,” said LeDoux, adding that its application is limited in her field. “We try to approach things in a really personal way for everyone; each client has a unique way of wanting to tell their story and how they want things presented.”
Lulla said AI-generated content tends to be error-prone and unable to finely discriminate.
“Tech reviewer Marques Brownlee asked Chat GPT to write a review of the iPhone 14 for him, and it included information about the iPhone 12 instead,” he said. “Asking ChatGPT to analyze the symptoms of an infection might either leave you convinced you have cancer or it could completely miss that you have COVID… Think of ChatGPT like a search engine; don’t trust it any more than you would trust a random link on Google.”
“My own anecdotal personal experimentation is such that, although it can be helpful in some areas, it is certainly not reliable,” Locher said. “It does have propensity to give you dated or inappropriate information on legal topics. It even has given me a case law citation to a case that does not exist…In that respect, it’s very concerning.”
The ethical and legal ramifications of content created by AI are still ambiguous, he added, such as inability to guarantee the most current information; questionable confidentiality; and a lack of accountability compared to human content producers.
“I asked Google’s AI, Bard, if I had to attribute its content. Bard replied, ‘If you share my content, you should attribute it to me by including my name and the source of the content... By attributing my content, you are helping to ensure that I am properly recognized for my work,’” Lulla said. “Bard’s response raises a number of questions—Do I legally have to credit my source, or is that just the preference of Google’s marketing team? Does it depend on the type of content I’m writing, i.e., an email versus a scientific paper? Is Google using Bard to argue that AI has similar rights as human creators?”
“Will it replace writers? Well, I don’t think so,” Locher said. “I don’t think it’s going to replace lawyers, either.”
“I don’t think it’s ever going to replace that human connection,” LeDoux said. “We’ll just have to see how it goes in the next few years.”