[Lex Computer & Tech Group/LCTG] some very disturbing AI behavior
Harry Forsdick
forsdick at gmail.com
Sun Feb 9 08:49:11 PST 2025
I understand one rule (provide guardrails to prevent answers enabling
suicide) can't solve all problems. But, I also understand that trying to
solve widespread problems impacting a lot of people (Deep fakes impacting
national elections) can be very difficult. As Bob points out, we do have
laws against a person encouraging another person to commit suicide, so why
not generalize that to Chat Bots? We might add the many other laws
covering punishment for providing information that society has deemed
harmful to others.
Again, the reason I feel pretty strongly about this is that I don't want us
to get to the point where this gets out of control and people start saying,
let's ban the use of all AI LLM's.
-- Harry
Harry Forsdick <http://www.forsdick.com/resume/>
Town Meeting Member Precinct 7 <http://lexingtontmma.org/>
harry at forsdick.com
<https://mail.google.com/mail/?view=cm&fs=1&tf=1&to=harry@forsdick.com>
www.forsdick.com
46 Burlington St.
Lexington, MA 02420 <https://goo.gl/xZXT2F>
(781) 799-6002 (mobile) <callto:17817996002>
On Sun, Feb 9, 2025 at 11:35 AM John Rudy via LCTG <lctg at lists.toku.us>
wrote:
> I posted it because I was unaware how far this disturbing trend has gone.
> You legal thoughts seem appropriate though it is always hard to know how
> far the controls should reach. Deep Fakes, internet fact checking, etc.
> are examples where just as much (or more) harm can be created. The
> suggestion to commit suicide affects only that one person and their close
> colleagues/relatives. A deep fake can affect millions and countries are
> applying them to inter-country (and intra-country) warfare.
>
>
>
> John Rudy
>
>
>
> 781-861-0402
>
> 781-718-8334 cell
>
> 13 Hawthorne Lane
>
> Bedford MA
>
> jjrudy1 at comcast.net
>
>
>
> *From:* Robert Primak <bobprimak at yahoo.com>
> *Sent:* Sunday, February 9, 2025 11:07 AM
> *To:* Lex Computer Group <lctg at lists.toku.us>; jjrudy1 at comcast.net
> *Cc:* Jim Barron <barron at barronaw.com>
> *Subject:* Re: [Lex Computer & Tech Group/LCTG] some very disturbing AI
> behavior
>
>
>
> The courts have ruled that if a human did this, it would be a crime in the
> US. At least as much of a guardrail should apply to AI "companions". The
> AI's creators and/or maintainers are responsible for providing adequate
> guardrails. This should be codified into laws and regulations.
>
>
>
> One publication went so far as to refer to the AI as the man's
> "girlfriend". While no one can regulate whom or what people choose as
> their companions, it is clear that this AI was being used (and provided)
> unethically.
>
>
>
> Ultimately, we must protect ourselves. There is still a component of
> personal responsibility, and attaching to inanimate objects or computer
> generated nonentities stretches my credulity about what strangeness the
> human mind is capable of. Regarding both those who attach and those who
> provide these "things" to attach to.
>
>
>
> There is still no substitute for the human element in relationships.
> Nursing homes are often less than ideal from this standpoint. But allowing
> socially isolated people (including many older Americans) to be attended to
> by nonhuman (inhuman?) "companions" is a huge public health cop-out. This
> trend needs to be stopped before it gains momentum. No one should be left
> to live alone unless they truly want it that way. Maybe not even then.
>
>
>
> -- Bob Primak
>
>
>
>
>
> On Sunday, February 9, 2025 at 09:39:22 AM EST, John Rudy via LCTG <
> lctg at lists.toku.us> wrote:
>
>
>
>
>
> An AI chatbot told a user how to kill himself—but the company doesn’t want
> to “censor” it | MIT Technology Review
> <https://www.technologyreview.com/2025/02/06/1111077/nomi-ai-chatbot-told-user-to-kill-himself/?utm_source=engagement_email&utm_medium=email&utm_campaign=wklysun&utm_term=02.09.25.subs_eng&utm_content=TR10_25LIVE_ACQ&mc_cid=67021e1372&mc_eid=0e9040b93d>
>
>
>
> This is a recent ar4ticle from MIT’s Technology Review and reveals AI
> without adequate guardrails. Of course they are not easy to set. Though I
> haven’t personally used any of them, there are an increasing number of
> systems that are designed to be “companions” that you can chat with about a
> myriad of subjects
>
>
>
> John
>
>
>
> John Rudy
>
>
>
> 781-861-0402
>
> 781-718-8334 cell
>
> 13 Hawthorne Lane
>
> Bedford MA
>
> jjrudy1 at comcast.net
>
>
>
> ===============================================
> ::The Lexington Computer and Technology Group Mailing List::
> Reply goes to sender only; Reply All to send to list.
> Send to the list: LCTG at lists.toku.us Message archives:
> http://lists.toku.us/pipermail/lctg-toku.us/
> To subscribe: email lctg-subscribe at toku.us To unsubscribe: email
> lctg-unsubscribe at toku.us
> Future and Past meeting information: http://LCTG.toku.us
> List information: http://lists.toku.us/listinfo.cgi/lctg-toku.us
> This message was sent to bobprimak at yahoo.com.
> Set your list options:
> http://lists.toku.us/options.cgi/lctg-toku.us/bobprimak@yahoo.com
> ===============================================
> ::The Lexington Computer and Technology Group Mailing List::
> Reply goes to sender only; Reply All to send to list.
> Send to the list: LCTG at lists.toku.us Message archives:
> http://lists.toku.us/pipermail/lctg-toku.us/
> To subscribe: email lctg-subscribe at toku.us To unsubscribe: email
> lctg-unsubscribe at toku.us
> Future and Past meeting information: http://LCTG.toku.us
> List information: http://lists.toku.us/listinfo.cgi/lctg-toku.us
> This message was sent to forsdick at gmail.com.
> Set your list options:
> http://lists.toku.us/options.cgi/lctg-toku.us/forsdick@gmail.com
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.toku.us/pipermail/lctg-toku.us/attachments/20250209/bc5114a1/attachment.htm>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image001.png
Type: image/png
Size: 96832 bytes
Desc: not available
URL: <http://lists.toku.us/pipermail/lctg-toku.us/attachments/20250209/bc5114a1/attachment.png>
More information about the LCTG
mailing list