The misconception of algorithms as “intelligent” entities is a dangerous one. – Webb Wright
Another attempt to use antique laws to ban abortion is foiled:
On July 2, the Wisconsin Supreme Court ruled th[at]…an 1849 law…[politici]ans [were trying to use to] ban…abortion from conception to birth [does no such thing]…in 2022, abortion providers in Wisconsin stopped providing services for fear of violating the 175-year-old law, which says that “any person, other than the mother, who intentionally destroys the life of the unborn child” is guilty of a felony and subject to 15 years imprisonment unless the life of the patient is at risk. The state’s…attorney general Josh Kaul filed a lawsuit arguing the law operates only as a feticide law…and does not ban “consensual abortions”…[in addition,] the 1849 law…has been implicitly repealed by later regulations, including a measure criminalizing abortion at 20 weeks…[the] court agreed with Kaul…Justice Jill Karofsky…described the effort to apply a 175-year-old law with almost no exceptions in the present day as a sign of a “world gone mad”…
[Trump hench]man Brendan Carr has dec[re]ed t[hat] prisons and jails [can] keep…price[-gouging the families of people locked in cag]es until at least 2027, delaying implementation of rate caps approved last year…Carr…[bloviated a lot of fascist nonsense justifying] the change…[but FCC] commissioner Anna Gomez [said]…”the FCC made the indefensible decision to ignore both the law and the will of Congress…[in order to] shield…a…system that inflates costs and rewards kickbacks to c[arcer]al facilities at the expense of [caged human beings] and their loved ones”…
Why would anybody waste good money to talk to computer programs pretending to be people?
One of the tech industry’s weirder cycles of brand reincarnation has given us yet another new version of Napster: a site teeming with [cartoons of] well-lit, photogenic people smiling confidently and talking with their hands…This new version of Napster surfaced on June 25 when a Florida company formerly known as Infinite Reality, which leveraged mysterious funding to buy the Napster brand for $207 million in March and then renamed itself after that purchase, announced “Napster Companions”…[with technobabble about] “thirty psychometric parameters that make each agent distinct”…Think of this as trying to put a [pseudo-]human face on [LLM] chatbots like ChatGPT or Claude…the company’s press release says…”If an agent does not already exist, the Napster Companion platform auto-generates a new one on the fly”…Access costs $19 a month or $219 a year…and…you must click a checkbox next to [a] disclaimer [wherein you claim to “understand” that chatbots are neither intelligent nor people and have a tendency to make shit up]…
A society can be judged by the way it treats its prisoners:
A [victim of the] federal [government] had to have one of his…limbs partially amputated after being kept in restraints for two days. Another [cag]ed person died after being pepper sprayed and left shackled in a restraint chair for five hours. The Department of Justice Office of Inspector General…published those details…after receiving dozens of [repor]ts a year from [prisoners who] were strapped to beds or chairs for long periods of time and assaulted…BOP policy allows [deranged screws] to [ab]use restraints [in virtually any way they please as long as they belch the magic word “]disruptive[” at some point before, during, or after the torture, and pinky-swear that it wasn’t]… a method of punishment…[no] documentation [or] video or audio [of the torture is required, not]…even…medical checks of [human beings left to suffer] in restraints…there [a]re no [time] limits…and [virtually no] review of the use of restraints…
When [large language] models were tested on simulated writing from [people foolish enough to think a word-association algorithm can practice medicine], they were more likely to advise against seeking medical care if the [make-believe] writer [supposedly] made typos, included emotional or uncertain language – or was [pretended to be] female…Abinitha Gourabathina at [MIT]…used [another LLM] to help create thousands of [make-believe] patient notes in different formats and styles…includ[ing LLM imitations of] patients with limited English proficiency…health anxiety…[overly-]emotional tone or gender-neutral pronouns. The researchers then fed the [LLM output] to four [other]…LLMs…commonly used to power chatbots and told the[m]…to answer questions about whether the [make-believe] patient should manage their [make-believe] condition at home or visit a clinic…the various format and style changes made all the [large language] models between 7 and 9 per cent more likely to recommend [make-believe] patients stay home instead of getting medical attention. The models were also more likely to recommend that [make-believe] female patients remain at home, and…more likely than [actual] clinicians to change their recommendations for treatments because of [supposed] gender and [feigned] language style…
One of the people the government empowers to police your sexuality:
A [Maryland cop named]…James Dodson Jr. was arrested by Pennsylvania State Police…[for] twice asking a friend [via Snapchat] to see her 11-year-old son naked in a picture or in person…he also admitted to [her that he] watch[es] child pornography…[after she reported him cops raided] Dodson’s [house] and [searched his] phones a[nd]…computers…[his boss hog used the opportunity to strut around and bloviate self-aggrandizing copaganda]…
A shortcut to LLM-induced psychosis:
…A growing number of [fools] are [mis]using…chatbots as “trip sitters”…this is a potentially dangerous psychological cocktail…while it’s far cheaper than in-person psychedelic therapy, it can go badly awry…Throngs of [nitwits] have turned to…chatbots in recent years as surrogates for human therapists…directly encouraged by some prominent figures in the tech industry, who [irresponsibly fantasize] that [word-association algorithms without consciousness] will [magically] revolutionize mental-health care…a profusion of chatbots designed specifically to help users navigate psychedelic experiences have been cropping up online…[but] experts…agree…[that] replacing human therapists with…bots during psychedelic experiences is a bad idea…[because] the basic design of large language models…is fundamentally at odds with the therapeutic process…
I find paywalls distasteful, and so many people find this blog valuable as a resource I just can’t bring myself to install one. Furthermore, I find ad delivery services (whose content I have no say over) even more distasteful. But as I’m now semi-retired from sex work, I can’t self-sponsor this blog by myself any longer. So if you value my writing enough that you would pay to see it if it were paywalled, please consider subscribing; there are four different levels to fit all budgets. Or if that doesn’t work for you, please consider showing your generosity with a one-time donation; you can Paypal to maggiemcneill@earthlink.net or else email me at the same address to make other arrangements. Thanks so much!


