To the editor: I skilled a visceral response to Anita Chabria’s latest column (“The Pentagon is demanding to use Claude AI as it pleases. Claude told me that’s ‘dangerous,’” Feb. 26). It seems the U.S. has taken one more web page out of Joseph Stalin’s playbook on the street to dictatorship, however now with way more sophistication with the assistance of synthetic intelligence.
The important thing issue within the fall of the Romanov dynasty to the Bolsheviks was that indignant minority group’s recognition of the quintessential worth of timing and chaos. Particularly in that 1917 12 months, their carefully orchestrated communication amongst railroads and telegraph techniques was swift and coordinated.
AI techniques at present (like Claude) seem like the final word software in controlling at present’s chaos and communication. Stalin didn’t have AI, however he did have his personal low-tech variations of surveillance: spies, the KGB, gulags, intimidation strategies, and many others. And the U.S. does too: masked ICE brokers, detention facilities, tear fuel.
The American individuals, if not our legislatures, should make sure that we now have guidelines and rules that management the unbridled use of this highly effective software by the highly effective people.
Darlene Pienta, San Marcos
..
To the editor: If Dario Amodei, CEO of Anthropic, needs the Trump administration to know his discomfort with President Trump’s demand that the Division of Protection be allowed to make use of Anthropic’s AI for “any lawful goal,” (“Anthropic refuses to bend to Pentagon on AI safeguards,” March 3) then I counsel Amodei cite the quote usually attributed to Ralph Waldo Emerson: “What you do speaks so loudly that I can’t hear what you say.”
Amodei is wise to restrict a license for the Division of Protection to specified functions fairly than broadly for “any lawful goal.” The reason being easy: Trump’s monitor file — in his private life, enterprise profession and function as president — clearly signifies that he can’t be trusted to behave lawfully.
Trump just lately confessed his perception that his presidential powers are restrained solely by his personal morality. Mainly, Trump believes that his presidential actions can’t be restrained by the Structure or any legislation, treaty, contractual dedication or another framework.
And it’s exactly this flimsy ethical compass that’s led him, in his private life, to turn out to be an adjudicated sexual abuser; in his enterprise life, to be sued thousands of times, be adjudicated a fraudster and turn out to be a convicted felon; and in his political life, to be impeached twice (to this point).
In the event you pair Trump’s low regard for working inside any authorized boundaries with the broad immunity that the Supreme Court docket granted him final 12 months, then Amodei is true to fret that limiting Division of Protection’s use of its AI to “any lawful goal” is just too weak a compliance normal for a federal authorities led by Trump.
Amodei must be counseled for his braveness in strolling from a high-visibility, profitable, consequential deal as a result of it didn’t comport together with his firm’s mission.
Todd Piccus, Venice
..
To the editor: What ought to Anthropic do now? Go to Europe (or Canada) the place it might function extra efficiently, freed from the heavy and puerile impositions of President Trump and Secretary of Protection Pete Hegseth.
Right here’s an organization that prides itself on the moral use of its merchandise being coerced into betraying that satisfaction by our authorities (“Trump orders federal agencies to stop using Anthropic’s AI after clash with Pentagon,” Feb. 27). What does this say concerning the moral character of that very same authorities?
Ken Johnson, Santa Barbara
..
To the editor: If all expertise firms be a part of Anthropic and say that their merchandise can’t be used for mass surveillance in opposition to People or in absolutely autonomous weapons operations, then Trump may have no alternative however to rescind his order for U.S. authorities businesses to cease utilizing Anthropic’s expertise. In unity there’s power, even with a forceful chief.
Richie Locasso, Hemet
