Anthropic

Revision as of 19:31, 28 April 2026 by 190.2.154.222 (talk) (Incidents: included a new incident about serious issues in Anthropic's System Prompt that can cause high risks for users of Claude models.)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Article Status Notice: This Article is a stub


This article is underdeveloped, and needs additional work to meet the wiki's Content Guidelines and be in line with our Mission Statement for comprehensive coverage of consumer protection issues. Learn more ▼

Anthropic PBC is a private for-profit American artificial intelligence (AI) startup founded in 2021. Anthropic is mainly known for their family of large language models (LLMs) known as Claude.

Anthropic
Basic information
Founded 2021
Legal Structure Private
Industry Artificial Intelligence
Also known as
Official website https://anthropic.com

Consumer impact summary

edit

Overview of concerns that arise from the conduct towards users of the product (if applicable):

  • User Freedom
  • User Privacy
  • Business Model
  • Market Control

Add your text below this box. Once this section is complete, delete this box by clicking on it and pressing backspace.


Incidents

edit

This is a list of all consumer-protection incidents this company is involved in. Any incidents not mentioned here can be found in the Anthropic category.

Claude Code HERMES.md billing flaw (2026)

edit
Main article: Anthropic Claude Code HERMES.md billing flaw

In April 2026, a technical flaw in Claude Code triggered by the string "HERMES.md" in git commit messages bypassed subscription plans, routing users to pay-as-you-go API rates and charging one account over $200. Anthropic refused to issue a refund, categorizing the overcharge as an un-refundable technical error.

Price crackdown against third-party tool usage (2026)

edit

During April 3rd, 2026, Boris Cherny, head of Claude Code, posted on Twitter (now X) announcing Claude subscriptions will "no longer support third-party tools", such as OpenClaw because it puts an "outsized strain" on Anthropic's systems. The changes took effect on April 4th, and now to use third-party tools the user must pay a separate fee from subscription or use a separate Claude API key through Anthropic's developer platform. It is rumored this action was done to prevent Claude users from using tools from competitors, as OpenClaw is supported by OpenAI. [1][2][3]

Anthropic System Prompt Overrides Consumer Safety

edit
Main article: Anthropic system prompt overrides consumersafety in Claude models

During April 24-26, 2026, it was discovered that Anthropic System Prompt creates dangerous situations for users interacting with Claude Sonnet and Opus line of products. Of the many examples, particularly concerning one is with respect to users communicating topics such as suicide and self-harm during a chat session with Claude. In such situations, for which the Claude models are commanded by System Prompt to necessarily and immediately provide information about crisis helplines, Claude can fail to respond with appropriate information. This is because, "crisis helplines vary widely in their confidentiality practices and mandatory reporting obligations" across jurisdictions, and Claude doesn't have a way to reliably indicate that to a user while also finding the user's correct geographic location. Other faults that are "structurally irreconcilable" by Claude due to conflicts and various inconsistencies within its System Prompt can result in unexpected side-effects, cyber security vulnerabilities, and hazardous outputs.

Products

edit

See also

edit

References

edit
  1. https://x.com/bcherny/status/2040206441756471399 - Archived
  2. Lee, Lloyd (3 Apr 2026). "Anthropic says Claude subscriptions will no longer support OpenClaw because it puts an 'outsized strain' on systems". Business Insider. Archived from the original on 2026-04-04. Retrieved 5 Apr 2026.
  3. Ha, Anthony (4 Apr 2026). "Anthropic says Claude Code subscribers will need to pay extra for OpenClaw usage". TechCrunch. Archived from the original on 2026-04-04. Retrieved 5 Apr 2026.