Jump to content

Friend app: Difference between revisions

From Consumer Rights Wiki
draft of a draft
 
mNo edit summary
 
(7 intermediate revisions by 5 users not shown)
Line 1: Line 1:
{{ProductCargo
{{ProductCargo
|InProduction=No
|InProduction=Yes
|Logo=FriendAI.png
|ReleaseYear=2025
|Category=AI chatbot, wearable
|ArticleType=Service
|ArticleType=Service
|Description=Friend app has continuous biometric, audio, & video recording with liability shifted to users under the guise of offering a friend
|Website= https://friend.com/?page=hardware
|Description=“Friend” is a wearable AI pendant device that claims to provide conversational companionship, raising concerns about continuous recording and privacy liability.
}}
}}
{{Ph-C-Int}}
 
'''Friend''' is a wearable [[artificial intelligence]] (AI) pendant device that aims to provide conversation and companionship.


==Consumer-impact summary==
==Consumer-impact summary==
{{Ph-C-CIS}}
Users of the Friend device may be subject to continuous recording of video, audio, and biometric data. The company’s policies shift liability for recordings involving bystanders onto users. The device’s data retention, model training, and deletion practices raise questions under evolving biometric data privacy laws such as Illinois’ BIPA and various state-federal frameworks.<ref name="FriendPPv2">{{cite web |url=https://www.friend.com/privacy.pdf |title=Friend – Privacy Policy v2 |website=Friend |date=14 Jun 2025 |access-date=1 Oct 2025 |archive-url=https://web.archive.org/web/20251001000000/https://www.friend.com/privacy.pdf |archive-date=1 Oct 2025}}</ref>
 
There is a one-time payment to get the device. It is unclear how the company will be able to provide ongoing service without a user subscription.
 
==Capabilities and data practices==
Friend’s official privacy policy (v2) states that it serves as a “data controller” under GDPR and collects personal data from use of the services.<ref name="FriendPPv2" />
 
Key practices include:
*Collection of name, phone, e-mail, account credentials, uploaded content, and ambient video/audio from device surroundings.<ref name="FriendPPv2" />
*Collection of biometric data including facial and voice recognition.<ref name="FriendPPv2" />
*Retention of data for over five years if "legal, regulatory, or technical reasons" prevent deletion.<ref name="FriendPPv2" />
*Use of personal data to train machine learning models powering Friend.<ref name="FriendPPv2" />


==Incidents==
Media reporting highlights possible discrepancies:
{{Ph-C-Inc}}
*NBC Bay Area reported the company claims it "will not store any audio recordings."<ref name="NBC">{{cite web |url=https://www.nbcbayarea.com/news/tech/friend-com-launches-product-preview/3611528/ |title=Meet the AI startup that wants to give you a friend |website=NBC Bay Area |date=1 Feb 2025 |access-date=1 Oct 2025 |archive-url=https://web.archive.org/web/20251001000000/https://www.nbcbayarea.com/news/tech/friend-com-launches-product-preview/3611528/ |archive-date=1 Oct 2025}}</ref>
This is a list of all consumer-protection incidents related to this product. Any incidents not mentioned here can be found in the [[:Category:{{PAGENAME}}|{{PAGENAME}} category]].
*Wired reported the pendant is always-on and continuously listening for conversation.<ref name="Wired">{{cite web |url=https://www.wired.com/story/friend-ai-pendant/ |title=Wear This AI Friend Around Your Neck |website=Wired |date=15 Mar 2025 |access-date=1 Oct 2025 |archive-url=https://web.archive.org/web/20251001000000/https://www.wired.com/story/friend-ai-pendant/ |archive-date=1 Oct 2025}}</ref>


===Example incident one (''date'')===
==Legal and regulatory environment==
{{Main|link to the main CR Wiki article}}
The Illinois Biometric Information Privacy Act (BIPA)  requires entities to obtain written consent before collecting biometric identifiers, provide retention schedules, and limit data storage.<ref>{{cite web |url=https://www.ilga.gov/legislation/publicacts/fulltext.asp?Name=095-0994 |title=Biometric Information Privacy Act |website=Illinois General Assembly |date=2008 |access-date=2025-10-01 |archive-url=https://web.archive.org/web/20251001000000/https://www.ilga.gov/legislation/publicacts/fulltext.asp?Name=095-0994 |archive-date=1 Oct 2025}}</ref> The Illinois Supreme Court's 2019 ''Rosenbach'' decision expanded standing for consumers to sue.<ref>{{cite web |url=https://www.americanbar.org/groups/antitrust_law/resources/source/2025-june/biometric-privacy-litigation/ |title=The Who, Why, and Where of Biometric Privacy Litigation |website=American Bar Association |date=1 Jun 2025 |access-date=1 Oct 2025 |archive-url=https://web.archive.org/web/20251001000000/https://www.americanbar.org/groups/antitrust_law/resources/source/2025-june/biometric-privacy-litigation/ |archive-date=1 Oct 2025}}</ref> 
Short summary of the incident (could be the same as the summary preceding the article).
===Example incident two (''date'')===
...


Analysts warn that devices like Friend raise risks under BIPA and similar laws, especially when data is used for AI training.<ref>{{cite web |url=https://www.lexisnexis.com/pdf/practical-guidance/ai/biometric-privacy-and-ai-legal-dev.pdf |title=Biometric Privacy and AI Legal Developments |website=LexisNexis |date=2025 |access-date=1 Oct 2025 |archive-url=https://web.archive.org/web/20251001000000/https://www.lexisnexis.com/pdf/practical-guidance/ai/biometric-privacy-and-ai-legal-dev.pdf |archive-date=1 Oct 2025}}</ref> Federal and state regulators increasingly classify biometric identifiers as sensitive data requiring special protection.<ref name="KPMG">{{cite web |url=https://kpmg.com/us/en/articles/2025/ai-and-privacy-a-look-at-biometric-tech-and-data-reg-alert.html |title=AI and Privacy: A Look at Biometric Tech & Data |website=KPMG |date=2025 |access-date=1 Oct 2025 |archive-url=https://web.archive.org/web/20251001000000/https://kpmg.com/us/en/articles/2025/ai-and-privacy-a-look-at-biometric-tech-and-data-reg-alert.html |archive-date=1 Oct 2025}}</ref>
===Precedents and industry comparisons===
The [[Clearview AI]] litigation demonstrates how courts treat unauthorized biometric data collection. In 2025, Clearview agreed to a US$51.75 million settlement resolving claims under BIPA.<ref>{{cite web |url=https://www.regulatoryoversight.com/2025/04/51-75m-settlement-in-clearview-ai-biometric-privacy-litigation-illustrates-creative-resolution-for-startups-facing-parallel-litigation-and-enforcement-action/ |title=$51.75M Settlement in Clearview AI Biometric Privacy Litigation |website=Regulatory Oversight |date=12 Apr 2025 |access-date=1 Oct 2025 |archive-url=https://web.archive.org/web/20251001000000/https://www.regulatoryoversight.com/2025/04/51-75m-settlement-in-clearview-ai-biometric-privacy-litigation-illustrates-creative-resolution-for-startups-facing-parallel-litigation-and-enforcement-action/ |archive-date=1 Oct 2025}}</ref> 
Competitor devices, such as the Limitless pendant, advertise a more limited data policy, not permitting ambient biometric capture.<ref>{{cite web |url=https://www.limitless.ai/privacy |title=Privacy – Limitless |website=Limitless.ai |date=2025 |access-date=1 Oct 2025 |archive-url=https://web.archive.org/web/20251001000000/https://www.limitless.ai/privacy |archive-date=1 Oct 2025}}</ref>
==Company responses and disputes==
The company asserts that:
*Friend does not store audio recordings permanently.<ref name="NBC" />
*Data deletion requests can be submitted, though model training may make deletion incomplete.<ref name="FriendPPv2" />
Operational issues have been reported; the company delayed initial shipments announced for Q1 2025 until at least Q3 2025.<ref name="TechCrunch">{{cite web |url=https://techcrunch.com/2025/01/20/friend-delays-shipments-of-its-ai-companion-pendant/ |title=Friend delays shipments of its AI companion pendant |website=TechCrunch |date=20 Jan 2025 |access-date=1 Oct 2025 |archive-url=https://web.archive.org/web/20251001000000/https://techcrunch.com/2025/01/20/friend-delays-shipments-of-its-ai-companion-pendant/ |archive-date=1 Oct 2025}}</ref>
==Consumer response==
Consumer advocates and analysts have identified key risks for consumers:
*Constant ambient recording of private environments.<ref name="Wired" />
*Shifting liability to users for privacy violations of third parties.<ref name="FriendPPv2" />
*Biometric sensitivity: face and voice identifiers are permanent.<ref name="KPMG" />
*Risk that deletion requests are ineffective once data is used to train AI models.<ref name="FriendPPv2" />
===Shipment delay (January 2025)===
Friend announced initial shipments to pre-order customers in Q1 2025 but later delayed deliveries until Q3 2025.<ref name="TechCrunch" />


==See also==
==See also==
{{Ph-C-SA}}
*[[Right to Repair]]
 
*[[Biometric Information Privacy Act]]
*[[Clearview AI]]


==References==
==References==
{{reflist}}
{{Reflist}}


[[Category:{{PAGENAME}}]]
[[Category:Friend AI Pendant]]
[[Category:Consumer electronics]]
[[Category:Privacy controversies]]

Latest revision as of 23:11, 2 April 2026

Friend app
Basic Information
Release Year 2025
Product Type AI chatbot, wearable
In Production Yes
Official Website https://friend.com/?page=hardware


Friend is a wearable artificial intelligence (AI) pendant device that aims to provide conversation and companionship.

Consumer-impact summary

[edit | edit source]

Users of the Friend device may be subject to continuous recording of video, audio, and biometric data. The company’s policies shift liability for recordings involving bystanders onto users. The device’s data retention, model training, and deletion practices raise questions under evolving biometric data privacy laws such as Illinois’ BIPA and various state-federal frameworks.[1]

There is a one-time payment to get the device. It is unclear how the company will be able to provide ongoing service without a user subscription.

Capabilities and data practices

[edit | edit source]

Friend’s official privacy policy (v2) states that it serves as a “data controller” under GDPR and collects personal data from use of the services.[1]

Key practices include:

  • Collection of name, phone, e-mail, account credentials, uploaded content, and ambient video/audio from device surroundings.[1]
  • Collection of biometric data including facial and voice recognition.[1]
  • Retention of data for over five years if "legal, regulatory, or technical reasons" prevent deletion.[1]
  • Use of personal data to train machine learning models powering Friend.[1]

Media reporting highlights possible discrepancies:

  • NBC Bay Area reported the company claims it "will not store any audio recordings."[2]
  • Wired reported the pendant is always-on and continuously listening for conversation.[3]
[edit | edit source]

The Illinois Biometric Information Privacy Act (BIPA) requires entities to obtain written consent before collecting biometric identifiers, provide retention schedules, and limit data storage.[4] The Illinois Supreme Court's 2019 Rosenbach decision expanded standing for consumers to sue.[5]

Analysts warn that devices like Friend raise risks under BIPA and similar laws, especially when data is used for AI training.[6] Federal and state regulators increasingly classify biometric identifiers as sensitive data requiring special protection.[7]

Precedents and industry comparisons

[edit | edit source]

The Clearview AI litigation demonstrates how courts treat unauthorized biometric data collection. In 2025, Clearview agreed to a US$51.75 million settlement resolving claims under BIPA.[8]

Competitor devices, such as the Limitless pendant, advertise a more limited data policy, not permitting ambient biometric capture.[9]

Company responses and disputes

[edit | edit source]

The company asserts that:

  • Friend does not store audio recordings permanently.[2]
  • Data deletion requests can be submitted, though model training may make deletion incomplete.[1]

Operational issues have been reported; the company delayed initial shipments announced for Q1 2025 until at least Q3 2025.[10]

Consumer response

[edit | edit source]

Consumer advocates and analysts have identified key risks for consumers:

  • Constant ambient recording of private environments.[3]
  • Shifting liability to users for privacy violations of third parties.[1]
  • Biometric sensitivity: face and voice identifiers are permanent.[7]
  • Risk that deletion requests are ineffective once data is used to train AI models.[1]

Shipment delay (January 2025)

[edit | edit source]

Friend announced initial shipments to pre-order customers in Q1 2025 but later delayed deliveries until Q3 2025.[10]

See also

[edit | edit source]

References

[edit | edit source]
  1. 1.0 1.1 1.2 1.3 1.4 1.5 1.6 1.7 1.8 "Friend – Privacy Policy v2" (PDF). Friend. 14 Jun 2025. Archived from the original (PDF) on 1 Oct 2025. Retrieved 1 Oct 2025.
  2. 2.0 2.1 "Meet the AI startup that wants to give you a friend". NBC Bay Area. 1 Feb 2025. Archived from the original on 1 Oct 2025. Retrieved 1 Oct 2025.
  3. 3.0 3.1 "Wear This AI Friend Around Your Neck". Wired. 15 Mar 2025. Archived from the original on 1 Oct 2025. Retrieved 1 Oct 2025.
  4. "Biometric Information Privacy Act". Illinois General Assembly. 2008. Archived from the original on 1 Oct 2025. Retrieved 2025-10-01.
  5. "The Who, Why, and Where of Biometric Privacy Litigation". American Bar Association. 1 Jun 2025. Archived from the original on 1 Oct 2025. Retrieved 1 Oct 2025.
  6. "Biometric Privacy and AI Legal Developments" (PDF). LexisNexis. 2025. Archived from the original (PDF) on 1 Oct 2025. Retrieved 1 Oct 2025.
  7. 7.0 7.1 "AI and Privacy: A Look at Biometric Tech & Data". KPMG. 2025. Archived from the original on 1 Oct 2025. Retrieved 1 Oct 2025.
  8. "$51.75M Settlement in Clearview AI Biometric Privacy Litigation". Regulatory Oversight. 12 Apr 2025. Archived from the original on 1 Oct 2025. Retrieved 1 Oct 2025.
  9. "Privacy – Limitless". Limitless.ai. 2025. Archived from the original on 1 Oct 2025. Retrieved 1 Oct 2025.
  10. 10.0 10.1 "Friend delays shipments of its AI companion pendant". TechCrunch. 20 Jan 2025. Archived from the original on 1 Oct 2025. Retrieved 1 Oct 2025.