Post-purchase EULA modification: Difference between revisions

m Fixed a grammar issue and added a citation for Adobe.
Copy editing and cleanup
Line 1: Line 1:
Post-purchase End User Licence Agreement (EULA) modification, colloquially referred to as the ‘[[EULA Roofie]]’, is an increasingly common practice whereby the terms that govern a customer’s use of a product are modified after the customer’s purchase of the product. Such changes are frequently impossible for a customer to reject without losing access to the product they paid for, or without losing substantial functionality. In some cases, no ‘reject’ option is given, other than to power off the product, and never use it again, such as in the case of the Roku smart TV [cit needed].
Post-purchase [[End-User License Agreement]] (EULA) modification, colloquially referred to as the ‘[[EULA Roofie]]’, is an increasingly common practice whereby the terms that govern a customer’s use of a product are modified after the customer’s purchase of the product. Such changes are frequently impossible for a customer to reject without either losing access to the product they paid for or losing substantial functionality. In some cases, no ‘reject’ option is given, other than to power off the product and never use it again, such as in the case of the Roku smart TV


Such a modification can work in the consumer’s favor (such as in the case of Valve[citation needed] changing the terms of the steam subscriber agreement to remove forced arbitration), or simply serve to clarify or correct specific terms in a way which is reasonable, and does not negatively affect the consumer.
{{Citation needed}}
 
.
 
Such a modification can work in the consumer’s favor (such as in the case of [[Valve]] [[Valve Removes Arbitration Requirement From Steam Subscriber Agreement|changing the terms]] of the [[Steam]] subscriber agreement to remove [[forced arbitration]]), or simply serve to clarify or correct specific terms in a way which both is reasonable and does not adversely affect the consumer.


A problem exists, however, when such a change is made in order to reduce the rights, or increase the obligations, of the consumer.
A problem exists, however, when such a change is made in order to reduce the rights, or increase the obligations, of the consumer.
Line 15: Line 19:
In extreme cases, companies may take a ''lack'' of action as consent, as was the case in this incident [link to that sock company thing], where non-response to an email was considered by the company to be appropriate consent for a change to the EULA.
In extreme cases, companies may take a ''lack'' of action as consent, as was the case in this incident [link to that sock company thing], where non-response to an email was considered by the company to be appropriate consent for a change to the EULA.


Particularly insidious examples of this practice include Adobe’s EULA changes, which required users to accept the use of their art and media for the training of AI, or face the loss of access to Adobe products [https://wiki.rossmanngroup.com/wiki/Adobe%27s_AI_Policy]. It was felt by a number of prominent creative professionals that this amounted to a substantial changing of the ‘deal’ they were offered at the time of purchase, effectively amounting to the theft of their creative efforts. Many creative professionals are deeply entrenched in the Adobe ecosystem, and would suffer substantial financial harm if they were to stop using Adobe products, as the time taken to learn alternative tools would directly correlate to lost work and payment. Combined with Adobe’s practice of charging a premium for the privilege of early subscription cancellation, users who did not want their art used for AI training were unethically forced into choosing between their livelihood and their integrity.
Particularly insidious examples of this practice include [[Adobe]]’s EULA changes, which [[Adobe's AI Policy|required users to accept]] the use of their art and media for the training of AI, or face the loss of access to Adobe products [https://wiki.rossmanngroup.com/wiki/Adobe%27s_AI_Policy]. It was felt by a number of prominent creative professionals that this amounted to a substantial changing of the ‘deal’ they were offered at the time of purchase, effectively amounting to the theft of their creative efforts. Many creative professionals are deeply entrenched in the Adobe ecosystem, and would suffer substantial financial harm if they were to stop using Adobe products, as the time taken to learn alternative tools would directly correlate to lost work and payment. Combined with Adobe’s practice of charging a premium for the privilege of early subscription cancellation, users who did not want their art used for AI training were unethically forced into choosing between their livelihood and their integrity.
 
Because of the nature of the agreements, legal professionals[who?] have argued that many cases of such contract changes are unenforceable, when the users have not been properly informed of contractual changes, and those changes are beyond what would be expected in a typical contract of this type
 
{{Citation needed}}


Due to the nature of the agreements, legal professionals[who?] have argued that many cases of such contract changes are unenforceable, when the users have not been properly informed of contractual changes, and those changes are beyond what would be expected in a typical contract of this type[citation/verification needed]. However the reality for the average user is that they cannot realistically challenge such a change due to the costs involved with litigation, and instead must accept the poisoned choice they are offered: suck it up and deal with the new terms, or lose access to a product they paid for.
. The reality for the average user, however, is that they cannot realistically challenge such a change, because of the costs involved with litigation, and instead must accept the poisoned choice they are offered: suck it up and deal with the new terms, or lose access to a product they paid for.


== '''Legislative action''' ==
== '''Legislative action''' ==