The "online retailer" (IABMFG) in this case is based in California.
A company in California, selling to a customer in California, shouldn't be able to say "California law doesn't apply because my payment processor is Canadian." And if Shopify wants to take a cut of every sale from retailers based in California, they should be willing to comply with California law as well, at least insofar as it applies to the services provided via those California-based retailers' web sites.
To be clear, this isn't a choice-of-law case. It's not about whether California law applies. It's about whether a court in California has jurisdiction; that is, whether it can hear the case at all.
Yes. Choice-of-law terms are frequently found in contracts. When interpreting the law, the court with jurisdiction will do its best to refer to and interpret existing law of the state in question to the case.
(I deleted my original reply, which was maybe conflating several different arguments. I'll try to restate my point more succinctly. IANAL, so it's quite possible I'm misunderstanding the legal issues at play.)
The linked opinion's discussion of jurisdiction seems to be about whether the case can be heard in any court, not if there's an alternate venue that would be more proper. FWIW, the retailer's web site[0] does not seem to contain a terms of service or anything with a choice-of-law provision. If the argument is that no court has jurisdiction in which the plaintiff can seek redress, that seems equivalent to saying that the law doesn't apply to the defendants.
Shopify already exercises a huge amount of discretion as to which businesses it's willing to provide services for, and how much it charges them. It does not seem unreasonable to me that the company would either a) be willing to answer to California courts, or b) stop selling its services to California retailers.
> And if Shopify wants to take a cut of every sale from retailers based in California, they should be willing to comply with California law as well,
For the moment, for purpose of consumer protections, fine. But on longer time scales, I'm not sure. Does it really make sense for legacy states to be able to bind transacations on the internet? Doesn't that just make it a very large intranet?
Obviously information refuses to be stopped by borders. Are we going to have a situation where states of various sizes try to claim jurisdiction, but only those with sufficiently violent tendencies (and thus the ability to achieve compliance by fear) succeed? Won't that corrupt the internet even worse than an absence of state-based consumer protections?
If two people who live 500 miles apart in the area currently claimed by the State of California, but do not recognize the State of California, and regard themselves as citizens of the internet, and, who is right, them, of the government of California?
Most of us will probably say that there is some social contract by which, for better or worse, the State of California is right.
But what if, in 100 years, California goes bankrupt. Does that change the calculus? If so, why? And does it change retroactively, for the purposes of historical classification of internet transactions? The diplomatic and economic affairs of state don't change the operation of internet protocols. It's hard to even fully imagine how to create an internet whose shapes are coterminous with the boundaries asserted by various states.
I'm broadly skeptical of any judicial rulings which extend the laws of the legacy states onto the internet, even if they appear to be on the side of short-term justice. This whole thing is starting to feel like a bandaid better ripped off quickly.
> If a company develops a web-based business for the purpose of conducting online transactions in all 50 States, it should not be surprised that it may be sued in any State for unlawful transactions that may occur within that State.
Obviously. But the author calls this "chilling". Without this, companies could circumvent state laws, to conduct actions that are illegal in that state within that state, simply by headquartering or hosting in another. That would be absurd.
It would create a race-to-the-bottom of consumer rights, where states wanting business tax revenue are incentivized to make their states surveillance/ data harvesting/ consumer exploitation havens, whose businesses could then operate across all other states freely.
I suspect this has something to do with "Shop Pay", Shopify's own payment system used on most (all?) Shopify stores. It enables you to have saved payment information for any Shopify store you come across, facilitating one-click checkout even if you have never shopped on that particular brand/website before. Webshop operators love it because it is very good at fraud detection (due to the pooled data on the backend), and removes barriers at checkout (needing your wallet, fill out an address form, etc). As far as I'm aware, it's optional on the Shopify platform. Using Shop Pay for payment is optional on the consumer level.
I suspect Shopify's terms inform their customers (webshop operators) that they are responsible for disclosure, etc and being compliant with state privacy laws - however since majority of web shops are exempt (due to small size, revenue, etc), these shops did not (knowingly or otherwise) publish these terms. That's just speculation on my part...
If this is true, I find this case troubling and weak, and hope it is overturned. It is squarely on the shop operator to be compliant - Shopify is just a platform vendor and shoppers are not Shopify customers; rather, they are customers of the shop. This seems to be akin to suing Google because a website uses Google Analytics but didn't disclose it in their privacy statement - silly...
This particular case gives me ADA and Prop65 vibes... lots of bottom-feeding lawyers using serial plaintiffs to extort businesses out of money. At least in this case they're going after someone with deep pockets and not just small businesses...
I'm not familiar enough with California's law to know whether companies like Shopify/Google are meant to be liable (in the sense that the law says so), but certainly it would be a great thing if the companies actually performing the mass surveillance (Google, Shopify) were liable even if the payload deliverer is small. Absolutely what is needed is law saying that Google can be sued (or better, held criminally liable for harassment/stalking) for spying on people through its Google Analytics program, among others.
Relentlessly stalking millions of people makes it millions of times worse than stalking one person, not somehow okay.
Or hold enough of those small actors to account that nobody wants to do business with Google Analytics in its current form.
It disgusts me that companies who want to transact with me don't vet their partners better. Off-Meta is another one that's despicable. Companies like my bank or their partners have NO business uploading lists of their users to third parties like that (even if it was induced by use of their analytics SDK's).
> If this is true, I find this case troubling and weak, and hope it is overturned. It is squarely on the shop operator to be compliant - Shopify is just a platform vendor and shoppers are not Shopify customers; rather, they are customers of the shop. This seems to be akin to suing Google because a website uses Google Analytics but didn't disclose it in their privacy statement - silly...
Most of my work is in the Shopify app dev ecosystem, and while I haven't been following this case very closely, I do think it's ironic how Shopify is behaving here given the privacy standards they enforce on their app developers.
Some context: all Shopify app developers are required to follow the EU's GDPR rules for customer data, full stop. Your app must implement Shopify's mandatory GDPR webhooks. You must delete customer data when a shop's customer is deleted; you must produce all data you store on a shop's customer within 7 days upon receipt of a certain GDPR webhook; and you must delete all the data you store on the shop itself after the shop uninstalls your app.
Additionally, if your app requires access to any customer data (whether its via the Customer API, or via other APIs e.g. to get the name of a customer who placed an order), you need to apply for access to that data on an app-by-app basis – replete with an explanation for why your app needs the data. Shopify's app store staff has to manually review and approve that data access application before you can publish your app on their app store.
To be clear, I think these restrictions are a good thing†, as apps used to have access to a veritable firehose of private customer data. But it's ironic to see Shopify enforce such standards on their app developers, while at the same time arguing that they should be able to track their own potential customers anywhere and everywhere across the internet regardless of privacy laws.
† Though I think it's a little odd that a Canadian company is making me, an American app developer, think about/adhere to the EU's GDPR rules. Not to mention other privacy laws like the one in California. Why not just call it "Shopify's Privacy Standards?"
Shopify is not enforcing those rules out of the goodness of its heart. It is in Shopify's best interest that retailers have as little information about their customers as possible and that it's as difficult as possible to export the data they do have out of Shopify, because that ties retailers to the Shopify ecosystem.
Stripe also has a version of this called “Link”, which uses SMS authentication. Based on Stripe data on multiple platforms I have access to, quite a high percentage of people use it, probably due to how hard it’s pushed by the UI when adding a payment method
> It is squarely on the shop operator to be compliant - Shopify is just a platform vendor and shoppers are not Shopify customers; rather, they are customers of the shop.
I disagree energetically. If Shopify wants to run a service identifying people between every site that it serves as a backend to, it should ask those people if they want to be included in that. The only alternative to stop the illegal activity otherwise is to print a list of Shopify's customers, and visit (and sue) them one by one in California. Shopify is running the service, and the shop owner probably doesn't even know how it works.
I'd even think that a shop owner sued over this should in turn be able to sue Shopify. If Shopify knows that something it does is not legal in California, it should tell its clients who may do business in California.
Maybe the line of reasoning offered and argued against is dubious. But IMO there are literally dozens of other arguments that will come to the same conclusion if you want to avoid hand waving about the particular bits the author raises.
By and large states having different laws is a pain, but arguing that you can do business in every state while only following the laws of one state is a very messy rejection of state's rights, and leads to using the commerce clause to basically negate most state level regulations and jurisdiction.
For completion I think "cease to insecurely extract, aggregate and abuse all that user data" should also be mentioned as an alternative to the different ways they could skirt regulation.
You’re misunderstanding the question - he’s asking how Shopify could avoid jurisdiction, not avoid this suit. Jurisdiction is a threshold question before you get to the merits… maybe Shopify did the bad thing, maybe they didn’t, but before we decide that, we need to determine if California law even applies to Shopify.
The author seems to think that there should be some way for Shopify to avoid jurisdiction while still offering services in California, but I don’t really understand why he thinks so.
As a former student of the author, I don't think he's saying they should be able to avoid jurisdiction. I think he was musing on whether it would even be possible under this new Ninth Circuit framework/test. He concludes it's unlikely, and hence for Shopify (or any other company putting cookies in browsers) to have any chance of avoiding it, they're going to have to appeal to SCOTUS.
Not at all. I think he rightly concludes that jurisdiction is completely avoidable by geoblocking California.
It is baffling to hear the author ask the question “did Shopify ‘expressly aim an intentional act at California?” And subsequently conclude that Shopify’s entire business model is in doubt if it doesn’t do business in California.
> First, the majority might say that Shopify should not engage in privacy-invasive activities. I didn’t invest the energy to figure out the irreducible privacy elements of the plaintiffs’ claims, but if using cookies to track users is an essential part of the claim, then more privacy-protective option are not feasibly available to Shopify.
This seems like a very strange reading of "express aiming"; instead of those words meaning that a person has done something to 'target', it means that the person did not 'expressly avoid'? I am not sure that "expressly aim" has much meaning at all in this reading.
I don't have any horse in this race, though I know the EFF is very popular on HN, and that many people here are also against data collection.
The "online retailer" (IABMFG) in this case is based in California.
A company in California, selling to a customer in California, shouldn't be able to say "California law doesn't apply because my payment processor is Canadian." And if Shopify wants to take a cut of every sale from retailers based in California, they should be willing to comply with California law as well, at least insofar as it applies to the services provided via those California-based retailers' web sites.
(The actual opinion is linked at the bottom of the submission; I humbly suggest folks commenting here should read it first: https://cdn.ca9.uscourts.gov/datastore/opinions/2025/04/21/2...)
To be clear, this isn't a choice-of-law case. It's not about whether California law applies. It's about whether a court in California has jurisdiction; that is, whether it can hear the case at all.
Could a tort claim under California state law be heard in any other court (assuming no accompanying Federal claims)?
Yes. Choice-of-law terms are frequently found in contracts. When interpreting the law, the court with jurisdiction will do its best to refer to and interpret existing law of the state in question to the case.
(I deleted my original reply, which was maybe conflating several different arguments. I'll try to restate my point more succinctly. IANAL, so it's quite possible I'm misunderstanding the legal issues at play.)
The linked opinion's discussion of jurisdiction seems to be about whether the case can be heard in any court, not if there's an alternate venue that would be more proper. FWIW, the retailer's web site[0] does not seem to contain a terms of service or anything with a choice-of-law provision. If the argument is that no court has jurisdiction in which the plaintiff can seek redress, that seems equivalent to saying that the law doesn't apply to the defendants.
Shopify already exercises a huge amount of discretion as to which businesses it's willing to provide services for, and how much it charges them. It does not seem unreasonable to me that the company would either a) be willing to answer to California courts, or b) stop selling its services to California retailers.
[0]: https://www.iambecoming.com/pages/who-we-are
> And if Shopify wants to take a cut of every sale from retailers based in California, they should be willing to comply with California law as well,
For the moment, for purpose of consumer protections, fine. But on longer time scales, I'm not sure. Does it really make sense for legacy states to be able to bind transacations on the internet? Doesn't that just make it a very large intranet?
Obviously information refuses to be stopped by borders. Are we going to have a situation where states of various sizes try to claim jurisdiction, but only those with sufficiently violent tendencies (and thus the ability to achieve compliance by fear) succeed? Won't that corrupt the internet even worse than an absence of state-based consumer protections?
If two people who live 500 miles apart in the area currently claimed by the State of California, but do not recognize the State of California, and regard themselves as citizens of the internet, and, who is right, them, of the government of California?
Most of us will probably say that there is some social contract by which, for better or worse, the State of California is right.
But what if, in 100 years, California goes bankrupt. Does that change the calculus? If so, why? And does it change retroactively, for the purposes of historical classification of internet transactions? The diplomatic and economic affairs of state don't change the operation of internet protocols. It's hard to even fully imagine how to create an internet whose shapes are coterminous with the boundaries asserted by various states.
I'm broadly skeptical of any judicial rulings which extend the laws of the legacy states onto the internet, even if they appear to be on the side of short-term justice. This whole thing is starting to feel like a bandaid better ripped off quickly.
[dead]
This ruling is correct, and good.
> If a company develops a web-based business for the purpose of conducting online transactions in all 50 States, it should not be surprised that it may be sued in any State for unlawful transactions that may occur within that State.
Obviously. But the author calls this "chilling". Without this, companies could circumvent state laws, to conduct actions that are illegal in that state within that state, simply by headquartering or hosting in another. That would be absurd.
It would create a race-to-the-bottom of consumer rights, where states wanting business tax revenue are incentivized to make their states surveillance/ data harvesting/ consumer exploitation havens, whose businesses could then operate across all other states freely.
I suspect this has something to do with "Shop Pay", Shopify's own payment system used on most (all?) Shopify stores. It enables you to have saved payment information for any Shopify store you come across, facilitating one-click checkout even if you have never shopped on that particular brand/website before. Webshop operators love it because it is very good at fraud detection (due to the pooled data on the backend), and removes barriers at checkout (needing your wallet, fill out an address form, etc). As far as I'm aware, it's optional on the Shopify platform. Using Shop Pay for payment is optional on the consumer level.
I suspect Shopify's terms inform their customers (webshop operators) that they are responsible for disclosure, etc and being compliant with state privacy laws - however since majority of web shops are exempt (due to small size, revenue, etc), these shops did not (knowingly or otherwise) publish these terms. That's just speculation on my part...
If this is true, I find this case troubling and weak, and hope it is overturned. It is squarely on the shop operator to be compliant - Shopify is just a platform vendor and shoppers are not Shopify customers; rather, they are customers of the shop. This seems to be akin to suing Google because a website uses Google Analytics but didn't disclose it in their privacy statement - silly...
This particular case gives me ADA and Prop65 vibes... lots of bottom-feeding lawyers using serial plaintiffs to extort businesses out of money. At least in this case they're going after someone with deep pockets and not just small businesses...
I'm not familiar enough with California's law to know whether companies like Shopify/Google are meant to be liable (in the sense that the law says so), but certainly it would be a great thing if the companies actually performing the mass surveillance (Google, Shopify) were liable even if the payload deliverer is small. Absolutely what is needed is law saying that Google can be sued (or better, held criminally liable for harassment/stalking) for spying on people through its Google Analytics program, among others.
Relentlessly stalking millions of people makes it millions of times worse than stalking one person, not somehow okay.
Or hold enough of those small actors to account that nobody wants to do business with Google Analytics in its current form.
It disgusts me that companies who want to transact with me don't vet their partners better. Off-Meta is another one that's despicable. Companies like my bank or their partners have NO business uploading lists of their users to third parties like that (even if it was induced by use of their analytics SDK's).
> If this is true, I find this case troubling and weak, and hope it is overturned. It is squarely on the shop operator to be compliant - Shopify is just a platform vendor and shoppers are not Shopify customers; rather, they are customers of the shop. This seems to be akin to suing Google because a website uses Google Analytics but didn't disclose it in their privacy statement - silly...
Most of my work is in the Shopify app dev ecosystem, and while I haven't been following this case very closely, I do think it's ironic how Shopify is behaving here given the privacy standards they enforce on their app developers.
Some context: all Shopify app developers are required to follow the EU's GDPR rules for customer data, full stop. Your app must implement Shopify's mandatory GDPR webhooks. You must delete customer data when a shop's customer is deleted; you must produce all data you store on a shop's customer within 7 days upon receipt of a certain GDPR webhook; and you must delete all the data you store on the shop itself after the shop uninstalls your app.
Additionally, if your app requires access to any customer data (whether its via the Customer API, or via other APIs e.g. to get the name of a customer who placed an order), you need to apply for access to that data on an app-by-app basis – replete with an explanation for why your app needs the data. Shopify's app store staff has to manually review and approve that data access application before you can publish your app on their app store.
To be clear, I think these restrictions are a good thing†, as apps used to have access to a veritable firehose of private customer data. But it's ironic to see Shopify enforce such standards on their app developers, while at the same time arguing that they should be able to track their own potential customers anywhere and everywhere across the internet regardless of privacy laws.
† Though I think it's a little odd that a Canadian company is making me, an American app developer, think about/adhere to the EU's GDPR rules. Not to mention other privacy laws like the one in California. Why not just call it "Shopify's Privacy Standards?"
Shopify is not enforcing those rules out of the goodness of its heart. It is in Shopify's best interest that retailers have as little information about their customers as possible and that it's as difficult as possible to export the data they do have out of Shopify, because that ties retailers to the Shopify ecosystem.
Stripe also has a version of this called “Link”, which uses SMS authentication. Based on Stripe data on multiple platforms I have access to, quite a high percentage of people use it, probably due to how hard it’s pushed by the UI when adding a payment method
> It is squarely on the shop operator to be compliant - Shopify is just a platform vendor and shoppers are not Shopify customers; rather, they are customers of the shop.
I disagree energetically. If Shopify wants to run a service identifying people between every site that it serves as a backend to, it should ask those people if they want to be included in that. The only alternative to stop the illegal activity otherwise is to print a list of Shopify's customers, and visit (and sue) them one by one in California. Shopify is running the service, and the shop owner probably doesn't even know how it works.
I'd even think that a shop owner sued over this should in turn be able to sue Shopify. If Shopify knows that something it does is not legal in California, it should tell its clients who may do business in California.
You opt-into using Shop Pay, as a consumer. By default you are in "guest" mode.
> If Shopify knows that something it does is not legal in California
This is what is being debated. This ruling is mostly expected out of the 9th... we'll see what happens when a real court hears this case.
What are the odds the Supreme Court hears this?
Your guess is as good as mine. I doubt Shopify will let this rest, since the consequences are fairly huge.
Maybe the line of reasoning offered and argued against is dubious. But IMO there are literally dozens of other arguments that will come to the same conclusion if you want to avoid hand waving about the particular bits the author raises.
By and large states having different laws is a pain, but arguing that you can do business in every state while only following the laws of one state is a very messy rejection of state's rights, and leads to using the commerce clause to basically negate most state level regulations and jurisdiction.
> What Could Shopify Have Done Differently?
For completion I think "cease to insecurely extract, aggregate and abuse all that user data" should also be mentioned as an alternative to the different ways they could skirt regulation.
You’re misunderstanding the question - he’s asking how Shopify could avoid jurisdiction, not avoid this suit. Jurisdiction is a threshold question before you get to the merits… maybe Shopify did the bad thing, maybe they didn’t, but before we decide that, we need to determine if California law even applies to Shopify.
The author seems to think that there should be some way for Shopify to avoid jurisdiction while still offering services in California, but I don’t really understand why he thinks so.
As a former student of the author, I don't think he's saying they should be able to avoid jurisdiction. I think he was musing on whether it would even be possible under this new Ninth Circuit framework/test. He concludes it's unlikely, and hence for Shopify (or any other company putting cookies in browsers) to have any chance of avoiding it, they're going to have to appeal to SCOTUS.
Not at all. I think he rightly concludes that jurisdiction is completely avoidable by geoblocking California.
It is baffling to hear the author ask the question “did Shopify ‘expressly aim an intentional act at California?” And subsequently conclude that Shopify’s entire business model is in doubt if it doesn’t do business in California.
I was going to quote, and respond in almost the exact same way.
The only change I would make to your suggestion would be to remove the word "insecurely".
They shouldn't extract or aggregate user data in any fashion whatsoever.
Backstory from eff:
https://www.eff.org/deeplinks/2024/07/courts-should-have-jur...
For anyone not clicking through, EFF supports the 9th Circuit in this case.
> then more privacy-protective option are not feasibly available to Shopify
I haven't laughed that hard in awhile. Poor Shopify, they couldn't possibly protect the privacy and data of their customers.
The part where many may object:
> First, the majority might say that Shopify should not engage in privacy-invasive activities. I didn’t invest the energy to figure out the irreducible privacy elements of the plaintiffs’ claims, but if using cookies to track users is an essential part of the claim, then more privacy-protective option are not feasibly available to Shopify.
This seems like a very strange reading of "express aiming"; instead of those words meaning that a person has done something to 'target', it means that the person did not 'expressly avoid'? I am not sure that "expressly aim" has much meaning at all in this reading.
I don't have any horse in this race, though I know the EFF is very popular on HN, and that many people here are also against data collection.
I guess it's just a coincidence that the California Shopify meetup groups are abandoned without notice?
The full title is:
> Ninth Circuit Takes a Wrecking Ball to Internet Personal Jurisdiction Law–Briskin v. Shopify
What law was wrecked? The outcome appears to be the upholding of a CA law.
Case law on internet-specific personal jurisdiction made by the district court, presumably.