Thanks to Lily Li, Data Privacy Attorney and Founder of Metaverse Law, for joining on an interview to share her insights on how businesses can prepare to comply with the California Privacy Rights Act (CPRA). The full interview can be watched here.

Mike: Hi everyone, if you’ve been following data privacy at all, you’ve probably already heard of California’s new landmark privacy law, the California Consumer Privacy Act, or CCPA as it is widely known.

The CCPA was the biggest data privacy shakeup in United States history. However, on November 3rd, California passed the California Privacy Rights Act or the CPRA, which adds teeth to the CCPA and further strengthens the rights for California consumers.

Here to talk about the upcoming CPRA is Lily Li, who is a Data Privacy Attorney and the founder of Metaverse Law.

Lily, thanks so much for joining us today.

Lily: Hey, thanks for having me.

Mike: Well, let’s jump right in. Can you please explain to everyone what the CPRA is?

Lily: Well, the CPRA is a law that amends the existing law on the books. As you mentioned there is this law called the California Consumer Privacy Act. It was passed by the California Legislature in 2018 and went into effect on January 1st of this year.

Now we have CPRA, which is a ballot initiative that passed in the latest election, and it amends CCPA even further to make it more protective of privacy rights. Both of how customers use sensitive data and also about how companies use children’s data. We can definitely go more into the different changes that CPRA made to CCPA but this is a little bit of background on how it started.

Mike: What do you feel are some of the key changes that the CPRA brings?

Lily: Well, the CPRA brings in this idea of sensitive personal information or sensitive personal data. And this aligns with a lot of other global privacy laws like GDPR and the new Brazilian Data Protection law.

Previously CCPA treated all types of personal information the same with respect to data subject requests. So people could get copies of their data. People could delete their data and a lot of people still have those rights with respect to companies.

Now, in CPRA there’s a new category of data-sensitive personal data, sensitive personal information, and these categories of data include things like health care information, now precise geolocation, information about people’s genetics, or biometric data.

And what’s important about these categories of data is that not only does the law prevent you from sharing this data without providing certain notices. The law also allows consumers to limit how a company uses sensitive data for their own purposes.

So even if you’re collecting Geo-location information, not giving it out to third parties, if you’re using it for purposes at the company that aren’t related to why you’re collecting it from the consumer, the consumer can have the right to ask you to limit your use of sensitive data.

A good example of this is precise Geo-location data. Uber got in trouble a little while ago because it would collect Geolocation data from people using its rideshare app—even after people had stopped using the app. And so Uber could track people’s location in their homes or while they were still waiting for the right transit service.

This is a big No-No—especially if you are not disclosing it. But now, customers and consumers have the right to say hey, only use these sensitive pieces of information to provide me the services that I’ve requested. Don’t use it for anything else.

Another big change that the CPRA makes. Some people call it “CIPRA” now like to use the term CIPRA is that it increases the penalties for children’s data.

So previously, you could suffer fines if you were using children’s data in violation of how you disclosed the uses of data and privacy policy or if you refuse to respond to consumer requests regarding children’s data and the finding regime was the same. It was $2500 to $7500 per violation.

The difference between CPRA and CCPA is that under CCPA you could be fined $2500 per violation or $7500 per intentional violation. So you had to intentionally violate the law, and not just accidentally violate it because you didn’t know about the rules.

What “CIPRA” does or CPRA does is that it removes the intentionality requirement when you’re dealing with children’s data. So if you are using children’s data in ways that you haven’t disclosed in your privacy policy or are you are not fulfilling consumer requests regarding children’s data, then you are subject to that higher fine of $7500 per violation without any showing that you did it on purpose.

And there are a lot of other changes in CPRA that affect businesses. One of them is concerning behavioral advertising.

Under CCPA there was a lot of debate about whether or not remarketing, retargeting other types of cookies that track users across websites counted as sales of consumer data. And if something counted as a sale of consumer data under California law, you need to put a lot of disclosures on your website, like I do not sell my personal information.

Some companies were arguing that targeting ads behavioral advertising wasn’t a sale. There was no real exchange of money for personal information.

But CPRA removes that ambiguity. Under CPRA it is very clear that cross contextual behavioral advertising, that is to say, cookies that you set on a device that tracks users across different platforms in order to create a profile for a user to target them, counts as sales of data under CCPA, and so triggers a lot of the same disclosure requirements as if you were selling data in more traditional formats. So that’s another big change due to CPRA.

Mike: What do you think are the most important steps for businesses to take to comply with the CPRA?

Lily: Well, one of the new concepts that’s brought in by the CPRA is this idea of purposefulness, purpose limitation, and data minimization. And for a lot of US audiences, this idea of data minimization and purpose limitation, they’re completely new ideas. A lot of US companies are based on the idea that will collect a lot of consumer data will build our base and then we’ll monetize that in some way in the future. But we don’t know yet what that’s going to be.

Now, the paradigm has shifted and it’s now becoming a lot more like other regimes like GDPR where a company needs to now consider what purpose are you using individuals’ data for, and to minimize your use of personal data in your own organization.

And so one of the most important steps that a business can do is Data mapping and data classification.

I know it’s a pain in the butt. It’s really really difficult.

But, go through your company records and see where are you storing personal information? Where are you collecting personal information? And what is the purpose of such information collection?

A lot of people don’t realize all the different ways of collecting data. They might have an app that collects Geolocation data. They might have a Salesforce platform that has all of their B2B customer data. They might have in person, (you know before covid) in-person events where they’re collecting information from physical locations.

And then, once you’ve figured out where all your data is stored and what purpose it’s being used for, then you can start classifying the data. Is this sensitive data that will fall under the new CPRA categories of sensitive personal information?

If so, you should only be using that information for the limited purpose of providing consumers the service. If you are using it for any purpose outside of that, you will need to disclose that in your privacy policy and, make note, individuals will be able to limit that use to those necessary to provide services.

So again, knowing where the data is, knowing what type of data it is, and then being able to segregate the use of such data.

Another thing that companies should be aware of is you know, really take stock of whether or not you’re collecting children’s data.

The fining regime under CPRA is very harsh for children’s data. There have been a lot of FTC actions in the data privacy space regarding children’s data. Just look at litigation and regulatory investigations involving YouTube. Just look at what’s been going on with TikTok.

And so if there’s any situation where you’re collecting children’s data and it’s not really necessary for your business purpose, it doesn’t really fulfill your business mission, just delete that data. The risks will far outweigh the benefits in that type of scenario.

Obviously, if you need children’s Data and that is core to your business. Consider CPRA, consider COPPA and take a look at a lot of those recent FTC actions in the children’s data privacy space.

Mike: Yeah, I think all that are very interesting points and you know with the CPRA removing this intentional factor for children’s data, I think raises a couple of interesting questions where you know devices nowadays such as iPads or Apple devices that maybe if you’re in a room speaking they might be recording you. Who knows if children are walking into a store and these devices are on that could be being recorded and it’s going to require businesses to really take some interesting steps to safeguard away from collecting that sensitive data.

Could you share with us how the CPRA aligns with global trends around data privacy?

Lily: Yea, for any businesses that want to market to audiences around the world, that CPRA is not unusual. There are a lot of data privacy trends that are moving more and more towards a pro-privacy stance. So if we take a look at South America, for instance, Brazil has this new data privacy law which just went into effect really recently, late this year, and has requirements that are similar to GDPR and similar to CPRA with even shorter timelines for responding to data subject requests. So pay attention to that.

Another law that went into effect is Japan’s new data privacy law. Again, it’s similar to GDPR, similar to CPRA, but has stronger consent requirements than either law. So, if you plan to do any business with Japan, please take a look at that law.

And then, the elephant in the room is GDPR. There have been a lot of decisions out of the EU courts and out of supervisory authorities that really restrict data transfers coming out of Europe and into the United States. So there are a lot of different safeguards they need to add now to EU data that you are collecting.

Once again, all of these different data privacy regimes are forcing companies to really map the data flows. Really classify them, and in some cases segment where data is stored and how data is managed in their ecosystems.

Mike: Yeah, that’s very interesting. Definitely, there’s a privacy wave that just seems to be flooding across the world and doesn’t seem like it’s going to stop anytime soon. Almost like a train without breaks.

One question that I feel like I get and seems like companies are always thinking about is they see all these privacy laws and it kind of, whether it be CCPA or GDPR, across the states especially they’re kind of just bubbling up and it’s like in their eyes are they landmines or they just something we don’t need to worry about?

From a company standpoint, how do you advise on which privacy laws to spend the most time on?

Given that there are so many different approaches across the globe really. So with kind of this fragmented legislative landscape of privacy laws and if you’re a company just about to start heading in to make your efforts and comply, how do you determine where you should be spending your time? ‘Cause is it realistic to comply with 85 laws? Or really where should efforts be given?

Lily: It definitely depends on the industry. So for instance, if you are a big e-commerce platform and you are going to be marketing across the globe, then I would definitely focus your efforts on GDPR, because that’s a giant block, CCPA, and CPRA. And then take a look at the PCI standards with respect to credit card data. And between those three regimes, you will have a good sense of what type of privacy notices you need to create, how you deal with consumer requests and basic security standards for credit card data. And a lot of other e-commerce data.

The nice thing is that even though there are different rules in other jurisdictions, a lot of them are modeled upon basic principles in GDPR. So let’s not let the perfect be the enemy of the good. Get those three down and that will take you a long way. So that’s the e-commerce space.

If you are in the healthcare space at all, I would just tweak it a little bit. Again, you need to understand CCPA and CPRA because not all data you collect is covered by HIPAA. You will need to know HIPAA and HITECH. And the medical privacy laws there, and then also take a look at GDPR.

There’s a lot of commonalities between HIPAA and GDPR, so if you already have a healthcare privacy program at your organization, you don’t have to start from scratch. You can build off of what you already have.

And then there are companies that are in completely different industries and I’m happy to chat with them about their different compliance obligations. But these are our two big ones. And what type of standards they should be looking at?

Mike: Yeah, that’s really good advice. I think there is.

Lily: I think there is both a very positive trend for data privacy laws, but there’s also a negative trend.

The positive trend is that, I mean it is very important for companies to secure data and to care about individual privacy. And as we have seen with Cambridge Analytica and with some of the Uber scandals, it can really affect individual liberties, it can really affect the Democratic process if you’re not protecting individual data. So I’m very, very happy that privacy is being implemented across different jurisdictions in all these different privacy regulations. So that’s great.

The negative part about this, though, is that the other trend I’m seeing is that countries are using data privacy laws and data security laws as trade barriers.

And so, there’s been this giant push to try and localize data within your country’s boundaries. And there’s been a lot of different decisions that essentially say, oh, we’re not going to allow transfers of data from one jurisdiction to another, even absent a lot of other considerations. And I think that is a really, really negative trend. What we don’t want is for data privacy and cybersecurity to become a proxy for a trade war between nations, because then it isn’t about protection protecting individual rights and individual liberties. It becomes a way for governments to impose trade barriers if things aren’t going right in other areas of the economy.

And so you know, for the sake of certainty, for the sake of you know, not tying these two concepts together. I really, really hope that legislatures will consider for sale harmonizing data privacy laws, so they’re easy for companies to comply with regardless of what your jurisdiction is.

Mike: Yeah, that’s a very interesting point and I think we’re heading into kind of a new world where the business now is kind of conducted on this borderless kind of internet highway of data. Whereas before things used to be a little bit more localized. But as things stretched jurisdictions are enacting laws, and as you said, they can be either a very good thing or they can really be a border between people able to transact business which can affect international trade and Global e-commerce on a massive scale and create some serious issues. So I am really happy you’re able to shed some light on that because I think that’s a really big issue and something we’re facing everyday.

You know, one last question is in terms of some of these issues. We’ve noticed technology like artificial intelligence. We hear a lot about machine learning that’s collecting lots and lots of data, and companies are very successful if they can use this data in a way that you know for marketing and sales etc. So maybe this technology is a big part of why we’re seeing some of this legislation given the magnitude of new data.

In your eyes, if problems concerning privacy are, in large part, stemming from technology, do you think the solution to overcome privacy problems will be more technology?

Lily: Well the nice thing about a lot of the new technologies that are coming out there is that they can be used/ built for privacy-intrusive purpose, but they can also be used for a privacy-protective purpose. And so, unfortunately, if you can afford it and you’re already implementing AI in your systems, there are a lot of vendors out there that use similar AI machine learning tools to help you map and classify data and help you actually respond to data subject requests in a very efficient way.

So, I mean, I’m a science fiction geek. I love technology. The technology is morally agnostic, right? It’s up to the business owner to use it in the proper way.

Mike: Yeah, I know it’s an interesting point, and yeah if you put data into a system that’s not good data to start, likely not going to have very good outcomes. So companies, no matter how good or bad technology have there’s a responsibility on the companies to use those tools in a way that’s moral and in line with what the requirements are.

Well, Lily, thank you again for sharing your views on the upcoming California Privacy Rights Act, as well as providing us with an outlook on how the CPRA aligns with global privacy trends and lots of other great information. And again, thank you so much for joining on this. And we all appreciate it.

Lily: No yeah, thanks again for having me. It’s been a lot of fun.

Skip to content