HB Ad Slot
HB Mobile Ad Slot
BEHIND THE FILTERS: CapCut And TikTok Are Making The Cut And Maybe Your Personal Data Too
Tuesday, March 18, 2025

Greetings CIPAWorld!

Let’s get techy with it. Ever edited a TikTok or Instagram Reel using CapCut? It turns out that you might have handed over more than just your creativity. The Northern District of Illinois has delivered a mixed but consequential ruling in Rodriguez v. ByteDance, Inc., about how video editing apps collect and utilize our personal data. See Rodriguez v. ByteDance, Inc., No. 23 CV 4953, 2025 U.S. Dist. LEXIS 37355 (N.D. Ill. Mar. 3, 2025). You guessed it. TikTok is at issue here. If you’ve ever used CapCut to perfect a TikTok video or Instagram reel, this decision deserves your attention!

Yikes. Imagine editing a quick vacation video only to discover the app might be scanning every photo in your gallery and capturing your facial features! That’s precisely the kind of privacy implications at the center of this case. Make sure to always check your app permissions!

The Opinion in ByteDance, Inc. offers a nuanced examination of modern privacy law. It allows several significant claims to proceed while dismissing others. So, let’s get into a brief background first.

CapCut, developed and operated by Chinese technology giant ByteDance (which also owns TikTok), has exploded in popularity since its 2020 U.S. launch. Now, it’s one of the most downloaded apps globally, with over 200 million monthly active users! CapCut allows users to create, edit, and customize videos using templates, filters, and visual effects. Everyone wants to look good, right? While predominantly free, users can access premium features through subscription models. It’s remarkable how quickly CapCut became essential for content creators. When in actuality, this case forces us to confront the reality that the most user-friendly tools might also be invasive.

However, according to the Plaintiffs, this seemingly innocent video editor allegedly harbors a more problematic function—collecting vast amounts of user data without proper authorization. The Complaint alleges that CapCut collects everything from registration information and social network contacts to location data, photos, videos, and even biometric identifiers like face geometry scans and voiceprints. Yes you read that right… Biometric identifiers.

First, the Court’s reasoning behind allowing the California constitutional and common law privacy claims to proceed reveals evolving judicial thinking about digital privacy. Judge Alexakis emphasized that privacy violations don’t depend solely on the sensitivity of the content collected but also on the manner of collection. See Davis v. Facebook, Inc. (In re Facebook Inc. Internet Tracking Litig.), 956 F.3d 589, 603 (9th Cir. 2020).

Many of us miss this critical distinction in our everyday tech interactions. We often focus on what data apps collect rather than how they collect it. The Court’s analysis suggests that even innocuous data could trigger privacy concerns if gathered through deceptive or overly invasive methods—a crucial lesson for developers and users alike.

Here, the Judge found the allegations that CapCut accesses and collects all the videos and photos stored on their devices, not just those they voluntarily uploaded to the CapCut app, particularly troubling. If proven true, this broad data collection practice would violate reasonable user expectations. Ringing any bells here? Drawing parallels to Riley v. California, 573 U.S. 373, 397-99 (2014), which recognized that individuals have a reasonable expectation of privacy in the contents of their cell phones, Judge Alexakis noted that a reasonable CapCut user would not expect the app to access and collect all the photos and videos on their devices, regardless of whether they use those photos and videos to create content within the app. Makes perfect sense, right?

Let that sink in for a minute. An app potentially scans your entire photo library when you only intend to edit a single clip! This broad access would be like handing a stranger your family photo album when they only ask to see one vacation picture. The Court rightly recognized how this violates our intuitive sense of privacy.

For the California constitutional and common-law privacy claims regarding user identifiers and registration information, the Court specifically relied on United States v. Soybel, 13 F.4th 584, 590-91 (7th Cir. 2021) for the principle that a person “has no legitimate expectation of privacy in information he voluntarily turns over to third parties,” which was central to dismissing claims based on this type of information.

Next, the California Invasion of Privacy Act (“CIPA”) claims represented a significant but ultimately unsuccessful component of Plaintiffs’ case. As we know, under CIPA, individuals are protected against unauthorized electronic interception of communications. Section 631(a) prohibits any person from using electronic means to “learn the contents or meaning” of any “communication” without consent or in an “unauthorized manner.” Critically, neither CIPA nor the federal Electronic Communications Privacy Act (“ECPA”) impose liability on a party to the communication, as Judge Alexakis noted in Warden v. Kahn, 99 Cal. App. 3d 805, 811, 160 Cal. Rptr. 471 (1979) held that section 631… has been held to apply only to eavesdropping by a third party and not to recording by a participant to a conversation.

Here’s where Plaintiffs ran into a fascinating legal hurdle. When you voluntarily use an app, the law often treats that app as a communication “participant” rather than an eavesdropper. Think about how different this is from our intuitive understanding… Few of us would consider a video editor an equal “participant” in our creative process, yet that’s essentially the legal fiction applied here.

Plaintiffs’ attempted to circumvent this limitation by asserting that Defendants effectively intercepted their data by “redirecting” communications to unauthorized third parties, including the Chinese Communist Party. They relied on legal authorities like Davis, 956 F.3d at 596, 607-08, where Facebook used plugins to track browsing histories even after users logged out.

Conversely, Judge Alexakis found two factual flaws in this theory. First, Plaintiffs failed to plausibly allege that any communications were intercepted during transmission rather than merely shared after collection. Though the Court acknowledged that the Seventh Circuit hadn’t definitively ruled whether interception must be contemporaneous with transmission, it noted that every court of appeals to consider the issue had reached this conclusion. See Peters v. Mundelein Consol. High Sch. Dist. No. 120, No. 21 C 0336, 2022 WL 393572, at *11 (N.D. Ill. Feb. 9, 2022). Second, Paintiffs’ allegations fell short of the specific software-tracking mechanisms proven sufficient in cases like Facebook Tracking. They identified no particular mechanism by which ByteDance contemporaneously redirected communications to third parties.

Let’s dig a little deeper so this makes sense. There’s a (legal) difference between an app intercepting your data in transit (like wiretapping a phone call) versus collecting it at the endpoint and sharing it later. Most individuals would see little practical difference in the outcome. Your private data ends up in unexpected hands either way, yet courts maintain this technical distinction that significantly impacts your legal protections.

Next, the Court rejected claims under Section 632 of CIPA, which imposes liability on parties who use an electronic amplifying or recording device to eavesdrop upon or record confidential communication. Beyond the conclusory assertions that ByteDance intercepted and recorded videos without consent, Plaintiffs failed to allege that Defendants used any electronic amplifying or recording device to eavesdrop on conversations.

Let’s switch it up now. We are going to talk about something a little different. Perhaps the most meaningful survival in the decision concerns claims under Illinois’ Biometric Information Privacy Act (“BIPA”). The Court rejected ByteDance’s argument that BIPA only applies when companies use biometric data to identify individuals. Looking at the statute’s plain language, which defines “biometric identifier” to include “voiceprint[s]” and “scan[s] of… face geometry,” the Court found no requirement that the data be used for identification purposes.

This is genuinely interesting to me. Illinois lawmakers created one of the strongest biometric protection laws in the country, and the Court’s ruling reinforces just how far those protections extend. The practical effect here is enormous. Companies can’t just escape liability by claiming they collected your facial geometry or voiceprints for purposes other than identification. The mere collection without proper consent is enough to trigger liability. This reasoning aligns with an emerging consensus in the Northern District of Illinois. In Konow v. Brink’s, Inc., 721 F. Supp. 3d 752, 755 (N.D. Ill. 2024), the Court held that a defendant may violate BIPA without using technology to identify an individual; instead, BIPA bars the collection of biometric data that could be used to identify a plaintiff.

The Court also found persuasive Plaintiffs’ detailed allegations that ByteDance employs engineers specializing in “computer vision, convolutional neural network, and machine learning, all of which are used to generate the face geometry scans that Defendants derive from the videos of CapCut users.” These technical specifications helped elevate the claims beyond mere conclusory allegations.

What’s particularly impressive here is how Plaintiffs connected the dots between ByteDance’s engineering talent, their patent applications for voiceprint technology, and the actual functions of CapCut. This level of technical detail is increasingly necessary in privacy litigation. In turn, vague claims of data collection often fail without demonstrating the underlying mechanisms involved.

For BIPA’s Section 15(c) claim, the Court relied on the statutory interpretation principle of ejusdem generis in interpreting “otherwise profit.” In Circuit City Stores v. Adams, 532 U.S. 105, 114-15, 121 S. Ct. 1302, 149 L. Ed. 2d 234 (2001), the Court noted that when general words follow specific words, they “embrace only objects similar in nature to those objects enumerated by the preceding specific words.” This interpretive principle was key to the Court’s narrower reading of the statute, limiting “otherwise profit” to commercial transactions similar to selling, leasing, or trading data. Sequentially, the Court rejected ByteDance’s argument that internal use of biometric data—such as improving CapCut’s editing features—constitutes “otherwise profiting” under BIPA. It is fascinating how this determination narrows the scope of liability under Section 15(c), signaling that plaintiffs must show an actual external transaction involving biometric data to succeed on these claims.

Next, the Court analyzed various consumer protection claims that failed because Plaintiffs couldn’t demonstrate economic injury. For their claims under California’s Unfair Competition Law (“UCL”) and False Advertising Law (“FAL”), Judge Alexakis emphasized that, unlike the broader Article III standing requirements, these statutes demand a showing that plaintiffs lost money or property. This highlights one of the most frustrating aspects of privacy litigation for consumers: proving financial harm from privacy violations is extraordinarily difficult. We intuitively understand that our personal data has value (why else would companies collect it so aggressively?). Yet, courts often struggle to quantify it or recognize its loss as economic injury. It’s like recognizing theft only when something tangible is taken.

The Court was particularly unpersuaded by theories based on the diminished value of personal data, noting Plaintiffs hadn’t alleged they attempted to sell their data or received less than market valueDavis, 956 F.3d at 599, rejected a similar argument that a loss of control over personal data constituted economic harm. The Court also referenced Cahen v. Toyota Motor Corp., 717 F. App’x 720, 723 (9th Cir. 2017), which held that speculative claims about diminished data value, without concrete evidence of lost economic opportunity, are insufficient to establish standing under consumer protection statutes.

Moreover, like in Griffith v. TikTok, Inc., No. 5:23-cv-00964-SB-E, 2023 U.S. Dist. LEXIS 223098, at *6 (C.D. Cal. Dec. 13, 2023), the Court observed that Plaintiffs failed to show they attempted or intended to participate in the market for their data. Additionally, the Court noted that Plaintiffs failed to allege any direct financial loss tied to CapCut’s data practices, distinguishing their claims from cases where courts recognized economic harm due to specific monetary expenditures, such as fraudulent charges or paid services rendered worthless by deceptive conduct.

Next, the Court turned to the core issue underlying many of Plaintiffs’ claims—what ByteDance actually did with the data it collected and whether users had truly consented to these practices. One of the most fascinating aspects of the Opinion is the battle over consent. ByteDance mounted an aggressive defense centered on its Terms of Service and Privacy Policies, arguing that users effectively waived their rights by agreeing to these documents. The company submitted three distinct versions of its Privacy Policy from 2020, 2022, and 2023, each making various disclosures about data collection practices.

Let’s be honest…When was the last time any of us actually read a privacy policy before clicking “agree”? Side note: you should. ByteDance, like many tech companies, is backing their legal protection on documents they know full well most users never read. What’s remarkable is that courts are increasingly skeptical of this fiction, recognizing the reality of how users actually interact with digital products.

Judge Alexakis’s detailed analysis here offers insights for both app developers and users alike. She recognized that while the policies could potentially be incorporated by reference since Plaintiffs mentioned them in their Complaint, she refused to dismiss the case based on consent at this early stage. The Court emphasized that dismissing claims based on affirmative defenses like waiver is only appropriate if “the allegations of the complaint itself set forth everything necessary to satisfy the affirmative defense.” See United States v. Lewis, 411 F.3d 838, 842 (7th Cir. 2005).

Here, there are several critical factual questions that prevented the Court from accepting ByteDance’s consent defense. Most notably, there was no conclusive evidence about exactly when and how the plaintiffs agreed to the terms. While ByteDance simply asserted that “[u]sers expressly or impliedly consent to the policy upon downloading and using the app,” Plaintiffs countered that they “were able to access the CapCut platform without having to scroll through and read such policies before they were allowed to sign up for the services.”

The Court was particularly skeptical of ByteDance’s reliance on what appeared to be a browsewrap agreement—where terms of service are presented passively and users are presumed to agree simply by using the service. Judge Alexakis emphasized that browsewrap agreements are only enforceable if users have actual or constructive notice of the termsSee Specht v. Netscape Commc’ns Corp., 306 F.3d 17, 30-31 (2d Cir. 2002). This means that merely linking to a privacy policy at the bottom of a webpage or app interface is insufficient to establish consent. As such, actual consent requires more than theoretical access to terms.

Additionally, the Court noted that the placement and formatting of ByteDance’s consent prompts were unclear, raising doubts about whether the plaintiffs were ever explicitly informed of the policy’s existence before using CapCut. This aligns with precedent from Nguyen v. Barnes & Noble Inc., 763 F.3d 1171, 1177 (9th Cir. 2014), where courts declined to enforce arbitration clauses hidden in inconspicuous terms of service.

This dispute highlights a pervasive problem I’m recognizing in digital consent. There is a gap between technical legal compliance and actual user understanding. Despite ByteDance presenting screenshots showing a prompt requiring users to click “Agree and continue,” Judge Alexakis noted this evidence couldn’t establish whether these particular plaintiffs had seen and agreed to these specific terms, especially because they explicitly alleged they never read any privacy policy or terms of use.

The Court also highlighted another crucial factual gap. Plaintiffs asserted that the attached policies were only three of the ten (or more) versions of the policy that existed over time. In Patterson v. Respondus, Inc., 593 F. Supp. 3d 783, 805 (N.D. Ill. 2022), the Court declined to dismiss claims based on policies because the case “may involve factual questions about what [defendant’s] policies looked like at different moments in time.”

The Court’s skepticism should serve as a wake-up call. Concealing invasive data practices within complicated legal documents and merely asserting user consent may be coming to an end. Companies that are truly dedicated to privacy must go beyond minimal compliance and strive for actual transparency and meaningful choices for users.

For the Computer Fraud and Abuse Act (“CFAA”) claim (Count I), the Court undertook a analysis of the access without authorization element. The CFAA, that was originally enacted to combat hacking, imposes liability on anyone who intentionally accesses a computer without authorization or exceeds authorized access to obtain information. While finding that Plaintiffs’ allegations were insufficient, Judge Alexakis specifically distinguished this matter from Brodsky v. Apple Inc., No. 19-CV-00712-LHK, 2019 WL 4141936 (N.D. Cal. Aug. 30, 2019), where the plaintiffs had “concede[d] that [they] voluntarily installed the software update,” which unambiguously established authorization. Here, the Court emphasized that essential questions of fact exist about the scope of the authorization and the design of Plaintiffs’ operating systems, making dismissal based on implied authorization inappropriate at this stage. The Court noted that factual disputes, such as the scope of Defendants’ access, are not appropriately resolved on a motion to dismiss.

Particularly, the Court declined to follow cases like hiQ Lab’ys, Inc. v. LinkedIn Corp., 31 F.4th 1180, 1197 (9th Cir. 2022), which held that scraping publicly available data does not constitute unauthorized access under CFAA. Unlike hiQ Labs, where access restrictions were clear, Plaintiffs alleged that CapCut accessed files beyond what they knowingly permitted. However, the Court found that Plaintiffs failed to sufficiently allege that ByteDance exceeded authorized access under Carr v. Saul, 593 U.S. 83, 141 (2021), which clarified that merely misusing information one is entitled to access does not violate the CFAA.

In dismissing the Stored Communications Act (“SCA”) claims (Count IV), Judge Alexakis found particularly significant the timing mismatch in Plaintiffs’ allegations about data sharing with the Chinese Communist Party (“CCP”). The Court notably observed that even if ByteDance shared user communications with the CCP in 2018 (as alleged by a former employee cited in Plaintiffs’ Complaint), it is too much to presume based on the engineer’s statement that these activities were ongoing several years later when CapCut became available to users in the United States. This temporal gap and the lack of specificity about what data was shared rendered the allegations too speculative to survive dismissal.

The Court also found that Plaintiffs failed to allege that ByteDance qualified as a remote computing service (“RCS”) or electronic communications service (“ECS”) under the SCA. Under the SCA, ECS is any service that provides users with the ability to send or receive wire or electronic communications. At the same time, RCS is a service that provides computer storage or processing services to the public utilizing an electronic communications system. In Garcia v. City of Laredo, 702 F.3d 788, 792 (5th Cir. 2012), the Court noted that for a company to be liable under the SCA, it must provide services that facilitate the transmission, storage, or processing of electronic communications on behalf of users—not merely collect and store user data for its purposes. Because Plaintiffs did not establish that CapCut functioned as an ECS or RCS, their SCA claims failed as a matter of law.

Lastly, Judge Alexakis granted Plaintiffs until April 2, 2025, to file an amended complaint addressing deficiencies in their dismissed claims. So whether you’re a legal professional, casual content creator, or simply concerned about data privacy, as you should be, the ongoing developments in Rodriguez v. ByteDance merit your continued attention.

So, all in all, I’m particularly encouraged by how the Court emphasized consumer expectations throughout its analysis. This suggests a shift from formalistic legal reasoning toward how privacy functions in people’s lives. Most users have never heard of BIPA or CIPA, but they instinctively recognize when an app crosses a line and invades their privacy.

For everyday app users (myself included), this case is a reminder that seemingly innocuous tools like video editors may be far more invasive than they appear. The allegations that CapCut collects all photos and videos on a device—not just those edited—should give pause to anyone who casually grants broad permissions during app installation..

So be careful out there, folks. Next time you download that trending app, consider this before blindly agreeing to permission requests. That innocent-looking video editor allegedly might be analyzing your face, recording your voice, or scanning through years of personal photos—all while you’re just trying to add a filter to your weekend outing with family and friends. Scary stuff.

And yet, whether you have any real recourse if an app oversteps its bounds depends entirely on which law—if any—happens to apply. The fact that some claims survived in this matter while others failed underscores the fragmented and inconsistent nature of privacy law in the U.S. Right now, a company’s liability for invasive data collection often hinges on whether a lawsuit is filed under a state biometric law, a consumer protection statute, or federal wiretap regulations—each with different requirements and loopholes. This patchwork approach leaves consumers vulnerable and businesses uncertain about compliance.

Imagine if physical property rights varied so drastically between states—where some protected against trespassing while others only recognized theft if an item was taken. That’s essentially our current digital privacy landscape, and without unified standards, the gaps in protection will only widen.

As always,

Keep it legal, keep it smart, and stay ahead of the game.

Talk soon!

HTML Embed Code
HB Ad Slot
HB Ad Slot
HB Mobile Ad Slot
HB Ad Slot
HB Mobile Ad Slot
 
NLR Logo
We collaborate with the world's leading lawyers to deliver news tailored for you. Sign Up for any (or all) of our 25+ Newsletters.

 

Sign Up for any (or all) of our 25+ Newsletters