In 2026, could any five words be more chilling than “We’re changing our privacy terms?”
The timing could not have been worse for TikTok US when it sent millions of US users a mandatory privacy pop-up on January 22. The message forced users to accept updated terms if they wanted to keep using the app. Buried in that update was language about collecting “citizenship or immigration status.”
Specifically, TikTok said:
“Information You Provide may include sensitive personal information, as defined under applicable state privacy laws, such as information from users under the relevant age threshold, information you disclose in survey responses or in your user content about your racial or ethnic origin, national origin, religious beliefs, mental or physical health diagnosis, sexual life or sexual orientation, status as transgender or nonbinary, citizenship or immigration status, or financial information.”
The internet reacted badly. TikTok users took to social media, with some suggesting that TikTok was building a database of immigration status, and others pledging to delete their accounts. It didn’t help that TikTok’s US operation became a US-owned company on the same day, with Senator Ed Markey (D-Mass.) criticizing what he sees as a lack of transparency around the deal.
A legal requirement
In this case, things are may be less sinister than you’d think. The language is not new—it first appeared around August 2024. And TikTok is not asking users to provide their immigration status directly.
Instead, the disclosure covers sensitive information that users might voluntarily share in videos, surveys, or interactions with AI features.
The change appears to be driven largely by California’s AB-947, signed in October 2023. The law added immigration status to the state’s definition of sensitive personal information, placing it under stricter protections. Companies are required to disclose how they process sensitive personal information, even if they do not actively seek it out.
Other social media companies, including Meta, do not explicitly mention immigration status in their privacy policies. According to TechCrunch, that difference likely reflects how specific their disclosure language is—not a meaningful difference in what data is actually collected.
One meaningful change in TikTok’s updated policy does concern location tracking. Previous versions stated that TikTok did not collect GPS data from US users. The new policy says it may collect precise location data, depending on user settings. Users can reportedly opt out of this tracking.
Read the whole board, not just one square
So, does this mean TikTok—or any social media company—deserves our trust? That’s a harder question.
There are still red flags. In April, TikTok quietly removed a commitment to notify users before sharing data with law enforcement. According to Forbes, the company has also declined to say whether it shares, or would share, user data with agencies such as the Department of Homeland Security (DHS) or Immigration and Customs Enforcement (ICE).
That uncertainty is the real issue. Social media companies are notorious for collecting vast amounts of user data, and for being vague about how it may be used later. Outrage over a particularly explicit disclosure is understandable, but the privacy problem runs much deeper than a single policy update from one company.
People have reason to worry unless platforms explicitly commit to not collecting or inferring sensitive data—and explicitly commit to not sharing it with government agencies. And even then, skepticism is healthy. These companies have a long history of changing policies quietly when it suits them.
We don’t just report on data privacy—we help you remove your personal information
Cybersecurity risks should never spread beyond a headline. With Malwarebytes Personal Data Remover, you can scan to find out which sites are exposing your personal information, and then delete that sensitive data from the internet.
In 2026, could any five words be more chilling than “We’re changing our privacy terms?”
The timing could not have been worse for TikTok US when it sent millions of US users a mandatory privacy pop-up on January 22. The message forced users to accept updated terms if they wanted to keep using the app. Buried in that update was language about collecting “citizenship or immigration status.”
Specifically, TikTok said:
“Information You Provide may include sensitive personal information, as defined under applicable state privacy laws, such as information from users under the relevant age threshold, information you disclose in survey responses or in your user content about your racial or ethnic origin, national origin, religious beliefs, mental or physical health diagnosis, sexual life or sexual orientation, status as transgender or nonbinary, citizenship or immigration status, or financial information.”
The internet reacted badly. TikTok users took to social media, with some suggesting that TikTok was building a database of immigration status, and others pledging to delete their accounts. It didn’t help that TikTok’s US operation became a US-owned company on the same day, with Senator Ed Markey (D-Mass.) criticizing what he sees as a lack of transparency around the deal.
A legal requirement
In this case, things are may be less sinister than you’d think. The language is not new—it first appeared around August 2024. And TikTok is not asking users to provide their immigration status directly.
Instead, the disclosure covers sensitive information that users might voluntarily share in videos, surveys, or interactions with AI features.
The change appears to be driven largely by California’s AB-947, signed in October 2023. The law added immigration status to the state’s definition of sensitive personal information, placing it under stricter protections. Companies are required to disclose how they process sensitive personal information, even if they do not actively seek it out.
Other social media companies, including Meta, do not explicitly mention immigration status in their privacy policies. According to TechCrunch, that difference likely reflects how specific their disclosure language is—not a meaningful difference in what data is actually collected.
One meaningful change in TikTok’s updated policy does concern location tracking. Previous versions stated that TikTok did not collect GPS data from US users. The new policy says it may collect precise location data, depending on user settings. Users can reportedly opt out of this tracking.
Read the whole board, not just one square
So, does this mean TikTok—or any social media company—deserves our trust? That’s a harder question.
There are still red flags. In April, TikTok quietly removed a commitment to notify users before sharing data with law enforcement. According to Forbes, the company has also declined to say whether it shares, or would share, user data with agencies such as the Department of Homeland Security (DHS) or Immigration and Customs Enforcement (ICE).
That uncertainty is the real issue. Social media companies are notorious for collecting vast amounts of user data, and for being vague about how it may be used later. Outrage over a particularly explicit disclosure is understandable, but the privacy problem runs much deeper than a single policy update from one company.
People have reason to worry unless platforms explicitly commit to not collecting or inferring sensitive data—and explicitly commit to not sharing it with government agencies. And even then, skepticism is healthy. These companies have a long history of changing policies quietly when it suits them.
We don’t just report on data privacy—we help you remove your personal information
Cybersecurity risks should never spread beyond a headline. With Malwarebytes Personal Data Remover, you can scan to find out which sites are exposing your personal information, and then delete that sensitive data from the internet.
TikTok may have found a way to stay online in the US. The company announced late last week that it has set up a joint venture backed largely by US investors. TikTok announced TikTok USDS Joint Venture LLC on Friday in a deal valued at about $14 billion, allowing it to continue operating in the country.
This is the culmination of a long-running fight between TikTok and US authorities. In 2019, the Committee on Foreign Investment in the United States (CFIUS) flagged ByteDance’s 2017 acquisition of Musical.ly as a national security risk, on the basis that state links between the app’s Chinese owner would make put US users’ data at risk.
In his first term, President Trump issued an executive order demanding that ByteDance sell the business or face a ban. That was order was blocked by courts, and President Biden later replaced it with a broader review process in 2021.
In April 2024, Congress passed the Protecting Americans from Foreign Adversary Controlled Applications Act (PAFACA), which Biden signed into law. That set a January 19, 2025 deadline for ByteDance to divest its business or face a nationwide ban. With no deal finalized, TikTok voluntarily went dark for about 12 hours on January 18, 2025. Trump later issued executive orders extending the deadline, culminating in a September 2025 agreement that led to the joint venture.
Three managing investors each hold 15% of the new business: database giant Oracle (which previously vied to acquire TikTok when ByteDance was first told to divest), technology-focused investment group Silver Lake, and the United Arab Emirates-backed AI (Artificial Intelligence) investment company MGX.
Other investors include the family office of tech entrepreneur Michael Dell, as well as Vastmere Strategic Investments, Alpha Wave Partners, Revolution, Merritt Way, and Via Nova.
Original owner ByteDance retains 19.9% of the business, and according to an internal memo released before the deal was officially announced, 30% of the company will be owned by affiliates of existing ByteDance investors. That’s in spite of the fact that PAFACA mandated a complete severance of TikTok in the US from its Chinese ownership.
A focus on security
The company is eager to promote data security for its users. With that in mind, Oracle takes the role of “trusted security partner” for data protection and compliance auditing under the deal.
Oracle is also expected to store US user data in its cloud environment. The program will reportedly align with security frameworks including the National Institute of Standards and Technology (NIST) Cybersecurity Framework. Other TikTok-owned apps such as CapCut and Lemon8 will also fall under the joint venture’s security umbrella.
Canada’s TikTok tension
It’s been a busy month for ByteDance, with other developments north of the border. Last week, Canada’s Federal Court overturned a November 2024 governmental order to shut down TikTok’s Canadian business on national security grounds. The decision gives Industry Minister Mélanie Joly time to review the case.
Why this matters
TikTok’s new US joint venture lowers the risk of direct foreign access to American user data, but it doesn’t erase all of the concerns that put the app in regulators’ crosshairs in the first place. ByteDance still retains an economic stake, the recommendation algorithm remains largely opaque, and oversight depends on audits and enforcement rather than hard technical separation.
In other words, this deal reduces exposure, but it doesn’t make TikTok a risk-free platform. For users, that means the same common-sense rules still apply: be thoughtful about what you share and remember that regulatory approval isn’t the same as total data safety.
We don’t just report on data privacy—we help you remove your personal information
Cybersecurity risks should never spread beyond a headline. With Malwarebytes Personal Data Remover, you can scan to find out which sites are exposing your personal information, and then delete that sensitive data from the internet.
TikTok may have found a way to stay online in the US. The company announced late last week that it has set up a joint venture backed largely by US investors. TikTok announced TikTok USDS Joint Venture LLC on Friday in a deal valued at about $14 billion, allowing it to continue operating in the country.
This is the culmination of a long-running fight between TikTok and US authorities. In 2019, the Committee on Foreign Investment in the United States (CFIUS) flagged ByteDance’s 2017 acquisition of Musical.ly as a national security risk, on the basis that state links between the app’s Chinese owner would make put US users’ data at risk.
In his first term, President Trump issued an executive order demanding that ByteDance sell the business or face a ban. That was order was blocked by courts, and President Biden later replaced it with a broader review process in 2021.
In April 2024, Congress passed the Protecting Americans from Foreign Adversary Controlled Applications Act (PAFACA), which Biden signed into law. That set a January 19, 2025 deadline for ByteDance to divest its business or face a nationwide ban. With no deal finalized, TikTok voluntarily went dark for about 12 hours on January 18, 2025. Trump later issued executive orders extending the deadline, culminating in a September 2025 agreement that led to the joint venture.
Three managing investors each hold 15% of the new business: database giant Oracle (which previously vied to acquire TikTok when ByteDance was first told to divest), technology-focused investment group Silver Lake, and the United Arab Emirates-backed AI (Artificial Intelligence) investment company MGX.
Other investors include the family office of tech entrepreneur Michael Dell, as well as Vastmere Strategic Investments, Alpha Wave Partners, Revolution, Merritt Way, and Via Nova.
Original owner ByteDance retains 19.9% of the business, and according to an internal memo released before the deal was officially announced, 30% of the company will be owned by affiliates of existing ByteDance investors. That’s in spite of the fact that PAFACA mandated a complete severance of TikTok in the US from its Chinese ownership.
A focus on security
The company is eager to promote data security for its users. With that in mind, Oracle takes the role of “trusted security partner” for data protection and compliance auditing under the deal.
Oracle is also expected to store US user data in its cloud environment. The program will reportedly align with security frameworks including the National Institute of Standards and Technology (NIST) Cybersecurity Framework. Other TikTok-owned apps such as CapCut and Lemon8 will also fall under the joint venture’s security umbrella.
Canada’s TikTok tension
It’s been a busy month for ByteDance, with other developments north of the border. Last week, Canada’s Federal Court overturned a November 2024 governmental order to shut down TikTok’s Canadian business on national security grounds. The decision gives Industry Minister Mélanie Joly time to review the case.
Why this matters
TikTok’s new US joint venture lowers the risk of direct foreign access to American user data, but it doesn’t erase all of the concerns that put the app in regulators’ crosshairs in the first place. ByteDance still retains an economic stake, the recommendation algorithm remains largely opaque, and oversight depends on audits and enforcement rather than hard technical separation.
In other words, this deal reduces exposure, but it doesn’t make TikTok a risk-free platform. For users, that means the same common-sense rules still apply: be thoughtful about what you share and remember that regulatory approval isn’t the same as total data safety.
We don’t just report on data privacy—we help you remove your personal information
Cybersecurity risks should never spread beyond a headline. With Malwarebytes Personal Data Remover, you can scan to find out which sites are exposing your personal information, and then delete that sensitive data from the internet.
Loyal readers and other privacy-conscious people will be familiar with the expression, “If it’s too good to be true, it’s probably false.”
Getting paid handsomely to scroll social media definitely falls into that category. It sounds like an easy side hustle, which usually means there’s a catch.
In January 2026, an app called Freecash shot up to the number two spot on Apple’s free iOS chart in the US, helped along by TikTok ads that look a lot like job offers from TikTok itself. The ads promised up to $35 an hour to watch your “For You” page. According to reporting, the ads didn’t promote Freecash by name. Instead, they showed a young woman expressing excitement about seemingly being “hired by TikTok” to watch videos for money.
The landing pages featured TikTok and Freecash logos and invited users to “get paid to scroll” and “cash out instantly,” implying a simple exchange of time for money.
Those claims were misleading enough that TikTok said the ads violated its rules on financial misrepresentation and removed some of them.
Once you install the app, the promised TikTok paycheck vanishes. Instead, Freecash routes you to a rotating roster of mobile games—titles like Monopoly Go and Disney Solitaire—and offers cash rewards for completing time‑limited in‑game challenges. Payouts range from a single cent for a few minutes of daily play up to triple‑digit amounts if you reach high levels within a fixed period.
The whole setup is designed not to reward scrolling, as it claims, but to funnel you into games where you are likely to spend money or watch paid advertisements.
Freecash’s parent company, Berlin‑based Almedia, openly describes the platform as a way to match mobile game developers with users who are likely to install and spend. The company’s CEO has spoken publicly about using past spending data to steer users toward the genres where they’re most “valuable” to advertisers.
Our concern, beyond the bait-and-switch, is the privacy issue. Freecash’s privacy policy allows the automatic collection of highly sensitive information, including data about race, religion, sex life, sexual orientation, health, and biometrics. Each additional mobile game you install to chase rewards adds its own privacy policy, tracking, and telemetry. Together, they greatly increase how much behavioral data these companies can harvest about a user.
Experts warn that data brokers already trade lists of people likely to be more susceptible to scams or compulsive online behavior—profiles that apps like this can help refine.
We’ve previously reported on data brokers that used games and apps to build massive databases, only to later suffer breaches exposing all that data.
When asked about the ads, Freecash said the most misleading TikTok promotions were created by third-party affiliates, not by the company itself. Which is quite possible because Freecash does offer an affiliate payout program to people who promote the app online. But they made promises to review and tighten partner monitoring.
For experienced users, the pattern should feel familiar: eye‑catching promises of easy money, a bait‑and‑switch into something that takes more time and effort than advertised, and a business model that suddenly makes sense when you realize your attention and data are the real products.
If you’re curious how intrusive schemes like this can be, consider using a separate email address created specifically for testing. Avoid sharing real personal details. Many users report that once they sign up, marketing emails quickly pile up.
Some of these schemes also appeal to people who are younger or under financial pressure, offering tiny payouts while generating far more value for advertisers and app developers.
So, what can you do?
Gather information about the company you’re about to give your data. Talk to friends and relatives about your plans. Shared common sense often helps make the right decisions.
Create a separate account if you want to test a service. Use a dedicated email address and avoid sharing real personal details.
Limit information you provide online to what makes sense for the purpose. Does a game publisher need your Social Security Number? I don’t think so.
Be cautious about app installs that are framed as required to make the money initially promised, and review permissions carefully.
Loyal readers and other privacy-conscious people will be familiar with the expression, “If it’s too good to be true, it’s probably false.”
Getting paid handsomely to scroll social media definitely falls into that category. It sounds like an easy side hustle, which usually means there’s a catch.
In January 2026, an app called Freecash shot up to the number two spot on Apple’s free iOS chart in the US, helped along by TikTok ads that look a lot like job offers from TikTok itself. The ads promised up to $35 an hour to watch your “For You” page. According to reporting, the ads didn’t promote Freecash by name. Instead, they showed a young woman expressing excitement about seemingly being “hired by TikTok” to watch videos for money.
The landing pages featured TikTok and Freecash logos and invited users to “get paid to scroll” and “cash out instantly,” implying a simple exchange of time for money.
Those claims were misleading enough that TikTok said the ads violated its rules on financial misrepresentation and removed some of them.
Once you install the app, the promised TikTok paycheck vanishes. Instead, Freecash routes you to a rotating roster of mobile games—titles like Monopoly Go and Disney Solitaire—and offers cash rewards for completing time‑limited in‑game challenges. Payouts range from a single cent for a few minutes of daily play up to triple‑digit amounts if you reach high levels within a fixed period.
The whole setup is designed not to reward scrolling, as it claims, but to funnel you into games where you are likely to spend money or watch paid advertisements.
Freecash’s parent company, Berlin‑based Almedia, openly describes the platform as a way to match mobile game developers with users who are likely to install and spend. The company’s CEO has spoken publicly about using past spending data to steer users toward the genres where they’re most “valuable” to advertisers.
Our concern, beyond the bait-and-switch, is the privacy issue. Freecash’s privacy policy allows the automatic collection of highly sensitive information, including data about race, religion, sex life, sexual orientation, health, and biometrics. Each additional mobile game you install to chase rewards adds its own privacy policy, tracking, and telemetry. Together, they greatly increase how much behavioral data these companies can harvest about a user.
Experts warn that data brokers already trade lists of people likely to be more susceptible to scams or compulsive online behavior—profiles that apps like this can help refine.
We’ve previously reported on data brokers that used games and apps to build massive databases, only to later suffer breaches exposing all that data.
When asked about the ads, Freecash said the most misleading TikTok promotions were created by third-party affiliates, not by the company itself. Which is quite possible because Freecash does offer an affiliate payout program to people who promote the app online. But they made promises to review and tighten partner monitoring.
For experienced users, the pattern should feel familiar: eye‑catching promises of easy money, a bait‑and‑switch into something that takes more time and effort than advertised, and a business model that suddenly makes sense when you realize your attention and data are the real products.
If you’re curious how intrusive schemes like this can be, consider using a separate email address created specifically for testing. Avoid sharing real personal details. Many users report that once they sign up, marketing emails quickly pile up.
Some of these schemes also appeal to people who are younger or under financial pressure, offering tiny payouts while generating far more value for advertisers and app developers.
So, what can you do?
Gather information about the company you’re about to give your data. Talk to friends and relatives about your plans. Shared common sense often helps make the right decisions.
Create a separate account if you want to test a service. Use a dedicated email address and avoid sharing real personal details.
Limit information you provide online to what makes sense for the purpose. Does a game publisher need your Social Security Number? I don’t think so.
Be cautious about app installs that are framed as required to make the money initially promised, and review permissions carefully.
Derek Banks // I recently heard something on the news that caught my attention. I suppose that isn’t abnormal these days, but this in particular was the first time I […]