What DeleteMe and Incogni aren't telling you

00:32:24
https://www.youtube.com/watch?v=iX3JT6q3AxA

Zusammenfassung

TLDRVideoen dækker emnet datamæglere og deres behandling af personlige data, herunder hvordan tjenester som Delete Me og Incogn fungerer for at hjælpe individer med at kontrollere deres data. Datamæglere indsamler og sælger personlige oplysninger, ofte uden samtykke, inddelingerne som søgning af personer, marketingdata, finansiel og sundhedsdata. Mens nogle datamæglere kan slette oplysninger, er det ikke altid muligt. Dette rejser spørgsmål omkring værdien af tjenester som Delete Me, der mange gange kun rådgiver om, hvordan man gør det selv. En vigtig pointe er, at forbrugere skal kæmpe for et stærkere lovgivningsmæssigt miljø for at beskytte mod usædvanlig databehandling.

Mitbringsel

  • 🔒 Tjenester som Delete Me hjælper med at kontrollere enkdata.
  • 🔍 Databrokers samler og sælger ofte data uden samtykke.
  • 📉 Det er vigtigt at forstå, hvilke data forskellige typer af databrokers indsamler.
  • 📜 Brugere kan anmode om sletning af oplysninger, men succesraten kan variere.
  • 🛡️ Der er behov for stærkere privatlivslove for at beskytte forbrugerne.
  • 👥 Risikoafhjælpende databrokers bruges til baggrundstjek og identitetsverifikation.
  • 📈 Forbrugere kan begrænse deres dataeksponering ved at skifte tjenester.
  • 📝 Inferred data bruges fra brugeradfærd til målrettede annoncer.
  • 🌐 De fleste datamæglere overholder sletningsanmodninger, selv uden lovpligt.
  • 💡 At være opmærksom på datasikkerhed kan reducere risikoen for databrud.

Zeitleiste

  • 00:00:00 - 00:05:00

    I første segment præsenteres Agny, en tjeneste, der hjælper brugere med at få kontrol over deres personlige data ved at fjerne oplysninger fra datahandlere automatisk. Dette betyder, at Agny håndterer alle anmodninger om databeskyttelse på brugernes vegne, og hjælper dem med at undgå den besværlige proces med at kontakte hver enkelt datahandler individuelt. Brugeren får 60% rabat for at tilmelde sig.

  • 00:05:00 - 00:10:00

    I det andet segment diskuteres problematikken omkring datahandlere, der har til formål at indsamle og bevare personlige oplysninger. Det forklares, hvordan almindelige virksomheder, der ikke er datamæglere, ofte nægter at slette data ved forespørgsel. Dette leder til spørgsmål om, hvorvidt virksomheder er juridisk forpligtet til at opretholde data.

  • 00:10:00 - 00:15:00

    I det tredje segment præsenteres begrebet datamægler mere detaljeret, herunder de fem hovedkategorier: menneskeforskningsservices, marketingdatahandlere, finansielle informationsdatahandlere, risikomitigeringsdatahandlere og sundhedsdatahandlere. Målet er at give en forståelse for, hvordan disse datahandlere indsamler og deler data.

  • 00:15:00 - 00:20:00

    I det fjerde segment fokuseres der på sundhedsdatahandlere, der indsamler følsomme sundhedsoplysninger og ikke er underlagt HIPPA-reglerne. Eksempler gives på, hvordan data kan samles fra forskellige enheder og endda bruges af reklamefirmaer til at målrette annoncer baseret på brugeres sundhedsmønstre.

  • 00:20:00 - 00:25:00

    Det femte segment dækker risikomitigeringsdatahandlere og deres rolle i baggrundstjek for job- og lejeansøgninger. Det afsløres, hvilke typer oplysninger de indsamler, såsom kontaktinfo, beliggenhed og økonomijski oplysninger, der bruges til at vurdere en persons baggrund.

  • 00:25:00 - 00:32:24

    I sidste segment diskuteres datatjenester, der hjælper med at fjerne oplysninger fra databrokere, såsom Delete Me og Incogn. Selvom de teknisk set leverer en vigtig service, argumenteres der for, at de er dyre og ofte ikke leverer en signifikant bedre service end hvad brugere selv kunne gøre. Diskussionen drejer sig også om, hvad man kan gøre for at beskytte sine data og opfordring til at tale med lokale repræsentanter for bedre privacy-lovgivning.

Mehr anzeigen

Mind Map

Video-Fragen und Antworten

  • What is a data broker?

    A data broker collects and sells personal information about individuals, often without their consent.

  • Why would data brokers delete your information?

    Data brokers usually resist deleting personal information as their business model relies on collecting and retaining data.

  • Is Delete Me a scam?

    While Delete Me offers legitimate services to delete your information from data brokers, its effectiveness and value compared to manual efforts are questioned.

  • What types of data do brokers collect?

    Data brokers can collect personal data like addresses, employment history, health data, and online behavior.

  • What are inferred data and user personas?

    Inferred data is based on observed online behavior to create profiles for advertising, without identifying individual users.

  • How can you protect your personal data?

    You can block ads, avoid public-facing social media, and contact representatives to demand better privacy laws.

  • What are risk mitigation data brokers?

    These brokers assess identities for fraud prevention, often used in job and apartment applications.

  • Why are personal health data brokers considered problematic?

    They collect sensitive health data often with minimal regulations, leading to potential misuse.

  • How can you delete your information from data brokers?

    You can manually request data deletion from brokers or use services like Delete Me.

  • What can consumers do about data collection?

    Consumers can advocate for privacy laws, limit personal data sharing, and switch to more privacy-focused services.

Weitere Video-Zusammenfassungen anzeigen

Erhalten Sie sofortigen Zugang zu kostenlosen YouTube-Videozusammenfassungen, die von AI unterstützt werden!
Untertitel
en
Automatisches Blättern:
  • 00:00:01
    author and talking solution, the simple
  • 00:00:04
    service that puts control of your
  • 00:00:05
    personal data back to you. They
  • 00:00:07
    intercept the data brokers on your
  • 00:00:09
    behalf, requesting your data's removal
  • 00:00:11
    and handle any resistance. But your data
  • 00:00:14
    could leave your address, employment
  • 00:00:16
    history, social media accounts,
  • 00:00:18
    telephone numbers, and much more.
  • 00:00:21
    Instead of you reaching out to data
  • 00:00:22
    brokers one by one, Agny does it
  • 00:00:25
    automatically on your behalf. See 60%
  • 00:00:27
    off your plan. So take back control of
  • 00:00:29
    your personal data today. To date, they
  • 00:00:31
    have deleted my personal information
  • 00:00:33
    from
  • 00:00:35
    670 digital.
  • 00:00:37
    [Music]
  • 00:00:38
    Why would a data broker delete your
  • 00:00:41
    information? Think about it. Isn't the
  • 00:00:43
    entire point of a data broker to collect
  • 00:00:45
    and retain your information? Like, isn't
  • 00:00:48
    that their whole business model? When I
  • 00:00:50
    requested Office Depot to delete my
  • 00:00:52
    personal information that they had
  • 00:00:53
    collected about me, like my name and
  • 00:00:55
    purchase history, they refused to do so
  • 00:00:57
    because of where I live. If I were
  • 00:00:59
    nicely to ask General Motors to delete
  • 00:01:01
    my data regarding what medications I
  • 00:01:03
    take, yes, that is a real thing. They
  • 00:01:05
    will also refuse for the same reason.
  • 00:01:08
    Neither of these companies are
  • 00:01:09
    considered data brokers, but what? They
  • 00:01:11
    care more about retaining my data than
  • 00:01:13
    an actual data broker? That doesn't make
  • 00:01:15
    any sense to me. Is it because they're
  • 00:01:17
    legally required to comply? Or do they
  • 00:01:19
    simply comply out of the goodness of
  • 00:01:21
    their own hearts? In this video, we're
  • 00:01:23
    going to find out what a data broker is,
  • 00:01:25
    how they collect your data, who they
  • 00:01:27
    share it with, and whether companies
  • 00:01:28
    like Incogn and Delete Me are a scam or
  • 00:01:30
    not. This video is not sponsored by
  • 00:01:32
    anyone. For full transparency, Delete Me
  • 00:01:35
    has tried to sponsor this channel in the
  • 00:01:36
    past, but I rejected their offer. That
  • 00:01:38
    said, I don't know that I would have
  • 00:01:40
    actually looked into making this video
  • 00:01:41
    had they not emailed me, so you can
  • 00:01:43
    thank them for that.
  • 00:01:49
    A major selling point of services like
  • 00:01:50
    Incogn and Delete Me is that these
  • 00:01:52
    creepy data brokers collect your data
  • 00:01:54
    and sell it off to the highest bidder.
  • 00:01:56
    In fact, their entire business relies on
  • 00:01:59
    data brokers existing. So, let's find
  • 00:02:00
    out what a data broker really is. It
  • 00:02:03
    turns out, like with most things, it's
  • 00:02:05
    more complicated than you'd probably
  • 00:02:07
    expect. The term data broker itself is a
  • 00:02:09
    pretty loose expression covering all
  • 00:02:11
    kinds of different companies. I actually
  • 00:02:13
    suspect the reason Incogn and Delete Me
  • 00:02:16
    use the terminology data brokers is
  • 00:02:18
    intentional as it's almost too broad to
  • 00:02:20
    actually describe what's really going
  • 00:02:21
    on. There are five major categories of
  • 00:02:24
    data brokers. I'll quickly give you an
  • 00:02:26
    idea of what they are and what they're
  • 00:02:27
    collecting and then we will dive deeper
  • 00:02:29
    into each category. The most common type
  • 00:02:31
    of data broker that data deletion
  • 00:02:33
    services work with are people search
  • 00:02:35
    services. These are companies that act
  • 00:02:37
    like modern-day phone books but online.
  • 00:02:40
    Much like a physical phone book, they're
  • 00:02:42
    available to pretty much anyone and
  • 00:02:43
    often have a majority of the important
  • 00:02:45
    information posted for free. Then there
  • 00:02:48
    are marketing data brokers. These
  • 00:02:50
    companies gather large amounts of
  • 00:02:51
    information about your online activity
  • 00:02:53
    and put it into pools of certain
  • 00:02:54
    behaviors, otherwise known as inferred
  • 00:02:57
    data. We'll dive into more detail as to
  • 00:02:59
    what this looks like in a bit. If you've
  • 00:03:02
    ever applied for a car loan or gotten a
  • 00:03:03
    credit card, you'll be familiar with the
  • 00:03:05
    next kind of broker, financial
  • 00:03:07
    information data brokers, otherwise
  • 00:03:09
    known as credit reporting bureaus. This
  • 00:03:11
    category is dominated by three
  • 00:03:13
    companies: Experian, Equifax, and
  • 00:03:15
    TransUnion. They track financial things
  • 00:03:18
    like payments to your phone bills,
  • 00:03:19
    utilities, when you pay rent, payments
  • 00:03:21
    to loans, when you apply to lines of
  • 00:03:23
    credit, or even when you have your
  • 00:03:24
    credit checked at all. In a similar
  • 00:03:26
    vein, there are risk mitigation data
  • 00:03:28
    brokers. These are companies that track
  • 00:03:29
    your identity often for preventing
  • 00:03:31
    fraud. When you apply for a job or even
  • 00:03:34
    apply for an apartment, this is how they
  • 00:03:36
    do background checks. They work with
  • 00:03:37
    risk mitigation data brokers. The last
  • 00:03:40
    on our list is personal health data
  • 00:03:42
    brokers. By far the creepiest in my
  • 00:03:44
    opinion. These companies track health
  • 00:03:46
    related data like when you purchase an
  • 00:03:48
    over-the-counter medication or even your
  • 00:03:50
    search history of a health rellated
  • 00:03:51
    topic. None of this data is protected
  • 00:03:53
    under HIPPA. So, these companies go nuts
  • 00:03:55
    in collecting and selling this data to
  • 00:03:57
    pharmaceutical companies. Before we
  • 00:03:59
    start diving into these categories, I
  • 00:04:00
    want to call out the last four on this
  • 00:04:02
    list are a vast majority of the data
  • 00:04:04
    brokers that make up the term data
  • 00:04:07
    brokers. Services like
  • 00:04:09
    Incogn only scratch the surface when it
  • 00:04:11
    comes to deleting your
  • 00:04:17
    data. You know what? I want to mix it up
  • 00:04:19
    today. We're going to start with a
  • 00:04:20
    category that just gives me the creeps.
  • 00:04:22
    Personal health data brokers are, in my
  • 00:04:25
    opinion, the worst kind of data broker.
  • 00:04:27
    They collect an obscene amount of highly
  • 00:04:29
    sensitive information with almost no
  • 00:04:31
    regulations at all. To help showcase
  • 00:04:33
    what I mean, I'm going to use the
  • 00:04:35
    privacy
  • 00:04:36
    visualizer. First, they collect health
  • 00:04:38
    data, which seems pretty obvious. This
  • 00:04:40
    would be direct health data from
  • 00:04:42
    something like a smartwatch or an app
  • 00:04:44
    that tracks health metrics, but they opt
  • 00:04:45
    to share that information for some
  • 00:04:47
    reason. We'll explore an example of this
  • 00:04:49
    in a moment. Usage data depends on the
  • 00:04:51
    service or tool in question, but how you
  • 00:04:54
    use it can sometimes qualify as health
  • 00:04:55
    data. An example of this would be
  • 00:04:57
    something like a smart toothbrush. The
  • 00:04:59
    usage of these tools can indicate
  • 00:05:01
    certain health patterns, hence the
  • 00:05:02
    category of usage data. If you've ever
  • 00:05:05
    searched for a health related product on
  • 00:05:06
    Amazon, guess what? They will sell that
  • 00:05:09
    information to a personal health data
  • 00:05:10
    broker who will then sell it to
  • 00:05:12
    advertisers. And would you believe it,
  • 00:05:14
    we have a category for that. It's search
  • 00:05:16
    history. Similarly, if you've read a few
  • 00:05:18
    health related articles using Google
  • 00:05:20
    Chrome, Google will sell that
  • 00:05:21
    information as well. Though Google is
  • 00:05:23
    sort of a data broker and all but name,
  • 00:05:25
    but they're also one of the biggest
  • 00:05:26
    advertising companies out there. So,
  • 00:05:28
    this example is a bit more complicated
  • 00:05:30
    than what we're going to actually look
  • 00:05:31
    into for this video. But browsing
  • 00:05:33
    history is a category, and we can count
  • 00:05:35
    that. Last, we have purchase history. If
  • 00:05:38
    you buy a health related item using
  • 00:05:40
    PayPal as your purchase method, PayPal
  • 00:05:42
    will happily sell that information to a
  • 00:05:44
    personal health data broker. Let's look
  • 00:05:46
    at a more complete example of how
  • 00:05:47
    personal health data brokers might get
  • 00:05:49
    this information. There's a sponsor here
  • 00:05:51
    on YouTube that I've seen a few times
  • 00:05:52
    before. Hate sleep. If you've never
  • 00:05:55
    heard of them, they make an overpriced
  • 00:05:56
    smart mattress that tracks your sleeping
  • 00:05:58
    patterns and keeps you cool while you
  • 00:05:59
    sleep. Great. That all sounds cool. How
  • 00:06:02
    do they fit into this picture of
  • 00:06:03
    personal health data brokers? Well, they
  • 00:06:06
    clearly state in the privacy policy that
  • 00:06:08
    they sell usage data to advertisers.
  • 00:06:10
    This means that your usage of the
  • 00:06:12
    mattress and the attached services are
  • 00:06:14
    sold to companies who can use that
  • 00:06:15
    information to show you health related
  • 00:06:17
    ads. What does that look like? Well,
  • 00:06:20
    let's say you've been having a hard time
  • 00:06:21
    sleeping for a few nights. That data
  • 00:06:23
    could be sold to a pharmaceutical
  • 00:06:25
    company that makes sleeping pills and
  • 00:06:26
    then you'll start seeing ads for said
  • 00:06:28
    sleeping pills. Or let's say your heart
  • 00:06:30
    rate is a little elevated while you
  • 00:06:32
    sleep. They can now show you ads for
  • 00:06:34
    heart medication. Heck, that heart rate
  • 00:06:37
    tracking thing can be used for all kinds
  • 00:06:38
    of weird things like understanding how
  • 00:06:40
    you interact with other platforms like
  • 00:06:42
    Facebook. If Facebook shows you an
  • 00:06:44
    article while you're laying in bed and
  • 00:06:46
    they know you've been looking at it for
  • 00:06:47
    a while, they can combine that data with
  • 00:06:49
    what eight collects and bammy wham, they
  • 00:06:51
    now know that you had an elevated heart
  • 00:06:53
    rate while reading it, implying that you
  • 00:06:55
    were angry. They can use that data to
  • 00:06:58
    show you more of those articles since it
  • 00:06:59
    keeps you on the platform for longer. I
  • 00:07:02
    know it sounds crazy, but it's clearly
  • 00:07:03
    spelled out in their privacy policy that
  • 00:07:05
    they share a lot of this information
  • 00:07:07
    with Facebook. And we all know how
  • 00:07:09
    Facebook is with data. Remember, none of
  • 00:07:11
    this is regulated and doesn't fall under
  • 00:07:13
    HIPPA guidelines at all. So, to them,
  • 00:07:16
    it's all fair game. Also, eight charges
  • 00:07:19
    you a monthly fee to see your health
  • 00:07:21
    data and cooling and whatever. And if
  • 00:07:22
    you stop paying, you can't use those
  • 00:07:24
    features, but they still track all that
  • 00:07:26
    precious data. Please do not buy one of
  • 00:07:28
    these. They are a privacy nightmare.
  • 00:07:30
    Let's move on to the next kind of data
  • 00:07:36
    broker. Risk mitigation data brokers are
  • 00:07:39
    probably one of the most overlooked
  • 00:07:41
    kinds of data brokers out there. They
  • 00:07:43
    are most commonly used in scenarios
  • 00:07:44
    where your identity needs to be
  • 00:07:45
    validated for, believe it or not,
  • 00:07:48
    mitigating risk. Like I mentioned
  • 00:07:50
    earlier, if you've ever applied for a
  • 00:07:51
    job or sent in an application for an
  • 00:07:53
    apartment, they often use risk
  • 00:07:55
    mitigation data brokers to make sure you
  • 00:07:57
    are who you say you are and you don't
  • 00:07:58
    have a history of doing quote unquote
  • 00:08:00
    unfavorable things. With a job
  • 00:08:03
    application, that can be things like
  • 00:08:04
    changing jobs rapidly or in some
  • 00:08:07
    instances working multiple jobs. With an
  • 00:08:09
    apartment, it can contain things like
  • 00:08:11
    rental history, late payments, and other
  • 00:08:13
    similar items. If we plug this into the
  • 00:08:15
    privacy visualizer, here's what we get.
  • 00:08:17
    First, they collect contact info, which
  • 00:08:19
    would be things like your name, current
  • 00:08:21
    and previous addresses, phone numbers,
  • 00:08:23
    and email addresses. This doesn't seem
  • 00:08:25
    to be collected in all cases, but
  • 00:08:27
    identifiers are still pretty common from
  • 00:08:28
    what I found. This comes up when the
  • 00:08:30
    data is aggregated to identify you under
  • 00:08:32
    one unique identifier rather than your
  • 00:08:35
    name. Though, this normally does come in
  • 00:08:37
    addition to your name and other contact
  • 00:08:39
    information, but it still counts. Next
  • 00:08:42
    is location. No, they aren't tracking
  • 00:08:44
    where you are all of the time, just
  • 00:08:45
    things like your address and addresses
  • 00:08:47
    related to you, like your work address.
  • 00:08:49
    Lastly, there's financial information.
  • 00:08:51
    This can sometimes contain things that
  • 00:08:52
    would show up on a credit report, such
  • 00:08:54
    as a list of debts, payment history, and
  • 00:08:56
    other similar information, which leads
  • 00:08:59
    us perfectly into our next data broker
  • 00:09:04
    type. You may recognize this category by
  • 00:09:07
    another name, credit reporting bureaus.
  • 00:09:09
    If you live in the United States, this
  • 00:09:12
    is data that you cannot opt out of or
  • 00:09:14
    have deleted at all. This is all of the
  • 00:09:16
    information that makes up your credit
  • 00:09:18
    score, which is captured by the major
  • 00:09:19
    companies, Equifax, Experian, and
  • 00:09:21
    TransUnion. The information they have is
  • 00:09:24
    used most often when dealing with
  • 00:09:25
    creditors. If you want to get a loan for
  • 00:09:28
    a house or a car, that's all tracked. If
  • 00:09:30
    you get a credit card and pay it off
  • 00:09:32
    every month, that information is
  • 00:09:33
    reported to these companies. If you get
  • 00:09:35
    a dozen credit cards and they're all
  • 00:09:36
    maxed out and you're always late on
  • 00:09:38
    payments, guess what? They track that,
  • 00:09:39
    too. This data is considered highly
  • 00:09:41
    sensitive as if it gets breached, you're
  • 00:09:44
    at much higher risk of identity
  • 00:09:48
    theft. What's that? Equifax had a data
  • 00:09:51
    breach a while
  • 00:09:53
    ago. 15 million users, you say?
  • 00:09:57
    Oh, and what's that you say? They're not
  • 00:09:58
    only still a wildly successful company,
  • 00:10:01
    they have also wedged themselves so
  • 00:10:02
    deeply into our economic system that
  • 00:10:04
    you're not allowed to opt out of this
  • 00:10:05
    data collection despite their very
  • 00:10:07
    traceable history of mishandling it.
  • 00:10:10
    Dang, that's pretty darn annoying.
  • 00:10:12
    Anyways, here's what these look like on
  • 00:10:14
    the visualizer. Financial information
  • 00:10:16
    contains things like your credit score,
  • 00:10:18
    the debt you're in, the creditors you've
  • 00:10:19
    applied to, any late payments, and even
  • 00:10:21
    how much money you make. Purchase
  • 00:10:23
    history is generally limited to major
  • 00:10:25
    purchases, but that still counts as
  • 00:10:27
    purchase history. Location, once again,
  • 00:10:29
    is mostly just your address. They don't
  • 00:10:31
    really care where you currently are.
  • 00:10:33
    Identifiers, as they have sort of a
  • 00:10:35
    profile on you that can have an ID
  • 00:10:37
    associated with it. Also, your social
  • 00:10:39
    security number is technically an
  • 00:10:41
    identifier. Contact info, including your
  • 00:10:43
    name, address, phone number, and email
  • 00:10:45
    address. And sensitive info, including
  • 00:10:47
    things like your birth date, social
  • 00:10:48
    security number. And I mean, your credit
  • 00:10:50
    score is something I'd consider to be
  • 00:10:51
    pretty sensitive
  • 00:10:57
    information. Next up is marketing data
  • 00:11:00
    brokers. This one is fun, I promise. The
  • 00:11:02
    goal of a marketing data broker is to
  • 00:11:04
    acquire vast amounts of indirect user
  • 00:11:06
    data to sell to companies looking to
  • 00:11:07
    advertise their products. Much of this
  • 00:11:10
    is done with something called inferred
  • 00:11:11
    data. Another common misnomer for this
  • 00:11:14
    is user personas, but those are normally
  • 00:11:16
    made up people that can guide what
  • 00:11:18
    inferred data is actually sought after.
  • 00:11:20
    To help show how this works, I'll need
  • 00:11:22
    100 volunteers. Great. Hi,
  • 00:11:25
    everybody. As it currently stands, I
  • 00:11:27
    don't know anything about these 100
  • 00:11:29
    people. So, to help me identify them,
  • 00:11:31
    I'll ask a simple question that might
  • 00:11:33
    describe something they do. To make it a
  • 00:11:35
    little more clear, I'll highlight the
  • 00:11:36
    tile they're standing on to represent if
  • 00:11:38
    they do the thing in question. Let's
  • 00:11:40
    start off with a pretty specific example
  • 00:11:42
    of the people here who has shopped at
  • 00:11:44
    Target within the last 6 months. Great.
  • 00:11:47
    So, as you can see here, 54 people in
  • 00:11:49
    this sample fall under this description.
  • 00:11:52
    If you imagine the general scale of the
  • 00:11:54
    amount of people who have shopped at
  • 00:11:55
    Target within the last 6 months, you can
  • 00:11:57
    imagine that the number is far too high
  • 00:11:58
    to properly identify any one person.
  • 00:12:02
    This sort of feels like it identifies
  • 00:12:03
    you, but without actually identifying
  • 00:12:05
    you. It's sort of like a giant game of
  • 00:12:07
    Guess Who, but without names.
  • 00:12:10
    Let's see what other groups we could
  • 00:12:11
    apply here. So, we have our 54 people
  • 00:12:13
    who shopped at Target. Let's highlight
  • 00:12:15
    the people who live in Atlanta, Georgia.
  • 00:12:17
    Okay, that's 20 people in our sample.
  • 00:12:19
    What about people who drink coffee?
  • 00:12:21
    People who listen to punk rock music?
  • 00:12:23
    People who have a commute between 10 and
  • 00:12:25
    15 miles to work? Who here owns a
  • 00:12:27
    PlayStation? Anyone here love dogs? Who
  • 00:12:30
    here has recently watched an educational
  • 00:12:32
    video on YouTube? What about the people
  • 00:12:34
    who are concerned about privacy? These
  • 00:12:37
    groupings by themselves can't identify
  • 00:12:39
    any one person. But if we add it all up
  • 00:12:42
    with people who shop at Target that live
  • 00:12:43
    in Atlanta, Georgia, who also drink
  • 00:12:45
    coffee, listen to punk rock, have a 5 to
  • 00:12:47
    10 mile commute, owns a PlayStation,
  • 00:12:48
    loves dogs, and watched an educational
  • 00:12:50
    video about privacy on YouTube. Well,
  • 00:12:53
    that describes just one person from this
  • 00:12:54
    sample. But here's the thing. I still
  • 00:12:57
    can't identify who this person is. I
  • 00:13:00
    certainly have enough details to know
  • 00:13:01
    what I could advertise to this person,
  • 00:13:03
    but I don't know their name, address, or
  • 00:13:05
    any personally identifiable information.
  • 00:13:07
    I can infer a lot about them without
  • 00:13:09
    actually knowing them. These inferences
  • 00:13:12
    are just aggregated groups of hundreds
  • 00:13:14
    or thousands of people. This is the data
  • 00:13:16
    that is most often used to show you
  • 00:13:18
    personalized ads. Let's plug this all
  • 00:13:20
    into the privacy visualizer. Purchases
  • 00:13:22
    can describe the items that you buy.
  • 00:13:24
    Going back to our list of examples, this
  • 00:13:26
    would be people who drink coffee on a
  • 00:13:28
    PlayStation or even the people who
  • 00:13:29
    recently shopped at Target. Location
  • 00:13:32
    information can be used as well, from
  • 00:13:34
    something as broad as the people who
  • 00:13:35
    live in Atlanta to using location
  • 00:13:37
    services to determine the people who
  • 00:13:39
    have a commute of 10 to 15 miles. Usage
  • 00:13:42
    data is information how you use certain
  • 00:13:44
    services. This would include people who
  • 00:13:47
    recently watched an educational video on
  • 00:13:48
    YouTube or even the people who listened
  • 00:13:50
    to punk rock music. Search history can
  • 00:13:53
    include things that you search for that
  • 00:13:54
    can infer certain things. If you search
  • 00:13:57
    for photos of dogs and puppies often,
  • 00:13:59
    you're probably a person who loves dogs.
  • 00:14:01
    If you searched for this video, you're
  • 00:14:03
    probably a person concerned about
  • 00:14:04
    privacy. In fact, the people who are
  • 00:14:06
    most concerned about privacy are the
  • 00:14:08
    people that incognate and delete me are
  • 00:14:10
    aiming to show their ads to. They know
  • 00:14:12
    through inferring what my channel is
  • 00:14:14
    about that my viewers are likely
  • 00:14:15
    concerned about privacy and they would
  • 00:14:17
    be an easy sell to you. Do they know who
  • 00:14:20
    any of you are? No. Of course not. But
  • 00:14:22
    by watching my channel, they infer that
  • 00:14:24
    a number of you do care about this
  • 00:14:26
    stuff. This also contains things like
  • 00:14:28
    browser history. So all of the websites
  • 00:14:30
    you visit can be added to your inferred
  • 00:14:32
    data. Here's the big question, though.
  • 00:14:34
    If they collect the inferred data in the
  • 00:14:36
    first place, and they know they can show
  • 00:14:38
    you an ad, wouldn't they have to be able
  • 00:14:40
    to identify you in some way? Well, they
  • 00:14:43
    do. They normally tie your usage of the
  • 00:14:45
    internet to something called an
  • 00:14:46
    advertising ID. This is a randomized
  • 00:14:48
    string of numbers that is assigned to
  • 00:14:50
    your Google account or even your phone
  • 00:14:51
    itself. Heck, even cookies can be sort
  • 00:14:54
    of an identifier as we explored in my
  • 00:14:56
    video about cookies. It's worth calling
  • 00:14:58
    out that these methods of identifying
  • 00:15:00
    you are rarely associated to your name
  • 00:15:02
    or any other personally identifiable
  • 00:15:04
    information. It's more of a this
  • 00:15:06
    particular cell phone has a user that
  • 00:15:08
    does these things. That said, there can
  • 00:15:11
    be sensitive information attached to
  • 00:15:12
    this too. Depending on what you look at
  • 00:15:14
    online, it can be inferred if you are
  • 00:15:17
    pregnant. If you have a disability, if
  • 00:15:18
    you're religious, if you're part of a
  • 00:15:20
    trade union, even your own political
  • 00:15:22
    opinions can be inferred. What's weird
  • 00:15:24
    here is that if you tried to request
  • 00:15:26
    your information be deleted from these
  • 00:15:28
    data brokers, well, what would they
  • 00:15:30
    delete? I mean, if they have personally
  • 00:15:33
    identifiable information, sure, some
  • 00:15:35
    will delete that, but a majority of the
  • 00:15:37
    data these companies have is
  • 00:15:39
    nonidentifiable. So, they can't exactly
  • 00:15:42
    delete it because they can't trace it
  • 00:15:43
    back to you. the only identifier they
  • 00:15:45
    get is that random string which can be
  • 00:15:47
    changed or deleted by you. If you clear
  • 00:15:50
    cookies, that can remove one of these
  • 00:15:52
    links. If you delete or change your
  • 00:15:54
    advertising ID, that also removes one of
  • 00:15:57
    these links. So, while these companies
  • 00:15:58
    can't really delete this data, you can
  • 00:16:00
    at least obscure what data you're
  • 00:16:02
    attached
  • 00:16:07
    to. Okay, but on the surface, it seems
  • 00:16:09
    like there's nothing wrong with this
  • 00:16:10
    inferred data, right? I mean, it can't
  • 00:16:13
    really easily be traced back to you.
  • 00:16:15
    Well, as always, here's where data
  • 00:16:17
    breaches come in to ruin the fun.
  • 00:16:19
    Scammers could easily use this inferred
  • 00:16:21
    data to target people that are more
  • 00:16:22
    likely to fall for scams with shocking
  • 00:16:24
    accuracy. This data could be cross
  • 00:16:26
    referenced with public data. Let's look
  • 00:16:28
    at one of our dudes here. This guy goes
  • 00:16:30
    to coffee shops in the afternoon. He
  • 00:16:32
    lives in Seattle, Washington. He is
  • 00:16:34
    looking for a new job. He has a dog and
  • 00:16:36
    travels once per year. Well, let's say
  • 00:16:38
    that he also publicly posts to various
  • 00:16:40
    social media platforms. He posts selfies
  • 00:16:42
    of coffee shops some afternoons to
  • 00:16:44
    Snapchat. On LinkedIn, it shows that he
  • 00:16:47
    lives in Seattle and he's seeking a job.
  • 00:16:49
    On top of that, he shows photos of his
  • 00:16:51
    dogs and even his once per year travel
  • 00:16:53
    event on Instagram. Sure, cross
  • 00:16:56
    referencing that data could take a lot
  • 00:16:57
    of time, but this is where spear fishing
  • 00:17:00
    campaigns could come in. Scammers could
  • 00:17:02
    just send in a bunch of automated texts
  • 00:17:03
    and calls to a massive list of phone
  • 00:17:05
    numbers that were in a data breach.
  • 00:17:08
    If the data is attached to any inferred
  • 00:17:10
    data like companies like Toyota or
  • 00:17:12
    Facebook would have, they could make a
  • 00:17:14
    very targeted kind of attack without
  • 00:17:15
    knowing who they're hitting. The goal
  • 00:17:17
    isn't to trick everyone. The goal is to
  • 00:17:20
    effectively trick just a small group of
  • 00:17:22
    people. Since they know what city he
  • 00:17:23
    lives in, and they know that he's
  • 00:17:25
    looking for a job, they'll send out
  • 00:17:26
    texts to all the leaked numbers in that
  • 00:17:28
    city and they'll say something like,
  • 00:17:30
    "Hi, we were informed that you were
  • 00:17:32
    looking for a job with 50% travel. We
  • 00:17:34
    can offer such a position. If
  • 00:17:36
    interested, please apply here. Shady
  • 00:17:38
    link. Now, a majority of people would
  • 00:17:40
    probably ignore this, but someone who is
  • 00:17:42
    seeking a job who also loves to travel,
  • 00:17:45
    well, it seems like a dream come true.
  • 00:17:47
    Remember, the more data that scammers
  • 00:17:49
    have, the easier it is for them to
  • 00:17:50
    target you in a very specific
  • 00:17:56
    way. This perfectly leads us into our
  • 00:17:58
    last data broker category, people search
  • 00:18:01
    services. These are simply the evolution
  • 00:18:03
    of physical phone books being migrated
  • 00:18:05
    to the digital realm. If you're too
  • 00:18:07
    young to remember, there were these
  • 00:18:09
    companies that would basically track
  • 00:18:10
    everyone's name, phone number, and
  • 00:18:12
    sometimes even their address, and they
  • 00:18:13
    would print it out in a huge book that
  • 00:18:15
    they would send to everyone in the mail.
  • 00:18:17
    They'd even attach them to these things
  • 00:18:18
    called phone booths. How fun. Anyways,
  • 00:18:22
    all this information was moved online,
  • 00:18:24
    and they obviously put that directly
  • 00:18:25
    identifiable information like your name,
  • 00:18:27
    phone number, and address in there. The
  • 00:18:30
    thing about the online world is that you
  • 00:18:31
    can also have an online presence. So,
  • 00:18:34
    some of them started tracking things
  • 00:18:35
    like social media profiles. Some of
  • 00:18:38
    these companies took it in a weirder
  • 00:18:39
    direction by adding other public records
  • 00:18:41
    like arrest records, property records,
  • 00:18:43
    and more. More specific information is
  • 00:18:45
    normally behind a payw wall. But for
  • 00:18:47
    everything else, like your name, phone
  • 00:18:48
    number, address, and even social media
  • 00:18:50
    profiles can all be seen for free by
  • 00:18:53
    anyone. Who's looking? Sometimes it can
  • 00:18:55
    be landlords who don't want to pay for
  • 00:18:56
    those risk mitigation data brokers.
  • 00:18:58
    Private investigators are probably
  • 00:19:00
    pretty pleased with these kinds of
  • 00:19:01
    services. Actually, any weirdo on the
  • 00:19:04
    internet, which has been problematic for
  • 00:19:06
    supporting things like doxing. And there
  • 00:19:08
    was even a reality TV show that used
  • 00:19:09
    these services. I think it was Catfish.
  • 00:19:12
    Okay, so this can be pretty creepy. Now,
  • 00:19:14
    we need to ask the scary question. Do
  • 00:19:16
    people search services comply with data
  • 00:19:18
    deletion requests? Weirdly enough, a
  • 00:19:21
    vast majority of them do comply. And not
  • 00:19:23
    all of them are even required to comply.
  • 00:19:25
    Many of these companies are based out of
  • 00:19:27
    states that are not required to comply,
  • 00:19:29
    yet they comply anyways. From what I
  • 00:19:31
    understand, they do this to avoid future
  • 00:19:33
    regulation, legal pressure, and just bad
  • 00:19:36
    press. Well, that's a good thing. You
  • 00:19:39
    can delete information using the forms
  • 00:19:40
    on the websites normally linked at the
  • 00:19:42
    bottom. In the description, I've linked
  • 00:19:44
    an amazing guide on how to delete your
  • 00:19:46
    information from a bunch of these
  • 00:19:48
    services for free. Actually, these are
  • 00:19:50
    the most common types of data brokers
  • 00:19:52
    that services like Incogn work with. So
  • 00:19:55
    things are starting to make more sense.
  • 00:19:57
    In fact, let's finally talk about data
  • 00:19:59
    deletion
  • 00:20:04
    services. These are companies like we've
  • 00:20:06
    been talking about incognite me. Some
  • 00:20:08
    other examples would be Mosilla Monitor,
  • 00:20:10
    Aura, Opry, OneREP, and Atlas Privacy.
  • 00:20:14
    I'm certainly there are more than that,
  • 00:20:15
    but they all work in the same way. They
  • 00:20:17
    offer to delete data that's been
  • 00:20:18
    collected about you for mostly people
  • 00:20:20
    search data brokers. Most of them use
  • 00:20:22
    automated systems to make this possible.
  • 00:20:24
    However, in the case of Delete Me, it
  • 00:20:27
    doesn't really seem like it's that much
  • 00:20:28
    faster than doing it yourself. Let's
  • 00:20:30
    break down one of the sponsored spots
  • 00:20:32
    for Delete Me. Just like my VPN video, I
  • 00:20:35
    am not being critical of the creator
  • 00:20:36
    shown in this video. I'm being critical
  • 00:20:38
    of Delete Me and the script they provide
  • 00:20:40
    to creators. That's it. It's been a
  • 00:20:43
    little over 6 months since I first told
  • 00:20:45
    you about Delete Me, the simple service
  • 00:20:47
    that gives control of your personal data
  • 00:20:49
    back to you. In that half a year, Delete
  • 00:20:51
    Me has reviewed over 4,000 listings from
  • 00:20:54
    data brokers across the web for me. This
  • 00:20:57
    is sort of misleading in my opinion.
  • 00:20:59
    That 4,000 number really makes it sound
  • 00:21:01
    like it's from 4,000 data brokers or
  • 00:21:03
    something, but it's actually 4,000
  • 00:21:05
    individual items. So, let's say one data
  • 00:21:08
    broker has your name, your personal
  • 00:21:10
    email address, your work email address,
  • 00:21:12
    your phone number, and your last four
  • 00:21:14
    addresses. That's seven listings in this
  • 00:21:16
    example, but just one data broker. They
  • 00:21:19
    all do this, by the way. Here, I'll use
  • 00:21:21
    myself as an example. Before researching
  • 00:21:23
    this video, I used Mozilla Monitor. They
  • 00:21:26
    showed that from one data broker, there
  • 00:21:28
    were three email addresses, one phone
  • 00:21:30
    number, nine addresses, and four family
  • 00:21:32
    members. Of that information, only eight
  • 00:21:35
    of the items were actually correct
  • 00:21:37
    pieces of information. The others were
  • 00:21:39
    incorrect emails. The phone number
  • 00:21:40
    wasn't one that had ever been mine at
  • 00:21:42
    any point in history. Some of the
  • 00:21:44
    addresses were wrong, and even the list
  • 00:21:46
    of family members contained people I
  • 00:21:47
    have never heard of before. All of these
  • 00:21:49
    data deletion services will count this
  • 00:21:51
    as 17 removals, but it was all from a
  • 00:21:54
    single source, half of which wasn't even
  • 00:21:56
    my data. Do you see how these numbers
  • 00:21:58
    feel bigger than they really are? And
  • 00:22:00
    it's removed my personal information
  • 00:22:02
    from almost 80 of them. Let's rewind for
  • 00:22:05
    a moment back to that screenshot. There
  • 00:22:07
    are 26 data brokers in this list. Of
  • 00:22:10
    those, there are over
  • 00:22:12
    4,347 pieces of information. And of all
  • 00:22:15
    of those, delete me has removed 38,
  • 00:22:18
    which is very likely from a single data
  • 00:22:19
    broker given how big that other number
  • 00:22:21
    is. So, they're claiming to have spent 9
  • 00:22:24
    hours contacting one data broker.
  • 00:22:27
    Additionally, according to the voiceover
  • 00:22:29
    itself from the first part, this was 6
  • 00:22:31
    months of paying for Delete Me. I'll
  • 00:22:33
    admit though, this isn't accounting for
  • 00:22:35
    search time, but think about this. I
  • 00:22:37
    don't know about you, but I'm fairly
  • 00:22:38
    confident that I could send more than 26
  • 00:22:40
    emails in a 6-month period. Heck, I know
  • 00:22:43
    that's true. I regularly send more than
  • 00:22:45
    that in a standard 8-hour workday. What
  • 00:22:48
    kind of listings? Well, we're talking
  • 00:22:50
    private information like my physical
  • 00:22:52
    address history, my property and court
  • 00:22:54
    records, even the names of my family
  • 00:22:56
    members now scrubbed from nearly 80 data
  • 00:22:59
    brokers archives without me having to do
  • 00:23:02
    any work on my own. As we discussed,
  • 00:23:04
    that is a tiny fraction of the actual
  • 00:23:06
    amount of data brokers that are out
  • 00:23:08
    there. Yes, removing this data is good,
  • 00:23:10
    but you can do this for free. And as
  • 00:23:12
    shown here, probably more efficiently
  • 00:23:14
    than Delete Me. Now, this all begs the
  • 00:23:17
    question, are these data deletion
  • 00:23:19
    companies legit? Well, for the most
  • 00:23:21
    part, yes. I'll admit though, I don't
  • 00:23:24
    like that some of them mess around with
  • 00:23:25
    dark patterns. Incogn, for example,
  • 00:23:27
    forces users to contact support to
  • 00:23:29
    cancel your subscription. They also post
  • 00:23:32
    obviously fake reviews to Reddit, but
  • 00:23:34
    overall they technically do what they
  • 00:23:36
    say they do. Is it a scam? No. It is
  • 00:23:39
    pretty clear to me that they do offer
  • 00:23:40
    the service of deleting your data from a
  • 00:23:42
    list of data brokers. A scam means to
  • 00:23:45
    take money with malicious intent of
  • 00:23:46
    offering nothing in return. That is
  • 00:23:49
    obviously not the case here. A better
  • 00:23:51
    question would be, is it a good deal? In
  • 00:23:53
    my opinion, not really. No. Unless you
  • 00:23:56
    live in a state with the right to delete
  • 00:23:58
    law. If you live in one of these states,
  • 00:24:00
    services like this could actually be
  • 00:24:01
    helpful as they will often look into
  • 00:24:03
    adding other data brokers for you if you
  • 00:24:05
    ask them to, and that could make the
  • 00:24:07
    process of cleaning up your online
  • 00:24:08
    presence a bit easier. That all said
  • 00:24:10
    though, a majority of the people in the
  • 00:24:12
    United States wouldn't exactly benefit
  • 00:24:14
    from this kind of service. Wait, I feel
  • 00:24:16
    like we've heard this before. Look,
  • 00:24:18
    everyone, it's friend of the channel,
  • 00:24:19
    Luch. Hey, I just overheard what you've
  • 00:24:22
    been talking about, and all of this
  • 00:24:23
    sounds really similar to what you said
  • 00:24:24
    about VPN companies. They kind of seem
  • 00:24:27
    like the same thing. Do you mind
  • 00:24:29
    elaborating? Well, both data deletion
  • 00:24:31
    services and VPNs technically provide
  • 00:24:33
    the services they claim to, and they do
  • 00:24:35
    it at a very low cost to them, but
  • 00:24:37
    charge a much higher price to you, the
  • 00:24:39
    consumer. Most people also don't really
  • 00:24:41
    need either service. VPNs provide a
  • 00:24:43
    solution to a problem most people don't
  • 00:24:45
    normally face. And in the case of data
  • 00:24:47
    deletion companies, all of that
  • 00:24:49
    information you're trying to have
  • 00:24:50
    deleted will always be replaced unless
  • 00:24:52
    you're a resident of a right to delete
  • 00:24:54
    state. Even if you do live in one of
  • 00:24:55
    these right to delete states, you would
  • 00:24:57
    still have to manually request that your
  • 00:24:59
    data be deleted by every other data
  • 00:25:01
    broker in the country. And that is a
  • 00:25:03
    timeconsuming task. There are over 1,700
  • 00:25:07
    registered data brokers in the US, which
  • 00:25:09
    is much higher than the 200 or so that
  • 00:25:11
    these companies actually delete data
  • 00:25:12
    from. But it's important to note that
  • 00:25:14
    only a handful of states have laws
  • 00:25:16
    requiring data brokers to actually be
  • 00:25:17
    registered. So in reality, that number
  • 00:25:19
    may be much higher when we take into
  • 00:25:21
    account the unregistered brokers. It's
  • 00:25:23
    estimated that worldwide there are over
  • 00:25:25
    5,000 data brokers. Oh, and as it turns
  • 00:25:28
    out, Incogn is owned by Surf Shark and
  • 00:25:30
    Surf Shark is owned by NordVPN. So, at
  • 00:25:32
    the end of the day, it kind of is just
  • 00:25:34
    the same thing. Funny how that works.
  • 00:25:36
    Fantastic points. Thanks for the extra
  • 00:25:38
    insight. Sure thing, man. I'll let you
  • 00:25:40
    get back to it. Thanks for stopping by.
  • 00:25:47
    This has me all thinking. Are we just
  • 00:25:49
    looking for a new company to hate on?
  • 00:25:51
    Hear me out. After the honey scandal,
  • 00:25:54
    people started giving these services the
  • 00:25:55
    stinky eye because it felt too good to
  • 00:25:57
    be true. As we just learned, it's not as
  • 00:26:00
    wonderful as they make it seem. They
  • 00:26:02
    have good marketing, but it's just a
  • 00:26:04
    simple tool that can only do so much
  • 00:26:05
    about the actual problem. These
  • 00:26:07
    companies are just taking advantage of
  • 00:26:08
    the fact that data brokers run freely
  • 00:26:10
    with no restrictions. It's an infinite
  • 00:26:12
    money glitch that they're just jumping
  • 00:26:13
    on before regulations eventually catch
  • 00:26:16
    up. They're not the source of the
  • 00:26:17
    problem, nor are they the solution to
  • 00:26:19
    the problem. The actual problem is the
  • 00:26:22
    data being collected in the first place.
  • 00:26:24
    Companies like Meta, Google, and Amazon
  • 00:26:25
    have been allowed to collect and sell
  • 00:26:27
    this data with almost no oversight in
  • 00:26:29
    the United States. Even now, 20 states
  • 00:26:32
    have some sort of consumer privacy law,
  • 00:26:34
    but to this day, there isn't a single
  • 00:26:36
    law that prevents a company from
  • 00:26:38
    collecting unnecessary data in the first
  • 00:26:40
    place. The European Union has
  • 00:26:42
    restrictions against these companies
  • 00:26:43
    with their GDPR. They absolutely prevent
  • 00:26:46
    companies from collecting certain types
  • 00:26:48
    of data from the start. It's hard to
  • 00:26:49
    imagine that there would be any kind of
  • 00:26:51
    federal change anytime soon given the
  • 00:26:53
    last three presidents have welcomed big
  • 00:26:55
    tech with open arms. This is not a
  • 00:26:57
    political jab at either side, but
  • 00:26:59
    instead at both. Both Democrats and
  • 00:27:02
    Republicans have enabled companies to
  • 00:27:03
    have this kind of power. Both Democrats
  • 00:27:06
    and Republicans have failed to protect
  • 00:27:07
    the people they're supposed to
  • 00:27:08
    represent. I mean, even the CEOs of
  • 00:27:11
    these companies aren't protected by
  • 00:27:12
    their own data collection. Mark
  • 00:27:14
    Zuckerberg's personal information has
  • 00:27:16
    been breached from Facebook data
  • 00:27:17
    breaches in the past.
  • 00:27:24
    Okay, so what can you do? It might feel
  • 00:27:26
    a little counterintuitive to what I just
  • 00:27:28
    said, but call your local representative
  • 00:27:30
    and demand for better privacy laws. Even
  • 00:27:32
    the states that have good privacy laws
  • 00:27:34
    can be greatly improved. Be loud, be
  • 00:27:37
    annoying. You have to pretend like these
  • 00:27:39
    local representatives have never heard
  • 00:27:41
    about what you're talking about before.
  • 00:27:43
    Be loud, but be clear in your language.
  • 00:27:46
    Be specific. For example, I don't like
  • 00:27:48
    that companies are allowed to collect
  • 00:27:50
    data that is solely used for targeted
  • 00:27:52
    advertising. It is oftentimes greatly
  • 00:27:54
    mishandled and results in getting leaked
  • 00:27:56
    to scammers, which costs Americans
  • 00:27:58
    hundreds of billions of dollars every
  • 00:28:00
    year. If you want to learn more about
  • 00:28:01
    how data breaches lead to scams, watch
  • 00:28:03
    my video about that here. When talking
  • 00:28:05
    to your local representative, give clear
  • 00:28:07
    examples. Identify what companies are
  • 00:28:09
    doing this and show it in their own
  • 00:28:11
    privacy policies. Here's an example.
  • 00:28:14
    Spotify shares data with these 94
  • 00:28:16
    companies by default. And I think it's
  • 00:28:18
    unnecessary for a company that streams
  • 00:28:20
    music to have all of this inferred data.
  • 00:28:22
    Data that is retained for any period of
  • 00:28:24
    time is at risk of a data breach, which
  • 00:28:26
    hurts consumers. When these 94 companies
  • 00:28:28
    are also given access to this
  • 00:28:30
    information, it stops being a question
  • 00:28:32
    of if there will be a data breach.
  • 00:28:34
    Rather, when will one of these 94
  • 00:28:36
    companies get hit with a data breach? In
  • 00:28:38
    2024, nearly half of all companies in
  • 00:28:41
    the United States experienced a data
  • 00:28:43
    breach. Using that information, we can
  • 00:28:45
    reasonably assume that 47 of these
  • 00:28:47
    companies will experience a data breach.
  • 00:28:50
    If there were regulations in place to
  • 00:28:51
    prevent Spotify from collecting this
  • 00:28:53
    information in the first place, the risk
  • 00:28:54
    of a data breach would be greatly
  • 00:28:56
    reduced as the unnecessary user data
  • 00:28:58
    wouldn't be there to begin with. Hackers
  • 00:29:00
    wouldn't have as much incentive to hack
  • 00:29:02
    Spotify. Here's another example. General
  • 00:29:05
    Motors knows what medications I take,
  • 00:29:06
    and they infer my intelligence through
  • 00:29:08
    various sources of data collection
  • 00:29:10
    spelled out in their privacy policy.
  • 00:29:12
    This is an inexcusable abuse of power,
  • 00:29:14
    and they do not need to know any of that
  • 00:29:16
    for me to be able to drive my car. In
  • 00:29:19
    the European Union, General Motors sells
  • 00:29:21
    vehicles that work exactly like they do
  • 00:29:23
    here in the United States. But laws like
  • 00:29:25
    the GDPR protect their citizens, and
  • 00:29:28
    this kind of data isn't collected at
  • 00:29:30
    all. Here's the EU version of the
  • 00:29:32
    privacy policy for General Motors. And
  • 00:29:34
    here's the US version. It's clear why
  • 00:29:37
    they collect as much as they do here.
  • 00:29:39
    There are no regulations to protect
  • 00:29:40
    consumers. If you want something more
  • 00:29:42
    actionable on your end, fight back with
  • 00:29:44
    your wallet if you're able to. For
  • 00:29:46
    example, stop using Gmail. For years
  • 00:29:49
    now, they've read every single email
  • 00:29:50
    that you get, which means that every
  • 00:29:52
    purchase you make is now valuable data
  • 00:29:54
    that they can sell to advertisers. I'll
  • 00:29:56
    note that this can take a very long time
  • 00:29:58
    to fully transition away, though. It
  • 00:30:00
    took me an entire year to move from
  • 00:30:01
    Gmail to Startmail, and even then I
  • 00:30:03
    faced minor inconveniences with services
  • 00:30:05
    that I forgot to move over. It takes
  • 00:30:08
    time, so be patient. Another thing I'll
  • 00:30:10
    always advocate for is to block all the
  • 00:30:12
    ads. A big point of this data being
  • 00:30:15
    collected is to sell to advertisers. So,
  • 00:30:17
    start blocking them from making their
  • 00:30:18
    money. Show that we don't want to be
  • 00:30:20
    tracked like this. I've heard other
  • 00:30:22
    creators compare this to piracy, but I
  • 00:30:25
    disagree. Blocking ads is not piracy. I
  • 00:30:28
    understand that it can hurt small
  • 00:30:29
    creators in the crossfire. So, if you're
  • 00:30:31
    able to support your favorite creators
  • 00:30:33
    directly. Another thing that can help
  • 00:30:35
    prevent cross referencing your data can
  • 00:30:37
    be to avoid public-f facing social
  • 00:30:38
    media, at least where your name is
  • 00:30:40
    shown. This can range from things like
  • 00:30:42
    Facebook, LinkedIn, and even Steam. Be
  • 00:30:44
    cautious about where you publicly show
  • 00:30:46
    your name. Most platforms have a privacy
  • 00:30:49
    setting that you can use to hide some of
  • 00:30:50
    this information. If you want more
  • 00:30:52
    information on what you can do about
  • 00:30:54
    feeling powerless to data collection, I
  • 00:30:56
    highly suggest you watch my video on
  • 00:30:57
    that here. Believe me, there is always
  • 00:31:00
    an
  • 00:31:01
    option. As always, thank you for
  • 00:31:03
    watching. Please be kind and patient
  • 00:31:04
    with each other. I want to give a
  • 00:31:06
    special thanks to everyone supporting
  • 00:31:07
    this channel through memberships and
  • 00:31:09
    Patreon. Considering that a lot of my
  • 00:31:11
    channel is about showing the truth
  • 00:31:12
    behind a lot of common sponsors, it
  • 00:31:13
    doesn't feel right for me to start
  • 00:31:15
    taking them on like most growing
  • 00:31:16
    YouTubers do. However, I would love to
  • 00:31:18
    make these videos for you all full-time.
  • 00:31:20
    So, if you have some money to spare and
  • 00:31:22
    want to help make that dream become a
  • 00:31:23
    reality, consider either becoming a
  • 00:31:25
    member or subscribing to my Patreon
  • 00:31:27
    page. By doing so, you'll get early
  • 00:31:29
    access to videos, exclusive access to a
  • 00:31:31
    supporters onlyly Discord server, and
  • 00:31:33
    more. The link to Patreon is in the
  • 00:31:35
    description or at
  • 00:31:37
    patreon.com/rejectconvenience. Peace.
  • 00:31:39
    Heads up, boss. Spider-Man's headed your
  • 00:31:41
    way.
  • 00:31:43
    [Laughter]
  • 00:31:46
    Excellent. I'll get you this time,
  • 00:31:49
    Spider-Man.
  • 00:31:50
    Doc, you're looking more medicine than
  • 00:31:52
    usual. Do something with your hair. Get
  • 00:31:56
    over here, you
  • 00:31:57
    pest.
  • 00:32:00
    O, yoza, you got a punch there,
  • 00:32:03
    Spider-Man. Thanks. I've been working on
  • 00:32:05
    a new move. Want to
  • 00:32:07
    see? Ah, you've defeated me once again.
  • 00:32:12
    Better luck next time. Wow. Yeah,
  • 00:32:15
    Spider-Man. Yeah.
Tags
  • databrokers
  • persondata
  • privacy
  • Delete Me
  • Incogn
  • data deletion
  • information security
  • ad targeting
  • GDPR
  • consumer rights