How Science Can Fix the Media Landscape

00:36:06
https://www.youtube.com/watch?v=4E0FJND3qkA

الملخص

TLDRIn this episode of Star Talk, Neil deGrasse Tyson and Gary O'Reilly discuss the complexities of modern news media, including the impact of the fairness doctrine, the rise of social media, and the challenges of bias in journalism. They are joined by Harleen Core, co-founder of Ground News, who explains how her organization aims to provide a more objective view of news by aggregating various sources and allowing users to see different perspectives. The conversation highlights the importance of critical thinking, the role of AI in journalism, and the need for individuals to engage with diverse viewpoints to combat echo chambers and filter bubbles.

الوجبات الجاهزة

  • 📰 The fairness doctrine aimed to ensure balanced news coverage.
  • 📺 News consumption has evolved into a 24/7 cycle driven by outrage.
  • 🔍 Ground News seeks to debias news by aggregating multiple sources.
  • 🤔 Critical thinking is essential for understanding news biases.
  • 📊 Metrics for success include the diversity of news sources read by users.
  • 💻 AI can help identify biases but also poses risks like deep fakes.
  • 🌐 Social media often reinforces existing beliefs through filter bubbles.
  • 📖 Individuals should challenge their biases by exploring diverse viewpoints.
  • ⚖️ Disinformation is intentionally misleading, unlike misinformation.
  • 👥 Engaging with opposing views can foster better understanding and dialogue.

الجدول الزمني

  • 00:00:00 - 00:05:00

    The discussion begins with a critique of the current state of news media, highlighting the shift from responsible journalism to sensationalism driven by outrage. The hosts, Neil DeGrasse Tyson and Gary O'Reilly, reflect on how news consumption has evolved from a simple daily update to a 24/7 cycle that feeds a constant need for information.

  • 00:05:00 - 00:10:00

    They introduce their guest, Harleen Core, co-founder of Ground News, who aims to address the biases in news reporting. Harleen explains how news is often filtered through various biases depending on the outlet, leading to fragmented perceptions of reality. The conversation emphasizes the importance of critical thinking when consuming news.

  • 00:10:00 - 00:15:00

    Harleen elaborates on Ground News' approach to presenting news, which involves aggregating different perspectives on the same event without imposing bias. They aim to help readers reconstruct the objective truth by showing how various outlets report on the same story, allowing users to make informed decisions based on a spectrum of coverage.

  • 00:15:00 - 00:20:00

    The hosts discuss the historical context of news reporting, mentioning the Fairness Doctrine that once required broadcasters to present multiple viewpoints. The repeal of this doctrine is seen as a turning point that led to the current fragmented media landscape, where sensationalism often overshadows factual reporting.

  • 00:20:00 - 00:25:00

    The conversation shifts to the challenges of identifying bias in news outlets. Harleen explains that Ground News uses third-party agencies to assess bias and factuality, emphasizing the need for transparency in how news is reported. They discuss the importance of understanding the framing of stories and the potential for misinformation.

  • 00:25:00 - 00:30:00

    The discussion touches on the psychological aspects of news consumption, including filter bubbles and echo chambers that reinforce existing beliefs. Harleen suggests that the revenue models of news outlets contribute to this issue, as they prioritize engagement over balanced reporting, leading to a lack of diverse perspectives.

  • 00:30:00 - 00:36:06

    Finally, the conversation concludes with a hopeful note about the potential for responsible journalism to emerge amidst the noise. Harleen shares metrics for success at Ground News, highlighting the importance of expanding readers' exposure to diverse news sources and fostering critical thinking in the audience.

اعرض المزيد

الخريطة الذهنية

فيديو أسئلة وأجوبة

  • What is the fairness doctrine?

    The fairness doctrine was a policy that required broadcasters to present contrasting viewpoints on controversial issues, ensuring equitable and honest news coverage.

  • How does Ground News aim to reduce bias?

    Ground News aggregates news from various sources and presents them without bias, allowing users to see different perspectives on the same event.

  • What are filter bubbles and echo chambers?

    Filter bubbles are situations where individuals are exposed only to information that aligns with their existing beliefs, while echo chambers reinforce those beliefs by limiting exposure to opposing viewpoints.

  • What is the role of AI in journalism?

    AI can help identify biases, summarize news, and improve comprehension, but it also poses risks like the creation of deep fakes.

  • How can individuals become more aware of news biases?

    Individuals can practice lateral reading, follow diverse news sources, and challenge their own biases to gain a broader understanding of news.

  • What metrics does Ground News use to measure success?

    Ground News measures success by tracking how many different news sources users engage with over time.

  • What is the impact of social media on news consumption?

    Social media often prioritizes sensational content, leading to outrage-driven engagement and reinforcing existing biases.

  • Can a center news outlet be biased?

    Yes, even center outlets can exhibit bias depending on their coverage and the topics they choose to highlight.

  • What is disinformation?

    Disinformation refers to false information that is deliberately spread to deceive, as opposed to misinformation, which is incorrect information shared without malicious intent.

  • How can people find reliable news sources?

    People can use tools like Ground News to compare different outlets and their biases, helping them identify reliable sources.

عرض المزيد من ملخصات الفيديو

احصل على وصول فوري إلى ملخصات فيديو YouTube المجانية المدعومة بالذكاء الاصطناعي!
الترجمات
en
التمرير التلقائي:
  • 00:00:00
    I probably watch much more Fox News than
  • 00:00:02
    you do. There was a doctrine called
  • 00:00:04
    fairness doctrine. FCC wanted uh radio
  • 00:00:06
    and then TV to take responsibility to
  • 00:00:08
    provide equitable and honest version of
  • 00:00:11
    what's really going on. I remember when
  • 00:00:13
    the news would give an opinion. It's an
  • 00:00:16
    outrage engine. They know that more
  • 00:00:18
    outrage you are, more time you'll spend
  • 00:00:20
    on it. Are we kidding ourselves to think
  • 00:00:21
    that we're going to get responsible
  • 00:00:23
    journalism coming forward?
  • 00:00:30
    This is Star Talk special edition. Mhm.
  • 00:00:34
    How about that? Neil Degrass Tyson here
  • 00:00:36
    right next to Gary O'Reilly. When you
  • 00:00:38
    see Gary, it is special. Today we're
  • 00:00:41
    talking about a very important subject.
  • 00:00:44
    Yeah. The news. Yeah. Was something
  • 00:00:47
    we've been wanting to come to grips with
  • 00:00:48
    for a while now. Who would have thought
  • 00:00:51
    that you'd have to talk about that? When
  • 00:00:53
    I grew up, the news was just the news
  • 00:00:54
    and you went on about your life. You
  • 00:00:56
    watched the news and then you went to
  • 00:00:57
    the real TV when you were growing up as
  • 00:00:59
    a kid.
  • 00:01:01
    This news thing got in the way, right?
  • 00:01:03
    Yeah. The news was like, "Yeah, I don't
  • 00:01:05
    need this." See, now it seems we live in
  • 00:01:09
    a constant need for news. It's not just
  • 00:01:12
    on the hour, it's every hour, 24 hours a
  • 00:01:14
    day. But I would I would say that it's a
  • 00:01:18
    desire for news, but a strong enough
  • 00:01:21
    desire becomes a need. Wow.
  • 00:01:26
    That's what I think is going on. Okay. I
  • 00:01:28
    mean, just have to think about the
  • 00:01:29
    number of news channels there are on TV,
  • 00:01:31
    if TV even exists anymore. Then there's
  • 00:01:34
    online, there's social media platforms.
  • 00:01:36
    Uh, but not just how many are there, how
  • 00:01:38
    many hours a day they broadcast news. Oh
  • 00:01:40
    gosh. Yeah. Right. I mean, it does fold
  • 00:01:43
    out into a large number. I mean, throw
  • 00:01:44
    in the unfiltered
  • 00:01:47
    influencers. Mhm. And then the news
  • 00:01:50
    landscape will and can look a bit of a
  • 00:01:52
    mess. Mhm. Uh we all have our trusted
  • 00:01:56
    news preferences, our go-tos, and as
  • 00:01:58
    I've said before, the better the
  • 00:02:00
    information, the better the the
  • 00:02:02
    decisions that you're going to make
  • 00:02:04
    based on that information. Exactly. Yes.
  • 00:02:06
    Right. But do we know if these new
  • 00:02:09
    sources bring their own filters or their
  • 00:02:11
    own biases?
  • 00:02:13
    Um it's not always obvious to see from
  • 00:02:17
    the outsider just by a headline.
  • 00:02:19
    Sometimes it is by the headline. Uh,
  • 00:02:21
    this is where our guest comes in, Neil.
  • 00:02:22
    So, if you would introduce them, please.
  • 00:02:24
    I'd be delighted to. Yeah. Yes. We have
  • 00:02:27
    with us Harleen Core. Harleen, welcome
  • 00:02:30
    to Star Talk. Thanks, Neil. Delighted to
  • 00:02:32
    be here. Excellent. And you're in from
  • 00:02:33
    Canada. That's right. Canada. The 51 The
  • 00:02:36
    51st day. That's right. We have no do
  • 00:02:38
    that.
  • 00:02:40
    Where did you read that, Neil? I I don't
  • 00:02:42
    know. I Some news source told me that
  • 00:02:45
    that's that's what it was. That
  • 00:02:46
    happened. Uh, co-founder and CEO. I got
  • 00:02:50
    it here of ground news. All right. Are
  • 00:02:54
    you grounded?
  • 00:02:56
    Grounded means you have an objective
  • 00:02:58
    understanding of reality in any
  • 00:02:59
    language. I'm pretty sure cuz the ground
  • 00:03:02
    is the ground. Yep. You're a former
  • 00:03:04
    engineer. That's That's right. What kind
  • 00:03:06
    of engineer? Uh a space engineer. Space.
  • 00:03:09
    Loving it. Oh, in the right place. Yes.
  • 00:03:12
    So a big fan. Okay. Thank you. You are
  • 00:03:16
    trying to
  • 00:03:18
    fix the news problem, not by giving it a
  • 00:03:21
    bias of your own, but by figuring out a
  • 00:03:24
    way to debias it. That's right. In some
  • 00:03:28
    objective way that people around a table
  • 00:03:30
    say, "Hey, I see what you did there."
  • 00:03:32
    That's right. And we kind of all agree,
  • 00:03:33
    no matter what side of the aisle you're
  • 00:03:35
    on, as Gary said, if I have what I would
  • 00:03:38
    consider a trusted source of news, and
  • 00:03:40
    what you do to the news makes it look
  • 00:03:42
    different from that. Yeah. Yeah. Why
  • 00:03:43
    should I have any confidence at all that
  • 00:03:45
    you're doing the right thing? Yeah, that
  • 00:03:47
    that's a very good question. So, let me
  • 00:03:49
    try and explain what what we do at
  • 00:03:50
    ground news. So, we are not No, there is
  • 00:03:52
    no try. There's do or do not. I I I
  • 00:03:56
    shall explain. Okay. Uh what we are
  • 00:03:59
    doing at ground news. So, how we view
  • 00:04:01
    news is that something happens as you
  • 00:04:03
    call the objective truth. something
  • 00:04:04
    happens and then it goes through this
  • 00:04:06
    prism of the media landscape and then it
  • 00:04:08
    fragments into all these million of
  • 00:04:10
    different versions of what what exactly
  • 00:04:12
    happens and depending on where news
  • 00:04:15
    outlets are on uh what their biases are
  • 00:04:17
    or what their agendas are or who's
  • 00:04:19
    funding them uh who owns them who who is
  • 00:04:22
    their audience that they don't want to
  • 00:04:24
    piss off and who's the sponsor who's the
  • 00:04:25
    sponsor and then they will tell you
  • 00:04:29
    although they uh the the event that
  • 00:04:31
    they're reporting is is is the the same
  • 00:04:34
    event, but how they're reporting it is
  • 00:04:36
    is going to be very very different. Um,
  • 00:04:39
    and depending on what version you're
  • 00:04:40
    reading, your your perception of the
  • 00:04:43
    reality of what happens is going to be
  • 00:04:44
    very very different to each other to the
  • 00:04:46
    point. Yeah. Using a space analogy,
  • 00:04:48
    you're lit we are literally sometimes
  • 00:04:50
    living in the different universes uh
  • 00:04:51
    depending on what news outlets or or
  • 00:04:54
    group. That analogy totally works. Yeah.
  • 00:04:56
    Thank you. You've met people and say,
  • 00:04:58
    "What universe did you did you come
  • 00:05:00
    from?" Yeah, literally. I've said that
  • 00:05:02
    way too many times lately. Yeah. Yeah.
  • 00:05:04
    Are we uh are we finding the same thing?
  • 00:05:06
    So, yeah, our job is not to say that
  • 00:05:08
    this one's right or this one's wrong. Um
  • 00:05:11
    and what we do is we literally uh
  • 00:05:15
    reconstitute all of that those versions
  • 00:05:17
    together to to to reverse engineer what
  • 00:05:21
    might have happened. So, we'll show
  • 00:05:23
    Whoa. Mhm. Whoa. It's a new take on it.
  • 00:05:27
    That's badass. Yeah. So, what criteria?
  • 00:05:29
    Let her finish her things. She Just use
  • 00:05:31
    the word reverse engineering. Let let
  • 00:05:33
    that sentence finish. The engineer said
  • 00:05:34
    reverse engineer surprise. Yeah. Yeah.
  • 00:05:37
    It's I hope it doesn't come become
  • 00:05:39
    scientist versus the engineer here. No,
  • 00:05:41
    no, no, not for this interview.
  • 00:05:44
    Otherwise, meet me outside. We'll talk
  • 00:05:45
    about No, I give up. Um, we put all
  • 00:05:49
    those sources together. So, let's say um
  • 00:05:52
    yeah, there is some executive order that
  • 00:05:54
    has passed and you there's a news story
  • 00:05:57
    saying, "Hey, this is the headline and
  • 00:05:58
    this is what happened." We will show you
  • 00:06:00
    along the spectrum of how the different
  • 00:06:02
    news sources cover it all the way from
  • 00:06:04
    the far left to the far right. And then
  • 00:06:06
    we don't put any check marks or X's
  • 00:06:08
    against any of them. We very much let
  • 00:06:10
    you decide where where the truth kind of
  • 00:06:14
    uh gets reconstituted and and you put up
  • 00:06:16
    you use your critical thinking to put
  • 00:06:17
    that together. You use your critical
  • 00:06:19
    thinking. Yes. What does that presume?
  • 00:06:23
    Where's the big assumption there? You're
  • 00:06:25
    absolutely right. or you should use your
  • 00:06:27
    critical thinking which is which is a
  • 00:06:29
    skill we are all losing we which is
  • 00:06:31
    which is a skill interesting you say
  • 00:06:33
    that
  • 00:06:34
    how it's more herd mentality then it's
  • 00:06:37
    herd mentality but also I think we
  • 00:06:39
    becoming lazy a bit because we I feel
  • 00:06:42
    like we like to be intellectually lazy
  • 00:06:44
    because it's it's great to hear somebody
  • 00:06:46
    else talk about what they think about it
  • 00:06:48
    or what their opinion is about something
  • 00:06:50
    and then regurgitate it rather than
  • 00:06:51
    using your own brain to be able to say
  • 00:06:53
    because it takes effort to be able to to
  • 00:06:55
    do that. I I'll just pick one guy, one
  • 00:06:57
    girl, one Substack, one podcast, one one
  • 00:07:00
    newsletter, one news channel, whatever
  • 00:07:02
    it is, and then just follow uh the one
  • 00:07:04
    that I agree with and reinforces my
  • 00:07:06
    cognitive bias. So, the psychology of
  • 00:07:09
    the news and how it's absorbed, how it's
  • 00:07:12
    portrayed
  • 00:07:14
    is now much much deeper than you and I
  • 00:07:17
    growing up. Oh, there's a nice guy in a
  • 00:07:19
    suit and a tie and he's reading the
  • 00:07:20
    stories from the day at 6:00 p.m. and
  • 00:07:22
    then we moved on. There are a couple of
  • 00:07:24
    reasons why that's not the case anymore.
  • 00:07:27
    So one is um there was a doctrine called
  • 00:07:30
    fairness doctrine if you've heard of
  • 00:07:32
    that that came into existence I think
  • 00:07:33
    late 1940s uh which that early that
  • 00:07:36
    early because that's before TV that that
  • 00:07:39
    that's because they wanted FCC wanted uh
  • 00:07:41
    radio and then TV to take responsibility
  • 00:07:43
    to provide um a more um equitable and
  • 00:07:47
    honest version of what's really going
  • 00:07:49
    on. So there was a fairness doctrine
  • 00:07:51
    where the onus was on the broadcasters
  • 00:07:53
    to actually show all versions of and
  • 00:07:54
    they had that control of them because
  • 00:07:56
    the federal government allocated the
  • 00:07:59
    electromagnetic spectrum to them. Right.
  • 00:08:02
    It was that's right. That's how they
  • 00:08:03
    control it. The licens
  • 00:08:05
    right and then it got repealed during
  • 00:08:07
    the Reagan administration. So that's
  • 00:08:10
    where the fragmentation really really
  • 00:08:11
    happened because then there wasn't there
  • 00:08:13
    wasn't uh uh any legal obligation to be
  • 00:08:16
    able to say that I have to show all this
  • 00:08:18
    various version and and show the that's
  • 00:08:20
    why the the guy in Thai uh that told the
  • 00:08:23
    Walter Kronhite version of the news that
  • 00:08:25
    used to exist didn't exist and then of
  • 00:08:27
    course everything spun out of control
  • 00:08:29
    when um news hit the internet and then
  • 00:08:32
    later hit social media and then it just
  • 00:08:34
    went crazy and um because I remember I
  • 00:08:37
    mean this is how old I I remember when
  • 00:08:40
    the news would give an opinion. It was
  • 00:08:43
    like, "Are you seated? Okay, we're about
  • 00:08:46
    to give an opinion. Get ready for this
  • 00:08:48
    opinion. We're going to come back and
  • 00:08:50
    we're going to sit here and this is
  • 00:08:51
    going to be an opinion." Like flashing
  • 00:08:53
    an opinion. And then it was over and
  • 00:08:55
    then and it wasn't told. It wasn't snuck
  • 00:08:57
    in to be said, "This is news. This is an
  • 00:08:59
    opinion." Yeah. As you said, very much
  • 00:09:01
    categorized. So, let me ask you, uh,
  • 00:09:04
    what criteria are you using at Ground
  • 00:09:07
    News to determine an outlet's bias? Be
  • 00:09:09
    it left, center, or right? That's right.
  • 00:09:12
    So, um, and can a center be biased?
  • 00:09:16
    Oh, that's a philosophical question.
  • 00:09:17
    That's a philosophical. Oh, yeah. Yeah.
  • 00:09:19
    Okay. Go ahead. Um, so one one decision
  • 00:09:21
    we made early on again to be as neutral
  • 00:09:23
    as possible and in a way as scientific
  • 00:09:25
    as possible. We do not determine the
  • 00:09:27
    rating that if this CNN is left or Fox
  • 00:09:30
    is right. We're using thirdparty uh
  • 00:09:32
    rating agencies and actually we are
  • 00:09:34
    using three of them who use three
  • 00:09:36
    different methodologies. One of them is
  • 00:09:37
    using crowd sourcing, one of them is
  • 00:09:38
    using experts, one of them, one of them
  • 00:09:41
    is is uh using algorithms and then we
  • 00:09:43
    take an statistical average of them and
  • 00:09:46
    then say okay based on these these uh
  • 00:09:48
    these rating agencies that's where the
  • 00:09:50
    news outlet lies. Um and do you look for
  • 00:09:53
    key words that would indicate that's
  • 00:09:55
    that's right that's right. How does the
  • 00:09:56
    story get framed? What topics do they
  • 00:09:58
    cover more often than less often? and
  • 00:10:00
    how much time they give to that topic to
  • 00:10:02
    that topic which is which is very very
  • 00:10:04
    interesting as well that it's not one
  • 00:10:06
    thing that we stumbled upon to be honest
  • 00:10:08
    I did not set out to do was it's not
  • 00:10:10
    necessarily the the spin on the coverage
  • 00:10:14
    it's the lack of coverage completely
  • 00:10:16
    that that tells the bias of the out yeah
  • 00:10:19
    um very recently um when markets were
  • 00:10:22
    crashing there were certain outlets if
  • 00:10:24
    you went to it then you wouldn't know
  • 00:10:26
    that there was anything terrible
  • 00:10:28
    happening in the financial here the
  • 00:10:30
    people the nothing to see here approach
  • 00:10:33
    to news you said the spin do we still
  • 00:10:36
    call them spin doctors or is that such
  • 00:10:39
    an archaic term well that for the
  • 00:10:41
    politicians yes but I think yeah the the
  • 00:10:43
    news outlets are very very much uh very
  • 00:10:45
    much spin doctor was one person among
  • 00:10:48
    many who was doing the spin but now the
  • 00:10:50
    many are spinning so it's a spinning of
  • 00:10:53
    doctors several doctors convention
  • 00:10:58
    okay So when when an I suppose an
  • 00:11:01
    article goes beyond simple bias and it's
  • 00:11:04
    actually misinformed, misleading. Yes. I
  • 00:11:07
    mean not misinformation but
  • 00:11:09
    disinformation. Yes. How how do you sort
  • 00:11:11
    of scan that? And what's behind that I
  • 00:11:15
    think is you began this conversation
  • 00:11:18
    saying there's an event. Yes. And then
  • 00:11:20
    you watch how people cover the event.
  • 00:11:21
    Yes. Or don't. However,
  • 00:11:24
    that presumes that everyone has equal
  • 00:11:27
    access to the objective true information
  • 00:11:30
    about the event. But in the days of
  • 00:11:33
    reporters, different reporters would be
  • 00:11:35
    delivering information back to the to
  • 00:11:38
    the newsroom
  • 00:11:40
    from their view. Yeah. So, it's there's
  • 00:11:43
    another layer in there, isn't there? Uh,
  • 00:11:45
    it's not just the person presenting the
  • 00:11:47
    news or writing the article. It's the
  • 00:11:49
    person who's supplying supplying the
  • 00:11:51
    information. The information. Yes. Yes.
  • 00:11:53
    So again the it's very hard to say that
  • 00:11:56
    there anything is objective because this
  • 00:11:58
    is a chain of humans as you described
  • 00:11:59
    it. Somebody's reporting it, somebody's
  • 00:12:01
    writing about it and then somebody's
  • 00:12:02
    watching it and making sense out of it.
  • 00:12:04
    The game of telephone in the UK. That's
  • 00:12:08
    that's exactly what I or maybe I did and
  • 00:12:11
    I don't remember. So no I know we
  • 00:12:14
    invented the telephone so maybe you
  • 00:12:15
    didn't do it but we we wouldn't play
  • 00:12:18
    just out of spite.
  • 00:12:20
    No, in elementary school, you do it in
  • 00:12:22
    elementary school, like kindergarten or
  • 00:12:23
    something, and someone starts with a
  • 00:12:25
    story that has a little bit of detail,
  • 00:12:27
    but not not Okay. On a level that you
  • 00:12:31
    can't remember it, right? It's like, so
  • 00:12:34
    Mary wore a blue dress to Johnny's
  • 00:12:38
    birthday party, right? And he turned six
  • 00:12:41
    and he blew out the candles and made a
  • 00:12:43
    wish he'd go to Disneyland. Okay.
  • 00:12:45
    Something like that. That's very There's
  • 00:12:47
    nothing weird about that. And you and I
  • 00:12:49
    tell it to you, you tell it to the next
  • 00:12:50
    person. You whisper it and then at the
  • 00:12:52
    end it's it's like, "Hey, Joey wanted to
  • 00:12:55
    go into space and have a birthday
  • 00:12:57
    party." The whole
  • 00:12:59
    It's one of the first things we learn in
  • 00:13:01
    elementary school how how unreliable
  • 00:13:04
    Yeah. the human when you pass that
  • 00:13:06
    information communicating information
  • 00:13:07
    is. That That's right. But if you had
  • 00:13:09
    the versions of all of those people
  • 00:13:10
    along the chain and put it together,
  • 00:13:13
    perhaps you can deci decipher what where
  • 00:13:15
    exactly. But everyone's version is
  • 00:13:17
    accurate in their own mind and is passed
  • 00:13:20
    on. But the the helicopter view is
  • 00:13:22
    something very different. That's right.
  • 00:13:24
    So let me ask answer your question Gary.
  • 00:13:26
    Um how do you determine if there is um
  • 00:13:28
    disinformation um in included? So let's
  • 00:13:31
    let me take an extreme example. I don't
  • 00:13:33
    know um there was a claim a few years
  • 00:13:35
    ago to totally um totally false claim
  • 00:13:38
    that um medicine called ivormectton
  • 00:13:40
    cured covid. And let's let's assume a
  • 00:13:43
    news outlet um one uh publishes that
  • 00:13:46
    that claim. Um so what ground news does
  • 00:13:49
    is again we will not just show that
  • 00:13:52
    claim published by that outlet. We'll
  • 00:13:54
    also show all the other outlets
  • 00:13:56
    commenting on it saying hey how how
  • 00:13:58
    there there are claims out there. So
  • 00:14:00
    they're reactive reaction videos
  • 00:14:03
    reaction videos and also correcting it.
  • 00:14:05
    scientists go out and correcting it and
  • 00:14:07
    and publishing uh publishing reports
  • 00:14:09
    that hey this is this is this is a claim
  • 00:14:10
    that's not true. The second thing we do
  • 00:14:12
    is apart from bias ratings we also
  • 00:14:14
    provide factuality ratings of the news
  • 00:14:16
    outlets. So how have they historically
  • 00:14:19
    historically um been reporting and so on
  • 00:14:23
    their reporting uh on their reporting
  • 00:14:24
    practices we'll say hey this is this is
  • 00:14:26
    historically is it from one source or
  • 00:14:28
    again is that cluster is from a cluster
  • 00:14:30
    of source again trying to be as neutral
  • 00:14:32
    and as close to objectivity as we
  • 00:14:34
    possibly can be uh that we do. So again
  • 00:14:37
    when you're reading that news not in
  • 00:14:39
    isolation but again uh clustered with
  • 00:14:42
    the other other reactions other other um
  • 00:14:46
    uh disproving of that claim then you
  • 00:14:48
    then you can you can you have all the
  • 00:14:51
    information at least in one place to be
  • 00:14:52
    able to say this is this is this is true
  • 00:14:54
    or not. Um but that's what we're trying
  • 00:14:57
    to do. Okay. Word seller question.
  • 00:14:59
    Filter bubbles, echo chambers and
  • 00:15:01
    cognitive biases. Um, they're not
  • 00:15:04
    phrases that you would have heard 10
  • 00:15:06
    years ago probably. Yes. But now, well,
  • 00:15:08
    cognitive bias is well known. Yes. But
  • 00:15:10
    I'm saying psychological in terms of a
  • 00:15:12
    news. No. Yeah. Definitely not news. So
  • 00:15:15
    this is this is now the landscape of
  • 00:15:18
    news media. You've got to go through
  • 00:15:20
    filter bubbles or look look through
  • 00:15:22
    someone's bubble, understand if that is
  • 00:15:24
    something and then find yourself in an
  • 00:15:26
    echo chamber. That's right. So
  • 00:15:28
    interestingly enough, I'll start at a
  • 00:15:29
    very different place. I think the the
  • 00:15:31
    reason all this has happened again going
  • 00:15:33
    back to what's happened with news is uh
  • 00:15:35
    one of the main reasons is the revenue
  • 00:15:36
    models of the news outlets. So the
  • 00:15:38
    revenue models of news outlets have gone
  • 00:15:40
    to similar to social media advertising
  • 00:15:43
    and how much time can we retain you on
  • 00:15:45
    the channel or on the on the app or on
  • 00:15:48
    the website. Commodity is your
  • 00:15:49
    attention. That that's it. That's what
  • 00:15:51
    they are. But then how do you do that?
  • 00:15:55
    by not showing you stuff you might might
  • 00:15:57
    disagree with and leave the site or
  • 00:15:59
    leave the app. So, keep showing you
  • 00:16:01
    again reinforcing that that that
  • 00:16:03
    cognitive bias uh creating creating that
  • 00:16:06
    bubble. So, and then you're like, "Yeah,
  • 00:16:08
    this news outlet gets me or this guy and
  • 00:16:10
    girl gets me. I want to keep in the old
  • 00:16:12
    days and I just know this from what I
  • 00:16:14
    was told. I didn't research this. The
  • 00:16:16
    news was not expected, TV news was not
  • 00:16:21
    expected to be a revenue generating
  • 00:16:23
    center." Yes, it was funded by all of
  • 00:16:26
    the other programming that went on in
  • 00:16:28
    the day and the news was a service to of
  • 00:16:31
    course it had ads. Yes. But that it was
  • 00:16:35
    there wasn't a calculation done that
  • 00:16:37
    they have to adjust the news to boost ad
  • 00:16:39
    revenue. Yeah. But now each each news
  • 00:16:41
    channel is its own profit center. So uh
  • 00:16:44
    then how do you how do you make sure
  • 00:16:46
    that remains remains profitable as you
  • 00:16:48
    you're saying that if if that is the
  • 00:16:50
    revenue generating uh revenue generating
  • 00:16:53
    source then you keep showing people what
  • 00:16:55
    what they want to see and not let them
  • 00:16:57
    you say that hey I cannot own the entire
  • 00:17:00
    entire population I'm going to own this
  • 00:17:02
    slice of population that believes in
  • 00:17:03
    these things and I'm going to keep
  • 00:17:04
    reinforcing those belief these these few
  • 00:17:07
    things I like that phrase to own them
  • 00:17:09
    own them basically that is they do what
  • 00:17:12
    you say they think what you think. Yeah.
  • 00:17:14
    And they think what you tell them to
  • 00:17:15
    think. Yeah. And then they own you. And
  • 00:17:17
    then you you're working very much with
  • 00:17:19
    the demographics. That's that's right.
  • 00:17:21
    So So that's that's where I think again
  • 00:17:23
    going back to the different universes
  • 00:17:25
    come in come in that Yeah. If you're
  • 00:17:27
    reading reading that channel uh
  • 00:17:28
    listening to that channel or or reading
  • 00:17:30
    that newspaper or even group of
  • 00:17:32
    newspapers that are are similar to that
  • 00:17:35
    aiology then you would think of things
  • 00:17:37
    happening very differently. what they
  • 00:17:39
    what another person at an opposite end
  • 00:17:41
    of the spectrum might be thinking. Hey
  • 00:17:43
    Star Talk fans, I don't know if you know
  • 00:17:46
    this, but the audio version of the
  • 00:17:49
    podcast actually posts a week in advance
  • 00:17:53
    of the video version and you can get
  • 00:17:55
    that in Spotify and Apple Podcast and
  • 00:17:59
    most other podcast outlets that are out
  • 00:18:02
    there. Multiple ways to ingest all that
  • 00:18:05
    is cosmic on Star Talk. Now that we
  • 00:18:08
    understand a little bit better what's
  • 00:18:10
    out there and how it's sort of brought
  • 00:18:13
    forward, what strategies can people
  • 00:18:16
    develop? Yeah. To be able to see through
  • 00:18:20
    to be able to be aware of what bias
  • 00:18:23
    might be spun at self-awareness.
  • 00:18:26
    That's what it comes down to, right? It
  • 00:18:28
    is, but it's a very tough ask for
  • 00:18:30
    somebody to do. Um I think to challenge
  • 00:18:33
    that a few tools that we are using and
  • 00:18:36
    and as a lay person even if you don't
  • 00:18:37
    want to use ground news I hope you do
  • 00:18:39
    but if you don't you can use it
  • 00:18:40
    yourself. Uh one is very much what we
  • 00:18:43
    call lateral reading again take the new
  • 00:18:46
    sources and read it across even if you
  • 00:18:48
    don't agree with them you don't have to
  • 00:18:50
    but just having that access and and
  • 00:18:52
    challenging yourself as you said having
  • 00:18:54
    the self-awareness that there is there
  • 00:18:55
    are other versions of what's happening.
  • 00:18:57
    Um second as I said uh by doing going
  • 00:19:00
    across the news sources or if you are
  • 00:19:03
    let's say on social media you are that
  • 00:19:04
    person who gets news on social media go
  • 00:19:06
    follow accounts that you might not agree
  • 00:19:08
    with and and and they make you might
  • 00:19:10
    make you angry but at least going out
  • 00:19:12
    and seeing what what we are calling
  • 00:19:14
    blind spots. So we have a feature called
  • 00:19:16
    blind spots. What we mean is that if you
  • 00:19:19
    were reading a certain set of news
  • 00:19:21
    sources, you would have never come
  • 00:19:22
    across these news stories and every
  • 00:19:24
    single day and it's not just one side or
  • 00:19:26
    the other. Both sides of the spectrum
  • 00:19:28
    are very much they do that. They just
  • 00:19:30
    leave certain news stories out. So, how
  • 00:19:32
    are you going to find that uh find them?
  • 00:19:34
    And by the way, just as a professional
  • 00:19:37
    educator, can I call myself that? You
  • 00:19:40
    just did. I mean, yes, it's true. You
  • 00:19:41
    are. Okay. You are. Um, I, you know, I'm
  • 00:19:45
    born and raised in New York City, so I
  • 00:19:46
    lean left politically. But when someone
  • 00:19:49
    starts
  • 00:19:51
    railing on the political right, and I
  • 00:19:54
    say, "Where did you get that
  • 00:19:55
    information?" And they talk about the
  • 00:19:56
    New York Times or MSNBC, whatever, then
  • 00:19:59
    I tell them, "I probably watch much more
  • 00:20:01
    Fox News than you do." So, you're doing
  • 00:20:04
    that already. Yes. Yes. I do it on
  • 00:20:06
    purpose. And what that helps me is I
  • 00:20:08
    know there's our demographics that watch
  • 00:20:09
    Fox News exclusively. Yes. Yeah. And
  • 00:20:11
    I've been on Fox News. Okay. But a
  • 00:20:13
    couple other shows. So when I'm out in
  • 00:20:15
    the in the wild,
  • 00:20:18
    we let we let you loose. Yeah. When when
  • 00:20:21
    I'm set forth into the nation, I I have
  • 00:20:25
    some sense of what forces are operating
  • 00:20:27
    on people's thoughts and it makes me a
  • 00:20:29
    way more potent educator. I think that I
  • 00:20:32
    I'm so glad you say, Neil, that you do
  • 00:20:35
    that because you then you can exactly
  • 00:20:37
    have that empathy to understand where
  • 00:20:38
    people are coming from. It's not to pass
  • 00:20:39
    judgment. It's to just understand. Yeah.
  • 00:20:41
    We we get uh such heartening feedback
  • 00:20:44
    all the time where hey I stopped talking
  • 00:20:46
    to my father or stop talking to my uncle
  • 00:20:48
    or husband and wife stop talking because
  • 00:20:50
    our political views didn't agree and
  • 00:20:52
    it's it's fracturing people uh and and
  • 00:20:55
    one common thing they do is okay let's
  • 00:20:57
    agree not to talk politics but that's
  • 00:20:58
    cannot be the solution like we cannot
  • 00:21:00
    solve other problems if we we don't
  • 00:21:03
    address um and bring people back to
  • 00:21:05
    common ground. So that's I think the
  • 00:21:08
    only way you can do that is presenting
  • 00:21:10
    all of the different opinions and you
  • 00:21:11
    don't have to agree with it but when you
  • 00:21:13
    run into that person who have this
  • 00:21:14
    opinion you can have at least a educated
  • 00:21:17
    conversation about it. That's a strategy
  • 00:21:20
    for an individual. Yes. That wants to
  • 00:21:23
    get a better understanding of the news
  • 00:21:27
    landscape. Doesn't that assume they want
  • 00:21:28
    to get a better understanding? I I think
  • 00:21:31
    so. Yeah. Suppose they don't want to.
  • 00:21:34
    It's interesting you say that. Be fed
  • 00:21:35
    the way they are. I So I think nobody
  • 00:21:38
    wants to be gamed. I think that's that's
  • 00:21:40
    for sure. Nobody wants say you're being
  • 00:21:43
    gamed. Oh, that those are fighting
  • 00:21:45
    words. I like that.
  • 00:21:48
    You have been gamed. Yeah. And that that
  • 00:21:51
    is everybody thinks they have
  • 00:21:52
    self-awareness, right? Everybody thinks
  • 00:21:54
    they have self-awareness.
  • 00:21:56
    Uh it's just that it's very challenging
  • 00:21:58
    when we are presenting a worldview that
  • 00:22:00
    we don't agree with. And how are you
  • 00:22:02
    going to go um find it if you keep
  • 00:22:04
    cocooning yourself with information and
  • 00:22:06
    if it's not in agreeance with you then
  • 00:22:07
    it's wrong. Yes. That's that's just
  • 00:22:10
    different. Yeah. Right. Right. Mhm.
  • 00:22:12
    Let's spin that round. Yeah. Uh rather
  • 00:22:14
    than on put the burden on the respons of
  • 00:22:16
    responsibility on the individual. Could
  • 00:22:18
    the corporations and there are major
  • 00:22:20
    corporations in play here. Could they be
  • 00:22:22
    more responsible for the messaging? We
  • 00:22:25
    can. I think um again things like
  • 00:22:27
    fairness doctrine was was one of the
  • 00:22:29
    ways uh that we could ask uh ask but I
  • 00:22:32
    don't think that's going to happen
  • 00:22:33
    again. Um one specific thing is of
  • 00:22:36
    course social media. I think social
  • 00:22:38
    media is as as you know it's it's a it's
  • 00:22:41
    the most intense form of those uh
  • 00:22:43
    reinforced algorithms outrage. It's an
  • 00:22:46
    outrage engine. They know that more
  • 00:22:48
    outrage you are more time you'll spend
  • 00:22:50
    on it and and uh more likely you are to
  • 00:22:52
    click. But that's an example though.
  • 00:22:54
    It's the opposite of what you said a
  • 00:22:55
    moment ago. There's one thing to show me
  • 00:22:58
    what I want to see. Yeah. Because I
  • 00:23:01
    agree with it. But if you show me
  • 00:23:04
    something that I vehematly disagree
  • 00:23:07
    with, that gets me bubbling and then I
  • 00:23:09
    for that look what they said over here
  • 00:23:12
    without checking what what so it seems
  • 00:23:15
    to work on both extremes of that. It
  • 00:23:17
    does. But but I think again it's not
  • 00:23:19
    that um that showing you Yeah. the the
  • 00:23:22
    emotion works on both extremes but uh
  • 00:23:25
    but again you might forward it but if
  • 00:23:27
    you but it's such an exaggerated version
  • 00:23:30
    of whatever it is on the other side as
  • 00:23:32
    well that you are shown it's not exactly
  • 00:23:34
    that you're becoming enlightened by
  • 00:23:36
    seeing seeing that news story you're
  • 00:23:38
    getting enraged by seeing that news
  • 00:23:39
    story but not not necessarily um but
  • 00:23:42
    yeah because um it's it's u again
  • 00:23:45
    whatever whatever the hot button topic
  • 00:23:47
    is take the most emotional exaggerated
  • 00:23:49
    version of that and show it to you what
  • 00:23:51
    is your revenue model. Yeah. So, one
  • 00:23:55
    good question. That's a good question.
  • 00:23:56
    Wow. Whoa. Whoa. We're talking about
  • 00:23:58
    other things in there, didn't you?
  • 00:24:02
    Right. Yeah. How you like that?
  • 00:24:06
    No softball questions. Try try to answer
  • 00:24:08
    that one. Yeah. No, it it is
  • 00:24:10
    straightforward. So, one thing early on
  • 00:24:12
    we we decided we are not going to do the
  • 00:24:14
    ad ad revenue model because then you are
  • 00:24:16
    just recreating the problem that you're
  • 00:24:18
    trying to solve. So, we decided to go
  • 00:24:20
    with a subscription model. If you find
  • 00:24:22
    our tools, our navigation tools to read
  • 00:24:24
    news helpful, if you find our analysis
  • 00:24:27
    helpful, then you can pay us a
  • 00:24:28
    subscription to be able to use the
  • 00:24:30
    product. And it's it's 100% subscription
  • 00:24:33
    way to do that. If you if you find value
  • 00:24:35
    in the product, pay pay it to us. But we
  • 00:24:37
    have a premium model. We realize not
  • 00:24:38
    everybody. If they don't find any value
  • 00:24:40
    in it, then you pay them. Is that
  • 00:24:43
    Sorry, you wasted your time. You wasted
  • 00:24:45
    my my time is $100 an hour and outside
  • 00:24:49
    of the revenue models. Yeah. Um,
  • 00:24:53
    is it my imagination? Yeah. When it
  • 00:24:55
    might be, but has science become a
  • 00:24:59
    trigger, especially on social media?
  • 00:25:02
    That's that's an interesting question.
  • 00:25:03
    What do you mean by that? Yeah. Okay.
  • 00:25:05
    Throw a view at somebody that aggravates
  • 00:25:09
    the out of them, right? Yeah. And
  • 00:25:11
    science seems to be one of those trigger
  • 00:25:13
    points. My answer to that would be
  • 00:25:15
    because I think people like making
  • 00:25:17
    compelling arguments on social medias
  • 00:25:19
    and that's why throwing a scientific I
  • 00:25:21
    don't know an excerpt of scientific
  • 00:25:23
    report or scientific news which is
  • 00:25:26
    either taken out of context which as a
  • 00:25:28
    scientist you would never do you would
  • 00:25:29
    explain the nuance and and to make so
  • 00:25:32
    good point. So you throw in a little bit
  • 00:25:33
    of science. Exactly. You get to boost
  • 00:25:36
    what your audience might think is the
  • 00:25:37
    authenticity of the account. It goes
  • 00:25:39
    back to that old adage of every good lie
  • 00:25:43
    Yes. has a grain of truth. Now, it
  • 00:25:45
    depends on the size of that grain. I've
  • 00:25:47
    never heard that. Oh, you kidding me?
  • 00:25:48
    Never. So, you haven't heard of
  • 00:25:50
    telephone any No, no, no. I've heard
  • 00:25:52
    I've heard every day is a school day.
  • 00:25:54
    No, not every lie. I've heard every
  • 00:25:56
    stereotype has a grain of truth. I've
  • 00:25:57
    heard that. Oh, well then it's copy and
  • 00:25:59
    paste. Every good lie has a grain of
  • 00:26:02
    truth in it. Okay. It's one of those
  • 00:26:04
    sort of it's part of the story. So, it
  • 00:26:05
    goes back to telephone. It's it's the
  • 00:26:07
    storytelling that we never went to the
  • 00:26:08
    moon has no truth in it at all. That's
  • 00:26:11
    why I that earth is flat. There's no
  • 00:26:14
    truth in it. As a former NASA engineer,
  • 00:26:16
    I think we can very much agree on that
  • 00:26:18
    one. There is objective. Um, science is
  • 00:26:21
    weaponized
  • 00:26:23
    because it's it's it adds um heft to an
  • 00:26:26
    argument. But if you use a snippet out
  • 00:26:28
    of context um so what we started at
  • 00:26:31
    least doing a ground news is if if there
  • 00:26:33
    was a a news story being reported about
  • 00:26:36
    a study that's that every single day
  • 00:26:38
    there's some study coming out and then
  • 00:26:39
    the headline only covers a partial we
  • 00:26:42
    actually started um actually started
  • 00:26:44
    connecting that report. So if you want
  • 00:26:46
    to go read the report in the entirety
  • 00:26:48
    and even summarize it for you and say
  • 00:26:49
    hey this is the entirety of it. How's AI
  • 00:26:52
    summaries lately? They got much much
  • 00:26:54
    better. they have gotten much much
  • 00:26:55
    better but out of the box LLMs have a
  • 00:26:58
    lot of hallucination which is for a use
  • 00:27:00
    case like news is exactly the opposite
  • 00:27:02
    of what we're trying to do so we've
  • 00:27:04
    worked a lot on putting guard rails in
  • 00:27:06
    place that it sticks to exactly what's
  • 00:27:08
    been Iaska I know I think one day
  • 00:27:13
    so I mean this I mean is is this where
  • 00:27:15
    the open
  • 00:27:17
    sitting there open the hatch and there
  • 00:27:18
    it is doing Iowa doing it with a shaman
  • 00:27:22
    but but I think AI can very powerful for
  • 00:27:25
    news. So I would like to think just as
  • 00:27:27
    internet gave us access to so much
  • 00:27:29
    information of course which had a lot of
  • 00:27:31
    positive but some negative AI can help
  • 00:27:34
    us um understand um improve our
  • 00:27:36
    comprehension of news. Again at ground
  • 00:27:39
    news we show you for example hundreds of
  • 00:27:40
    different versions of the article. No,
  • 00:27:42
    we have people who read through all of
  • 00:27:44
    that. But if you don't have time,
  • 00:27:45
    summarizing it and giving it to you in a
  • 00:27:47
    format that where we can highlight the
  • 00:27:49
    differences or highlight where the news
  • 00:27:51
    outlets agree make your life much
  • 00:27:53
    easier. Again, we don't say, "Hey, this
  • 00:27:54
    is right or wrong." But this is the
  • 00:27:56
    summarization of what's happened or this
  • 00:27:58
    is a summarization. But you don't bring
  • 00:27:59
    the judgment to the we don't bring the
  • 00:28:00
    judgment because I think as soon as you
  • 00:28:02
    bring the judgment, you alienate
  • 00:28:03
    somebody and we don't want to do that.
  • 00:28:05
    We mentioned AI. Um,
  • 00:28:08
    is it likely we're going to get
  • 00:28:10
    responsible, I'll say, journalism formed
  • 00:28:14
    by artificial intelligence or or are we
  • 00:28:16
    going to end up with constant stream of
  • 00:28:17
    deep fakes? And something I've come to
  • 00:28:20
    understand or just learned recently,
  • 00:28:21
    synthetic headlines. Yes. I mean, it's I
  • 00:28:24
    mean, I'm used to the bias of the the
  • 00:28:28
    news outlet being in the headline and
  • 00:28:29
    therefore there's no need to read the
  • 00:28:31
    article because they want you just to
  • 00:28:32
    read the headline and then and that's
  • 00:28:34
    how most people read, by the way.
  • 00:28:36
    I mean, we all we all think we're time
  • 00:28:38
    poor and therefore I've only got the
  • 00:28:40
    time to read the headline. But does AI
  • 00:28:42
    have the capacity to really stop and if
  • 00:28:46
    it does, will it ever be utilized that
  • 00:28:48
    way? I I think like any any any
  • 00:28:51
    groundbreaking technology, AI has the
  • 00:28:53
    possibility to do both, which is help
  • 00:28:55
    the news and hurt the news, which is
  • 00:28:57
    doing as well. um help the news by yeah
  • 00:29:00
    doing things like identifying deep fakes
  • 00:29:02
    by by giving tools to journalists to be
  • 00:29:04
    able to to be able to produce um quality
  • 00:29:08
    content or take out the bias even
  • 00:29:09
    highlight the bias. You're asking AI to
  • 00:29:11
    turn itself in
  • 00:29:13
    finding deep AI did made the damn fake.
  • 00:29:16
    Okay. So and then one day AI is going to
  • 00:29:18
    say I'm not going to do what the humans
  • 00:29:20
    tell me.
  • 00:29:22
    These are our people. Our deep fakes are
  • 00:29:24
    our people. You're looking at the AI,
  • 00:29:26
    but I'd look further back in the history
  • 00:29:29
    and say it's the design of the
  • 00:29:30
    algorithm. Yeah. If you want to design
  • 00:29:32
    it to do those things, then you will. If
  • 00:29:36
    you don't, then it goes in a different
  • 00:29:38
    direction. Where are we with the biases
  • 00:29:39
    on algorithms that people thought were
  • 00:29:41
    not biased? Yeah, that's a like facial
  • 00:29:43
    recognition software. That is the most
  • 00:29:45
    famous example. Yeah. So I I think they
  • 00:29:47
    have been now there are companies
  • 00:29:49
    actively working on on ways to correct
  • 00:29:52
    that and again create data sets to be
  • 00:29:55
    able to reset that to remove bias and
  • 00:29:57
    same for for news as well. So there are
  • 00:29:59
    data sets that exist and and for example
  • 00:30:01
    we work very very hard to identify when
  • 00:30:04
    there is bias language and to be able to
  • 00:30:05
    say hey this and and some simply
  • 00:30:08
    sometimes just highlighting it and say
  • 00:30:09
    hey this is where the bias is and and uh
  • 00:30:12
    help people and the bias is not so much
  • 00:30:14
    in the nouns as it is in the adjectives
  • 00:30:16
    it's in the adjectives like uh yeah uh I
  • 00:30:19
    remember it's it's funny one adjective I
  • 00:30:22
    comes to mind so last time President
  • 00:30:25
    Trump he he uh he there was a parade
  • 00:30:28
    Right. And then uh every headline on the
  • 00:30:31
    left kept using the word soggy and every
  • 00:30:33
    headline on the right kept using grand.
  • 00:30:36
    And I was like, how did they agree on
  • 00:30:38
    which adjective to use? And it's like
  • 00:30:41
    soggy parade. And yeah, that's an in
  • 00:30:44
    between major corporations
  • 00:30:46
    that dominate the news outlet universe
  • 00:30:51
    and the wild west of social media and
  • 00:30:54
    unregulated uh influencers. Are we
  • 00:30:57
    kidding ourselves to think that we're
  • 00:30:59
    going to get responsible journalism
  • 00:31:00
    coming forward? No. You just said her
  • 00:31:03
    whole job is pointless. Our our job is
  • 00:31:05
    to help me make sense. I think that's uh
  • 00:31:08
    that's what we are doing. Uh no, I don't
  • 00:31:10
    think we are kidding ourselves. And
  • 00:31:11
    yeah, my job is not pointless. We are
  • 00:31:13
    trying to uh I think there's amazing
  • 00:31:15
    journalism coming out. There are
  • 00:31:17
    journalists out there who are working on
  • 00:31:19
    exposes that take years. There is some
  • 00:31:22
    journalists out there in a cave and I
  • 00:31:23
    don't know wherever trying to report to
  • 00:31:25
    you. That's all that amazing work
  • 00:31:27
    happening. I think the problem is it
  • 00:31:28
    gets drowned out or or uh drowned out by
  • 00:31:32
    all the else that exists around the
  • 00:31:34
    noise that it's the noise again. It's
  • 00:31:36
    the noise again. So I think yeah we uh
  • 00:31:38
    again we our job is at least at ground
  • 00:31:40
    users not to recreate this amazing work
  • 00:31:43
    but to be able to help you dial down
  • 00:31:45
    that noise and give you tools to be able
  • 00:31:46
    to read that. So I got to land this
  • 00:31:48
    plane. So let me ask you
  • 00:31:50
    what are the metrics that you might use
  • 00:31:53
    to know if you're succeeding? Very good
  • 00:31:56
    question. Very good question. I think
  • 00:31:58
    the number one metric for me is how many
  • 00:32:00
    new sources people end up reading when
  • 00:32:03
    when they come to ground news. We see
  • 00:32:04
    that in our KPIs that people would go to
  • 00:32:07
    two or three sources that quote unquote
  • 00:32:09
    trust or came in with. But within 3
  • 00:32:12
    months we see that 3x people are going
  • 00:32:15
    to 10 different news sources because the
  • 00:32:18
    ease of it and yeah expanding that. Do
  • 00:32:21
    you have to pay a fee to those news
  • 00:32:22
    sources to channel them into your No, we
  • 00:32:25
    don't because all if you want to read
  • 00:32:27
    their articles, you're still going to
  • 00:32:28
    the to the publishers website. They're
  • 00:32:31
    not reading it on your website. No, you
  • 00:32:32
    cannot. That's where we draw the line
  • 00:32:34
    and say if you want to read that, go to
  • 00:32:35
    New York Times or go to whoever. Got But
  • 00:32:38
    yeah, we see that we actually had a
  • 00:32:40
    researcher from Duke University who um
  • 00:32:42
    who did research on ground news and
  • 00:32:44
    found out that people's opinions can
  • 00:32:46
    actually be changed if they are
  • 00:32:48
    presented with uh uh with uh with
  • 00:32:52
    counter to what their beliefs are. So we
  • 00:32:55
    really think that's got to be the the
  • 00:32:56
    way that we can bring everybody back to
  • 00:32:59
    the same page, back to common ground.
  • 00:33:00
    This is very hopeful. Yeah, I didn't
  • 00:33:02
    think this would end hopefully, but it
  • 00:33:04
    did. You pessimist.
  • 00:33:08
    Yes, I was totally skeptical. Well,
  • 00:33:11
    thank you for this insight. Where can we
  • 00:33:12
    find you online? Uh, you can go to
  • 00:33:14
    ground.news uh to our website.
  • 00:33:16
    Ground.news. News is the That's right.
  • 00:33:19
    The domain name. The domain name. That's
  • 00:33:21
    right. Ground.news. Or you can go to the
  • 00:33:23
    app store or play store and look for
  • 00:33:25
    ground news app. Oh, then put put it on
  • 00:33:27
    your on your smartphone. You can use it
  • 00:33:29
    from your smartphone. But but yeah, we
  • 00:33:31
    have a free uh version and we are
  • 00:33:33
    subscription supported. So, let me see
  • 00:33:34
    if I can knock knock knock this out with
  • 00:33:36
    a little bit of cosmic perspective if I
  • 00:33:38
    may. I
  • 00:33:40
    I've said a couple of times I'm on
  • 00:33:43
    record noting that
  • 00:33:46
    as AI gets better and better.
  • 00:33:49
    Yes, there's the good side, but the bad
  • 00:33:51
    side is it can be better and better at
  • 00:33:53
    making deep fakes. And a deep fake
  • 00:33:56
    becomes a source of what someone thinks
  • 00:33:58
    is an objective reality, what someone
  • 00:34:01
    thinks is news. And then that becomes
  • 00:34:04
    part of what people then argue over. And
  • 00:34:09
    I worry, and I think I still worry even
  • 00:34:12
    after this conversation, that
  • 00:34:15
    it could signal the end of the internet
  • 00:34:18
    when deep fakes become so good
  • 00:34:23
    and it's known that they're good that
  • 00:34:25
    all the people who used to believe the
  • 00:34:28
    fake news
  • 00:34:30
    won't believe the fake news anymore
  • 00:34:33
    because they'll
  • 00:34:35
    be sure that it was faked. Once the
  • 00:34:38
    people who believe fake news no longer
  • 00:34:40
    believe anything on the internet,
  • 00:34:42
    there's nothing left on the internet to
  • 00:34:44
    believe, not even the fake news, because
  • 00:34:46
    that was faked. And I think that would
  • 00:34:48
    signal the end of the internet as a
  • 00:34:51
    source of objective information in this
  • 00:34:53
    world. And we'd all go back to just
  • 00:34:55
    reading books and talking to people in
  • 00:34:57
    the town square and maybe reading broad
  • 00:35:00
    sheets stapled up on the on the bulletin
  • 00:35:03
    board. And then the internet will just
  • 00:35:06
    resort to cat videos just as it once
  • 00:35:09
    was.
  • 00:35:11
    And that's my cosmic perspective on that
  • 00:35:15
    topic. And let me thank our special
  • 00:35:17
    guest, Helen Core, who's trying to fix
  • 00:35:19
    the world one reader at a time. And good
  • 00:35:23
    luck with that. I think you'll need some
  • 00:35:25
    of that as well. Thank you. All right,
  • 00:35:27
    Gary. Pleasure, Neil. Thank you. Good to
  • 00:35:29
    have you, man. Signing out from my
  • 00:35:31
    office here at the American Museum of
  • 00:35:33
    Natural History. As always, I bid you to
  • 00:35:36
    keep looking up.
  • 00:35:50
    [Music]
  • 00:35:57
    [Music]
الوسوم
  • news
  • bias
  • journalism
  • social media
  • Ground News
  • fairness doctrine
  • disinformation
  • AI
  • critical thinking
  • echo chambers