AI In Education: Shaping The Future Of Classrooms | Prof. Bharat N. Anand At #IndiaTodayConclave2025

00:44:20
https://www.youtube.com/watch?v=ssZZJ5ArWLo

Resumo

TLDRThe talk discusses the intersection of generative AI and education, highlighting the transformative potential of AI technologies like ChatGPT. It explores the rapid evolution of technology and the implications for educational institutions like Harvard. Key points include the need for educators to reconsider traditional methods, embrace AI for efficiency, and ensure equal access to technology. The speaker emphasizes the importance of critical thinking, human judgment, and maintaining the role of educators in fostering social and emotional skills, rather than simply relying on AI capabilities. Furthermore, the discussion touches on the risks of misinformation and the potential widening of educational disparities due to differences in access.

Conclusões

  • 🌟 Transformative potential of AI in education is significant.
  • 🕒 Generative AI can save time and automate mundane tasks.
  • ⚠️ Caution is needed due to hallucinations and misinformation from AI.
  • 🧠 Skills like critical thinking and empathy are essential for future generations.
  • 👩‍🏫 Role of teachers will shift towards fostering creative and social skills.
  • 📚 Lifelong learning and curiosity should be prioritized in education.
  • 💻 Access to technology is crucial for leveling the educational playing field.
  • 🚀 AI can streamline administrative duties in educational institutions.
  • 💡 Students today are using AI for deeper learning, not just surface-level tasks.
  • 📈 Online learning and AI tools have transformed educational accessibility.

Linha do tempo

  • 00:00:00 - 00:05:00

    The speaker introduces the topic of generative AI and its intersection with education, referencing their role at Harvard and posing questions about the anticipated impact of AI on educational institutions. A chart is presented showing the adoption rates of various technologies, suggesting that generative AI could have a transformative effect similar to past technological advancements.

  • 00:05:00 - 00:10:00

    The speaker argues that the transformative potential of generative AI relates more to improving accessibility rather than the intelligence of the output. They discuss the evolution of human communication and how AI has only recently become accessible through user-friendly interfaces, highlighting that everyone can now engage with this technology.

  • 00:10:00 - 00:15:00

    Moving forward, the speaker questions the ongoing reliance on advanced software expertise, suggesting that generative AI could democratize creative and design work and enable easier communication in professional fields, exemplified by the simplification of medical record-keeping through AI.

  • 00:15:00 - 00:20:00

    The speaker presents a two-dimensional framework for assessing the utility of AI in organizational tasks, emphasizing the importance of both error costs and the type of data input into AI systems, and discusses potential applications in education, such as assisting with queries and drafting educational materials.

  • 00:20:00 - 00:25:00

    A call for organizations to consider the benefits of generative AI irrespective of current intelligence nuances is made. The speaker uses low-cost services like Ryanair as a parallel to automation in organizations adopting generative AI for efficiency, despite the existing prediction errors.

  • 00:25:00 - 00:30:00

    In education, large language models can automate mundane tasks effectively while enhancing efficiency. However, the speaker warns about the smaller role human involvement will play as automation increases, prompting a reflection on organizational structures.

  • 00:30:00 - 00:35:00

    The speaker highlights concerns that generative AI may widen the knowledge divide as those with existing expertise can leverage AI more effectively, which could lead to an upsurge in inequality in educational success. They emphasize the need to prepare students for the digital divide related to generative AI's emerging influence.

  • 00:35:00 - 00:44:20

    Finally, the speaker encourages thinking critically about the role of education and emphasizes the importance of creativity and empathy as foundational skills for students, stating that a strategic conversation about educational content and delivery methods is necessary in the face of evolving technologies.

Mostrar mais

Mapa mental

Vídeo de perguntas e respostas

  • What is generative AI?

    Generative AI refers to AI models that can generate text, images, or other content, such as ChatGPT.

  • How will generative AI affect education in the future?

    It will likely transform educational practices by facilitating access to knowledge and automating mundane tasks.

  • What are some risks associated with generative AI in education?

    Risks include lazy learning, misinformation, and potential reliance on AI for critical tasks.

  • Why should educators be cautious about AI-generated content?

    AI can produce hallucinations or incorrect information, necessitating human review.

  • What skills are important for children to learn in the age of AI?

    Skills like creativity, empathy, and critical thinking are essential as they are less likely to be automated.

  • How can AI be used effectively in classrooms?

    AI can automate administrative tasks, assist in responding to common queries, and support personalized learning.

  • What should be the focus of future education?

    Teaching children how to think critically and fostering lifelong learning and curiosity.

  • How does generative AI compare to traditional teaching methods?

    AI can provide faster information retrieval but may lessen engagement and deeper understanding if relied on excessively.

  • What roles might teachers play in an AI-focused classroom?

    Teachers could focus on fostering critical thinking, creativity, and socio-emotional skills, transitioning away from rote learning.

  • What challenges exist for low-income students regarding AI access?

    There's a concern that disparities in technological access may widen the gap in educational opportunities.

Ver mais resumos de vídeos

Obtenha acesso instantâneo a resumos gratuitos de vídeos do YouTube com tecnologia de IA!
Legendas
en
Rolagem automática:
  • 00:00:00
    I need some more energy good
  • 00:00:01
    morning it's a pleasure to be here with
  • 00:00:04
    all of you uh today to talk about gen
  • 00:00:07
    and
  • 00:00:08
    education um for those who don't know
  • 00:00:11
    what gen is imagine a
  • 00:00:15
    person who's often wrong but never in
  • 00:00:20
    doubt now be honest with me how many of
  • 00:00:22
    you thought about your
  • 00:00:24
    spouse I did not okay but that's
  • 00:00:28
    gen um and what I I want to talk about
  • 00:00:31
    is um what happens when we have large
  • 00:00:34
    language models like chpt and generative
  • 00:00:37
    AI intersect with institutions like
  • 00:00:39
    Harvard where I sit and I've been there
  • 00:00:41
    for the last 27 years currently
  • 00:00:43
    overseeing teaching and learning for the
  • 00:00:46
    University let me just um ask you a
  • 00:00:49
    question how many of you think in the
  • 00:00:51
    next 5 to 10 years generative AI will
  • 00:00:54
    have a very large impact on education
  • 00:00:57
    just raise your
  • 00:00:58
    hands how many would say a moderate
  • 00:01:01
    impact so we have a few how many would
  • 00:01:04
    say little to no
  • 00:01:07
    impact pretty much none okay uh let me
  • 00:01:11
    come back to this here's a chart showing
  • 00:01:14
    the rise of Technologies and the time it
  • 00:01:17
    took for different Technologies to reach
  • 00:01:19
    50% penetration in the US economy so if
  • 00:01:23
    you look at computers it actually took
  • 00:01:26
    20 years to reach about 30% penetration
  • 00:01:31
    radio it took about 20 years to reach
  • 00:01:33
    half the
  • 00:01:35
    population uh TV about 12 years
  • 00:01:38
    smartphones about 7 years smart speakers
  • 00:01:41
    about uh 4
  • 00:01:43
    years and chatbots about 2 and a half
  • 00:01:46
    years this is part of the reason we're
  • 00:01:48
    talking about this
  • 00:01:50
    today here's what we know so far about
  • 00:01:53
    gen and
  • 00:01:55
    Education First the transformative
  • 00:01:58
    potential St from its intelligence
  • 00:02:01
    that's the eye in AI okay
  • 00:02:06
    secondly as prudent Educators we should
  • 00:02:10
    wait until the
  • 00:02:12
    output is smart enough and gets better
  • 00:02:15
    and it's less prone to hallucinations or
  • 00:02:17
    wrong
  • 00:02:19
    answers third given the state of where
  • 00:02:22
    bot tutors are it's unlikely I think
  • 00:02:25
    many believe that it's going to be
  • 00:02:27
    ultimately as good as the best AC
  • 00:02:30
    learning teachers who have refined their
  • 00:02:31
    craft over many many years and
  • 00:02:34
    decades fourth and Sal Khan talks about
  • 00:02:37
    this this is likely to ultimately level
  • 00:02:39
    the playing field in
  • 00:02:40
    education and finally the best thing we
  • 00:02:43
    can do is to make sure that we secure
  • 00:02:46
    access to everyone and let them
  • 00:02:48
    experiment before you take a screenshot
  • 00:02:50
    of
  • 00:02:51
    this
  • 00:02:53
    don't because I'm going to argue all of
  • 00:02:55
    this is
  • 00:02:58
    wrong now that I hopefully have have
  • 00:03:00
    your attention I'm going to spend the
  • 00:03:01
    next 10 minutes arguing
  • 00:03:03
    why uh let's actually start with the
  • 00:03:05
    first one which is the transformative
  • 00:03:07
    potential stems from how intelligence
  • 00:03:09
    the output
  • 00:03:10
    is I would argue and in fact we just
  • 00:03:13
    heard this from the previous speaker
  • 00:03:14
    we've been actually
  • 00:03:16
    experiencing AI for 70 years machine
  • 00:03:19
    learning for upwards of 50 years deep
  • 00:03:21
    learning for 30 years Transformers for 7
  • 00:03:23
    to 8 years this has been an improvement
  • 00:03:26
    gradually over time there were some
  • 00:03:28
    discrete changes recently but the
  • 00:03:30
    fundamental reason why this is taken off
  • 00:03:32
    I would argue has less to do with the
  • 00:03:35
    discrete improvements in intelligence 2
  • 00:03:36
    years ago as opposed to the Improvement
  • 00:03:39
    in Access or the interface that we have
  • 00:03:42
    with the intelligence what do I mean by
  • 00:03:44
    that I'm going to give you the one
  • 00:03:45
    minute history of human
  • 00:03:47
    communication so we started out sitting
  • 00:03:49
    around campfires talking to each other
  • 00:03:52
    from there we started writing pictures
  • 00:03:55
    on the walls that was Graphics from
  • 00:03:58
    there we start writing scroll s and
  • 00:04:00
    books that was formal text and finally
  • 00:04:03
    the Pinnacle of human human
  • 00:04:04
    communication which was ones and zeros
  • 00:04:07
    and that's
  • 00:04:08
    mathematics that's the evolution of
  • 00:04:10
    human to human communication the
  • 00:04:12
    evolution of human to computer
  • 00:04:13
    communication has gone exactly in the
  • 00:04:15
    opposite direction which is 60 70 years
  • 00:04:18
    ago starting with Punch Cards ones and
  • 00:04:20
    zeros for those of you old enough might
  • 00:04:22
    remember that then we move to things
  • 00:04:24
    like dos prompts commands that we had to
  • 00:04:27
    input by the way and this is the
  • 00:04:29
    fundamental thing the big difference
  • 00:04:31
    between Windows 1.0 and windows
  • 00:04:34
    3.0 functionally they were almost
  • 00:04:36
    identical the big difference was the
  • 00:04:38
    interface meaning we moved to a
  • 00:04:40
    graphical user interface and suddenly
  • 00:04:42
    7-year-old kids could be using computers
  • 00:04:45
    that I think is more similar to the
  • 00:04:46
    revolution we're seeing now which is AI
  • 00:04:49
    for a long time was the province of
  • 00:04:52
    computer programmers software Engineers
  • 00:04:54
    Tech experts with chat GPT it basically
  • 00:04:57
    became available to every one of us in
  • 00:04:59
    the planet through a simple search bar
  • 00:05:01
    that's basically the reason for the
  • 00:05:02
    revolution where is this going probably
  • 00:05:05
    towards just audio and I don't know if
  • 00:05:08
    anyone can guess what's the next
  • 00:05:09
    evolution of this in terms of
  • 00:05:14
    communication neural reading
  • 00:05:16
    emotions you might argue basically us
  • 00:05:19
    grunting and shaking our arms formally
  • 00:05:22
    that would be called the Apple Vision
  • 00:05:24
    Pro uh you could argue we are regressing
  • 00:05:28
    as a species on the other hand you could
  • 00:05:30
    argue that in fact what's happening is
  • 00:05:33
    that the distance between humans and
  • 00:05:35
    computers is fundamentally
  • 00:05:36
    shrinking so that's the first thing I
  • 00:05:39
    just want to say which is fundamentally
  • 00:05:40
    this is about access what does this mean
  • 00:05:44
    it means that does anyone know what this
  • 00:05:48
    is this is
  • 00:05:50
    Photoshop there's a lot of people who
  • 00:05:52
    spend one year 2 years four years trying
  • 00:05:54
    to master this Graphics
  • 00:05:57
    design arguably we don't need this kind
  • 00:06:00
    of expertise anymore we can simply get
  • 00:06:02
    it by communicating directly in natural
  • 00:06:04
    language with computers now this for
  • 00:06:07
    those of you who don't know is epic it's
  • 00:06:09
    a medical software record my wife who's
  • 00:06:11
    a cardiologist does not like this she
  • 00:06:14
    spends 2 hours every single day filling
  • 00:06:16
    in notes on these software records You
  • 00:06:19
    could argue sometime in the near future
  • 00:06:21
    that communication will become much
  • 00:06:23
    simpler by the way one of the things to
  • 00:06:25
    keep in mind is for every one of you
  • 00:06:28
    sitting in organization ations and by
  • 00:06:30
    the way this is a happy organization to
  • 00:06:33
    think about what this is likely to do to
  • 00:06:35
    the or
  • 00:06:36
    structure if you think about the bottom
  • 00:06:38
    of this organization there's people who
  • 00:06:40
    have expertise in different kinds of
  • 00:06:42
    software okay some expertise in
  • 00:06:45
    Photoshop some in
  • 00:06:47
    concur uh some in different kinds of
  • 00:06:50
    software You could argue there's going
  • 00:06:52
    to be consolidation within those
  • 00:06:53
    functions the middle managers who used
  • 00:06:56
    to oversee all these software experts
  • 00:06:58
    it's likely we're going to see shrinkage
  • 00:07:00
    there in fact you could argue all the
  • 00:07:02
    way that the person at the top could in
  • 00:07:05
    fact do sales Graphics design design
  • 00:07:07
    marketing everything by just interacting
  • 00:07:09
    directly with the computer it's not a
  • 00:07:12
    stretch to say and some people predict
  • 00:07:13
    this that the first 1 person billion
  • 00:07:16
    dollar company is going to be likely to
  • 00:07:18
    be born pretty soon okay and people are
  • 00:07:20
    already working on this I would urge you
  • 00:07:22
    to think about this question which is
  • 00:07:24
    what does this mean for your expertise
  • 00:07:26
    and organizations or the organizations
  • 00:07:28
    you run
  • 00:07:30
    because that's going to have big
  • 00:07:31
    implications for how you run these
  • 00:07:33
    organizations all right so that's the
  • 00:07:35
    first point which is fundamentally this
  • 00:07:36
    is not about intelligence about how it's
  • 00:07:39
    accessed the implication of this is more
  • 00:07:41
    people will be able to use more
  • 00:07:43
    computers for specialized purposes but
  • 00:07:45
    it doesn't necessarily mean it's likely
  • 00:07:47
    to be the same people okay that's the
  • 00:07:49
    first
  • 00:07:50
    thing
  • 00:07:52
    second I think we all look at these
  • 00:07:54
    hallucinations and we say let's wait
  • 00:07:57
    let's wait till it gets better
  • 00:08:00
    by the way that begs the question that
  • 00:08:01
    hallucinations are a fundamental
  • 00:08:03
    intrinsic property of generative AI
  • 00:08:05
    because they are probabilistic models
  • 00:08:07
    but I would go further and say even when
  • 00:08:10
    AI capabilities fall far short and
  • 00:08:13
    impair the human value proposition
  • 00:08:15
    there's still a reason to adopt it why
  • 00:08:18
    do I say that I'm a strategist as
  • 00:08:21
    strategist we think of two sides of the
  • 00:08:23
    equation one is the benefit side what
  • 00:08:26
    are customers willing to pay the other
  • 00:08:28
    is the cost or the time
  • 00:08:30
    side even if there's no improvement in
  • 00:08:33
    intelligence simply because of cost and
  • 00:08:36
    Time Savings there might be massive
  • 00:08:38
    benefits to trying to adopt this so the
  • 00:08:41
    metaphor I want you to think
  • 00:08:42
    about is the following
  • 00:08:47
    company has anyone flown Ryan
  • 00:08:50
    Air uh what is the experience like
  • 00:08:54
    isan basic efficient basic efficient by
  • 00:08:57
    the way when I ask my students this they
  • 00:08:59
    often say I hate it every single time I
  • 00:09:01
    fly it and of course it begs the
  • 00:09:04
    question why are you repeatedly flying
  • 00:09:06
    it this is an airline like most lowcost
  • 00:09:09
    Airlines it doesn't offer any food on
  • 00:09:12
    board no seat selection you've got to
  • 00:09:14
    walk to the tarmac you got to pay extra
  • 00:09:16
    for bags no frequent flyers no lounges
  • 00:09:19
    and this is the most profitable airline
  • 00:09:20
    in Europe for the last 30 years
  • 00:09:22
    running why it's not providing a better
  • 00:09:26
    product it's saving cost
  • 00:09:30
    that's the metaphor I would love for you
  • 00:09:31
    to keep in mind when you think about
  • 00:09:33
    generative Ai and its potential so let
  • 00:09:35
    me just walk through this and sorry as a
  • 00:09:37
    strategist I have to put up a 2 x two
  • 00:09:39
    Matrix at some point there's two
  • 00:09:42
    Dimensions here I'd love for you to
  • 00:09:43
    think about the first is what is the
  • 00:09:45
    data that we're inputting into these
  • 00:09:47
    large language models and the data could
  • 00:09:50
    be explicit in the form of files like
  • 00:09:52
    text files numbers Etc that's explicit
  • 00:09:56
    data or it could be tacit knowledge
  • 00:09:59
    meaning
  • 00:10:00
    creative judgment etc etc okay but the
  • 00:10:04
    second dimension is as important which
  • 00:10:06
    is what's the cost of making an error
  • 00:10:10
    from the output not the prediction error
  • 00:10:13
    what's the cost of something going wrong
  • 00:10:16
    in some cases it could be low in some
  • 00:10:18
    cases it could be high so let's actually
  • 00:10:20
    talk through some
  • 00:10:21
    examples first is explicit data low cost
  • 00:10:25
    of Errors that's high volume customer
  • 00:10:27
    support for the last 30 years this thing
  • 00:10:29
    is being automated by the way that
  • 00:10:31
    trajectory is likely to continue why do
  • 00:10:34
    I say that it is virtually impossible
  • 00:10:36
    for any company to have people manning
  • 00:10:39
    the phones to talk to 100,000 customers
  • 00:10:42
    this is the direction where it's going
  • 00:10:44
    even if we have two or three or 4%
  • 00:10:46
    errors it's okay it's simply much more
  • 00:10:49
    efficient to respond to customers in
  • 00:10:51
    this way okay so that's one
  • 00:10:53
    dimension second dimension is drafting
  • 00:10:56
    legal agreements for all the lawyers in
  • 00:10:59
    the room just watch out it's going to be
  • 00:11:01
    much much easier it already is to draft
  • 00:11:03
    legal agreements but we can't rely on
  • 00:11:07
    generative AI to Simply give us this
  • 00:11:09
    thing without checking it some of you
  • 00:11:11
    may have heard of that lawyer who did
  • 00:11:13
    that a couple years ago basically didn't
  • 00:11:16
    review the agreements there were some
  • 00:11:17
    errors he got fired so we might have
  • 00:11:20
    human in the loop you don't want to
  • 00:11:22
    basically take the output at face value
  • 00:11:25
    okay because the cost of making an error
  • 00:11:26
    is simply too
  • 00:11:27
    high third on the top right is creative
  • 00:11:32
    skills design marketing copywriting
  • 00:11:36
    these are things where it's hard to
  • 00:11:37
    evaluate what's truly better or worse
  • 00:11:41
    and so in some sense the design outputs
  • 00:11:43
    we get the social media content we get
  • 00:11:46
    as suggestions from generative AI pretty
  • 00:11:48
    good the cost of making an error there
  • 00:11:50
    not that high and finally we get to the
  • 00:11:53
    top right where we want to be very very
  • 00:11:56
    careful because this is like large LGE
  • 00:11:59
    enterprise software integration you
  • 00:12:01
    don't want to go there pretty soon okay
  • 00:12:03
    or designing an aircraft now what does
  • 00:12:06
    it mean for Education let's actually
  • 00:12:08
    play this out I'm going to use our
  • 00:12:10
    example as an illustration if I'm
  • 00:12:13
    sitting at
  • 00:12:15
    Harvard basically we get when we open up
  • 00:12:17
    the
  • 00:12:18
    website about 10,000 applications in the
  • 00:12:21
    first couple months for admission maybe
  • 00:12:24
    30,000 people who look at the website by
  • 00:12:26
    the way they have questions it's
  • 00:12:28
    impossible
  • 00:12:29
    to speak personally and individually to
  • 00:12:31
    everyone who has a question this is
  • 00:12:34
    beautiful for chat chat Bots to be able
  • 00:12:37
    to Simply respond again if there's an
  • 00:12:40
    error in the response it's okay I mean
  • 00:12:43
    these are people who are simply thinking
  • 00:12:45
    about applying and they might find
  • 00:12:47
    information in other ways secondly legal
  • 00:12:50
    contracts with food contractors we want
  • 00:12:52
    to be careful about human in the loop
  • 00:12:54
    thirdly designing social media content
  • 00:12:56
    when we go to the top left this is
  • 00:12:58
    something we can do far more efficiently
  • 00:13:00
    today with generative Ai and finally I
  • 00:13:03
    can assure you we're not going to be
  • 00:13:04
    using this anytime soon for hiring
  • 00:13:06
    faculty or disciplinary actions against
  • 00:13:09
    students by the way think about this not
  • 00:13:11
    just for your organization think about
  • 00:13:13
    it for you individually so if I was to
  • 00:13:15
    do that responding to emails I get a lot
  • 00:13:19
    of emails every
  • 00:13:21
    day most of these emails are things that
  • 00:13:24
    are very standard Professor when are
  • 00:13:26
    your office ours where's the syllabus
  • 00:13:28
    posted
  • 00:13:29
    by the way even in other cases where
  • 00:13:31
    students ask questions like Professor I
  • 00:13:34
    have two offers one from McKenzie one
  • 00:13:36
    from B Boston Consulting
  • 00:13:38
    Group the cost of an error is not that
  • 00:13:41
    high in my response you'll be okay or
  • 00:13:44
    I'm trying to decide whether to go to
  • 00:13:46
    Microsoft or Amazon you'll be okay okay
  • 00:13:48
    I'm just kidding by the way I can assure
  • 00:13:50
    you I respond to all those emails
  • 00:13:51
    individually but you get the
  • 00:13:54
    point writing a case study it takes us 9
  • 00:13:57
    months to write these famous har
  • 00:13:59
    business school case studies the head of
  • 00:14:01
    the MBA program last year said I want to
  • 00:14:03
    teach a case on Silicon Valley Bank
  • 00:14:06
    tomorrow what he did was go to Chachi PT
  • 00:14:09
    said write a case like Harvard Business
  • 00:14:11
    School with these five sections
  • 00:14:14
    financial information competitor
  • 00:14:15
    information regulatory information it
  • 00:14:17
    spits it out he then said please tweak
  • 00:14:21
    the information give me this data on the
  • 00:14:23
    financials talk about these competitors
  • 00:14:25
    he iterated it kept spitting out
  • 00:14:27
    information from beginning to to end he
  • 00:14:29
    had a case study complete in 71
  • 00:14:33
    minutes um if you're not scared by the
  • 00:14:36
    way we are about what the potential here
  • 00:14:39
    is brainstorming a slide for teaching
  • 00:14:42
    there's a couple slides in this talk
  • 00:14:44
    where I took some pictures and I started
  • 00:14:45
    trying to resize it PowerPoint designer
  • 00:14:48
    simply threw up some suggestions saying
  • 00:14:50
    here's how you might want to do it in 1
  • 00:14:52
    second it didn't take me 10 15 minutes
  • 00:14:55
    to try and redesign these slides a
  • 00:14:57
    beautiful application for using this and
  • 00:14:59
    finally thinking about exactly how I
  • 00:15:01
    teach in the classroom or my research
  • 00:15:03
    Direction I'm not going there anywhere
  • 00:15:05
    soon I'd love you to think about a
  • 00:15:08
    couple things from this simple framework
  • 00:15:11
    number one we are obsessed with talking
  • 00:15:14
    about prediction errors from large
  • 00:15:16
    language models I think the more
  • 00:15:18
    relevant question is the cost of making
  • 00:15:20
    these errors meaning in some cases the
  • 00:15:24
    prediction error might be 30% but if the
  • 00:15:27
    cost of error is zero it's okay to adopt
  • 00:15:29
    it in other cases prediction errors
  • 00:15:33
    might be only 1% but the cost of failure
  • 00:15:36
    is very high you want to stay away so
  • 00:15:39
    stop thinking about prediction errors
  • 00:15:41
    let's start thinking about the cost of
  • 00:15:42
    Errors for
  • 00:15:43
    organizations secondly if you notice
  • 00:15:45
    what I've done I've broken down the
  • 00:15:47
    analysis from thinking about Industries
  • 00:15:50
    what's the impact of AI on banking or
  • 00:15:52
    education or retail into jobs and in
  • 00:15:56
    fact gone a step further and broken it
  • 00:15:58
    down into into tasks so don't ask the
  • 00:16:01
    question of what is AI going to do to me
  • 00:16:04
    ask the question which are the tasks
  • 00:16:06
    that I can actually automate and which
  • 00:16:08
    are the tasks I don't want to touch and
  • 00:16:10
    the third is I don't know about you in
  • 00:16:12
    my LinkedIn feed every single day I get
  • 00:16:15
    new information about the latest AI
  • 00:16:17
    models and where the intelligence
  • 00:16:20
    trajectory is going getting better and
  • 00:16:22
    better that's basically about the top
  • 00:16:24
    right
  • 00:16:25
    cell I would say that's a red herring
  • 00:16:27
    for most organizations
  • 00:16:29
    because basically there's three other
  • 00:16:31
    cells where you can adopt it right now
  • 00:16:33
    and today with human in the loop okay so
  • 00:16:36
    that's just something I'd love you to
  • 00:16:38
    think about by the way we did this with
  • 00:16:40
    Harvard
  • 00:16:41
    faculty where we interviewed 35 Harvard
  • 00:16:45
    faculty who were using gen deeply in
  • 00:16:47
    their
  • 00:16:48
    classrooms those videos are up on the
  • 00:16:50
    web if you just type in Google
  • 00:16:52
    generative AI faculty voices Harvard you
  • 00:16:54
    see all these videos here are some
  • 00:16:56
    examples of what they were doing a
  • 00:16:59
    faculty co-pilot chatbot it's almost
  • 00:17:01
    like a teaching assistant that simulates
  • 00:17:04
    the faculty that answers simple
  • 00:17:06
    questions and is available to you
  • 00:17:09
    24/7 secondly one of the things that we
  • 00:17:13
    as faculty spend a lot of time thinking
  • 00:17:15
    about is designing the tests and the
  • 00:17:17
    quizzes and the assessments every year
  • 00:17:20
    and we've got to make it fresh because
  • 00:17:22
    we know our students probably have
  • 00:17:24
    access to last year's
  • 00:17:26
    quizzes large language models are
  • 00:17:28
    basically spitting this out in a couple
  • 00:17:30
    minutes and of course as individuals we
  • 00:17:32
    would refine it we're not going to just
  • 00:17:34
    take it at face value we refine it we
  • 00:17:36
    look at it but it's saving a lot of time
  • 00:17:39
    third when we're giving
  • 00:17:41
    lectures students often have questions
  • 00:17:44
    which they're too scared to ask live in
  • 00:17:46
    front of 300 students oh it's beautiful
  • 00:17:49
    if they can simply type in the questions
  • 00:17:51
    have gen summarized the questions and
  • 00:17:53
    put it up on a board The Faculty know
  • 00:17:55
    exactly what the sentiment is in the
  • 00:17:57
    classroom and where students are getting
  • 00:17:59
    confused by the way notice one thing
  • 00:18:01
    about all these
  • 00:18:03
    examples every single one of them is
  • 00:18:06
    about automating the mundane it's not
  • 00:18:09
    about saying let's rely on the
  • 00:18:11
    intelligence that's getting better and
  • 00:18:13
    better it's the left column of that
  • 00:18:15
    framework I was talking about so these
  • 00:18:18
    are ways that it's being used nowadays
  • 00:18:20
    in our
  • 00:18:21
    classrooms the third thing this premise
  • 00:18:25
    that bot tutors are unlikely to be as
  • 00:18:27
    good as the best instructors
  • 00:18:29
    we had a few colleagues at Harvard who
  • 00:18:32
    tested this for a course called physical
  • 00:18:34
    sciences 2 this is one of the most
  • 00:18:36
    popular courses and by the way the
  • 00:18:38
    instructors are very good in that course
  • 00:18:40
    they've been refining Active Learning
  • 00:18:41
    teaching methods for many years what
  • 00:18:44
    they did as an experiment was say for
  • 00:18:46
    half the students every week we'll give
  • 00:18:49
    them access to the human tutors for the
  • 00:18:51
    other half give them access to an AI bot
  • 00:18:54
    and by the way the nice thing about the
  • 00:18:56
    experiment is they flip that every
  • 00:18:57
    single week so some people always had
  • 00:19:00
    access to the humans some people had
  • 00:19:01
    access to the AI for that week but then
  • 00:19:03
    they'd flipped the next week every
  • 00:19:06
    single week they tested your Mastery of
  • 00:19:09
    the content during that
  • 00:19:10
    week and what was interesting
  • 00:19:13
    was the scores of the students using the
  • 00:19:17
    AI Bots were higher than with the human
  • 00:19:20
    tutors and these are tutors who've been
  • 00:19:22
    refining their craft year in and year
  • 00:19:25
    out what was even more surprising is
  • 00:19:27
    engagement was higher
  • 00:19:29
    by the way this is a first experiment
  • 00:19:32
    the only point is we better take this
  • 00:19:35
    seriously uh next will it level the
  • 00:19:38
    playing field in education part of the
  • 00:19:40
    premises because everyone has access any
  • 00:19:44
    individual in a village a low-income
  • 00:19:47
    area is basically going to have access
  • 00:19:49
    to the same technology as those who are
  • 00:19:51
    in Elite universities and this is going
  • 00:19:53
    to level
  • 00:19:55
    everything there's a possibility it
  • 00:19:58
    might go exactly the other way which is
  • 00:20:01
    the benefits might ACR
  • 00:20:02
    disproportionately to those who already
  • 00:20:04
    have domain expertise why do I say this
  • 00:20:08
    think about a simple example when you
  • 00:20:10
    have knowledge of a subject and you
  • 00:20:12
    start using generative AI or chat PT the
  • 00:20:16
    way you interact with it asking it
  • 00:20:18
    prompts follow on prompts you're
  • 00:20:21
    basically using your judgment to filter
  • 00:20:23
    out what's useful and what's not useful
  • 00:20:25
    if I didn't know anything about the
  • 00:20:27
    subject I basically don't know what I
  • 00:20:29
    don't know so in some sense the prompts
  • 00:20:31
    are garbage in garbage out by the way
  • 00:20:34
    this is being shown in different studies
  • 00:20:36
    there was a metaanalysis summarized by
  • 00:20:38
    The Economist a couple of weeks ago
  • 00:20:41
    where they basically talk about
  • 00:20:42
    different kinds of studies that are
  • 00:20:43
    showing for certain domains and
  • 00:20:46
    expertise the gap between high
  • 00:20:50
    performance High knowledge workers and
  • 00:20:51
    no knowledge workers is actually
  • 00:20:53
    increasing we better take this seriously
  • 00:20:56
    why and this is not the first time this
  • 00:20:57
    has happened
  • 00:20:59
    12 years ago there was a big revolution
  • 00:21:01
    in online education Harvard and MIT got
  • 00:21:04
    together created a platform called edex
  • 00:21:07
    where we offered free online courses to
  • 00:21:10
    anyone in the world by the way they
  • 00:21:12
    still exist if you want to take a course
  • 00:21:14
    from Harvard for
  • 00:21:15
    free pay $100 for a certificate you can
  • 00:21:19
    get it on virtually every subject what
  • 00:21:21
    happened as a result edex reached 35
  • 00:21:24
    million Learners as did corera and
  • 00:21:26
    Udacity and other platforms
  • 00:21:29
    what was beautiful is roughly free 3,000
  • 00:21:32
    courses the challenge was completion
  • 00:21:36
    rates less than 5% Why by the way if
  • 00:21:39
    you're used to a boring lecture in the
  • 00:21:40
    classroom the boring lecture online is
  • 00:21:43
    10 times worse so there's virtually no
  • 00:21:45
    engagement people take a long time to
  • 00:21:47
    complete or may not complete but here's
  • 00:21:49
    what's
  • 00:21:50
    interesting the vast majority 75% of
  • 00:21:53
    those who actually completed these
  • 00:21:55
    courses already had college degrees
  • 00:21:59
    meaning the educated rich were getting
  • 00:22:01
    richer now think about that that's very
  • 00:22:04
    sobering why is that because those are
  • 00:22:07
    people used to curiosity intrinsic
  • 00:22:10
    motivation by the way they're used to
  • 00:22:11
    boring lectures they've gone to college
  • 00:22:13
    but this has big implications for how we
  • 00:22:15
    think about the digital divide so I just
  • 00:22:17
    want to keep that in your mind and the
  • 00:22:19
    last thing I just want to say is rather
  • 00:22:22
    than going out and trying to create
  • 00:22:23
    tutor Bots for as many courses as
  • 00:22:25
    possible I think what we really need to
  • 00:22:27
    do is have a strategic conversation
  • 00:22:29
    about what's the role and purpose of
  • 00:22:32
    teachers given the way the technology is
  • 00:22:35
    proceeding the one thing I will say here
  • 00:22:38
    is that when we think about what we
  • 00:22:40
    learned in school okay think back think
  • 00:22:43
    back many many
  • 00:22:44
    years we learned many
  • 00:22:47
    things tell me honestly how many of you
  • 00:22:51
    have used geometry proofs since you
  • 00:22:53
    graduated from high
  • 00:22:56
    school three people
  • 00:22:59
    why did we learn state capitals and
  • 00:23:01
    world capitals of every single
  • 00:23:05
    country okay uh foreign languages and by
  • 00:23:09
    the way this is Italian Davy is not a
  • 00:23:11
    goddess Davy in Italian says You must
  • 00:23:15
    okay they have
  • 00:23:17
    similarities um why did we learn foreign
  • 00:23:19
    languages when we think about business
  • 00:23:21
    Concepts in our curriculum I often get
  • 00:23:23
    my students who come back 10 years later
  • 00:23:25
    and say those two years were the most
  • 00:23:27
    transformative years of my life life I
  • 00:23:29
    often asked them what were the three
  • 00:23:31
    most important Concepts you learned they
  • 00:23:33
    said we have no idea I'm like no no okay
  • 00:23:35
    give me one no no we have no idea I'm
  • 00:23:38
    like so why do you say this was
  • 00:23:39
    transformative the point simply being
  • 00:23:41
    they're saying this was transformative
  • 00:23:44
    not because of the particular content
  • 00:23:46
    but because of the way we were learning
  • 00:23:48
    we were forced to make decisions in real
  • 00:23:49
    time we were listening to others we were
  • 00:23:52
    communicating what are they saying
  • 00:23:54
    they're saying that the real purpose of
  • 00:23:56
    case method was listening and
  • 00:23:58
    communication the real purpose of proofs
  • 00:24:01
    was understanding Logic the real purpose
  • 00:24:04
    of memorizing state capitals was
  • 00:24:07
    refining your memory by the way that
  • 00:24:08
    example there is the poem If by rard
  • 00:24:10
    Kipling some of you might remember this
  • 00:24:12
    from school it goes something like this
  • 00:24:15
    if you can keep your head when all about
  • 00:24:16
    you are losing theirs and blaming it on
  • 00:24:17
    you uh I have PTSD because my nephew
  • 00:24:20
    when he was reciting this to me
  • 00:24:21
    preparing for his 10th grade exams I was
  • 00:24:23
    like what the heck are you doing but it
  • 00:24:25
    was basically refining memory skills and
  • 00:24:28
    for for languages it was just learning
  • 00:24:29
    cultures and syntax when we go deep down
  • 00:24:32
    and think about what we were actually
  • 00:24:35
    teaching I think that probably gives us
  • 00:24:37
    a little more hope because it means it
  • 00:24:40
    doesn't matter if some of these things
  • 00:24:41
    are probably accessible through
  • 00:24:43
    gen when calculators came along we
  • 00:24:46
    thought it's going to destroy math
  • 00:24:47
    skills we're still teaching math
  • 00:24:49
    thankfully 50 years later and it's
  • 00:24:51
    pretty good so this is something that I
  • 00:24:53
    think is going to be an important
  • 00:24:54
    strategic
  • 00:24:55
    conversation this is the slide I'd love
  • 00:24:57
    for you to keep in mind which is
  • 00:24:59
    basically everything I've just said if
  • 00:25:01
    you want to take a screenshot this is
  • 00:25:02
    the slide to take a screenshot thank you
  • 00:25:04
    all so much um and I hope to be in
  • 00:25:09
    touch and keep this
  • 00:25:12
    here at HBS I took Professor anan's
  • 00:25:15
    class on economics for managers
  • 00:25:16
    listening to him feels like being back
  • 00:25:18
    in class fortunately he didn't call call
  • 00:25:19
    anyone which is terrific so thank you
  • 00:25:21
    for that now I have a few questions
  • 00:25:24
    we've got young children and you've got
  • 00:25:27
    so much of knowledge available now on
  • 00:25:30
    chat prompts what's your advice to
  • 00:25:33
    everyone who's got young children and
  • 00:25:34
    are wondering about what should they be
  • 00:25:36
    teaching their children so that when
  • 00:25:38
    they grow up and when we don't know what
  • 00:25:40
    the actual capabilities of these
  • 00:25:41
    machines are that what they've learned
  • 00:25:43
    is still
  • 00:25:44
    useful how old are your kids rul so my
  • 00:25:47
    son is nine and my daughter is five what
  • 00:25:49
    are you telling them right now now I
  • 00:25:51
    want to learn from you and I know we
  • 00:25:53
    telling them a lot of stuff with a good
  • 00:25:54
    bad ugly I don't I'm trying to refine
  • 00:25:56
    that and give them a framework of what
  • 00:25:57
    we should be telling them so there's two
  • 00:25:59
    things so I think first of all this is
  • 00:26:01
    probably one of the most common
  • 00:26:02
    questions I get uh by the way it's
  • 00:26:05
    really interesting that the tech experts
  • 00:26:08
    and there was an article in the Wall
  • 00:26:09
    Street Journal about this 10 days ago
  • 00:26:12
    are basically telling their kids don't
  • 00:26:13
    learn computer
  • 00:26:16
    science that skill at least basic
  • 00:26:18
    computer programming is
  • 00:26:20
    gone Advanced Computer Science Advanced
  • 00:26:23
    Data analysis if you want to do that
  • 00:26:25
    that's going to be fine what are they
  • 00:26:27
    telling their kids to learn they're
  • 00:26:28
    telling their kids to learn how to teach
  • 00:26:30
    dance they're telling their kids to
  • 00:26:32
    learn how to do plumbing they're telling
  • 00:26:35
    their kids to learn about the
  • 00:26:37
    humanities why are they saying that
  • 00:26:40
    implicitly they're saying what are those
  • 00:26:42
    skill sets that are robust to machine
  • 00:26:46
    intelligence now I will say it is
  • 00:26:49
    virtually impossible to predict that
  • 00:26:50
    given the pace at which this Improvement
  • 00:26:52
    is
  • 00:26:53
    occurring I probably have a slightly
  • 00:26:55
    different kind of answer by the way my
  • 00:26:57
    daughter's majoring in Psychology
  • 00:26:59
    without me telling her anything so the
  • 00:27:01
    kids I think know basically where this
  • 00:27:02
    is going but the one thing I'll say
  • 00:27:04
    Rahul is I don't know when you started
  • 00:27:06
    out College what were you measuring in
  • 00:27:08
    journalism journalism you started out
  • 00:27:10
    with journalism okay that's enlightened
  • 00:27:13
    I started out doing
  • 00:27:15
    chemistry and then the reason I switched
  • 00:27:17
    to economics was probably like many of
  • 00:27:20
    you there was one teacher who inspired
  • 00:27:23
    me and that's what made me
  • 00:27:25
    switch and I would say to kids follow
  • 00:27:29
    the teachers who inspire you and the
  • 00:27:32
    reason is if you can get inspired and
  • 00:27:34
    passionate about a subject that's going
  • 00:27:36
    to build something that's going to be a
  • 00:27:38
    skill that would last all your life
  • 00:27:40
    which is
  • 00:27:41
    curiosity which is intrinsic motivation
  • 00:27:44
    we talked about in the last session this
  • 00:27:45
    is no longer about learning episodically
  • 00:27:49
    it's about learning lifelong and that's
  • 00:27:51
    I think going to be the most important
  • 00:27:52
    St in the way that Indian families
  • 00:27:54
    operate and as do so many Asian families
  • 00:27:56
    too parents want to equip their children
  • 00:27:58
    with the skills that are likely to be
  • 00:28:00
    most useful when they grow up so it used
  • 00:28:03
    to be say engineering and doctors back
  • 00:28:06
    in the day then uh it a few years ago so
  • 00:28:10
    if you were looking ahead what do you
  • 00:28:12
    think the children should be learning so
  • 00:28:14
    they acquire skills which are useful in
  • 00:28:16
    the job market years down yeah I think
  • 00:28:18
    that's honestly being too
  • 00:28:20
    instrumental as I said 10 years ago a
  • 00:28:22
    lot of my students were talking to me
  • 00:28:24
    and saying what should I major in I
  • 00:28:25
    never told them computer science if I
  • 00:28:27
    told them that that I would have
  • 00:28:28
    regretted it but I genuinely mean this
  • 00:28:31
    that's looking at it things too narrowly
  • 00:28:33
    what I would say is think about things
  • 00:28:35
    like creativity judgment human emotion
  • 00:28:38
    empathy psychology those are things that
  • 00:28:41
    are going to be fundamentally important
  • 00:28:43
    regardless of where computers are going
  • 00:28:45
    by the way you can get those skills
  • 00:28:47
    through various subjects it doesn't
  • 00:28:49
    matter it's not a one-o-one mapping
  • 00:28:51
    between those skills and a particular
  • 00:28:53
    topic or disciplinary ERA this is partly
  • 00:28:55
    what I'm saying really think about where
  • 00:28:57
    their passion is how do we teach our
  • 00:28:58
    children how to think because
  • 00:29:00
    everything's available on Google
  • 00:29:02
    co-pilot chat GPT you can just chat GPT
  • 00:29:05
    it so joining the dots giving them a
  • 00:29:08
    framework to be able to interpret
  • 00:29:10
    analyze and think how do you tell them
  • 00:29:12
    that when the easiest thing is go yeah
  • 00:29:16
    so uh it's a good question just two
  • 00:29:18
    things on that the first is there was an
  • 00:29:22
    interesting study done by colleagues at
  • 00:29:23
    MIT recently where they had groups of
  • 00:29:26
    students and they were asked to
  • 00:29:29
    undertake a particular task or learn
  • 00:29:31
    about a topic some students were given
  • 00:29:34
    AI chat Bots some students were only
  • 00:29:36
    given Google search with no
  • 00:29:39
    AI what they found is the students with
  • 00:29:42
    access to AI
  • 00:29:44
    intelligence learned the material much
  • 00:29:47
    faster but when it came time to apply it
  • 00:29:50
    on a separate test which was different
  • 00:29:52
    from the first one they found it much
  • 00:29:55
    harder the students who learned the
  • 00:29:57
    material through Google search with no
  • 00:29:59
    other access took
  • 00:30:01
    longer but they did much better on those
  • 00:30:04
    tests why is that part of the issue is
  • 00:30:07
    learning is not simple it takes effort
  • 00:30:11
    okay and so part of the issue is you
  • 00:30:13
    can't compress that
  • 00:30:15
    effort um the harder it is to learn
  • 00:30:19
    something the more likely you'll
  • 00:30:21
    remember it for longer periods of time
  • 00:30:25
    and so I think for me the big
  • 00:30:26
    implication is when I tell my students
  • 00:30:28
    look all these Technologies a are
  • 00:30:30
    available it depends on how you use it
  • 00:30:34
    my basic approach to them is just saying
  • 00:30:38
    study because if you get domain
  • 00:30:40
    expertise you will be able to use these
  • 00:30:42
    tools in a much more powerful way later
  • 00:30:45
    on uh so in some sense this goes back to
  • 00:30:48
    the notion of agency it's like we can be
  • 00:30:51
    lazy with tools and Technologies or we
  • 00:30:53
    can be smart it's all entirely up to you
  • 00:30:57
    but this is my advice you know some of
  • 00:30:59
    my friends in Silicon Valley have the
  • 00:31:02
    toughest controls on their children when
  • 00:31:03
    it comes to devices you know we look at
  • 00:31:06
    how much time our children can spend on
  • 00:31:07
    their iPads or TV we're far more lenient
  • 00:31:10
    and they're the guys who are actually in
  • 00:31:11
    the middle of the devices and they're
  • 00:31:13
    developing them and they know the
  • 00:31:15
    dangerous side effects now those devices
  • 00:31:17
    are also the repository of knowledge
  • 00:31:18
    which is where you can learn so much
  • 00:31:20
    from so as an educ every parent has his
  • 00:31:22
    own take and how much time children can
  • 00:31:24
    spend but as an educator how do you look
  • 00:31:26
    at this device addiction just spending
  • 00:31:28
    far more time picking up some knowledge
  • 00:31:30
    but also wasting a lot of time yeah I
  • 00:31:31
    think I mean there's a Nuance here which
  • 00:31:34
    is basically what they're doing is not
  • 00:31:36
    saying don't use devices they're saying
  • 00:31:39
    don't use social media and this goes
  • 00:31:41
    back again to one of the things we were
  • 00:31:43
    talking about
  • 00:31:44
    earlier uh we have gone through a decade
  • 00:31:48
    where things like misinformation
  • 00:31:50
    disinformation and so on there is no
  • 00:31:52
    good solution as far as we know
  • 00:31:54
    today there's also various other kinds
  • 00:31:56
    of habits and so on that are going to
  • 00:31:57
    improved that's partly what they're
  • 00:31:59
    saying stay away from they're not saying
  • 00:32:00
    stay away from computers we can't do
  • 00:32:02
    that and in fact you don't want to do
  • 00:32:04
    that but there's a Nuance in terms of
  • 00:32:05
    how we interact with with tools and
  • 00:32:07
    computers that we just want to keep in
  • 00:32:09
    mind when we think about guardrails
  • 00:32:11
    right are you seeing your students
  • 00:32:13
    getting more and more obsessed with
  • 00:32:14
    their devices and how does that impact
  • 00:32:16
    what are you trying to do to get them to
  • 00:32:19
    socialize more you know to spend more
  • 00:32:21
    time with each other and not be stuck on
  • 00:32:23
    their phones or that yeah it's a very
  • 00:32:24
    interesting question so in some sense
  • 00:32:26
    last year we had a conference at Harvard
  • 00:32:28
    we had 400 people from our community
  • 00:32:30
    attend the conference and some of our
  • 00:32:33
    colleagues were saying we should have a
  • 00:32:35
    policy of laptops down no laptops and
  • 00:32:37
    class take out
  • 00:32:39
    devices I was coming in for a session
  • 00:32:41
    right afterwards but part of the reason
  • 00:32:44
    I wanted them to take out their mobile
  • 00:32:45
    phones was I had two or three polls
  • 00:32:48
    during my lecture where I wanted them to
  • 00:32:50
    give me their input so I said mobile
  • 00:32:52
    phone's out okay and this was sort of
  • 00:32:55
    crazy but the story illustrates
  • 00:32:57
    something interesting which is these
  • 00:32:59
    devices for certain things can be really
  • 00:33:01
    powerful it can turn a passive learning
  • 00:33:03
    modality into an active learning
  • 00:33:05
    modality where every single person is
  • 00:33:07
    participating we don't want to take that
  • 00:33:09
    away what we want to try and deal with
  • 00:33:11
    is people playing games while you're
  • 00:33:14
    lecturing now by the way me personally I
  • 00:33:16
    just put it on
  • 00:33:17
    myself if I'm not exciting enough or
  • 00:33:19
    energizing enough for my students to be
  • 00:33:21
    engaged use your Mobile phones that's on
  • 00:33:23
    me okay but that's partly what chall Eng
  • 00:33:28
    show fans how many felt engaged during
  • 00:33:30
    the session and how
  • 00:33:33
    many okay no so uh that that that which
  • 00:33:36
    is why Agent take Ai and chat Bots can
  • 00:33:40
    never do what professors can right so uh
  • 00:33:42
    we I'll take some questions KH has a
  • 00:33:44
    question KH go
  • 00:33:45
    ahead hi professor uh you mentioned that
  • 00:33:48
    one of the things that we should work on
  • 00:33:50
    to teach our children is empathy how do
  • 00:33:54
    you actually teach empathy in know
  • 00:33:56
    formal educ Ed ation system or does this
  • 00:33:59
    just go back to then parents and
  • 00:34:03
    family it's a it's a hard it's a hard
  • 00:34:06
    question um in fact this is by the way
  • 00:34:09
    one of the most important issues we're
  • 00:34:11
    facing today on campuses it's related in
  • 00:34:15
    part even in higher education not just
  • 00:34:17
    younger kids when we talk about
  • 00:34:19
    difficult conversations on
  • 00:34:21
    campus part of the reason we're facing
  • 00:34:24
    those issues is because people are
  • 00:34:28
    intransigent it's like I don't care what
  • 00:34:30
    you say I'm not going to change my mind
  • 00:34:33
    one of the things we introduced a couple
  • 00:34:35
    years ago on the Harvard application for
  • 00:34:37
    undergraduate is a question that says
  • 00:34:40
    have you ever changed your mind when
  • 00:34:42
    discussing something with anyone else
  • 00:34:44
    okay or something to that effect but
  • 00:34:46
    that's basically saying how open-minded
  • 00:34:48
    are we that's one version of empathy
  • 00:34:50
    there's many other
  • 00:34:51
    dimensions I think part of the challenge
  • 00:34:53
    is that we don't teach that in schools
  • 00:34:58
    right we don't teach that formally in
  • 00:34:59
    schools which is partly why there's this
  • 00:35:01
    whole wave now of schools not just in
  • 00:35:04
    other countries in India which has
  • 00:35:06
    stared to talk about how do we teach the
  • 00:35:08
    second curriculum the hidden curriculum
  • 00:35:10
    how do we teach those social and
  • 00:35:11
    emotional skills The Book of Life so to
  • 00:35:14
    speak and I think I mean it's not rocket
  • 00:35:18
    science to say this it starts at home
  • 00:35:20
    right like that's basically what we do
  • 00:35:22
    with our kids every single day um but
  • 00:35:25
    that's something that's I think going to
  • 00:35:26
    become fundamentally more important
  • 00:35:28
    partly because of the reasons are what
  • 00:35:30
    what I talked about Dr sanjie B has a
  • 00:35:32
    question okay I see lots of hands going
  • 00:35:37
    up yes Dr wonderful wonderful listening
  • 00:35:40
    to you um just with regards Ai and
  • 00:35:43
    Technology I've always said that uh Ai
  • 00:35:45
    and digital technology is not an
  • 00:35:48
    expenditure It's actually an investment
  • 00:35:51
    so very quickly if you allow me just 60
  • 00:35:53
    seconds in healthcare it gives you
  • 00:35:55
    better clinical outcomes it has has
  • 00:35:57
    decreased from number one cause of death
  • 00:36:00
    as Hospital acquired infections in many
  • 00:36:03
    Hospital chains as practically less than
  • 00:36:05
    1% so it gives you a safer outcome it
  • 00:36:08
    gives you a better patient experience
  • 00:36:10
    the turnaround of the bed strength is a
  • 00:36:12
    lot quicker and more importantly is it
  • 00:36:14
    gives you better operational excellence
  • 00:36:17
    so all the hospitals as far as medical
  • 00:36:19
    facilities are concerned who have not
  • 00:36:20
    embraced it as yet will find it
  • 00:36:23
    difficult to operate in the present
  • 00:36:24
    environment the what Ai and digital
  • 00:36:27
    technology has made us learn as doctors
  • 00:36:30
    is that data is the new gold if you
  • 00:36:32
    don't analyze data if you don't see what
  • 00:36:34
    your results are if you don't see where
  • 00:36:36
    your clinical outcomes are then you
  • 00:36:37
    can't go forward so AI is what is in the
  • 00:36:40
    future for us all of us thank you that's
  • 00:36:42
    more in the form of an let me just
  • 00:36:44
    elaborate on that in two ways one is I
  • 00:36:47
    think I would just go back and useful to
  • 00:36:50
    contextualize AI right like right now we
  • 00:36:53
    we often get obsessed by the latest
  • 00:36:55
    technology when we think about
  • 00:36:56
    upskilling reskilling in education
  • 00:36:59
    there's a revolution that started a
  • 00:37:01
    decade ago as I alluded to there's
  • 00:37:04
    basically 3,000 courses available to all
  • 00:37:06
    of you today on any subject so the
  • 00:37:09
    notion of let's wait for AI no no no
  • 00:37:11
    it's already there my father-in-law
  • 00:37:13
    who's 92 years old during covid he said
  • 00:37:15
    bat what should I do I said we have all
  • 00:37:17
    these these courses from Harvard
  • 00:37:19
    available in the last two years or three
  • 00:37:21
    years he's completed 35 courses wow okay
  • 00:37:26
    at the age of 92 wow wow by the way he's
  • 00:37:29
    paid $0 for that because he said I don't
  • 00:37:31
    need a certificate and so I told him
  • 00:37:33
    you're the reason we have a business
  • 00:37:34
    model problem okay but that's one
  • 00:37:37
    aspect the the second aspect is sort of
  • 00:37:40
    thinking about where you're going I
  • 00:37:42
    think you're exactly right sanj which is
  • 00:37:44
    every organization is going to have lwh
  • 00:37:46
    hanging fruit the one thing I just
  • 00:37:48
    caution is there's going to be a paradox
  • 00:37:50
    of access meaning if every organization
  • 00:37:54
    every one of your peers has access to
  • 00:37:56
    the same technology as you
  • 00:37:58
    it's going to be harder for you to
  • 00:37:59
    maintain competitive advantage that's a
  • 00:38:02
    fundamental question okay this is just a
  • 00:38:04
    basic observation so I just want to sort
  • 00:38:07
    of mention that but you're absolutely
  • 00:38:09
    right about the lwh hanging fruit in
  • 00:38:10
    medicine and Healthcare okay Toby Walsh
  • 00:38:12
    has a question or an observation and
  • 00:38:14
    then we I there lots of hands up okay I
  • 00:38:16
    don't frankly know what to do because
  • 00:38:18
    we're also out of time so let this just
  • 00:38:19
    be where we conclude one of the greater
  • 00:38:21
    challenges especially in higher
  • 00:38:22
    education is the cost has gone through
  • 00:38:24
    the roof are you optimistic that AI is
  • 00:38:27
    going to be able to turn that around so
  • 00:38:31
    again I'll just go back uh to what's
  • 00:38:34
    happened in the last decade as I said
  • 00:38:37
    you can now get access to credentials
  • 00:38:39
    and certificates at a minimal cost
  • 00:38:42
    compared to the cost of getting a degree
  • 00:38:44
    okay just to put it in perspective we
  • 00:38:46
    have 177,000 degree students every year
  • 00:38:49
    who come to Harvard they are paying a
  • 00:38:51
    lot of money those who need financial
  • 00:38:53
    aid get financial aid by the way can
  • 00:38:55
    anyone guess how many students we have
  • 00:38:57
    touched over the last
  • 00:39:00
    decade 10 times 100 times that it's
  • 00:39:03
    about 15
  • 00:39:04
    million that is not a story We publicize
  • 00:39:07
    but that's a story about the number of
  • 00:39:09
    students who've actually taken a Harvard
  • 00:39:11
    course or enrolled in a Harvard course
  • 00:39:13
    so in some sense I think where we are
  • 00:39:14
    today is the marginal cost of providing
  • 00:39:17
    education is very very low what we need
  • 00:39:20
    for that is not incremental Improvement
  • 00:39:23
    on the existing model we need to
  • 00:39:25
    basically break it apart and say how do
  • 00:39:28
    we put it back together again in a way
  • 00:39:30
    that makes sense for everyone um there's
  • 00:39:33
    an organization that we just started at
  • 00:39:35
    Harvard called axim jointly with MIT
  • 00:39:37
    with the endowment from the sale of the
  • 00:39:38
    ax platform whose only function is to
  • 00:39:41
    increase access and equity in education
  • 00:39:43
    and by the way their focus is on 40
  • 00:39:45
    million people in America Who start
  • 00:39:47
    college but never completed not just
  • 00:39:49
    because of cost for many other reasons
  • 00:39:51
    right in some the potential to reduce
  • 00:39:54
    the cost is massive but it's going to
  • 00:39:57
    require leadership and strategy this
  • 00:39:59
    gentleman here has a question can
  • 00:40:01
    someone just take the mic to him
  • 00:40:08
    please uh so uh earlier it was okay use
  • 00:40:12
    Ai and it will summarize and help you in
  • 00:40:14
    productivity but with the latest open AI
  • 00:40:16
    models like o03 minia and all that they
  • 00:40:19
    are doing reasoning which is much better
  • 00:40:21
    than humans so the people who are not
  • 00:40:24
    using it are at a dis disadvantage
  • 00:40:28
    so isn't it right that the students use
  • 00:40:30
    Ai and uh be familiar with it and uh be
  • 00:40:35
    be up to speed with that rather than not
  • 00:40:37
    using it and be at a disadvantage to
  • 00:40:39
    other students yeah absolutely there's
  • 00:40:41
    no question about that by the way I sit
  • 00:40:44
    at Harvard overseeing the generative AI
  • 00:40:46
    task force for teaching and learning and
  • 00:40:48
    we have 17 faculty the most interesting
  • 00:40:50
    conversations I've had about adoption
  • 00:40:53
    are with our
  • 00:40:54
    students now there's when we understand
  • 00:40:57
    that Behavior it just throws up things
  • 00:40:59
    that we wouldn't even have thought about
  • 00:41:01
    I'll ask you one question we had a
  • 00:41:03
    Sandbox that we created for the entire
  • 00:41:04
    Harvard Community which was a safe and
  • 00:41:06
    secure sandbox giving them access to
  • 00:41:08
    large language models as opposed to
  • 00:41:10
    using public open AI the adoption rate
  • 00:41:12
    amongst our faculty was about 30 35% in
  • 00:41:15
    the first year what do you think the
  • 00:41:17
    adoption rate was amongst our
  • 00:41:21
    students it was about
  • 00:41:23
    5% so we were surprised when we went to
  • 00:41:26
    them we said what's going on are you
  • 00:41:28
    familiar with the sandbox they said yeah
  • 00:41:30
    we are we said are you using it they
  • 00:41:32
    said no we said are you using AI in any
  • 00:41:34
    way yeah yeah we have access to chat GPT
  • 00:41:36
    we have our own private accounts there
  • 00:41:38
    so we're like wait a wait why are you
  • 00:41:40
    not using the secure Harvard
  • 00:41:42
    sandbox what do you think their answer
  • 00:41:46
    was they said why would we use something
  • 00:41:49
    where you can see what we're
  • 00:41:51
    inputting now by the way as faculty
  • 00:41:54
    members if the number one question we
  • 00:41:56
    talk about with generative of AI is oh
  • 00:41:58
    we're worried about cheating and
  • 00:41:59
    assessments our students are listening
  • 00:42:01
    to us they're like oh if that's what
  • 00:42:03
    you're worried about we're not coming
  • 00:42:04
    anywhere close to you okay so part of
  • 00:42:06
    the point is the students are far ahead
  • 00:42:08
    of us in terms of using this they're
  • 00:42:10
    using it to save time they're using it
  • 00:42:11
    for engaging in deep learning we better
  • 00:42:14
    understand that ourselves to figure out
  • 00:42:15
    what we can do
  • 00:42:17
    join brilliant presentation just wanted
  • 00:42:19
    to understand one side of the spectrum
  • 00:42:21
    you have all the you know the positives
  • 00:42:24
    what's on the other side what risk do
  • 00:42:26
    you think is there on the other side it
  • 00:42:28
    starts coding on its own gets out of
  • 00:42:30
    hand is that a possibility what's the
  • 00:42:32
    possibility so so the risks are the
  • 00:42:33
    things I talked about towards the end
  • 00:42:35
    okay which is number one we put our head
  • 00:42:38
    in the sand as institutions and we don't
  • 00:42:41
    take this seriously that's the first
  • 00:42:43
    risk the second risk is lazy learning
  • 00:42:46
    the way I would call it now again that's
  • 00:42:48
    agency it partly depends on you as a
  • 00:42:51
    student do I want to be lazy do I not
  • 00:42:53
    want to be lazy the third risk is
  • 00:42:56
    everything we were talking about in the
  • 00:42:57
    previous session with respect to
  • 00:42:58
    misinformation
  • 00:42:59
    disinformation the fourth big risk is
  • 00:43:02
    asking the fundamental question what's
  • 00:43:04
    our role as teachers and I'll just share
  • 00:43:06
    one anecdote in closing there's a
  • 00:43:08
    colleague at another school who called
  • 00:43:09
    me and said my students have stopped
  • 00:43:12
    reading the cases they're basically
  • 00:43:15
    inputting the assignment questions into
  • 00:43:16
    generative Ai and by the way they're so
  • 00:43:18
    smart they're saying give me a quirky
  • 00:43:20
    answer I can use in class okay the
  • 00:43:23
    assessments are compromised and get this
  • 00:43:25
    The Faculty have stopped reading cases
  • 00:43:27
    they're inputting the cases and
  • 00:43:29
    basically saying give me the teaching
  • 00:43:32
    plan that's the
  • 00:43:34
    downside you know we we met on a flight
  • 00:43:37
    from Delhi to Mumbai and we had a long
  • 00:43:39
    conversation about the future of
  • 00:43:40
    Education we've been able to in the past
  • 00:43:42
    45 minutes recreate the magic of that
  • 00:43:44
    conversation here on stage can we have a
  • 00:43:46
    very warm Round of Applause for the
  • 00:43:47
    professor for making the effort of
  • 00:43:49
    coming here and for joining us and for
  • 00:43:52
    delivering this master class thank you
  • 00:43:54
    absolute pleasure thank you so much
  • 00:43:56
    thank you
  • 00:43:59
    if you like the video do like comment
  • 00:44:01
    share and
  • 00:44:05
    [Music]
  • 00:44:09
    subscribe if you like the video do like
  • 00:44:12
    comment share and subscribe
Etiquetas
  • Generative AI
  • Education
  • Harvard
  • Technology
  • Learning
  • ChatGPT
  • Future of Education
  • Critical Thinking
  • Empathy
  • Emotional Intelligence