The Next Paradigm Shift in Human-Machine Interaction | Magnus Arveng | TEDxTrondheim

00:08:52
https://www.youtube.com/watch?v=lC5AN4fJAcs

概要

TLDRThe video discusses the evolution of human-machine interaction, starting from ancient technologies like the potter's wheel to the latest innovations such as touchscreens and potential future interactions using sensors and extended reality. The speaker emphasizes the importance of designing intuitive, human-centric interfaces to ensure inclusive and empowering use of technology. The development of a "Smart Glove" for intuitive control in space exploration, featuring in a NASA project, exemplifies how new interaction paradigms can enhance functionality and integration in both digital and physical realms.

収穫

  • 🦾 Evolution of interfaces: From mechanical to digital, evolving to be more intuitive.
  • 🌐 Human-centric design: Essential for inclusive and empowering tech use.
  • 🚀 Space tech innovation: 'Smart Glove' exemplifies new interaction paradigms.
  • 🤖 Future interactions: Sensors can transform how we control machines.
  • 📱 Powerful devices: Modern smartphones exceed Apollo-era capabilities.
  • 🧠 Processing power: Idle computing resources can enhance interaction.
  • 🌌 Extended reality: Augmented and virtual realities will play roles in future interaction.
  • 👥 Inclusive tech: Aims to engage all users effectively.
  • 🛠️ Universal translator: Envisioned to bridge human actions and machine commands.
  • 🤝 Empowerment through tech: Designs should focus on human engagement and control.

タイムライン

  • 00:00:00 - 00:08:52

    Human-machine interaction has evolved from using physical actions like controlling a potter’s wheel to digital interfaces like touchscreens. The speaker discusses the shift from mechanical to digital interaction, emphasizing the need for design that makes this interaction intuitive. The emergence of new technologies, such as sensors and augmented reality, aims to simplify these interactions, making them more inclusive. The speaker's company is working on a smart glove that can intuitively control machines, which could revolutionize interactions, especially in contexts like space exploration with NASA. The ultimate goal is a human-centric future where technology empowers rather than alienates people.

マインドマップ

ビデオQ&A

  • What is the main focus of the video?

    The evolution and future of human-machine interaction.

  • How did machine interfaces evolve over time?

    From mechanical interfaces to graphical user interfaces, and eventually to touchscreens.

  • What role do sensors and microphones play in new machines?

    They serve as new input methods for human-machine interaction.

  • What technological advancement is compared to the Apollo missions?

    Today's smartphones have more processing power than what was used during the Apollo missions.

  • What is a significant theme discussed regarding future interaction?

    Creating intuitive and human-centric machine interactions.

  • How is the 'Smart Glove' relevant?

    It's a sensor-based glove developed for intuitive control of drones and explored for use in space exploration.

  • What is the potential impact of intuitive interaction with machines?

    To create a more inclusive and empowering world.

  • What is a 'universal translator' in the context of this video?

    A system proposed to interpret human actions into machine commands.

  • Why is designing human-centric systems important?

    To ensure technology is inclusive and empowering, not alienating.

  • What example of space exploration technology is discussed?

    The Astronaut Smart Glove used in NASA's Haughton-Mars project.

ビデオをもっと見る

AIを活用したYouTubeの無料動画要約に即アクセス!
字幕
en
オートスクロール:
  • 00:00:00
    Transcriber: gaith Takrity Reviewer: Michael Nystrom
  • 00:00:05
    Ever since the invention of the potter’s wheel
  • 00:00:08
    back in ancient Mesopotamia,
  • 00:00:11
    we humans have had machines in our lives.
  • 00:00:15
    The way we interact with these machines has varied over time, but mostly
  • 00:00:19
    we’ve used our bodies and our hands.
  • 00:00:24
    Technology helps us humans in our everyday lives,
  • 00:00:27
    and it even propels us to the Moon.
  • 00:00:32
    We have just entered the second era of human deep space exploration,
  • 00:00:38
    and I believe that we are on the brink of the next paradigm shift
  • 00:00:43
    in human-machine interaction.
  • 00:00:44
    And it might be used on the Moon first.
  • 00:00:47
    I’ll tell you why in a minute.
  • 00:00:50
    But first, what is interaction?
  • 00:00:54
    The Cambridge Dictionary defines interaction as:
  • 00:00:57
    “when two or more people or things communicate with or react to each other.”
  • 00:01:06
    We have human-object interaction, like when I sip on my glass of water.
  • 00:01:12
    Like when you have your first cup of coffee in the morning
  • 00:01:15
    or when you inadvertently push on a door that you were supposed to pull.
  • 00:01:22
    Then we have human-machine interaction.
  • 00:01:26
    And that occurs when we as humans want to interact with
  • 00:01:30
    or control any kind of machine.
  • 00:01:32
    And that's what I'm going to focus on here today.
  • 00:01:37
    Like when you want to pull a lever to control a crane,
  • 00:01:43
    if the interaction is designed well and you know what to do,
  • 00:01:48
    it usually goes right.
  • 00:01:49
    But this requires some sort of afterthought, training, or practice.
  • 00:01:56
    A hundred years ago,
  • 00:01:58
    machines were mostly mechanical in their nature,
  • 00:02:02
    and they were built in such a way that they require
  • 00:02:05
    a very specific set of input actions in order to work as intended.
  • 00:02:10
    It’s a simple action-reaction interaction.
  • 00:02:15
    Then, with the advent of the microprocessor in the early 1970s,
  • 00:02:21
    we were able to create programmable interfaces.
  • 00:02:25
    Suddenly, we could choose what button to push or lever to pull
  • 00:02:30
    and decide what would happen.
  • 00:02:34
    But with great power comes great responsibility.
  • 00:02:37
    And since then,
  • 00:02:38
    we have had to consciously think about how we create these interactions
  • 00:02:43
    in order to make them intuitive for us humans.
  • 00:02:49
    In the 1980s, the graphical user interface was introduced,
  • 00:02:54
    and the mouse and keyboard became the standard
  • 00:02:57
    on how to interact with machines.
  • 00:03:00
    It would take nearly 40 years until this interaction was dethroned
  • 00:03:05
    by the touchscreen as the most widely adopted one.
  • 00:03:09
    At this point in time, suddenly, anyone would be able to control a machine,
  • 00:03:14
    and you didn’t need to be a programmer to do it.
  • 00:03:18
    During this time, technology has been rapidly advancing.
  • 00:03:23
    We have, suddenly, computers in our pockets
  • 00:03:28
    with so much excess computational power
  • 00:03:32
    that it can do more than its core tasks
  • 00:03:36
    At any given point in time,
  • 00:03:37
    you have idle processing power,
  • 00:03:40
    and it’s a shame that we don’t do more with it
  • 00:03:43
    because we could use this processing power to enhance the interaction.
  • 00:03:49
    In fact, we have more processing power in our pockets
  • 00:03:53
    than was used for the Apollo missions.
  • 00:03:58
    So we’ve come a long way, but there’s a long way to go still.
  • 00:04:03
    Even though we have advanced from analog inputs like buttons and levers
  • 00:04:08
    into more digital spheres like the touch screens.
  • 00:04:14
    The paradigm is yet to come.
  • 00:04:17
    We’re now able to equip new machines with sensors,
  • 00:04:21
    microphones, and other means of sensing the environment.
  • 00:04:26
    And suddenly, we could use that as input methods
  • 00:04:30
    so our bodies could become the control system.
  • 00:04:39
    There is a digital divide
  • 00:04:44
    where some people proficient in technology know how to use
  • 00:04:50
    all of the gadgets that we now carry around,
  • 00:04:52
    where some might also be left behind.
  • 00:04:56
    But we can choose to design a different future,
  • 00:05:00
    a future that is human-centric,
  • 00:05:04
    where interaction with machines is easy,
  • 00:05:08
    effortless, and letting people in control.
  • 00:05:13
    We can create a future where humans experience happiness
  • 00:05:18
    with positive interactions with their environment.
  • 00:05:23
    But then we need to make human-machine interaction intuitive,
  • 00:05:28
    and that is what we have been doing at my company for the past six years.
  • 00:05:33
    We believe that there is a need for a universal translator
  • 00:05:38
    that can take into account the environment and the context
  • 00:05:42
    in which an action is performed,
  • 00:05:44
    and then analyze the human behavior,
  • 00:05:47
    try to extract the intention behind that action,
  • 00:05:52
    and translate that into machine command.
  • 00:05:55
    If we manage to do this properly,
  • 00:05:57
    we could revolutionize the way humans and machines interact.
  • 00:06:02
    It will become more visible
  • 00:06:04
    once technology for extended and augmented realities
  • 00:06:09
    become more mature and usable.
  • 00:06:11
    We see headsets, glasses, and other means of adding a digital layer
  • 00:06:16
    on top of our world or merge them all together.
  • 00:06:23
    My journey into this realm started with a smart glove.
  • 00:06:27
    It’s a sensor-based glove that could control drones.
  • 00:06:32
    Then it was discovered by Dr. Pascal Lee,
  • 00:06:36
    a planetary scientist and researcher at NASA.
  • 00:06:39
    Now, our friend who encouraged us to develop this technology
  • 00:06:44
    for the potential use of exploring the Moon, Mars, and beyond.
  • 00:06:49
    It quickly became
  • 00:06:53
    the “Astronaut Smart Glove.”
  • 00:06:59
    And we used it during a field test with a NASA Haughton-Mars project
  • 00:07:03
    where it was integrated
  • 00:07:05
    into an analog prototype of a next-generation spacesuit
  • 00:07:08
    replacing the astronauts’ need for its conventional controller.
  • 00:07:14
    We created an interaction system that intuitively let astronauts
  • 00:07:20
    interact with and control robotic assets in space.
  • 00:07:25
    And this could be used on Earth as well.
  • 00:07:29
    So why would this benefit us?
  • 00:07:34
    Because intuitive interaction with machines
  • 00:07:38
    would help create a more inclusive world,
  • 00:07:41
    a world where humans are engaged with
  • 00:07:46
    and empowered by the use of technology
  • 00:07:49
    rather than being estranged in a world
  • 00:07:52
    where it goes literally out of our control.
  • 00:07:58
    I envision a future, a future for all of us,
  • 00:08:02
    where we, as humans, can impact the way we interact with machines
  • 00:08:08
    and the choices we make today.
  • 00:08:10
    When designing systems for interacting with technology,
  • 00:08:15
    we need to think about the humans in the loop.
  • 00:08:19
    We need to make it human-centric.
  • 00:08:23
    We believe that human-machine interaction
  • 00:08:27
    should be universal and on human terms.
  • 00:08:32
    Thank you.
  • 00:08:33
    (Applause)
タグ
  • human-machine interaction
  • technology evolution
  • intuitive design
  • Smart Glove
  • inclusive technology
  • extended reality
  • human-centric design
  • space exploration
  • digital interfaces
  • NASA