Lecture 01: Vectors in Machine Learning

00:39:48
https://www.youtube.com/watch?v=Y1Gndz4sNeE

Summary

TLDRIn the first lecture of 'Essential Mathematics for Machine Learning,' the focus is on understanding vectors, their properties, and how they apply to machine learning. Vectors are a fundamental mathematical entity representing both magnitude and direction within a vector space. In machine learning, they are crucial for representing data attributes. The lecture delves into various vector concepts such as vector addition, scalar multiplication, dot product, and magnitude calculation. It also discusses linear combination, dependency and independence of vectors, as well as orthogonal and orthonormal vectors. These concepts are key to understanding advanced topics in machine learning and are briefly implemented in Python for practical understanding.

Takeaways

  • 📏 Vectors represent both magnitude and direction in a vector space.
  • ➕ Vector addition is component-wise, resulting in a new vector.
  • ✖️ Dot product results in a scalar, useful for angle computation.
  • 📐 Magnitude of a vector is the square root of sum of squares of its components.
  • 🔗 Linear independence means no vector is a linear combination of others.
  • 🟦 Orthogonal vectors are perpendicular with a dot product of zero.
  • 🌀 Orthonormal vectors are orthogonal and have a magnitude of one.
  • 📊 Vectors in machine learning represent data features for various observations.
  • 🔧 Vector operations can be implemented in Python using NumPy.
  • 📚 Understanding vectors is foundational to learning machine learning concepts.

Timeline

  • 00:00:00 - 00:05:00

    The lecture begins by reintroducing vectors, a fundamental concept in machine learning and mathematics. Vectors are explained as mathematical objects that have length and direction, and exist in a vector space where they share common properties. They can be represented as 1-dimensional arrays, either as row or column vectors, in n-dimensional space. The lecturer emphasizes understanding the geometrical and mathematical representation of vectors, which is crucial for grasping machine learning algorithms.

  • 00:05:00 - 00:10:00

    In this section, the lecturer explains the representation of vectors in geometric models, touching on vectors in two-dimensional and three-dimensional space. Further, the basic operations of vector algebra like addition and subtraction are covered, illustrating with examples how these operations work dimensionally. The speaker discusses how vectors retain their properties in operations, adhering to vector space rules.

  • 00:10:00 - 00:15:00

    The lecture delves into dot products and how they result in scalar quantities. The concept is illustrated with examples, clarifying how the operation is performed in vector algebra. The lecturer proceeds to explain the concept of vector magnitude or length, which is obtained through the dot product of a vector with itself, followed by calculating the square root. This understanding is foundational for deeper vector algebra and machine learning.

  • 00:15:00 - 00:20:00

    Focus shifts to more complex operations such as calculating the angle between two vectors using their dot product and magnitudes. Then, the lecturer introduces linear combinations of vectors, explaining how new vectors can be formed through scalar multiplications and additions. Understanding linear combinations is stressed as pivotal for forming complex models and algorithms in machine learning.

  • 00:20:00 - 00:25:00

    The lecture explores the concepts of linear independence and dependence of vectors, including how to identify them through vector equations. Examples demonstrate these principles, showing how vectors can or cannot be represented in terms of others. This section lays the foundation for recognizing vector relationships that are crucial for algorithm development in machine learning.

  • 00:25:00 - 00:30:00

    Orthogonal and orthonormal vectors are introduced, detailing how orthogonal vectors have a dot product of zero and orthonormal vectors also have a length of one. The lecturer notes the importance of orthonality implying linear independence, a key concept later used in machine learning. Examples illustrate these principles, and the importance of converting linearly independent vectors to orthogonal vectors is outlined.

  • 00:30:00 - 00:39:48

    Finally, the practical application of vectors in machine learning is exemplified through feature vectors in datasets. The process of feature vector creation and manipulation in data is touched upon. A brief introduction to using Python for vector operations is given, highlighting the use of the NumPy library for performing vector algebra. Students are encouraged to explore further vector manipulations in Python to solidify their understanding.

Show more

Mind Map

Video Q&A

  • What is a vector?

    A vector is a mathematical object that encodes length and direction, represented in a vector space.

  • How can vectors be represented?

    Vectors can be represented using one-dimensional arrays, either as column or row vectors.

  • What is vector addition?

    Vectors can be added by summing their components, resulting in a new vector within the same space.

  • What is a dot product?

    The dot product is a scalar resulting from multiplying corresponding components of two vectors and summing them.

  • How is the length of a vector calculated?

    The length or magnitude of a vector is the square root of its dot product with itself.

  • What are orthogonal vectors?

    Orthogonal vectors are vectors whose dot product equals zero, indicating they are perpendicular.

  • What does it mean for vectors to be linearly independent?

    Vectors are linearly independent if no vector in the set can be expressed as a combination of others.

  • How do you calculate the angle between two vectors?

    The angle between two vectors is calculated using the inverse cosine of their dot product divided by their magnitudes.

  • What are orthonormal vectors?

    Orthonormal vectors are both orthogonal and have a length of one.

  • How are vectors used in machine learning?

    In machine learning, vectors represent data attributes, where each vector is an observation comprised of features.

View more video summaries

Get instant access to free YouTube video summaries powered by AI!
Subtitles
en
Auto Scroll:
  • 00:00:00
    [Music]
  • 00:00:20
    [Music]
  • 00:00:22
    Hello friends so welcome to the first
  • 00:00:25
    lecture of this course essential
  • 00:00:27
    mathematics for machine learning in this
  • 00:00:30
    lecture we will learn or we will recall
  • 00:00:34
    in fact the concept of
  • 00:00:39
    vectors so you know that we have done
  • 00:00:42
    vectors in school level mathematics so
  • 00:00:45
    from the we will recall some Concepts
  • 00:00:47
    from there and then we will try to
  • 00:00:50
    relate those in context of machine
  • 00:00:55
    learning to be very
  • 00:00:57
    Frank vectors is very very basic entity
  • 00:01:02
    of any machine learning algorithm so if
  • 00:01:05
    I Define it a vector so a vector is a
  • 00:01:09
    mathematical object that encodes a
  • 00:01:12
    length and direction if I talk as a
  • 00:01:16
    mathematician they are elements of a
  • 00:01:19
    vector space where we will put
  • 00:01:22
    infinite vectors those share some common
  • 00:01:26
    properties into
  • 00:01:28
    a we will put them
  • 00:01:31
    together so it is a collection of object
  • 00:01:34
    that is closed under an addition rule
  • 00:01:37
    means if you add two
  • 00:01:40
    vectors of a vector space the resulting
  • 00:01:44
    Vector will also belong to the same
  • 00:01:47
    Vector space and a rule for
  • 00:01:50
    multiplication by scalar means if you
  • 00:01:53
    multiply a scale by a scalar to a vector
  • 00:01:56
    the resulting Vector will also lie in
  • 00:01:59
    the same vector
  • 00:02:01
    space what we do in terms of
  • 00:02:04
    representation we represent vectors by
  • 00:02:07
    one dimensional
  • 00:02:09
    array this may be a vertical array means
  • 00:02:12
    a column vector or a horizontal array
  • 00:02:16
    that is a row Vector
  • 00:02:19
    geometrically vectors typically
  • 00:02:21
    represent coordinates within a n
  • 00:02:24
    dimensional
  • 00:02:25
    space where n is the number of
  • 00:02:28
    components in that particular Vector a
  • 00:02:32
    simplified representation of a vector
  • 00:02:34
    might be arrow in a vector space with an
  • 00:02:37
    origin Direction and length that is also
  • 00:02:42
    called magnitude of the vector so let us
  • 00:02:45
    try to understand all these Concepts so
  • 00:02:48
    let me take a vector v equals
  • 00:02:53
    to V1
  • 00:02:58
    vs2 VN
  • 00:03:01
    so it is a vector in a n dimensional
  • 00:03:05
    space if these
  • 00:03:08
    V1 are real
  • 00:03:12
    numbers means all these V1 V2 VN belongs
  • 00:03:15
    to set of real numbers then I will say
  • 00:03:17
    it is a real Vector of Dimension
  • 00:03:23
    n and I will say in that case it
  • 00:03:26
    is a vector in vector SP RN we will
  • 00:03:32
    learn this the concept of vector space
  • 00:03:34
    more formally in third
  • 00:03:38
    lecture okay so we can represent it by a
  • 00:03:42
    one dimensional array which can be a row
  • 00:03:46
    or I can write it in form of a column
  • 00:03:55
    also again belongs to RN here each V1 V2
  • 00:03:59
    VN are real
  • 00:04:02
    numbers so for example
  • 00:04:08
    Take N = to 2 it means I talking
  • 00:04:13
    about Vector space
  • 00:04:17
    R2 it is nothing
  • 00:04:19
    just R by
  • 00:04:23
    R so a real number means an order pair
  • 00:04:27
    of real numbers
  • 00:04:30
    so let us represent
  • 00:04:33
    it so I'm saying X and Y
  • 00:04:38
    AIS let me take a vector in this R2 V =
  • 00:04:42
    to 1 and
  • 00:04:46
    2 so here what you see this first
  • 00:04:49
    component which is the component in the
  • 00:04:53
    direction of
  • 00:04:55
    x-axis and the second component is
  • 00:04:58
    nothing just corresponding to y
  • 00:05:02
    direction so I am having a point here 1
  • 00:05:06
    2 and this is my vector
  • 00:05:12
    v the length of
  • 00:05:16
    this is the magnitude of
  • 00:05:22
    v and this is the angle the vector v is
  • 00:05:26
    making with xais means to rep gend the
  • 00:05:30
    direction of vector
  • 00:05:33
    v okay
  • 00:05:36
    similarly in threedimensional space let
  • 00:05:39
    me take x
  • 00:05:42
    y and
  • 00:05:44
    Zed we will be having a
  • 00:05:47
    vector having three
  • 00:05:50
    components component in X Direction y
  • 00:05:53
    direction and J Direction so for example
  • 00:05:56
    you take V = to 1 2
  • 00:06:04
    3 here in
  • 00:06:09
    RN a vector will be
  • 00:06:11
    having n
  • 00:06:13
    components so I cannot plot n
  • 00:06:16
    dimensional Vector here like I can do 2D
  • 00:06:20
    in 2D or using some software I can plot
  • 00:06:23
    3D but I cannot plot uh vectors those
  • 00:06:27
    are having Dimension more than three
  • 00:06:30
    easily okay but the this is very
  • 00:06:34
    abstract setting of a vector now let us
  • 00:06:38
    see some vector
  • 00:06:40
    algebra so first addition and
  • 00:06:44
    subtraction so we can add or subtract
  • 00:06:49
    two vectors if they are having the same
  • 00:06:53
    Dimension so for example if I take this
  • 00:06:58
    in R2
  • 00:07:00
    so let me take a vector
  • 00:07:02
    V1 which is
  • 00:07:07
    13 and
  • 00:07:09
    V2 let me take 1 -
  • 00:07:13
    1 then what is V1 +
  • 00:07:18
    V2 or let me take one one because it
  • 00:07:22
    will be more easy to
  • 00:07:25
    plot so V1 + V2 you will add X component
  • 00:07:30
    of V1 with the X component of vs2 so 1 +
  • 00:07:34
    1
  • 00:07:35
    2 the Y component of V1 with the Y
  • 00:07:38
    component of vs2 so 3 + 1 is
  • 00:07:42
    4 geometrically if you
  • 00:07:46
    see let me say X and
  • 00:07:50
    Y okay V1 is
  • 00:07:54
    13 so let
  • 00:07:56
    me this is my Vector V1
  • 00:08:03
    which is 1 and 3 and V2 is 1 and 1 one
  • 00:08:10
    so let me take this is my Vector 1
  • 00:08:17
    one so now some of these two vectors
  • 00:08:22
    will be so one one will be somewhere
  • 00:08:28
    here
  • 00:08:30
    so 1 1 13 and 1
  • 00:08:33
    1
  • 00:08:34
    now the sum of these two vectors will
  • 00:08:40
    be like
  • 00:08:42
    this which is 2
  • 00:08:48
    4 so this is the sum of V1 and
  • 00:08:53
    V2
  • 00:08:56
    similarly V1 minus vs2 will be
  • 00:09:00
    you subtract first component of V1 from
  • 00:09:03
    the first component of vs2 so 1 - 1 0
  • 00:09:07
    and second component of V1 from the
  • 00:09:09
    second component of vs2 that is 3 - 1 is
  • 00:09:15
    2 so this is addition and subtraction
  • 00:09:18
    similarly we can have in R3 or in
  • 00:09:20
    general RN so in RN if you are having
  • 00:09:24
    two vectors let us say V1 is X1 X2
  • 00:09:31
    xn and V2
  • 00:09:34
    is y1
  • 00:09:36
    Y2
  • 00:09:38
    YN then V1 +
  • 00:09:42
    vs2 will
  • 00:09:44
    be you add first component of both of
  • 00:09:47
    these vector and write it as the first
  • 00:09:50
    component of the sum of these two
  • 00:09:53
    Vector so X1 + y1 X2 + Y 2 and so on 1
  • 00:09:59
    and then you will be
  • 00:10:02
    having xn +
  • 00:10:05
    YN so this is V1 + vs2 similarly you
  • 00:10:09
    will be having V1 minus
  • 00:10:12
    vs2 in this way so X1 - y1 X2 - Y 2 xn -
  • 00:10:19
    YN now dot product of two
  • 00:10:24
    vectors so let me take two vectors uh V1
  • 00:10:30
    equals to X1
  • 00:10:35
    X2
  • 00:10:37
    xn and V2 is again
  • 00:10:40
    y1
  • 00:10:42
    Y2
  • 00:10:44
    YN so both belongs to
  • 00:10:49
    RN then dot products of
  • 00:10:53
    V1 with
  • 00:10:55
    V2 will be a
  • 00:10:57
    scalar which is nothing
  • 00:11:00
    just component wise
  • 00:11:04
    multiplication X1 y1 + X2
  • 00:11:08
    Y2 plus xn YN or in short I can write it
  • 00:11:14
    summation I = to 1 2
  • 00:11:18
    n x i y
  • 00:11:21
    i so for
  • 00:11:25
    example you take in
  • 00:11:28
    R3
  • 00:11:31
    a vector V1 = to 1 1 -
  • 00:11:37
    1 and another Vector V2 is 2 3
  • 00:11:43
    1 then
  • 00:11:46
    V1 dot product of V1 and vs2 is nothing
  • 00:11:49
    just 1 into
  • 00:11:52
    2 + 1 in 2 3
  • 00:11:57
    3 - 1 into 1 - 1 so it comes out to be 4
  • 00:12:04
    okay so this is the dot product between
  • 00:12:07
    two vectors later we will see the
  • 00:12:10
    concept of inner product which is
  • 00:12:13
    generalize one version of this dot
  • 00:12:16
    product the third one is length or
  • 00:12:19
    magnitude of a
  • 00:12:27
    vector so so let me take a vector
  • 00:12:30
    v in RN having
  • 00:12:34
    component X1
  • 00:12:38
    X2
  • 00:12:41
    xent then length of
  • 00:12:44
    V is denoted by like this and again it
  • 00:12:48
    will be a scalar so I'm writing simply
  • 00:12:51
    V it is nothing
  • 00:12:58
    just
  • 00:13:01
    square root of dot product of V with
  • 00:13:04
    itself so let me take V do V will
  • 00:13:08
    become X1 s + X2
  • 00:13:13
    s + xn
  • 00:13:19
    squ and square root of this so this in
  • 00:13:22
    this way we can calculate length or
  • 00:13:25
    magnitude of a vector v so for example
  • 00:13:31
    example you take V =
  • 00:13:33
    to 1 - 1
  • 00:13:37
    2
  • 00:13:40
    then length of this vector v will become
  • 00:13:43
    square
  • 00:13:44
    root 1 s + - 1 s + 2
  • 00:13:57
    s this
  • 00:13:59
    comes out to be square < TK
  • 00:14:02
    6 okay if the length of a vector means
  • 00:14:07
    the length of a vector is zero then the
  • 00:14:10
    vector is zero Vector for a nonzero
  • 00:14:14
    Vector the length or magnitude will be
  • 00:14:17
    greater than
  • 00:14:18
    zero another concept angle between two
  • 00:14:23
    vectors so let me take two
  • 00:14:26
    vectors again
  • 00:14:34
    so V1 and V2 belongs to
  • 00:14:39
    RN then the angle between these two
  • 00:14:42
    vectors is given by theta equals
  • 00:14:45
    to cine
  • 00:14:48
    inverse V1 do
  • 00:14:51
    V2
  • 00:14:54
    upon V1 into
  • 00:14:58
    V2 okay so what I am having in numerator
  • 00:15:01
    I having the dot product of V1 and vs2
  • 00:15:05
    and in denominator I'm having product of
  • 00:15:07
    their
  • 00:15:09
    lengths this will give me the length
  • 00:15:12
    between two vectors V1 and
  • 00:15:16
    vs2 so here I'm making use
  • 00:15:19
    of dot
  • 00:15:26
    product okay now come to the concept of
  • 00:15:30
    linear combination of
  • 00:15:36
    vectors so
  • 00:15:40
    consider a set s
  • 00:15:43
    of let me say k vectors V1 V2
  • 00:15:49
    VK so V1 V2 VK are K vectors from some
  • 00:15:55
    vectory
  • 00:15:57
    space
  • 00:16:02
    then a new
  • 00:16:07
    Vector of the same Dimension or even in
  • 00:16:11
    fact in the same Vector space V which is
  • 00:16:15
    nothing just alpha 1 V1 + Alpha 2 V2
  • 00:16:22
    + Alpha and
  • 00:16:25
    VN is
  • 00:16:27
    called
  • 00:16:33
    linear
  • 00:16:38
    combination of V1
  • 00:16:43
    V2
  • 00:16:46
    VN
  • 00:16:50
    where alpha 1 Alpha
  • 00:16:54
    2 okay I am having K so yeah
  • 00:17:00
    where alpha 1 Alpha 2 Alpha K
  • 00:17:04
    are
  • 00:17:06
    scalar so they are coming from the field
  • 00:17:10
    on which Vector space is defined we will
  • 00:17:12
    learn very soon okay they are
  • 00:17:15
    scalars so we can assume that uh these
  • 00:17:18
    are real numbers at this
  • 00:17:20
    moment so for
  • 00:17:23
    example you take three vectors V1 = 2 1
  • 00:17:29
    2 -
  • 00:17:31
    1 V2 = to 1 1
  • 00:17:36
    0 and V3 = to 0 1 -
  • 00:17:41
    1
  • 00:17:43
    then I will be having linear combination
  • 00:17:46
    as alpha 1 V1 Alpha 2
  • 00:17:50
    V2 plus Alpha 3
  • 00:17:53
    V3 so alpha 1 1 2 - 1
  • 00:18:00
    + Alpha
  • 00:18:01
    2 1 1
  • 00:18:06
    0+ Alpha 3 0 1
  • 00:18:11
    -1 in another way I can write
  • 00:18:14
    it a new Vector alpha 1 + Alpha
  • 00:18:20
    2 which is the first component of this
  • 00:18:23
    linear
  • 00:18:24
    combination twice alpha
  • 00:18:26
    1 plus Alpha 2 is the second component
  • 00:18:30
    plus Alpha 3 here's the second
  • 00:18:34
    component and minus alpha
  • 00:18:37
    1 minus Alpha 3 is the third
  • 00:18:42
    component so if you vary alpha 1 Alpha 2
  • 00:18:46
    and Alpha 3 or the set of real numbers
  • 00:18:50
    then you will get different
  • 00:18:53
    vectors from
  • 00:18:55
    R3 and those vectors can be form by the
  • 00:18:59
    linear combinations of V1 V2 and V3 next
  • 00:19:05
    concept is linear independent and linear
  • 00:19:09
    dependent
  • 00:19:11
    vectors so a set of
  • 00:19:19
    vectors let me take again
  • 00:19:22
    s which is having V1
  • 00:19:26
    V2 V and N vector
  • 00:19:30
    vors
  • 00:19:32
    each linearly
  • 00:19:44
    independent
  • 00:19:46
    if the
  • 00:19:49
    equation or let me write vector
  • 00:19:56
    equation alpha 1 V1 1 + Alpha 2
  • 00:20:02
    V2 plus alpha n
  • 00:20:06
    VN so it is linear combination and this
  • 00:20:09
    equals to zero Vector so here the zero
  • 00:20:12
    in the right hand side is a vector Z
  • 00:20:15
    Vector of the same
  • 00:20:19
    Dimension which is the uh means of the
  • 00:20:23
    dimension similar to dimension of V1 V2
  • 00:20:26
    or V1 or V2 or VN so if the vector
  • 00:20:30
    equation V alpha 1 V1 + Alpha 2 V2 and
  • 00:20:33
    so on equals to
  • 00:20:35
    Z
  • 00:20:38
    holds
  • 00:20:42
    only
  • 00:20:43
    when alpha 1 = to Alpha
  • 00:20:48
    2 = to alpha n =
  • 00:20:52
    to0 okay so they are linearly
  • 00:20:55
    independent if this vector equation
  • 00:20:57
    equals to Z holds only only when these
  • 00:21:00
    scalers alpha 1 Alpha 2 alpha n are zero
  • 00:21:04
    so what I want to
  • 00:21:06
    say
  • 00:21:07
    that uh you cannot write any of the
  • 00:21:13
    vector like V1 V2 or
  • 00:21:16
    VN in terms of other
  • 00:21:21
    vectors you cannot write V1 in terms of
  • 00:21:25
    uh vectors from the subset V2 to V n and
  • 00:21:29
    similarly true for other vectors V2 up
  • 00:21:33
    to
  • 00:21:35
    VN if this is not true means if this
  • 00:21:39
    vector equation is zero holds and some
  • 00:21:43
    or all alha is are non zero then we say
  • 00:21:49
    that the set of vectors is linearly
  • 00:21:54
    dependent so else the
  • 00:21:57
    set is is
  • 00:22:02
    linearly
  • 00:22:07
    dependent so for linearly dependent I
  • 00:22:11
    will use LD in short whereas for
  • 00:22:13
    linearly independent I will
  • 00:22:16
    use
  • 00:22:19
    Li so let me take some
  • 00:22:22
    example so first example I am taking
  • 00:22:25
    from
  • 00:22:27
    R2
  • 00:22:30
    so I'm taking a set s which is having
  • 00:22:32
    Vector
  • 00:22:34
    1 and 1
  • 00:22:38
    1 so let me take linear combination
  • 00:22:42
    alpha 1
  • 00:22:44
    1 plus Alpha 2 1 1 and this equals to
  • 00:22:50
    zero Vector from R2 that is 0 0 it gives
  • 00:22:54
    me alpha 1 + Alpha 2
  • 00:22:58
    = to 0 and second equation is giving me
  • 00:23:03
    Alpha 2 = to 0 so when Alpha 2 is zero
  • 00:23:07
    you put it in first equation you will
  • 00:23:09
    get alpha 1 is also
  • 00:23:13
    zero so
  • 00:23:17
    this vector equation holds only when
  • 00:23:21
    alpha 1 and Alpha 2 both are zero it
  • 00:23:24
    means s
  • 00:23:27
    contains
  • 00:23:29
    linearly independent
  • 00:23:34
    vectors on the other hand if I take
  • 00:23:37
    another set s Das which is having Vector
  • 00:23:41
    1
  • 00:23:42
    1 and 3
  • 00:23:46
    3 so here if I
  • 00:23:50
    take uh 3
  • 00:23:53
    * 1
  • 00:23:55
    1 plus orus 3 3 * 1 1 + 3
  • 00:24:02
    3 this comes out to be 0 0 it means if
  • 00:24:06
    you take alpha 1 =
  • 00:24:09
    -3 and Alpha 2 = to 1 then the vector
  • 00:24:14
    equation holds and hence s d
  • 00:24:21
    contains linearly dependent
  • 00:24:25
    vectors believe me we will make lot of
  • 00:24:29
    use of this concept of linearly
  • 00:24:31
    independent and linearly
  • 00:24:33
    dependent in subsequent lectures and in
  • 00:24:37
    machine learning
  • 00:24:39
    also similarly we can see an example in
  • 00:24:47
    R3 so you take a set let me take 1
  • 00:24:54
    -10 one
  • 00:24:57
    1
  • 00:25:00
    0 1
  • 00:25:02
    1 okay so what you can say and these
  • 00:25:06
    belongs to
  • 00:25:08
    R3 each of these Vector so what you can
  • 00:25:12
    say about these
  • 00:25:14
    vectors so if I take alpha 1 = to 1
  • 00:25:19
    Alpha 2 = to -1 and Alpha 3 = to 1 just
  • 00:25:24
    check what I will get alpha 1 into
  • 00:25:29
    2 1 - 1
  • 00:25:31
    0 + Alpha 2 1 0 1 plus Alpha 3 0 1
  • 00:25:41
    1 this equals to I am taking alpha 1 is
  • 00:25:44
    1 so 1 - 1 0 Alpha 2 is
  • 00:25:49
    -1 1 0 1 plus Alpha 3 is 1 so 0 1 1 this
  • 00:25:57
    comes out to
  • 00:25:58
    be
  • 00:26:01
    Z 0 and finally
  • 00:26:05
    zero hence I'm
  • 00:26:09
    getting alpha 1 V1 + Alpha 2 vs2 + Alpha
  • 00:26:13
    3 V3 = to zero
  • 00:26:16
    Vector when alpha 1 Alpha 2 and Alpha 3
  • 00:26:20
    are non zero it
  • 00:26:22
    means s
  • 00:26:25
    h LD set of vectors
  • 00:26:30
    or these set of vectors are linearly
  • 00:26:34
    dependents means I can write uh if I
  • 00:26:38
    take it as
  • 00:26:39
    V1 vs2 V3 so what I can write I can
  • 00:26:45
    write V2 as V1 + V3 there is a linear
  • 00:26:51
    relationship between these vectors and
  • 00:26:54
    this you can verify V1 is 1 - 1 Z 0 + V3
  • 00:27:00
    0 1 1 so what you will be having 1 0 1
  • 00:27:05
    which is nothing just
  • 00:27:09
    V2 if this alpha 1 V1 + Alpha 2 vs2 +
  • 00:27:13
    Alpha 3 V3 HS only if Alpha is are
  • 00:27:17
    zero then we will say these are Li one
  • 00:27:22
    of the example in
  • 00:27:23
    R3 you
  • 00:27:25
    take 1 0
  • 00:27:30
    0 1
  • 00:27:31
    0 0 0 1 that is standard basis in
  • 00:27:38
    R3 similarly we can extend this concept
  • 00:27:41
    in
  • 00:27:42
    RN now come to next concept that is
  • 00:27:46
    orthogonal
  • 00:27:47
    vectors yeah so before going to
  • 00:27:50
    orthogonal
  • 00:27:53
    vectors I am having some
  • 00:27:56
    remark about l i and LD
  • 00:28:00
    vectors so first remark
  • 00:28:05
    is in
  • 00:28:11
    RN a set
  • 00:28:19
    of more than n
  • 00:28:26
    vectors each
  • 00:28:29
    LD like in R2 if you take a set having
  • 00:28:32
    three
  • 00:28:33
    vectors that is LD three or more vectors
  • 00:28:37
    in R3 if you are having a set having
  • 00:28:40
    four or more vectors that will be
  • 00:28:44
    LD
  • 00:28:48
    second any set of
  • 00:28:52
    vectors
  • 00:28:55
    containing zero vector
  • 00:29:00
    is
  • 00:29:01
    LD and this you can easily see from the
  • 00:29:04
    definition now come to the concept of
  • 00:29:07
    orthogonal
  • 00:29:19
    vectors so we
  • 00:29:24
    say that a
  • 00:29:26
    set of
  • 00:29:32
    vectors V1
  • 00:29:36
    V2 VN is
  • 00:29:42
    orthogonal VN are
  • 00:29:48
    mutually or
  • 00:29:57
    pairwise
  • 00:30:00
    if VI dot
  • 00:30:05
    VJ equals to
  • 00:30:07
    Z for all I not equals to
  • 00:30:11
    J so for
  • 00:30:15
    example in
  • 00:30:17
    R3 you take a set of
  • 00:30:21
    vector 1
  • 00:30:24
    0 -
  • 00:30:26
    1 and then you take 1 < tk2
  • 00:30:30
    1 and then you take 1 - < tk2
  • 00:30:37
    1 then you can check 1 0 - 1 dot product
  • 00:30:45
    with 1 <
  • 00:30:47
    tk2 1 = to
  • 00:30:49
    0 1 0 - 1 dot product with 1 - < tk2 1
  • 00:30:58
    equal to0 and the dot product of second
  • 00:31:01
    and third Vector 1 < tk2 1 with 1 - <
  • 00:31:06
    tk2 1 = to
  • 00:31:09
    0
  • 00:31:11
    so uh they are pair wise
  • 00:31:16
    orthogonal we are
  • 00:31:19
    having another concept which is
  • 00:31:21
    orthonormal
  • 00:31:26
    vectors so this is a
  • 00:31:30
    set
  • 00:31:35
    of orthogonal
  • 00:31:39
    vectors
  • 00:31:43
    each
  • 00:31:46
    orthonormal
  • 00:31:48
    if each
  • 00:31:51
    Vector
  • 00:31:55
    H length one length or magnitude one so
  • 00:32:01
    a set of orthogonal vectors is
  • 00:32:02
    orthonormal if each Vector in this set
  • 00:32:06
    has length
  • 00:32:07
    one so for
  • 00:32:10
    example you take vectors in
  • 00:32:14
    R2 1 by < tk2 1 by <
  • 00:32:18
    tk2 and 1 by <
  • 00:32:21
    tk2 - 1 by <
  • 00:32:25
    tk2 you can verify they are each of
  • 00:32:29
    these Vector is having length
  • 00:32:31
    one and uh they are orthogonal
  • 00:32:37
    also here one remark I want to tell you
  • 00:32:43
    that a set of orthogonal
  • 00:32:52
    vector h a mean orthogonality implies
  • 00:32:57
    linearly
  • 00:32:59
    independent but Converse is not true in
  • 00:33:02
    later part of this course we will see a
  • 00:33:06
    process by which we can make a set of
  • 00:33:11
    Ali vectors we can convert a set of Ali
  • 00:33:14
    vectors into a set of orthogonal
  • 00:33:18
    vectors so an example of fature vectors
  • 00:33:22
    means how we can see vectors in machine
  • 00:33:25
    learning so take a very simple data set
  • 00:33:28
    I am
  • 00:33:31
    having data of the employee in an office
  • 00:33:36
    I'm having their height and
  • 00:33:41
    weight and then I'm having employee ID
  • 00:33:43
    let us say E1
  • 00:33:46
    E2
  • 00:33:48
    e and then I am having some number alpha
  • 00:33:52
    1 beta 1 Alpha 2 Beta 2 and so on Alpha
  • 00:33:57
    k
  • 00:33:58
    beta
  • 00:33:59
    K so for this data set E1 E2 e k are
  • 00:34:04
    observations or
  • 00:34:05
    samples height and weight are
  • 00:34:12
    features or
  • 00:34:17
    attributes now if I take any Vector that
  • 00:34:21
    is row corresponding to any sample let
  • 00:34:24
    us say Alpha 20 beta
  • 00:34:26
    20 so it is a feature Vector of the 20th
  • 00:34:31
    employee of
  • 00:34:33
    employee
  • 00:34:35
    E20 so in that way for each data set we
  • 00:34:39
    will make the feature
  • 00:34:41
    vectors let us see a brief
  • 00:34:45
    implementation of these Concepts which
  • 00:34:47
    we have seen so far in
  • 00:34:51
    Python okay so for python I will use
  • 00:34:55
    Google
  • 00:34:56
    cab uh one can use Jupiter notebook also
  • 00:35:00
    or any other editor so for opening
  • 00:35:03
    Google collab you can type Google collab
  • 00:35:06
    in Google so first of all I will import
  • 00:35:10
    a very important package from python
  • 00:35:15
    nump so nump is used for all array types
  • 00:35:21
    of operations in
  • 00:35:24
    Python so it is having lot of
  • 00:35:26
    functionality related to uh
  • 00:35:29
    multi-dimensional
  • 00:35:31
    array uh related to linear algebra and
  • 00:35:34
    many more so how to define a vector so I
  • 00:35:38
    am defining a vector let us say V equals
  • 00:35:43
    to so V will be a onedimensional array
  • 00:35:47
    vectors are onedimensional
  • 00:35:52
    array so I'm taking a vector 1 - 1
  • 00:35:56
    2
  • 00:35:58
    I'm taking another Vector
  • 00:36:02
    W NP do
  • 00:36:05
    array and then I'm having let us
  • 00:36:09
    say 2 5
  • 00:36:12
    2 so both of these vectors V and W are
  • 00:36:15
    from
  • 00:36:18
    R3 so print V +
  • 00:36:22
    W will give you the addition of these
  • 00:36:26
    two vectors
  • 00:36:29
    so you can see 3 4 4 so 1 + 2 3 - 1 + 5
  • 00:36:34
    4 and 2 + 2 = to
  • 00:36:38
    4 similarly you can
  • 00:36:42
    print V minus
  • 00:36:50
    W you can see -1 - 6
  • 00:36:53
    0 you can see scalar
  • 00:36:56
    multiplication
  • 00:37:01
    so I will
  • 00:37:03
    print
  • 00:37:05
    three star
  • 00:37:08
    V means 3 *
  • 00:37:12
    V you can see 3 - 3 6 3 into 1 3 3 into
  • 00:37:17
    - 1 - 3 3 into 2
  • 00:37:22
    6 I can find out the length of this
  • 00:37:26
    Vector let us say say let me find the
  • 00:37:28
    length of V so simply I can
  • 00:37:34
    use a command for finding the length
  • 00:37:37
    that is NP dot Lin
  • 00:37:42
    lse so Line Stands for linear algebra
  • 00:37:46
    which is a sub package in
  • 00:37:48
    NP and uh then Norm is for getting the
  • 00:37:52
    length and of which Vector let me find
  • 00:37:55
    out the length of vector
  • 00:38:02
    V so you can see it is 2.44 it is
  • 00:38:06
    nothing just Square ot
  • 00:38:13
    6 let me also print the dot product so
  • 00:38:17
    let me write s so dot product you can
  • 00:38:20
    simply get NP Dot and then vectors V and
  • 00:38:26
    W let me
  • 00:38:28
    take okay so it will give you dot
  • 00:38:30
    product which is a scalar and store it
  • 00:38:34
    assigned to
  • 00:38:35
    S and then you can print test to see the
  • 00:38:43
    result so one and you can 1 into 2 2 - 5
  • 00:38:48
    + 4 so 6 - 5 which will be
  • 00:38:55
    one so similarly you can explore more
  • 00:38:59
    operations related to vectors and we
  • 00:39:02
    will do it in subsequent lectures in
  • 00:39:06
    next lecture we will take some basic
  • 00:39:10
    Matrix
  • 00:39:11
    algebra with this let me close this
  • 00:39:14
    lecture I hope you have enjoyed it thank
  • 00:39:17
    you very
  • 00:39:21
    [Applause]
  • 00:39:22
    [Music]
  • 00:39:26
    much
  • 00:39:31
    [Applause]
  • 00:39:35
    [Music]
  • 00:39:42
    [Laughter]
  • 00:39:47
    n
Tags
  • vectors
  • machine learning
  • vector space
  • dot product
  • magnitude
  • linear independence
  • orthogonal vectors
  • Python