Data Structures and Algorithms in 15 Minutes

00:16:19
https://www.youtube.com/watch?v=oz9cEqFynHU

Ringkasan

TLDRThis video humorously underscores the significance of mastering data structures and algorithms for computer science professionals. It describes how being proficient in these areas can boost one's career by attracting prestigious job offers and increasing perceived intelligence. However, the journey to mastering these skills can be fraught with challenges such as imposter syndrome. The video demystifies complex concepts such as big O notation, binary search trees, and hash maps, offering insights into efficient problem-solving. Additionally, it contrasts BFS and DFS and discusses further learning resources, highlighting the creator's preference for breadth-first learning over traditional depth-first educational approaches. For continued learning, it suggests Jomo class for practical lessons and MIT lectures for theoretical knowledge. The content is framed in a light-hearted and engaging manner to appeal to viewers.

Takeaways

  • 💡 Mastering data structures and algorithms enhances career opportunities in tech.
  • 🧠 Understanding big O notation is crucial for evaluating algorithm efficiency.
  • 🔗 Linked lists allow flexible data storage but lack constant-time indexing.
  • 🌳 Binary search trees help maintain ordered data with efficient search capabilities.
  • 🔍 BFS and DFS are essential concepts for traversing tree and graph data structures.
  • 🏷️ Hash maps enable constant-time operations for data retrieval and storage.
  • 📚 Breadth-first learning can offer a more comprehensive understanding than depth-first approaches.
  • 🌐 Free resources, like MIT lectures, provide comprehensive theoretical knowledge.
  • 📺 Jomo class is recommended for practical, concise lessons on data structures and algorithms.
  • 😂 The video uses humor to make complex topics more approachable.

Garis waktu

  • 00:00:00 - 00:05:00

    The speaker discusses how being proficient in data structures and algorithms boosts one's social and professional value in computer science, but emphasizes the challenges like frustration and imposter syndrome. They suggest learning these subjects not just for prestigious job offers but to improve problem-solving skills and efficiency. The video aims to demystify concepts like algorithm efficiency, using "big O" notation to explain time and space complexities in simple terms.

  • 00:05:00 - 00:10:00

    Different data structures like linked lists, binary trees, and heaps are explained. The speaker elaborates on searching methods such as depth-first and breadth-first searches, highlighting the application of these structures in solving various computational problems. Trees are compared to graphs, emphasizing their scope and usage in algorithms like short-path finding and topological sorting. Binary trees, heaps, stacks, and queues are introduced as foundational structures.

  • 00:10:00 - 00:16:19

    Graph theory is explored through practical scenarios, like calculating degrees of separation and finding shortest paths using Dijkstra's algorithm, highlighting the versatility of graphs in areas like social networks and maps. The video recommends resources to further understanding, such as the Joma Class for structured learning and free MIT lectures. The speaker encourages a breadth-first approach to learning, acquiring a general overview before deeper exploration.

Peta Pikiran

Mind Map

Pertanyaan yang Sering Diajukan

  • What is the main topic of the video?

    The video covers data structures and algorithms, explaining their importance in computer science.

  • Why are data structures and algorithms important?

    Mastery in data structures and algorithms can lead to high-paying job offers and enhanced problem-solving skills.

  • What is big O notation?

    Big O notation is a way to describe the time complexity or growth rate of an algorithm based on the input size.

  • What is a linked list?

    A linked list is a data structure consisting of nodes, where each node contains data and a pointer to the next node.

  • What is a binary search tree?

    A binary search tree is a tree data structure where each node has at most two children, with the left child's value being smaller and the right child's value being larger.

  • What is a hash map?

    A hash map is a data structure that allows constant time complexity for retrieving, storing, and deleting data using key-value pairs.

  • What is the difference between DFS and BFS?

    DFS (Depth-First Search) explores as far down a branch as possible before backtracking, while BFS (Breadth-First Search) explores all neighbors at the current level before moving on to the next level.

  • What does the video say about learning resources?

    The video recommends using jomoclass for an accessible learning experience or free resources like MIT lectures for more comprehensive teaching.

  • What is the creator's attitude towards current educational systems?

    The creator suggests that traditional education dives too deeply into topics one at a time and recommends a breadth-first approach to learning.

  • What is the tone of the video?

    The video maintains a humorous and somewhat irreverent tone throughout.

Lihat lebih banyak ringkasan video

Dapatkan akses instan ke ringkasan video YouTube gratis yang didukung oleh AI!
Teks
en
Gulir Otomatis:
  • 00:00:00
    being good at data structures and
  • 00:00:01
    algorithms is a computer science
  • 00:00:02
    equivalent of having
  • 00:00:04
    i mean everyone assumes you're a genius
  • 00:00:06
    you get high paying offers from
  • 00:00:07
    prestigious companies
  • 00:00:08
    and you even have a high social market
  • 00:00:10
    value on internet forums but the journey
  • 00:00:12
    from lead cell to algo chat is not an
  • 00:00:14
    easy one
  • 00:00:14
    it's filled with frustration and
  • 00:00:16
    imposter syndrome i know i've been there
  • 00:00:18
    but now i have offers from feng which
  • 00:00:20
    basically qualifies me to cure cancer
  • 00:00:22
    so to give back and help even the most
  • 00:00:24
    confusing programmers
  • 00:00:25
    consider this video my cure since this
  • 00:00:27
    industry is cancer
  • 00:00:30
    [Music]
  • 00:00:38
    the most important part of learning
  • 00:00:40
    something new is that you actually want
  • 00:00:41
    to learn it
  • 00:00:42
    this is why i failed out of spanish one
  • 00:00:44
    so before we start
  • 00:00:45
    ask yourself why do you want to learn
  • 00:00:47
    this don't do it to get a job at google
  • 00:00:49
    do it because learning this changes the
  • 00:00:51
    way you think data structures and
  • 00:00:52
    algorithms
  • 00:00:53
    is about solving problems efficiently a
  • 00:00:55
    bad programmer solves the problem
  • 00:00:57
    inefficiently and a
  • 00:00:58
    really bad programmer doesn't even know
  • 00:01:00
    why their solution is inefficient so
  • 00:01:01
    let's start with that
  • 00:01:02
    how do we rank an algorithm's efficiency
  • 00:01:04
    [Music]
  • 00:01:06
    say we wrote a function that goes
  • 00:01:08
    through every number in a list and adds
  • 00:01:09
    it to a total sum variable if you
  • 00:01:11
    consider
  • 00:01:12
    adding to be one operation then running
  • 00:01:13
    this function on a list with 10 numbers
  • 00:01:15
    costs 10 operations running it on a list
  • 00:01:17
    with 20 numbers costs 20 operations
  • 00:01:20
    say we wrote another function that just
  • 00:01:21
    returns the first number in a list then
  • 00:01:23
    no matter how large the list is this
  • 00:01:25
    function will never cost more than one
  • 00:01:27
    operation
  • 00:01:27
    clearly these two algorithms have a
  • 00:01:29
    different time complexity or
  • 00:01:30
    relationship between growth of input
  • 00:01:32
    size and growth of operations we
  • 00:01:34
    communicate these time complexities
  • 00:01:35
    using big
  • 00:01:36
    o notation referring to input size as n
  • 00:01:38
    common complexities are
  • 00:01:40
    constant linear quadratic
  • 00:01:43
    logarithmic n log n exponential
  • 00:01:47
    and factorial our first algorithm runs
  • 00:01:49
    in o of n
  • 00:01:50
    meaning its operations grow in a linear
  • 00:01:52
    relationship with the input size which
  • 00:01:54
    in this case is the amount of numbers in
  • 00:01:55
    the list our second algorithm
  • 00:01:57
    is not dependent on the input size at
  • 00:01:59
    all so it runs in constant time let's
  • 00:02:00
    take a look at how many operations a
  • 00:02:02
    program will have to execute in a
  • 00:02:03
    function with an input size of 5
  • 00:02:06
    versus 50. it might not matter when the
  • 00:02:08
    input is small but this gap gets very
  • 00:02:10
    dramatic as the input size increases if
  • 00:02:12
    n were 10
  • 00:02:13
    000 a function that runs in log of n
  • 00:02:15
    would only take 14 operations while the
  • 00:02:17
    function that runs in n factorial
  • 00:02:19
    would set your computer on fire for big
  • 00:02:21
    o notation we drop constants so
  • 00:02:23
    o of 10 times n and o of n over 10 are
  • 00:02:26
    both equivalent to o of n because the
  • 00:02:28
    graph is still linear and bagel notation
  • 00:02:30
    is also used for space complexity which
  • 00:02:32
    works the same way for how much space an
  • 00:02:34
    algorithm uses as n grows
  • 00:02:36
    unless you sit in your room doing
  • 00:02:38
    algorithms all day there's a good chance
  • 00:02:39
    you forgot what logarithms are
  • 00:02:41
    logarithms are the inverse to
  • 00:02:42
    exponential functions let's say i wanted
  • 00:02:44
    to find a word in a dictionary
  • 00:02:46
    one method is to check every word one by
  • 00:02:48
    one this is o
  • 00:02:49
    n but nobody does that so how are humans
  • 00:02:52
    able to find
  • 00:02:53
    any word in a dictionary with a hundred
  • 00:02:55
    thousand words in seconds we do
  • 00:02:57
    something more along method two
  • 00:02:59
    cut the search window in half each time
  • 00:03:01
    by checking the middle element
  • 00:03:02
    if we passed our word search on the left
  • 00:03:04
    half otherwise search on the right half
  • 00:03:06
    and then we repeat this until we find
  • 00:03:08
    our word
  • 00:03:08
    if our input n doubled in size that
  • 00:03:10
    would only grow our operations by one
  • 00:03:13
    in computer science this is the binary
  • 00:03:15
    search but for the binary search to work
  • 00:03:17
    the collection has to be sorted which
  • 00:03:19
    opens up a huge field of computer
  • 00:03:21
    science
  • 00:03:21
    sorting algorithms a very basic sort is
  • 00:03:24
    a selection sort you have one pointer at
  • 00:03:25
    the start of the list and another
  • 00:03:27
    pointer that is linear scan to find the
  • 00:03:28
    next minimum element
  • 00:03:30
    then you swap those elements and
  • 00:03:31
    increment the pointer since you're doing
  • 00:03:32
    a linear scan for every element in the
  • 00:03:34
    collection
  • 00:03:35
    this runs in all of n squared a more
  • 00:03:37
    efficient algorithm is the merge sort a
  • 00:03:39
    merged source splits a collection in
  • 00:03:40
    half into sub-collections until those
  • 00:03:42
    sub-collections can be sorted in
  • 00:03:44
    constant time and then it works its way
  • 00:03:46
    back up by merging sorted subcollections
  • 00:03:48
    until the entire collection is sorted
  • 00:03:49
    since it's splitting in half there will
  • 00:03:51
    be log n splits and thereby
  • 00:03:53
    log n merges it is o then to merge two
  • 00:03:56
    sorted collections into one sorted
  • 00:03:57
    collection
  • 00:03:58
    since we're doing that log n times the
  • 00:04:00
    time complexity is of
  • 00:04:01
    n times log n no algorithm can sort an
  • 00:04:04
    arbitrary collection in a better time
  • 00:04:06
    complexity than that so we consider
  • 00:04:08
    n log n to be the cost of sorting
  • 00:04:11
    perhaps the most fundamental data
  • 00:04:12
    structure is an array an array is an
  • 00:04:14
    indexable contiguous chunk of memory
  • 00:04:16
    arrays are fixed size you can't insert a
  • 00:04:18
    ninth element into an array meant for
  • 00:04:20
    eight elements a list data structure
  • 00:04:21
    with a flexible size is a linked list
  • 00:04:23
    the idea is to package your data into
  • 00:04:25
    nodes that hold one
  • 00:04:26
    your data and two a point that points to
  • 00:04:29
    the next node
  • 00:04:29
    traversing a linked list reminds me of
  • 00:04:31
    those youtube direction chains
  • 00:04:32
    like you see a comment that says read my
  • 00:04:35
    profile picture
  • 00:04:36
    and their profile picture says click on
  • 00:04:38
    my profile which brings you to their
  • 00:04:39
    header which says
  • 00:04:40
    read my bio so you go to their bio and
  • 00:04:43
    it says click the link below and you
  • 00:04:44
    click it and then it installs a virus
  • 00:04:46
    onto your browser
  • 00:04:48
    if a note's next pointer points to null
  • 00:04:50
    it's the end of the list you can add a
  • 00:04:52
    new node to the list in constant time by
  • 00:04:54
    just setting that pointer to the new
  • 00:04:55
    node
  • 00:04:56
    by doing additional tom [ __ ] with
  • 00:04:57
    pointers we can also insert and delete
  • 00:04:59
    nodes in constant time
  • 00:05:00
    sometimes nodes also contain a pointer
  • 00:05:02
    to the previous node in a variation
  • 00:05:04
    called the doubly linked list but a
  • 00:05:05
    major downside of linked list is that
  • 00:05:07
    you can't
  • 00:05:07
    access elements in constant time via
  • 00:05:09
    index like an array in practice both
  • 00:05:11
    programming languages have dynamically
  • 00:05:13
    sized arrays or lists which work by
  • 00:05:15
    creating arrays with a fixed size
  • 00:05:17
    if the array is full capacity and you
  • 00:05:18
    try to add a new element
  • 00:05:20
    the programming language automatically
  • 00:05:21
    creates a new array with double the
  • 00:05:23
    capacity
  • 00:05:24
    copies the current elements over and
  • 00:05:25
    sets your pointer to that new array
  • 00:05:27
    since this happens so infrequently we
  • 00:05:29
    generally consider appending to a list
  • 00:05:31
    to be a constant time operation
  • 00:05:34
    link lists aren't the only data
  • 00:05:35
    structure that include nodes referring
  • 00:05:37
    to other nodes what if instead of
  • 00:05:38
    pointing to a next node nodes pointed to
  • 00:05:40
    a left and a right child node this is
  • 00:05:42
    called a binary tree
  • 00:05:43
    these child notes can also be called
  • 00:05:45
    subtrees since trees are recursive in
  • 00:05:47
    nature
  • 00:05:47
    a node with no children is called a leaf
  • 00:05:49
    node while a node with seven children is
  • 00:05:50
    my
  • 00:05:51
    father-in-law a common type of tree is
  • 00:05:53
    the binary search tree which follows
  • 00:05:55
    these rules
  • 00:05:55
    basically a node's left child must have
  • 00:05:57
    a smaller value while a node's right
  • 00:05:59
    child must have a greater or equal value
  • 00:06:02
    let's say you're looking for element x
  • 00:06:04
    you start at the root and if the number
  • 00:06:05
    is smaller than the root you go to the
  • 00:06:07
    left tree if it's larger you go to the
  • 00:06:09
    right and you keep repeating this until
  • 00:06:10
    you arrive at your number
  • 00:06:12
    in the worst case you have to traverse
  • 00:06:13
    the height of the tree so the time
  • 00:06:15
    complexity to search
  • 00:06:16
    insert and delete elements is of h where
  • 00:06:19
    h is the height
  • 00:06:20
    this is efficient because on average the
  • 00:06:21
    height will be log n
  • 00:06:23
    but what if you inserted every element
  • 00:06:25
    of a sorted array into an empty binary
  • 00:06:27
    search tree
  • 00:06:28
    well that's a linked list meaning the
  • 00:06:31
    height would be
  • 00:06:31
    n but we can guarantee log n time
  • 00:06:34
    complexities using a self-balancing
  • 00:06:36
    binary search tree such as red black
  • 00:06:38
    trees which maintain
  • 00:06:39
    additional properties to guarantee that
  • 00:06:41
    the height of the tree is o of log n
  • 00:06:43
    binary search trees are kind of
  • 00:06:44
    considered to be a workhorse
  • 00:06:45
    distribution that can solve most
  • 00:06:47
    problems with decent efficiency but i
  • 00:06:48
    found situations where binary search
  • 00:06:50
    trees
  • 00:06:51
    really shine are when you're asked a
  • 00:06:53
    question about binary search trees
  • 00:06:56
    another type of tree is a heap the
  • 00:06:59
    primary difference between this and the
  • 00:07:00
    binary search tree is
  • 00:07:02
    actually use this one it's also
  • 00:07:03
    sometimes called a priority queue
  • 00:07:04
    because at the root of it is always the
  • 00:07:06
    highest priority element
  • 00:07:08
    and min heap will have the min element
  • 00:07:09
    and the max heap will have the max
  • 00:07:11
    element at the root but don't get me
  • 00:07:12
    wrong the rest of the heap is unsorted
  • 00:07:14
    it is an absolute wasteland down there
  • 00:07:16
    searching in the rest of the heap
  • 00:07:18
    may as well be an unsorted array we only
  • 00:07:20
    care about what's at the top
  • 00:07:23
    just like capitalism to insert a new
  • 00:07:25
    element into a heap
  • 00:07:26
    first we find the next available spot to
  • 00:07:28
    add a leaf node then we compare it with
  • 00:07:29
    his parents and if it's a higher
  • 00:07:30
    priority it swaps and bubbles his way up
  • 00:07:32
    we're doing at most
  • 00:07:34
    log n comparisons you can build a heap
  • 00:07:36
    from a random collection by inserting n
  • 00:07:38
    times which cost
  • 00:07:39
    o of n log n but there's also a way to
  • 00:07:41
    build a heap in o of end time
  • 00:07:43
    i'd suggest reading this wikipedia page
  • 00:07:45
    because it's super interesting stuff and
  • 00:07:47
    lastly
  • 00:07:47
    since the heap is nearly complete
  • 00:07:49
    although we can visualize it as a tree
  • 00:07:51
    we could actually use these properties
  • 00:07:52
    to represent it with an
  • 00:07:53
    array one way to traverse a tree is a
  • 00:07:57
    depth first search
  • 00:07:58
    which is like going deep fully exploring
  • 00:08:00
    one path and then moving on to the next
  • 00:08:02
    one way to visit this tree in a depth
  • 00:08:04
    first order could be to start at 10
  • 00:08:06
    go to 13 go to 4 then 6 then 12
  • 00:08:09
    then 8 and then 1 whereas another option
  • 00:08:12
    is a breadth first search where we view
  • 00:08:14
    it more
  • 00:08:14
    level by level a way to print this same
  • 00:08:16
    tree in a bfs manner could be
  • 00:08:18
    10 13 12 4 6
  • 00:08:22
    8 1. when it comes to implementation
  • 00:08:24
    depth first search can use a stack a
  • 00:08:26
    stack is a list-based data structure
  • 00:08:28
    where the only two operations
  • 00:08:29
    are add to the end and pop from the end
  • 00:08:32
    this makes a stack
  • 00:08:32
    lifo last in first out dfs are often
  • 00:08:36
    done using recursion
  • 00:08:37
    which indirectly uses a stack the
  • 00:08:39
    recursive stack so if your recursion
  • 00:08:41
    exceeds a certain amount of depth then
  • 00:08:43
    you get a stack overflow let's look at a
  • 00:08:45
    way to print this tree in a dfs matter
  • 00:08:47
    using a stack so first initialize the
  • 00:08:48
    stack and add the root
  • 00:08:50
    then while there are still elements in
  • 00:08:52
    the stack pop from the stack print that
  • 00:08:54
    element and then add its children to the
  • 00:08:56
    stack on the contrary a breadth first
  • 00:08:58
    search uses a queue
  • 00:08:59
    a queue is fifo first in first out so
  • 00:09:03
    instead of popping from the end of the
  • 00:09:04
    list
  • 00:09:05
    you pop from the beginning doing the
  • 00:09:07
    same algorithm as before but
  • 00:09:08
    using a q instead of a stack the tree
  • 00:09:10
    will now print in a bfs
  • 00:09:12
    order all trees are graphs but not all
  • 00:09:16
    graphs are trees a graph is a collection
  • 00:09:18
    of vertices which are like points
  • 00:09:20
    and edges which are like connections
  • 00:09:22
    from one vertex to another vertex a
  • 00:09:24
    graph can be directed meaning edges can
  • 00:09:26
    only go one way
  • 00:09:27
    so this edge means you can only go from
  • 00:09:28
    a to b or undirected meaning you can
  • 00:09:31
    also go from b to a
  • 00:09:32
    two common ways to model edges are
  • 00:09:34
    adjacency lists and adjacency matrices
  • 00:09:37
    graphs are my favorite part of
  • 00:09:38
    destruction algorithms so i'd like to
  • 00:09:40
    show how cool they are with three
  • 00:09:41
    example scenarios on where you can use
  • 00:09:43
    them
  • 00:09:45
    so it recently dawned on me that i know
  • 00:09:47
    kanye west
  • 00:09:48
    and by that i mean i know joma tech but
  • 00:09:50
    joma tech knows jarvis johnson who knows
  • 00:09:53
    mkbhd who knows elon musk
  • 00:09:55
    who knows kanye west which is five
  • 00:09:57
    degrees of separation between me and
  • 00:09:58
    kanye west i wanted to find something
  • 00:10:00
    shorter
  • 00:10:01
    well turns out i know bretman rock who
  • 00:10:03
    knows rihanna who knows kanye three
  • 00:10:05
    degrees of separation
  • 00:10:06
    which also means that anyone who knows
  • 00:10:08
    me is at most
  • 00:10:09
    four degrees of separation away from
  • 00:10:11
    kanye if we wanted to make a program
  • 00:10:12
    that computes the smallest degrees of
  • 00:10:14
    separation between two people
  • 00:10:16
    a graph would be the perfect choice you
  • 00:10:18
    model people as vertices and you model
  • 00:10:20
    relationships as edges the shortest path
  • 00:10:22
    between the source node
  • 00:10:23
    me and the target node kanye can be
  • 00:10:26
    found with a breath first search
  • 00:10:27
    since we're moving out level by level we
  • 00:10:29
    can guarantee that the first time
  • 00:10:31
    reaching the kanye node will also be the
  • 00:10:33
    shortest path
  • 00:10:34
    the problem with implementing this
  • 00:10:35
    algorithm the way i just described is
  • 00:10:36
    that
  • 00:10:37
    nobody has a concrete list of all the
  • 00:10:39
    people they know nor would that be
  • 00:10:41
    publicly accessible by me but a possible
  • 00:10:43
    variation could be in a social network
  • 00:10:45
    like facebook where we can use the
  • 00:10:46
    friend list as edges
  • 00:10:48
    we would then be able to run this
  • 00:10:49
    algorithm and find the smallest degrees
  • 00:10:51
    of separation between
  • 00:10:52
    any two users in that social network
  • 00:10:55
    imagine a map system like google maps
  • 00:10:57
    that wants to find the shortest path
  • 00:10:58
    between two locations
  • 00:11:00
    this is different than the last example
  • 00:11:01
    because although they both deal with
  • 00:11:03
    shortest paths the degrees of
  • 00:11:04
    separations did not have
  • 00:11:05
    weighted edges where a map system does
  • 00:11:07
    for example if we're computing the
  • 00:11:08
    shortest path between two vertices and
  • 00:11:10
    there's a direct edge between them
  • 00:11:12
    weighing 20 versus a path of two edges
  • 00:11:14
    weighing eight
  • 00:11:14
    each a breath first search would give us
  • 00:11:16
    the shortest path in terms of the
  • 00:11:17
    vertices away but not the smallest
  • 00:11:19
    weight one algorithm that computes the
  • 00:11:21
    shortest path in a graph with positive
  • 00:11:22
    weighted edges is dijkstra's algorithm
  • 00:11:24
    using the heap data structure in the
  • 00:11:26
    original version of this video
  • 00:11:28
    i explained dijkstra's and my video shot
  • 00:11:29
    up 6 minutes so instead just research it
  • 00:11:32
    for yourself if you're interested
  • 00:11:34
    but graphs aren't just good for path
  • 00:11:36
    finding imagine a course schedule in
  • 00:11:37
    school with classes
  • 00:11:38
    and prerequisites for each class and say
  • 00:11:40
    you wanted to find the order in which
  • 00:11:42
    you can take all your classes while
  • 00:11:43
    still fulfilling your prerequisites well
  • 00:11:45
    model the classes as vertices and
  • 00:11:47
    prerequisites as edges count how many
  • 00:11:49
    incoming edge each vertex has add
  • 00:11:51
    vertices with zero incoming edges to a
  • 00:11:52
    queue then pop and print the element
  • 00:11:54
    from the queue and decrement the
  • 00:11:55
    incoming edges of all its children
  • 00:11:57
    if a child now has zero incoming edges
  • 00:11:59
    add it to the cube and repeat while
  • 00:12:00
    there are still elements in the queue
  • 00:12:02
    this algorithm is known as a topological
  • 00:12:04
    sort
  • 00:12:05
    [Music]
  • 00:12:07
    hash maps are sort of the holy grail of
  • 00:12:08
    data structures with basically constant
  • 00:12:10
    time retrieval of data
  • 00:12:12
    the saying goes that if you don't know
  • 00:12:13
    what you're doing just try throwing
  • 00:12:15
    hashtags at the question
  • 00:12:16
    as an example one time i was in a coding
  • 00:12:17
    interview and froze so
  • 00:12:19
    i just told the interviewer hmm i think
  • 00:12:22
    the solution to use a hash map
  • 00:12:23
    unfortunately the question was what are
  • 00:12:25
    your biggest weaknesses
  • 00:12:27
    so the better answer was hang up the
  • 00:12:28
    phone to show that i have none a hashmap
  • 00:12:30
    is a data structure built on top of an
  • 00:12:32
    array optimized to store
  • 00:12:33
    key value pairs what makes it so
  • 00:12:35
    powerful is you can retrieve delete and
  • 00:12:37
    store data
  • 00:12:38
    in on average constant time here we have
  • 00:12:40
    a map where keys are strings
  • 00:12:41
    representing people's names and values
  • 00:12:43
    are their corresponding ages
  • 00:12:45
    we can directly access trend black's age
  • 00:12:47
    like this it's almost as if strings can
  • 00:12:49
    be used as indices
  • 00:12:50
    but how is that even possible because of
  • 00:12:53
    this little thing called a hash function
  • 00:12:55
    a hash function will take a key and
  • 00:12:56
    return a hash code if we take that
  • 00:12:58
    number modulus the length of its
  • 00:12:59
    underlying array we can use that as an
  • 00:13:01
    index to store its value but the hash
  • 00:13:03
    function has to compute the same hash
  • 00:13:05
    code if
  • 00:13:06
    given the same key so hash of trend
  • 00:13:07
    black should always return the same hash
  • 00:13:09
    code or else we lose the index but what
  • 00:13:11
    if hash of trend black and hash of ziz
  • 00:13:13
    both end with the same index well this
  • 00:13:15
    is called collision
  • 00:13:17
    one way to deal with this is to store
  • 00:13:18
    our value in a linked list
  • 00:13:20
    so when we go to stores is and see that
  • 00:13:21
    index three already has a value we have
  • 00:13:24
    that value
  • 00:13:24
    point to the new value this is why a
  • 00:13:26
    hash map is not strictly of one because
  • 00:13:29
    if you write some
  • 00:13:30
    god awful hash function then it won't be
  • 00:13:32
    balanced and we will have to do a lot of
  • 00:13:34
    linkless traversing good hash functions
  • 00:13:36
    evenly spread out hash codes
  • 00:13:37
    in practice modern languages use very
  • 00:13:39
    good hash functions so you don't have to
  • 00:13:41
    write your own
  • 00:13:41
    an example of a hashmap is the
  • 00:13:43
    dictionary in python or the object in
  • 00:13:45
    javascript and remember
  • 00:13:46
    constant lookup of data is very
  • 00:13:48
    overpowered so try to use it when you
  • 00:13:50
    can
  • 00:13:50
    [Music]
  • 00:13:52
    and that's pretty much all you need to
  • 00:13:53
    know for data structures and algorithms
  • 00:13:55
    a six month college course taught in 13
  • 00:13:58
    minutes now of course you didn't learn
  • 00:13:59
    anything in great detail but trust me it
  • 00:14:01
    is a lot easier to learn something in
  • 00:14:03
    great detail if you already learned the
  • 00:14:05
    big picture first
  • 00:14:06
    this is why college sucks at teaching it
  • 00:14:08
    dives very deep into one topic and then
  • 00:14:10
    moves on to the next
  • 00:14:11
    it's like a depth first search i believe
  • 00:14:14
    the better way of learning is to get a
  • 00:14:15
    general understanding and then build on
  • 00:14:17
    it
  • 00:14:17
    like a breath first search so if you
  • 00:14:19
    want to keep going what are my
  • 00:14:20
    recommendations to build on your
  • 00:14:22
    knowledge well the first one
  • 00:14:23
    is jomo class you know those really nice
  • 00:14:25
    animations in this video like this one
  • 00:14:28
    yeah i got them from joma class if you
  • 00:14:30
    don't care about getting extremely
  • 00:14:31
    theoretical and just want simple
  • 00:14:33
    to the point everything you need to know
  • 00:14:35
    and nothing you don't type lessons then
  • 00:14:37
    jomo class makes this difficult topic
  • 00:14:39
    very accessible but i will admit that i
  • 00:14:41
    roasted the
  • 00:14:42
    [ __ ] out of jomah a while back because
  • 00:14:44
    he was pushing some
  • 00:14:46
    actual snake oil well unlike some people
  • 00:14:48
    he took my criticism very well and to
  • 00:14:50
    vindicate his name
  • 00:14:52
    he made something that is actually
  • 00:14:54
    valuable and i will defend that
  • 00:14:55
    he put much more effort into these
  • 00:14:57
    explanations dropped the price from a
  • 00:14:59
    thousand dollars to eight dollars a
  • 00:15:01
    month and most importantly he dropped
  • 00:15:03
    the
  • 00:15:05
    dead weight if you're a complete
  • 00:15:06
    beginner it also comes with a course on
  • 00:15:08
    how to code in python and of course on
  • 00:15:10
    sql if you're a bit more advanced but i
  • 00:15:12
    gotta say the best part is the community
  • 00:15:14
    aspect
  • 00:15:14
    each lesson has a whole community behind
  • 00:15:16
    it where people ask questions
  • 00:15:18
    discuss topics and just learn together
  • 00:15:20
    well i like the course so much that i
  • 00:15:22
    actually called joma
  • 00:15:24
    and i was like yo what the [ __ ] dude
  • 00:15:27
    this is actually good i'm gonna start
  • 00:15:29
    recommending this to people
  • 00:15:30
    and he was so excited to hear that i
  • 00:15:32
    liked it that he was actually willing to
  • 00:15:34
    offer my audience a discount
  • 00:15:35
    so if you go to trend.jomoclass.com
  • 00:15:38
    then you'll get 15 off but even though
  • 00:15:41
    paying for stuff increases the chance
  • 00:15:43
    that you'll actually complete it you
  • 00:15:44
    don't
  • 00:15:44
    have to spend money not everyone has
  • 00:15:46
    eight dollars a month to drop and i want
  • 00:15:48
    to assure you that you can still learn
  • 00:15:50
    everything for free
  • 00:15:51
    so it might take a bit more
  • 00:15:52
    self-discipline but you can do it so
  • 00:15:54
    don't listen to anyone who says you
  • 00:15:56
    have to pay so if you do want to go down
  • 00:15:58
    the free route i'd recommend these
  • 00:16:00
    college lectures
  • 00:16:01
    they're literally taught by mit
  • 00:16:02
    professors and posted for free online
  • 00:16:04
    they're pretty theoretical but very
  • 00:16:06
    comprehensive and it's perfect if you
  • 00:16:08
    like that old-school
  • 00:16:09
    chalkboard setting so yeah good luck
  • 00:16:11
    guys and go get that guck 3000
Tags
  • data structures
  • algorithms
  • computer science
  • big O notation
  • linked list
  • binary search tree
  • hash map
  • DFS
  • BFS
  • learning resources