Applying AI to the SDLC: New Ideas and Gotchas! - Leveraging AI to Improve Software Engineering

00:50:02
https://www.youtube.com/watch?v=67uxiGfCWzY

Résumé

TLDRThe presentation explores the parallels between the evolution of navigation technology and today's developments in generative AI within the realm of software engineering. The speaker, Tracy Ben, from the MITRE Corporation, highlights how technology, specifically AI, has become ubiquitous and yet cautions against getting swept up in the hype around it. Tracy suggests that we are at the peak of inflated expectations in the AI lifecycle, where generative AI, while holding groundbreaking potential, comes with limitations and challenges—primarily in terms of security, trust, and integration with existing systems. The discussion delves into the phases of AI development, how it’s reshaping software engineering practices, and the need for rigorous testing and maintaining human oversight in AI processes. Concerns over data silos, the need for platform engineering, and changes in team dynamics with AI agents are addressed. Best practices are encouraged, like version control, testing independence, and avoiding data exposures. The session concludes with a call for organizations to thoughtfully integrate AI by understanding and preparing for its impacts on software engineering.

A retenir

  • 📱 QR codes simplify downloading materials.
  • 📍 Technology evolution from maps to AI in navigation.
  • 🔍 Current AI technology mirrors digital transition in navigation.
  • ⚠️ Peak of inflated AI expectations noted by the speaker.
  • 👥 Emphasis on keeping humans in the loop with AI.
  • 🛡️ Concerns about AI's traceability and security.
  • 🧩 Generative AI is a part of larger AI/ML frameworks.
  • 📉 Identification of potential reduced collaboration in software teams.
  • 🔧 Best practices include testing independence and security measures.
  • 🚀 Encouragement of strategic AI pilot initiatives for organizations.

Chronologie

  • 00:00:00 - 00:05:00

    The speaker begins by discussing the ubiquitous nature of navigation technology, comparing past experiences of using maps to the current use of digital navigation systems like GPS. They draw a parallel between this technological evolution and the current state of AI, emphasizing the importance of AI in software engineering, particularly with the transition to generative AI.

  • 00:05:00 - 00:10:00

    The speaker comments on the current hype cycle of AI, noting that AI technologies are at their peak of inflated expectations. They discuss the complexities of software engineering, emphasizing the need for understanding different AI types beyond generative AI as tools within the software development lifecycle.

  • 00:10:00 - 00:15:00

    They highlight that generative AI is non-deterministic, which comes with potential and limitations. The speaker suggests treating AI like a young apprentice, capable but requiring oversight. They also reference the need for humans in the loop, especially to ensure security, traceability, and auditability in software development processes.

  • 00:15:00 - 00:20:00

    The speaker discusses a survey finding about developers using AI, emphasizing a potential reduction in collaboration due to the use of generative AI tools. They recount a personal story about leveraging AI for requirements analysis, underlining the importance of diverse data sets and rigorous testing with humans in the loop.

  • 00:20:00 - 00:25:00

    Emphasizing potential privacy issues and hallucinations in AI-generated tests, the speaker warns about the use of AI in code generation, noting the shift from 'code generation' to 'code completion' due to reliability issues. There's a need for independence between generated code and tests to maintain quality.

  • 00:25:00 - 00:30:00

    They mention studies showing AI-generated code frequently contains security vulnerabilities and highlight the importance of thorough code reviews. Recognizing AI's groundbreaking potential, the speaker insists on having rigorous testing and maintaining human oversight.

  • 00:30:00 - 00:35:00

    The speaker stresses the importance of infrastructure readiness before embedding AI into workflows, recommending continuous integration practices. They discuss the balance between perceived and actual productivity gains with AI, advocating for team-based productivity measurement.

  • 00:35:00 - 00:40:00

    Recommended organizational strategies for adopting AI include setting clear governance and performing needs assessments. They stress creating a focused pilot to identify skills required and challenges faced during AI integration, advocating for thoughtful leadership to stay abreast of rapid AI advancements.

  • 00:40:00 - 00:50:02

    Looking ahead, they foresee increased data silos and slower workflow initially due to AI integration. They suggest focusing on platform engineering to minimize errors and highlight the shift from AI as a tool to AI functioning as integrated agents in the software development lifecycle.

Afficher plus

Carte mentale

Mind Map

Questions fréquemment posées

  • What is the main topic of the video?

    The video discusses the evolution of technology, focusing on the transition to generative AI in software engineering.

  • How does the speaker describe the current stage of AI development?

    The speaker suggests we are at the peak of inflated expectations in the development of generative AI.

  • What does the speaker compare generative AI to?

    The speaker compares generative AI to a young apprentice who is promising yet requires supervision and guidance.

  • What are some of the concerns raised about generative AI in the video?

    Concerns include traceability, auditability, reproducibility, security, and reduced collaboration in software engineering.

  • How does the speaker propose organizations should engage with generative AI?

    By implementing pilots to assess needs, ensuring diverse data sets, maintaining rigorous testing, and always keeping humans in the loop.

  • What changes does the speaker predict will occur with the use of generative AI in software development?

    The speaker predicts more data silos, increased platform engineering needs, and a shift in the structure of teams to incorporate AI agents.

  • What are some best practices mentioned for using generative AI in software development?

    Best practices include maintaining independent code and test generation, thorough code reviews, and securing vulnerabilities.

  • Does the speaker view current generative AI technology positively or negatively?

    The speaker recognizes both the groundbreaking potential and the current limitations and challenges of generative AI.

  • What is the importance of trust in AI according to the speaker?

    Trust is crucial in AI, especially given its non-deterministic nature, requiring more oversight and understanding of its outputs.

  • What is suggested to avoid while using generative AI tools?

    It's suggested to avoid relying solely on AI-generated content without human oversight and not to expose sensitive information to public models.

Voir plus de résumés vidéo

Accédez instantanément à des résumés vidéo gratuits sur YouTube grâce à l'IA !
Sous-titres
en
Défilement automatique:
  • 00:00:03
    [Music]
  • 00:00:12
    good morning everybody thank you for
  • 00:00:14
    coming out this morning Chris thank you
  • 00:00:16
    for the for the warm welcome if you
  • 00:00:18
    could all do me a favor and take out
  • 00:00:20
    your smartphones and point them at the
  • 00:00:22
    QR code and that'll take you to a
  • 00:00:24
    download of today's materials that way
  • 00:00:27
    if you'd like to during the course of
  • 00:00:28
    our chat you don't need take as many
  • 00:00:33
    photos now I got into town over the
  • 00:00:35
    weekend I got in this is my first time
  • 00:00:37
    in London and I've been navigating the
  • 00:00:40
    city and I've been having a fantastic
  • 00:00:42
    time and it really got me thinking about
  • 00:00:44
    something it got me thinking about the
  • 00:00:47
    fact that I could use my phone to get
  • 00:00:49
    anywhere I needed to
  • 00:00:53
    go and it got me to think about how
  • 00:00:55
    ubiquitous it is that we can navigate
  • 00:00:58
    easily anywhere we want to go it's built
  • 00:01:01
    into our cars i road bicycle and I have
  • 00:01:04
    a computer on my road bike and we always
  • 00:01:06
    know where I am you can buy a little
  • 00:01:08
    chip now and you can sew it into the
  • 00:01:09
    back of your children's sweatshirts and
  • 00:01:12
    things and always know where they're at
  • 00:01:14
    so it's really ubiquitous but it didn't
  • 00:01:16
    start out that
  • 00:01:18
    way when I learned to drive I learned to
  • 00:01:21
    drive with a map and as a matter of fact
  • 00:01:25
    I was graded on how
  • 00:01:27
    well I could refold
  • 00:01:31
    the map obviously a skill that I haven't
  • 00:01:34
    worried about since
  • 00:01:36
    then but I was also driving during the
  • 00:01:39
    digital transition when all of that
  • 00:01:41
    amazing cartography information was
  • 00:01:44
    digitized and somebody realized we can
  • 00:01:46
    put a front end on this and we can ask
  • 00:01:48
    people where they're starting where
  • 00:01:50
    they're going and then we can give them
  • 00:01:53
    step by step place to go but they still
  • 00:01:55
    had to print it out and if you happen to
  • 00:01:58
    be the first person who is in the
  • 00:02:00
    passenger seat you got to be the voice
  • 00:02:03
    in 100 m take a left the ramp onto the
  • 00:02:09
    M4 and it wasn't long until we had
  • 00:02:12
    special Hardware now we had a Garmin or
  • 00:02:16
    we had a
  • 00:02:17
    TomTom and it was mixing the cartography
  • 00:02:20
    information it was mix mixing The Voice
  • 00:02:23
    aspect and it was mixing that Hardware
  • 00:02:24
    together and is fantastic now when my
  • 00:02:26
    children started to drive they started
  • 00:02:28
    with a TomTom but I made made them learn
  • 00:02:30
    to read a map because if you can see
  • 00:02:32
    what it says there the signal was
  • 00:02:35
    lost but now it's everywhere it is
  • 00:02:38
    ubiquitous for us in 2008 the iPhone was
  • 00:02:42
    released the iPhone 3G and it had that
  • 00:02:46
    sensor in it and now everywhere that we
  • 00:02:48
    went we have the ability to tell where
  • 00:02:51
    we are we can track our packages we can
  • 00:02:53
    track when the car is coming to pick us
  • 00:02:55
    up we can track all sorts of different
  • 00:02:56
    things we've just begun to expect that
  • 00:02:59
    what does that have to do with AI with
  • 00:03:03
    software
  • 00:03:04
    engineering that's because I believe
  • 00:03:06
    that this is where we're at right now I
  • 00:03:09
    think we're at the digital transition
  • 00:03:10
    when it comes specifically to generative
  • 00:03:12
    Ai and leveraging that to help us to
  • 00:03:15
    build
  • 00:03:17
    software so yes my name is Tracy Ben and
  • 00:03:20
    I go by Trace I like wordclouds uh and I
  • 00:03:23
    am a software architect I am a
  • 00:03:25
    researcher now and that's been something
  • 00:03:27
    newer in my career over the last couple
  • 00:03:29
    of years years I work for a company
  • 00:03:31
    called the miter Corporation we're
  • 00:03:33
    federally funded research and
  • 00:03:35
    development the US government realized
  • 00:03:38
    that they needed help they needed
  • 00:03:39
    technologist they weren't trying to sell
  • 00:03:40
    anything so I get paid to talk straight
  • 00:03:43
    it's kind of
  • 00:03:44
    cool so let's go back in time everybody
  • 00:03:47
    2023 where were you when you heard that
  • 00:03:50
    100 million people were using chat
  • 00:03:53
    GPT I don't know I do remember that all
  • 00:03:56
    of a sudden my social feed my emails
  • 00:03:59
    newsletters there's everything said ai
  • 00:04:02
    ai ai right chronic fomo it's almost as
  • 00:04:06
    though you expect to go walking down the
  • 00:04:08
    aisle in the grocery and see AI stickers
  • 00:04:12
    slapped on the milk and on the biscuits
  • 00:04:13
    and on the cereal because obvious it's
  • 00:04:15
    everywhere it's
  • 00:04:17
    everything please don't get swept up in
  • 00:04:21
    the
  • 00:04:23
    hype now I know here at cuon and with
  • 00:04:26
    infoq we prefer to talk about CR
  • 00:04:29
    crossing the chasm but I'm going to use
  • 00:04:32
    the Gartner hype cycle for a
  • 00:04:35
    moment the words are beautiful are we at
  • 00:04:38
    the technology trigger when it comes to
  • 00:04:40
    Ai and software engineering are we at
  • 00:04:42
    the peak of inflated expectations the
  • 00:04:45
    Troth of disillusionment have we started
  • 00:04:48
    up the slope of Enlightenment yet are we
  • 00:04:51
    yet at that plateau of productivity
  • 00:04:53
    where do you think we are it's one of
  • 00:04:55
    the few times that I agree with
  • 00:04:57
    Gardner we are at the peak of inflated
  • 00:05:01
    expectations now granted Gartner is
  • 00:05:03
    often late to the game no offense to
  • 00:05:05
    anybody from Gartner who's here but by
  • 00:05:07
    the time they realize it oftentimes I
  • 00:05:08
    believe that we're further along the
  • 00:05:10
    hype cycle but what's interesting here
  • 00:05:12
    is two to five years to the plateau of
  • 00:05:15
    productivity how many people would agree
  • 00:05:17
    with that based on what I'm seeing based
  • 00:05:20
    on my experience based on Research I
  • 00:05:22
    believe that's
  • 00:05:26
    correct what we do as software
  • 00:05:29
    architects as software Engineers is
  • 00:05:31
    really complex and it's not a straight
  • 00:05:34
    line in any decision that we're making
  • 00:05:36
    we use architectural trade-off I love
  • 00:05:38
    the quote by Grady bch the entire
  • 00:05:41
    history of software engineering is one
  • 00:05:42
    of rising levels of abstraction and
  • 00:05:44
    we've heard about that this week we've
  • 00:05:46
    heard about the discussions of needing
  • 00:05:47
    to have orchestration platforms of many
  • 00:05:50
    many different layers of many many
  • 00:05:52
    different libraries that are necessary
  • 00:05:53
    to abstract and make AI generative Ai
  • 00:05:56
    and specific helpful
  • 00:06:00
    yes I like word clouds I mentioned that
  • 00:06:03
    before I have the luxury of working with
  • 00:06:05
    about 200 of the leading data scientists
  • 00:06:09
    and data engineers in the world so I sat
  • 00:06:12
    down with a couple of them and said I'm
  • 00:06:13
    going to cuon this is the audience how
  • 00:06:16
    would you explain to me all of the
  • 00:06:18
    different types of AI that exist the ml
  • 00:06:22
    universe beyond generative AI boy did we
  • 00:06:25
    draw Frameworks we had slide after slide
  • 00:06:27
    after slide so I came back to it and
  • 00:06:30
    said you know let's take this instead
  • 00:06:31
    like Legos and dump them on the
  • 00:06:33
    table what's important to take away from
  • 00:06:36
    this slide is that generative AI is
  • 00:06:39
    simply one piece of a massive puzzle
  • 00:06:42
    there are many many different types of
  • 00:06:44
    AI many types of ml many different types
  • 00:06:46
    of algorithms that we can and should be
  • 00:06:51
    using so where do you think AI can be
  • 00:06:55
    used within Dev SEC Ops within the
  • 00:06:57
    software development life cycle
  • 00:07:01
    now the next slide I'm going to show you
  • 00:07:02
    is one that's worth coming back to it's
  • 00:07:04
    an ey chart please do download it
  • 00:07:07
    because you're not going to be able to
  • 00:07:08
    read
  • 00:07:10
    it the first time I published this was
  • 00:07:13
    in October of last year and there are at
  • 00:07:15
    least a half a dozen additional areas
  • 00:07:17
    that have been added to that during the
  • 00:07:19
    this time what's important is that
  • 00:07:22
    generative AI is only one piece of the
  • 00:07:24
    puzzle here we've been using AI we've
  • 00:07:28
    been using ml for years and years and
  • 00:07:31
    years how do we get after digital twins
  • 00:07:33
    if we're dealing with cyber physical
  • 00:07:35
    systems we're not simply generating new
  • 00:07:38
    scripts and new codes we're leveraging
  • 00:07:40
    deterministic algorithms for what we
  • 00:07:43
    need to do and remember that generative
  • 00:07:45
    AI is
  • 00:07:48
    non-deterministic with it though it has
  • 00:07:51
    groundbreaking potential generative AI
  • 00:07:53
    in specific groundbreaking potential and
  • 00:07:57
    it has limitations and has challeng
  • 00:08:00
    I love this slide you're going to see
  • 00:08:01
    the slide a couple of times I simply
  • 00:08:03
    love the photo treat generative AI like
  • 00:08:07
    a young apprentice and I don't mean
  • 00:08:09
    somebody who's coming out of college I
  • 00:08:11
    mean that 15-year-old brings a lot of
  • 00:08:14
    energy and you're excited to have them
  • 00:08:16
    there and occasionally they do something
  • 00:08:18
    right and it really makes you happy but
  • 00:08:21
    most of the time you're cocking your
  • 00:08:23
    head to the side and say what the heck
  • 00:08:25
    were you
  • 00:08:26
    thinking we heard that with stories this
  • 00:08:29
    week in the tracks especially around Ai
  • 00:08:32
    and ml so pay close attention pay very
  • 00:08:35
    close
  • 00:08:38
    attention and yes I learned my lesson
  • 00:08:41
    and I do use note cards now I'm going to
  • 00:08:44
    take you back for a moment and just make
  • 00:08:45
    sure that I say to you that this is not
  • 00:08:48
    just my
  • 00:08:49
    opinion this is what the research is
  • 00:08:52
    showing there are service providers who
  • 00:08:55
    have provided AI capabilities who are
  • 00:08:57
    now making sure that they have have all
  • 00:08:59
    kinds of disclaimers and they have all
  • 00:09:01
    kinds of advice for you that they're
  • 00:09:02
    providing guidance that says make sure
  • 00:09:04
    you have humans in the
  • 00:09:08
    loop do you think that generative AI
  • 00:09:11
    contradicts devops
  • 00:09:14
    principles any thoughts on that well I
  • 00:09:18
    will tell you that sort of it does so
  • 00:09:21
    when I think about
  • 00:09:22
    traceability if it's being generated by
  • 00:09:24
    a black box that I don't own that's much
  • 00:09:27
    more difficult how about auditability
  • 00:09:29
    that's part of depths Ops how am I going
  • 00:09:32
    to be able to audit something that I
  • 00:09:33
    don't understand where it came from or
  • 00:09:35
    the provenance for it reproducibility
  • 00:09:38
    anybody ever hit the regenerate button
  • 00:09:40
    does it come back with the same thing
  • 00:09:42
    reproducibility explainability do you
  • 00:09:45
    understand what was just generated and
  • 00:09:47
    handed to you whether it's a test
  • 00:09:49
    whether it's code whether it's script
  • 00:09:51
    whether it's something else do you
  • 00:09:53
    understand and then there's
  • 00:09:55
    security we're going to talk a lot about
  • 00:09:57
    Security today so I'm glad that we are
  • 00:09:59
    having a security track today as well
  • 00:10:02
    there was a survey of over 500
  • 00:10:05
    developers and of those 500
  • 00:10:08
    developers 56% of them are leveraging Ai
  • 00:10:11
    and of that 56% all of them are finding
  • 00:10:14
    security issues in the code completion
  • 00:10:16
    or the code generation that they're
  • 00:10:17
    running into there's also this concept
  • 00:10:20
    of reduced
  • 00:10:22
    collaboration why why would there be
  • 00:10:24
    reduced
  • 00:10:25
    collaboration well if you're spending
  • 00:10:28
    your time talking talking to your GI
  • 00:10:31
    friend and not talking to the person
  • 00:10:33
    beside you you're investing in that
  • 00:10:35
    necessary prompting and chatting it has
  • 00:10:37
    been shown so far to reduce
  • 00:10:42
    collaboration so where are people using
  • 00:10:44
    it today for building software we've
  • 00:10:47
    spent a lot of time this week talking
  • 00:10:49
    about how we can provide it as a
  • 00:10:51
    capability to end users but how are we
  • 00:10:53
    using it to generate software to build
  • 00:10:56
    the capabilities we deliver into
  • 00:10:57
    production well I don't ignore the
  • 00:11:01
    industry or the commercial surveys
  • 00:11:03
    because if you're interviewing or
  • 00:11:04
    surveying hundreds of thousands of
  • 00:11:06
    people even tens of thousands of people
  • 00:11:09
    I'm not going to ignore that as a
  • 00:11:10
    researcher so yes stack Overflow
  • 00:11:14
    friends so
  • 00:11:16
    37,000 developers answered the survey
  • 00:11:20
    and of that 44% right now are attempting
  • 00:11:23
    to use AI for their job 25 additional
  • 00:11:28
    percent said they want to they really
  • 00:11:29
    want to perhaps that's fomo perhaps not
  • 00:11:32
    but what are they using it for of that
  • 00:11:34
    44% that are leveraging it well let me
  • 00:11:36
    read you some
  • 00:11:38
    statistics 82% are attempting to
  • 00:11:41
    generate some kind of code that's a
  • 00:11:43
    pretty high number 48% are debugging
  • 00:11:48
    another 34% documentation love that one
  • 00:11:52
    this is my personal favorite which is
  • 00:11:54
    explaining the code base using it to
  • 00:11:57
    look at language that already exists but
  • 00:12:01
    less than a quarter are using it for
  • 00:12:03
    software
  • 00:12:06
    testing so this is a true story this is
  • 00:12:08
    my story from the January time frame
  • 00:12:10
    about how I was able to leverage with my
  • 00:12:12
    team AI to assist us with requirements
  • 00:12:15
    analysis what we did was we met with our
  • 00:12:18
    uh user base and we got their permission
  • 00:12:21
    I'm going to talk with you I'm going to
  • 00:12:23
    record it we're going to take that
  • 00:12:24
    transcriptions are you okay if I
  • 00:12:26
    leverage a GPT tool to help us and
  • 00:12:28
    analyze it
  • 00:12:29
    the answer was yes we also crowdsourced
  • 00:12:32
    via survey now it was free form by and
  • 00:12:35
    large very little was it rationalized to
  • 00:12:37
    using like or anything uh along that
  • 00:12:40
    line and when we fed all of that in
  • 00:12:42
    through a series of very specific
  • 00:12:45
    prompts we were able to uncover some
  • 00:12:47
    sentiments that were not really as overt
  • 00:12:51
    as we had thought there were other
  • 00:12:52
    things that people were looking for in
  • 00:12:54
    their requirements so when it comes to
  • 00:12:56
    requirements analysis I believe it is
  • 00:12:58
    strong use of the tool because you're
  • 00:13:00
    feeding in your language and you are
  • 00:13:02
    extracting from that it's not generating
  • 00:13:04
    that on its own things to be concerned
  • 00:13:08
    about make sure you put your prompt into
  • 00:13:11
    your version control and don't just put
  • 00:13:12
    the prompt into Version Control but keep
  • 00:13:14
    track of what model or what service that
  • 00:13:18
    you are posting it against because as
  • 00:13:19
    we've heard as we know those different
  • 00:13:23
    prompts react differently with different
  • 00:13:25
    models now why would I talk about
  • 00:13:27
    diverse data sets well the models
  • 00:13:31
    themselves have been proven to have
  • 00:13:33
    issues with bias it's already a leading
  • 00:13:36
    practice for you to make sure that
  • 00:13:38
    you're talking to a diverse User Group
  • 00:13:40
    when you're identifying and pulling
  • 00:13:42
    those requirements out but now you have
  • 00:13:44
    that added need that you have to make
  • 00:13:47
    sure that you are balancing the
  • 00:13:49
    potentiality that the model has a bias
  • 00:13:52
    in it so make sure that your data sets
  • 00:13:54
    make sure that the interviews make sure
  • 00:13:55
    the people you talk to represent a
  • 00:13:57
    diverse set and of course
  • 00:13:59
    rigorous testing humans in the
  • 00:14:03
    loop now I personally like it for test
  • 00:14:06
    cases and there was some research that
  • 00:14:08
    was published in the January time frame
  • 00:14:10
    that made me take pause it said that
  • 00:14:12
    only 42 I'm sorry 47% 47% of
  • 00:14:16
    organizations have automated their
  • 00:14:19
    testing I need you to hear that again
  • 00:14:21
    47% have automated their testing now in
  • 00:14:23
    some of the places where I work where
  • 00:14:25
    there's cyber physical systems when I'm
  • 00:14:26
    working with the military I want it to
  • 00:14:28
    be
  • 00:14:29
    that but that also means that 53% have
  • 00:14:33
    manual testing going on well let's
  • 00:14:36
    realize and let's be okay with the fact
  • 00:14:38
    that there's manual testing going on and
  • 00:14:39
    let's set our QA professionals down in
  • 00:14:42
    front of a chat engine let's make sure
  • 00:14:44
    that they have their functional
  • 00:14:46
    requirements they have their manual test
  • 00:14:48
    cases they have their scenarios that
  • 00:14:49
    they have their user stories that they
  • 00:14:51
    have Journey Maps let them sit down and
  • 00:14:53
    let them go through Chain of Thought
  • 00:14:55
    prompting and allow the GPT to be their
  • 00:14:58
    Muse because you will be surprised how
  • 00:15:01
    well it can really help now back to
  • 00:15:04
    stack
  • 00:15:06
    Overflow
  • 00:15:07
    55% said that they were interested in
  • 00:15:10
    somehow using generative AI specifically
  • 00:15:13
    for testing yet only 3% trust it it
  • 00:15:17
    could be because it is
  • 00:15:20
    non-deterministic now I bring that up
  • 00:15:22
    because you can use um generative AI to
  • 00:15:25
    help you with synthetic test data
  • 00:15:27
    generation
  • 00:15:29
    but it's not always going to give you
  • 00:15:31
    anything that is as accurate as you
  • 00:15:33
    would like and there are some gotas
  • 00:15:34
    we'll come back
  • 00:15:36
    to one of the gotus is privacy if you're
  • 00:15:41
    taking your data elements of your data
  • 00:15:43
    aspects of your data and feeding it into
  • 00:15:45
    anybody else's subscription model if you
  • 00:15:48
    are not self-hosting and owning it
  • 00:15:49
    yourself you could have a data privacy
  • 00:15:52
    concern you could also have issues with
  • 00:15:55
    the Integrity of that data so you have
  • 00:15:56
    to be highly in tune with what's
  • 00:15:59
    happening with your information if
  • 00:16:00
    you're sending it out to a subscription
  • 00:16:03
    service also beware we've talked about
  • 00:16:06
    hallucinations it happens when you
  • 00:16:08
    generate tests as well you can have
  • 00:16:10
    irrelevant tests I've seen it I've
  • 00:16:12
    experienced it it's kind of funny but it
  • 00:16:15
    happens and back to transparency and
  • 00:16:19
    explainability the tests that come
  • 00:16:20
    forward the code that comes forward
  • 00:16:23
    sometimes it's not as helpful as you'd
  • 00:16:24
    like it to
  • 00:16:27
    be so let's talk about about the
  • 00:16:29
    elephant in the
  • 00:16:30
    corner no technical conference would be
  • 00:16:33
    complete without talking about code
  • 00:16:35
    generation
  • 00:16:38
    right
  • 00:16:41
    oh well there we go that was my dramatic
  • 00:16:44
    ad to the day
  • 00:16:47
    um
  • 00:16:48
    so when it comes to coding there's a
  • 00:16:53
    interesting Trend that's happening right
  • 00:16:54
    now major providers are pulling back
  • 00:16:57
    from calling it code generation to
  • 00:17:00
    calling it code completion and that
  • 00:17:01
    should resonate with us that should
  • 00:17:04
    point out to us that something's a foot
  • 00:17:06
    if they're pulling back from saying code
  • 00:17:08
    generation to code
  • 00:17:10
    completion there's a reason for that now
  • 00:17:13
    it is fantastic it is amazing when it
  • 00:17:16
    comes to explaining your existing code
  • 00:17:18
    base now you have to be okay with
  • 00:17:20
    exposing your existing code base to
  • 00:17:22
    whatever that language model is whether
  • 00:17:24
    it's hosted or not and generally the
  • 00:17:28
    code that you get out of this thing will
  • 00:17:30
    be wonderfully structured it will be
  • 00:17:32
    well formatted and occasionally it'll
  • 00:17:35
    work now there's a study from Purdue
  • 00:17:39
    University that has shown that when they
  • 00:17:42
    prompt uh for software engineering
  • 00:17:44
    questions that about 52% of the time the
  • 00:17:47
    answers are
  • 00:17:48
    wrong so that means we're getting
  • 00:17:51
    inaccurate code generated we have to be
  • 00:17:54
    cognizant of it remember this is
  • 00:17:56
    groundbreaking potential this is amazing
  • 00:17:59
    stuff limitations and challenges just go
  • 00:18:02
    in with eyes wide open gang these tools
  • 00:18:05
    can help to generate code what it can't
  • 00:18:08
    do is it can't build software not yet
  • 00:18:11
    not
  • 00:18:14
    yet look at the blue arrow that's what I
  • 00:18:17
    want you to focus on one of three that's
  • 00:18:21
    one of three choices for any one piece
  • 00:18:24
    of code so in this instance I've Seen It
  • 00:18:27
    Go as high as six and you're simply
  • 00:18:29
    asking for a a module a function a small
  • 00:18:34
    tidbit the person that you see there is
  • 00:18:37
    suffering from what we call decision
  • 00:18:39
    fatigue now decision fatigue in the past
  • 00:18:42
    has been studied with medical
  • 00:18:43
    professionals military the Judiciary
  • 00:18:47
    places where people have to make really
  • 00:18:49
    important decisions constantly they're
  • 00:18:52
    under high pressure and their ability to
  • 00:18:55
    make those decisions
  • 00:18:57
    deteriorates in what World should we be
  • 00:19:00
    studying decision fatigue and software
  • 00:19:02
    engineering we shouldn't be in ide help
  • 00:19:06
    can be fantastic when it comes to
  • 00:19:09
    helping you with that blank page
  • 00:19:11
    mentality that we get to it can really
  • 00:19:13
    help with that but I can tell you day in
  • 00:19:15
    and day out it can cause some
  • 00:19:17
    fatigue groundbreaking potential know
  • 00:19:20
    the limitations know the
  • 00:19:23
    challenges some things to be concerned
  • 00:19:26
    about or at least to be aware of
  • 00:19:28
    consideration ations you will see
  • 00:19:30
    unequal productivity gains with the
  • 00:19:32
    different individuals who are using it
  • 00:19:34
    somebody new in career new to the
  • 00:19:37
    organization will have less individual
  • 00:19:40
    productivity gains than somebody who is
  • 00:19:42
    more senior who can look at the code and
  • 00:19:44
    can understand there's a problem I see
  • 00:19:48
    it I see the problem code churn this is
  • 00:19:52
    something that a company named uh git
  • 00:19:54
    clear has been studying on GitHub for
  • 00:19:58
    years
  • 00:19:59
    from
  • 00:20:00
    2019 until
  • 00:20:02
    2023 the code churn value by industry
  • 00:20:06
    was roughly the same what code churn is
  • 00:20:09
    is I take that code that I've written or
  • 00:20:11
    I've had help writing I check it in I
  • 00:20:14
    then check it out I tinker with it I
  • 00:20:16
    check it in I check it out there's a
  • 00:20:17
    problem with it I check it in I check it
  • 00:20:19
    out code churn in
  • 00:20:22
    2024 we are on Pace to double double
  • 00:20:26
    code churn is it cause
  • 00:20:29
    by generation I don't know is there
  • 00:20:32
    correlation I don't know but we are
  • 00:20:34
    going to watch that because that's an
  • 00:20:36
    interesting number to see
  • 00:20:38
    rising and the code is less secure I
  • 00:20:41
    know people don't want to believe that
  • 00:20:43
    it is I'll tell you a personal story
  • 00:20:46
    first second week of March I sat through
  • 00:20:49
    an entire afternoon Workshop I was using
  • 00:20:51
    GitHub co-pilot good tool it has some
  • 00:20:54
    real value we're using Java codebase uh
  • 00:20:58
    and I was able even with what I thought
  • 00:21:00
    was pretty articulate and elegant
  • 00:21:02
    prompting to have oosp top 10 right
  • 00:21:06
    there I had my SQL injection right there
  • 00:21:08
    in front of me unless I very clearly
  • 00:21:10
    articulated don't do this don't do this
  • 00:21:12
    don't do this be aware be aware be aware
  • 00:21:16
    that means that the code is less secure
  • 00:21:18
    by Nature by
  • 00:21:21
    Nature now there was a Stanford study
  • 00:21:24
    that came out and all of the studies all
  • 00:21:27
    of the reports that I mentioned are
  • 00:21:29
    referenced in the bibliography that
  • 00:21:31
    you'll get when you download this by the
  • 00:21:33
    way but that Stanford report clearly
  • 00:21:36
    demonstrated it's a security
  • 00:21:38
    professional's worst
  • 00:21:40
    nightmare we tend to think that it's
  • 00:21:43
    right we tend to overlook it because it
  • 00:21:45
    is well formatted it's it's almost as
  • 00:21:47
    though it has authenticity right it's
  • 00:21:50
    it's speaking to us it looks correct so
  • 00:21:52
    more more issues are sneaking into the
  • 00:21:55
    code so what's that mean we need
  • 00:21:58
    rigorous testing we need humans in the
  • 00:22:02
    loop as a matter of fact now we actually
  • 00:22:04
    need more humans not fewer humans don't
  • 00:22:07
    worry about losing your job there's a
  • 00:22:10
    lot for us to
  • 00:22:13
    do generative AI can be
  • 00:22:16
    unreliable so pay close attention pay
  • 00:22:19
    very close attention you'll notice that
  • 00:22:21
    I'm emphasizing the person who has the
  • 00:22:23
    oversight this
  • 00:22:25
    time so there was a North Carolina State
  • 00:22:29
    University study that came out that said
  • 00:22:31
    that 58% of us when we are doing code
  • 00:22:33
    reviews are now doing what's called
  • 00:22:36
    coping Out means that we only look at
  • 00:22:38
    the
  • 00:22:39
    diffs why does that matter I was talking
  • 00:22:43
    to a team member of mine his name is
  • 00:22:44
    Carlton he's a technical lead has a
  • 00:22:47
    beautiful team um one of his Rockstar
  • 00:22:50
    developers is named Stephen these are
  • 00:22:52
    real people so if you want a social
  • 00:22:53
    engineer and find out who they are you
  • 00:22:55
    can I asked Carlton how do you do code
  • 00:22:59
    reviews for Stephen he said well I pull
  • 00:23:02
    it up I've worked with Steven for five
  • 00:23:03
    years I trust his capabilities I know
  • 00:23:07
    his competencies I only look at the
  • 00:23:09
    diffs it's okay when you have someone
  • 00:23:12
    new in your organization new to your
  • 00:23:14
    team new to to this domain what do you
  • 00:23:17
    do with their code changes well I open
  • 00:23:19
    them up I studied it I make sure that
  • 00:23:21
    they understand what they were doing I
  • 00:23:22
    back out into other pieces of the code I
  • 00:23:25
    really study
  • 00:23:27
    it okay
  • 00:23:29
    so if Stephen starts to use a code
  • 00:23:32
    completion tool or a code generation
  • 00:23:34
    tool and there's pressure on him to get
  • 00:23:36
    something done quickly do you trust him
  • 00:23:40
    with the same amount of trust that you
  • 00:23:42
    had before and Carlton's eyes got pretty
  • 00:23:46
    big I'm going to have to not cop out now
  • 00:23:49
    if you're doing something like par
  • 00:23:51
    programming where you are not
  • 00:23:52
    necessarily doing the code reviews in
  • 00:23:53
    the same way you're going to want to
  • 00:23:55
    rotate Partners more quickly you may
  • 00:23:57
    want to rotate rotate in a domain expert
  • 00:24:00
    at some point consider more frequent
  • 00:24:02
    rotations also think about bringing
  • 00:24:05
    together um individuals who can help you
  • 00:24:08
    with more sass more static analysis with
  • 00:24:12
    all of these it's interesting I think it
  • 00:24:14
    was the end of last week that there was
  • 00:24:16
    an announcement from gitlab I believe um
  • 00:24:19
    this is not in the bibliography I'll
  • 00:24:20
    have to double check this but they've
  • 00:24:22
    purchased a tool they've purchased a
  • 00:24:24
    corporation that provides sast because
  • 00:24:26
    they want to make sure that there's more
  • 00:24:27
    sast scanning going on in in the devops
  • 00:24:30
    pipeline going on in our ability to turn
  • 00:24:32
    out this code because we have to pay
  • 00:24:35
    closer
  • 00:24:37
    attention by the way if you're
  • 00:24:39
    generating code don't generate the tests
  • 00:24:42
    if you're generating the tests don't
  • 00:24:44
    generate the code you need to have that
  • 00:24:47
    independent verification this is just
  • 00:24:49
    smart stuff right this is just smart
  • 00:24:52
    stuff there can be bias and there can be
  • 00:24:55
    blind spots there can also be this
  • 00:24:57
    really interesting condition that I
  • 00:24:59
    learned about maybe six or seven months
  • 00:25:01
    ago called
  • 00:25:02
    overfitting it's when a model is trained
  • 00:25:05
    and there's some noise in the training
  • 00:25:07
    data and it causes it to be hyperfocused
  • 00:25:09
    in one area and what can happen with
  • 00:25:12
    your tests is that they can be
  • 00:25:14
    hyperfocused in one area of your code
  • 00:25:16
    base to the exclusion of other areas
  • 00:25:19
    does that mean to not use generative AI
  • 00:25:22
    tools no it means be aware know the
  • 00:25:25
    limitations prepare for it
  • 00:25:29
    so is your organization ready to use
  • 00:25:32
    generative AI for software engineering
  • 00:25:35
    anybody anybody I don't see a lot of
  • 00:25:38
    hands come on folks all
  • 00:25:40
    right my question to you is is your sdlc
  • 00:25:44
    already in pretty good
  • 00:25:46
    shape if it is hot diggity you might
  • 00:25:49
    want to amplify leveraging generative AI
  • 00:25:52
    but if you have some existing
  • 00:25:54
    problems
  • 00:25:56
    sprinkling some generative AI on top
  • 00:25:59
    it's probably not a good
  • 00:26:02
    idea so let's go back to the basics for
  • 00:26:05
    just a moment when I get parachuted into
  • 00:26:08
    a new organization into a new team one
  • 00:26:11
    of the first questions that I ask is do
  • 00:26:13
    you own your path to production and by
  • 00:26:17
    asking that simple question it gives me
  • 00:26:19
    an entire waterfall of cascading other
  • 00:26:22
    questions to ask if you can't make a
  • 00:26:24
    change and understand quickly how it's
  • 00:26:26
    going to get fielded probably has some
  • 00:26:30
    challenges and that's when I usually
  • 00:26:32
    tell teams that we need to step back and
  • 00:26:34
    start to do the
  • 00:26:37
    minimums in 2021 during the height of
  • 00:26:40
    the lockdowns I attended the devops
  • 00:26:42
    Enterprise Summit with a number of
  • 00:26:44
    different friends it was virtual and you
  • 00:26:46
    if any of you attended there are lots of
  • 00:26:48
    different tools where you could belly up
  • 00:26:49
    to the virtual bar and I bellied up to
  • 00:26:52
    the bar with a friend of mine actually
  • 00:26:54
    someone who introduced me to Chris Swan
  • 00:26:56
    and my friend Brian Finster and n six or
  • 00:26:59
    seven other people were arguing and
  • 00:27:01
    frustrated with one another why is
  • 00:27:03
    everybody telling us that they can't use
  • 00:27:06
    Dev SEC Ops that they can't have a cicd
  • 00:27:09
    pipeline why are there so many dang
  • 00:27:11
    excuses you know what we'll do we're
  • 00:27:14
    going to write down what those minimums
  • 00:27:15
    are and we did so that QR code will take
  • 00:27:18
    you to minimum c.org but you can
  • 00:27:20
    remember that easy enough and it's an
  • 00:27:22
    open source listing we simply are
  • 00:27:25
    maintainers of documentation providing
  • 00:27:27
    people what the minimums are so what are
  • 00:27:29
    the minimums what do you need to do
  • 00:27:32
    before you start sprinkling AI on
  • 00:27:34
    top make sure you're practicing
  • 00:27:37
    continuous integration that means don't
  • 00:27:39
    leave the code on your desktop overnight
  • 00:27:42
    tell the people on your team don't leave
  • 00:27:44
    the code outside the repository check it
  • 00:27:46
    in and if it's not done that's okay put
  • 00:27:49
    a flag around it put a feature flag
  • 00:27:51
    around it so that if it does flow
  • 00:27:52
    forward it's not going to cause a
  • 00:27:54
    problem once you check that code in how
  • 00:27:58
    does it get into production the pipeline
  • 00:28:01
    the pipeline determines deployability it
  • 00:28:05
    determines releasability and how does
  • 00:28:08
    that magical pipeline do that because we
  • 00:28:11
    as humans sat down and decided what our
  • 00:28:14
    thresholds were for
  • 00:28:15
    deployability and then we codified it
  • 00:28:18
    into that
  • 00:28:19
    pipeline what else is involved once that
  • 00:28:23
    code becomes an electronic asset it's
  • 00:28:26
    immutable humans don't touch it again
  • 00:28:28
    you don't touch the environments you
  • 00:28:30
    don't touch anything stop touching
  • 00:28:32
    things let the pipeline take care of it
  • 00:28:35
    that's a big piece of Dev SEC Ops
  • 00:28:37
    principles and it matters and it helps
  • 00:28:41
    you also whenever you're doing any kind
  • 00:28:42
    of testing you want any of the other
  • 00:28:44
    environments that you're leveraging to
  • 00:28:46
    be at what's called parody parody to
  • 00:28:48
    production because I can give you a lot
  • 00:28:50
    of stories we'll share drinks tonight
  • 00:28:53
    and I'll tell you about having
  • 00:28:54
    environments that were not identical
  • 00:28:58
    a thing that you can do to get started
  • 00:29:00
    is to take a look at the DOR metrics
  • 00:29:02
    pick one you don't have to pick four
  • 00:29:04
    don't bite off more than you can Sho
  • 00:29:06
    pick one deployment frequency is not a
  • 00:29:07
    bad place to start that QR code will
  • 00:29:09
    take you to the research site and when
  • 00:29:12
    you're there you can also find another
  • 00:29:14
    tool that it's a quick survey I think
  • 00:29:16
    it's four or five questions that'll help
  • 00:29:18
    you decide which of those metrics to
  • 00:29:20
    start to track don't you love this
  • 00:29:22
    picture I just love this
  • 00:29:25
    picture let's talk about the gotas as
  • 00:29:27
    we're going for forward
  • 00:29:30
    gang if you're adding generative AI into
  • 00:29:33
    your workflow your workflow is going to
  • 00:29:36
    change that means your measurements and
  • 00:29:38
    your metrics are going to change so if
  • 00:29:40
    you have people who are really paying
  • 00:29:41
    attention and looking at your metrics
  • 00:29:43
    and studying your measurements let them
  • 00:29:45
    know that things are going to waver and
  • 00:29:47
    that you're going to have to train some
  • 00:29:49
    folks and be aware that if you're
  • 00:29:52
    processes were in okay shape people have
  • 00:29:55
    what I call muscle memory sometimes
  • 00:29:57
    they're resistant to to change does that
  • 00:29:59
    mean to not do it no it just means some
  • 00:30:01
    things to be aware
  • 00:30:03
    of let's talk about productivity this
  • 00:30:06
    drives me frakin
  • 00:30:07
    batty because it's perceived
  • 00:30:10
    productivity that the surveys that the
  • 00:30:12
    current research that the current
  • 00:30:15
    advertisements are all talking about you
  • 00:30:17
    are going to have greater productivity
  • 00:30:18
    you are going to have greater
  • 00:30:20
    productivity personal
  • 00:30:22
    productivity it's perceived at this
  • 00:30:24
    point by and large that productivity is
  • 00:30:26
    a perceived gain it means I'm excited I
  • 00:30:29
    got a new tool this is really cool this
  • 00:30:32
    is going to be great it doesn't
  • 00:30:34
    necessarily mean that I'm dealing with
  • 00:30:36
    higher order issues that I am putting
  • 00:30:38
    features out at a faster Pace with
  • 00:30:40
    higher quality doesn't necessarily mean
  • 00:30:42
    that at all it means I perceive it we
  • 00:30:44
    have to give time for there to be
  • 00:30:45
    equalizing of the perceived gain to real
  • 00:30:48
    gain but that leads to a really much
  • 00:30:51
    bigger
  • 00:30:52
    thing we measure team productivity not
  • 00:30:55
    individual productivity it's how well
  • 00:30:58
    does a team put software into production
  • 00:31:00
    right it's not how fast does Tracy do it
  • 00:31:03
    alone it's how fast do we do it as a
  • 00:31:06
    team now if you're measuring
  • 00:31:08
    productivity and you should think about
  • 00:31:10
    it I recommend using Dr Nicole
  • 00:31:13
    forsen's um framework this came out
  • 00:31:16
    around 2021 with a number of other
  • 00:31:19
    researchers from Microsoft what's
  • 00:31:20
    important is that you see all those
  • 00:31:22
    human elements that are there
  • 00:31:24
    satisfaction we actually need to
  • 00:31:26
    understand if people feel satis Saied
  • 00:31:28
    with what they're doing to understand
  • 00:31:29
    their productivity now I met with Nicole
  • 00:31:31
    about three weeks ago and we're talking
  • 00:31:32
    about adding in uh another dimension
  • 00:31:36
    kind of throws off the whole Space
  • 00:31:37
    analogy there but we're talking about
  • 00:31:40
    adding in
  • 00:31:41
    trust why does trust
  • 00:31:44
    matter if I'm using traditional
  • 00:31:47
    traditional Ai and ML and it's
  • 00:31:50
    deterministic I can really understand
  • 00:31:53
    and I can recreate algorithmically
  • 00:31:56
    repetitively again and again and again
  • 00:31:59
    that same value so think about a heads
  • 00:32:01
    up display for a pilot I want them to
  • 00:32:05
    trust what the AI or the ml algorithm
  • 00:32:09
    has has given them and I do that by
  • 00:32:11
    proving to them again and again and
  • 00:32:13
    again that it will be identical that is
  • 00:32:15
    the altitude that is a mountain you
  • 00:32:17
    should turn
  • 00:32:19
    left generative AI is by its nature
  • 00:32:23
    non-deterministic it lies to you so
  • 00:32:26
    should you trust it so we have to
  • 00:32:29
    understand as things change as we start
  • 00:32:32
    to use generative AI we have to
  • 00:32:33
    understand are we going to be able to
  • 00:32:35
    trust it and that's going to give people
  • 00:32:37
    angst and we're already seeing some
  • 00:32:38
    beginnings of that so we're going to
  • 00:32:40
    have to understand how do we measure
  • 00:32:41
    productivity going forward can't tell
  • 00:32:44
    you 100% how that's going to happen
  • 00:32:47
    yet the importance of
  • 00:32:49
    context I love this library because this
  • 00:32:52
    represents your code base this
  • 00:32:54
    represents your IP this represents all
  • 00:32:57
    the things that you need to be willing
  • 00:32:59
    to give over access to a
  • 00:33:02
    model if you own the model if it's
  • 00:33:05
    hosted in your organization that's a
  • 00:33:06
    whole lot different than if you decided
  • 00:33:08
    to use a subscription service I'm not
  • 00:33:11
    telling you to not use subscription
  • 00:33:13
    Services what I'm telling you is to go
  • 00:33:15
    in eyes wide open and make sure that
  • 00:33:17
    your organization is okay with things
  • 00:33:20
    crossing your boundary I deal a lot with
  • 00:33:22
    infosec organizations and we talk about
  • 00:33:24
    the information flow and if all of a
  • 00:33:26
    sudden I say yeah I'm just going to take
  • 00:33:27
    code base to provide as much context as
  • 00:33:30
    possible and shoot it out the door you
  • 00:33:32
    guys don't mind do you they
  • 00:33:36
    mind now this is not to poke an eye in
  • 00:33:40
    sneak I love sneak I love their tools
  • 00:33:43
    but I want you to to take away from this
  • 00:33:45
    is read the
  • 00:33:47
    popups read the end user licensing
  • 00:33:50
    agreements read them when I saw this for
  • 00:33:53
    just a moment I went well how do I flush
  • 00:33:56
    the cash
  • 00:33:58
    now it happened to be that I was using
  • 00:34:00
    some training information actual
  • 00:34:02
    Workshop code but if it had been
  • 00:34:05
    something of greater value I would have
  • 00:34:08
    taken pause so read read those things
  • 00:34:11
    read the popups be
  • 00:34:14
    aware oh Public Service
  • 00:34:17
    Announcement keep the humans in the
  • 00:34:23
    loop so we're going to talk about how we
  • 00:34:25
    add AI to the Enterprise the next slide
  • 00:34:27
    SL might be worthy of coming back
  • 00:34:30
    to how do you add AI Strat to your
  • 00:34:33
    strategy or how do you create an AI
  • 00:34:36
    strategy doesn't matter if you're an
  • 00:34:38
    organization that has two people it
  • 00:34:40
    doesn't matter if you're an organization
  • 00:34:41
    with 200 or 2,000 or 20,000 people you
  • 00:34:45
    may already have a data strategy what
  • 00:34:47
    matters is that you do A needs
  • 00:34:49
    assessment don't roll your eyes I saw
  • 00:34:52
    that by the
  • 00:34:53
    way what matters is that you get some
  • 00:34:56
    people together perhaps just sit around
  • 00:34:58
    the table with some Post-it notes and
  • 00:35:00
    you talk about what might be a valuable
  • 00:35:03
    place to leverage this make a decision
  • 00:35:06
    it's not everything at all times not
  • 00:35:08
    automatically scaling which takes me to
  • 00:35:10
    the second Point Define a pilot make
  • 00:35:13
    sure you have a limited focused pilot so
  • 00:35:16
    you can try these things out what I'm
  • 00:35:18
    telling you is that this has what
  • 00:35:20
    groundbreaking potential groundbreaking
  • 00:35:23
    potential and there are limitations and
  • 00:35:26
    there are challenges when you're going
  • 00:35:28
    through that pilot it's going to help
  • 00:35:29
    you to understand the different types of
  • 00:35:31
    skills that you're going to need in your
  • 00:35:33
    organization or if you're going to need
  • 00:35:34
    to hire more people or if you're going
  • 00:35:36
    to need to bring more people
  • 00:35:39
    in it'll also help you get after those
  • 00:35:41
    first couple of tranches of governance
  • 00:35:43
    and hopefully your governance is don't
  • 00:35:45
    do it no your governance needs to be
  • 00:35:47
    relevant and relative to what you are
  • 00:35:49
    attempting to do monitoring and feedback
  • 00:35:53
    loops always important but I want to
  • 00:35:56
    point out the bottom bullet this that's
  • 00:35:57
    here may seem a little strange to you
  • 00:36:01
    why am I telling you that you have to
  • 00:36:02
    have thought leadership as part of your
  • 00:36:04
    AI strategy I'm not talking about
  • 00:36:06
    sending your people to get up on stage
  • 00:36:08
    I'm not talking about writing white
  • 00:36:10
    papers what I'm telling you is to make
  • 00:36:12
    sure that in your organization that you
  • 00:36:14
    give dedicated time to more than one
  • 00:36:17
    person to stay AB breast and help your
  • 00:36:19
    organization to stay on top of what's
  • 00:36:21
    happening because it's a tital wave
  • 00:36:23
    right now isn't it I I some days don't
  • 00:36:26
    even like to turn on my phone or read
  • 00:36:28
    any of my feeds because I know what it's
  • 00:36:30
    going to say another automated picture
  • 00:36:32
    generated from Dolly yeah too much too
  • 00:36:37
    much choose when and where to start how
  • 00:36:43
    map it to a business need map it to a
  • 00:36:46
    need make sure it's relevant and if your
  • 00:36:49
    need is that you need to get some
  • 00:36:51
    experience that's fine make a decision
  • 00:36:55
    write it down architectural decision
  • 00:36:57
    records
  • 00:36:59
    and then make sure that you have some
  • 00:37:01
    measurements against
  • 00:37:04
    it all right time to design your AI
  • 00:37:08
    assisted software engineering tool
  • 00:37:12
    chain why is it that suddenly we've
  • 00:37:15
    forgotten about all of the software
  • 00:37:17
    architectural principles capabilities
  • 00:37:20
    and things that we've been doing for
  • 00:37:21
    decades why have we suddenly forgotten
  • 00:37:24
    about trade-off analysis about the illes
  • 00:37:27
    when you're designing your tool chain
  • 00:37:30
    apply that same lens is it more relevant
  • 00:37:34
    for you to take something that's off the
  • 00:37:36
    shelf because you need time to Market
  • 00:37:38
    what are my trade-offs well it may be
  • 00:37:41
    faster it'll be less tailored to my
  • 00:37:43
    exact domain need and it may be less
  • 00:37:46
    secure but that may be a choice that we
  • 00:37:49
    make it could be that I have the time
  • 00:37:52
    energy finances abilities to do the
  • 00:37:55
    tailoring myself maybe I in stti a model
  • 00:37:58
    internally maybe I have an external
  • 00:38:00
    service but I have a rag internally lots
  • 00:38:02
    of different variations but make those
  • 00:38:03
    choices let's not forget about all the
  • 00:38:06
    things that we've known about for all
  • 00:38:07
    these
  • 00:38:11
    years leading practices got to have a
  • 00:38:14
    leading practi this slide I want to
  • 00:38:16
    point out that we need to keep humans in
  • 00:38:18
    the loop someone had an an HL I'm going
  • 00:38:22
    to start to hashtag that you're going to
  • 00:38:23
    get sick of it if any of you if any of
  • 00:38:25
    us are connected online make sure that
  • 00:38:28
    everything everything everything is in
  • 00:38:31
    source code the
  • 00:38:33
    prompts the model numbers and names that
  • 00:38:35
    you're using it against secure your
  • 00:38:37
    vulnerabilities and don't provide your
  • 00:38:41
    private information into public models
  • 00:38:43
    into public
  • 00:38:46
    engines I love this picture it's another
  • 00:38:48
    one that I love because it makes me take
  • 00:38:50
    pause sky is on a tight rope he's
  • 00:38:53
    walking between mountains take a look at
  • 00:38:54
    that and he's mitigated his risk he has
  • 00:38:57
    tethers so is he doing something
  • 00:39:00
    dangerous yeah but that's okay because
  • 00:39:03
    he's mitigating that I need you to think
  • 00:39:05
    about 2023 as a year where we really
  • 00:39:08
    didn't have a lot of good
  • 00:39:09
    regulation it's coming about we're
  • 00:39:12
    seeing that regulation catch up but
  • 00:39:14
    there are challenges with IP it can be
  • 00:39:17
    that a model was trained with public
  • 00:39:19
    information and so you actually don't
  • 00:39:21
    own the copyright to the things that
  • 00:39:23
    you're generating because it track back
  • 00:39:26
    from a lineage persp perspective is
  • 00:39:27
    something somebody else owned or
  • 00:39:30
    worse when You' sent it out the door
  • 00:39:33
    even if it hasn't been used to directly
  • 00:39:35
    train a model let's say that they are
  • 00:39:37
    keeping on your behalf all of your
  • 00:39:39
    conversation threads and that they're
  • 00:39:41
    analyzing those conversation threads and
  • 00:39:43
    that they're taking IP from that you can
  • 00:39:46
    lose ownership of your IP in the US we
  • 00:39:49
    have copyright law and our copyright law
  • 00:39:52
    says that a human hand must have touched
  • 00:39:54
    it it means I have to be really careful
  • 00:39:56
    doesn't it when it comes comes to
  • 00:39:57
    generate
  • 00:40:00
    code so what questions should you be
  • 00:40:03
    asking to your
  • 00:40:05
    providers or if you are the people who
  • 00:40:08
    are providing that service to your
  • 00:40:11
    Enterprise in the appendix for this
  • 00:40:14
    there are two different sheets of
  • 00:40:16
    different types of questions that I want
  • 00:40:18
    you to take home and I want you to
  • 00:40:20
    leverage I'll give you one or two as a
  • 00:40:22
    as a
  • 00:40:23
    snippet one how are you ensuring that
  • 00:40:27
    the model is not creating malicious
  • 00:40:31
    vulnerabilities how are you what are the
  • 00:40:33
    guardrails that you have in place if I'm
  • 00:40:35
    using your model or if you're providing
  • 00:40:38
    that model how are you ensuring that
  • 00:40:41
    that's not happening if there's an issue
  • 00:40:43
    with the model and the model needs to be
  • 00:40:45
    changed how are you going to notify me
  • 00:40:47
    so that I can understand what the
  • 00:40:49
    ramifications are to my value chain to
  • 00:40:52
    my value stream questions to ask peeps
  • 00:40:58
    so let's look
  • 00:41:00
    ahead not going to go into this slide in
  • 00:41:02
    detail because it covers generative AI
  • 00:41:05
    it covers regular AI it covers ml what's
  • 00:41:09
    important to know is that red arrow
  • 00:41:10
    where are we we're at the peak of
  • 00:41:13
    inflated expectations we absolutely are
  • 00:41:16
    I completely believe that and I'm sure
  • 00:41:19
    all of your Social feeds tell you that
  • 00:41:20
    as
  • 00:41:21
    well AI Ops is on the rise other places
  • 00:41:25
    other types of AI and ml will continue
  • 00:41:28
    to improve so we're at the beginning of
  • 00:41:31
    generative AI but we're well on the way
  • 00:41:34
    with the
  • 00:41:36
    others what do you think it looks like
  • 00:41:38
    over the next 12 to 24
  • 00:41:42
    months recently I've had the opportunity
  • 00:41:44
    to interview folks from Microsoft from
  • 00:41:47
    it Revolution from Yahoo from the
  • 00:41:50
    software engineering Institute and even
  • 00:41:52
    some of my colleagues within miter
  • 00:41:55
    Corporation what we believe what what
  • 00:41:57
    we're seeing is going to happen is
  • 00:41:59
    happening now is that we're seeing more
  • 00:42:01
    data silos because each one of those
  • 00:42:04
    areas where a different AI tool is being
  • 00:42:07
    leveraged is a conversation between me
  • 00:42:09
    and that tool you and I are not sharing
  • 00:42:13
    session so we're not having the same
  • 00:42:16
    experience especially with those
  • 00:42:17
    generative AI tools so for right now for
  • 00:42:20
    now for this moment more data silos data
  • 00:42:23
    silos mean slower flow slower flow often
  • 00:42:26
    means me more quality issues it's going
  • 00:42:28
    to get worse before it gets better and
  • 00:42:32
    it's groundbreaking potential that we
  • 00:42:34
    need to know the limitations and the
  • 00:42:36
    risks
  • 00:42:39
    for there's an entire track today about
  • 00:42:42
    platform engineering I'm going to foot
  • 00:42:44
    stomp that there is going to be a
  • 00:42:46
    continued increase for the need because
  • 00:42:48
    what are platforms for whether it's low
  • 00:42:50
    code no code or the new kit on the Block
  • 00:42:53
    that we're doing it for our custom
  • 00:42:55
    developers it's making it hard hard for
  • 00:42:57
    people to make mistakes it's codifying
  • 00:42:59
    leading practices this is going to
  • 00:43:00
    continue to increase if you have a
  • 00:43:02
    chance to go to today's track I strongly
  • 00:43:05
    suggest
  • 00:43:06
    it what about this guy what about this
  • 00:43:10
    guy any of you with adult children who
  • 00:43:12
    are going to send them off to coding
  • 00:43:14
    boot
  • 00:43:15
    camp Jensen Hong would say do not do
  • 00:43:20
    that the
  • 00:43:22
    pessimists are saying that AI will
  • 00:43:27
    replace the coder the optimists are
  • 00:43:31
    saying that those who are qualified
  • 00:43:34
    software Engineers software developers
  • 00:43:37
    will be in a great place so I want you
  • 00:43:39
    to hear the nuances that are there if
  • 00:43:42
    you're good at your craft if you
  • 00:43:45
    understand the principles if you're able
  • 00:43:46
    to leverage those principles if you're
  • 00:43:48
    able to teach others you'll be
  • 00:43:52
    fine what about Devon have you heard
  • 00:43:54
    about Devon or have you followed open AI
  • 00:43:58
    open Devon that came out about three
  • 00:44:00
    days after Devon was announced it's kind
  • 00:44:02
    of fun to watch it you see a little
  • 00:44:04
    video there think there's six videos on
  • 00:44:06
    the site and it is saying that this is
  • 00:44:08
    an AI software engineer and what they've
  • 00:44:10
    done is a form of AI swarming they have
  • 00:44:12
    different agents that are plugged in
  • 00:44:14
    where one is triggering one is reacting
  • 00:44:16
    to it there are different patterns one
  • 00:44:18
    is a coder critic pattern it's
  • 00:44:20
    essentially those
  • 00:44:22
    patterns we're going to see ai go from
  • 00:44:25
    being a tool that we independently and
  • 00:44:27
    individually use to agents that are
  • 00:44:30
    plugged into our sdlc and when they get
  • 00:44:33
    plugged into our sdlc we're going to
  • 00:44:36
    have to be cognizant of what that does
  • 00:44:38
    to the humans in the mix we're going to
  • 00:44:39
    give them very defined small roles so
  • 00:44:41
    you may have somebody on your team that
  • 00:44:44
    is a a gen AI not a Gen X not a gen Z
  • 00:44:49
    gen
  • 00:44:51
    AI I want to pause for a moment guys I
  • 00:44:54
    want to take you back to 1939
  • 00:44:57
    9 what's that have to do with
  • 00:45:00
    software it has to do with black and
  • 00:45:02
    white 1939 was when the Wizard of Oz was
  • 00:45:06
    filmed and it started out as black and
  • 00:45:08
    white and I don't know if there's
  • 00:45:10
    there's anybody here who hasn't seen it
  • 00:45:13
    go watch
  • 00:45:14
    it Dorothy's house is picked up by a
  • 00:45:18
    tornado and it is cast Over the Rainbow
  • 00:45:21
    and it lands in Oz smashes the Wicked
  • 00:45:25
    Witch and
  • 00:45:27
    she opens the door and as she opens the
  • 00:45:30
    door she looks out at things that she
  • 00:45:33
    has never seen before munchkins flying
  • 00:45:37
    monkeys an Emerald City all in beautiful
  • 00:45:41
    Technicolor and do you know where we
  • 00:45:44
    are same Technic color my friends the
  • 00:45:48
    future is amazing what we're going to do
  • 00:45:52
    will be
  • 00:45:53
    amazing but we're going to need to
  • 00:45:55
    optimize differently
  • 00:45:57
    right now our software practices are
  • 00:45:59
    optimized for humans I limit work in
  • 00:46:02
    progress why because I'm a human agile I
  • 00:46:06
    take one user story at a time why
  • 00:46:08
    because I'm human we're worried about
  • 00:46:10
    cognitive overload why because we're
  • 00:46:13
    humans it's not negative it's just a
  • 00:46:15
    fact that we finally learned to optimize
  • 00:46:17
    for the humans so as we go from having
  • 00:46:20
    AI agents to having more
  • 00:46:23
    capable team members or perhaps teams
  • 00:46:26
    than that are made up of many different
  • 00:46:29
    generative AI agents we're going to have
  • 00:46:31
    to figure out how do we optimize and who
  • 00:46:34
    who do we optimize for exciting damn
  • 00:46:38
    exciting stuff guys damn exciting stuff
  • 00:46:41
    but I'm going to take you from
  • 00:46:41
    Technicolor back to where we are right
  • 00:46:45
    now I like to say we cannot put the
  • 00:46:49
    genie back in the
  • 00:46:51
    bottle prompt engineering we need to
  • 00:46:54
    understand it as a discipline we need to
  • 00:46:56
    understand ethics of prompts who owns
  • 00:46:58
    the generated outcomes machine human
  • 00:47:01
    teaming we need to understand all this
  • 00:47:03
    what about software team performance
  • 00:47:05
    trust and reliability but why am i
  • 00:47:07
    showing you a horse's
  • 00:47:09
    backside
  • 00:47:11
    well because a friend of mine named
  • 00:47:13
    Lonnie Rosales hails from the great
  • 00:47:15
    state of
  • 00:47:17
    Texas and she said Trace you can
  • 00:47:20
    actually trick the genie back into the
  • 00:47:23
    bottle but you can't put the Poo back in
  • 00:47:26
    the horse
  • 00:47:28
    now she's from the great state of Texas
  • 00:47:30
    and I can tell you that the word that
  • 00:47:31
    she used was not
  • 00:47:33
    poo but I want you to take that with you
  • 00:47:36
    we cannot go back ever ever ever to
  • 00:47:40
    where we were we cannot go back to where
  • 00:47:42
    we were that's okay we can go into it
  • 00:47:45
    Eyes Wide Open understanding the
  • 00:47:47
    challenges and the limits that are there
  • 00:47:48
    and working together to figure these
  • 00:47:50
    things
  • 00:47:51
    out your call to
  • 00:47:54
    action go back and pulse your
  • 00:47:56
    organization
  • 00:47:57
    find out where the shadow gen is being
  • 00:48:00
    used the shadow AI is being used bring
  • 00:48:03
    it to the surface don't shame people
  • 00:48:05
    understand how they're using it and then
  • 00:48:07
    enable them to do the kinds of research
  • 00:48:10
    or if they bring forward a need that you
  • 00:48:12
    help them with that need make sure you
  • 00:48:15
    are looking at cyber security as your
  • 00:48:17
    Numero Uno issue number one number one
  • 00:48:21
    establish your guard rails then connect
  • 00:48:24
    with your providers use those questions
  • 00:48:26
    or be ready to answer those questions if
  • 00:48:29
    you are the provider of generative AI
  • 00:48:31
    capabilities to your
  • 00:48:33
    organization now that's your call to
  • 00:48:36
    action but I need something from all of
  • 00:48:38
    you you're actually the missing piece of
  • 00:48:41
    my puzzle as a researcher I want to
  • 00:48:45
    understand how are you using generative
  • 00:48:48
    AI how is your organization preparing
  • 00:48:51
    how are you personally focusing on
  • 00:48:54
    getting ready for this what are you
  • 00:48:55
    doing I'm going to be going down to the
  • 00:48:58
    second floor after this and I would love
  • 00:49:01
    if anybody wants to swing by and have a
  • 00:49:03
    chat on what you're doing share your
  • 00:49:05
    organization's Lessons Learned tell me
  • 00:49:08
    about your stories tell me about the
  • 00:49:09
    challenges that you have or tell me
  • 00:49:11
    about the things that you want to learn
  • 00:49:12
    about because you haven't gotten there
  • 00:49:16
    yet by the
  • 00:49:18
    way this is in color what matters in all
  • 00:49:22
    of this is the humans we've talked about
  • 00:49:24
    it all week this is what matters
  • 00:49:27
    matters grab that QR code that'll take
  • 00:49:30
    you to a download of today's materials
  • 00:49:33
    it'll take you to the bibliography as
  • 00:49:35
    well as that Continuum
  • 00:49:37
    slide and I've been asked to pop this up
  • 00:49:40
    and ask you to vote uh and provide
  • 00:49:42
    feedback thank you guys very much
  • 00:49:46
    [Applause]
  • 00:49:54
    [Music]
Tags
  • Generative AI
  • Software Engineering
  • AI Challenges
  • Technology Evolution
  • Digital Transition
  • Trust in AI
  • AI Hype Cycle
  • AI Implementation
  • IT Strategy
  • Data Silos