"Stranger than Fiction Case Studies in Software Engineering Judgment," Steve McConnell

01:04:00
https://www.youtube.com/watch?v=PFcHX0Menno

الملخص

TLDRACM's webinar med Steve McConnell fokuserer på software engineering judgment. Han præsenterer case-studier som healthcare.gov for at demonstrere forskellen mellem fejlagtig analyse og nødvendigheden af god domsførelse i projekter, introducerer konceptet med Bloom's taksonomi og de fire faktorer, der påvirker software projekter: størrelse, usikkerhed, defekter og menneskelig variation. Formålet er at hjælpe fagfolk med at udvikle evnen til at tage bedre beslutninger i komplekse softwareprojekter.

الوجبات الجاهزة

  • 📚 Bloom's taksonomi hjælper med at forstå niveauer af forståelse i uddannelse.
  • 🛠 Brugen af checklister kan effektivisere beslutningstagning.
  • 🔍 Projektdynamik afhænger af fire faktorer: størrelse, usikkerhed, defekter og menneskelig variation.
  • 🤔 Dom er en højtudviklet færdighed over analyse i software engineering.
  • 📉 Fejl i projektdom kan føre til store fejl, som set med healthcare.gov.
  • 👥 Menneskelige faktorer kan have en stor indvirkning på projektets succes eller fiasko.
  • 🚦 Projekter med flere 'røde flag' sandsynligvis mislykkes uanset en enkelt god faktor.
  • 💡 Bedre dom i beslutningstagning kan udvikles ved at anvende erfaring og modeller.
  • 🗣 God kommunikation mellem teknisk personale og ledelse kan forbedre projektresultater.
  • 📊 Bedre projektskøn og risikostyring kan forhindre budgetoverskridelser.

الجدول الزمني

  • 00:00:00 - 00:05:00

    ACM's læringswebinar præsenteres som en mulighed for professionelle og studerende at forbedre deres færdigheder og fremme deres karriere inden for datalogi. Organisationen tilbyder forskellige uddannelsesressourcer og professionelle udviklingsmuligheder, der hjælper medlemmer med at navigere i det hurtigt skiftende felt.

  • 00:05:00 - 00:10:00

    Steve McConnell introduceres som dagens foredragsholder med erfaring inden for software engineering. Hans præsentation omhandler vurderingsevne inden for softwareudvikling. Han planlægger at adressere det gennem Bloom's taksonomi som en model til at forbedre dømmekraften.

  • 00:10:00 - 00:15:00

    Bloom's taksonomi anvendes til at forstå niveauer af læring og erkendelse. Den baseres på et hierarki fra viden og forståelse til anvendelse og evaluering. Steve diskuterer, hvordan dette hierarki kan bruges til at forbedre vurderingsevnen i softwareudvikling.

  • 00:15:00 - 00:20:00

    Bloom's taksonomi indeholder niveauer fra husken af faktuel viden til anvendelse og praksis af denne viden i software engineering. Steve illustrerer dette med eksempler som design og implementering af software.

  • 00:20:00 - 00:25:00

    Analyse i software engineering indebærer kritisk tænkning og at opbryde problemer for at forstå deres relationer. Dette er ofte en overudviklet evne i softwareindustrien, hvilket kan føre til "analyseparalyse" og en besættelse af detaljer.

  • 00:25:00 - 00:30:00

    Syntese eller skabende tænkning involverer at sammensætte forskellige dele for at forme noget nyt. Steve argumenterer for vigtigheden af dette i ledelse og design og adresserer bias imod denne kompetence blandt tekniske ansatte.

  • 00:30:00 - 00:35:00

    Vurdering, øverst i Bloom's hierarki, handler om at evaluere ideer og koncepter. Steve nævner eksempler fra software engineering og bemærker hvor sjælden denne evne er sammenlignet med analyse og syntese.

  • 00:35:00 - 00:40:00

    Steve introducerer en model kaldet firefaktormodellen som en ramme til at fremme bedre vurdering i softwareprojekter. Denne omfatter faktorer som størrelsen af projektet, usikkerhed, defekter og menneskelig variation.

  • 00:40:00 - 00:45:00

    Gennem anvendelsen af firefaktormodellen gennemgår Steve case-studier af softwareprojekter for at illustrere, hvordan vurdering kan forudsige projektets succes eller fiasko. Hans første case handler om problemerne med healthcare.gov.

  • 00:45:00 - 00:50:00

    Analyser af fejlslagne projekter som healthcare.gov antyder, at problemerne stammer fra hurtige tidsplaner, skiftende krav og manglende planlægning, snarere end blot "ikke nok test","I modsætning til fejlede projekter viser vellykkede projekter som Cheyenne Mountain ATAMS, hvordan risiko styring, tidlig testning og dygtige teams kan lede til succes ved at håndtere de fire faktorer effektivt.

  • 00:50:00 - 01:04:00

    Afsluttende diskuterer Steve vigtigheden af at udvikle god vurderingsegenskab hos softwareprofessionelle, ved hjælp af eksempler og modeller som firefaktormodellen til at træffe bedre beslutninger i projekter.

اعرض المزيد

الخريطة الذهنية

Mind Map

الأسئلة الشائعة

  • Hvad er Bloom's taksonomi?

    Bloom's taksonomi er en ramme, der klassificerer læringsmål i seks niveauer: viden, forståelse, anvendelse, analyse, syntese og vurdering.

  • Hvad er formålet med at bruge de fire faktorer i evalueringsmodellen?

    De fire faktorer hjælper med at forstå projektdynamik, der kan støtte domskræfter i software engineering.

  • Hvem er Steve McConnell?

    Steve McConnell er CEO og chefsoftwareingeniør hos Construx Software og forfatter af flere bøger om software engineering.

  • Hvordan kan man forbedre dom i software engineering?

    Dom kan forbedres gennem erfaring, anvendelse af evalueringsmodeller som de fire faktorer, og ved at analysere casestudier.

  • Hvad er udfordringerne med projekter som healthcare.gov?

    Udfordringerne inkluderede komprimeret tidsplan, ændrede krav i sidste øjeblik og manglende effektiv planlægning og opsyn.

عرض المزيد من ملخصات الفيديو

احصل على وصول فوري إلى ملخصات فيديو YouTube المجانية المدعومة بالذكاء الاصطناعي!
الترجمات
en
التمرير التلقائي:
  • 00:00:06
    hello and welcome to today's ACM
  • 00:00:08
    learning webinar this webcast is part of
  • 00:00:11
    acm's commitment to lifelong learning
  • 00:00:13
    and serving the over a 100,000 Computing
  • 00:00:16
    professionals and students who are ACM
  • 00:00:18
    members I'm will Trace Lui Martin fellow
  • 00:00:21
    Emeritus and cheer of ACM sigsoft the
  • 00:00:24
    special interest group on software
  • 00:00:27
    engineering I'm also a member of the ACM
  • 00:00:30
    professional development committee and
  • 00:00:31
    have served as the editor of ACM six off
  • 00:00:33
    software engineering
  • 00:00:36
    notes for those of you who may be
  • 00:00:38
    unfamiliar with ACM or what it has to
  • 00:00:40
    offer ACM offers educational and
  • 00:00:42
    professional development resources that
  • 00:00:44
    booster skills and enhance career
  • 00:00:47
    opportunities our members can stay
  • 00:00:49
    competitive in the constantly changing
  • 00:00:51
    world of computing with a range of ACM
  • 00:00:54
    learning research at learning. acm.org
  • 00:00:57
    you can see some of those highlights on
  • 00:00:59
    your screen
  • 00:01:00
    ACM recognizes the role of computing and
  • 00:01:02
    driving the innovations that sustain
  • 00:01:04
    competitiveness in global environment as
  • 00:01:06
    such ACM provides timely Computing
  • 00:01:09
    information published by ACM including
  • 00:01:11
    Communications of ACM and Q magazines
  • 00:01:14
    access to the ACM Digital Library the
  • 00:01:16
    world's most comprehensive database on
  • 00:01:18
    Computing literature and international
  • 00:01:20
    conferences that draw leading experts on
  • 00:01:22
    a broad spectrum of computing topics
  • 00:01:25
    furthermore we support Education and
  • 00:01:27
    Research including curriculum
  • 00:01:29
    development teacher training and the ACM
  • 00:01:31
    turning touring and ACM infosis
  • 00:01:34
    Foundation Awards ACM enables its
  • 00:01:37
    members to solve critical problems using
  • 00:01:39
    new technology that enriches our lives
  • 00:01:41
    and advances our society in the digital
  • 00:01:43
    age and I might add two things first
  • 00:01:46
    Google generously has increased the
  • 00:01:47
    Turning award to $1 million that's a lot
  • 00:01:50
    of money and second ACM membership is
  • 00:01:53
    really inexpensive that's not a lot of
  • 00:01:55
    money and if you can't afford a full
  • 00:01:57
    membership you might look into joining
  • 00:01:59
    one of the 30 seven special interest
  • 00:02:01
    groups of course I'm biased about ACM
  • 00:02:03
    sigsoft uh where student memberships
  • 00:02:06
    start at $10 and professional membership
  • 00:02:09
    start at
  • 00:02:12
    $20 okay housekeeping before we get
  • 00:02:15
    started once again for those who are
  • 00:02:18
    first timers I'd like to quickly mention
  • 00:02:20
    a few housekeeping iters this show on
  • 00:02:22
    the slide in front of you in the top
  • 00:02:24
    right hand corner of the slide area of
  • 00:02:25
    your screen there is a button that will
  • 00:02:27
    allow you to enlarge the slides you may
  • 00:02:29
    want to largest slides at any time by
  • 00:02:31
    dragging the corner of the slide window
  • 00:02:33
    the slides will advance automatically
  • 00:02:35
    throughout the event during the
  • 00:02:37
    presentation you can minimize the slide
  • 00:02:39
    area question and answer and bio screens
  • 00:02:42
    using the buttons on the bottom panel
  • 00:02:44
    you can also use a number of widgets
  • 00:02:46
    also found at the bottom panel including
  • 00:02:47
    Facebook Twitter Wikipedia and a
  • 00:02:49
    resource list where you can get copies
  • 00:02:51
    of the slides if you're experiencing
  • 00:02:54
    problems with our web interface please
  • 00:02:56
    refresh your console by pressing the F5
  • 00:02:58
    key and Windows command plus r if you're
  • 00:03:00
    on a Mac uh refresh your browser if
  • 00:03:03
    you're on a mobile device or close and
  • 00:03:05
    relaunch the presentation you can also
  • 00:03:07
    visit the webcast help guide by clicking
  • 00:03:09
    on the help widget below the slide
  • 00:03:11
    window obviously to control volume
  • 00:03:13
    please adjust the master volume on your
  • 00:03:15
    computer and at the end of the
  • 00:03:17
    presentation as usual we will have time
  • 00:03:19
    for questions and we hope you have
  • 00:03:20
    several if you think of a question
  • 00:03:22
    during the presentation please type it
  • 00:03:24
    into the Q&A box and click on the submit
  • 00:03:27
    button you don't need to wait until the
  • 00:03:28
    end of the presentation begin submitting
  • 00:03:30
    questions you may also use a Q&A box uh
  • 00:03:33
    and the survey at the end to suggest
  • 00:03:35
    topics for future webinars at the end of
  • 00:03:37
    the prev at the end of the presentation
  • 00:03:40
    you will see a survey open in your
  • 00:03:42
    browser please take a minute to fill it
  • 00:03:44
    out and help us improve our webinars the
  • 00:03:46
    session is being recorded and will be
  • 00:03:49
    archived you will receive an automatic
  • 00:03:51
    email notification when it is available
  • 00:03:53
    and you can check learning. acm.org in a
  • 00:03:56
    few days for updates
  • 00:04:01
    you can also use Facebook and the
  • 00:04:03
    Twitter widgets on the bottom panel to
  • 00:04:05
    share the presentation link with your
  • 00:04:06
    friends as well as to tweet comments and
  • 00:04:08
    questions using the hashtag poundside
  • 00:04:11
    ACM webinar judge Jud
  • 00:04:15
    dge uh we'll be watching for those treat
  • 00:04:19
    tweets today's presentation is Stranger
  • 00:04:22
    Than Fiction case studies and software
  • 00:04:25
    engineering judgment by Steve McConnell
  • 00:04:28
    Steve is the CEO and chief software
  • 00:04:31
    engineer at construct software where he
  • 00:04:33
    consults to a broad range of Industries
  • 00:04:35
    and
  • 00:04:36
    overseas constructs Consulting and
  • 00:04:39
    training offerings Steve is the
  • 00:04:41
    bestselling author of the industry
  • 00:04:44
    classic Cod code complete as well as
  • 00:04:48
    software estim
  • 00:04:50
    estim estimation demystifying the black
  • 00:04:54
    art rapid development and other titles
  • 00:04:58
    Steve has served as editorinchief of It
  • 00:05:01
    software magazine and chair of the it
  • 00:05:03
    computer society's professional
  • 00:05:04
    activities board finally readers of
  • 00:05:07
    software development magazine voters
  • 00:05:09
    Steve one of the three most influential
  • 00:05:13
    people in the software industry along
  • 00:05:16
    with such guys as Bill Gates and
  • 00:05:19
    lionist I can't pronounce this last name
  • 00:05:21
    t vaals yes uh Steve we look forward to
  • 00:05:26
    your presentation here today
  • 00:05:30
    will uh thank you for that great
  • 00:05:32
    introduction uh and I'll put in a second
  • 00:05:34
    plug uh from will for joining uh sigsoft
  • 00:05:37
    uh that's how I got to uh became aware
  • 00:05:39
    of will in the first place I've enjoyed
  • 00:05:41
    his writings uh in that uh uh capacity
  • 00:05:43
    for a long time and I'm really happy to
  • 00:05:46
    be here today to talk about uh case
  • 00:05:48
    studies in software engineering judgment
  • 00:05:50
    uh Stranger Than Fiction uh we have a
  • 00:05:53
    fairly ambitious agenda for the talk
  • 00:05:55
    today and uh the title should be taken
  • 00:05:58
    quite literally uh case studies in
  • 00:06:00
    software engineering judgment uh that
  • 00:06:02
    means that we're going to talk some
  • 00:06:03
    about judgment uh maybe more than you
  • 00:06:05
    would have expected uh but we're going
  • 00:06:07
    to talk some about that I'm going to
  • 00:06:09
    present a model uh that is one possible
  • 00:06:11
    way to support better judgment in
  • 00:06:12
    software engineering uh and then uh we
  • 00:06:16
    will uh uh look at a few case studies in
  • 00:06:18
    applying software engineering judgment I
  • 00:06:21
    I think that uh this is a pretty
  • 00:06:23
    important topic uh we're going to just
  • 00:06:24
    really hit the tip of the iceberg today
  • 00:06:26
    but I think you'll at least get enough
  • 00:06:28
    of a flavor of it to see there actually
  • 00:06:30
    is a pretty important uh issue
  • 00:06:33
    here so turning then to that first topic
  • 00:06:35
    the topic of judgment uh I think a good
  • 00:06:38
    way to look at judgment is through
  • 00:06:40
    something known as Bloom's taxonomy uh
  • 00:06:43
    Bloom's taxonomy is often used in uh I
  • 00:06:46
    think primarily used in education
  • 00:06:47
    circles typically for the design of
  • 00:06:49
    class materials or perhaps exam
  • 00:06:51
    questions uh I think it actually is
  • 00:06:54
    often treated as though it's kind of
  • 00:06:56
    just a superficial description of
  • 00:06:57
    something but I think if you really
  • 00:06:59
    study Bloom's taxonomy it yields some
  • 00:07:02
    really significant insights uh for
  • 00:07:04
    software engineering uh Bloom's taxonomy
  • 00:07:07
    basically uh uh lists uh levels of of of
  • 00:07:11
    understanding uh going from knowledge
  • 00:07:13
    and comprehension and application down
  • 00:07:15
    to analysis synthesis and judgment or
  • 00:07:18
    valuation we will talk about each of
  • 00:07:19
    those in some detail um so as I as now
  • 00:07:24
    we officially see on the slide I think
  • 00:07:26
    Bloom's taxonomy is often treated almost
  • 00:07:28
    superficially or even flippantly uh but
  • 00:07:31
    I I don't think it's an overstatement to
  • 00:07:33
    say that a real understanding especially
  • 00:07:35
    of the upper levels of the taxonomy uh
  • 00:07:37
    which is to say uh uh analysis synthesis
  • 00:07:41
    and judgment has profound implications
  • 00:07:44
    for software
  • 00:07:46
    professionals so the first level as you
  • 00:07:49
    see typically represented as the bottom
  • 00:07:51
    level on a pyramid is knowledge or also
  • 00:07:53
    known as recall this is the remembering
  • 00:07:55
    of previously learned material uh we see
  • 00:07:58
    recall or knowledge and software
  • 00:08:00
    engineering when we for example recall
  • 00:08:02
    book learning or we recall personal
  • 00:08:04
    experience or we recall specific details
  • 00:08:07
    needed to implement a particular
  • 00:08:08
    technical practice uh certainly
  • 00:08:10
    recalling patterns of practices would
  • 00:08:12
    fall into that knowledge category and
  • 00:08:14
    recalling successes in design Code test
  • 00:08:17
    project management and so on would also
  • 00:08:19
    be examples of recall or
  • 00:08:23
    knowledge the next level up is
  • 00:08:25
    comprehension comprehension refers to
  • 00:08:27
    grasping the meaning of of material this
  • 00:08:31
    would be the lowest level of
  • 00:08:33
    understanding according to Bloom's
  • 00:08:34
    taxonomy examples in software
  • 00:08:36
    engineering uh would include summarizing
  • 00:08:39
    a methodology so not just recalling
  • 00:08:41
    details but summarizing it which would
  • 00:08:43
    show comprehension explaining scrum
  • 00:08:45
    either in words or as a diagram being
  • 00:08:47
    able to explain a concept in different
  • 00:08:49
    forms would be a sign of comprehension
  • 00:08:52
    describe an example of scrum would be uh
  • 00:08:54
    one kind of comprehension explain why
  • 00:08:56
    scrum is not a design approach so you
  • 00:08:58
    can see how that goes beond Beyond
  • 00:08:59
    Simple recall or explain how scrum is
  • 00:09:02
    different from extreme programming those
  • 00:09:03
    are all examples of comprehension and I
  • 00:09:06
    think you can probably see how that
  • 00:09:07
    contrasts with uh recall which is a a
  • 00:09:10
    lower level on Bloom's
  • 00:09:12
    taxonomy the next level up is
  • 00:09:14
    application and in application uh we use
  • 00:09:18
    the knowledge from the recall level as
  • 00:09:20
    well as comprehension to solve problems
  • 00:09:22
    in new and concrete situation so in
  • 00:09:25
    software engineering that would include
  • 00:09:27
    examples like using general design
  • 00:09:29
    knowledge to solve a specific design
  • 00:09:32
    problem so in other words applying our
  • 00:09:33
    general knowledge to a specific case
  • 00:09:35
    that's application or using general
  • 00:09:37
    project planning knowledge to plan a
  • 00:09:39
    specific project or general software
  • 00:09:41
    construction knowledge to write a
  • 00:09:42
    specific piece of code so this is
  • 00:09:45
    considered in Bloom's taxonomy to be
  • 00:09:47
    another level up in the
  • 00:09:49
    taxonomy the next level up is analysis
  • 00:09:52
    and Analysis consists of breaking a
  • 00:09:54
    problem into its part so that its
  • 00:09:55
    relationships and organizational
  • 00:09:57
    structure can be understood
  • 00:09:59
    we do an awful lot of analysis in
  • 00:10:02
    software engineering uh some examples
  • 00:10:04
    would include breaking a large class
  • 00:10:06
    into two smaller classes or breaking a
  • 00:10:08
    class into methods and data uh certainly
  • 00:10:10
    when we allocate functionality and data
  • 00:10:12
    to methods within a class that would be
  • 00:10:14
    another example of breaking a problem
  • 00:10:16
    into its parts uh finding flaws in a
  • 00:10:20
    proposed design basically we use phrases
  • 00:10:21
    like pick something apart that is
  • 00:10:23
    fundamentally describing analysis that
  • 00:10:26
    is breaking something into its parts
  • 00:10:28
    finding the source of a coding error
  • 00:10:29
    tracking into the code I.E looking at
  • 00:10:32
    increasingly fine levels of detail would
  • 00:10:35
    be another example of analysis now I'd
  • 00:10:38
    like to dwell on analysis for a little
  • 00:10:40
    bit because this is really key to the
  • 00:10:42
    practice of software engineering
  • 00:10:44
    analysis is also known as critical
  • 00:10:46
    thinking and in the software field at
  • 00:10:48
    least at the practitioner level we
  • 00:10:50
    screen extensively for analysis skills
  • 00:10:53
    as an entry criteria for entering the
  • 00:10:55
    programming profession uh the whole idea
  • 00:10:57
    of programming identifying the correct
  • 00:11:00
    sequence of operations in a section of
  • 00:11:02
    code is fundamentally an analysis
  • 00:11:04
    activity uh identifying boundary
  • 00:11:06
    conditions identifying off by one errors
  • 00:11:09
    those are fundamentally analysis
  • 00:11:10
    activities now I think it's important to
  • 00:11:12
    understand that these are not common
  • 00:11:14
    human skills most of the human race does
  • 00:11:17
    not find it that easy to identify the
  • 00:11:20
    exact correct sequence of operations and
  • 00:11:23
    uh I don't know how many of you might
  • 00:11:24
    have uh approached programming in your
  • 00:11:26
    younger days but I still remember when I
  • 00:11:28
    was first learning my first programming
  • 00:11:30
    language I remember how brain busting it
  • 00:11:33
    was to have to think through the steps
  • 00:11:36
    in a program line by line keeping in
  • 00:11:39
    mind that certain things had to be done
  • 00:11:42
    before other things and the holistic
  • 00:11:44
    sense of the thing didn't matter uh in
  • 00:11:46
    that it didn't really I couldn't just
  • 00:11:47
    put everything down that needed to be
  • 00:11:49
    done and assume it would be done in the
  • 00:11:50
    correct order I found it to be quite a
  • 00:11:52
    challenge at first to get things uh
  • 00:11:55
    listed step by step so that I didn't do
  • 00:11:58
    the dependent step step until I had done
  • 00:12:00
    the antecedent step and I remember how
  • 00:12:03
    much that hurt my brain at first of
  • 00:12:05
    course you know a couple years later I
  • 00:12:07
    was doing that sort of thing in my sleep
  • 00:12:08
    as is probably everyone on this on this
  • 00:12:11
    uh webinar but I think it's good to
  • 00:12:13
    understand that these are not common
  • 00:12:14
    human skills and most of the human race
  • 00:12:17
    uh doesn't really possess these skills
  • 00:12:19
    to any significant degree uh and the
  • 00:12:21
    interesting thing is that most software
  • 00:12:23
    professionals are really really good at
  • 00:12:26
    analysis and that is partly a strength
  • 00:12:29
    that partly a weakness as it turned out
  • 00:12:31
    I would say that analysis often
  • 00:12:33
    constitutes an overdeveloped muscle for
  • 00:12:36
    many technical staff uh developed is
  • 00:12:38
    fine but overdeveloped really means that
  • 00:12:41
    it's out of balance or overly strong
  • 00:12:43
    compared to the next two highest levels
  • 00:12:45
    on blooms taxonomy of synthesis and
  • 00:12:47
    evaluation when we have overdeveloped
  • 00:12:49
    analysis skill that can lead to the
  • 00:12:51
    common problem that's so common it has a
  • 00:12:53
    name analysis paralysis uh it can also
  • 00:12:56
    lead to excessive focus on individual
  • 00:12:59
    details and I think this is almost a
  • 00:13:01
    cliche in software that people who can
  • 00:13:04
    be quite excellent in some respects
  • 00:13:06
    actually can have an inability to see
  • 00:13:07
    the forest for the trees they become
  • 00:13:09
    excessively preoccupied with uh things
  • 00:13:12
    that are very specific and really maybe
  • 00:13:14
    don't have that much General
  • 00:13:16
    importance so let's talk about synthesis
  • 00:13:19
    for a moment we'll come back and revisit
  • 00:13:21
    analysis in a little bit uh synthesis is
  • 00:13:24
    the uh the next highest level on Bloom's
  • 00:13:26
    taxonomy and this is putting Parts
  • 00:13:28
    together to form a new organization or
  • 00:13:30
    whole that requires original or creative
  • 00:13:33
    thinking and in software engineering
  • 00:13:35
    some examples would include things like
  • 00:13:36
    combining two classes into a new class
  • 00:13:39
    into one new class that provides an
  • 00:13:41
    interface at a different level of
  • 00:13:42
    abstraction so that would be a good
  • 00:13:44
    example of synthesis I think uh making
  • 00:13:47
    Global versus local trade-offs in design
  • 00:13:49
    of a system to create a better overall
  • 00:13:51
    design the whole idea that the
  • 00:13:53
    considerations globally might not simply
  • 00:13:55
    be enumerating the entire list of local
  • 00:13:58
    tradeoffs but the we might get into
  • 00:13:59
    something that's qualitatively different
  • 00:14:01
    at the global level would be a synthesis
  • 00:14:04
    kind of idea assembling a team based on
  • 00:14:06
    strengths and weaknesses of a particular
  • 00:14:08
    set of individuals rather than thinking
  • 00:14:10
    there is one archetypal best contributor
  • 00:14:12
    in trying to have as many of those
  • 00:14:14
    people as we can the idea that we would
  • 00:14:16
    assemble a team Based On A diversity of
  • 00:14:18
    strengths and weaknesses and that the
  • 00:14:19
    whole would somehow be greater than the
  • 00:14:21
    parts would be a synthesis activity
  • 00:14:23
    adjusting overall project plans based on
  • 00:14:26
    progress of a set of individual teams so
  • 00:14:28
    this is really at the executive
  • 00:14:30
    management level would be a synthesis
  • 00:14:32
    activity in Bloom's taxonomy synthesis
  • 00:14:34
    or create crate is one of the highest
  • 00:14:37
    levels of
  • 00:14:38
    understanding uh let's H dwell a little
  • 00:14:41
    bit more on synthesis so synthesis is
  • 00:14:44
    also known as creative thinking and uh
  • 00:14:47
    under Bloom's taxonomy as you can see
  • 00:14:49
    from the way it's represented as a
  • 00:14:50
    pyramid this is a higher level skill and
  • 00:14:53
    not as many people are good at it uh I
  • 00:14:55
    find I found over the years that
  • 00:14:57
    technical people often discount account
  • 00:14:59
    the value of synthesis uh for example
  • 00:15:02
    it's very common for technical staff to
  • 00:15:03
    be skeptical of upper management which
  • 00:15:05
    really by its nature needs to be more
  • 00:15:07
    more focused on synthesis and creation
  • 00:15:10
    than on analysis and so we actually see
  • 00:15:12
    not just an under appreciation but often
  • 00:15:14
    a disrespect for the whole synthesis
  • 00:15:17
    activity I think the bottom line here is
  • 00:15:19
    that software industry does a much
  • 00:15:20
    better job of recruiting and screening
  • 00:15:23
    for analysis skill than we do for
  • 00:15:25
    synthesis or creative skill despite the
  • 00:15:27
    fact that you could see from the list of
  • 00:15:29
    activities that I listed under synthesis
  • 00:15:31
    that there are some really important
  • 00:15:32
    activities uh that fall under that
  • 00:15:36
    umbrella the top level in Bloom's
  • 00:15:39
    taxonomy uh is Judgment or evaluation uh
  • 00:15:42
    and that means uh evaluate the value of
  • 00:15:45
    ideas Concepts principles or solution
  • 00:15:47
    methods for a given purpose and like
  • 00:15:50
    synthesis evaluation is one of the
  • 00:15:52
    highest levels of understanding um now
  • 00:15:56
    interestingly enough judgment or value
  • 00:15:59
    ation has been applied to Bloom's
  • 00:16:00
    taxonomy itself uh in the original
  • 00:16:02
    version of Bloom's taxonomy created in
  • 00:16:05
    1956 uh the terms were nouns going from
  • 00:16:08
    knowledge to evaluation with the two top
  • 00:16:11
    levels being synthesis second and
  • 00:16:13
    evaluation first the taxonomy was
  • 00:16:15
    revised in 2001 to revise the terms to
  • 00:16:18
    be verbs rather than nouns and the two
  • 00:16:20
    top levels were essentially switched so
  • 00:16:22
    that in 2001 the thinking was that
  • 00:16:24
    evaluation was the second uh highest
  • 00:16:26
    level and crate was the highest level so
  • 00:16:29
    kind of an interesting application of
  • 00:16:30
    judgment to Bloom's taxonomy itself uh I
  • 00:16:34
    personally interpret it more like this
  • 00:16:36
    to say that I'm not sure that I believe
  • 00:16:38
    that either create or evaluate is the
  • 00:16:40
    thing at the top I think they are both
  • 00:16:42
    at the top I think they both feed off
  • 00:16:45
    each other and they both depend on
  • 00:16:47
    analysis uh so really I think what we
  • 00:16:50
    have is really kind of two things at the
  • 00:16:51
    top um there may be others who know more
  • 00:16:53
    about Bloom's taxonomy than I do who
  • 00:16:55
    would uh dispute that but I think that's
  • 00:16:57
    a a pretty good way to look at
  • 00:17:00
    it so what are some examples of judgment
  • 00:17:02
    and software engineering well choosing
  • 00:17:05
    the better of two technology paths okay
  • 00:17:07
    that's something we need to do fairly
  • 00:17:08
    often uh choosing the best of three
  • 00:17:10
    design approaches that's something we
  • 00:17:12
    need to do very often uh justifying a re
  • 00:17:15
    architecture project kind of interesting
  • 00:17:17
    you know when is the right time to
  • 00:17:19
    actually make that leap well there's a
  • 00:17:21
    lot of judgment involved in that um
  • 00:17:23
    choosing which proposed projects best
  • 00:17:25
    support a business's objectives clearly
  • 00:17:27
    that involves uh uh making some
  • 00:17:29
    judgments or evaluations assessing the
  • 00:17:31
    degree to which a new methodology will
  • 00:17:33
    benefit an organization or harm it uh
  • 00:17:36
    certainly an example predicting the
  • 00:17:38
    likelihood of success of a project plan
  • 00:17:40
    another good example or conducting root
  • 00:17:43
    cause analysis on a failed project
  • 00:17:44
    that's I think actually the one that is
  • 00:17:46
    maybe the most questionable particularly
  • 00:17:49
    because it has the word analysis in it
  • 00:17:51
    so that makes it sound like maybe it has
  • 00:17:52
    more to do with analysis but the point
  • 00:17:54
    here is it's not just about the analysis
  • 00:17:56
    it's also about picking which which
  • 00:17:59
    factor actually or factors are actually
  • 00:18:01
    the ones that made the difference so
  • 00:18:03
    identifying the factors would be
  • 00:18:04
    analysis but then discriminating among
  • 00:18:07
    the factors and picking correctly the
  • 00:18:09
    ones that actually matter that would be
  • 00:18:11
    the uh judgment activity now I think uh
  • 00:18:15
    uh uh the difference between analysis
  • 00:18:17
    and judgment can be sometimes a little
  • 00:18:19
    bit hard to grasp but the basic idea I
  • 00:18:22
    think a way to look at it is that the
  • 00:18:24
    anal analysis is the ability to go very
  • 00:18:26
    far down a decision tree and Ong
  • 00:18:29
    multiple paths so the ability and I
  • 00:18:31
    don't know how well you'll be able to
  • 00:18:32
    see this on your screen but the ability
  • 00:18:34
    to go very way down to a fine level of
  • 00:18:36
    detail and then discriminate by going
  • 00:18:38
    down other branches far down the tree
  • 00:18:40
    going down to other fine levels of
  • 00:18:42
    detail and being able to organize and
  • 00:18:44
    keep those straight and understand
  • 00:18:47
    comprehend uh all those paths a fine
  • 00:18:49
    level of detail would really be
  • 00:18:51
    characteristic of analysis now judgment
  • 00:18:54
    on the other hand really has more to do
  • 00:18:57
    with deciding
  • 00:18:59
    uh which path is the right path to go
  • 00:19:01
    down for a particular purpose and this
  • 00:19:04
    is something where a lot of technical
  • 00:19:05
    people get hung up because they're so
  • 00:19:07
    focused on the analysis that they don't
  • 00:19:09
    look at the very top decision and say
  • 00:19:11
    you know what we can go deep in the tree
  • 00:19:13
    but that very first decision was
  • 00:19:15
    actually the one that was wrong uh and
  • 00:19:18
    from a judgment point of view we might
  • 00:19:19
    very well have a circumstance where any
  • 00:19:22
    path on the right is better than any
  • 00:19:25
    path on the left and further analysis on
  • 00:19:27
    the details of the path path on the L is
  • 00:19:30
    not going to change that fundamental uh
  • 00:19:32
    decision because we actually we actually
  • 00:19:36
    made the right or wrong decision at the
  • 00:19:38
    very very first decision on the tree uh
  • 00:19:41
    and so this is something that a lot of
  • 00:19:42
    technical people who are very
  • 00:19:44
    analytically focused have a really hard
  • 00:19:46
    time wrapping their brains around they
  • 00:19:48
    have a hard time wrapping their brains
  • 00:19:50
    around the idea that in some cases
  • 00:19:53
    further analysis doesn't matter that
  • 00:19:55
    it's okay to ignore details that more
  • 00:19:58
    detail s are actually not going to
  • 00:20:00
    influence the overall outcome of the
  • 00:20:02
    decision and so this would be
  • 00:20:04
    characteristic of judgment not analysis
  • 00:20:06
    but judgment to the degree that
  • 00:20:08
    technical people are uncomfortable with
  • 00:20:10
    that it's symptomatic of that
  • 00:20:12
    overdeveloped analysis muscle and may be
  • 00:20:15
    an underdeveloped judgment
  • 00:20:17
    muscle so uh one of the other uh
  • 00:20:20
    interesting aspects of analysis and
  • 00:20:22
    judgment is because we're so analysis
  • 00:20:24
    heavy in software we often mistake uh
  • 00:20:27
    analysis for judging we uh will we be in
  • 00:20:30
    circumstances that really do call for
  • 00:20:31
    judgment we'll see a heavy application
  • 00:20:33
    of analysis instead and we'll think that
  • 00:20:36
    that's judgment even though it's not so
  • 00:20:38
    examples uh might be uh something like
  • 00:20:40
    criticizing each of two technology paths
  • 00:20:43
    that's not judgment that is analyzing
  • 00:20:45
    each of the two paths uh finding faults
  • 00:20:48
    in three design approaches well you know
  • 00:20:50
    if you're tasked with choosing the best
  • 00:20:52
    of three design approaches then it kind
  • 00:20:54
    of looks like if you were to find faults
  • 00:20:55
    in each of those three that that might
  • 00:20:57
    kind of be sort of like judgment but
  • 00:20:59
    actually finding the faults would be the
  • 00:21:00
    analysis step and that would need to be
  • 00:21:02
    followed by a judgment step depending on
  • 00:21:05
    how blatant The Faults Are there might
  • 00:21:06
    not be a lot of analysis needed actually
  • 00:21:08
    because the right path might actually be
  • 00:21:11
    uh determined much higher in the
  • 00:21:13
    decision tree identifying limitations of
  • 00:21:16
    a current system to justify a re
  • 00:21:17
    architecture project uh identifying the
  • 00:21:20
    limitations fundamentally and Analysis
  • 00:21:22
    task advocacy for projects doesn't get
  • 00:21:25
    Beyond past Beyond advocacy of one
  • 00:21:28
    favored project so that's interesting
  • 00:21:30
    too you know we can advocate for one
  • 00:21:32
    project but that doesn't imply judgment
  • 00:21:34
    it really implies a lower level on
  • 00:21:36
    Bloom's taxonomy assessment of a new
  • 00:21:38
    methodology amounts to religious
  • 00:21:40
    advocacy for one methodology so again
  • 00:21:43
    there assessment is kind of in the same
  • 00:21:44
    family as evaluate or judge uh and uh
  • 00:21:48
    advocacy for one thing is not the same
  • 00:21:50
    as assessment of a set of things
  • 00:21:53
    assessment of project plans focuses on
  • 00:21:55
    minutia that's the whole idea of not
  • 00:21:56
    being able to see the force for the
  • 00:21:58
    reason we certainly see that uh fairly
  • 00:22:00
    often in our field or root cause
  • 00:22:03
    analysis on a failed project consists of
  • 00:22:05
    rehashing unpopular decision same kind
  • 00:22:07
    of issue where uh again rather than
  • 00:22:10
    really doing judgment uh as
  • 00:22:13
    counterintuitive as that might be for
  • 00:22:14
    something called root cause analysis uh
  • 00:22:16
    instead of applying judgment uh we
  • 00:22:18
    actually get hung up on just looking at
  • 00:22:21
    again at details that may or may not
  • 00:22:22
    really
  • 00:22:24
    matter so um judgment capability in my
  • 00:22:27
    view is even rarer than synthesis in
  • 00:22:29
    software engineering and I think we
  • 00:22:31
    hardly screen for judgment in our
  • 00:22:33
    recruiting and uh recruiting practices
  • 00:22:35
    at all uh for example Microsoft's famous
  • 00:22:38
    interview questions are nearly all about
  • 00:22:40
    synthesis uh Which is higher on Bloom's
  • 00:22:42
    taxonomy than typical interview
  • 00:22:44
    questions are which is off which are
  • 00:22:45
    often more about analysis but poor
  • 00:22:48
    business judgment is so common among
  • 00:22:50
    technical staff that it is a cliche uh
  • 00:22:52
    if you spend much time with a C Level
  • 00:22:54
    that is CEO coo CFO Executives in
  • 00:22:57
    companies they just shake their heads at
  • 00:23:00
    the business judgment demonstrated by
  • 00:23:01
    technical staff and I've had numerous
  • 00:23:03
    experiences where I would actually
  • 00:23:04
    confirm that myself so rather than just
  • 00:23:08
    kind of uh saying wow that's terrible
  • 00:23:10
    too bad we have to live with that the
  • 00:23:11
    $64 question is then how do we develop
  • 00:23:15
    good judgment in software professionals
  • 00:23:17
    if they don't come by it naturally and
  • 00:23:19
    if we're not recruiting for it how do we
  • 00:23:21
    develop that good judgment uh so that
  • 00:23:24
    that's not a weakness of our field and
  • 00:23:28
    uh I think there are a variety of ways
  • 00:23:29
    to do that but a way that I will propose
  • 00:23:31
    today is use of what I call the four
  • 00:23:32
    factors model I have another
  • 00:23:34
    presentation that goes into a lot of
  • 00:23:36
    detail on the four factors model uh but
  • 00:23:39
    um I will just summarize this very
  • 00:23:41
    briefly today the four factors model
  • 00:23:43
    basically says that if we have a deep
  • 00:23:45
    understanding of the influence of
  • 00:23:47
    project size or the product size of the
  • 00:23:51
    role of uncertainty in software projects
  • 00:23:52
    of the role that defects play and of
  • 00:23:55
    human variation and of the Myriad
  • 00:23:57
    interactions between between these four
  • 00:23:59
    factors we have a good understanding a
  • 00:24:01
    deep understanding of those uh four
  • 00:24:03
    factors that goes a long way toward
  • 00:24:05
    explaining everything you need to know
  • 00:24:07
    basically about uh software Project
  • 00:24:09
    Dynamics um I'm not going to go through
  • 00:24:12
    these in a huge amount of detail but
  • 00:24:13
    size would include things like the idea
  • 00:24:15
    of diseconomy of scale the notion that
  • 00:24:18
    the larger the project gets the lower
  • 00:24:20
    per person productivity becomes uh
  • 00:24:22
    failure rate of large versus small
  • 00:24:24
    projects specializations of large versus
  • 00:24:26
    small projects mix of different kinds of
  • 00:24:28
    of activities on large versus small
  • 00:24:30
    projects and so on uh uncertainty huge
  • 00:24:33
    topic you know could be spend many hours
  • 00:24:35
    lecturing on that but includes a notion
  • 00:24:37
    of intellectual phases which uh I've
  • 00:24:39
    talked about a lot but otherwise it
  • 00:24:41
    hasn't really achieved much currency
  • 00:24:43
    cone of uncertainty which really has
  • 00:24:44
    achieved some currency uh and other
  • 00:24:47
    topics that you can see on the slide uh
  • 00:24:49
    defects really understanding the notion
  • 00:24:51
    of defect cost increase of the effect of
  • 00:24:54
    a lag between defect insertion and
  • 00:24:56
    defect detection uh the effect of
  • 00:24:59
    different defect removal techniques and
  • 00:25:01
    the consequences of putting those into a
  • 00:25:03
    series of removal techniques and so on
  • 00:25:05
    again lots and lots of details that are
  • 00:25:07
    worth talking about there and then human
  • 00:25:09
    variation I think a huge factor that
  • 00:25:11
    doesn't get emphasized enough especially
  • 00:25:13
    not in a research context frankly
  • 00:25:15
    because the human variation actually
  • 00:25:17
    undermines a lot of the research that's
  • 00:25:19
    been done if we take it seriously and
  • 00:25:21
    yet if we really want to get to a point
  • 00:25:23
    where we have consistently successful
  • 00:25:24
    software projects I think it's really
  • 00:25:26
    important to understand the inss and
  • 00:25:28
    outs of how human variation affects
  • 00:25:30
    software project Dynamic so that's a
  • 00:25:32
    very very brief introduction to uh the
  • 00:25:35
    four factors um so the four factors
  • 00:25:39
    model essentially uh if we spin it the
  • 00:25:41
    right way can provide a set of templates
  • 00:25:44
    against which we compare what we see on
  • 00:25:45
    a project versus what we would expect to
  • 00:25:47
    see on healthy project and I think that
  • 00:25:50
    that can support judgment I mean if we
  • 00:25:52
    think about how people commonly talk
  • 00:25:54
    about judgment we often talk about it as
  • 00:25:57
    though judgment is something that comes
  • 00:25:58
    significantly from experience and
  • 00:26:01
    experience really is actually something
  • 00:26:03
    that kind of if you think about it uh
  • 00:26:05
    involves creating mental templates it
  • 00:26:07
    create it's a process of creating
  • 00:26:09
    expectations based on having seen
  • 00:26:11
    similar things over and over again and
  • 00:26:13
    then judgment can be supported by
  • 00:26:15
    comparing something new to the
  • 00:26:17
    expectations that we have formed based
  • 00:26:19
    on seeing a variety of things in the
  • 00:26:21
    past that would be one way to apply
  • 00:26:23
    judgment so one example of using the
  • 00:26:26
    four factors model to support judg would
  • 00:26:28
    be to create checklists based on the
  • 00:26:31
    four factors and so I'm not going to
  • 00:26:33
    read through every point on these slides
  • 00:26:35
    uh as a uh will set in the introduction
  • 00:26:37
    the slides will the presentation will be
  • 00:26:39
    recorded and archived and available so
  • 00:26:42
    if you're interested you can come back
  • 00:26:43
    and take a look at these details later
  • 00:26:46
    uh but we would have a checklist related
  • 00:26:47
    to size and it would have questions like
  • 00:26:49
    uh is the project estimated close to its
  • 00:26:51
    actual size and that would imply is the
  • 00:26:53
    pro are the project plans really scaled
  • 00:26:55
    appropriately for the size does the
  • 00:26:57
    project schedule permit completion of a
  • 00:27:00
    project of the estimated effort uh other
  • 00:27:02
    questions like uh do we have appropriate
  • 00:27:04
    staff specializations for the size of
  • 00:27:06
    the project do we have appropriate
  • 00:27:08
    management for the size of the project
  • 00:27:10
    uh and so on so basically a cluster of
  • 00:27:13
    issues related to size and we'll get
  • 00:27:15
    into more details on that when we look
  • 00:27:17
    at the case studies uh second of the
  • 00:27:19
    four factors would be uncertainty and
  • 00:27:21
    here too we have questions like do the
  • 00:27:23
    projects estimates and plans account for
  • 00:27:25
    the cone of uncertainty where will the
  • 00:27:27
    projects ch Alles come from in terms of
  • 00:27:29
    the intellectual phase profiles uh are
  • 00:27:32
    we addressing design uncertainty in a in
  • 00:27:34
    an appropriate way are we addressing
  • 00:27:35
    requirements uncertainty and Technology
  • 00:27:38
    uncertainty uh inappropriate ways and uh
  • 00:27:41
    and other questions along these lines
  • 00:27:43
    and again we'll get into the application
  • 00:27:45
    of some of these as we look at uh some
  • 00:27:47
    of the case studies uh we also have a
  • 00:27:49
    defect checklist again uh you can look
  • 00:27:52
    at the details at your leisure perhaps
  • 00:27:54
    uh after the talk has been archived but
  • 00:27:56
    we're really looking at how effective is
  • 00:27:58
    the approach the project is taking to
  • 00:28:00
    detecting and removing defects in a
  • 00:28:02
    timely way uh really these questions
  • 00:28:04
    amount to answering that question uh and
  • 00:28:08
    finally human variation uh again here
  • 00:28:11
    are lots of important questions uh do
  • 00:28:12
    the people on the project have the
  • 00:28:14
    skills needed to complete a project of
  • 00:28:16
    the intended size uh do they have the
  • 00:28:18
    skills needed to complete a project with
  • 00:28:20
    the uncertainty characteristics
  • 00:28:21
    demonstrated on the project uh do they
  • 00:28:24
    have the skills needed to deliver
  • 00:28:25
    software at the intended or Target
  • 00:28:27
    quality level these are the kinds of
  • 00:28:29
    questions we look at when we want to
  • 00:28:30
    talk about human variation so these
  • 00:28:33
    checklists really are intended for
  • 00:28:35
    purposes of illustration I think they
  • 00:28:37
    are uh reasonable uh presentation of
  • 00:28:40
    checklists for presentation purposes
  • 00:28:42
    they aren't necessarily the BL end all
  • 00:28:44
    checklists uh uh somebody who wanted to
  • 00:28:47
    spend a lot more time looking at these
  • 00:28:48
    might be able to come up with better
  • 00:28:50
    checklists but I think they're good
  • 00:28:51
    enough to get the idea across of how we
  • 00:28:53
    could package these up and put them into
  • 00:28:56
    uh a model that would would actually
  • 00:28:58
    allow us to practice using judgment in
  • 00:29:00
    looking at software projects and so
  • 00:29:02
    really what I've done for purposes of
  • 00:29:04
    this talk is to create a simplified
  • 00:29:06
    model where we simply have a red yellow
  • 00:29:08
    green classification for each of the
  • 00:29:11
    four factors size uncertainty defects
  • 00:29:13
    and human
  • 00:29:15
    variation so with all that is prologue
  • 00:29:18
    uh given that the title of the talk is
  • 00:29:20
    case studies in software engineering
  • 00:29:21
    judgment uh that lays some groundwork
  • 00:29:24
    for the Judgment part of the talk now
  • 00:29:26
    let's go ahead and take a look at uh
  • 00:29:28
    case studies part of the talk and here
  • 00:29:30
    uh we will look at a handful of case
  • 00:29:32
    studies some that of projects that went
  • 00:29:34
    well some of projects that didn't go so
  • 00:29:35
    well and apply that four factors model
  • 00:29:38
    and see if we can actually essentially
  • 00:29:40
    practice applying judgment to uh
  • 00:29:43
    software projects the first case study
  • 00:29:46
    uh the value of case studies I think is
  • 00:29:49
    um that we can improve judgment um I
  • 00:29:52
    think understanding the four factors
  • 00:29:54
    model which is really kind of baked into
  • 00:29:56
    the way I'm presenting the case studies
  • 00:29:58
    also support synthesis or creative in
  • 00:30:00
    the blooms taxonomy sense in in that we
  • 00:30:03
    really need to be applying synthesis or
  • 00:30:05
    creative in planning and management uh
  • 00:30:07
    and so really this is all just very
  • 00:30:09
    condensed form of practice of applying
  • 00:30:11
    judgment you know experience can real
  • 00:30:13
    experience can be a great teacher but it
  • 00:30:15
    takes a long time uh I really am a fan
  • 00:30:17
    of case studies for condensing that kind
  • 00:30:19
    of uh experience into a much shorter
  • 00:30:21
    time frame so the first case study I'd
  • 00:30:24
    like to take a look at is healthcare.gov
  • 00:30:27
    certainly the high highest profile uh
  • 00:30:29
    challenge software project we've had in
  • 00:30:31
    recent years and I think one that really
  • 00:30:33
    does benefit from uh a de from a look at
  • 00:30:36
    uh the factors that were in play on that
  • 00:30:41
    project so the background uh for people
  • 00:30:44
    who uh don't read the newspaper every
  • 00:30:46
    day in the US is that the Affordable
  • 00:30:48
    Care Act was passed in December of 2009
  • 00:30:51
    signed into law in March
  • 00:30:52
    2010 private sector development
  • 00:30:55
    contracts were awarded in
  • 00:30:56
    2011 the original project budget was
  • 00:30:59
    about $100 million um coding by CGI the
  • 00:31:02
    prime contractor began in Spring
  • 00:31:05
    2013 for October 1st 2013 go live date
  • 00:31:10
    now if you know anything about software
  • 00:31:13
    Project Dynamics right away you should
  • 00:31:15
    be nervous about that statement coding
  • 00:31:17
    for a100 million project began in Spring
  • 00:31:20
    2013 for an October first first go live
  • 00:31:23
    date but we will we will go on cost by
  • 00:31:25
    the time the system went live was almost
  • 00:31:27
    $300 million
  • 00:31:29
    uh and as most of the people who were in
  • 00:31:30
    The us at the time would know when the
  • 00:31:32
    system went live it was plagued by slow
  • 00:31:33
    performance downtime lost data
  • 00:31:35
    incomplete functionality other problems
  • 00:31:38
    one estimate was that only 1% of people
  • 00:31:40
    were able to use the site as intended at
  • 00:31:43
    first so clearly a challenged project in
  • 00:31:45
    many respects now what's interesting
  • 00:31:48
    about this is how the people who were
  • 00:31:49
    involved in the project diagnosed the
  • 00:31:52
    problems with healthcare.gov so
  • 00:31:54
    contemporary newspaper reports meaning
  • 00:31:56
    October of 2013
  • 00:31:58
    characterized the problems with the site
  • 00:32:00
    as being not enough testing so you can
  • 00:32:02
    see from the headline here from USA
  • 00:32:04
    Today the government did not test the
  • 00:32:06
    healthcare site as needed and in case
  • 00:32:08
    you missed the fact that it was about
  • 00:32:09
    testing uh the subheading also says did
  • 00:32:12
    not test the health exchange website as
  • 00:32:14
    much as it needed to and in case it's
  • 00:32:16
    still not clear the first sentence in
  • 00:32:17
    the article says not not enough tests
  • 00:32:19
    were performed and in case you still
  • 00:32:21
    haven't gotten the picture the quote in
  • 00:32:23
    the first sentence of the second
  • 00:32:24
    paragraph says the system just wasn't
  • 00:32:26
    tested enough so it seems pretty clear
  • 00:32:29
    that the people directly involved with
  • 00:32:31
    the project reporting the issues at the
  • 00:32:33
    time thought that the problems had
  • 00:32:34
    something to do with testing now let's
  • 00:32:37
    dig into that a little bit and see uh
  • 00:32:39
    whether we agree that that's the case
  • 00:32:41
    other details reported in articles at
  • 00:32:43
    the time said things like we all know we
  • 00:32:45
    were working under a compressed time
  • 00:32:47
    frame to launch this on October 1st well
  • 00:32:50
    compressed time frame says something
  • 00:32:52
    different to me than not tested enough I
  • 00:32:54
    wrote a whole book called rapid
  • 00:32:56
    development about development under
  • 00:32:57
    compr compressed time frames there are
  • 00:32:59
    all kinds of dynamics that come into
  • 00:33:01
    play when you start working under a
  • 00:33:02
    compressed time frame that don't have a
  • 00:33:04
    whole lot to do with not enough testing
  • 00:33:07
    another report another comment said that
  • 00:33:09
    during house hearing contractors said
  • 00:33:11
    CMS decided at the last minute not to
  • 00:33:14
    allow people to shop for plans now
  • 00:33:16
    that's interesting decided at the last
  • 00:33:18
    minute what does that mean well to me
  • 00:33:20
    with my software guy hat on that says
  • 00:33:22
    there were late requirements changes in
  • 00:33:24
    the project which again doesn't really
  • 00:33:25
    have anything to do with testing another
  • 00:33:28
    comment officials were still changing
  • 00:33:29
    features of the website as late as the
  • 00:33:31
    last week of September for an October 1
  • 00:33:34
    release absolutely crazy and nothing to
  • 00:33:36
    do with testing absolutely has to do
  • 00:33:38
    with
  • 00:33:39
    requirements uh and then here's one they
  • 00:33:41
    had just two weeks to test the site
  • 00:33:42
    before all the pieces from several
  • 00:33:44
    contractors had to work together the day
  • 00:33:46
    of the launch now I could do a whole
  • 00:33:48
    talk on that one sentence uh this I
  • 00:33:50
    think is characteristic of projects that
  • 00:33:52
    start out being behind schedule
  • 00:33:54
    basically uh there's work going on in
  • 00:33:56
    the project that no one wanted to admit
  • 00:33:58
    needed to be done namely significant
  • 00:34:01
    integration work from the several
  • 00:34:03
    contractors but basically uh because
  • 00:34:05
    everyone was under so much schedule
  • 00:34:06
    pressure nobody took ownership of that
  • 00:34:09
    work and the work just didn't happen the
  • 00:34:11
    idea that they were going to just put
  • 00:34:12
    the pieces together at the last minute
  • 00:34:14
    and see if they worked on the day of the
  • 00:34:16
    launch is just laughable that's
  • 00:34:18
    something that people would have
  • 00:34:20
    reported uh as a bad practice in the 80s
  • 00:34:23
    we certainly should know better today uh
  • 00:34:25
    and then interestingly enough
  • 00:34:28
    determining many of the problems the
  • 00:34:29
    system would have after the various
  • 00:34:31
    parts were integrated was difficult
  • 00:34:32
    until the site actually went online wow
  • 00:34:35
    that's quite a statement too so they
  • 00:34:37
    weren't going to actually know whether
  • 00:34:38
    it worked until it went online uh a very
  • 00:34:41
    scary statement it was the agency's
  • 00:34:42
    responsibility to make sure all the
  • 00:34:44
    parts work together so again basically
  • 00:34:47
    uh there was too little time on the
  • 00:34:49
    project nobody wanted to take
  • 00:34:50
    responsibility by default it was the
  • 00:34:52
    agency's responsibility to make sure all
  • 00:34:54
    the parts work together but the agency
  • 00:34:55
    didn't have the expertise to do that
  • 00:34:57
    which is why they hired a contractor for
  • 00:34:59
    this project in the first place um the
  • 00:35:02
    technology is there to do that it just
  • 00:35:04
    requires foresight now this is goves me
  • 00:35:07
    more than anything else on this uh on
  • 00:35:09
    this slide which is it just requires
  • 00:35:11
    foresight is basically saying it just
  • 00:35:13
    requires an adequate amount of planning
  • 00:35:15
    which of course I would completely agree
  • 00:35:17
    with but it's sort of implying that they
  • 00:35:19
    didn't have an adequate amount of
  • 00:35:20
    planning which I think is actually quite
  • 00:35:22
    clear from the results so bottom line is
  • 00:35:25
    how does all this add up to not enough
  • 00:35:27
    testing it doesn't add up to not enough
  • 00:35:30
    testing it has very little to do with
  • 00:35:31
    not enough testing yet the people
  • 00:35:33
    intimately involved with the project
  • 00:35:35
    thought that testing was the culprit uh
  • 00:35:37
    when I think quite clearly just from
  • 00:35:39
    reading the newspaper reports at the
  • 00:35:41
    time you could tell that it really
  • 00:35:43
    didn't have very much to do with testing
  • 00:35:45
    now interestingly enough if we appli the
  • 00:35:47
    four factors model uh which is
  • 00:35:50
    essentially what I did at uh my
  • 00:35:52
    executive uh Summit in 2013 if we look
  • 00:35:55
    at the size of the project it's a short
  • 00:35:57
    schedule it's a huge budget $100 million
  • 00:36:01
    huge staff ramp up there's a good
  • 00:36:02
    argument to be made that you cannot
  • 00:36:05
    meaningfully spend $100 million worth of
  • 00:36:08
    development effort in the amount of time
  • 00:36:09
    they had to do the project and clearly
  • 00:36:11
    the planning was not matched to the
  • 00:36:13
    project size I think this gets a pretty
  • 00:36:15
    good good red circle on the size uh size
  • 00:36:18
    uh Factor uncertainty we had numerous
  • 00:36:21
    immovable requirements or laws and so uh
  • 00:36:25
    those uh put some constraints on the
  • 00:36:27
    project but had massive and late
  • 00:36:29
    requirements changes and we also had
  • 00:36:31
    some significant unprecedented this in
  • 00:36:33
    the nature of the application that was
  • 00:36:34
    being built I would actually give this
  • 00:36:36
    uh uh characterize this as red on the
  • 00:36:38
    uncertainty Factor as well on the defect
  • 00:36:41
    front the approach to QA was not matched
  • 00:36:44
    the size of the project or the nature of
  • 00:36:46
    the uncertainty we don't really even
  • 00:36:48
    have to talk about the details we can
  • 00:36:50
    just tell from the reports that it
  • 00:36:52
    really doesn't have to do with testing
  • 00:36:53
    it has to do with endtoend quality
  • 00:36:55
    assurance practices and here too I give
  • 00:36:57
    this a red circle now human variation
  • 00:37:00
    interestingly enough what was that we
  • 00:37:02
    don't get a lot of information about how
  • 00:37:04
    good or bad the staff was on the project
  • 00:37:06
    and you know what it doesn't matter
  • 00:37:08
    because when we're talking about
  • 00:37:09
    software engineering judgment as opposed
  • 00:37:11
    to analysis if we've got red light on
  • 00:37:14
    the first three factors it doesn't
  • 00:37:16
    really matter whether the final factor
  • 00:37:18
    is red yellow or green we know that
  • 00:37:20
    we're going to have a failed project and
  • 00:37:22
    that goes back to that whole idea of
  • 00:37:23
    judgment not needing to know every
  • 00:37:25
    single detail
  • 00:37:27
    all right now interestingly enough uh
  • 00:37:29
    the GAO the government Accounting Office
  • 00:37:31
    issued a report in July of
  • 00:37:33
    2014 that concluded substantially the
  • 00:37:36
    same thing uh nine months after I gave
  • 00:37:39
    my talk in October of
  • 00:37:41
    2013 uh is that nine months maybe it was
  • 00:37:43
    November of 2013 that concluded
  • 00:37:45
    basically the same thing that
  • 00:37:46
    healthcare.gov suffered from Rush
  • 00:37:47
    schedule changing requirements lacks
  • 00:37:50
    oversight of contractors lack of
  • 00:37:52
    effective planning and oversight
  • 00:37:53
    practices and so the interesting thing
  • 00:37:56
    is I think that uh that I gave a very
  • 00:37:59
    similar assessment just a few days after
  • 00:38:01
    it went live just from reading the
  • 00:38:03
    newspaper and I believe that anyone can
  • 00:38:06
    do this if they understand the four
  • 00:38:08
    factors model so I'm not saying I have
  • 00:38:10
    any particular unique attributes in this
  • 00:38:12
    area or skills in this area but I do
  • 00:38:14
    think that I have a way of understanding
  • 00:38:16
    projects through the four factors model
  • 00:38:18
    that if others can apply that they would
  • 00:38:20
    understand things just as well now let's
  • 00:38:22
    turn to a more recent example uh as if
  • 00:38:24
    we didn't get enough problems with
  • 00:38:26
    Healthcare websites and
  • 00:38:28
    2013 in 2013 we had similar issues with
  • 00:38:31
    a healthc care website in Oregon called
  • 00:38:33
    cover Oregon and uh the background on
  • 00:38:36
    cover Oregon is that in 2011 Oregon
  • 00:38:39
    decided to develop its own State Level
  • 00:38:41
    Health exchange rather than use the
  • 00:38:42
    federal government's exchange and work
  • 00:38:45
    began on cover Oregon in 2012 for that
  • 00:38:47
    same October 1st 2013 go live date uh
  • 00:38:51
    Oregon contracted with Oracle to develop
  • 00:38:53
    the exchange uh they receiv received
  • 00:38:55
    $300 million in Federal Grant money to
  • 00:38:57
    develop the site now from a judgment
  • 00:39:00
    point of view that's interesting too uh
  • 00:39:02
    the federal government budgeted their
  • 00:39:04
    project for the entire country at $100
  • 00:39:06
    million but for some reason they gave
  • 00:39:08
    the state of Oregon $300 million to
  • 00:39:10
    develop the site just for the one state
  • 00:39:12
    um I'll leave that as an exercise to the
  • 00:39:14
    reader to figure out how how that logic
  • 00:39:16
    worked out um so the exchange was still
  • 00:39:19
    not working in December 2013 and Oregon
  • 00:39:22
    reassigned 500 staff to process paper
  • 00:39:25
    applications by April 20 2014 the
  • 00:39:28
    exchange was still not working cover
  • 00:39:30
    Oregon was closed and Oregon adopted the
  • 00:39:32
    federal program or the federal website
  • 00:39:33
    of healthcare.gov uh beginning at the
  • 00:39:36
    beginning of this year
  • 00:39:38
    2015 um the interesting thing about this
  • 00:39:41
    is that there was some business judgment
  • 00:39:42
    here that was about as bad as the
  • 00:39:44
    software judgment uh they spent
  • 00:39:46
    two-thirds of their $300 million budget
  • 00:39:48
    about $200 million they had signed up
  • 00:39:50
    more than 63,000 people uh for private
  • 00:39:54
    exchange for private insurance which
  • 00:39:55
    generated a per member per month fee of
  • 00:39:58
    n a little over $9 now if we look at
  • 00:40:00
    that $200 million divided by 63,000
  • 00:40:03
    people means it costs them over $3,000
  • 00:40:06
    per enrol Le to enroll people in this
  • 00:40:09
    system and that's to just to access the
  • 00:40:11
    exchange that doesn't include the cost
  • 00:40:13
    of the actual Healthcare now if we do
  • 00:40:15
    some math on that they're getting
  • 00:40:16
    $9.38 per month per enrol and that
  • 00:40:19
    implies that they would pay off the
  • 00:40:21
    system in 28 years so again just the
  • 00:40:25
    business judgment here is just kind of
  • 00:40:26
    aoll and really goes to show that
  • 00:40:29
    absence of judgment is by no means
  • 00:40:31
    unique to software professionals uh and
  • 00:40:34
    all that is interesting except for the
  • 00:40:36
    fact that none of those people were
  • 00:40:37
    actually signed up using the exchange
  • 00:40:39
    The Exchange never actually worked so
  • 00:40:41
    the 63,000 people were actually signed
  • 00:40:44
    up on paper so essentially all that
  • 00:40:46
    money went to sign up no one uh which is
  • 00:40:49
    an even worse matter still all right so
  • 00:40:52
    uh reported problems with cover Oregon
  • 00:40:54
    included code quality is subpar there is
  • 00:40:57
    no impact analysis prior to coding no
  • 00:40:59
    peer review details on software check-in
  • 00:41:02
    checkout and merge processes are lacking
  • 00:41:05
    no skilled software development
  • 00:41:06
    engineering manager um they lacked the
  • 00:41:09
    status reporting lacked basic
  • 00:41:10
    information including number of calendar
  • 00:41:13
    days had poor design even worse code the
  • 00:41:15
    quality of work was atrocious they broke
  • 00:41:18
    every single development best practice
  • 00:41:19
    that Oracle themselves have defined and
  • 00:41:21
    these are all direct quotes from the the
  • 00:41:23
    assessment of the project uh OHA and
  • 00:41:26
    cover Oregon the skills knowledge or
  • 00:41:28
    ability to be successful so how would
  • 00:41:30
    this stack up using our four factors
  • 00:41:32
    model well the size factor that huge
  • 00:41:34
    budget huge staff ramp up and apparently
  • 00:41:38
    no aspect of the development was matched
  • 00:41:40
    to the large size of the project none of
  • 00:41:42
    it so we have to give that a red a red
  • 00:41:44
    rating a red light rating uncertainty
  • 00:41:46
    we're going to skip for now defects they
  • 00:41:49
    had zero meaningful QA practices and
  • 00:41:50
    evidence again red light on that pretty
  • 00:41:52
    clearly human variation they were
  • 00:41:54
    massively unders skilled according to
  • 00:41:56
    the report so we give that a red light
  • 00:41:58
    as well going back to uncertainty again
  • 00:42:01
    from a judgment point of view it really
  • 00:42:03
    doesn't matter if we've got three red
  • 00:42:05
    lights on size defects and human
  • 00:42:06
    variation that's pretty much all we need
  • 00:42:09
    to know the the Project's not going to
  • 00:42:10
    be successful with a bad performance in
  • 00:42:13
    those three areas no matter what
  • 00:42:15
    approach is taken to uncertainty now
  • 00:42:17
    there's I also would say that these are
  • 00:42:19
    not just little red lights they're huge
  • 00:42:22
    red lights they were blatant massive
  • 00:42:24
    failures uh really across the board and
  • 00:42:27
    uh I think it's important to understand
  • 00:42:30
    that there is nothing subtle about what
  • 00:42:32
    WR what went wrong with this project if
  • 00:42:34
    it hadn't been a project that was
  • 00:42:36
    performed uh last year I think you could
  • 00:42:39
    argue that it's even contrived for
  • 00:42:41
    purposes of presenting in this
  • 00:42:42
    presentation but this was a project that
  • 00:42:44
    was still ongoing to the tune of $200
  • 00:42:46
    million last year and that's interesting
  • 00:42:49
    in itself uh because what we see here is
  • 00:42:53
    that it isn't just subtleties in
  • 00:42:54
    judgment it's really large grain
  • 00:42:56
    decisions on this project were wrong uh
  • 00:43:00
    and so we don't have to get down into
  • 00:43:01
    the details we should understand that
  • 00:43:03
    the problems on this project were so
  • 00:43:05
    conspicuous that this case study seems
  • 00:43:07
    almost contrived to make a point but
  • 00:43:09
    it's not you would think we would have
  • 00:43:11
    learned these lessons decades ago but
  • 00:43:13
    this project was still ongoing very
  • 00:43:16
    recently now there's a very nice book
  • 00:43:18
    called made to stick which is a
  • 00:43:20
    marketing book but it describes
  • 00:43:21
    something called The Curse of knowledge
  • 00:43:23
    the curse of knowledge is basically
  • 00:43:25
    where you understand something so well
  • 00:43:27
    that you become incapable of explaining
  • 00:43:29
    it to others and I've been doing this
  • 00:43:31
    software development thing for about 30
  • 00:43:32
    years now my first book came out 21
  • 00:43:35
    years ago um and or actually 22 years
  • 00:43:38
    ago so I've been doing this for a long
  • 00:43:40
    time the more time goes by the more
  • 00:43:42
    difficulty I have knowing what is
  • 00:43:44
    obvious to other people and what is not
  • 00:43:46
    the problems with these Pro this project
  • 00:43:48
    seem completely obvious to me and yet
  • 00:43:51
    somehow this project was allowed to go
  • 00:43:53
    wrong by intelligent people and in a
  • 00:43:56
    government Contex where there are
  • 00:43:57
    multiple levels of oversight to the tune
  • 00:44:00
    of $200 million and I find it almost
  • 00:44:03
    inconceivable that this was allowed to
  • 00:44:05
    happen and yet clearly it
  • 00:44:08
    did so I think it's pretty clear that
  • 00:44:11
    the problem in this case was not the
  • 00:44:13
    absence of analysis it wasn't subtle
  • 00:44:15
    miscalculations or subtle errors in
  • 00:44:17
    judgment but it was gross errors in
  • 00:44:19
    judgment I think we end up asking the
  • 00:44:21
    wrong question when we asked the
  • 00:44:23
    question what went wrong with this
  • 00:44:24
    project we assume that it was ever
  • 00:44:26
    positioned for success and something
  • 00:44:27
    specific went wrong really I think the
  • 00:44:30
    more meaningful question to ask is why
  • 00:44:31
    did anyone ever think this project would
  • 00:44:34
    be successful and I think that is
  • 00:44:36
    actually a much harder question to
  • 00:44:38
    answer uh all right so
  • 00:44:42
    uh so let's skip ahead then to uh our
  • 00:44:46
    third case study which is the Chrysler
  • 00:44:47
    C3 project and now we're going to look
  • 00:44:50
    at uh projects that aren't so
  • 00:44:51
    conspicuously bad as the first two
  • 00:44:53
    projects we looked at the C3 project of
  • 00:44:56
    course is the original extreme
  • 00:44:57
    programming project now on this project
  • 00:45:00
    uh Chrysler wanted to replace a number
  • 00:45:02
    of different Legacy Cobalt payroll
  • 00:45:04
    systems with one system um the project
  • 00:45:06
    didn't make much progress from 1993 to
  • 00:45:08
    1995 and in 1996 Kent Beck was hired to
  • 00:45:11
    build the system he in turn hired Ron
  • 00:45:13
    Jeff Kent and Ron implemented pair
  • 00:45:16
    programming continuous integration
  • 00:45:17
    on-site customer unit testing
  • 00:45:19
    refactoring you ain't going to need it
  • 00:45:21
    all the practices that eventually became
  • 00:45:23
    extreme programming initial release was
  • 00:45:25
    2 months late on a 12 Monon schedule
  • 00:45:27
    which the team considered to be
  • 00:45:28
    basically on time after that progress
  • 00:45:31
    for the next few years was mixed and
  • 00:45:32
    characterized by just one more
  • 00:45:34
    requirement syndrome they had some
  • 00:45:36
    difficulty actually making further
  • 00:45:37
    releases and then further development
  • 00:45:39
    was finally halted when Daimler bought
  • 00:45:41
    Chrysler in
  • 00:45:42
    2000 so from the four factors model what
  • 00:45:46
    does this look like well from the size
  • 00:45:48
    point of view this is a pretty small
  • 00:45:49
    project and the plan scope was pretty
  • 00:45:52
    close to the real scope so I give it a
  • 00:45:53
    green light on the size category on the
  • 00:45:56
    UN cty side I think payroll is a well
  • 00:45:58
    understood area but we had this very uh
  • 00:46:01
    kind of rich and possibly contradictory
  • 00:46:04
    set of Legacy system so I think there's
  • 00:46:05
    some uncertainty that arises from that
  • 00:46:08
    uh on the defect side I think the
  • 00:46:10
    practices for removing defects were
  • 00:46:12
    reasonable uh pair programming and so on
  • 00:46:14
    and match to the size of the project I
  • 00:46:16
    think the reality is this is not a high
  • 00:46:18
    defect potential project in the first
  • 00:46:20
    place because it's essentially taking
  • 00:46:21
    existing system or systems and simply
  • 00:46:24
    reproducing them in a different
  • 00:46:25
    technology and on the human variation we
  • 00:46:28
    have to give this a green circle not
  • 00:46:30
    only do we have to give it a green
  • 00:46:31
    circle we have to give it a huge green
  • 00:46:33
    circle because we've got a couple of
  • 00:46:35
    Rockstar programmers we've got Kent Beck
  • 00:46:37
    we've got Ron Jeff you know on a small
  • 00:46:40
    project with this kind of human uh
  • 00:46:43
    high-end of human contribution it's
  • 00:46:45
    actually going to be pretty hard for us
  • 00:46:46
    to be unsuccessful as long as those guys
  • 00:46:49
    are making uh the kinds of contributions
  • 00:46:50
    they're capable of making so um so what
  • 00:46:55
    else can we say about uh the the
  • 00:46:57
    Chrysler C3 Project based on this four
  • 00:47:00
    factors model what would I say surprises
  • 00:47:02
    me about this project my answer is
  • 00:47:04
    nothing nothing surprises me about this
  • 00:47:06
    project uh because of the human
  • 00:47:09
    variation that we see on the project I
  • 00:47:10
    certainly don't think there was any
  • 00:47:12
    secret sauce uh of the extreme
  • 00:47:14
    programming practices that I would
  • 00:47:15
    consider significant on that project uh
  • 00:47:18
    and if I asked the question I asked on
  • 00:47:20
    the cover Oregon why did anyone ever
  • 00:47:22
    think this project would be successful I
  • 00:47:24
    think it's pretty clear why it was
  • 00:47:26
    successful we had great talent we had
  • 00:47:28
    reasonable practices and the lesson here
  • 00:47:31
    is not about extreme programming it's if
  • 00:47:33
    you pay attention to the needs of the
  • 00:47:35
    project and plan and execute accordingly
  • 00:47:37
    uh in terms of uh handling uh variation
  • 00:47:40
    and uncertainty in terms of having good
  • 00:47:42
    people on the project in terms of your
  • 00:47:44
    defect uh uh practices then the project
  • 00:47:46
    will be successful so as I said nothing
  • 00:47:49
    about this project really surprises me I
  • 00:47:51
    think it it at least in the first
  • 00:47:53
    version it worked worked fine uh and I
  • 00:47:55
    think that's easy to explain
  • 00:47:58
    all right the last case study we'll look
  • 00:48:00
    at today is the Cheyenne Mountain atams
  • 00:48:02
    project uh I think pretty clearly a
  • 00:48:05
    successful project and uh and I think uh
  • 00:48:07
    we'll we'll see why um the US Air Force
  • 00:48:10
    had a cheyen mountain upgrade project
  • 00:48:12
    that was originally scheduled to last
  • 00:48:13
    six years and cost just under a billion
  • 00:48:16
    dollars as 13 years later the government
  • 00:48:18
    Accounting Office estimated that CMU the
  • 00:48:21
    shine Mountain upgrade project was a
  • 00:48:22
    billion dollars over budget and 11 years
  • 00:48:25
    behind schedule the new system systems
  • 00:48:27
    had not been completed that had been
  • 00:48:28
    completed were not even usable uh so in
  • 00:48:31
    this context CMU managers commissioned
  • 00:48:34
    Cayman Sciences to conduct a specific
  • 00:48:36
    project called atams and the goal was to
  • 00:48:38
    replace displays on 20 monitors with
  • 00:48:41
    displays on just two monitors and
  • 00:48:42
    improved response time at the same time
  • 00:48:45
    the project constraint was a schedule of
  • 00:48:46
    one year and a budget of $2
  • 00:48:49
    million so the response to this is that
  • 00:48:51
    Cayman Sciences appointed an experienced
  • 00:48:53
    project manager um they conducted
  • 00:48:55
    development with an 11 impact intact
  • 00:48:58
    development team uh the team extensively
  • 00:49:00
    prototyped the user experience and when
  • 00:49:03
    they did that they found that user
  • 00:49:04
    demands turned the two message four
  • 00:49:06
    display system into a 57 message 35
  • 00:49:09
    display system but this was discovered
  • 00:49:11
    very early during prototyping at which
  • 00:49:14
    point the team tackled the riskiest
  • 00:49:15
    elements first and they proceeded to uh
  • 00:49:18
    buy the risk down by working from
  • 00:49:19
    highest risk item to lowest risk item in
  • 00:49:21
    a disciplined way design reviews caught
  • 00:49:24
    more than 200 major defects and 500
  • 00:49:26
    minor defects at design time at a cost
  • 00:49:28
    of slightly less than one staff hour per
  • 00:49:30
    defect found now if we had let those
  • 00:49:32
    defects persist into design or
  • 00:49:34
    construction of course it would have
  • 00:49:36
    cost a lot more than one staff hour per
  • 00:49:38
    defect found so not great news to find
  • 00:49:40
    those but it certainly prevented the
  • 00:49:41
    even worse news of finding them much
  • 00:49:43
    later in the
  • 00:49:44
    project um root cause analysis was
  • 00:49:47
    performed for each defect found uh
  • 00:49:49
    technical peer reviews continued
  • 00:49:50
    throughout the project and active
  • 00:49:52
    management was conducted to ensure the
  • 00:49:54
    peer reviews were performed in a timely
  • 00:49:56
    way uh the team adopted a standard of
  • 00:49:58
    perfecting each component that is
  • 00:50:00
    removing all defects before moving on to
  • 00:50:02
    the next component project status and
  • 00:50:05
    task status were displayed in a graphic
  • 00:50:07
    format that anyone can understand and
  • 00:50:09
    project management used the status
  • 00:50:10
    information to seek out project risks
  • 00:50:13
    and actively address
  • 00:50:15
    them uh the results of this was that
  • 00:50:18
    this project was delivered one month
  • 00:50:20
    early on a 12-month schedule and only
  • 00:50:22
    two defects were found uh within the
  • 00:50:24
    first 16 months of operation and so I
  • 00:50:26
    think really by any measure this has to
  • 00:50:29
    be considered to be a highly successful
  • 00:50:33
    project so how would we score this in
  • 00:50:35
    terms of the four factors model size
  • 00:50:38
    this is actually a pretty small project
  • 00:50:40
    11 people for a year um this is not
  • 00:50:42
    causing me any heartburn in terms of the
  • 00:50:44
    the project challenge related to the
  • 00:50:46
    scope of the project uncertainty well
  • 00:50:48
    clearly there was some uncertainty here
  • 00:50:50
    there were some significant requirements
  • 00:50:52
    changes but those changes were
  • 00:50:54
    discovered early uh and then the project
  • 00:50:56
    actively attacked uncertainty in
  • 00:50:58
    requirements in quality in project plans
  • 00:51:00
    in Risk in general and so while there
  • 00:51:02
    was some inherent uncertainty on the
  • 00:51:04
    project I think the actions of the team
  • 00:51:06
    kept that uncertainty in the yellow zone
  • 00:51:09
    and kept it from drifting into the Red
  • 00:51:10
    Zone which from some of the other
  • 00:51:11
    project teams we've seen today would
  • 00:51:14
    easily have drifted into the Red Zone in
  • 00:51:15
    a big way uh defects I think this is a
  • 00:51:19
    great story early requirements defect
  • 00:51:21
    detection through prototyping detected
  • 00:51:23
    tons of defects and allowed them to be
  • 00:51:24
    fixed early and cheaply they thoroughly
  • 00:51:27
    reviewed uh their work on an ongoing
  • 00:51:29
    basis again detecting defects early
  • 00:51:31
    keeping them under control there was a
  • 00:51:33
    cultural focus in the project on
  • 00:51:34
    maintaining high quality through that
  • 00:51:36
    standard of perfecting each component
  • 00:51:38
    before moving on and consistent
  • 00:51:40
    application of high discipline at the
  • 00:51:41
    team level also reinforced by the
  • 00:51:43
    manager on the project to make sure the
  • 00:51:45
    team followed through on its good
  • 00:51:46
    intentions so I would say another
  • 00:51:48
    another green light here and then human
  • 00:51:50
    variation again pretty good story
  • 00:51:52
    skilled project team intact project team
  • 00:51:54
    so people who are used to working
  • 00:51:56
    together and know how to work with each
  • 00:51:57
    other and skilled management and overall
  • 00:52:00
    this leads to a successful project and
  • 00:52:03
    I'm not sure that I see anything
  • 00:52:04
    terribly surprising here either actually
  • 00:52:07
    so when we compare that to the
  • 00:52:09
    commonalities from some of the other
  • 00:52:10
    projects we've seen the first two case
  • 00:52:12
    studies I mentioned people on the
  • 00:52:14
    project seem unable to identify even
  • 00:52:16
    basic Dynamics on their own projects
  • 00:52:18
    even in hindsight we see quite a
  • 00:52:20
    contrasts here there was an awareness of
  • 00:52:22
    risk and explicit steps taken to address
  • 00:52:24
    the risks and then the project ends up
  • 00:52:26
    being
  • 00:52:29
    successful so if we apply the other
  • 00:52:32
    question I've asked a couple times why
  • 00:52:33
    did anyone ever think this project would
  • 00:52:35
    be successful here again we have an
  • 00:52:38
    affirmative answer there are lots of
  • 00:52:39
    reasons for this project to be
  • 00:52:40
    successful not not really terribly
  • 00:52:43
    mysterious
  • 00:52:46
    actually and then uh another common
  • 00:52:48
    question problems are not subtleties but
  • 00:52:50
    gross errors and judgment I don't think
  • 00:52:52
    we see any gross errors and judgment in
  • 00:52:54
    this project in fact the causes of
  • 00:52:56
    success in this project seem as
  • 00:52:58
    conspicuous as the causes of failure did
  • 00:53:00
    on the other projects which really kind
  • 00:53:02
    of says we're not really talking about
  • 00:53:04
    subtleties in judgment we're really
  • 00:53:06
    talking about fairly coarse judgment uh
  • 00:53:09
    the causes of success are conspicuous
  • 00:53:10
    here the causes of failure on the other
  • 00:53:12
    projects are equally conspicuous and I
  • 00:53:14
    actually think that's that's pretty
  • 00:53:16
    representative of the software World At
  • 00:53:18
    Large my company has spent almost 19
  • 00:53:20
    years uh doing assessments of projects
  • 00:53:23
    that have been successful and of
  • 00:53:24
    projects that have failed or been
  • 00:53:25
    challenged and would say that in general
  • 00:53:28
    this is characteristic of what we find
  • 00:53:30
    that it is not very subtle that if you
  • 00:53:31
    know what to look for once you go in and
  • 00:53:33
    look at it uh you see that the that the
  • 00:53:36
    deficiencies are actually quite uh
  • 00:53:38
    glaring so let's uh summarize what we've
  • 00:53:40
    covered uh today uh I think uh as I said
  • 00:53:44
    earlier most of what I've described
  • 00:53:46
    today seems obvious to me possibly
  • 00:53:48
    because of that curse of knowledge I've
  • 00:53:50
    been doing this for a long time and uh
  • 00:53:52
    I'm just I've kind of lost the ability
  • 00:53:54
    to tell what's obvious to other people
  • 00:53:55
    and what's not
  • 00:53:57
    uh but one common theme in the failed
  • 00:53:58
    projects is that basic Project Dynamics
  • 00:54:01
    were not obvious to the people involved
  • 00:54:02
    in these projects and given credit where
  • 00:54:04
    credit is due these are not stupid
  • 00:54:06
    people these are highly intelligent
  • 00:54:07
    people and they're often missing the key
  • 00:54:09
    points even in hindsight as we saw in
  • 00:54:11
    those comments on healthcare.gov from
  • 00:54:13
    USA Today so the question really arises
  • 00:54:16
    how can people who are so smart make
  • 00:54:18
    such bad decisions and I think the
  • 00:54:20
    answer comes back to that distinction
  • 00:54:22
    between analysis and evaluation or
  • 00:54:25
    judgment the software professional tend
  • 00:54:26
    to be very strong in analysis uh so I
  • 00:54:29
    don't think the deficiency in analysis
  • 00:54:31
    is really the issue but of course I do
  • 00:54:33
    think that the deficiency in judgment is
  • 00:54:35
    the issue which can lead to not just
  • 00:54:37
    subtle errors in judgment but in some
  • 00:54:39
    cases gross errors in judgment and for
  • 00:54:42
    this reason I think that a focus on
  • 00:54:43
    developing judgment in software
  • 00:54:45
    professionals is important uh perhaps
  • 00:54:47
    even more important than professions
  • 00:54:49
    that do not select so strongly for
  • 00:54:51
    analysis skills in the first place with
  • 00:54:54
    that I will turn the microphone back
  • 00:54:56
    over over to Will and see if we have a
  • 00:54:57
    few minutes for comments or
  • 00:55:01
    questions thank you Steve uh yes and we
  • 00:55:04
    do have questions and uh it just shows
  • 00:55:08
    I'm very impressed with the maturity of
  • 00:55:10
    our attendees uh we have some very good
  • 00:55:13
    questions here and let's just move on um
  • 00:55:17
    doesn't the use of checklists reduce
  • 00:55:20
    judgment to
  • 00:55:23
    analysis yeah I think that's a fair
  • 00:55:25
    question doesn't the use of checklist
  • 00:55:27
    reduce judgment to analysis the way I
  • 00:55:29
    would describe it is that it puts the
  • 00:55:32
    first rung of the Judgment ladder uh low
  • 00:55:35
    enough so that someone whose main
  • 00:55:37
    strength is analysis can jump onto that
  • 00:55:39
    first rung of the ladder at least that
  • 00:55:40
    would be the way I would look at it very
  • 00:55:43
    good okay next question what happens if
  • 00:55:45
    you have two Reds and two greens uh does
  • 00:55:48
    the uncertainty on judgment you know
  • 00:55:50
    which one what which one would be the
  • 00:55:53
    highest yeah so the that's a really a
  • 00:55:55
    good question that would require going
  • 00:55:57
    into a lot more detail on the four
  • 00:55:59
    factors model and in particular going
  • 00:56:01
    into the interaction effects between
  • 00:56:03
    each of the four factors because there
  • 00:56:05
    are interactions between the factors in
  • 00:56:08
    every combination of the factors and
  • 00:56:11
    what that means really is that I think
  • 00:56:13
    you would have a really hard time
  • 00:56:15
    constructing even a contrived example
  • 00:56:17
    where you had two Reds and two greens um
  • 00:56:20
    just because of those interactions uh if
  • 00:56:23
    you really did have two Reds there would
  • 00:56:25
    be interaction effects with the thing
  • 00:56:27
    that you want to have two greens and I
  • 00:56:29
    think you would have I don't think
  • 00:56:30
    you're going to see that in practice and
  • 00:56:32
    and I think even if you tried to
  • 00:56:33
    contrive an example you would have a
  • 00:56:35
    hard time contriving an example that
  • 00:56:37
    wasn't just
  • 00:56:39
    absurd very good and uh I just notice
  • 00:56:43
    that uh I in the last two minutes
  • 00:56:45
    received an additional 35 questions
  • 00:56:48
    which obviously in the time that we have
  • 00:56:50
    are not going to be able to get to but
  • 00:56:53
    if Steve is so kind and has the time to
  • 00:56:56
    answer the questions we will post the
  • 00:56:58
    his answers on our website as usual uh
  • 00:57:02
    at least the ones that he thinks would
  • 00:57:04
    uh he'll he'll do the final filtering so
  • 00:57:07
    okay in moving on the four Factor model
  • 00:57:11
    where does the project lifespan fact
  • 00:57:15
    in where is the project lifespan I'm not
  • 00:57:18
    sure what is meant by project lifespan
  • 00:57:20
    do that mean like an ongoing program for
  • 00:57:22
    several years or I'm not I'm not sure
  • 00:57:24
    how to interpret I I would say with is a
  • 00:57:26
    onetime delivery or you know a uh
  • 00:57:29
    something that's going to have
  • 00:57:32
    anticipate several uh versions to roll
  • 00:57:35
    out a product that will live on Beyond
  • 00:57:38
    its initial release yeah okay so that
  • 00:57:40
    would be basically the service life of
  • 00:57:42
    the software that's produced yes yeah so
  • 00:57:45
    um I would um uh basically say that
  • 00:57:48
    comes in particularly on the the defect
  • 00:57:52
    side uh um on the uncertainty side and
  • 00:57:56
    depending you know may or may not come
  • 00:57:57
    in on the size side but um that's kind
  • 00:58:00
    of where I would put that I think
  • 00:58:02
    um you you know will with your
  • 00:58:04
    background you probably are more
  • 00:58:05
    equipped to answer that question than I
  • 00:58:07
    am uh but I would say that I think one
  • 00:58:09
    of the failure modes that people run
  • 00:58:11
    into with projects that are anticipated
  • 00:58:13
    to have a long service life is trying to
  • 00:58:15
    anticipate to too high a degree what uh
  • 00:58:19
    capabilities might be required of that
  • 00:58:21
    system in the future and not focusing
  • 00:58:23
    enough on the requirements of that
  • 00:58:25
    system at the present time um and my you
  • 00:58:29
    know in the agile movement we've had
  • 00:58:30
    this whole idea of yogy or you ain't
  • 00:58:32
    going to need it uh and uh so basically
  • 00:58:35
    focusing on design for today not for
  • 00:58:37
    tomorrow I I think that's the wrong way
  • 00:58:40
    to look at it frankly um and they Al
  • 00:58:42
    agile movement has also criticized big
  • 00:58:44
    design up front they say beu as a
  • 00:58:46
    derogative term um I I don't think
  • 00:58:49
    that's the real issue we've spent quite
  • 00:58:50
    a bit of com time at my company trying
  • 00:58:52
    to really put our finger on exactly what
  • 00:58:54
    is the issue I do think that that's
  • 00:58:56
    circling around an issue and what we've
  • 00:58:58
    really decided is that the issue is what
  • 00:59:01
    we've come to decide come to describe as
  • 00:59:03
    design for speculative requirements and
  • 00:59:06
    anytime you find yourself designing for
  • 00:59:08
    speculative requirements I think that
  • 00:59:11
    creates uh that basically uh gets you
  • 00:59:14
    pretty close to a red circle in the
  • 00:59:16
    uncertainty box um and we have a hard
  • 00:59:19
    enough time coming up with good designs
  • 00:59:21
    for the requirements we know about the
  • 00:59:23
    idea that we're going to speculate about
  • 00:59:24
    requirements correctly and then that
  • 00:59:26
    we're actually going to come up with
  • 00:59:28
    effective designs on top of that for
  • 00:59:30
    those requirements we're speculating
  • 00:59:31
    about I think is a really tall order so
  • 00:59:34
    um you know I think some of that would I
  • 00:59:37
    think most of it as I kind of think out
  • 00:59:39
    loud here uh in terms of a Long Live
  • 00:59:41
    system the success or failure would
  • 00:59:43
    probably be determined most in the
  • 00:59:45
    uncertainty Factor basically how is the
  • 00:59:48
    team addressing the uncertainty of a
  • 00:59:50
    long lifespan of that system are they
  • 00:59:52
    getting overly preoccupied with things
  • 00:59:54
    that aren't pinned down yet that may or
  • 00:59:56
    may not happen at some point in the
  • 00:59:57
    future so I that's the way I would look
  • 00:59:59
    at it very good I I'm going to thank all
  • 01:00:04
    the people you know we're we're
  • 01:00:05
    approaching a 100 questions here and I'm
  • 01:00:07
    just going to try to uh move off Target
  • 01:00:10
    here a bit and talk about
  • 01:00:12
    professionalism because there were
  • 01:00:15
    several questions dealing what what I
  • 01:00:17
    call professionalism the role of the
  • 01:00:19
    software engineer versus how do you
  • 01:00:22
    handle management how do you deal with
  • 01:00:24
    customers how do as professional do you
  • 01:00:27
    as an individual then embrace the model
  • 01:00:31
    uh the the the four factors the Bloom's
  • 01:00:34
    taxonomy in yourself and how do you go
  • 01:00:37
    about becoming a better person and then
  • 01:00:39
    influencing the overall outcome of uh
  • 01:00:42
    you know project
  • 01:00:44
    success yeah you know in philosophy
  • 01:00:46
    there's a concept called the principle
  • 01:00:47
    of Charity and the basic idea of the
  • 01:00:49
    principle of Charity is that if some
  • 01:00:53
    historically significant philosopher
  • 01:00:54
    wrote something down they weren't idiots
  • 01:00:57
    and there was probably some reason for
  • 01:00:58
    it it may not ultimately be correct but
  • 01:01:01
    it's also not just completely stupid and
  • 01:01:04
    uh Jim McCarthy in his book dynamics of
  • 01:01:06
    software development kind of said the
  • 01:01:08
    same thing when he said don't flip the
  • 01:01:10
    bozo bit you know don't just say Okay
  • 01:01:12
    this guy's a bozo so I'm not taking
  • 01:01:14
    anything he says seriously technical
  • 01:01:16
    people tend to flip the bozo bit on
  • 01:01:18
    upper management or non-technical
  • 01:01:20
    managers or sales and marketing people
  • 01:01:23
    pretty readily in my experience and I
  • 01:01:25
    think part of
  • 01:01:26
    uh improving our skills in analysis and
  • 01:01:30
    evaluation not in analysis but rather in
  • 01:01:33
    synthesis or create and judgment or
  • 01:01:35
    evaluation really is actually being more
  • 01:01:37
    charitable in our interactions with
  • 01:01:39
    upper management sales and marketing and
  • 01:01:41
    so on those folks are often very
  • 01:01:44
    intelligent people as well but the
  • 01:01:45
    nature of their intelligence is
  • 01:01:47
    different than the nature of the
  • 01:01:48
    software staff's intelligence they are
  • 01:01:50
    not selected primarily on the basis of
  • 01:01:52
    their analysis skill they may be
  • 01:01:53
    selected primarily on the basis of their
  • 01:01:56
    synthesis skill or their evaluation
  • 01:01:58
    skill and so we can actually learn a lot
  • 01:02:01
    from them if we actually go into those
  • 01:02:03
    interactions with an open mind and don't
  • 01:02:05
    flip the bozo bit on them so that would
  • 01:02:06
    that would be my my knee-jerk reaction
  • 01:02:08
    to that question very good and I'm
  • 01:02:10
    afraid we've run out of time today so
  • 01:02:12
    I'd like to thank Steve once again for
  • 01:02:14
    his informative presentation and
  • 01:02:16
    insightful answers to the many questions
  • 01:02:20
    and a special thanks for you out there
  • 01:02:22
    taking the time to attend and
  • 01:02:24
    participate today this webinar was
  • 01:02:26
    recorded and will be available online in
  • 01:02:28
    a few days at learning. acm.org
  • 01:02:32
    weinar you can find announcements of
  • 01:02:34
    upcoming weinar webinars and other ACM
  • 01:02:36
    activities at learning. acm.org and
  • 01:02:39
    acm.org and then now we're going to ask
  • 01:02:42
    if you wouldn't mind uh filling out a
  • 01:02:45
    quick survey where you can suggest
  • 01:02:46
    future topics or speakers which uh which
  • 01:02:49
    as I said should be on your screen the
  • 01:02:52
    uh so in closing this is Will Trace
  • 01:02:54
    saying goodbye for now now thanks again
  • 01:02:56
    for joining us hope you will join us
  • 01:02:58
    again in the future and in particular
  • 01:03:01
    our next webinar speaker is tentatively
  • 01:03:03
    scheduled to be bertran Meyer speaking
  • 01:03:06
    on agile The Good the hype and the Ugly
  • 01:03:11
    so take care and uh talk to you
  • 01:03:24
    soon e
الوسوم
  • ACM
  • software engineering
  • Bloom's taksonomi
  • judgment
  • case-studier
  • fire faktorer
  • Steve McConnell
  • teknologiprojekter