00:00:01
here okay so we are recording hello
00:00:04
hello so glad you could make it Christy
00:00:07
and hopefully we'll get some others in
00:00:08
here but you never you never
00:00:11
know so welcome to module one uh this
00:00:15
week what you're going to do we can pull
00:00:17
up the screen here one moment um
00:00:20
basically we're going to talk a little
00:00:22
bit about ethical theories and
00:00:25
specifically consequentialist ethical
00:00:28
theories such as utilitarian ethics
00:00:30
which is about doing the greatest good
00:00:32
for the greatest number and good for
00:00:35
most not all utilitarians is happiness
00:00:37
or pleasure and kind of contrasting that
00:00:40
with ethical egoism which is you know if
00:00:43
it's personal ethical egoism it's like I
00:00:46
should do what's in my best interest and
00:00:48
you do whatever you want if it's
00:00:50
universal ethical egoism I would say you
00:00:52
should do what's in your best interest I
00:00:54
should do what's in my best interest and
00:00:56
those people over there should all do
00:00:58
what's in their best interest to and of
00:01:01
course any of the theories uh we study
00:01:04
throughout this semester you know
00:01:06
there's going to be people that really
00:01:08
support it and we're going to look at
00:01:09
the critiques of these theories too
00:01:11
right like what are the advantages and
00:01:14
disadvantages of these uh theories so oh
00:01:18
Tiana glad to see you too so a little
00:01:21
bit about ethics right it's a branch of
00:01:26
philosophy um and it's just kind of gets
00:01:28
down to
00:01:30
what do we believe is morally right and
00:01:33
more importantly why why do we believe
00:01:36
it how do we know what's right or wrong
00:01:39
what kind of moral principles or ethical
00:01:42
foundations guide our decisions right so
00:01:46
some folks um may be moral absolutists
00:01:48
they might be believe the death penalty
00:01:51
is wrong whether you're in France or
00:01:53
Texas or lived in the 1700s or lived in
00:01:57
2025 things are either right or wrong
00:02:00
there's cultural
00:02:02
relativism this idea that you should do
00:02:05
whatever um is the cultural norm or the
00:02:09
cultural values there's subjective
00:02:11
relativism which is whatever you think
00:02:14
is right or wrong is actually right or
00:02:16
wrong right for you and you know
00:02:19
everything's got its problems moral
00:02:21
absolutism folks will say well aren't
00:02:24
there legitimate exceptions to some of
00:02:26
these rules like don't kill or something
00:02:29
but what about self-defense you know
00:02:31
that kind of thing um for cultural
00:02:34
relativism well there have been cultures
00:02:36
that have institutionalized slavery
00:02:38
refuse to let women have the right to
00:02:41
vote we could go on and on right so um
00:02:45
is that necessarily moral just because
00:02:47
the majority of folks in a culture
00:02:50
institutionalized it you know that
00:02:52
doesn't necessarily seem right and then
00:02:54
subjective relativism if just if if
00:02:57
there's no standard and we just all
00:02:59
think different things then I'll give
00:03:01
you an example that usually resonates
00:03:03
with students let's say if I promise you
00:03:06
hey if you get 90% or above you'll get
00:03:08
an A you're like Jennifer I got 96% I'm
00:03:12
like yeah I give you a c like that's not
00:03:14
right well I'm a subjective relativist I
00:03:16
don't believe in promises you know you
00:03:18
might be like but I'm going to tell the
00:03:19
doo students they better believe in
00:03:21
promises right there's going to be this
00:03:23
idea that promises are morally correct
00:03:27
but are they you know are they always
00:03:28
morally correct we can dive into that
00:03:31
too but ethics is really to me and I
00:03:34
love ethics a little bit about me um I
00:03:37
got my Master's Degree at Purdue and
00:03:39
back then it was all logic analytic
00:03:42
philosophy philosophy of math I
00:03:45
computability and Logic the course I
00:03:47
took oh my goodness and the philosophy
00:03:50
of math class I'll be honest kind of
00:03:52
broke me and I'm like I'm gonna study
00:03:54
something else so I went to Michigan
00:03:56
State and got engaged in applied ethics
00:03:59
and
00:04:00
and social and political philosophy and
00:04:02
Global Justice and things like that
00:04:04
because I just I was taking up so much
00:04:07
time like trying to figure out if
00:04:09
numbers exist or not and I and I don't
00:04:12
know to you know it depends on how you
00:04:13
think about it um but was that a waste
00:04:16
of time no I really started to
00:04:20
understand at a very deep level how our
00:04:23
metaphysical
00:04:25
foundations shape our ethical
00:04:27
perspectives do you believe in a God or
00:04:29
God um do you believe that people have
00:04:31
free will or how much are they
00:04:34
determined by where and when they're
00:04:35
born who their parents are that sort of
00:04:38
thing um what's human nature is it
00:04:41
basically good or bad right all these
00:04:43
different foundational issues that are
00:04:46
really hard to prove but people build a
00:04:49
lot of their lives around them right and
00:04:52
so and then their ethics often times
00:04:54
comes from those metaphysical
00:04:56
foundations so it's it's all real
00:04:59
fascinating
00:05:00
um we won't really have a lot of time to
00:05:02
go that deeply into it so basically what
00:05:06
I do each week is I show the assignment
00:05:08
go over that a little bit see if y'all
00:05:10
have any questions um and then I have a
00:05:13
PowerPoint I try to go over that and
00:05:16
just kind of go over some of the
00:05:17
concepts and ideas each week um you know
00:05:21
we're going to talk about physician
00:05:23
assisted death is that moral very
00:05:25
controversial topics like abortion
00:05:28
rights and that sort of thing um
00:05:30
immigration big topic right now in the
00:05:33
news and so we talk about controversial
00:05:36
uh topics and we really focus on the
00:05:39
topics and I really want you to truly do
00:05:42
your level best to look at the best
00:05:44
argument for an opposing position what
00:05:46
was the very very best argument you
00:05:48
found for an opposing position and if
00:05:50
you don't agree with it why or why not
00:05:53
right um and and it's usually a fun
00:05:56
class we usually have folks with all
00:05:58
kinds of diverse perspectives which
00:06:01
makes it fun so every once in a while
00:06:03
very rarely every once in a while you
00:06:04
get a class where they all tilt I don't
00:06:06
know one way or the other I never never
00:06:09
know which way it'll be right however
00:06:10
the chips may fall and if there's an
00:06:12
underrepresented position you may me see
00:06:14
me chiming in the discussion board well
00:06:16
what about this argument or that
00:06:18
argument just so we have the opportunity
00:06:21
right to look at ethics through some
00:06:23
different perspectives so that is what
00:06:26
this class is about I hope you're
00:06:28
excited for for it um I love teaching
00:06:31
ethics I think it's a great great fun
00:06:34
but we'll see we'll see what y'all think
00:06:36
um so first of
00:06:38
all try to share my screen
00:06:41
here there we go um and I told you we're
00:06:45
going to talk about ethics in um in
00:06:48
conse what's called consequentialist
00:06:50
ethics um which is what is ethical you
00:06:53
have to look to the Future to know what
00:06:55
are the consequences next week we'll
00:06:57
talk about non-consequential theories
00:06:59
that say Hey the intention matters more
00:07:01
than the
00:07:03
consequences okay so you'll have a
00:07:06
discussion um board which some folks
00:07:08
have already done let's start to think
00:07:10
about ethics and where you might have
00:07:12
seen them outside of the class right
00:07:14
where do you currently encounter the
00:07:15
idea of ethics and moral principles in
00:07:17
your everyday life um how much do you
00:07:21
consciously think about ethics and your
00:07:23
everyday uh decision making uh does your
00:07:26
workplace talk about ethics if so how
00:07:28
often what ises it it bring up uh if
00:07:31
you've never heard or seen your
00:07:33
workplace ever consider ethics uh what
00:07:36
consequences might that have have you
00:07:38
seen news stories about situations where
00:07:40
ethical concerns were specifically
00:07:42
mentioned um and those are just um some
00:07:45
of the questions right you you don't
00:07:48
have to necessarily answer all of the
00:07:50
questions and if you have something else
00:07:52
related to ethics you want to talk about
00:07:54
that's fine uh the discussion boards are
00:07:56
usually what are called relatively like
00:07:58
low State grading
00:08:00
assignments right they're not the big
00:08:01
essays or anything like that they're
00:08:04
just a chance for you to kind of share
00:08:06
your thoughts demonstrate you know of
00:08:08
course understanding of some of the
00:08:09
concepts and then provide two
00:08:12
substantive responses uh to other
00:08:15
students posts you know um and
00:08:17
substantive just means don't say I agree
00:08:20
good point or something right you want
00:08:22
to say this is why I agree or this is
00:08:25
why it's a good point or this is why I
00:08:27
disagree you know or hey check out this
00:08:29
article related to what you were just
00:08:31
talking about whatever it is right so
00:08:34
like I don't really care what side of an
00:08:35
issue you're on or what your position is
00:08:37
or that but you know for this how are
00:08:39
you advancing the discussion are we all
00:08:41
learning from each other that's that's
00:08:43
really
00:08:45
important okay then we'll
00:08:48
go um to your assignment of um module
00:08:52
one ethical egoism and utilitarian
00:08:56
ethics just what's the strength of each
00:08:58
Theory what's the weakness of each
00:09:00
Theory um what are these theories in
00:09:02
your own words um and then which ethical
00:09:05
Theory do you believe is the strongest
00:09:09
and why now be careful with AI right
00:09:12
some students use it to brainstorm or
00:09:14
whatever you don't want to just like put
00:09:16
into discussion prompts into Ai and copy
00:09:19
it down first of all if you get found
00:09:21
out on that that's plagiarism right and
00:09:23
that gets a plagiarism uh report there's
00:09:25
even a little checkbox for AI if I don't
00:09:29
know for sure a lot of times I might say
00:09:31
well I'm not sure why you thought this
00:09:34
was related to the reading material
00:09:36
let's have a discussion about it just to
00:09:38
kind understanding what's going on
00:09:40
explain why you chose to write about
00:09:43
what you do and that sort of thing but
00:09:45
one of the things I've personally found
00:09:46
with AI is it never has an opinion now
00:09:49
I've heard someday it might but right
00:09:51
now as far as I've seen it still does it
00:09:54
so it will always answer which one is
00:09:56
strongest well that depends on your
00:09:58
metaphysic IC foundations and your
00:10:00
perspective and you know and and that
00:10:03
you know it's all different well I'm not
00:10:05
asking do different people have
00:10:07
different views wouldn't that be funny
00:10:09
like hey you think different people have
00:10:10
different views yeah different
00:10:12
perspectives depends of course we all
00:10:14
know that even before we step into an
00:10:16
ethics class I think so it's what do you
00:10:19
think and why and that's going to be the
00:10:22
heart of it this is a normative ethics
00:10:25
class right so what ought we to do and
00:10:28
why you know and so you're going to
00:10:30
explain what you think we ought to do
00:10:32
and why which hopefully will be fun for
00:10:34
you right if if you're just like like
00:10:37
just
00:10:38
regurgitating theories that gets kind of
00:10:40
boring when I say well if like you were
00:10:42
in charge of the world what would you do
00:10:44
and why hopefully it gets a little more
00:10:46
exciting there uh then you have the
00:10:49
module one assignment um two the
00:10:52
utilitarian response to the news so so
00:10:57
you choose a story from the last 90 days
00:10:59
that presents an ethical issue uh you
00:11:01
can use the McMillan Library databases
00:11:04
um those can be a little bit tricky for
00:11:06
some students to to find articles so if
00:11:09
you just want to use Yahoo news or or
00:11:12
routers or AP news or if you've got a
00:11:15
subscription to New York Times or
00:11:18
something like that whatever it is as
00:11:20
long as it's not satire so had students
00:11:22
use satire sites that are not true don't
00:11:24
do that but otherwise whatever or if you
00:11:27
go into the McMillan Library database
00:11:29
you can you know you can set a search
00:11:31
for newspaper articles and things like
00:11:34
that you can always ask a librarian they
00:11:36
have a little chat ask a library and hey
00:11:38
how do I find this they'll be happy to
00:11:40
help you but if that's too much you can
00:11:42
just use the internet that's fine right
00:11:45
um and then you'll create a deliverable
00:11:47
a paper a video or a PowerPoint
00:11:49
presentation I love it I usually get
00:11:51
some of each and that really makes me
00:11:53
happy um crafter responds to the
00:11:56
specific ethical issue as a utilitarian
00:11:58
would resp respond um do you agree that
00:12:01
utilitarian ethics is the best way to
00:12:03
address the issue why or why not and
00:12:05
then use your own ethical reasoning to
00:12:07
support your position of course don't
00:12:09
forget to cite your news story in apa
00:12:11
format have a link to it however you
00:12:14
accessed it give me that information so
00:12:16
I can access it right and that is how um
00:12:20
that are your assignments uh for this
00:12:22
week uh obviously a lot of Topics in the
00:12:26
news that could relate to ethics um so
00:12:29
just pick one that interest you uh
00:12:31
pretty much and then the other
00:12:34
assignment of course is the tech live
00:12:36
Reflections now uh Christie and Tiana
00:12:39
are here you don't have to do anything
00:12:41
you don't even have to write I was there
00:12:44
right so the only bad thing is you'll
00:12:46
see a zero at first like like on Sunday
00:12:49
night at midnight I think it rolls
00:12:51
Sunday at night like right 11:59 p.m.
00:12:53
into Monday morning it rolls over into
00:12:56
zeros automatically and I don't grade
00:12:58
the TCH lives until until after the due
00:13:01
date has passed for all the people that
00:13:03
are still submitting you'd be surprised
00:13:05
how many people you know honestly submit
00:13:07
things 11:57 on Sunday 11:58 11:59
00:13:11
sometime one time I had 11 students like
00:13:13
in the last three minutes so I always
00:13:15
wait until Monday to grade them so
00:13:17
you'll at least till Monday they
00:13:19
everything will always be graded by
00:13:20
Wednesday maybe Monday Tuesday Wednesday
00:13:23
depending how all the week's going um
00:13:27
but at any rate so you have that um and
00:13:31
and then I'll go ahead and I what I do
00:13:35
is I look at the statistics so if I
00:13:36
forget uh who was there that week you
00:13:38
know especially week one right then I go
00:13:41
into statistics they'll say oh chrisy
00:13:42
and Tiana are there and I just put in
00:13:44
the tens at that time so just simply
00:13:46
being here you get the 10 points um
00:13:49
everybody else you got to write a little
00:13:51
reflection paragraph you know what's an
00:13:53
unanswered question to have what did you
00:13:55
learn that sort of
00:13:57
thing all right so now and if you have
00:14:01
any questions uh feel free to throw them
00:14:04
in you can just type in the chat if you
00:14:06
want you can unmute your microphone you
00:14:08
can use your video whatever makes you
00:14:11
happy truly
00:14:13
um communicate how you wish yeah but
00:14:16
you're not you know you're not required
00:14:18
to communicate uh to get your 10 points
00:14:22
now I got a I got yeah I got this a
00:14:25
little out of order
00:14:26
so I had it um my folders by date
00:14:31
instead of in alphabetical
00:14:34
order now I got to find Indiana tag
00:14:37
ethics there we
00:14:45
go
00:14:47
right we're opening this up
00:14:52
now and it'll be up in just a
00:14:55
moment yes so the lesson does explain
00:14:58
what is meant by the um utilitarian
00:15:00
ethical view let me go ahead I'm just
00:15:03
clicking on there I actually um I was
00:15:06
really to me it was important to try to
00:15:09
set this up um I designed the course
00:15:12
just so you know and it was important
00:15:14
for me to set it up without hopefully
00:15:16
relying on a a textbook that you had to
00:15:19
purchase right I was hoping to avoid
00:15:22
that and and I did but how I've done it
00:15:26
is let me let me go back in here a
00:15:28
minute so module one you'll go into
00:15:33
um I clicked on the wrong thing um the
00:15:37
module one lesson you'll click on that
00:15:40
when you click on that you will see what
00:15:43
um what I have done is you got the you
00:15:46
know module learning objectives why the
00:15:49
lesson is important and that stuff what
00:15:51
is ethical egoism and I've put some
00:15:54
links some of the stuff is links like
00:15:56
crash course links uh some of it his
00:15:59
articles on the internet uh some of it
00:16:02
has been um some of it's my writing
00:16:06
where I'm just writing what it is um and
00:16:09
so I tried to put in a lot of videos a
00:16:12
lot of other different things uh for you
00:16:14
to to look at um and so that is uh how
00:16:19
how I have set that up and uh you can
00:16:23
you can just go through that at your
00:16:24
leisure but yeah it's all it's all in
00:16:26
there under module one
00:16:29
lesson a good
00:16:32
question but I'm going to talk a little
00:16:34
bit about that um today too
00:16:37
so but yeah the crash course videos
00:16:40
you'll see I tried to put a lot of Crash
00:16:42
Course videos in here because I think
00:16:44
they are um really good at explaining
00:16:47
what you're reading right um in a really
00:16:50
fun interesting way and if we'd have
00:16:52
bought a big expensive textbook nothing
00:16:54
against them except for the the the cost
00:16:56
sometimes where I'm like oh but um but a
00:16:59
lot of times the crash courses do do
00:17:01
well and I you know I like I said my PhD
00:17:04
specialization is ethics so I know this
00:17:07
um they really do a good job of taking
00:17:09
what you would find the most intro to
00:17:11
ethics textbooks and explaining the
00:17:13
major points so I love those things and
00:17:15
I've included those wherever I
00:17:18
can okay so when we think about moral
00:17:21
theories what makes an action right uh
00:17:24
what makes a person or a thing good you
00:17:27
know is there even such a thing as good
00:17:29
people and bad people or they're just
00:17:31
people and and sometimes we do good
00:17:34
actions and sometimes we do bad ones you
00:17:36
know people debate that too um and then
00:17:39
what's the difference between a moral
00:17:41
theory and a moral code moral codes are
00:17:44
rules they'll say follow these rules in
00:17:46
the workplace follow this follow that
00:17:48
follow this list of you know of of
00:17:51
Commandments or rules or what have you
00:17:53
but then the moral theory is the why
00:17:56
behind it so you might have um like
00:17:59
philosophy of religion arguments about
00:18:02
why you should do what's pleasing to God
00:18:04
or um you know the ethical egoism and
00:18:06
the utilitarian ethics will say well you
00:18:09
know if we're trying to decide what's
00:18:10
right or wrong we can't just you know be
00:18:13
arbitrary capricious right we want to
00:18:15
have reasons hopefully at least
00:18:17
according um to most of these ethical
00:18:19
theories so what are the reasons right
00:18:23
and and you sometimes see three
00:18:25
categories um consequentialist ethics
00:18:28
non-con quential IST ethics virtue
00:18:30
ethics they can kind of um Twist and
00:18:32
Turn For example Divine command Theory
00:18:34
kind of religious ethics is often
00:18:37
considered non-consequentialist because
00:18:39
you're supposed to follow you know
00:18:41
whatever the Quran or the Bible whatever
00:18:43
you're you're I mean Indiana mostly
00:18:45
Christianity though we do have some
00:18:47
Muslims and Buddhists and and pagans and
00:18:49
and so on right um but whatever it is
00:18:52
that whatever you think you should do um
00:18:56
to meet the requirements of your
00:18:58
religion right regardless of
00:19:00
consequences so thou shal not kill
00:19:03
regardless of consequences type of thing
00:19:05
but I have had religious people say hey
00:19:07
I'm religious because I heard I'm going
00:19:09
to get to heaven if I am and and hell if
00:19:12
I don't so that's the reason I'm
00:19:13
religious so he kind of could for some
00:19:15
people maybe fit into a consequential
00:19:18
category but most people say no I just
00:19:20
do it because it's the right thing to do
00:19:22
which then puts it back into the
00:19:25
non-consequentialist category things
00:19:27
like that and even it's so fun because
00:19:29
even Thou shalt not kill right I've
00:19:31
taught these classes for Years first of
00:19:34
all you got all kinds of different
00:19:35
religions um you've got atheists
00:19:37
whatever all kinds of people with
00:19:39
different views but even in one religion
00:19:41
like uh Christianity being the most
00:19:43
popular in Indiana you'll have some
00:19:45
people that say thou not shalt not kill
00:19:47
that's why I'm an absolute pacifist
00:19:49
right you'll have others that say thou
00:19:52
shalt not kill but I I think it's okay
00:19:54
to be a soldier I think it's okay to
00:19:57
support the death penalty think it's
00:19:59
okay to kill in self-defense and then
00:20:01
you'll have others that fall maybe in
00:20:03
between absolute pacifism and
00:20:06
self-defense the death penalty Soldier
00:20:09
type of thing you know so even how you
00:20:11
interpret that with the rest of a sacred
00:20:14
text so many people I promise you
00:20:16
interpret it you probably already know
00:20:17
that but interpret it so differently
00:20:19
same with utilitarian ethics what should
00:20:21
we do that's the greatest good for the
00:20:23
greatest number I'll have two people
00:20:25
that are utilitarians completely
00:20:27
disagree about what's the greatest good
00:20:30
for the greatest number now like I said
00:20:32
the beauty is I could care less you know
00:20:35
what side of an issue you're on but you
00:20:37
just have to briefly explain your
00:20:39
utilitarian calculation to me for
00:20:41
example you know that's kind of how this
00:20:44
uh class works just to give you a a feel
00:20:46
for it virtue ethics uh that we should
00:20:48
focus on the character of the person
00:20:50
what kind of habits we develop that's
00:20:52
sometimes gets its own Theory um but you
00:20:56
now here we go if you have not already
00:20:58
done the reading material right then
00:21:01
then here will be here here'll give you
00:21:02
a jump start um first actions are to be
00:21:06
judged rightly or wrong solely in virtue
00:21:08
of their consequences Nothing Else
00:21:11
Matters right actions are simply those
00:21:13
that have the best consequences so it's
00:21:15
a very futur looking uh philosophy right
00:21:19
it's not um it's not well you're getting
00:21:22
punished because you deserve it you can
00:21:24
only say well you're getting punished
00:21:26
because we believe that will be the
00:21:28
greatest good for the greatest number in
00:21:29
the end right that kind of a thing um so
00:21:33
second in assessing consequences for for
00:21:36
most classic utilitarians now there's
00:21:39
value pluralists and things like that
00:21:41
that say well knowledge should count or
00:21:44
or Integrity should count fair enough
00:21:46
but for most classic utilitarians
00:21:48
they're going to say the amount of
00:21:50
happiness or unhappiness is what matters
00:21:53
everything else is not relevant th right
00:21:55
actions are those that produce the
00:21:56
greatest balance of happiness
00:21:59
over unhappiness third in calculating
00:22:01
the happiness or unhappiness that will
00:22:03
be caused no one's happiness is to be
00:22:06
counted as more important than anyone
00:22:08
else's um so and we'll see like we're
00:22:11
going to see when we get into Ross's
00:22:12
Prim aasia duties Theory some folks say
00:22:15
well that's a little bit hardcore right
00:22:17
like don't we owe maybe our
00:22:20
caregivers a duty of responsibility or a
00:22:23
duty of gratitude our children a duty of
00:22:26
responsibility things like that but for
00:22:29
John Stewart Mill he says the happiness
00:22:31
which forms the utilitarian standard of
00:22:34
what is right in conduct is not the the
00:22:37
agent's own happiness but the overall
00:22:41
happiness right so it's not that your
00:22:43
own happiness doesn't count at all but
00:22:46
it counts as one person right and then
00:22:48
there's all these other people in the
00:22:50
world and since the ethics of utility
00:22:52
often focuses on Pleasure too you know
00:22:55
and we could get big discussions what's
00:22:57
the difference between pleasure and
00:22:58
happiness we might get into that a
00:23:00
little bit in this class but pleasure
00:23:03
sentience right it's about a certain way
00:23:06
of feeling so we're going to see
00:23:09
utilitarians are going to include
00:23:11
animals in the moral community in a way
00:23:14
some other theories may
00:23:16
not okay what's the difference between
00:23:19
act utilitarian and Rural utilitarian
00:23:23
ethics well act utilitarian say do
00:23:25
whatever Act is going to produce the
00:23:27
greatest good for the greatest number
00:23:29
rural utilitarians are going to say wait
00:23:31
a minute that can be tricky so what if
00:23:34
I'm on campus and I'm cutting through
00:23:35
the grass and y'all say I don't want the
00:23:38
grass to die I love looking at the grass
00:23:40
and I say ah I'm just going to CW across
00:23:42
it my feet aren't going to kill the
00:23:44
grass all right fair enough but you're
00:23:47
like yeah one person's feet can't kill
00:23:49
the grass she's got a point there I'm
00:23:51
going to make that same shortcut we all
00:23:53
make the same shortcut and eventually
00:23:56
the grass D are all like oh I didn't
00:23:58
want the grass to die know I'm sad right
00:24:00
so for rule utilitarians they're
00:24:02
basically saying what rule utilitarians
00:24:06
are basically saying is we can um we
00:24:09
should we can all be rational act
00:24:11
utilitarians and still have bad results
00:24:14
from a utilitarian perspective the best
00:24:16
we can do is follow a rule you know like
00:24:19
don't walk in the grass if we all
00:24:21
ultimately care about the Grass more
00:24:23
than we care about getting to class five
00:24:25
minutes early or whatever right that's
00:24:28
to be their point but act utilitarians
00:24:30
they got their own push back they're
00:24:33
like wait a minute we don't want to be
00:24:35
rule worshippers you know because a lot
00:24:37
of rules have
00:24:39
legitimate exceptions you know example I
00:24:41
give is if you go like to a subdivision
00:24:43
a lot of times you see those retaining
00:24:45
ponds and it'll say no swimming well
00:24:48
what if a child's drowning and they're
00:24:50
like help me help me and you're like no
00:24:52
it says no swimming swimming and I'm a
00:24:54
rule follower and you just let the child
00:24:56
drown right most people are going to say
00:24:58
well that's wrong you should have saved
00:24:59
the the child that was a rulle meant to
00:25:01
be broken in that extreme situation
00:25:04
right so you got to look at the
00:25:06
situation and make decisions whether you
00:25:08
want to or not that'll be the ACT
00:25:10
utilitarian perspective so which which
00:25:13
is better why or why not you know you
00:25:15
can you can make a case either way for
00:25:17
that and if you want to do your news
00:25:19
article and say my position aligns with
00:25:22
act utilitarian ethics or this is how I
00:25:24
think act utilitarian would respond or
00:25:26
what have you or a rule U utilitarian
00:25:28
that's that's good you know narrow it
00:25:32
down um utilitarians like I said have
00:25:35
different conceptions of intrinsic good
00:25:37
and we're going to see that when we get
00:25:39
into things like uh physician assisted
00:25:41
death uh for most utilitarians
00:25:44
maximizing intrinsic good means
00:25:46
maximizing happiness um it's called
00:25:48
hedonistic utilitarian ethics but as I
00:25:51
mentioned some can be ideal utilitarian
00:25:53
ethics now let me tell you what I mean
00:25:55
intrinsic good means it's good for its
00:25:59
own sake right so happiness is good for
00:26:02
its own sake we're going to see when we
00:26:05
get into physician assisted death and
00:26:06
I'll talk more about that then but
00:26:08
you're going to see these differences
00:26:10
you're going to see utilitarians they
00:26:12
have somebody a lot of not all
00:26:14
utilitarians a lot of utilitarians and
00:26:16
if somebody wants to die and in their
00:26:19
suffering you know and it produces the
00:26:21
greatest amount of Happiness overall let
00:26:24
them die right because happiness is what
00:26:28
matters that's what's good for its own
00:26:31
sake but you'll have some natural law
00:26:34
theorists Divine command theorists
00:26:36
generally that might say well human life
00:26:39
is intrinsically valuable it's valuable
00:26:42
for its own sake whether you are
00:26:43
suffering or not right and so to take an
00:26:47
infinitely valuable life even if a
00:26:50
person wants to end it is wrong well
00:26:54
okay is happiness an intrinsic value is
00:26:57
human life an intrinsic value what if
00:27:00
there is a conflict between the two and
00:27:02
how in the world do you prove what's an
00:27:04
intrinsic value as far as I've seen no
00:27:06
one's done it yet so but we've got all
00:27:09
got our you know intuitions about it or
00:27:12
our reasoning about it or what we would
00:27:14
want and that sort of thing but it's
00:27:16
really hard to prove which may be why
00:27:18
we're always still uh discussing these
00:27:21
issues even though we've been talking
00:27:23
about some of them thousands of years
00:27:26
some of them hundreds of years right
00:27:29
um and then the other Theory I wanted to
00:27:31
briefly uh discuss is ethical egoism the
00:27:34
idea that says we ought to do what is
00:27:36
ever in our best interest and that
00:27:38
others ought to do um what is in their
00:27:41
best interest too now it's a different I
00:27:43
won't show the video here um but it's a
00:27:45
difference between psychological egoism
00:27:48
and ethical egoism ethical egoism is a
00:27:51
normative ethical Theory it's about what
00:27:53
we ought to
00:27:55
do on the other hand Psych logical ego
00:27:59
is say well you are kidding yourself if
00:28:02
you think you can do anything other than
00:28:05
the ACT you most want to do I'll give
00:28:07
you a really clear example you might say
00:28:10
I'd be rather doing a lot of things than
00:28:12
sitting in this Tech live session let me
00:28:15
tell you right it's it's possible um but
00:28:19
the psychological egoist would say well
00:28:21
no based on your temperament your
00:28:23
environment the way you were raised all
00:28:25
kinds of different things you calculated
00:28:27
yeah I'll go I'll go sit in on the tech
00:28:29
live session who knows why but it was
00:28:31
what you calculated was the thing that
00:28:34
was best for you to do for however that
00:28:36
calculation was and that we all always
00:28:38
do whatever we think is the best it
00:28:41
doesn't have to be selfish I mean it
00:28:43
could be but maybe you just want to help
00:28:46
somebody and it just you just feel that
00:28:47
pull so strong you're like I'm going to
00:28:49
help them and and that's what you're
00:28:51
going to do but you couldn't really say
00:28:53
no I'm going to pick the second thing
00:28:55
this thing I second most want to do
00:28:57
right overall um but that's in tension
00:29:01
with ethical eism do we have the free
00:29:03
will to choose what we um ought to do
00:29:07
and if we think it's in our
00:29:08
self-interest versus do we ought to do
00:29:10
whatever we most want to do anyway
00:29:12
because that's kind of the way were
00:29:14
programmed or designed depending on your
00:29:16
metaphysical commitments right so you
00:29:19
get those kinds of kinds of issues uh
00:29:22
too so you know it's really interesting
00:29:27
um like I said next week we're going to
00:29:28
talk about uh we can talk a little bit
00:29:30
about content ethics I don't know
00:29:32
natural law theory Rosses Theory we'll
00:29:35
talk a little bit about virtue ethics
00:29:37
and Care ethics at some point I've got
00:29:39
some slides uh for that we're going to
00:29:41
get into applied ethics whereas I
00:29:44
mentioned we'll talk about things such
00:29:45
as immigration animal ethics physician
00:29:49
assisted death maybe even euthanasia
00:29:51
around the world and in the Netherlands
00:29:55
active euthanasia is legal for people
00:29:58
that have like post-traumatic stress
00:30:00
disorder and they've gone to treatment
00:30:02
they've taken different meds and they're
00:30:03
not getting better they can say I am
00:30:06
tired of living in pain and they're
00:30:08
allowed to make that choice
00:30:10
because the societal social Norm
00:30:13
cultural norm is that of the principle
00:30:17
of autonomy of being able to self-direct
00:30:20
our lives in the United States we see no
00:30:24
right there not not that much of um as
00:30:27
much of a push in that that area at
00:30:30
least for the principle of autonomy or
00:30:32
making your own decisions and
00:30:34
self-governing active euthanasia is not
00:30:36
legal in any state a physician assisted
00:30:39
death is but it's illegal in Most states
00:30:43
right and and um who's right and why and
00:30:47
we'll explore some of those uh questions
00:30:49
how should animals uh be treated based
00:30:52
on sentience should they have rights why
00:30:55
or why not you know and are you like
00:30:58
logically consistent throughout the
00:31:00
class like right are you consistently a
00:31:03
utilitarian or a contan or Divine
00:31:05
command there or an ethical egoist right
00:31:07
we'll explore that I won't grade on that
00:31:09
I will ask you that and if you know but
00:31:12
you can say no I'm not and if not why
00:31:14
not and just show like why you think
00:31:16
some moral principles sometimes take
00:31:18
precedence like if you're a utilitarian
00:31:20
some of the time I'm going to do the
00:31:22
greatest good for the greatest number
00:31:25
you know but sometimes you might not you
00:31:26
might like well not if it invol killing
00:31:28
a person or something like that right
00:31:31
okay so you know how do you draw those
00:31:33
lines right Williams James the famous
00:31:37
American pragmatic philosopher said if
00:31:39
we all agreed on everything we we
00:31:42
wouldn't have to have ethics right we
00:31:44
just say Christy you do what you want
00:31:46
Tiana you do what you want Jennifer will
00:31:48
do what she wants woohoo woohoo right
00:31:50
we'd all be happy but he says ethics is
00:31:52
about drawing lines right and you know
00:31:56
what um you know we don't have to get
00:31:58
political here but we can too right what
00:32:01
Donald Trump thinks is ethical might be
00:32:03
different from what kamla Harris thinks
00:32:05
is ethical which might be different from
00:32:07
what um Jill Stein thinks is ethical
00:32:10
which might be different from what
00:32:12
Buddha thought was ethical or what Jesus
00:32:14
Christ would have thought was ethical or
00:32:15
what Einstein I don't know would have
00:32:18
thought ethical we keep going on with
00:32:20
names right it doesn't matter and those
00:32:23
different conceptions of the good bump
00:32:25
into each other sometimes if I think
00:32:28
physician assisted death should be
00:32:30
illegal and you think it should be legal
00:32:32
or vice versa we can't we we can't do
00:32:36
both right we got to make a decision and
00:32:40
you know and so for James drawing the
00:32:43
lines you know ethics is about drawing
00:32:45
lines and he says the cries of the
00:32:48
Wounded will let you know who you hurt
00:32:51
and even if you said I'm not drawing any
00:32:52
lines I'm going to just not even I'm
00:32:54
going to try never to think about ethics
00:32:56
you're still ABD ating responsibility
00:32:59
which is kind of drawing a line so how
00:33:02
do you justify where you're drawing
00:33:05
those lines who you're hurting how do
00:33:06
you justify it right that's I mean
00:33:09
that's a big part of Ethics just
00:33:10
thinking about your reasoning all right
00:33:13
so I will stop the recording here