00:00:00
[Music]
00:00:19
okay but with social media is
00:00:21
interesting because uh Scholars were
00:00:23
saying that remember our multiple selves
00:00:26
when we have communities we also project
00:00:29
several of ourselves so um we the
00:00:33
narratives that we form ourselves are
00:00:35
not
00:00:37
coherent um sometimes they are still in
00:00:40
the process of becoming they are still
00:00:42
in the process of being constructed in
00:00:44
other words of course that it's really
00:00:46
true
00:00:47
because every day we change and so are
00:00:51
the uh the environment um that we live
00:00:54
in and so the the many different people
00:00:57
that we meet they also shape our lives
00:00:59
and so we also shape their lives so it's
00:01:01
even um you know Unthinkable that the
00:01:05
way or the the the person that we were
00:01:09
before is still the person we are today
00:01:11
because selves uh constructing
00:01:15
identities is a um Dynamic
00:01:20
process but this one this in other words
00:01:24
what some some Scholars are saying
00:01:25
there's no coherent Narrative of
00:01:27
ourselves it's always a bit of this and
00:01:30
that and uh sometimes we go overboard we
00:01:35
become so
00:01:37
self-conscious so we become so
00:01:41
self-indulgent uh especially if uh have
00:01:43
you seen people who travel around the
00:01:45
world they started posting all their
00:01:47
photos because of course uh some people
00:01:50
were really well some people would see
00:01:52
this just natural but some people were
00:01:55
annoyed by that uh by that action to
00:01:59
document all the cool places that
00:02:01
they've been or the restaurants that
00:02:03
they have visited some people were not
00:02:06
really happy uh seeing other people's
00:02:08
post scholar was saying that the
00:02:11
incoherent selves actually um emerges
00:02:14
from this potential
00:02:16
for a bit
00:02:19
of performing for others in other words
00:02:23
um we while it's true that we create
00:02:26
identities but it's also it's another
00:02:28
thing to say that maybe I should create
00:02:29
a
00:02:30
false notion of myself because I want to
00:02:33
be famous I want to be uh popular I want
00:02:36
to be uh known etc etc so in other words
00:02:40
they uh they're warning us that there's
00:02:42
a disconnect with uh but because we have
00:02:46
strayed too far from who we are and this
00:02:49
incoherence actually is a result of uh
00:02:53
being uh obsessed with Fame with being
00:02:56
rich or being famous etc etc
00:03:00
so we perform for
00:03:02
others rather than for ourselves and it
00:03:06
is as if we want others to know us in
00:03:09
certain ways even if this this self is
00:03:12
so much different from who we are this
00:03:15
the Cy space or the internet offers us a
00:03:18
chance to be that because the platform
00:03:20
is there something like the stage is all
00:03:22
there we just
00:03:23
perform and uh and sometimes they they
00:03:29
present conflicting
00:03:33
potentials um
00:03:34
and we tend to delete the information
00:03:38
like remember you took selfie of
00:03:39
yourself you had to erase some photos
00:03:42
because it's not the best angle so you
00:03:44
edit yourself because we always want to
00:03:47
present ourselves as the best uh before
00:03:51
uh others in in cyers space so we edit
00:03:53
ourselves and we edit and edit we don't
00:03:56
even know that the result would really
00:03:58
be us
00:04:00
right so that's why we have so much
00:04:03
selfie photos in us and we don't always
00:04:06
post them so we choose a lot um and we
00:04:09
we took a lot but you also choose the
00:04:11
best angle the best one and that's also
00:04:13
the true with with Poe right we can
00:04:15
always edit um our uh our comments and
00:04:21
we also edit the way the which we
00:04:23
present ourselves in the process of
00:04:24
editing then we realize is it really us
00:04:28
or is it just a part of us that is that
00:04:31
is there and not really the whole of USS
00:04:33
so we monitor ourselves we check and
00:04:36
reduct
00:04:37
information okay so in other
00:04:41
words this
00:04:44
problematic content has got to do with
00:04:47
our editing but more so because this
00:04:51
incoherent selves or the many ways in
00:04:53
which they construct themselves are also
00:04:55
being monitored by others for certain
00:04:58
ends one of them of course it's
00:05:00
commercial they want to sell us remember
00:05:02
that you Google some some keywords like
00:05:05
Baka and suddenly your in your Google
00:05:08
account there there's an array of
00:05:11
resorts in bakai when bakai is closed
00:05:14
now but before or you travel to say
00:05:17
Vegan ilur and suddenly you got uh hotel
00:05:21
bookings potential in other words there
00:05:24
be this this idea of our aspirations our
00:05:27
dreams are being monitored by others
00:05:29
because they want to make money out of
00:05:33
that and of course the agenda could also
00:05:37
be political because they want us to
00:05:40
vote for
00:05:41
them like this day's uh election is
00:05:45
still far but people are pushing
00:05:49
themselves to uh you know to our
00:05:53
attention like they're already
00:05:56
campaigning and another agenda what
00:05:58
could other agenda be they want also to
00:06:00
get our uh they want to sell something
00:06:03
to us they might want us to vote for
00:06:05
them
00:06:06
okay and some information are simply
00:06:09
just available but some people are
00:06:12
pushing certain information that's the
00:06:14
the the job of the trolls that's a job
00:06:17
of uh uh groups in social media that try
00:06:21
to push information to us because they
00:06:23
want to convince us that their ways are
00:06:26
better and they also want us to uh to
00:06:29
believe that their points of view are
00:06:31
better and in the process they
00:06:33
undermine uh rational critical debate
00:06:37
and discussion I mean not all of them
00:06:39
are quite friendly or patient or
00:06:42
tolerant of other views they might be uh
00:06:46
virulent in their comments they might be
00:06:48
cursing you or they might uh not even
00:06:52
welcome opposing views and we become
00:06:56
targets of this problematic
00:06:58
content because they know where we are
00:07:01
because of our notification they know
00:07:03
our location they know that we go to
00:07:06
feu um in this college in this
00:07:10
department and we live in this city we
00:07:13
live in this area and we have friends
00:07:16
with uh people from this municipality
00:07:19
from this country in other words they
00:07:21
were able to connect um to construct our
00:07:25
connections because of the digital
00:07:28
footprint that that we left they say
00:07:31
well if you don't like Facebook you
00:07:32
don't sign up but do we really have a
00:07:34
choice can't we have free Facebook and
00:07:37
yet uh this company which is Facebook a
00:07:40
multi-billion company based in
00:07:43
California could also protect our
00:07:45
privacy and not uh you know um render us
00:07:48
vulnerable to trolls and third party
00:07:51
advertisers that Harvest at account that
00:07:54
give us haes for example and all sort of
00:07:57
claims on uh like medical claims
00:08:00
remember uh cures for certain when and
00:08:03
you have to take this and that and
00:08:05
medical uh knowledge which is available
00:08:08
which do not uh really add up to being
00:08:10
true or verified by SCI scientific
00:08:13
community in other words th those are
00:08:16
the problematic content that we are
00:08:18
exposed to as a result of our engagement
00:08:21
online Scholars actually said that there
00:08:24
are two kinds of uh problematic content
00:08:28
and they call that the miss information
00:08:30
and thisinformation and both refer to
00:08:33
how information uh is considered to be
00:08:36
inaccurate Incorrect and misleading so
00:08:38
I'm going to discuss this two I got this
00:08:41
from a uh study created by data and
00:08:45
society and this is the dataon society
00:08:48
this is how um defined
00:08:51
misinformation who uh something whose
00:08:54
inaccuracy is an intention other
00:08:57
words an information could be wrong but
00:09:00
the way that it was wrong was not really
00:09:02
the intention of the maker something
00:09:05
like for example there's a breaking news
00:09:08
about a shooting incident or a car uh
00:09:12
accident people are uh some this news
00:09:16
organization um broadcast I mean tweet
00:09:20
the incident without even verifying the
00:09:23
details of the information some
00:09:24
information could be
00:09:26
incomplete some could be inaccurate but
00:09:29
they are still verifying the information
00:09:31
so sometimes okay uh maybe you forgive
00:09:34
them because eventually they would uh
00:09:35
revise the news and that's intentional
00:09:37
and those are the errors that
00:09:39
journalists commit when they fail to
00:09:43
verify the source especially uh during
00:09:46
an unfolding crisis but only during
00:09:48
unfolding crisis because if it's not an
00:09:51
unfolding crisis like a right yes but
00:09:55
even if you know it's not an unfolding
00:09:58
crisis some some groups or some uh
00:10:01
government offices we just we just
00:10:03
release information that's uh actually
00:10:05
inaccurate and that would be actually
00:10:08
it's more disinformation rather than
00:10:10
misinformation because disinformation is
00:10:13
information that is deliberately false
00:10:16
and misleading in other words there's no
00:10:19
debate about it it's really misleading
00:10:21
and inaccurate
00:10:24
uh plainly and it could be driven by
00:10:28
politics
00:10:30
propaganda political ends could be
00:10:32
driven by profit in other words because
00:10:35
they want to drive traffic to their site
00:10:37
they want to to monetize the the clicks
00:10:42
and pranks they just want you to uh have
00:10:46
uh to enjoy have a bit of fun remember
00:10:49
the past few weeks uh we have Monsoon
00:10:52
floods right and have you seen the post
00:10:55
that there's a shark swimming in uh
00:10:57
Marikina River yes right that's a that's
00:11:01
a form
00:11:02
of prank I mean people W believe that
00:11:05
but there they posted a picture was it
00:11:08
on Facebook that there was a shark
00:11:10
swimming in Marikina River and people
00:11:13
were sharing it the people is well some
00:11:15
people say it's funny or they might as
00:11:18
well have a crocodile but uh that's how
00:11:20
it is I mean just people want to have
00:11:22
some a bit of fun because it's raining
00:11:24
and the spirit were so dampen by the
00:11:26
rain so maybe just P something which
00:11:29
would have a laugh at so those are the
00:11:32
uh probable agenda why people would
00:11:35
spread uh misinformation and
00:11:37
disinformation first draft is also group
00:11:40
in the US actually the they operate
00:11:42
globally classified misinformation and
00:11:45
disinformation into seven
00:11:47
ways um maybe you should help me give an
00:11:50
example the first one is Sati
00:11:53
parody and sattin P the do not intend to
00:11:57
cause harm but has the potential to F
00:12:00
some people like to have a little fun
00:12:02
what is satire give an example of satire
00:12:05
the onion the onion or are you familiar
00:12:07
with professional heckler you that's a
00:12:10
form of sat parody is more imitation
00:12:13
it's something like uh remember some
00:12:15
some funny videos on YouTube that how to
00:12:19
put on makeup and it's really you know
00:12:22
uh it's not the real makeup but they
00:12:24
just put paint on their face and they
00:12:26
make it all like funny and that's parody
00:12:28
like that's imitation which is really to
00:12:31
which is humorous could make you laugh
00:12:33
but it's also a form of commentary
00:12:35
especially SATA which is a form of
00:12:37
political commentary that in certain
00:12:39
times it could even be
00:12:41
subversive um that's the power of satire
00:12:45
another is misleading content um to
00:12:48
frame or the an end an issue sorry or
00:12:52
individual so um give an example of
00:12:55
misleading
00:12:57
content how about out saying that Mayon
00:13:02
Volcano is in Naga oh so that's a form
00:13:06
of
00:13:07
uh is it intentional no right so it's
00:13:11
misleading um because I think the
00:13:14
context of uh the way that she said it
00:13:17
is that she wants to pin uh the vice
00:13:20
president that of course you're from
00:13:22
Naga you haven't even uh uh taken care
00:13:26
of the refugees of in other words
00:13:29
it's not just because she mentioned it
00:13:31
in a way that's humorous humorous but
00:13:34
she really wants to pin down it's make
00:13:36
it appear uh that
00:13:38
she yeah to embarrass the vice president
00:13:42
so that makes it a form of sing content
00:13:45
it's uh a bit intentional because of the
00:13:48
agenda that she puts into it
00:13:51
so imposter content is when genuine
00:13:55
sources are impersonated um can you give
00:13:58
me an example Le
00:13:59
how how memes for example um could could
00:14:04
operate online but some people even
00:14:05
believe to them to be
00:14:06
true right
00:14:09
huh about Mar yes the Marcus memes that
00:14:13
uh they're the greatest
00:14:15
uh uh couple ever and the love life of
00:14:20
Elda and Ferdinand was even you know
00:14:22
being posted that as models for the
00:14:25
Millennials right so um and uh that's I
00:14:30
think a form of misleading
00:14:33
information fabricated content is false
00:14:36
and designed to deceive and do harm um
00:14:40
how what is um fabricated content um
00:14:45
like I've seen for example that during
00:14:47
the election of Hillary Clinton and
00:14:49
Trump as president in the US um there
00:14:52
are Facebook groups and twit and twit uh
00:14:57
Twitter uh posts and even um
00:15:00
announcement said that Hillary Clinton
00:15:03
uh supporters did not go to the polls
00:15:06
they can just actually email their
00:15:09
votes um so you don't have to go to the
00:15:12
polls just email to this
00:15:14
address which is of course a form of
00:15:17
deception and preventing the voters of
00:15:20
or supporters of Hillary from going to
00:15:22
the polls and they said you just email
00:15:24
you can vote online just email this
00:15:26
actually a form of deception in the
00:15:28
intend of it is to do harm because uh
00:15:31
they want Trump to win and that's a form
00:15:33
of dirty tricks during election
00:15:35
clickbaits could form part of number
00:15:39
five remember the clickbaits that uh
00:15:42
when headlines visuals or captions do
00:15:44
not support the
00:15:46
content um I think this is true for some
00:15:49
media
00:15:50
organizations right clickbaits they're
00:15:52
really meant to uh uh attract viewers or
00:15:57
listeners or readers
00:15:59
and um even if the the visuals or
00:16:04
captions there's nothing in here
00:16:07
which correspond to which correspond to
00:16:11
the headline but it's had way just to
00:16:13
attract uh audience right false
00:16:16
connection that's number five so we have
00:16:18
two more so false
00:16:22
context when genuine content is shared
00:16:25
with false uh contextual information
00:16:27
I've seen this a during the campaign of
00:16:29
bong bong Marcos uh said that oh they
00:16:32
built the cultural center or something
00:16:36
uh therefore they're the greatest or
00:16:38
they built the whatever and therefore
00:16:41
they're the better
00:16:43
uh president something um the context is
00:16:48
false right Credit Credit form of false
00:16:52
cont yes I think yes credit
00:16:54
grabbing um and it's also true during
00:16:57
the um the accusations of uh remember uh
00:17:02
recently I I I noticed
00:17:05
that the spokesman of President D was
00:17:08
saying that extra judicial killing
00:17:10
actually is worse in the time of uh
00:17:14
president bno aino Jr time of no noi and
00:17:18
it's actually uh the problem is already
00:17:21
worse in his time which is of course not
00:17:23
true the Contex is different why through
00:17:25
the extradition cling took place in a
00:17:27
time of know it is not in any way as a
00:17:31
huge problem as it is now right so
00:17:34
that's a a false uh contextual
00:17:37
information and the the last one is
00:17:39
manipulated content when images are
00:17:44
meant to photoshop are you familiar with
00:17:47
it how how pictures are Photoshop and
00:17:50
the recent one is the the protest the
00:17:53
people Sona where they change the banner
00:17:56
to be prodo they when actually was taken
00:17:59
from the march of the of the protesters
00:18:03
I guess that AR get some of it could be
00:18:06
so crude as that but others could be
00:18:10
sophisticated that you wouldn't even
00:18:12
know unless you have a uh you're train
00:18:14
to spot this information online you you
00:18:17
you have the techniques to verify
00:18:19
information or photographs by the use of
00:18:21
reverse image search you can actually do
00:18:24
that in Google there are apps for that
00:18:26
that you can tell whether a photograph
00:18:28
is manipulated or photoshop by doing an
00:18:31
a reverse image search that's why uh
00:18:35
mocha o was outed was actually called
00:18:38
out when she posted pictures of uh
00:18:42
supposedly marawi uh TR uh troops in
00:18:45
marawi soldiers in marawi and actually
00:18:47
the that photograph was taken from uh
00:18:51
other countries like Vietnam or was it
00:18:53
Costa Rica or something as a f uh the
00:18:56
photo Photoshop is uh some of it could
00:18:58
be very sophisticated techniques but you
00:19:01
can also detect that by uh employing
00:19:04
some verification tools as I
00:19:06
said it's not because false information
00:19:09
are really bad true and true as I said
00:19:11
they could also be forms of protests and
00:19:14
political cultural commentary like the
00:19:17
use of satire and parody um but Hawks is
00:19:23
it it ranges
00:19:25
from um STK swimming in Marikina River
00:19:30
to uh cures in Facebook is the place
00:19:34
where all sorts of claims are made like
00:19:37
cure for cancer cure for varicus veins
00:19:41
ano pa ba all sorts of medical elements
00:19:45
that uh and uh
00:19:48
superfood uh effective diet uh what to
00:19:52
eat what to take and what to uh what
00:19:56
what whatever uh so haes and in some
00:19:59
countries it's even dangerous because in
00:20:01
the Philippines um it appears that our
00:20:04
Hawes are just confined to Facebook and
00:20:08
we tend to know a bit of it because it's
00:20:10
out there but in some countries it's
00:20:12
even dangerous because it is confined to
00:20:16
encrypted apps like WhatsApp and in
00:20:19
WhatsApp our apps which only has 256
00:20:22
members and it's a very close group and
00:20:25
and it's uh encrypted that people
00:20:27
wouldn't know what's going on and uh the
00:20:30
Hawes are spreading uh in the groups and
00:20:33
you don't even know what's out uh if
00:20:35
you're an outsider and the Philippines
00:20:37
we tend to know because some of them are
00:20:39
found on Facebook some of them are found
00:20:41
in fake news sites the sites that appear
00:20:44
to be
00:20:47
uh news sites but they're actually fake
00:20:50
news sites and they have uh appearance
00:20:53
of being
00:20:55
uh news uh news uh production or news uh
00:21:01
Creator sites but they actually invented
00:21:05
content and most of them are uh
00:21:08
identified with the president Rodrigo
00:21:10
doterte and can if you um if you're
00:21:13
familiar with the verification
00:21:15
techniques of ver files and U um rappler
00:21:19
they have a list of fake news sites
00:21:22
right um some of it could be fun like
00:21:25
remember that duterte said that he
00:21:27
signed a decree that he will ban
00:21:29
homework for
00:21:31
students uh of course some people would
00:21:34
share it a million times
00:21:37
um some groups would share share it a um
00:21:41
a million times in fact it garnered like
00:21:43
3 million shares it's fun you know it's
00:21:45
not true but um people were just sharing
00:21:48
it out
00:21:50
of what fun or they just want to prank
00:21:53
other people and say okay there's no
00:21:55
more homework according to the th but
00:21:58
they could also be powerful political
00:22:01
commentaries as long as people
00:22:04
would tell would be able to know that
00:22:07
they are satire and do not believe it at
00:22:09
face value the problem with Filipinos is
00:22:11
according to some Scholars is Filipinos
00:22:14
could not tell Sati from the real news
00:22:17
and they thought that Sati is
00:22:20
real in other words it's a form of uh um
00:22:25
critique and not many people are
00:22:27
familiar with it
00:22:29
okay and that's I think a problem of
00:22:31
this particular forms because they said
00:22:32
that Filipino humor is more slapstick
00:22:35
rather than satiric nature I don't know
00:22:38
uh maybe I should ask some Scholars to
00:22:40
have a look at it but uh they said
00:22:43
Philipp is a problem telling what is
00:22:45
Sati what is not Sati
00:22:49
okay okay the problem also with the fake
00:22:53
news or disinformation is sometimes they
00:22:55
tend to be recognized as legitimate
00:22:59
because they form part of promotions
00:23:02
advertising information campaigns and uh
00:23:06
public relations and
00:23:08
propaganda in other words they are
00:23:11
really
00:23:11
information
00:23:13
um campaigns or uh programs that are
00:23:18
meant to change the way in which people
00:23:21
believe uh things to be or to promote a
00:23:26
product or a political view or
00:23:29
Etc um and they're of often done through
00:23:33
mass media it could be through a
00:23:35
political ad uh advertisement during
00:23:37
election it could also be a billboard it
00:23:40
could also be a uh television spot Etc
00:23:45
it could also be through a jingle um on
00:23:48
social on the radio in other words they
00:23:52
the problem there is is the mixture of
00:23:54
facts and
00:23:55
interpretation and they have got to do
00:23:58
with Brands and this this is a
00:24:00
problematic mix facts and interpretation
00:24:03
and insights and sometimes this
00:24:05
interpretation or insights is not
00:24:07
something which is
00:24:09
verifiable in other words it's not
00:24:12
something which could easily be uh um
00:24:16
assessed to be or evaluated to be true
00:24:19
or
00:24:20
not and that's a problem with
00:24:22
information campaigns because they have
00:24:24
little mix of this and that and um
00:24:29
they they meant to Target people to
00:24:31
change their the way that they think
00:24:34
that change the way they feel about
00:24:36
something like for example it removes
00:24:40
99.9% of germs or something oh how could
00:24:44
you verify
00:24:46
that I mean those are claims and it's a
00:24:49
mix of course when you wash your hands
00:24:50
it's really to remove germs but about
00:24:54
99.9% uh that's incredible right but
00:24:57
some people people would take that as is
00:24:59
and they would really buy that kind of
00:25:00
soap or that kind of handwashing uh
00:25:04
detergent or
00:25:06
um gel and these blend of facts and
00:25:11
interpretation are problematic because
00:25:15
they are difficult to evaluate because
00:25:17
they they are mix of facts and
00:25:20
opinion and um the same in the same
00:25:24
breath okay and it's uh if you're a fact
00:25:27
checker you have to isolate which can be
00:25:29
verifiable and which could not could be
00:25:32
just taken as an opinion we could not
00:25:33
fact check opinion unless it's
00:25:35
inconsistent from the previous statement
00:25:38
which is another flip-flop but to think
00:25:41
that it's
00:25:43
uh not factual is another thing I mean
00:25:46
it's difficult to to say that okay so I
00:25:50
I'm down to my last two slides and I I
00:25:52
would like to uh to make us uh think
00:25:55
through the give you some tips on how uh
00:25:58
what to do with our cyber selves that we
00:26:01
are exposed to so much information and
00:26:03
some of this are incoherent conflicting
00:26:06
information and because of our
00:26:08
engagement online we are so vulnerable
00:26:10
to all sorts of manipulation and agenda
00:26:13
online that sometimes we cannot tell
00:26:16
which is true or not and we always
00:26:19
subject to all sorts of influence and uh
00:26:22
disinformation and information because
00:26:25
of our presence just by our mere
00:26:27
presence online so um how do we evaluate
00:26:32
information online then so these are the
00:26:35
tips I would like to live with you um
00:26:38
hoping that this would form part of the
00:26:41
empowerment that we have as digital uh
00:26:45
natives or as cyber users or cyber
00:26:48
citizens or neens as they say in other
00:26:52
words uh part of our being part of being
00:26:55
empowered is to take hold of relevant
00:26:57
intelligent factual and verifiable
00:27:00
information and before deciding what to
00:27:03
believe in and also before sharing
00:27:06
things with others because we might end
00:27:08
up multiplying this information
00:27:10
information online We Ready add up to
00:27:13
the Clutter or the garbage out there so
00:27:17
these are the tips that I would give
00:27:19
you um this is the given to us by the
00:27:23
last week The Trusted media Summit and
00:27:26
there was this uh
00:27:30
guy by the name of Owen Sweeney and he
00:27:32
works with first trp and he gave us this
00:27:34
tips uh which of course I'm sharing with
00:27:37
you you check the URL of course the
00:27:40
universal resource locator the website
00:27:44
see if it it's also found elsewhere and
00:27:48
uh is it from reputable
00:27:50
sources you check the so this is like a
00:27:54
process it's not if you're too lazy
00:27:58
you might want to per you you might not
00:28:00
want to perform this but this is just
00:28:02
something like housekeeping a basic
00:28:04
housekeeping if we want to
00:28:08
uh remove the garbage out there online
00:28:13
so you look at the about page it seem
00:28:16
appears to be like CNN but why is it
00:28:18
that it's
00:28:19
cnn. uh.
00:28:22
co uh it appears to be like inquir why
00:28:24
is it Inquirer that uh number one or
00:28:29
inquir one or two or whatever it's an
00:28:32
imitation side so uh who who publishes
00:28:36
this is it found who who are the
00:28:39
Publishers of the page and if you go to
00:28:42
Raper and Veri they might have already
00:28:44
fact check information recently I talked
00:28:47
to some uh fact Checkers um and they
00:28:50
said there appears to be more
00:28:51
celebrities are dying in the
00:28:54
Philippines uh you know that the among
00:28:58
them are the son of Chris
00:29:01
aino
00:29:03
uh huh who Eddie Garcia Eddie
00:29:09
Garcia
00:29:11
um they said why is it that there are a
00:29:13
lot of celebrities dying online in the
00:29:15
Philippines there's a lot of
00:29:18
Hawes uh somebody died actually they
00:29:21
they haven't it's just that um
00:29:24
somebody's saying that they have died or
00:29:26
and then post see it online and because
00:29:28
people like celebrities they would click
00:29:29
on it and share it without even uh
00:29:32
verifying the
00:29:34
information okay so you check the agenda
00:29:37
I mean you can't really check the agenda
00:29:39
it might be perity but you can at least
00:29:40
infer in other words
00:29:44
think why is it this this group is
00:29:46
sharing this why is it this group is
00:29:48
sharing that and why what do you want
00:29:50
what do they want to attain by sharing
00:29:52
the information so you look at the
00:29:55
writer's by line and profile pick and uh
00:29:58
check it if it's uh is the if the person
00:30:01
is real you might have to look at the
00:30:03
editorial uh list or box if the person
00:30:07
is real you can always do a Facebook
00:30:10
search of the person there are tools to
00:30:12
do that you can look at the photograph
00:30:15
and verify whether it's really taken and
00:30:17
not
00:30:19
manipulated you can look at the sources
00:30:21
the links Etc um these are just steps on
00:30:25
on how on how much we check if the
00:30:27
information is truthful
00:30:30
or um false okay okay I I would like to
00:30:35
conclude with saying that in the process
00:30:37
of empowering our or constructing
00:30:41
ourselves um we encounter problematic
00:30:44
and disempowering
00:30:46
information right in our search to
00:30:49
become famous to be available to be uh
00:30:53
to be seen as friendly to be seen as
00:30:55
compassionate to be seen as
00:30:58
uh uh cool we might be exposing
00:31:02
ourselves to all sorts of
00:31:05
information um problems because some
00:31:09
some some groups or some people would
00:31:12
steal our
00:31:13
identity and con reconstruct it from um
00:31:18
our engagement online because they want
00:31:20
to sell us something because want to to
00:31:22
promote certain causes or they just want
00:31:24
to prank
00:31:25
us and uh as empowered users we should
00:31:30
take time to verify the
00:31:33
information make sure that we should not
00:31:35
contribute to the Clutter and the
00:31:37
garbage by ensuring that the misleading
00:31:40
content and inacurate content would be
00:31:43
called out and I think that should be a
00:31:45
very proactive I should even say an
00:31:49
enlightened way of dealing with uh with
00:31:52
the information in the process of
00:31:54
constructing our identity uh we get uh
00:31:57
we are also constructing our culture of
00:32:00
verification um culture of being
00:32:03
truthful um and um a culture
00:32:08
of being uh how do you call it
00:32:11
transparent and not deceive people just
00:32:14
because we want to to be something
00:32:17
else I think that's
00:32:20
my that's my take on the Cyber
00:32:26
s
00:32:29
[Music]
00:32:56
n