ihub research's umati november 2012 findings

Upload: angela-crandall

Post on 04-Apr-2018

220 views

Category:

Documents


0 download

TRANSCRIPT

  • 7/30/2019 iHub Research's Umati November 2012 Findings

    1/16

    Umati : Monitoring OnlineDangerous Speech

    November 2012 Findings

    *

  • 7/30/2019 iHub Research's Umati November 2012 Findings

    2/16

    2Umati: Monitoring Dangerous Speech Online *

    Goals o UmatiFollowing the need to de ne, identi y and deal withhate speech, the goals o the Umati project are:

    To set a de nition o hate/dangerous speechthat can be incorporated into the constitution.

    To orward incidences o dangerous speech toUchaguzi to limit urther harm.

    To de ne a process or election monitoring thatcan be replicated elsewhere.

    To urther civic education on hate speech.

    Hate speech has garnered growing interest in Kenya, especially sincethe 2007 Post Election Violence, owing to its realised potential to stiror promote violence against targeted groups o people. The current

    de nition o hate speech, according to the National Cohesion and In-tegration Commission Act o 2008 is speech that which advocates orencourages violent acts against a speci c group, and creates a climateo hate or prejudice, which may, in turn, oster the commission o hatecrimes.

    Due to this vague de nition, there has been an escalating demandrom peace-building organisations, politicians, government o cials and

    the general public on how to de ne, identi y, mitigate, report and dealwith hate speech. This need or less ambiguity in the de nition o hatespeech motivated the project to rely on a more actionable de nition,and hence the monitoring o dangerous speech.

    In a nutshell, dangerous speech is hate speech with a potential to causeviolence. Pro essor Susan Benesch o American University (Washing-ton, DC, USA), de nes dangerous speech as a subset o hate speech,

    and comprising o ve criteria:1. A power ul speaker with in uence over an audience;

    2. An audience vulnerable to incitement;

    3. Content o the speech that may be taken as in ammatory;

    4. A conducive social and historical context o the speech; and

    5. An in uential means o disseminating the speech.

    Introduction

  • 7/30/2019 iHub Research's Umati November 2012 Findings

    3/16

    3Umati: Monitoring Dangerous Speech Online *Why Online?

    While most projects related to hate speech have been looking at

    mainstream media, we are aware o the in uencepositive and neg-

    ativethat New Media such as the blogosphere and online orums

    had on the 2007 Post Election Violence in Kenya. There ore, our ag-

    ship project seeks to monitor and report, or the rst time, the role

    New Media plays on a Kenyan election. Our project will have citizens

    at the core and use relevant technology to collect, organise, analyse,

    and disseminate the in ormation that we receive.

    Monitoring ProcessOver a period o 10 months beginning September 2012 and ending

    June 2013, the Umati project will moni tor online content and record

    incidences o hate and dangerous speech.

    This process is being carried out by ve monitors, representing the

    ve largest ethnic groups in Kenya to enable the translation o cited

    incidences rom vernacular to the countrys o cial language, English.

    Monitors use an online categorisation process that enables them to

    sort each collected statement into its respective category.

    The three hate speech cateogories, in order o severity, are:

    Category One: Ofensive speech

    Category Two: Moderately Dangerous speech

    Category Three: Extremely Dangerous speech

    Impact

    We hope that the work o this project will lead to the inclusion o a

    more elaborate de nition o illegal speech in the current constitution

    o Kenya, and that ndings will be used to educate the Kenyan public

    on what type o speech has the potential to disrupt peace and secu-rity in the country. Through this project, we aim to create a process

    that can be replicated in other countries to monitor dangerous speech

    leading up to pivotal national events, such as elections and re erenda.

    Outputs

    The ollowing section presents a consolidation o all incidences o

    hate speech that have been identi ed in Kenyas webosphere rom

    three main sources: Social media (Facebook and Twitter), online blogs

    and comments sections o online newspaper

  • 7/30/2019 iHub Research's Umati November 2012 Findings

    4/16

    4Umati: November 2012 Findings *

    13%

    1.0%

    0.3%

    Suggest that some peopleare spoiling the purity or in-tegrity o another group

    Suggest that the audienceaces a serious threat orviolence rom another group

    Hallmarks o Dangerous Speech

    10%

    5%

    * For urther inormation and articles on the

    hallmarks and on Dangerous Speech generally,

    see www.voicesthatpoison.org

    From studying many examples rom a variety o countries andhistorical periods, our project partner Pro . Benesch has identi ed

    tell-tale signs, or hallmarks, o Dangerous Speech.

    We adapted these hallmarks to the Kenyan context and searched or themwithin the statement we collected. The three hallmarks we used are :

    Compare a group o people with animals, insects or a derogatory term in mothertongue,

    Suggest that the audience aces a serious threat or violence rom another group, Suggest that some people rom another group are spoiling the purity or integrity

    o the speakers group (e.g. characterize members o the other group as weedsspots, stains, rotten apples who may spoil a barrel, etc.).

    From a total o 772 sampled statements, 226 contained one or more hallmarkso dangeorus speech. Out o those, the hallmark contained in most statementswas when the speaker suggested to his/her audience that they ace a seriousthreat or violence rom another group.

    72%

  • 7/30/2019 iHub Research's Umati November 2012 Findings

    5/16

    5Umati: November 2012 Findings *Hallmarks o Dangerous Speech across di erent speakers

    Across the speakers, all three hallmarks were contained in speechthat we collected in November.

    The third hallmark, suggesting that some people are spoiling thepurity or integrity o another group, exhibited the highest promi-nence rom all speakers. It is also the most prominent hallmark instatements made by politicians, identi able bloggers and identi -able commenters.

    Compare agroup o people

    with animals,insects or aderogatory

    term in mothertongue

    Suggest that

    the audienceaces a seriousthreat or

    violence romanother group

    Suggest that

    some peopleare spoiling thepurity or

    integrity oanother group

    0

    20

    40

    60

    80

    100Compare a

    group o peoplewith animals,insects or aderogatory

    term in mothertongue

    Suggest thatthe audienceaces a serious

    threat orviolence romanother group

    Suggest thatsome peopleare spoiling the

    purity orintegrity o

    another group

    Anonymous blogger 0 0 1Identi iable blogger 8 1 13 Journalist 0 0 1Public igure 0 0 0Politician 1 1 4

    Community Leader 0 0 1Anonymous Commenter 0 0 0Identi iable comment er 69 34 79

    Total number o reports = 213

    N u m b e r

  • 7/30/2019 iHub Research's Umati November 2012 Findings

    6/16

    6Umati: November 2012 Findings *

    Offensive speech27%

    Moderately Dangerousspeech

    52%

    Extremely dangerousspeech

    21%

    For the Umati project, three criteria o dangerous speech were usedto sort collected statements into the three hate speech categories.The pie chart here shows that in November 2012, most statementsell in the moderately dangerous speech category. Important to notehere is that extremely dangerous statements have the highest po-tential to stir violence.

    Examples o statements in November that all in the three catego-ries, are listed below:

    O ensive speech: Even Jesus was a Gay. its a matter o personal choice.

    we wont vote 4 a woman shindwa na ushindwe

    Moderately dangerous speech: Show me a [tribe] n i wil show you a waste o space on thz beati ul planet.

    Hakuna dini inaitwa [religion]. daily ua changng....ua nt sure o ua books...[religion] ni kwa mungiki.

    Extremely dangerous speech: I you were amazed to discover that cell phones have uses other than setting o roadside bombs, You are a [religion].I you have nothing against women and think every man should own at least our, *You are a [religion]

    Msiseme ati hamkuwa equipmnts kwani majambazi waliku- wa na helicopers.?Wacha hao waone moto pia ndo waamuke.Si walihongania hiyo job.Walifkiri ni kuchukua hongo tu na kutomba wanawake mtaani.

    ** In an efort to avoid uelling hate ul speech, we have deliberately omitted the naming o any tribes, political parties or politicians whenwriting this report. For example, when we quote statements verbatim rom our study, we replace the named tribes with the terms [tribe1],[tribe2] etc.

    Moderately dangerous speech continues to be the most rampant category o dangerousspeech

    Total reports= 806

  • 7/30/2019 iHub Research's Umati November 2012 Findings

    7/16

    7Umati: November 2012 Findings *

    a blogger6%

    a politician3%

    0.2%

    0.2%

    an identifiable commenter91%

    Surprisingly, the highest use o dangerous speech rom the Kenyan online Umati is monitoring, is by identi able commenters. In Octoberthey accounted or 53% o all speakers and this month, they account or 91%.

    Identi able commenters are online users who leave comments in response to a Facebook post, an online news article, a orum or blog post.They are identi able in that they use their own name or a pseudoname.

    The lack o caution when speaking online suggests that the speakers are not considering the negative impact their statements could have,nor are they worried about being associated with the dangerous statements they make.

    Increase in identi able commenters

    Total number oreports = 803

    CommunityLeader

    AnonymousCommenter

  • 7/30/2019 iHub Research's Umati November 2012 Findings

    8/16

    8Umati: November 2012 Findings *

    Discriminate

    79%

    Kill

    12%

    Riot1%

    Beat2%

    Forcefully evict

    6%

    According to research carried out by Pro essor Susan Benesch othe American University, speech that contains certain calls to ac-tion can be deemed dangerous. Based on the audience and thespeaker, these calls to action can generate varying degrees o in-citement to violence. The calls are to discriminate, to steal, to riot,to beat, to orce ully evict and nally to kill.

    In the month o November, as was in October, the most requentcall to action on monitored Kenyan blogs, newspapers, Facebookpages and tweets was to discriminate members o another group.

    We also ound that dangerous speech this month was centerdmainly around two topics; tribe and religion. There was a noted in-crease rom October in dangerous speech statements centeredaround religion.

    Some examples o extremely dangerous speech are listed below:

    Against people o a certain tribe: I supot tribalism!!I cant vote 4 a [tribe] even at gun point

    Ours is simple. Use [tribe1] to get presidency then dump them. Since when did you hear a [tribe2] appointing a [tribe1]to anything. Watachunga ngombe zao a ter we win the prezzo then we can also take their shambas we have not orgotten.

    It will be 39 tribes against two

    Against a certain religion: wetha u lyk or nt churchez zitaisha day by day

    Is not that all [religion] are terrorist but all terrorist are [religion]

    The best option ni wanyoroshwe tu

    Against rival gangs: #KisumuViolence and TNA operations with #ChinaSquad will be back in the a ternoon.... KISUMU IS MARWA, we will fnish them like ochungulu

    Most noted call to action is the call to discriminate

    Total number oreports = 414

  • 7/30/2019 iHub Research's Umati November 2012 Findings

    9/16

    9Umati: November 2012 Findings *Events had a infuence on the occurence o dangerous speech online

    In relation to who was being targeted, key events took placein October that increased the requency o dangerous speechcirculated in the online space. Notable events are:

    2013 General Elections Cattle rustlers Eastleigh grenade attack Gor Mahia ans

    Raila - Ruto rivalry Political campaigns and alliances

  • 7/30/2019 iHub Research's Umati November 2012 Findings

    10/16

    10Umati: November 2012 Findings *

    The calls to action rank rom the most severe whichis to kill, to the least severe which is to discriminate.

    Identi able commenters were the only group thatexpressed all the 6 calls to action with the calls to

    discriminate being the most, and calls to loot beingthe least.

    Identi able commenters express all 6 calls to action

    journalist

    politician

    anonymous commenter

    community leader

    identifiable blogger

    identifiable commenter

    i i i

    ll i

    ill

    i

    li

    li i i

    i l

    i i l l

    l

    Beat

    Discriminate

    Force ully evict

    Kill

    Loot

    Riot

    Beat Discriminate Force ullyEvict

    Kill Loot Riot Total

    journalist 0 1 0 0 0 0 1politician 0 7 0 0 0 0 7anonymouscommenter

    0 2 0 0 0 0 2

    community

    leader

    0 2 0 0 0 0 2

    identi ableblogger

    0 26 2 0 0 0 28

    identi ablecommenter

    8 281 23 50 1 3 366

    Total 8 319 25 50 1 3 406

  • 7/30/2019 iHub Research's Umati November 2012 Findings

    11/16

    11Umati: November 2012 Findings *Given the context, speaker, and audience o the statement, the notedcalls to action ell into diferent categories o dangerous speech.

    As a comparison, these three statements below are all calls to dis-criminate a particular tribe yet they all in the three diferent catego-ries o dangerous speech.

    Category 1: O ensive speech [PA1] ni mwizi! [PA2] ni inciter! [PA3] ni mwizi na incita! [PA4] ama [PA5] 4 me!

    Category 2: Moderately Dangerous Speech

    we as [political party] supporters now urge [PA1] to enter structured negotiation with [PA2] to save this country rom the jaws o this hyenas

    Category 3: Extremely Dangerous Speech [tribe1] n [tribe2]hav ruled dis country since 1960 to 2000 40 remainng tribe wil vote against them. Watch out .

    This reiterates that the extremity o an in ammatory comment,whether it alls under category 1 or 3, relies on a combination o ac-tors;

    the in uence the speaker has over the audience, how incite ul the statement is to the audience, and how harm ul it is to the targeted group.

    *PA = Presidential Aspirant

    Calls to discriminate across the three categories o dangerous speech

    0 50 100 150 200 250

    i

    l

    l

    i i i

    i

    ll i

    ill

    OfensiveSpeech

    ModeratelyDangerous

    Speech

    ExtremelyDangerous

    Speech

    Ofensive speech Moderately

    Dangerous Speech

    Extremely

    Dangerous SpeechDiscriminate 72 196 51Riot 0 1 2Beat 2 3 2Force ully evict 0 10 15Kill 0 22 28

  • 7/30/2019 iHub Research's Umati November 2012 Findings

    12/16

    12Umati: November 2012 Findings *From the diagram, the ollowing trends can be noted:

    Anonymous commenters and identi able com-menters were the most active amongst the speak-ers.

    Anonymous commenters reduced as the severityo hate ul speech increased.

    Unlike in October where identi able commentersincreased with the severity o dangerous speech,November saw a clear lead in moderately danger-ous speech.

    Anonymous commenters reduce as the severity o hate ul speech increases

    ii i i li i i l l

    i l

    li i ili

    l

    O ensive Moderate Dangerousidenti iable commenter 341 548 306identi iable blogger 9 26 10

    community leader 2 1 4anonymous 150 110 78politician 9 20 15 journalist 1 0 5Blogger 8 20 23

    OfensiveSpeech

    Moderately DangerousSpeech

    Extremely DangerousSpeech

  • 7/30/2019 iHub Research's Umati November 2012 Findings

    13/16

    13Umati: November 2012 Findings *

    From 792 statements collected during the month o November, 78% engaged in dangerous speech dialogue on public Facebookgroups and pages. Private Facebook groups were also rampant with actitivy.

    Facebook, the most pre erred plat orm

    9%

    78%

    0%

    5%

    2%

    3%

    2%

    0%1%

    i i

    i li

    l i l i i l

    l i l i li l

    i i li l

    i li li l

    i li i ll

    li i l

    A Facebook post in a private group/page

    A Facebook post in a public group/page

    A blog article in a private blog/ orum

    A blog article in a public blog/ orum

    A comment in response to a private blogarticle/ orumA comment in response to a public blogarticle/ orumA comment in response to an online news articleor blogA tweet

    An online news article

  • 7/30/2019 iHub Research's Umati November 2012 Findings

    14/16

    14Umati: November 2012 Findings *

    In November private Facebook groups were spaceswhere more in ammatory ( level 2 and level 3) dan-

    gerous speech was discussed.

    Private Facebook groups were used or Moderately and Extremely Dangerous Speech

    0 0.2 0.4 0.6 0.8 1

    . .

    .

    . . . .

    . . . . .

    A Facebook post in a private group/page 0 12% 11%A Facebook post in a public group/page 85% 77% 71%A blog article in a private blog/forum 0.50% 0 1%

    A blog article in a public blog/forum 3.50% 5% 8%A comment in response to a private blog

    article/forum 2.50% 1% 2%

    A comment in response to a public blogarticle/forum 5% 2% 2%

    A comment in response to an online newsarticle or blog 2% 2% 3%

    A tweet 0 0.40% 0.50%An online news article 0.50% 0.20% 1%

    Ofensive

    Speech

    Moderately DangerousSpeech

    Extremely DangerousSpeech

    Ofensive Speech Moderately DangerousSpeech

    Extremely DangerousSpeech

  • 7/30/2019 iHub Research's Umati November 2012 Findings

    15/16

    15Umati: Monitoring Dangerous Speech Online *DEFINITIONS

    Discrimination: Discrimination is understood as any distinction,

    exclusion or restriction made on the basis o race. colour, descent,

    national or ethnic origin, nationality, gender, sexual orientation, lan-

    guage, religion, political or other opinion, age, economic position,

    property, marital status, disability, or any other status, that has the

    efect or purpose o impairing or nulli ying the recognition or exercise,

    on an equal ooting, o all human rights and undamental reedoms in

    the political, economic, social, cultural, civil or any other eld o public

    li e. Source: La Rue, F., (2012, September 7). Report o the Special

    Rapporteur on the promotion and protection o the right to reedomo opinion and expression, UN Doc A/67/357, p12.

    Dangerous speech: This is a term coined by Pro . Susan Benesch

    to describe incitement to collective violence that has a reasonable

    chance o succeeding, in other words speech that may help to ca-

    talyse violence, due to its content and also the context in which it is

    made or disseminated. This possibility can be gauged by studying ve

    criteria that may contribute to the dangerousness o speech in con-

    text: the speaker (and his/her degree o in uence over the audience

    most likely to react, the audience (and its susceptibility to in amma-

    tory speech), the speech act itsel , the historical and social context,

    and the means o dissemination (which may give greater in uence or

    orce to the speech).

    Identi able Commenter: A person who responds to an online ar-

    ticle, blog post or Facebook post who can be identi ed by a name,

    regardless o whether the name is real or ake.

    SOURCES

    Pictures on page 7 sourced rom:

    http://static1.demotix.com/sites/de ault/ iles/imagecache/a_scale_

    large/100-1/photos/1258405850-cattle-rustlers-kill-11-in-ke-

    nya180541_180541.jpg Accessed 18th December 2012.

    http://www.coastweek.com/3546_nairobiblast_02.JPG. Accessed 18th

    December 2012.

    http://www. utaa.com//images/350x300/Gor ans_1.JPG. Accessed 18th

    December 2012.

    http://www.allkisima.com/wp-content/uploads/2012/02/raila+uhuru+ruto+mudavadi+wetangula1.jpg. Accessed 18th December 2012.

    Notes

  • 7/30/2019 iHub Research's Umati November 2012 Findings

    16/16

    Umati: November 2012 Findings *

    For more in ormation on this project, contact

    Umati Project TeamiHub ResearchNairobi, Kenya

    [email protected] | Twitter: @iHubResearch