When Citizen Journalism Crosses the Line

Feature
20
Earlier this year, an online petition called for STOMP, a popular citizen-journalism
portal, to be shut down, citing the damaging potential of misleading content that can
sometimes be found on the website. User-generated online content can constitute
harassment, and the much-talked-about Protection from Harassment Act is timely in
recognising this. Is the Act, however, an end to irresponsible cyberspace behaviour?
How effective is it in dealing with content posted on citizen-journalism websites?
This article explores these questions and posits that, while the Harassment Act is a
commendable step forward, the unique nature of citizen-journalism portals presents
a number of challenges to applying and enforcing the Act against objectionable
online conduct.
When Citizen Journalism Crosses the Line:
Does the Harassment Act Have an Online Bite?
Posts like the one above are par for the course on websites
and forums that rely heavily on user-generated content to
drive traffic. Web portals that foster the sharing of usergenerated content are rife with photographs of people
caught in various types of “socially-objectionable behaviour”
– young couples publicly displaying affection, people eating
and drinking on trains, military personnel dressed slovenly,
fathers smoking with their children in tow – the examples
are legion.
A combination of cellular cameras, mobile internet and
social networking sites has made the concept of “citizen
journalism” one that has re-defined news gathering and
the value of news. Armed with a mobile phone, any man or
woman on the street has the potential to produce content
that can go viral in a matter of minutes.
In late March 2014, netizens went up in arms over what
appeared, at first blush, to be another run-of-the-mill post
on a local popular content-sharing website. Over 20,000
people signed an online petition calling for citizen-journalism
website STOMP to be closed down.1 This cyber uproar
stemmed from a photograph portraying a full-time national
serviceman seated on a row of fully-occupied MRT seats
while an elderly woman stood in front of him. The ensuing
captions and comments lambasted the serviceman for
failing to give up his seat to the old lady. It was subsequently
revealed that the photo had been edited before it was
posted onto STOMP: an empty seat at the end of the row (in
which the serviceman was seated) appeared to have been
cropped out, and the truth of the matter was that the elderly
lady had chosen not to take that available seat.
Dan Gillmor, in his seminal work on citizen journalism,
characterised the trend as such: “news reporting and
production will be more of a conversation or a seminar. The
lines will blur between producers and consumers, changing
the role of both in ways we're only beginning to grasp.
The communication network itself will be a medium for
everyone's voice”.2 Naturally, “everyone’s voice” includes
the good, bad, mad and everything in between. At the one
end of the spectrum, there is citizen journalism that brings
to light objectionable social conduct, which Government
and society may then address constructively. At the other,
acts of petty voyeurism stake claim to the appellation.
Where does one draw the line between “public-spirited”
citizen journalism and unwarranted internet vigilantism,
where user-generated content paints a false or misleading
Singapore Law Gazette June 2014
21
picture of actual events (as in the case of the national
serviceman)? Even when an uploaded photo genuinely
depicts objectionable conduct, the comments that follow
can be scathing, sometimes identifying the subject, and in
some cases impugning his or her character. The subject
can end up paying an inordinately high price for what could
have been an extremely minor anti-social “transgression”.
Unwitting subjects could find themselves in the digital
equivalent of the medieval “stocks”, frozen into position in an
online square for all and sundry to take out their frustrations
and pass abusive public judgment on.
When online castigation is taken too far and becomes
premised on untruths, falsehoods and malicious namecalling (that are disproportionate to the gravity of the
conduct), the subject of a post can turn into a victim of a
digital lynch mob. In these circumstances, does the law
provide such victims with recourse against pernicious, usergenerated online content?
This article considers the imminent Protection from
Harassment Act (“Harassment Act” or “Act”) and discusses
its utility in affording relief against capricious forms of
citizen journalism. This article will also discuss nuances in
enforcing the Act in respect of user-generated content that
is hosted on, and moderated by, a central website or portal,
and questions whether the Act is adequately equipped to
deal with such content.
The Harassment Act and Objectionable Online
Content
On 14 March 2014, Parliament passed the Protection from
Harassment Bill 2014 (which will soon be enacted as the
Harassment Act). Introduced to address a lacuna in our
law in respect of harassment,3 the Act criminalises certain
forms of physical and non-physical harassment. The former
encompasses physical stalking and sexual harassment at
the workplace; the latter includes anti-social behaviour in
the cyber sphere.
Since the introduction of the Harassment Bill, much ink
has been spilt on aspects of the Act, such as its purpose,
feasibility, scope and practicality of enforcement. It is,
therefore, not the ambit of this article to give an exposition
of the Act and its general operation, save to say that the
broadly-worded definition of what constitutes harassment
affords the Act considerable latitude in targeting a whole
host of undesirable online conduct.
Under s 4 of the Act, any person who makes “threatening,
abusive or insulting communication4 which is heard, seen
or otherwise perceived by any person likely to be caused
Feature
harassment, alarm or distress” is guilty of a criminal
offence. This broadens the pre-existing statutory provision
for punishing harassment, which was found in s 13A of the
Miscellaneous Offences (Public Order and Nuisance) Act
(“MOA”).5
A number of observations can be made from a literal reading
of s 4. First, photographs and comments posted on the
internet will incontrovertibly be deemed “communication”
for the purposes of the Act. This is much broader than
s 13A of the MOA, which was more circumscribed and
did not appear to apply to digital communications (only
criminalising “threatening, abusive or insulting words or
behavior” or “writing, sign or other visible representation
which is threatening, abusive or insulting”).
Second, whether or not a post offends the Act is a matter to
be ascertained purely from the psychological repercussions
suffered by the victim; the intention of the post’s originator
is immaterial.
Third, the victim does not have to suffer actual “harassment,
alarm or distress.” It suffices for the victim to show that,
given the offending post, it is likely that he or she will feel
harassed, alarmed or distressed. This broadens the preexisting definition under s 13A(1) of the MOA, which requires
that actual harassment, alarm or distress be caused before
the statute bites.
Fourth, unlike an action in defamation, the veracity of the
offending content cannot be put up as a defence. In other
words, it does not matter if a post genuinely depicts the victim
engaged in objectionable conduct. As long as the victim can
show that he or she is likely to be psychologically affected
by the post, an offence under s 4 can be established.
Two statutory defences are, however, open to the originator
(provided under s 4(3) of the Act). First, the originator is not
guilty of an offence if he or she can prove that there was
no reason for him or her to believe that the offending post
would be heard, seen or otherwise perceived by the victim.
It is difficult to imagine how this defence can possibly apply
to a post made on the Internet.
Content, once uploaded onto the web, can be accessed by
anyone from anywhere in the world (except, of course, when
the content is uploaded onto a password-protected personal
blog or website, or when the originator puts up a post on
his or her social networking profile, but restricts access to
selected groups of people). The applicability of this defence
becomes more improbable in the context of posts made on
citizen-journalism platforms, which are popular and openly
viewed by many, some on a very regular basis.
Singapore Law Gazette June 2014
Feature
22
Second, the originator can escape liability if he or she can
prove that, in posting the content in question, his or her
conduct was reasonable. Reasonableness, in the context
of the Act and online content, is not defined. Would posting
a “truthful” (yet damaging) photo or comment qualify as
reasonable conduct? What about content arising out of an
intention to warn the public of a social menace (perhaps in
the case of a flasher caught on camera)? As a preliminary
thought, it is difficult to envisage how harassing comments,
especially those that impugn the character of a victim
can, in the eyes of the common person, be regarded as
“reasonable”, even when there is indeed some modicum
of truth in the opinions posted. Therefore, absent the
development of case law, one can only surmise (in the
infancy of the Act) what this defence of reasonableness will
encompass.
Self-help Remedies Under the Act
While an originator guilty of an offence under ss 3, 4, 5 and
7 is liable to criminal sanctions,6 of more interest to potential
victims are the possible reliefs (otherwise known as “selfhelp” remedies available under the Act7). The Harassment
Act recognises that offending online content, if left on the
internet, can have a “continuing”8 effect on the victim and,
therefore, allows a victim to apply for what is known as a
Protection Order (“PO”) under s 12 of the Act. Such an
order can require the originator to remove the offending
content he or she has posted and to refrain from posting
further offending content.9 This is a practical remedy that
seeks to eradicate the source of the harassment at its
roots. In his second reading of the Harassment Bill, Minister
for Law, K Shanmugam, said a PO is essentially focused
on helping laypeople “navigate the court process without
involving lawyers”.10 Applications for POs will, therefore, “be
governed by a set of simplified court procedures and court
forms”, and there will be in place “expedited processes in
the courts which can give this remedy immediately within a
day, two days, sometimes”.11 Under these ideals, offending
content that causes harassment can be nipped in the bud
before it becomes shared and re-shared by other online
users, causing wider damage.
something of a halfway-house approach which deals with
untrue content even when such content does not amount
to harassment (and, therefore, cannot be dealt with under
s 4 of the Act).
The example of the national serviceman who was incorrectly
accused of not giving up his seat on the train provides an apt
illustration of how this approach may work. The serviceman
might not have sufficient basis to claim that he has been
harassed by the post. He might, however, feel indignant that
the photo was posted in a manner that cast him in a bad
light, and that such a false depiction of the actual events,
if allowed to stay online, may eventually lead to his being
flamed. The serviceman may seek recourse under s 15 of
the Act. Section 15 allows a subject of an untrue statement
of published12 fact to apply to the District Court for an order
compelling the originator of the post to cease publication of
the untrue post unless the originator of the post can offer
to publish a notification bringing attention to the falsehood
and the true facts. This has the effect of either eliminating
the untrue content from the internet altogether or setting the
actual facts straight with the hope of minimising damage
that may stem from the falsehoods published.
How do the Reliefs Provided by the Harassment
Act Work in the Context of Content Uploaded onto
Citizen-journalism Portals?
The provision of self-help remedies under ss 12 and 15 of
the Harassment Act (“Remedies”) is arguably the Act’s most
practical utility in combating online harassment. While such
What About Posts that are Not Severe Enough
to Constitute Harassment, but are Nonetheless
Untrue?
Not every online post directed at a particular individual
will cause the individual to feel “harassed, alarmed or
distressed”. These are, after all, rather strong emotions that
may or may not be invoked, depending on the nature and
context of the post, as well as the individual’s psychological
resilience. The Harassment Act, therefore, provides
Singapore Law Gazette June 2014
23
Remedies appear to be easily sought and enforced, this
article will suggest that enforcing these Remedies against
content uploaded onto citizen-journalism websites may not
be as straightforward as one would like it to be.
The Harassment Act is certainly progressive in providing a
framework of both civil and criminal remedies for all forms
of harassment that now extends to regulating cyberspace
conduct. Be that as it may, it is unrealistic to expect the Act
to be a panacea for all forms of damaging speech online.
Some limitations are unavoidable in the context of citizenjournalism websites which garner and put up mostly usergenerated content that is lightly-moderated. A few of these
limitations are considered in the following section.
Anonymous Contributions
Most websites that rely on user-generated content do not
require a user to register as a member before submitting
a post. A contributor need only furnish his or her name,
e-mail address and contact number together with the photo
or video, title and caption he or she intends to submit.13
For example, STOMP’s Terms and Conditions14 (which
govern use and access of the STOMP portal, including
contributions) do not require the user details accompanying
a contribution to be the contributor’s real name. In fact, from
a perusal of STOMP’s user-generated content, most posts
are attributed to the contributor through pseudonyms, from
which the real identity of a contributor cannot be gleaned.
Without a named individual against whom the Remedies
can be taken out, a victim of objectionable online content
will find considerable difficulty in seeking recourse through
the Harassment Act.
Section 19 of the Act attempts to address the issue of
anonymous contributions. For purposes of seeking the
Remedies, civil procedure rules may be enacted15 to
provide for orders directing an anonymous contributor to be
identified by an Internet location address, a username or
account, an e-mail address or any other unique identifier.
This means that a victim can first apply for the Remedies
even without prior knowledge of an originator’s identity,
then seek an ancillary order for the unknown originator to
be traced.
While theoretically feasible, this approach is not without
practical limitations. First, any tracing would have to be
done with the cooperation of the online portals. Second, it
would be futile to attempt to identify a contributor through
a pseudonym. The e-mail address and mobile number
furnished at the time of contribution would be of little or no
assistance if sham details were given: how does one serve
a PO on #HelloKitty1234?
Feature
In such circumstances, a complainant seeking a remedy
might have to apply to Court for pre-action interrogatories
to compel the relevant Internet Service Provider to disclose
the poster’s unique Internet Protocol address, a process
that is time and cost consuming.
Damage Done: The Viral Nature of Online Content
The greatest stumbling block to the Act’s efficacy has little
to do with the shortcomings of the legislation as drafted per
se. Rather, the viral nature of online content means that any
remedial action to be taken under the Act could be too little
too late.
One Mdm Valerie Sim, through no fault of her own, found
her reputation and livelihood threatened by recent online
postings. Mdm Sim struggled to earn a living by collecting
waste oil from coffee shops in the Jurong area, which
involved siphoning oil from grease traps to be pumped into
oil tanks. The waste oil would then be processed by the
company she worked for and turned into biodiesel. She
was paid S$5 for every barrel of waste oil she managed
to collect. Unfortunately, a photo of her was uploaded onto
citizen journalism website STOMP in February this year as
part of an online scare concerning some mainland Chinese
nationals who were allegedly recycling gutter oil for use in
food preparation.16 The post involving Mdm Sim went viral
on Facebook and Twitter, and before she knew it, Mdm Sim
was advised to stop working by the National Environment
Agency (even though her job had nothing to do with the
gutter oil scare).
Mdm Sim’s situation illustrates how the Act might be a
step behind the realities of the Internet. The Act’s punitive
provision and self-help remedies would provide little succor
to Mdm Sim, who was effectively deprived of her livelihood
temporarily. The Remedies, even if obtained, would unlikely
be practically enforced against all the participants of her
digital lynching.
Online content is easily reproduced. An offending post can
be picked up by a reader and re-posted on a number of
other platforms such as forums and social media sites by
providing a direct URL link to the original post. This sort
of reproduction can be nipped by simply removing the
original post; the satellite URL links then become dead. A
more pressing concern arises when objectionable content,
especially photos and videos, are copied from the original
post and posted afresh on another platform. Deleting the
original post, in such circumstances, will not eliminate the
re-posted content. This latter way of reproducing content
can also be more damaging because re-posters can add
and publish their own comments, which may be equally, if
not more, objectionable than the original ones.
Singapore Law Gazette June 2014
Feature
24
Mdm Sim’s story and the above observations are telling of
a need for third-party conduct (such as that of re-posters)
to be addressed in addition to that by the contributor and
by the content host. It may be possible for a victim to
successfully enforce the Remedies against a contributor
or STOMP, only to find that the offending content, although
removed at the root, has resurfaced by way of reproduction
on other websites. Would the victim then need to apply for
another order against the reproducer?
Fortunately, the answer is no, according to Minister for Law,
K Shanmugam, in his second reading of the Bill.17 A Remedy,
once issued by the Court, is good against all publication of
an offending content, including subsequent reposts. All a
victim has to do is to inform re-posters of the terms of the
Remedy earlier issued; a re-poster will thereafter be bound
by these same terms and is obliged to remove his or her
reproduction of the offending content.
However, given the viral nature of content on the Internet,
a Remedy, even if enforceable against the world at large,
might be a step too late, especially when the damage is
already dealt.
The Contributor does Not Have Control Over the
Visibility and Presentation of Submitted Content
Even when a contributor is finally identified, he or she might
not be the appropriate person against whom the Remedies
should be directed. This may appear counterintuitive, but is
so for two reasons:
1. Unlike where a user posts content onto his or her own
social media account (eg Facebook, Twitter, Instagram)
or on conventional web forums, a contributor to local
citizen-journalism portals usually does not have control
over the content he or she submits for publication.
Submitted photos, videos and commentaries are
presumably moderated by editors, and eventually
appear in the form of a “news report” that may not
manifest in the contributor’s own words (although
most content does display some of the contributor’s
own language in the form of direct quotes). While this
means that the portal has discretion over whether or
not submitted content eventually gets published, it
also means that, no “delete” button is available to
the contributor with which objectionable content can
be removed at his or her will (unlike how a post to
Facebook can be easily deleted). Accordingly, a Court
Order compelling an identified contributor to remove his
or her submitted post will have no practical effect.
2. The fact that some online citizen-journalism portals
can moderate and edit a contributor’s submission
(especially the commentary accompanying a photo or
a video) points to the fact that objectionable statements
might not necessarily be generated by the contributor.
Except in instances where an objectionable statement
is found in a direct quote, offending statements may
well be a consequence of how certain portals choose to
present or paraphrase user-submitted commentaries.
This is particularly possible when a contributor submits
a photo or video, but the editors of the portals impute
additional opinions or descriptions of the submitted
media (different from or beyond those of the contributor).
Under those two circumstances, rather than direct the
Remedies against a contributor, it would appear that the
online citizen-journalism portals, as the content host and
provider, should be made the subject of the Remedies.
The Harassment Act rightly acknowledges this: 14
Protection Orders can be taken out against third parties
such as publishers or website administrators (in addition
to harassers), and these third parties can be compelled to
take down offending content. This is evident from the fact
that s 12(3)(b) allows the District Court to make an order
requiring that “no person shall publish or continue to publish
the offending communication”. The term “person” is broad
enough to encompass not only the primary harasser, but
also third parties with control over the offending content.
Section 15(2) also employs the term “person” in relation to a
takedown order or an order for the clarification or correction
of false information. Given how these portals have sole
control over content published on its website, practicality
would call for the Remedies to be directed at them, instead
of at the contributor.
The Participative Nature of Online Content
Many content portals allow users to freely comment on
posts. One example is STOMP. To leave a comment,
one must sign up first as a member, and abide by cl 6 of
STOMP’s Terms and Conditions,18 which prohibits users
from making comments that are “inaccurate, misleading,
libelous, defamatory … abusive … false” or those that
would violate any law or rights of any third party. However,
it is questionable whether, first, users refer to these
conditions before posting comments, and second, even if
they do, whether they abide by them. Comments that are
posted appear in real-time; they appear not to be screened
or moderated by STOMP before showing up on the site.
Irresponsible comments that amount to harassment or
falsehoods can have as damaging, if not more, an effect
than the original post.
If the original post was not objectionable, but a comment
(left by a third party) it attracts is, it seems likely that a victim
can simply apply to the Courts to have the Remedies issued
Singapore Law Gazette June 2014
25
against the person leaving the comment. Such person will
then be obliged to either take down or clarify his or her
comment.
a better-rounded array of measures and solutions will be
available to control and regulate online conduct.
► Choo Zheng Xi *
Senior Associate
Peter Low LLC
E-mail: [email protected]
Conclusion: The Act Must be Complemented by a
More Holistic Approach
While the Harassment Act is to be commended for bringing
the law of harassment in Singapore up to speed with digital
realities, a healthy dose of realism about what the limitations
of the Act are will enable lawyers and laypersons to consider
how best to vindicate their rights not to be harassed. This
article has attempted to flag some of these limitations
Addressing the bane of online harassment does not end
with the provision and enforcement of remedies that
eliminate the offending content. In the aftermath of an
offending post, victims (and their legal advisors) are left with
a need to consider how best to mend damaged reputations.
To maximise the utility of the Act in protecting victims of
harassment, lawyers looking to advise their clients on the
remedies under the Act need to think beyond the limitations
of the Act to tailor creative solutions that meet the realities
of communications on social media.
For instance, in the landmark UK case of McAlpine v
Bercow [2013] EWHC 1342, Lord McAlpine sued the wife
of the Speaker of the House of Commons for insinuating on
Twitter that he was a pedophile.
Interestingly, Lord McAlpine also proceeded to commence
legal action against Twitter users who had “re-tweeted” the
allegations, permitting such Twitter users with less than 500
followers to settle the matter by making a donation of GBP
25 to a BBC charity for children but proceeding to pursue
legal action against 20 high profile re-tweeters of the libel.
What Lord McAlpine’s case demonstrates is that online
reputation management and recovery involves much
more than the strict enforcement of legal rights: it is also a
public relations battle in which your client’s legal rights and
remedies are the starting point, not the destination.
Locally, the hapless Mr Anton Casey who found himself
pilloried endlessly, and even stalked for his insensitive
comments, did not just “lawyer-up”; he engaged PR
consultants to engage in damage control.19
So, while the Harassment Act provides a good starting point
for lawyers to suggest remedies to their clients, lawyers
must be open to considering how remedies under the Act
can be used in conjunction with creative and calibrated
means of reputation management. It is hoped that, with the
future development of jurisprudence surrounding the Act,
Feature
► Fong Wei Li
Associate
Michael Hwang Chambers
E-mail: [email protected]
* The views expressed in this article are the personal views of the authors and
do not represent the views of Peter Low LLC and Michael Hwang Chambers
Notes
1
“Close down STOMP.com.sg”; available at: http://www.change.org/en-GB/petitions/
sph-stomp-com-sg-close-down-stomp-com-sg (accessed 23 May 2014).
2
Dan Gillmor, “We the Media: Grassroots Journalism by the People, For the People”
(O’Reilly Press), at Introduction, p XXIV; available at: http://dl.e-book-free.
com/2013/07/we_the_media.pdf (accessed 23 May 2014).
3
See AXA Insurance Singapore Pte Ltd v Chandran s/o Natesan [2013] 4 SLR 545 (“AXA
Insurance”), and Yihan Goh, “The Case for Legislating Harassment in Singapore”
(2014) 26 SAcLJ 68, cf Nicholas Hugh Bertram Malcomson and Another v Naresh Kumar
Mehta [2001] SGHC 309.
4
Defined in s 2 as “any words, image, message, expression, symbol or other
representation, than can be heard, seen or otherwise perceived by any person”.
5
(Cap 184, 1997 Rev Ed Sing).
6
And enhanced penalties for subsequent offences under s 8 of the Act.
7
Singapore Parliamentary Speeches and Responses, Second Reading Speech on the
Protection from Harassment Bill (13 March 2014) at 165-168 (K Shanmugam,
Minister for Law).
8
See ss 12(2)(a) and 12(2)(b) of the Act.
9
Section 12(3)(b) of the Act.
10
Supra (note 7 above).
11
Ibid.
12
Defined in s 2 as making a “communication or statement available in any form such
that the communication or statement is or can be heard, seen or otherwise perceived by
the public in Singapore or any section of the public in Singapore, and includes cause to
be published”.
13
STOMP website registration page <http://singaporeseen.stomp.com.sg/singaporeseen/
external/contribute.php> (accessed 23 May 2014).
14
Singapore Press Holdings, “Terms and Conditions”; available at: http://sph.com.sg/
legal/website_tnc.html (accessed 23 May 2014).
15
By way of the Rules of Court, and by the Rules Committee constituted under s 80(3)
of the Supreme Court of Judicature Act (Cap 322, 1985 Rev Ed Sing).
16
Jasmine Lim, “Grease Trapped”, The New Paper (10 March 2014); available at: http://
news.asiaone.com/news/singapore/grease-trapped (accessed 23 May 2014).
17
Supra (note 3 above) at 202-204.
18
Supra (note 15 above).
19
See Yeo Sam Jo, “Anton Casey Goes for an Online Makeover”, The Straits Times (18
May 2014); available at: http://news.asiaone.com/news/singapore/anton-casey-goesonline-makeover (accessed 23 May 2014).
Singapore Law Gazette June 2014