Blurred Lines: The Ethics of Digital Media Creation
Digital Media Ethics Defined
According to the University of Wisconsin’s Center for Journalism Ethics, digital media ethics deals with “the distinct ethical problems, practices, and norms of digital news media.” This specific perspective mainly examines digital media ethics from a journalistic point of view, but digital media is so much more. Digital media includes anything posted on social media, professional online journalism, citizen journalism, and digital photojournalism, among other platforms. While media consistently continues to evolve, unethical media practices are growing simultaneously.
Image Ethics
With the growth of cell phones and the increase in mobile usage, it has never been easier to capture images and videos, literally at anyone’s fingertips. The ease of taking and sharing images is a large part of the problem and requires ethical observation, especially regarding the ease of manipulating photos. Of course, there is a large difference between changing the tone of a photo versus photo-shopping a dinosaur in the background of the New York City skyline and calling it legitimate. Besides editing images, image ethics also applies to what can and cannot be posted on social media. Of course, the guidelines for posting differ by platform. Many social media platforms have their own set of “community guidelines” in place, which will be discussed below.
Community Guidelines
Instagram’s Community Guidelines
Instagram states on its website, under FAQ that their community guidelines set the policies for what is and is not allowed on Instagram. Its community guidelines cover seven topics: intellectual property, appropriate imagery, spam, illegal content, hate speech, self-injury, and graphic violence. The appropriate imagery aspect of the guidelines has been contested and changed, currently, Instagram does allow photos of post-mastectomy scarring and breastfeeding, both of which used to be flagged for nudity. Another important thing to note is that the guidelines relating to self-injury extend to eating disorders; the guidelines do not that self-injury and eating disorders can be posted only to create awareness or support.
TikTok’s Community Guidelines
TikTok’s community guidelines are “principles that help embody our commitment to human rights,” per their website. The three headings of these principles are balance, dignity, and fairness. Apart from the community principles, their overview page has ten other pages covering youth safety and well-being, safety and civility, mental and behavioral health, sensitive and mature themes, integrity and authenticity, regulated goods and commercial activities, privacy and security, For You feed eligibility standards, accounts and features, and enforcement. TikTok should be recognized for its stance on youth safety, especially in such a digital age. This aspect of their guidelines covers exploitation, psychological and physical harm, and child sexual abuse material (CSAM). This portion also covers the dangers of online challenges that young children on the internet often take part in that can result in serious physical harm.
X’s Community Guidelines
X does not have a specific set of community guidelines, instead, it offers a page dedicated to Rules and Policies. Certain X guidelines or resources are country-specific, which is not seen on either of the previously mentioned platforms. The closest thing X has to community guidelines is their Terms of Service. These terms begin with the age restriction (must be 13 years or older) and their privacy policy. Essentially, X takes the stance that you are responsible for the media that you consume on their platform, which illustrates an “at your own risk approach.” Regarding the content that individuals post, X reserves the right to remove any content that does not comply with applicable laws, rules, or regulations or may violate the user agreement.
Facebook’s Community Guidelines
Seeing that Facebook, just like Instagram, is also under the ownership of Meta, one can expect some similarities between the two. Facebook follows a list of Community Standards that “outline what is and isn’t allowed on Facebook.” Facebook claims that its standards have been created based on feedback from users and experts in the field of technology and public safety. These standards draw upon four main categories: Authenticity, Safety, Privacy, and Dignity. Authenticity and Dignity are the two most easily explainable pillars, as authenticity relates to users misrepresenting who they are or what they are doing on Facebook and dignity simply relates to respecting other users. However, authenticity has an entire subcategory based on misinformation, something Facebook is all too familiar with. Facebook can flag the following as misinformation: anything that can cause physical harm, that may interfere with political proceedings and deceptive media. The safety category for Facebook follows the same guidelines as Instagram and also explicitly acknowledges eating disorders as a form of self-injury.