As a society now enthralled with social media and sharing everything from our morning latte to our biggest triumph (yet ironically not equally as often our failures…) on platforms such as Instagram, we are obsessed with sharing photo and video snippets to craft our image and identity online. In such identity construction, the creator or account holder clearly has domain over the content depicting themselves, but what about the instances when a bystander finds their way into other’s posts inadvertently, or even secretly, without the bystander’s consent? However harmless, is this a breach of ethics and personal privacy or merely par for the course in our new age? Specifically looking to Instagram “stories”, which are particularly popular and pertinent to these questions.
Generally speaking, federal laws seem to allow for such behavior of public recording where there is not an “expectation of privacy”, but surely such broad language must’ve been adopted prior to current times where nearly everyone has a smartphone and therefore a camera in hand at all times. Non-consenting bystanders find themselves recorded constantly either directly or indirectly, purposefully the subject of a post to create content for the creator or account holders online following.
Like many components of social media, this all seems to be an unintended consequence of the rapid transformation of technology. Even assuming the technology was created for a good and sound purpose, i.e. done in good nature—public and corporate policy has the responsibility to adapt in lockstep with such progression of technology.
I highly doubt they rise to such challenges which means we’ll likely need more technology to fix or counteract the unintended consequences of the previous. For example, in the future, as we further advance, I wonder if the insta-story bystanders would and could be recognized through facial recognition software, and then notified when they are being targeted through other’s public and maybe even private social platforms…? And if so, then what?
No comments:
Post a Comment