Various internet-based companies are protected from liability for the content they deliver to their customers based on the idea that they arent editing or producing that material. However, the question that algorithmic-based delivery and sorting should be raising is whether that activity does constitute editing: it selects what and when to show things and to whom, not based on a user-originated request or a blank selection such as chronology (putting the most recent things first, regardless of who they are from), but based on some unknown metric that only the company really controls: conceptually this action sounds like an autonomous version of an Editor (even if it is one generated by using pervasive monitoring to watch the customer).
minor update:
Censorship by online companies such as Facebook or any other "distribution platform" should be recognized for what it is: a tacit and explicit admission of their editorial decisions and activities. If they are to be regarded as merely conveyors of information, useful and interesting, to an audience that is also in the business of posting and creating that same information. For the company to then interpose itself as more than simply a disinterested platform, then why are they concerned with what the material posted by their audiences is? If they are not engaged in editorial decisions, as they consistently claim, then there should be no issues around censorship or other content-based selections at all.
|
|