Photographer Max Marshall discusses the changing nature of authorship and ownership in a networked world where others copy, paste, change, link, attribute, or misattribute, someone else’s work. Rather than “wasting time” seeking out those who misattribute or don’t give credit, Marshall instead suggests fostering and extending a community ethos of sharing, acknowledging others, and trust. With the loss of one’s individual authority, one gains serendipitous juxtapositions and interesting pingbacks created by the collective curatorship of the blogosphere, with the ultimate result of more people experiencing one’s art.
A community of faculty and staff working together at a college or university using a networked data tracking system that repeats, recontextualizes, and reinscribes content generated by its respective users can be compared to Marshall’s notion of allowing one’s content to end up in interesting, unforeseen places. In response to demands from accrediting bodies, boards, and political funding bodies who desire data to “prove” the effectiveness of education and the value of a degree, these new student tracking systems are being implemented at colleges and universities to measure the institution's attempts to ensure a student's success. These systems allow/require faculty to raise “flags” chosen from a prescriptive list of common academic difficulties and to narrate details of specific concerns about the student. The flag triggers an institutional response in the form of multiple communications with the student, his/her advisor, the counseling center, success coaches, and the office of academic research. The flag and its contents are tracked by the system, which accumulates data about the student, the faculty member raising the flag, and the responses taken by others who intervene to assist the student.
Once a flag is raised, the automated system sends an email to the student – attributed to and ostensibly from the faculty member who raised the flag – without human intervention. That is, the system is designed to immediately reblog the information submitted by the faculty member by copying it and pasting it into a new context: that of the canned concerned email to the student. At the very bottom of this email is the text that the faculty member added to the flag. The bulk of the email, however, is generic text procedurally written by the system from the perspective and point of view of the faculty member. The program uses algorithmic logic to construct the email from a bank of phrases as well as specific student-related information pulled from the flag, such as the student’s name, ID number, and course name. For example, the email to the student states, "I want you to be successful in every course you take" and "I am very concerned about your success in (Insert course name here).” In this case, the reposting by the software has attributed words to someone who did not author them, and distributed the message to a variety of audiences. Faculty members often remain unaware that their comments about the student have been repurposed into an email and archived in the system in this format. All this reposting of data creates an archived virtual identity of the student as “someone who struggles.” While the college’s motivation for using such a system was partially based on caring and concern (as well as accountability demands by accrediting bodies), the reposting and archiving of confidential student information (such as their reasons for missing class or their likelihood of failure) creates a traceable, identifiable, and potentially public reinscription of the student.
Unlike Marshall’s community of trust and acknowledgement, there is no human curator who makes aesthetic judgments about the value of the content or its applicability in a new context. The machine does not contemplate or judge the merits of the flag or comments and subsequently determine an appropriate action to take, if any. It is an automatic switch that takes predictable, unconsidered actions that apply to all inputs of a specific type. Professor A remarkably says the same words as Professor B in the machine-driven text sent to the student. Advisors and counselors receive emails about students they have never met – emails with personally identifying information and narrated details offered by a student to one person, now shared with all others tagged by the system. While the tracking system itself is protected by a secured sign-on, these emails generated by the program are one click away from being forwarded to an outside party, thereby opening the professor and student to additional audiences and inscriptions. Clearly one must trust one’s colleagues to hope that such violations of FERPA do not occur, but the fact that actions are taken by the uncritical software without the knowledge of student or professor raises concerns about the viability of a community of trust when governed by algorithmic rhetoric programmed by an external corporation who profits from the program’s installation and use.