“Art Post-Internet” was an exhibition curated by Karen Archey and Robin Peckham for the Ullens Center for Contemporary Art in Beijing in spring 2014. This is the specially designed pdf catalogue whose with the front page is created each time with the IP and quite approximated location of the user. It includes tentatively definition of “post-internet” by Cory Arcangel, Simon Denny, and Bunny Rogers, art critics Ben Davis and Paddy Johnson, academics Mark Tribe and Esther Choi, and museum professionals Christiane Paul, Raffael Dörig, Jamillah James, Ben Vickers, Omar Kholeif and Gene McHugh.
The uncertainty on the nature of a written piece, that is, how much it looks ‘human’ even if it was generated by a machine, is one of the issues raised by text generation software. The refinement of the algorithms used and, consequentially, of the results obtained, has brought the Turing test (a method to test if an artificial text could be mistaken for a real one) out of laboratories and made possible to administer it directly to social systems. This is exactly what Jeremy Stribling and other two MIT friends of his have done with Rooter: A Methodology for the Typical Unification of Access Points and Redundancy, a research paper generated by SCIgen, a program designed to automatically generate computer science papers, complete with graphs, figures and quotations, which can be accessed through a web interface and is free to use and experiment with. Sensationally, the paper was accepted at the prestigious WMSCI, the World Multi-conference on Systemics, Cybernetics and Informatics. Even though none of the revisors explicitly approved it, none of them rejected it either, passing the judgement on and tacitly approving the plausibility of the paper. After the sensation, the paper was obviously removed, but its authors are now trying to secretly get in touch with other relators, offering them a reimbursement of the attendance fee (which they raised thanks to the donations of 165 anonymous web users) to gain access to the conference anyway and complete the stunt by actually presenting the work with all the seriousness required by the context. These textual generations could produce a completely new meta-language, a sort of ‘lexicon of plausibility’ where the signs of a radical mutation of linguistic structures, which are now the domain of machines, could be recognized.