What Would Robot Poetry Look Like?


Consider the word “table.” Reading it likely elicits thoughts of a place to put things, to eat off of. Maybe it’s wooden, maybe it has legs that are also wooden or made of another material. Maybe, if you’re interested in science, “table” is to you a synonym for “chart,” a means of organization. Or, if you’re prone to debating, you’re familiar with “tabling” discussions, or holding onto them for later. Because the latter definition is a tough thing to illustrate — it describes a series of actions and thoughts rather than a concrete object — it’s unlikely to come up in a Google Image search for “table.”

This is one quality that separates search engines from humans, who are currently more capable of making connections between disparate meanings. While humans use metaphors informed by experiences, search engines — which operate by recalling “tags” that have been given to images and other pieces of information — are necessarily more straightforward. If you search for the word “table,” you’ll see a lot of oak, and not a lot of depictions of productive, businesslike conversations.

Corey Pressman, Director of Strategy at app developer Neologic, is at work on a project that aims to change all that. Called “Poetry for Robots,” it confronts the question, “What if we used poetry and metaphor as metadata? Would a search for ‘eyes’ return images of stars?”

So, what exactly does it mean to “use poetry and metaphor as metadata”? It’s not as complicated as it sounds; basically, “metadata” is data that’s used to describe data. On Flickr, when a picture you stumble on lists the type of camera it was taken with, that’s metadata; at a library, the genre a book belongs to is metadata, too. When librarians and others who work with metadata —> Read More