Tech Project

Open Call
In Residency
Residency Outcome
AMORE investigates how humans use language to talk about the world, and enables computers to understand us.
From Feb. 1, 2017 to Jan. 31, 2022
  • amore-angelina-gemma.jpg  PI, Gemma Boleda
    amore-angelina-gemma.jpg Credits: PI, Gemma Boleda
  • amore-examined-boy_Tw0b9VS.jpg  photograph of the boy: Hagerty Ryan, USFWS (
    amore-examined-boy_Tw0b9VS.jpg Credits: photograph of the boy: Hagerty Ryan, USFWS (
  • car-with-gps.jpg  CC0 Public Domain,
    car-with-gps.jpg Credits: CC0 Public Domain,

Description of the challenges faced by the Tech Project

Imagine your GPS could see. To answer the question "Do I turn there where that big tree is?", a camera is not enough; the GPS needs to connect what you say to the portion of reality that surrounds your car. AMORE builds machines that connect language to reality, and seeks an understanding of how people do it. The main challenges are: 1) identifying which entities ("that big tree") are being talked about, both on the visual and on the linguistic camps; 2) tracking the entities as they appear again, adding new information about them as needed; 3) learning these two abilities directly from examples. We face the machine with different tasks that require using language to talk about the world, and the machine progressively learns to represent both the entities and the language that we use to refer to them.

Brief description of technology

We use Machine Learning to enable computers to understand how we talk about the world. Programs tell computers how to carry out tasks, as if you gave a recipe to your neighbor and he baked a cake by following the instructions. Machine Learning enables computers to learn the recipe themselves by observation, as if your neighbor watched you baking a cake and then made one himself back at home. For instance, in AMORE we give the computer a collection of images depicting different entities (say, two boys and one table), and we tell it that if we ask for "the table" it should retrieve the image of the table, but if we ask for "the boy" it should complain that we didn't get our language right, because there are two boys. By crunching thousands of examples, the model is able to generalize to situations it has never encountered. We will give the artist access to the data we learn from, to the computational model itself (definition and software), to the answers the model gives for new data, and to our analyses of the model's behavior. We are ready to discuss any aspect of the research.

What the project is looking to gain from the collaboration and what kind of artist would be suitable

AMORE examines how language relates to the world. Art often asks the same question, either about language itself or about other modes of human expression ("Ceci n'est pas une pipe"), including pictoric representation and sculpture. The project would benefit from an interaction with an artist interested in this topic. We expect the collaboration with the artist to enrich our perspective on the following points, by viewing them through the artistic lens: The computational model itself (how it works, how we build it), the tasks that we are facing it with (task definitions, data), our results, and the overall topic of the project (reference, or how language relates to the world). We also expect the collaboration to greatly expand the dissemination of our research to society, reaching out to the wider public in a way we could not do ourselves.

Resources available to the artist

Our university supports this application. The artist will have access to office space with a computer, internet connection via cable and wifi, and a generous travel budget to present results related to this collaboration (the latter with project funds). The department is also willing to explore other needs the artist may have regarding his/her working environment. Me and my team (two PhD students, three post-docs, and a senior member) are looking forward to the collaboration and will allocate time and space as needed for the artistic production.