Searching for specific images may become easier thanks to a new tool that generates image queries based on a sketch or description of objects in spatial relationships. The tool, which has been proposed by researchers from KAUST and University College London, makes it easier to search the world’s ever-expanding databases for pictures matching a wider and more powerful range of image queries.
The enormous collections of photographs and pictures now available in online databases represent a remarkable resource for research and creative arts. As unfathomably rich as these databases might be, they are only as useful as a user’s capacity to use a query to search effectively.
“When searching for images in a database like Flickr, the images need to include a short but informative description,” explained Peter Wonka, the KAUST researcher who led the study. “The description needs to be short to allow the search algorithm to match against millions of possibilities, but also needs to be informative because the correct images need to be found based solely on this description.”
Wonka and his colleagues Paul Guerrero and Niloy Mitra from University College London wanted to add something more powerful to the currently limited repertoire of image search tools without adding extra metadata to existing images.
“Instead of describing just the individual objects occurring in an image, we wanted to describe the relationships between objects—such as ‘riding,’ ‘carrying,’ ‘holding’ or ‘standing on’—in a way that can be computed and searched for efficiently,” noted Wonka.
Read the full article