This Android application was made as part of my Information Technology undergraduate capstone project. To create this application, I recorded hundreds of hours of gameplay footage and stored frame by frame stills into a dataset of images. When a picture was taken of a video game, an algorithm would search through the dataset and look for reoccurring color themes and iconography to return a result. Each series of images were associated with a self-created walkthrough which would then pop up within the application to provide instant and accurate assistance with the game.

At its core, GameSnap was developed through a Machine Learning approach. Through supervised learning, the application would search through labeled training data (the footage stills that I tagged) and attempt to deliver the most relevant response to end-users.

The API that this application was built upon has since been discontinued.

Release Date

May 10, 2018

Technology Used

IQ Engines, Computer Vision, Java

screenshot of the gamesnap application