The system uses a central imaging sensor connected to a processor which runs a machine learned object recognition program locally.

The program will categorize the waste item and inform the user.

After recognising the waste item, the central display informs the user in which bin to place it. An animation plays to indicate direction and a coloured ring lights up.

From a technical standpoint, we designed the animations in a single file. From there we could separate the necessary parts to be used either in the display or in any of the LED strips.

This way, we ensured that the timing and speed of the animation was in sync across each output.

The recycling station will recognise when a person approaches and will awaken with an animation on a frontal and dorsal LED strip to indicate the user it is ready to help.