Automatic Waste and Recyclables Sorting

A prototype system to classify and automatically sort recyclables from trash.
By Jonathan Lee.
August 27, 2020

SPARCFIRE

Solar Powered Automatic Recycling Container (For Independent Research Endeavors).

Sparcfire is a smart 2-in-1 recycling and trash bin that automatically sorts recyclables from trash when a user drops an item in the receptacle. The system uses a Raspberry Pi as its onboard processor with a downward-facing camera under the lid to classify waste. When someone drops an item in the bin, PIR sensors detect the movement and activate the camera. The camera then classifies the item into one of seven types of material using a convolutional neural network. The item will then fall in either the “trash side” or the “recycling side” of the bin.

Diagram of how the system works.

I completed this project over the summer of 2020 as part of the GW Systems Lab. This project was motivated by the need for a city-scale sensing and research platform that was also power and resource constrained. The platform would support parallel research efforts on Sunneed, a framework for tracking and managing the distribution of power on a multi-tenant system. While the project never made it past the prototype phase, the vision was to benefit urban areas by reducing recycling contamination and provide a large-scale sensing platform to collect environmental data.

Platform

The container is built around a Rubbermaid trash container lid with a split-flap design. I built a wooden base and then a plywood housing that would contain the downward-facing webcam and Raspberry Pi.

Mechanical Sorting

A look underneath the lid with the 4 solenoids and 2 servos and belt drives.

Servos and solenoid actuators underneath the lid are used to control the flaps. Four solenoids are fixed on 3D printed mounts, with a pair per each side to control whether the flaps can open or not. This locks the lid closed without drawing any power, and prevents a new item from falling into the bins without being sorted. Once the item has been identified by the camera, the corresponding flap is opened by momentarily activating the pair of solenoids, and allowing the item to fall just by the force of gravity. If the item is very light (like a napkin), there is not enough force on the lid to open, so a servo and belt system is used to rotate the hinge of the flap and allow the item to fall through.

The Rasperry Pi controls the solenoids via the 3.3V GPIO pins. An optocoupler is used to isolate the 3.3V level logic side from the 12V voltage supply for the solenoids.

Waste Classification

LED lights underneath the wooden housing illuminate the trash and provide an even surface lighting. The lights also act as an indicator to the user whether the item was recycled (green) or trashed (red).

Mobilenet architecture and performance.

I used a Mobilenet convolutional neural network for multi-class classification. I trained the model using this image dataset with categories for glass, plastic, metal, paper, cardboard, and trash. There was approximately 400-500 images of each type that were augmented through random horizontal/vertical flipping. The classification of waste into types of material is possible using CNNs, but the level of accuracy achieved during my testing left room for improvement.

One difficulty is that almost any item can be trash, so it is difficult to have trained on every item that might be thrown out. However, one only needs to determine the type of material, not what the name of the item is. This leads me to believe that alternative approaches like decision trees and random forests might have more success. For example, a camera could be used to determine object reflectance, opacity, and color. A metal detector could determine if an item contains metal, and a flex sensor could measure object weight. The combination of these metrics could be the input to a decision tree, or neural network to infer the material type.