My Example Idea: Fate

A smart food bin and shopping app that learns what you waste to help save you money.

With the research covered, read below to see how I developed my solution

Go back to the research section here.

View my main UXD page here.

Here's how the solution works for the three stages of shopping input, waste learning, and then feedback.

First you provide the user with a hassle-free way of inputting their shopping data.


Scanning receipts to input shopping data.

Then you augment the humble food waste bin to make it context-aware:


A smart food bin for context-awareness.

So machine learning from the above two can help you see what not to buy...


A signal to the user instead of supermarket noise.

User flows show how simple the interaction would be for each part.

First, scanning a receipt:


User flow for scanning a receipt.

Then throwing something away:


User flow for throwing away in to the smart bin.

To finally learning how to shop more sensibly:


User flow for feedback during shopping.

Some draft sketches of navigation and interfaces show this process in more detail.

Below, the user is in 'shop' mode.  From left to right they:

  • confirm their location,
  • get tips and warnings from the app,
  • snap an offer on salad,
  • get feedback suggesting they avoid the offer.



A sketch of snapping something bad to avoid.

Having declined the offer, the app now provides tips relevant to the previous item submitted:

  • best options for salad items are presented,
  • the user snaps something,
  • it has a better fate so they swipe it to 'in trolley',
  • the weekly food indicator at the top increases to show them how 'fully stocked' their kitchen will be for a week.  Suggestions now change to other salad items.

A sketch of snapping something good to buy.

After sketching and wireframing, some simple mockups can be brought to life with the Marvel prototyping app:


In the video below, I whiz through the first two main scenarios of using the app.

1.) Shop.  The scenario begins as I arrive at my local supermarket.  The app geo-locates me, and a notification asks if I want help shopping.  Through interaction with cards, searching, and snapping the app provides me with feedback to prevent me from buying food I would never be able to realistically eat.

2.) Scan.  The next scenario takes place after a shop, back at home, when I now need to scan in my shopping receipt.  Two events outside of the ideal 'happy flow' occur, and the app's response is shown.

This next video shows the final scenario of using the app - setting up a filter to winnow from all of the waste data available to specifcally the things you want to track.

In other words, filling out and submitting a form - a common UXD challenge.

My attempt here uses natural language rather than drop-down lists, checkboxes and such.

The use case is completely hypothetical but only used to illustrate the range of functions available:

Want to try it for yourself?  

Have a go with the prototype below:

(I made this protoype to fit my Nexus 4 screen size, hence the empty space at the bottom of the Nokia phone screen Marvel offer.)

You can also access this prototype through this link:


Scan it, track it, snap it.

A simple hands-off solution where reality informs our hopes and dreams, rather than constantly shattering them.

From a vicious circle to a feedback loop for saving the user money:


Summarising the scan-track-snap experience.

An app that fits in to the lives of our user personas to effortlessly save them money:


How fate fits in to users' lives.

That's the power of context-awareness, of information feedback, of designing an app that works around the user's reality rather than forcing the user to adapt their lives to the app.

No more confusion, succumbing to the supermarket noise and their tactics, of endlessly buying what they you want to.  Instead, you have the power to know what you need.

And with that data, you can do more things than just reduce your spending - you have a platform to build upon...


Ideas for more functionalities that could be added later.

But it all begins with the simple approach of a hands-off interaction-light, context-aware, learning solution to provide great value to the user.


Recap: this was my General Assembly course project from 2015.

To view the breadth of work I have done since then, view my main UXD page here.

Or feel free to get in touch: