I teamed up with trusty friend Mukul once again, entering Cambridge University’s official hackathon – HackCambridge – as a team of two. The goal? To develop a novel product in 24 hours.
We built EcoScan – an app where you scan your food, discover your eco footprint. We were delighted to win the BlackRock prize for the most innovative environmental project. You can find the slideshow here, and Github here. Screen shots shown below in all their rookie glory!
(Scan your food, and we’ll display its eco footprint for you!)
What follows is an account of the event, and some behind the scenes material – how we gradually focused on a feasible idea, plus reflections and tips during our development. Intended to be accessible, and somewhat educational (most computer science terms are explained).
As usual, plenty of photos and diagrams to lighten up the read!
Saturday 19th January
The sun shines outside as we make our way to the Corn Exchange, a spacious venue hosting the event. Mukul and I register and pick a seat close to the door (away from the loud speakers! – and close to the snacks). The atmosphere is bustling with anticipation.
The hall fills with over 300 competitors, but it doesn’t feel too cramped – thanks to the able planning of the committee. The sponsors of the event – including BlackRock, the cohost, Microsoft, Avast and others – give brief talks as we wait for the event starting at noon. All too soon, the countdown is upon us and the 24 hours begins!
Ideas (noon – 4pm)
Mukul and I try riffling through our idea brainstorm that we made two days before. But nothing seems to stand out immediately, so we go for lunch – a chorizo baguette – then walk about Cambridge for some inspiration. Ideas involving ‘zen’, ‘creativity spots’ and ‘connecting friends’ pop up as we amble our way around the Burrells’ Gardens in Trinity, but we still struggle to flesh out any convincing ideas.
It’s when we return that we start to realise certain restrictions (e.g. the data that we could find and access), and this guides us to focus on our project idea – an app that scans your food and shows you your carbon footprint.
And much like at any hackathon , this proves to be more difficult and problematic than expected!
Development Begins (4pm – 1am/7am)
Indeed, experience at past hackathons taught me a valuable lesson – know when you’re being too ambitious. It’s all too easy to underestimate just how long a ‘simple’ (simple, haha) task might take, and it’s better to do a smaller project well, than to finish with an unfinished product. This key point proves crucial.
Our initial idea is: ‘take a photo of food items, output the carbon footprint’. Simple concept, and sounds rather innocent.
But this proves to be much too complex.
We attend a workshop given by Microsoft demonstrating how to use their Azure AI services, then armed with this knowledge, I start integrating their Computer Vision API into the project’s backend (using Python). But the issue I run into is that the API detects everything, not just the food. It’s not entirely reliable either. We could use some sort of NLP (natural language processing) to filter this, but then we’d be introducing two layers of unreliability.
We consider scanning receipts instead, so I also try the OCR API (optical character recognition) – but hours in making it work and I realise that it too, is not reliable enough. It detects all items as well, and the text is often too small, prone to being blurred. Too unreliable!
Thus we pivot our idea and simplify. New iteration: ‘take a photo of specific food items, and output their carbon footprint’. In other words, restrict to a small group of foods for the sake of producing a reliable MVP (a fancy computer science term standing for ‘minimum viable product’) – that demonstrates the gist of the idea, but which is also complex enough. Dinner time!
Food is served in the neighbouring Guild Hall – many of the event sponsors set up stalls during the day with ‘free stash’ (as is customary at such events).
The simplification opens up new paths for both of us. Mukul no longer needs to scrape all the data, and can manually obtain it instead. He begins work on the crucial frontend – the design and linking of the browser activity (i.e. all your button presses and food photos) to my backend processing work. He starts by using React JS and uses design ideas from Google’s ‘Material Design‘.
I can finally try the Custom Vision API – where I can upload over 80 manually taken photos of bananas, apples etc., tell the Microsoft AI what they are, and then it ‘learns’ to detect them in future photos. It proves to be somewhat fiddly (there is a complication between detecting one or multiple foods in an image), but I finally manage to integrate it into the project by midnight. 12 hours to go! I go to get some rest, in fear of further worsening the sore throat that I had been developing.
Mukul courageously forges on. There is a difficulty linking the frontend with the server – my backend work is in Python whereas the frontend is in React JS, so he needs some way of bridging the difference in languages. He adeptly manages to achieve this using Gatsby JS and GraphQL. Creating a sleek design is by no means a simple task, and he finishes this working through the night – till 7am! (what a team player)
Sunday 19th January (7am – 12pm)
Our foundations are now set, and Mukul passes the baton back to me in the morning to get an hour’s rest! Pressured for time, I work out how to deploy the backend onto an Azure Web App – installing it onto a computer in the cloud that will run the image-processing – with the help of a Microsoft mentor present at the event. There are also minor fixes to be carried out to do with the encoding of the image, connection policies and styling.
Only a few hours are left. Mukul expertly fixes a JSON issue in the receiving of processed data, then gets to work improving and adding the final touches to the design. I begin work on the presentation. Another lesson learnt from past experience is how important the design and presentation are. It’s all too easy to get carried away coding, only to leave a product with an unappealing interface; it’s not much good being unable to communicate your idea and its selling points either!
Minor shock ensues with an hour left, when we try our product after leaving it for a while. This is probably due to the cloud computer sleeping, and thankfully it returns shortly. We race to complete the final bits of code, then submit our code before the 12pm deadline. And the 24 hours is over!
But the hackathon is far from over. For it’s time for the demonstration (but lunch first!).
Demonstration (1pm – ~3pm)
It’s time for each of the 60 teams to set up their exhibition stall. The chaos then begins as the judges make their way around – the teams present and demonstrate their products.
There are multiple prizes to win – the general competition, and prizes for more specific project types, themed by the sponsors. The finalists for the general competition give further presentations on stage, then the winners are announced! We win the BlackRock prize for environmental projects, and delightedly take home a pair of headphones.
Thanks to my brilliant teammate Mukul for his talent and reliability – a shoutout to his blog here https://mukulrathi.com/. And my sincere thanks also to the fantastic HackCambridge committee for organising yet another wonderful event this year!