HackHarvard'17: What I built at Harvard University's International Hackathon
So! Well, I was offline for quite a while, I’d visited the US to compete at Harvard University’s HackHarvard’17. And it was absolutely lovely, I learnt a TON (none algorithmic, but I’ve got no regrets) of stuff. New tools and frameworks I’d never thought I needed to ever learn, some of which surprised me by their capabilities.
Disclaimer: This article is less algorithmic, and more of a digression on the implementation of what I built along with my teammates, at Harvard – PetPal.
What my team and I built - The project submission details are available at - https://devpost.com/software/petpal-0z8dqt
PetPal is the stepping stone to a world with no homeless pets. Aptly named “PetPal”, it aspires to help owners find their lost pets, by crowdsourcing the problem.
The way our application was designed to work is - users who have the android app installed, upon finding a lost/homeless animal, takes a picture of it and the app uploads information of this picture onto a database.
A licensed organization dedicated to saving such animals retrieves this information from the same database and now ideally knows the location of the last sighting of the corresponding animal.
How we built it
Our implementation consisted of an android app and a web browser application.
Here’s how it works – a user of the android app, upon running into a lost/homeless pet takes a picture of the animal through the app.
And, well, that’s all the user needs to do. Take a picture of the animal. The rest is handled by the applications.
What follows, is that the android app now sends the image and the date-time and location credentials to a secure Firebase database (thanks, Google!). This bit of the project was entirely implemented in Android and Node.js
Simultaneously, something else is happening too right now, the same image has been sent to Google’s Cloud Vision (again, thanks Google!) which applies state-of-the-art object recognition algorithms to classify the animal in the picture. The results of this classification is also fed back in to our firebase database with JavaScript.
At this point, our secure Firebase database has the following details ->
- Animal Type
- Picture
- Location
- Date-time
- Email ID of the user who sent in the picture
Now, we get to the next bit. Representing this information in a neat way. My team and I decided to use Google Maps, and upon reading through their documentation, we ran into a nice little tool called “Marker Clustering”.
This tool was real neat. When there are multiple marked points in close proximity on a map, upon zooming out, it’s hard to see them as two individual entities. However when using marker Clustering, what happens is that these clustered marked points are grouped and shown in one single blip with a number on it indicating the number of markers that were grouped together. Clicking this blip zooms the map in, far enough to individually distinguish the markers.
So basically, we neatened up our soon-to-be populated map.
Then we looked at another tool from Google, “Reverse Geocoding”. This tool was pretty cool, it was capable of reverse engineering the street address, given latitudinal and longitudinal coordinates.
Then we decided to add in another small feature. The infowindow. Currently, our UI showed the world map, with red markers placed on coordinates obtained from the Firebase database (magic of JavaScript) and what we had in our minds was – have a small popup over the marker when the mouse hovers over the marker. We wanted an infowindow. This infowindow, would show the corresponding image (taken by a user), the animal type (taken from Google’s Cloud Vision), the street address (reverse geocoding + coordinates from android phone), the date and time of the sighting (from android phone) all in a neat manner.
And that’s what we did! This map now would be available only to this licensed organization, and authentication would be done via Google’s OAuth (online authentication) service.
Wow, we used a LOT of Google tools and APIs for this, didn’t we?
After all of this, all that was left, was a nice, pretty landing page for the app which I designed using CSS, HTML and some quick help with Bootstrap.
The headaches along the way
Oh, so many, so so many. :)
- Using JavaScript. (This kept me slow for nearly the entirety of the hack, but the good news was, I learnt at an exponential rate, which was nice)
- Getting Google authorization to work locally on my laptop.
- Linking the firebase database to my JavaScript code to access each node.
- Iterating over nodes to extract just the coordinates from my firebase database into arrays on my JavaScript code.
- Reading so so so many documentations.
- Getting the stupid markers to show on the map. (No kidding, I ended up asking a Google engineer who was the hackathon to help me out, which he was great at)
- Incorporating reverse geocoding using the extracted coordinates.
- I wanted sleep. ;-;
What I learnt
A friggin’ ton.
That’s what.
For the implementation, I handled almost the entire web application, I implemented Google’s online authentication for the browser, Google maps’ marker Clustering, reverse geocoding, linked the firebase database to the on-display map, (tried my damnedest to) show the markers on the map after reading them from the database, and designed all of the front end side of the web application. So yeah, I learnt a bunch.
I literally went into this with zero experience in JavaScript and I honestly can’t believe how much progress I could make in about 20 hours of hacking time. StackOverflow, Google Search, SoundCloud and insomnia candy and cookies became by besties this weekend. (Google Cloud Platform is insane, yo)
A teammate did most of the android implementation but I did get to spectate a bit after I finished up my part. For the UI for our android app, we used a tool - Lottie, first time I’d heard of it. Bottom line is - it looks beautiful. I doubt we leveraged all its capabilities but we had a shiny, pretty android app in a few minutes.
Anywho, it was a lovely experience. I learnt loads and I got to see the kind of talent there was on an international level. I’ve got quite a way to go.
Side note: FOOOOODD! After joining college, the hackathon might have been the first time I properly filled my stomach (with junk food, sorry mom)
It was a great experience and a wonderful way to spend my Diwali vacations! And now I expect my workload to rise phenomenally so the next post may also be delayed. I’ve given the disclaimer, my job here is done!
Later then!
Aditya