We just wrapped up a great hackathon at University of Washington! Thank you to the amazing DubHacks organizers for putting on a truly memorable event. Our founders, Iddo and Mickey, definitely enjoyed themselves 😝
Overall, we were blown away by some of the projects we saw built in just 24 hours. The theme of the event was “accessibility.” Here’s how some teams brought that mission to life with the power of APIs.
The Mission: The cmpr team set out to build a project that helps those with visual impairments access the internet more easily.
The Method: The app used open source machine learning APIs to enrich existing web pages. cmpr added image descriptions and highlighted key words. Those descriptions could then be read back to the end user with an audio plugin.
See below for the words and phrases that Cmpr recognized from Wikipedia.
We were so impressed with this team that we awarded them with our grand prize, a drone! Congrats to Shikib Mehr, Kevin Zhang and the rest of the cmpr team.
Stop the Bleed
The Mission: Empower bystanders to intervene in emergencies by alerting authorities and giving basic medical instructions..
The Method: Voice commands were a key component of The Stop the Bleed app, as many victims may be too shocked in the moment to think what to do. The team used the Amazon Alexa Voice Service API to build a decision tree that could alert the authorities, survey the situation and give relevant instructions. The team also included a messaging component with theTwilio API.
Congrats to the Stop the Bleed team on making the finals! Watch the team explain their app and demo it below.
The Mission: Let people with visual impairments use Alexa to better “see” the world around them.
The Method: A user would tell Alexa to take a picture of an object that they’re having a difficult time recognizing. Lightbringer would then allow the user to take a picture with their phone. The app the uses image recognition technology to identify objects in the picture and return a voice response of identifying the object from the picture.
Great job, Lightbringer team!
The Mission: An app that cheer users up when they are feeling sad using their Facebook pictures.
The Method: Moody Up’s app first pulls images from a user’s Facebook profile. Next, it used the Microsoft’s Emotion API and Clarifai’s Image and Video Recognition API. to classify the sentiment behind the image. On the user’s side, the app would ask how the user is feeling, and return an image to either match that feeling or feel more positive.
The Mission: Find old and embarrassing pics on Facebook you may want to forget.
The Method: Notifai first pulls a user’s Facebook pictures with the Facebook Graph API. Next, the app uses Clarifai’s Image and Video Recognition API‘s NSFW filter to identify any images that a user might not want to display on Facebook 😳.
The Mission: Watch a video together even when you’re not in the same room.
The Method: beatbud combined video synching technology from YouTube’s Data API with a chat interface into one app. As a team with members in the US, Ukraine and Israel, we thought the use cases here (watching a tv show with a friend across the globe) were particularly compelling.
APIs Used: YouTube’s Data API (through RapidAPI)
The Mission: Help developers navigate and contribute to the countless open source projects out there.
The Method: The OpenAid team built a matching platform website where a developer can sign up to be matched. The platform scans GitHub for projects and adds all tags directly to the GitHub repository.
APIs Used: GitHub API (through RapidAPI)
The Mission: Connect farmers to people with organic waste to reduce waste and save money.
The Method: The Kudoti team built a matching platform where both organic food consumers and farmers can connect. Using Google’s Geocode API, Kudoti allows both parties can find a way to meet and exchange the goods.
APIs Used: Google Geocode API (with RapidAPI)