Time:
November 2019 – One Week

Main Technologies used:
Ruby on Rails, Bootstrap 4, Geocoder gem

Github Repositories and other Links

Video Walkthrough

CRWLR is a web app to plan food crawling events. It was built using Ruby on Rails and it utilizes the Geocoder gem to sort nearby locations as a user continues to add locations to their crawl. Our location data came from an open dataset from Yelp. We chose only the Las Vegas locations for sake of size, but also because it’s a food crawling app. Why not choose Las Vegas for our seed data?!

The project was my mod 2 project during my time at Flatiron school and was developed with Ken Averbookh. I was responsible for putting together the visual design of the project, user authentication, and the logic behind whether a user is a guest or host of a crawl.

Challenges

CRWLR was a fun project to work on. The challenges we had seemed like fun opportunities to grow as developers. One big hurdle we had to overcome came early on when we had to figure out our User model for the project. The big question we had was, “How can we distinguish users as either being a host of a crawl, or a guest of a crawl?” Ken and I poured over the Rails documentation and found our answer.

Another challenge for me was learning bootstrap and principles of CSS as we went along. This was my first foray into CSS and I wanted my project to look visually appealing. I’d say it went well!

Takeaways

By the time Ken and I worked on this project, we were about 6 weeks into Flatiron school. We felt we were still fairly new to being developers but wanted to dive right in and see what we could do with only a week and ambitious plans.

I enjoyed the dynamic that we had. We took a divide and conquer approach to this project and then link back up to pair program to combine the features we made for the day. Because of that, we managed to get a lot of features in this project, like implementing Google static maps, cookies to allow users to auto-login whenever they returned, and invite-uninvite functionality.