Getting help deciding what to order from McDonald’s

Joel Wiersma
4 min readDec 15, 2020

I have been coding for around 6 weeks using Flatiron’s Software Engineering Bootcamp as a part-time student and was tasked with creating a CLI project. I really wanted some Taco Bell but wasn’t sure what to get, and thought that creating a program to decide for me would be both useful, and fulfill the requirements for the CLI project. The company that I currently work at uses scrapers a lot, so I thought I should create a program that utilizes a scraper to get more experience with them before I need to use them for work.

Scraping and Iterating

Scraping is the process of using CSS selectors to grab relevant information from certain websites. Using the gems nokogiri and open-uri makes creating scrapers much more convenient. An example of using CSS to find something on McDonald’s website is below.

The nice thing about scraping is that websites tend to use the same CSS selectors on all of their menu items allowing you to iterate through them. Iterating is the process of going through each item one by one. There are many iterators, with the most simple being #each. #each goes through each item without doing anything special while something like #collect would collect the return value of each item. McDonald’s happened to use two different CSS selectors which was a little inconvenient, but not that bad. This one method was just twice as long because of it.

Blockers

There were two times when I felt truly stuck. Above I said I wanted Taco Bell, but have been talking about McDonald’s all this time. The reason for that is after running nokogiri on the page with all of Taco Bell’s menu items nothing appeared. At first, I thought that maybe I was using the CSS selectors incorrectly and just not finding any information. So after trying for a bit over an hour I tried using binding.pry which is honestly a lifesaver and I should have been using it more often. I tried running the command

and the program would time out then give errors. That's when I finally realized that I couldn’t scrape Taco Bells' website, so I went to the next best choice, McDonald’s. After switching I immediately tried the same thing to make sure McDonald’s website would be more cooperative, and sure enough, it worked just as expected. The second time I struggled, was learning McDonald’s website didn’t have all of the menu items on one page for an easy time iterating. At this point, it became apparent that I needed to iterate over every menu category, use nokogiri on the acquired link to be able to parse the page, then iterate over all the menu items on that page. It took a while to get the method to put all the categories through nokogiri and into an array to work, but once it did, there was a rush of relief and pleasure knowing I was done with it.

The line looks simple enough, but before I started this project there were a few things I didn’t know about open-uri that would allow me to grab the relevant information easily. Searching online, every solution seemed extremely complicated and went over my head, until someone suggested putting [“href”] after to grab that value which was a life saver.

What could be improved

The one thing that I really wanted to do, was go into every menu item and collect all the nutrition info. The only problem with this, was the program would need to use nokogiri on every single menu item page, which would make it take around a minute with rough estimations. Running the scraper on 11 pages takes around 8 seconds, so running it on 119 pages would take around 86 seconds assuming every page went at the same speed. However cool that would be, it would lead to people not wanting to use the program. It potentially would take so long, that people would close it without letting it get through the full minute. With people being accustomed to everything loading in less than a second normally, that seemed to be out of the question. Another thing that could be improved would be to have more methods. After writing this program compared to other ones that the coding Bootcamp has walked me through, I realize that having more methods cleans up the code more than I thought it would. I think going back in and simplifying the code makes it less repetitive and makes it look cleaner.

All in all, I really enjoyed this project and felt challenged, but not overwhelmed. I learned quite a bit about scrapers and am glad about all the failures and successes I had in making this, as they are all learning experiences. I will definitely think of using pry sooner next time, as I used it throughout the program after struggling without it, and had a much more enjoyable time coding the rest of the program.

--

--