As we discussed in today's meeting, @danielgoldelman still needs to update the readme on how to set up the crawler. @JoeChampeau and I will review these instructions and update/add to the set up instructions when necessary.
Alright, I was able to get the crawler working. Not sure what @jjeancharles and @danielgoldelman had to do, but for me, I also had to:
Install Firefox Nightly and modify the binary in local-crawler.js to reference the Windows path. Running the crawler on regular Firefox doesn't throw any errors, but it seems to prohibit installing unsigned extensions (which I didn't realize initially). It looks like there's a commented out line that sets the binary to "firefox.Channel.NIGHTLY" which, I assume, gets the program to check the default installation path for Nightly. If we could get this to work instead of needing to set the path manually, that'd be nice.
Super minor thing, but npm install needs to be run for both the 'slenium-crawler' directory and 'rest-api' directory. There are probably scripts that can be put in the parent package.json to streamline this into a single command.
Regarding the MySQL setup, the user that's made needs to be assigned the authentication protocol 'mysql_native_password' and, then, it has to be granted all privileges on the table 'entries.'
As we discussed in today's meeting, @danielgoldelman still needs to update the readme on how to set up the crawler. @JoeChampeau and I will review these instructions and update/add to the set up instructions when necessary.