aimacode / aima-javascript

Javascript visualization of algorithms from Russell And Norvig's "Artificial Intelligence - A Modern Approach"
http://aimacode.github.io/aima-javascript/
MIT License
540 stars 218 forks source link

Repository Structure #1

Closed duaraghav8 closed 5 years ago

duaraghav8 commented 8 years ago

I propose the following structure for this repo: It must be divided into 2 directories: **1. CommonJS

  1. AMD**

The former would contain modules following CJS pattern (server-side JS Code, i.e., Node.js) and the latter, AMD (client-side JS Code).

for example: if A Star Search Code needs argmax () functionality from utils.js, the CJS version of search.js (the file which contains the A-Star Search Code) would use the statement:

require ("./utils.js");

and the CJS utils.js must contain the following line of code:

function argmax (...) {...} exports.argmax = argmax;

This keeps the repo neatly divided and, I believe, will eliminate confusion. However, it might lead to 2 copies of the same code being kept. For example: The argmax () function implementation remains the same, regardless of the modular pattern. However, since the method of importing the module is different for the 2 directories, we may have to maintain 2 separate copies of this function - 1 inside CJS folder and 1 inside AMD folder.

I could be wrong, I'm still learning. So I'd be happy if more people contribute on deciding the Directory Structure, I believe it is crucial in order to make it easier for the user.

Also for Visualization, I believe D3 provides an excellent set of tools.

norvig commented 8 years ago

I have very little javascript experience, but my inclination would be to not use Node.js or any other server-side framework; rather I would prefer to have the hole thing client side, for example with jQuery. I could be convinced that server-side code is necessary, but for now I don't see it -- we don't need to keep a database; we just need to run demos in the browser.

duaraghav8 commented 8 years ago

I see your point Mr. Norvig. Going by your approach, we will have 2 parts to each algorithm in the book:

  1. The Implementation of the algorithm itself (written in core JS)
  2. An HTML Document containing the algorithm's visualization (written in JQuery & D3).

This way, the user can see the basic algorithm implementation without any added complexities and visualize it side-by-side.

Ghost---Shadow commented 8 years ago

Since we are going to see javascript action anyway, I suggest we make gh-pages branch. It will have the explanation in HTML and some neat and colourful output of the algorithms on HTML5 canvas.

gh-pages does not support any server side scripting so it is best to avoid it if possible.

kepta commented 8 years ago

Hello everyone, First of all, this is an awesome project!! I would like to share my knowledge regarding the architecture for this repository.

Any modern Javascript app requires nodeJS / NPM for the development process. Even if you want to build a completely front-end app with no nodeJS code, you will still require

  1. npm init, this initialises an empty repository and also adds a file called package.json which will keep record of all the dependencies for this web app.
  2. ES6 or ES2015 is out and it supports the use of native modulo loading. We can use JSPM or Webpack depending on our needs.
  3. A build system, which would minify and compress the code and deploy it to gh-pages.

Let me know what you guys think, I would be very happy to work on this project. Here is an awesome article which throws some light on this subject.

Ghost---Shadow commented 8 years ago

I envisioned having a HTML file and a few js files in each directory. Making a single page website was not what I was hoping for.

We are going to have hundreds of scripts, each having hundreds of lines. All of them need not be loaded into a single page every single time. I think it would have the opposite effect and slow the website down.

We can use gulp to merge some js and css which are common to most pages, but we should not shoot for a single page app.

kepta commented 8 years ago

I don't think even having hundreds of script would have any significant impact in load time. The vendor bundles contain thousands of small files and gets compressed down.

And if it still concerns you we can have code splitting and have multiple bundles to load asynchronously.

Ghost---Shadow commented 8 years ago

Also, we do not want to obfuscate the code which we are trying to open source. The js files implementing the algorithms should be kept away from gulp.

kepta commented 8 years ago

Well I don't understand with what you mean by obfuscating the code. If that is about compressing and uglifying, that is build step which is used by all modern web applications. To view source code you hit the repository. You can right click and inspect any modern website and find the code 'obfuscated', including the open source ones. https://engineeredweb.com/blog/why-minify-javascript/

Ghost---Shadow commented 8 years ago

I see your point about minifying code. Now, should we shoot for a single page app or a structured one?

duaraghav8 commented 8 years ago

Aren't we focusing too much on the app? I mean, the goal is to write working implementations of the algorithms and their visualizations. So when I uploaded Minimax, I simply put it Under "Chapter 5/Minimax". I didn't spend too much time on site structure.

Now, given that, I completely agree that the user needs to be given a clean interface, so the site is an essential part. But all I'm saying is lets keep a structure which is easy to work with. People contributing code here need not spend time understanding structure. They should fork-->upload their code under the right Chapter/Category and make PR. No confusion. So I think every algorithm should be self-contained, i.e., 1 directory is dedicated to 1 Algorithm. This DIR should contain the implementation, support files + HTML Document showing visualization and should not not be affected by any other directories. If we look at the Java & Python Repos, they contain only files related to the algorithms. Nothing else.

That's what I think.

Ghost---Shadow commented 8 years ago

I agree with having a easy to understand directory structure too. However the gulp build kind of broke that (just click on any link). The project will definitely be too big for a single page app.

It is only 16th March, I think we have enough time to set some basic rules before we start the grunt work. Once we start it, re-factoring would be a huge pain.

I really like the idea about compressing code, however I was planning to embed all the code in the HTML pages for a simultaneous reference (just like any coding tutorial website). That kind of negates the benefits gained through compression.

We can still compress 3rd party libraries like bootstrap, jquery, two.js, etc.

kepta commented 8 years ago

I don't understand how having compressing the code and having a proper build step has to do with writing algorithms and visualisations.

Ghost---Shadow commented 8 years ago

We or at least I am not used to such web development preocedures. I am trying hard to keep and open mind about it.

Look at this web page or this I kind of have this in my mind as the finished product. The HTML page would load the javascript as text and also as executable to animate the HTML5 canvas. So if we compress the executable, it would only give half the benefit.

duaraghav8 commented 8 years ago

@Kepta My point is that we're having too much discussion about the App. App is not the main thing here. I'm not against Compression and Build Features, I was just suggesting that that's not the focus point here. Also, I'm not very well versed with JSPM. So if you can handle the whole Build Part, then go ahead! Even I'll get to learn more from you (but I won't be able to contribute in that part).

Ghost---Shadow commented 8 years ago

I cant effectively start working before we have finalized the website structure. A lot of code can and will be reused.

I have to admit I am hearing the names gulp, jspm, .travis for the first time in my life but I expected that to happen. So, I am trying my best to adopt them.

kepta commented 8 years ago

@Ghost---Shadow, I am yet not clear what the author wants and how he wants to see the web-app. I would wait for him to reply.. @norvig ...

Mean while, why don't you have a look at modern web development. I can help you with that.

Some more links

https://developer.mozilla.org/en-US/docs/Web/Apps/Fundamentals/Modern_web_app_architecture

https://medium.com/@wob/the-sad-state-of-web-development-1603a861d29f#.6abue5k6p

And also a learn more about git merging, I saw your PR having git merge conflicts in the source code itself. Let me know what you think

Ghost---Shadow commented 8 years ago

If we are making a multi page app, is there any clever way to move the html files from each folder to the build directory, or do I have to do something like this and call gulp copy before every build?

gulp.task('copy', function() {
    return gulp.src(['*-*/*.html'])
        .pipe(gulp.dest('build'));
});
redblobgames commented 7 years ago

This issue is still open. Now that

  1. the focus of this project has shifted from writing Javascript code to writing visualizations, and
  2. we are not writing a web application, and
  3. we are not writing a reusable library,

I wanted to revisit this. We're making 28 static web pages. That's all. Given the reduced scope, is there anything we can simplify/eliminate for this year's gsoc?

Ghost---Shadow commented 7 years ago

You dont need to remove anything, just dont update it.

Ghost---Shadow commented 7 years ago

@redblobgames Definitely not but why fix something that aint broken? All you need to type is gulp deploy and it is pushed to gh-pages.

redblobgames commented 7 years ago

@Ghost---Shadow I agree for gh-pages, but for local testing, having a build step is slower than not having a build step. To make testing locally faster and simpler, I propose we use the bootstrap cdn instead of our own copy. That way you can test the page by saving the file and reloading, and skipping the build step. Also, some of Chrome's dev tools work only when we work with source files directly; we are unable to use these tools with the current build system.

One seamless way to do this would be to have a development version ../main.js script that loads bootstrap indirectly, and then for the gh-pages production version ../main.js would include bootstrap directly.

However, this made me think, why not use bootstrap cdn on gh-pages too? Lots of web sites use bootstrap and jquery so these are already in the browser cache. By making our own copy of it, people who visit the page have to load a new copy of these (large) libraries instead of being able to reuse the cached copy. Unless we are planning to modify bootstrap it seems simpler to use the standard minified bootstrap library instead of making a copy of it. We are already using the CDN for jquery and two.js, so doing it for bootstrap would make things more consistent.

I tried using the bootstrap cdn for local testing while working on ch 2 and I found the development process to go much more smoothly.

Ghost---Shadow commented 7 years ago

@redblobgames I should have told you this but I forgot. The gulp file has two symmetric functions (copy and copyback). copy copies the files to the build folder and copyback does the reverse. So, I did all my programming inside the build folder and then ran gulp copyback and committed the git. I host the server inside the build folder to make the changes live.

I agree this is a very hacky way of doing stuff so, anything better is much appreciated.

I think the older commits did use the cdn but later it was bundled. Anyway, sounds great go ahead.

redblobgames commented 7 years ago

@Ghost---Shadow Oh! I somehow missed the copyback step. I agree, it seems hacky. I'll work on merging the two copies for development so that we don't need copy + copyback while testing our changes.

redblobgames commented 7 years ago

I checked in a change that uses the bootstrap CDN; this removes the need for the copyback step.

Ghost---Shadow commented 7 years ago

Not really, if in future we use some other scripts which are not already hosted we would need to use this.

nervgh commented 7 years ago

I don't know what exactly means this discussion, but I have some production experience with front-end and back-end JavaScript development and I could help with things like bundling (minification/uglification) if this project really need that.

However, I think this project should focus on algorithms and them visualization instead of particular production cases.

norvig commented 7 years ago

I agree that the focus is on understanding algorithms through visualization.

On Fri, Aug 4, 2017 at 2:09 AM Alex notifications@github.com wrote:

I don't know what exactly means this discussion, but I have some production experience with front-end and back-end JavaScript development and I could help with some things like bundling https://webpack.github.io/ (minification/uglification) if this project really need that.

However, I think this project should focus on algorithms and them visualization https://en.wikipedia.org/wiki/Minimum_viable_product instead of particular production cases.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/aimacode/aima-javascript/issues/1#issuecomment-320198757, or mute the thread https://github.com/notifications/unsubscribe-auth/AEKvLDzJGbENPIBFTfg4XqxTI3EkpxkRks5sUt_agaJpZM4Hp69w .

redblobgames commented 7 years ago

@nervgh Yes, the focus is on the visualizations and not on bundling, minification, etc. Algorithms from the textbook are needed only if the visualizations need them (not all of them do), and the problem sizes are typically small, so algorithm optimizations are usually not needed.

redblobgames commented 5 years ago

I'm closing this, as nothing in 2017 or 2018 was written this way, and there is no plan to use any of this in 2019. We can reopen if the maintainer changes the plan.