koshinf / Bloodborne_Lore

Analyzing Bloodborne Lore
4 stars 0 forks source link

Post your analysis sections! #19

Open NADGIT opened 5 years ago

NADGIT commented 5 years ago

Current doc:

Bloodborne Catalog and analysis Project lead: Frank Koshinskie Assisting members: Purpose: The purpose of this project is to catalog and analyze interactable entities within the videogame Bloodborne. The reason for doing this to is multifaceted. The primary purpose is to determine where the developers put the greatest emphasis of entities, and consequently lore, in the various areas inside the game. The secondary purpose is to catalog all of the interactable entities, sorted by location, for quick, user-friendly referencing. The tertiary purpose is to apply a multitude of coding languages to create a fully functional website that outputs meaningful data in a format that is both user-friendly and time efficient. Methodology: Various coding languages are used in this project. XML markup is used to catalog and categorize the interactable entities. A schema is devised to normalize the project’s XML format. Regex is used to quickly and efficiently edit the project’s XML documents.

NADGIT commented 5 years ago

My section: (XSLT and Saxon-JS)

XSLT was used to gather information on entities according to their location ID. As the XML was in need of corrections at various stages in development, the best course of action was to create a system where the XML could be updated without the need to update HTML files. In order to encapsulate the HTML and the XML, it was decided that the XSLT should run on the client on-demand. This was achieved using Saxon-JS, an XSLT processor written completely in JavaScript. As such, an HTML page could apply an XSLT stylesheet to an XML file through a simple scripted API. Saxon-JS proved to be fast and effective, allowing for HTML files to draw from updated XML files. Javascript was also used to create an interactive SVG map so that users could click on a location and bring up all of the entities related to it.

koshinf commented 5 years ago

(HTML and CSS)

When thinking of how to design the look of the site I had to first consider what the index page should feature. My first thoughts were to give a brief explanation of the project goals as well as giving some cool looking links to other pages on the site, even though they are mostly redundant. I used hand-coded HTML and CSS for most of the site with some SVG elements made by my team members plopped into containers that I had made. A lot of the site is mainly a container, and then just a little bit of content in the container. I used in-line styling mainly to speed of the process of creating the site, those normally wouldn't be there in such number. A few cases where in-line was required would be the TSV page since it uses a container from another page. Overall the design of the site has been crafted to give the user the fastest experience possible with getting to where they need to go.

ghost commented 5 years ago

My section (XML and network analysis)

In regards to the XML, I did a lot of grunt-work in gathering lots and lots of raw data. Working with the other members, we came up with a formatting hierarchy, and I normalized our data to for that hierarchy and the schema that it was based upon. I also helped generate some tsv files with xQuery. Using these files, I was able to use them in network analysis to draw our conclusions. I was able to map NPC with the keywords that they spoke, thus creating a connection between NPCs, the words they spoke, and how it relates to each other. This was only a sample conclusion, as many, many more things can be learned from the tsv files that were generated. The tools are now there for anyone to play with, and to generate their own useful data with.