jakobzhao / geog595

Humanistic GIS @ UW-Seattle
Other
39 stars 13 forks source link

Bot #4

Closed jakobzhao closed 3 years ago

zwguo95 commented 4 years ago

Thelwall and David Stuart's piece on web crawling ethics caught my eye in many ways. Some best practices for scholars include always to make sure to check the existence and the content of robots.txt before scraping. It will help to deter both impression fraud (have crawlers or crowd workers repeatedly view the site) and click fraud (click on the ads). In addition, scholars also need to take a bunch of questions into consideration. First, how many websites we should crawl really matters. Some prepackaged crawlers may have limitations on the number of sites and the rate of crawling. Second, how deep into a given website the crawler should go? Are we looking for both content and link or the main landing page of interest? However, a recent article I read from Freelon (https://www.tandfonline.com/doi/full/10.1080/10584609.2018.1477506?journalCode=upcp20) raises an interesting point. He mentions that scholars should not confuse the potential consequences of violating TOS with that of research ethics. The distinction is crucial in the sense that the former aims to protect the company while the latter helps to protect the research participants. I feel like this sort of relieves the pressure for scholars of violating TOS/robots.txt when scholars can justify the purpose of doing so is for common good or the public in general.

weixingnie commented 4 years ago

What is the autonomy of the technological phenomenon? Despite several emotional viewpoints, I am impressed by Jacques Ellul's argument. All the technology needs regulation but how could we constrain it? The obvious answer is relying on the government. However, why? All of the previous philosophers and tech pioneers pictured the democracy of using technology, they insisted on the strong, moral, righteous government. All the research certainly relies on founding but we never doubt the part of social reform and government reliability. Furthermore, the idea of technology regulation base on experienced ethics of the behavior required for the technological system to function well. Is truly profounding? Not just the users, not just the government. We need a righteous system to endure, to adapt, to judge the changing society and technology. The article for web crawling is also astonishing. When I personally begin to learn the idea of web crawling, I am scared and frighten by the idea of someone else going through all of my social media and comments. The scholars and the bots need a boundary to identify a distinguished line between privacy and user data. However, a personalized service/data analysis couldn't avoid such a paradox of Privacy VS Persona. Where is the boundary? Should the website decide it? or users? or the giant government? All the readings are connected from automation, web crawling, agency regulation, and artifacts.

jouho commented 4 years ago

After reading Vincent J. Del Casino Jr's discussion on robots and machines, I was particularly interested in the idea of uneven rights distributed to robot users. Today, some of the hottest topics in the field of artificial intelligence are the development of smart cities and self-driving cars. Many companies such as Toyota, Waymo have already been planning to deploy them in our life. As robot embeddings keep proceeding in the future, a large portion of human's job will be substituted by robots. One big concern is the idea of resulting human space of ‘uneven rights, compensation, and safety’. Vincent J. Del Casino Jr mentions that ‘[some] computer users can experiment, make, and do-it-themselves, while others must reliably keep the infrastructure humming and accessible’. This implies that the development of AI and robots inevitably leads to inequality of rights between developers and users, companies and customers, and countries and citizens. This is already taking place in our lives today. For example, big tech companies such as Microsoft, Amazon, and Apple are monopolizing the tech market and are the dictators in driving today's economy. How should we approach this problem? Should we all start learning to code so that we can join these tech companies and won't be left behind in the future? Or should companies rather distribute these 'rights' to alleviate the expanding gap of inequality?