Open deholz opened 3 years ago
Martin Rees, in his interview with Beshara Magazine, says that we do not have a coherent and connected theory of physics, and I am wondering to what extent recent developments in the field might change that or grow either end of the field? I am specifically talking about the development in the study of muons. To quote the AP, "Tiny particles called muons aren’t quite doing what is expected of them in two different long-running experiments in the United States and Europe. The confounding results — if proven right — reveal major problems with the rulebook physicists use to describe and understand how the universe works at the subatomic level."
In your book you write that new technology “may offer new solutions to the crises that threaten our crowded world; on the other hand, they may create vulnerabilities that give us a bumpier ride through the century” (63)”. I was wondering if you have a specific emerging-technology in mind that currently has the potential to go both ways (offer solution vs create vulnerabilities)?
In your book, you mentioned that different countries and entities will have different incentives regarding geoengineering. What are some scenarios you foresee that would lead to large-scale geoengineering without a worldwide (albeit presumably desperate) consensus?
In your interview with Beshara magazine you mention that one of you area of concern, in terms of social disruption, is that our expectations are high. We have high standards and live in a comfortable world, so we don't cope as well when a crisis happens i.e. there is a breakdown of social order. Do you think there's a way to lower expectations? Will we be forced to? Are there ways in which our high expectations are beneficial?
How must politics transform in order to mitigate existential threats? What form of government is most effective in dealing with said threats? Is is democracy, or an altered form of democracy?
Would the lecturer please compare the US academia with the UK academia in terms of their professional performance and their relationship with the public and politicians in the face of existential threats?
What kind of international body would need to exist to effectively implement solutions to these existential crises on a global scale? What kind of international reorganization would need to happen to make cooperation more feasible and efficient?
Do you think that American democracy as it exists (with maybe a little adjustment) would be strong enough to survive/prepare for future risks? Or do you think that we need some sort of largescale societal change?
Given that our society is driven by technological advancement, I imagine that the future will be one that's predicated on emerging technologies and research. How can our scientists best assure themselves and the public that the technology that emerges can be safely handled? How concerned should we be with technology running amok and leading to our doom?
Do you think that there could ever be a moral imperative to making certain transhumanist-modifications obligatory?
How do you think a future pandemic will affect the human race? During this pandemic, some people said it was going to be the end of the world. How did you view it?
Is there a way to remove corporate interests from politics? How can we get politicians such as Joe Manchin whose entire careers have been funded by energy and coal to take climate change seriously? Voting with feet or wallets both seems to be in vane.
What steps do you think need to be taken before humanity could start seriously considering colonization of other planets? As a follow up to this, do you think that we should fix all of the problems on Earth before even considering such an endeavor
Why do you think technology will save our species? It seems like the scale of problems we create for ourselves dwarf the investment we put into technology to save ourselves. For example, it seems like we've done irreversible damage to our climate, and even technology that made us instantly carbon neutral would not cool the planet fast enough to prevent a "heat-age"
How can you liberate art from capitalism? That's a pretentious way of putting it, but how do we escape from milquetoast critiques of capitalism published by Amazon?
***original post 18 hours ago was deleted by github so reposting In chapter 2 you discuss the intractability of ethical questions and "sanctity of life" arguments in biotechnology, specifically work with embryos and gene modification. As these technologies become more common, will society be force to / evolve to agree on these questions? Or will our disagreement hamper such progress?
In examining the future of humanity, I believe it is important to also understand how humanity will evolve in tandem with the threats that we create. In terms of forms of aggressive destruction (nuclear, bioterror, cyberattacks, etc.), do you believe that humans will become more hostile or more agreeable to one another? To what extent is aggression towards others built into who we are as a species?
What are your thoughts on the potential 'space hotels' that have been rumored to be developed in coming years? Is this productive, or a waste of time and money? Is this an extension of manifest destiny, or imperialism of recent centuries?
Does globalisation imply that science is working more globally than it has before or does the global world order and the current geopolitical landscape disable globalist thinking in a way that we haven't seen before?
Questions for Martin Rees, inspired by his book On The Future.
Questions: Every week students will post one question here of less than 150 words, addressed to our speaker by Wednesday @ midnight, the day immediately prior to our class session. These questions may take up the same angle as developed further in your weekly memo. By 2pm Thursday, each student will up-vote (“thumbs up”) what they think are the five most interesting questions for that session. Some of the top voted questions will be asked by students to the speakers during class.