From my point of view, the world is neither getting worse nor better, the world has always been the way it is, and it doesn’t seem like it’s going to change. It’s just my opinion.
Edit: In fact, what is getting worse is our economic system, but that is nothing new.
I did not deny climate change. I was referring to social issues, about people in general. Why the fuck do you think I’m a climate denier?