Skip to content

Every single day, there is something alarming in the news about climate change. Click on any headline about a natural disaster like a forest fire or a flood or a hurricane, and there will be dire warnings in the article about how this particular phenomenon is worse than ever before because of climate change. Google the words “climate change,” and you can learn about how it is making poison ivy itchier, glaciers smaller, and the world generally less pleasant to live in. It is even being theorized that there could be a connection between earthquakes and climate change. What I don’t understand is why climate change is seen as a bad thing. It’s normal for the climate to change. Millions of years before the dinosaurs, the Earth was a solid ball of ice. During the time of the dinosaurs, there was no ice at all. The planet continued to cool off and warm up, all without human intervention, and when humans did come along, they adapted to changes in the climate. Up until the 20th century, nobody thought that a change in the weather warranted prophesying the end of the world. Today, there is constant alarmism.

Notifications - Subscriptions
Notify of
Oldest Most Voted
Inline Feedbacks
View all comments

Would love your thoughts, please comment.x