You don't need any "weird" goal like paperclips. You just need the basic goals of survival and expansion that every species posesses (implicitly) to understand why a superintelligence is a danger.
No, we are not madly harvesting the world's resources in a monomaniacal attempt to optimize artificial intelligence. We are, however, harvesting the world's resources in an attempt to optimize artificial intelligence.