Life and It’s Hidden Meanings

Does it seem realistic that we allow a whole life form to become extinct because we robbed it of it’s ability to thrive? Or does it seem more like we should have inhabited the Earth with the desire and love to share it with all that has been granted the greatest gift of all…life. Do you see yourself as a gift? Or have you allowed the ways of mainstream media rob you have that special feeling?