How does everyone feel about parents influencing how a child sees the world using religion? On one hand, religion teaches hope and all that cute jazz. On the other hand, some make them think that certain things are wrong (ie me, raised in a catholic family and a homosexual) and therefore are faced with a childhood struggle to fit in. Also, I thank God for food (partly out of habit) but I'm also thankful for the men and women who make it possible for me to even have food. Idk, those are just my examples and viewpoints. What's everyone else's stand on this outspoken issue?