The short answer is no. The long answer is, again, no. Here's what the science says about the question: Do romance novels ruin relationships?
Whether you’re a kid or an adult, a reminder of hope is always welcome. I hope these books about hope give you the warm fuzzies, including Don’t Call Us Dead by Danez Smith.