So the movie is all over the net and is being talked about everywhere. It makes a pretty powerful point. The US healthcare system is fucked. It also shows a lot of other countries having great free healthcare. One sided of course, so are there any other good movies or books that show the other half or are other countries healthcare systems too good too be true.
Also the movie makes me want to live in France, not only for their healthcare, from what I've seen, but they have some gorgeous women.
Also the movie makes me want to live in France, not only for their healthcare, from what I've seen, but they have some gorgeous women.