Has Obamacare Made Restaurants Partisan?

Politics in the US is discouragingly partisan. National politics has become increasingly partisan since at least the late ’60s, when the passage of civil rights legislation influenced many conservative southern Democrats to join the Republican Party. Even state politics has become more partisan, where even famously nice people in Wisconsin have found themselves battling their neighbors across political divisions. Fortunately, most of life does not force us to confront our political differences, meaning we can go out to dinner with our family and friends, sharing space with other diners free from concern that we will be confronted with partisan rhetoric.
Until now.
According to a report on CNN Money, some restaurants in Florida are now making sure their patrons recognize the burdens being placed upon them by the Affordable Care Act. Here’s a copy of a receipt from one of these restaurants…(Read more and view comments at Forbes)

PeterUbel