Should employers be required to offer health insurance? (McClatchy Newspapers)

Admin

Administrator
Staff member
McClatchy Newspapers - WASHINGTON — Requiring employers to offer most workers health insurance has long been seen as a crucial piece of Democratic efforts to overhaul the nation's health care system, but legislation that the Senate's expected to consider soon is unlikely to include any such mandate.

More...
 
Top