Should doctors and other healthcare professional be used mainly as a social asset to achieve broad social goals like public health and screening? Or should they be regarded mainly as private practitioners who are free to sell their trained expertise to the highest bidder? Consider the investment in money and effort doctors make to educate themselves. Should they have a sense of entitlement or regard themselves as public servants because they provide something vitally needed and that not everyone can provide? Should we take over the expense of training them, if we were to develop such socially obligatory expectations? If we in America move toward universal access and expect doctors to play their role in such a system, how can we provide the incentives necessary to motivate people to go through the rigorous training involved? Give reasons for holding your view.