Legal Positivism


The term “legal positivism” refers to the legal theory that provides that the rules of law are in place only because they are enacted by the existing political authority or accepted as binding in a given society, and not because they are grounded in morality or in natural law.