(also known as the Wagner Act) a U.S. labor law which established legal rights for most workers, excluding farm workers and domestic workers.
(also known as the Wagner Act) a U.S. labor law which established legal rights for most workers, excluding farm workers and domestic workers.