Monday, May 9, 2011

Do Immigrants benefit America?

Immigrants have benefited the United States, however they have not harmed it in any way. There was a huge change in immigration until the 20th century when immigrants were mostly Europeans. Today in the U.S, immigrants come from all over the world. Many come from Mexico, Asia, Latin America, and the Caribbean. In my opinion, I believe that we should legalize the illegal immigrants coming into the U.S., because there are so many job opportunities available. Immigrants are willing to take jobs that Americans will not take such as construction, housekeeping, and yard work.

No comments:

Post a Comment