So would I. I had my last U.S. History class at a public high school about 5 years ago. I would consider it to be an accurate representation of U.S. history, without a lot of the politically correct junk that people complain about. (Of course, by now I've had plenty of time to forget parts of it, but that's a completely different issue.)
On a side note, did anyone here actually learn anything about Africa outside of Egypt and the requisite paragraph on the Boer War before they got to college? I didn't know jack about Africa until I took an African history & geography course in my undergrad years.