I'm dissatisfied with the way English is taught in America. Teachers and professors emphasize literature to the detriment of other equally-important aspects of English: language and writing.
Because English is a risky career decision, the majority of English majors are women, who culturally worry less about providing for a family and historically brought about the rise of the novel. Students also choose English because it is an easy major. Your assignments are to read works of fiction that you would probably be reading any way. You don't learn any practical skills, like you would in a science, engineering, or nursing major. In short, it is an academic, and not a professional degree.
These hordes of English majors usually find themselves teaching English, where their love of literature and ignorance of language and writing reflect in what new students are taught. This is why, when I taught English overseas, students frequently scrutinized my lessons and questioned whether or not a principle was British English or American English. To them, Americans notoriously spoke and wrote their own language poorly.
Additionally, my English classes invariably study works of fiction. Much can be learned from these books, but more is left unknown because we never study historical documents, biographies, or any other kind of real text. Instead, we take a piece of fiction and debate endlessly over the existence of phallic symbols in it. Wouldn't it be nice to read about what our forefathers said about America, or what Queen Elizabeth said about the Spanish Armada? I care more about the history of the English language than I do about penises.
I propose that English programs be changed to reflect an equal division between language, literature, and writing. Language to know how to accurately express yourself and understand others, literature to learn how other's have expressed their thoughts, and writing so you can and do express your thoughts. This focus on literature is crippling our ability to communicate.