Monday, January 17, 2011
Are They Right or Delusional?
There's a big myth in the industry, one that I'm frankly confused about so I thought I'd throw it out here and see what everyone else thinks.
What I thought was an old belief, the idea authors write books and the agents, editors, and publishers do everything else, is somehow making a comeback. The argument today is that if a publisher expects the author to promote their own work, they are no better than a vanity press. Therefore, it's up to the author if they "feel like" doing book signings, guest appearances, etc.
What do you think?