It's unlikely I'm part of your target demographic for this question. But you've given me the opportunity to vent; therefore vent I shall!
Gripe 1: Yesterday I spent hours writing a parser that takes a raw-text output from Bob's database and chops it up appropriately for insert into Bill's database. If Bob would just give Bill access to the database, then such a parser wouldn't be required.
Gripe 2: I spend an inordinate amount of time coding up processes that really accomplish nothing but data entry. The tasks are too small for it to be worthwhile to program spiders to crawl the information sources, but they're too significant to neglect. I'd much rather outsource this stuff, but I spend more time explaining these one-off tasks to people overseas than it takes me to perform them myself!
Gripe 3: Many of my data-collection tasks are massive and repetitive. Does anyone out there know a good source of trainable crawlers / scrapers? My difficulty is that I need them for a series of unpredictable mid-term tasks rather than a consistent long-term task.