Just to add a couple of items to the stack of already great answers.
1) For a large site. Absolutely. I would split out the XML sitemaps by category or page type (assuming we're only talking about static pages (not images/videos), this will help you to understand potential indexation problems (alongside investigating your server logs and landing page traffic)
2) Use a delay job/cron job to update your sitemaps on a regular basis (i.e, have them run at midnight) so that any new pages are added, and junk pages removed. You're ultimately working to ensure that the search engines can see all of your content, including the latest pages.
3) It's worth paying attention to your XML sitemaps, and ensure that, at John rightly pointed out, that they are free of broken URLs 400's and 500's, free of redirects (301's and 302's) and don't exclude 50,000 URLs each.
4) And lastly... don't get obsessed by having a MASSIVE site, with hundreds of thousands of thin pages, with very poor indexation. Instead (and particularly post Panda), focus on 'making every page count'.
Cut down the bloat. Improve your overall site quality. Consolidate your internal and external link flow, and maximise the effectiveness of your crawl budget.
As your Domain and Page authority improved, you'll be able to handle a larger site...
For a marketplace, this is essential.
Hope this helps.