The problem is I need to do this at scale, quickly, in a whitehat way. I thought about pinging, hiring guys on Fiverr to submit to social networks or using Scrapebox's rapid indexer add-on - but I think that all of these solutions are going to look mighty suspicious to Google and result in penalties.
I agree with some of the recommendations above;
1) Focus on making sure your XML sitemap is accurate, and make sure your weightings (server priority weights) are set correctly from parent to child (category > sub-category > product detail).
2) Submit through Google webmaster tools
3) Submit partials in increments (again, good suggestion at 50k/time) and pay close attention to your index rate to see how fast these are picked up
4) you can brute force a lot of these crawls by submitting and re-submitting your sitemap(s) - as often as every day
5) For very important URL's that are not getting picked up a handy trick is to tweet them out from an account that has a positive share of voice (more followers than following)
Putting that many new URLs on a new domain in the system may trigger a manual review. I would probably start putting 50K at a time over the course of 1-3 months.
Also best way to get those deeper URLs indexed are:
1) Setup the XML sitemap in webmaster tools.
2) Setup easy to crawl HTML sitemaps. Web pages with ~100 links per page that are easy to crawl and navigate through pagination.
I would recommend not to use any of the mentioned techniques, instead try this
1) Upload a sitemap to Google Webmaster tools
2) Fetch as google in webmaster tools (website and some categories)
3) Create social profiles such as Twitter, Pinterest, Facebook and put your URL there. These sites have very high authority and get indexed all the time. Share the link in posts
4) Do cheap stumble upon discovery
5) Do cheap Reddit paid Discovery
6) Bookmark site in Delicious - just there
7) Write 10 blog posts, drive traffic to them through PPC, have a link back to categories or website
8) Submit RRS feed
9) Set up small PPC budget and get some clicks for very long tail and cheap keywords. Even if you get a lot of impressions and no clicks this is still visibility to google. So bid just enough to be on the first page and for the keywords that are cheap. Result - few clicks, a lot of impressions, and now Google knows about your site.
10) Monitor Webmaster tools and make sure that robots.txt is not blocking any URLs
I hope it helps
I would recommended:-
1.Make clear product hierarchy.
2.Proper product categorization.
3.Create separate sitemaps for products, categories and blog.
5.Use rel next and rel previous on product pages.
6.Use rel canonical on pages with session ids to control crawl budget.
7.Try to include more products on main categories instead of dividing them to multiple pages.
Well, if so, there are steps to do it yourself, but the most important step according to me is URL inspection, once you do it well, delivering sitemap information to Google helps because it will be queue up for crawling and all the URLs in there should get indexed quickly if they meet Google's quality standard.
Besides if you do have any questions give me a call: https://clarity.fm/joy-brotonath