Size matters, depending on your web server there are limitations on the amount of items you can have in your database or what size the productfeeds can be. How big really depends on the server.
Feed import limitations
The import is quite memory efficient, items are parsed one by one, memory should not be an issue van importing feeds.
The main issue when importing datafeeds (using the web interface) is time, time limits also apply to the feed configuration. Often shared web hosts have a time limit of 30 seconds for running web scripts. If this exceeds you will get on error message, an error 500 page or even a blank page.
What might help:
- Use csv files instead of xml.
- enable 'curl' as download method, the feeds are fetched first and parsed in a second step. The feeds are cached as well, so if the first import fails the second might succeed since there is no download for the second run. add define('FORCE_CURL','yes'); to your feeds.php
- download the files manually put them on your server;Use a path instead of a url in the feed configuration, like : /var/www/htdocs/affiliatfeeds.nl/feeds/hotels.csv ( the path depends on your server)
- during feed configuration limit the number of items in the feed, for example for zanox you can use the 'testmode'. After configuration remove the limitation and run the import from the CLI. If you must run the import from the web web keep the limitation or fetch a queried feed.
- use ini_set('max_execution_time',120); in your feeds.php or server settings. ( might by disabled). Increase 120 if neccessary
- buy or rent a better server.
Item display limitations.
Most (shared) hosts are not designed to handle large databases. These servers are tuned to handle a lot of small websites. How far you can go really depends on the server, especially memory and mysql settings.
For shared hosted it's recommended to:
- limit the number of items to a few thousand
- do not use full text ( title description) search
- limit the number of modules on a page
- do not use like or regexp queries