How to efficiently load live results from multiple data sources



I'm building a price comparison website (using PHP & MySQL) whose live search results are sourced from xml data feeds supplied by partner sites. Potentially, there could be 20+ feeds called per search request.


Currently, a site visitor selects their search options, then a PHP script requests the xml data (based on the given search options) from the partner sites, formats the results and sends them straight to the browser.


The system is working well in testing with one user, but my concern is that over time more visitors will slow the site down considerably due to the live nature of the data sources.


I've calculated that the actual data should only need to be refreshed no more than once every 10 minutes.


Can anyone recommend a caching technique using PHP and MySQL, or just some way to make this system more efficient? Perhaps every 10 minutes I should run a script that stores the results of the most common searches in a MySQL summary table?


Many thanks.


No comments:

Post a Comment