My web program is an RSS reader. This program fetches new updates from multiple RSS feed sites and stores them on the mariadb disk. The address is about 20,000. And The index content is the feed content URL. Once content has been updated via this index, it will not be saved when saved. The problem is that this kind of crawling is burdensome to the hard disk and crashes in a few days. How can I crawl hard disks for less? System: 4 GB of memory. Crawled and stored url content is 60 million. The data capacity is over 60 gb. Database Engine is MyIsam not InnoDB.
function detect($link, $url_id){ $conn = connect(); try{ /* link is index */ $result = $conn->query ("SELECT link from wc_xml_post where link='".realEncode( $conn , $link )."' and url_id='".$url_id."'"); if( !$result ) throw new Exception("Database access failed: " . mysqli_error($conn), 16); }catch( Exception $e ){ $message = $e->getMessage()." ".$e->getLine(); writeLog($e->getMessage(), $e->getCode); return false; } if( mysqli_num_rows( $result ) ){ $result = mysqli_fetch_assoc( $result ); echo "\033[31mCrawled\033[0m ".$result['link']."\n"; return true; }else{ return false; } }
var
This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)