AbanteCart Community

Shopping Cart Operations => Support => Topic started by: smallfish on December 29, 2016, 12:32:47 AM

Title: Big data caused memory exhausted
Post by: smallfish on December 29, 2016, 12:32:47 AM
2016-12-29 5:11:55 - error:  AbanteCart core v.1.2.9 Allowed memory size of 536870912 bytes exhausted (tried to allocate 15757865 bytes) in <b>core/lib/cache.php</b> on line <b>221</b>
I have increased the memory_limit to 512MB or 1024MB but still can't solve it. I have 380 thousands of products (testing data) in the database.
This line
      if (!is_null($data) && $this->enabled && $this->cache_driver && $this->cache_driver->isSupported()){
         $data = serialize($data);
get problem. If I disable the ( Use SEO URL's ) or disable (Cache) in backend setting, this error will not occur.
I have enabled the XCache to speed up the loading of the product listing pages since I have 380 thousands of products in the database.
Please try to fix this problem and improve the code.
Thank you!
Title: Re: Big data caused memory exhausted
Post by: eCommerce Core on December 29, 2016, 08:16:37 AM
Can you tell us what operation / request cause this issue? What version of AbanteCart is this?
Title: Re: Big data caused memory exhausted
Post by: smallfish on December 29, 2016, 10:19:55 AM
abantecart v.1.2.9, enabled SEO in backend, enabled xCache, inserted 380 thousands of dummy products, then tried to open the front store but failed. The above error occurred.
If disable SEO in backend, there is no problem.
Title: Re: Big data caused memory exhausted
Post by: eCommerce Core on December 29, 2016, 10:45:53 PM
Let's try to pinpoint the problem. Can you try file based cache or memcache with SEO on?

Title: Re: Big data caused memory exhausted
Post by: smallfish on December 30, 2016, 02:37:24 AM
Let's try to pinpoint the problem. Can you try file based cache or memcache with SEO on?

File based cache also has this problem. Problem is still caused by serialize(). Abantecart needs to read the big data (here should be the table abc_url_aliases) a few times. (limit the max. no. of records each time). If not, Serialize() can't handle it.
Title: Re: Big data caused memory exhausted
Post by: eCommerce Core on December 31, 2016, 08:56:10 AM
Thank you for an update. I will check this an look for solution.
Title: Re: Big data caused memory exhausted
Post by: eCommerce Core on December 31, 2016, 08:33:43 PM
SEO URLs are not cached. We need to fid our what cache in particular is oversized.
Basically, that part will need to be restructured a bit to handle such a large number of products.

Can you please check the size of files in cache directory and locate 3 largest files?
Title: Re: Big data caused memory exhausted
Post by: abantecart on January 01, 2017, 03:36:51 PM
It would be great exercise to tune AbanteCart for 380k products. Anxious to see it working.

Smallfish, let's work together on this
Title: Re: Big data caused memory exhausted
Post by: smallfish on January 03, 2017, 02:19:51 AM
The largest files in in /cache/category/ is 13KB
/cache/extension/ is 15KB
/cache/html_cache/ is 15KB
/cache/resource/ is 12KB
/cache/setting/ is 32KB

Here is the script for generate 380K records (need to run 10 times, because the script will timeout after 3xxxx records in my server)
You can try to generate the dummy records in your testing server
=============================================
<?php
header("Content-Type: text/html; charset=utf-8");

// Database .............
//ini_set('max_execution_time', '3600');
$server = 'localhost:3306';
$database = 'yourdatabasename'; //change this
$datauser = 'databaseusername'; //change this
$password = 'password';         //change this
$connection = mysql_connect($server,$datauser,$password);

if (!$connection) {
die('Could not connect to MySQL database, the server return the error: '.mysql_error());
}

$db = @mysql_select_db($database);
@mysql_query("SET CHARACTER SET UTF8");

if (!mysql_query("SET NAMES 'utf8'", $connection))
return false;



for ($i=1; $i<=50000; $i++) {   //need to advance the $i value to new value for each running of this script , i.e for the 2nd time running this script, $i=50001; $i<=100000; I need to run 10times of this script because the script will timeout after generate 3xxxx records00
   $d = date("Y-m-d");
   $t = date("Y-m-d H:i:s");
   $end_date = date("Y-m-d", mktime(0, 0, 0, date("m"),   date("d"),   date("Y")+20));
   $itemcode = "Item " . substr('0000000000' . $i, -8);
   $athumb =  "catalog/demo/htc_touch_hd_1.jpg";
   $brand = 15;
   $price = 100;
   $points = 100;
   $cost = 60;
   $weight = 1;
   $weburl = "location A";
   $topcat = 43;
   $maincat = 48;

   mysql_query("INSERT INTO abc_products (location, model, quantity, stock_status_id, manufacturer_id, shipping, price, weight, weight_class_id, status, date_available, date_added, date_modified, cost) VALUES ('$weburl', '$itemcode', '100', '3', '$brand', '1', '$price', '$weight', '1', '1', '$d', '$t', '$t', '$cost') ");
   $product_id = mysql_insert_id();
   $goods_id = $product_id;
   $company =0;
   mysql_query("INSERT INTO abc_products_to_categories (product_id, category_id) VALUES ('$goods_id', '$maincat') ");
   mysql_query("INSERT INTO abc_products_to_categories (product_id, category_id) VALUES ('$goods_id', '$topcat') ");
   mysql_query("INSERT INTO abc_products_to_stores (product_id, store_id) VALUES ('$product_id', '$company') ");

   $title = "Title " . $itemcode;
   $alldesc1 = "商品說明 " . $itemcode;  //Chinese name
   $alldesc2 = "Product Description " . $itemcode;  //English name
   $picpath2 = "18/79/1.jpg";

   mysql_query("INSERT INTO abc_resource_library (type_id, date_added, date_modified) VALUES ('1', '$t', '$t')");
   $resource_id = mysql_insert_id();
   mysql_query("INSERT INTO abc_resource_map (resource_id, object_name, object_id, sort_order, date_added, date_modified) VALUE ('$resource_id', 'products', '$product_id', '2', '$t', '$t') ");

   mysql_query("INSERT INTO abc_resource_descriptions (resource_id, language_id, name, title, description, resource_path, resource_code, date_added, date_modified) VALUES ('$resource_id', '1', '$alldesc1', '$title', '', '$picpath2', '', '$t', '$t')");
   mysql_query("INSERT INTO abc_resource_descriptions (resource_id, language_id, name, title, description, resource_path, resource_code, date_added, date_modified) VALUES ('$resource_id', '2', '$alldesc2', '$title', '', '$picpath2', '', '$t', '$t')");

   $keyword = "Keyword-" . substr('0000000000' . $i, -8);
   $key1 = "product_id=$product_id";
   $keyw1 = $keyword . "-en";
   $keyw2 = $keyword . "-hk";

   mysql_query("INSERT INTO abc_url_aliases  (language_id, query, keyword) VALUES ('1', '$key1', '$keyw1') ");
   mysql_query("INSERT INTO abc_url_aliases  (language_id, query, keyword) VALUES ('2', '$key1', '$keyw2') ");


   mysql_query("INSERT INTO abc_product_descriptions (product_id, language_id, name, meta_keywords, meta_description, description) VALUES ('$product_id', '1', '$title', '$title', '$title', '$alldesc1')" );
   mysql_query("INSERT INTO abc_product_descriptions (product_id, language_id, name, meta_keywords, meta_description, description) VALUES ('$product_id', '2', '$title', '$title', '$title', '$alldesc2')" );


   $mprice = 99;
   $vprice = 98;
   $vvprice = 97;
   mysql_query("INSERT INTO abc_product_discounts (product_id, customer_group_id, quantity, price) VALUES ('$product_id', '8', '1', '$mprice') ");  //reg. member price;
   mysql_query("INSERT INTO abc_product_discounts (product_id, customer_group_id, quantity, price) VALUES ('$product_id', '9', '1', '$vprice') ");  //vip. member price;
   mysql_query("INSERT INTO abc_product_discounts (product_id, customer_group_id, quantity, price) VALUES ('$product_id', '10', '1', '$vvprice') ");  //wholesale. member price;



   mysql_query("INSERT INTO abc_product_options (product_id, status, element_type, required) VALUES ('$product_id', '1', 'S', '1') ");
   $product_option_id = mysql_insert_id();   

   $oname = "color";
   mysql_query("INSERT INTO abc_product_option_descriptions (product_option_id, language_id, product_id, name) VALUES ('$product_option_id', '1', '$product_id', '$oname')");
   mysql_query("INSERT INTO abc_product_option_descriptions (product_option_id, language_id, product_id, name) VALUES ('$product_option_id', '2', '$product_id', '$oname')");

   $optprice =0;
   $weight = 0;

   for ($h = 1; $h <= 10; $h++) {

          mysql_query("INSERT INTO abc_product_option_values (product_option_id, product_id, quantity, subtract, price, prefix, weight, weight_type, attribute_value_id, sort_order)
   VALUES ('$product_option_id', '$product_id', '50', '0', '$optprice', '$', '$weight', 'kg', '0', '0')");
          $product_option_value_id =  mysql_insert_id();       
          $color = 50;
          $optprice = 0;

      $optvalue = "Color " . $h;
      mysql_query("INSERT INTO abc_product_option_value_descriptions (product_option_value_id, language_id, product_id, name, grouped_attribute_names) VALUES ('$product_option_value_id', '1', '$product_id', '$optvalue', '')" );
      mysql_query("INSERT INTO abc_product_option_value_descriptions (product_option_value_id, language_id, product_id, name, grouped_attribute_names) VALUES ('$product_option_value_id', '2', '$product_id', '$optvalue', '')" );   

      
   }

}


echo "end of program...";
die();

?>

Title: Re: Big data caused memory exhausted
Post by: eCommerce Core on January 03, 2017, 08:11:54 AM
Sizes of the files are not that big. I do not think this is the issue here. I will give your script a try and we will see. Thank you.
Title: Re: Big data caused memory exhausted
Post by: smallfish on January 03, 2017, 10:17:52 AM
If you are going to re-design or restructure the cache handling method, please also restructure the home page loading and category/product loading page. Because the loading speed is very slow (take around 20-30 seconds:this is under no traffic environment. if the website is published, it will take a longer time to load the page) for opening the page if the product table is big.
Title: Re: Big data caused memory exhausted
Post by: eCommerce Core on January 03, 2017, 10:28:40 PM
Main page was well optimized before. See if HTML cache improves your main page.

with 380k+ products my storefront is fast and nothing crashes. I will try tomorrow to add more products.
Title: Re: Big data caused memory exhausted
Post by: smallfish on January 04, 2017, 05:42:42 AM
I installed abantecart v.1.2.9 on Debian 8 server with 32GB Ram, Xeon CPU L5630 x 2, 256GB SSD x 4 (Dell H700 Raid card, set raid 10), 4TB Hardisk x 6 (set raid 10). Mysql was installed on the SSD.
Running on PHP Version 5.6.29-0+deb8u1 (FPM).
Besides, I also install it on another Debian server with 16GB Ram, Intel i3, 3TB Hardisk x 4 (Raid 10). But the results are the same after insert 380K products. The page display HTTP 500 Error and find  memory exhausted error.
HTML cache can improve it. But if I clear the cache (because some new products are added, need to clear the cache) and visit the front page again, the loading speed is very slow.
For the category / product listing page, the loading speed is more slowly. If the page is not cached, the first person who open the page will take a long time.
Besides, do you enable the "Use SEO URL's" in "Setting/System" abantecart backend admin?

Title: Re: Big data caused memory exhausted
Post by: eCommerce Core on January 04, 2017, 09:28:21 AM
I am trying to replicate your issue. Currently I have 415448 products loaded and I do not get any major issues. I see some slower selects in the admin, but that could be expected.
Computer is with 4gb of ram with 128mb for PHP with file cache ON and SEO is ON. PHP Version 5.5.6

I will try to investigate more.
Title: Re: Big data caused memory exhausted
Post by: smallfish on January 04, 2017, 09:40:44 AM
I have reset the memory_limit to 512M in php.ini (previous setting is 256M) again. This time it works without error. I think that I may modify the incorrect php.ini file previously.
I will try to test it again. But the time for loading of category / product listing page is too long. Hope abantecart team can improve it.
I put all the dummy products in category "Skincare" (380K+ dummy products). When I click "Skincare", it takes very long time to open the products listing page. However, even the category "Makeup" (only 6 demo products), it still takes very long time to open the products listing page. (6 products should be assumed to be opened within 1 second in good design)
Thank you!
Title: Re: Big data caused memory exhausted
Post by: smallfish on January 04, 2017, 10:33:06 PM
This morning, I have tested it again and there has 450K products. For the 1st, 2nd time, there is no problem. But after I have cleared all the caches a few times and set sorting by Date New -> Old. Then the http 500 error (memory exhausted) occurs again. Then I have increased the memory_limit  to 512M in php.ini, no error occurs later.
I am running the test from production server. The resource may affect by other live websites.
Last night, I also created 1150K products and there is no error even the memory_limit set to 256M....
I am confusing right now and don't know where is the problem.
Anyway, the loading speed of home page and category / product listing page is really slow for 1st visit. If cache is enable, it is fast when the page is cached (2nd visit and no refresh of data and no expiration of caches). Hope that abantecart team can improve the loading speed of the home page and category / product listing page (data size inside one page is small, should load fast. just like the phpmyadmin, browsing a large/big table is very fast without delay).
I am also worrying about the Max number of concurrent processes. My server has 16GB Ram, then it should limit the no. of visitors / users and may caused memory exhausted easily from a few visitors open the web pages simultaneously.
Title: Re: Big data caused memory exhausted
Post by: smallfish on January 04, 2017, 11:17:05 PM
I have cleared all the caches again and open the store front home page immediately.
This time, the following error occurs:

aban2/static_pages/index.php
==========================
There has been a critical error processing your request

SQL Error: MySQL server has gone away
Error No: 2006
SQL: DELETE FROM `abc_online_customers` WHERE `date_added`< (NOW() - INTERVAL 1 HOUR) in /aban2/core/database/amysqli.php on line 108
==========================

Waiting for 30 seconds later, go to /aban2 (home page) and open the storefront home page again, there is no problem and the page is cached.
Title: Re: Big data caused memory exhausted
Post by: eCommerce Core on January 05, 2017, 09:03:54 PM
Looks like your mysql server is a root of your problems.

FYI. I am working on some improvements for large product database.  I will post update in here soon.     
Title: Re: Big data caused memory exhausted
Post by: abantecart on January 05, 2017, 09:15:18 PM
This is great. I wonder if we should consider eCommerce site with 380 thousand products as a large scale site. What is next? Entire database of Amazon? :)
Usually, sites with large catalogs and volumes, require special hardware and administration in addition to software tuning. This is a costly exercise.

Our prior focus on performance testing was towards fast growing customers and orders. 1/2 million of products was not ever tested and great that AbanteCart can handle it.
Title: Re: Big data caused memory exhausted
Post by: smallfish on January 05, 2017, 09:57:04 PM
Looks like your mysql server is a root of your problems.

FYI. I am working on some improvements for large product database.  I will post update in here soon.   
May be I have 6 live websites which are requesting data from mysql at the same time and abantecart has problem to access mysql at the moment.
I think that mysql should be tuned for performance setting. But I am not Familiar with it....
It's a good news to hear that you are improving abantecart to handle large product database.
Thank you!
Title: Re: Big data caused memory exhausted
Post by: smallfish on January 05, 2017, 10:14:11 PM
This is great. I wonder if we should consider eCommerce site with 380 thousand products as a large scale site. What is next? Entire database of Amazon? :)
Usually, sites with large catalogs and volumes, require special hardware and administration in addition to software tuning. This is a costly exercise.

Our prior focus on performance testing was towards fast growing customers and orders. 1/2 million of products was not ever tested and great that AbanteCart can handle it.
From my testing, abantecart v.1.2.9 is better than v.1.2.7 or before which can handle large products database but it is not ideal. However, it is better than OC, ZC or even some paid shopping cart. Agree with you that handling large catalogs and volumes required more resources. But I have used a free Chinese shopping cart which can already handle 1000K+ products smoothly and it doesn't need powerful hardware or no special request for the software. The cart was developed at least 10 years ago. (it uses smarty template and also apply caching methodology at the beginning. So handling large product database is efficiently and open the category / product listing page, home page is very fast. It's database doesn't has too much normalization, so generate less quiries.(means run faster and use less resource)) I am still using it but I want to use a shopping cart which can provide multi-shop, multi-language and multi-currencies functionality. Now, abantecart may be the choice as I have already checking on abantecart a few years.
Title: Re: Big data caused memory exhausted
Post by: eCommerce Core on January 06, 2017, 10:03:24 PM
Looks like your mysql server is a root of your problems.

FYI. I am working on some improvements for large product database.  I will post update in here soon.   
May be I have 6 live websites which are requesting data from mysql at the same time and abantecart has problem to access mysql at the moment.
I think that mysql should be tuned for performance setting. But I am not Familiar with it....
It's a good news to hear that you are improving abantecart to handle large product database.
Thank you!
There are a number of articles about Mysql performance tuning.
http://dba.stackexchange.com/questions/136349/mysql-performance-tuning-for-myisam

We already have plans to add InnoDB engine to boots performance for Mysql and also support PostgreSQL.
Timeframe is not yet clear as this requires a lot of effort and needs funding.
Title: Re: Big data caused memory exhausted
Post by: eCommerce Core on January 09, 2017, 01:26:20 AM
Here are some improvements in case of very large products database

https://github.com/abantecart/abantecart-src/commit/63c32b5215bb0162adf20d472d05795e633cfb9d

Give it a try.
Title: Re: Big data caused memory exhausted
Post by: smallfish on January 09, 2017, 11:38:28 AM
I have tested it with 450K products.
The backend product browsing is improved. It takes 13-15 seconds to jump to next page previously but now only 4-5 seconds

The frontstore home page: It takes 1-2 minutes to load the home page previously but now 30 seconds (after clear all the caches and reload it).
The categories / products listing page : It takes 15-20 seconds to load the page previously but now 12-14 seconds.
browsing page by page in a category (jump to next page): It takes 15-20 seconds to load the page previously but now also 12-14 seconds.
However, open the product listing page in a category where it has 10-20pcs of products is the same time as the category where it has 450K products....
open product detail page : no change, it takes 4-6 seconds to open the page.

In fact, for the front store, it was failed to load the home page after add the new indexes to the tables. (I have uploaded abantecart v.2 /admin, /core to overwrites my abantecart v.1.2.9 but forget to overwrite the storefront.) The error is http 500 error and find memory exhausted in error log. I have increased memory_limit to 512M in php.ini, then it works.
It seems that the new indexes will also use some memories. But after I uploaded the abantecart v.2 /storefront to overwrites the v.1.2.9, it only needs 384M.

Besides, you have mistake in public_html/install/abantecart_database_upgrade.sql (abantecart v.2.0)
CREATE INDEX `ac_products_status_idx` ON `abc_products` (`product_id`, `status`, `date_available`);
should correct to :
CREATE INDEX `ac_products_status_idx` ON `ac_products` (`product_id`, `status`, `date_available`);

Title: Re: Big data caused memory exhausted
Post by: smallfish on January 09, 2017, 12:11:59 PM
Looks like your mysql server is a root of your problems.

FYI. I am working on some improvements for large product database.  I will post update in here soon.   
May be I have 6 live websites which are requesting data from mysql at the same time and abantecart has problem to access mysql at the moment.
I think that mysql should be tuned for performance setting. But I am not Familiar with it....
It's a good news to hear that you are improving abantecart to handle large product database.
Thank you!
There are a number of articles about Mysql performance tuning.
http://dba.stackexchange.com/questions/136349/mysql-performance-tuning-for-myisam

We already have plans to add InnoDB engine to boots performance for Mysql and also support PostgreSQL.
Timeframe is not yet clear as this requires a lot of effort and needs funding.

This is for your reference : https://www.selikoff.net/2008/11/19/why-too-much-database-normalization-can-be-a-bad-thing/
Title: Re: Big data caused memory exhausted
Post by: eCommerce Core on January 10, 2017, 05:37:35 PM
I have tested it with 450K products.
The backend product browsing is improved. It takes 13-15 seconds to jump to next page previously but now only 4-5 seconds

The frontstore home page: It takes 1-2 minutes to load the home page previously but now 30 seconds (after clear all the caches and reload it).
The categories / products listing page : It takes 15-20 seconds to load the page previously but now 12-14 seconds.
browsing page by page in a category (jump to next page): It takes 15-20 seconds to load the page previously but now also 12-14 seconds.
However, open the product listing page in a category where it has 10-20pcs of products is the same time as the category where it has 450K products....
open product detail page : no change, it takes 4-6 seconds to open the page.

In fact, for the front store, it was failed to load the home page after add the new indexes to the tables. (I have uploaded abantecart v.2 /admin, /core to overwrites my abantecart v.1.2.9 but forget to overwrite the storefront.) The error is http 500 error and find memory exhausted in error log. I have increased memory_limit to 512M in php.ini, then it works.
It seems that the new indexes will also use some memories. But after I uploaded the abantecart v.2 /storefront to overwrites the v.1.2.9, it only needs 384M.

Besides, you have mistake in public_html/install/abantecart_database_upgrade.sql (abantecart v.2.0)
CREATE INDEX `ac_products_status_idx` ON `abc_products` (`product_id`, `status`, `date_available`);
should correct to :
CREATE INDEX `ac_products_status_idx` ON `ac_products` (`product_id`, `status`, `date_available`);
Improvements are there, but I think this is the max that we can squeeze out of Mysql and MyISAM on the code side for this case.

I like us to work on Postgres support and some other tools to make AbanteCart EE (Enterprise ECommerce) ready, but this will require some funding.

I fixed typos. Thank you.
Title: Re: Big data caused memory exhausted
Post by: smallfish on January 11, 2017, 04:14:44 AM
I have tested it with 450K products.
The backend product browsing is improved. It takes 13-15 seconds to jump to next page previously but now only 4-5 seconds

The frontstore home page: It takes 1-2 minutes to load the home page previously but now 30 seconds (after clear all the caches and reload it).
The categories / products listing page : It takes 15-20 seconds to load the page previously but now 12-14 seconds.
browsing page by page in a category (jump to next page): It takes 15-20 seconds to load the page previously but now also 12-14 seconds.
However, open the product listing page in a category where it has 10-20pcs of products is the same time as the category where it has 450K products....
open product detail page : no change, it takes 4-6 seconds to open the page.

In fact, for the front store, it was failed to load the home page after add the new indexes to the tables. (I have uploaded abantecart v.2 /admin, /core to overwrites my abantecart v.1.2.9 but forget to overwrite the storefront.) The error is http 500 error and find memory exhausted in error log. I have increased memory_limit to 512M in php.ini, then it works.
It seems that the new indexes will also use some memories. But after I uploaded the abantecart v.2 /storefront to overwrites the v.1.2.9, it only needs 384M.

Besides, you have mistake in public_html/install/abantecart_database_upgrade.sql (abantecart v.2.0)
CREATE INDEX `ac_products_status_idx` ON `abc_products` (`product_id`, `status`, `date_available`);
should correct to :
CREATE INDEX `ac_products_status_idx` ON `ac_products` (`product_id`, `status`, `date_available`);
Improvements are there, but I think this is the max that we can squeeze out of Mysql and MyISAM on the code side for this case.

I like us to work on Postgres support and some other tools to make AbanteCart EE (Enterprise ECommerce) ready, but this will require some funding.

I fixed typos. Thank you.

This is for your reference : https://www.digitalocean.com/community/tutorials/sqlite-vs-mysql-vs-postgresql-a-comparison-of-relational-database-management-systems
Title: Re: Big data caused memory exhausted
Post by: eCommerce Core on January 12, 2017, 08:19:29 AM
Thank you for sharing