Show Posts


Messages - Geoffrey

Pages: 1 ... 3 4 [5] 6 7 ... 11
61
Thanks!

I don't think it is a significant change. 

I just don't know which file to mod, and I don't really speak php. 

I already asked elsewhere...


62
General Support / Re: Problem with canonical urls
« on: September 14, 2017, 04:45:34 PM »
Hello, & thanks. 

I looked through some of the new files on github, but did not recognize any files that modified seo url generation.  Maybe you can point me to a particular file. 

Another approach to my goal:

Can you tell me how to modify whatever php file that generates seo urls like "domain/product-seo-key",
so that it now generates a "domain/category-seo-key/product-seo-key" url?   

Shouldn't be too hard, I think.  I just don't know how to do it. 

My site doesn't use Specials, Bestsellers, Featured, New Product, or any other block to access products.  Only Category.
Therefore all product urls are domain/category/product.
If I make all my Category keys very short,
and if I make AC generate domain/category/product urls for seo urls, then I can reach my goal of having only one url indexed and navigated.

Thanks. 

63
Worked like a charm! 

I can kind of see how it works:  the <a></a> link value is only applied when count meets criteria, else echo breadcrumb text without link. 

Thanks!  This is great! 

I save all these tweaks in a doc, and sometimes I learn enough from them to make similar changes to other files, so thanks for broadening my horizon another step!  :-). 

I had already slightly modified my breadcrumbs.tpl to remove the little house icon from the Home crumb. 

So here is Sam's mod combined with my house-removal mod:

Code: [Select]
<?php 
if ($breadcrumbs && count($breadcrumbs) > 1) {
$total count($breadcrumbs);
$count 1;
?>

<section class="breadcrumbs">
<h4 class="hidden">&nbsp;</h4>
   <ul class="breadcrumb">
       <?php foreach ($breadcrumbs as $breadcrumb) { 
         
$count $count 1;
      
?>

       <li>
      <?php if($count <= $total){ ?>
       <a href="<?php echo $breadcrumb['href']; ?>">
          <?php echo ($breadcrumb['text']); ?>
       </a>
      <?php } else { ?>
      <?php echo ($breadcrumb['text']); ?>
      <?php ?>
       </li>
       <?php ?>
   </ul>
</section>
<?php ?>

Noobs: because the tpl mod above removes the hotlink from the breadcrumb for the active page, the color for the "cold" crumb will no longer be the hot color, so you may not need the css mod originally mentioned above unless you want to specify an entirely different color for the cold crumb. 

So Sam. 
Can you modify the seo-url generation code to include category in the seo product urls?

Example: instead of domain/product-seo-key, the seo url for each product page would be domain/category-seo-key/product-seo-key. 
I have only done a cursory search for the proper file.  I've looked at storefront/model/tool/seo_url.php, but I'm not sure that is the correct file.

I'm aware this will be a core edit.  I currently have 4 core edits on my site, and a detailed list of how to replicate them in the event that they get overwritten by an update.  So I don't mind adding another to the list. 

Many thanks!  This is great!  8)

64
SEO / Re: CSS Management, pageload speed, & BLOAT
« on: September 13, 2017, 10:14:10 PM »
In head.tpl,

Line 48 is a comment: "Set $faster_browser_rendering == true; for loading tuning."

Does that mean to set == true and leave it set to true, or

set to true for load tuning, but set back to false for production? 


65
General Support / Re: Problem with canonical urls
« on: September 13, 2017, 09:21:33 PM »
Hello.
Let me post few thoughts.
1. Your products can be assigned to one category, to several categories or did not assigned at all to any.
The valid product urls like www.domain/product_seo_key and can be accessed from blocks (specials,featured, bestsellers) and from search listing , breadcrumbs. All others urls also valid www.domain/category_seo_key/product_seo_key

2. in the 1.2.11 (beta testing now) if you enabled seo url the canonical meta tag will be on all product pages and will point google or any bot to www.domain/product_seo_key
So you can grab the code from source to see how to implement it https://github.com/abantecart/abantecart-src/tree/1.2.11

This feature must already be in place on 1.2.10, because that is what is happening on my site already. 
Navigate from homepage to product page, and you will be at domain/category-seo-key/product-seo-key but the meta on that page will be domain/product-seo-key.
When I asked at GSC, the response was that my page meta was pointing google at a different page than the one i was on, and that the different page was identical to the one I was on, and that google considered this to be duplicate content. 

3. Free online seo urls generators is not the best way to create sitemap for google

Strange. 
If a noob builds a site with AC, the noob will eventually learn by reading on this forum that AC creates an html sitemap for site users but not an xml sitemap for bots.  The noob is encouraged by various threads and comments (by moderators on this forum) to access any number of available online sitemap builders, because a sitemap is simply a list of urls of the site, so the source is not important.  What is not said is that just about any sitemap builder will output a sitemap that has at least two different urls for identical content (three in my case, see red letter comments above), and that this will be a problem for search engine bots, therefore the noob might consider editing the sitemap to remove all of the urls to duplicate content before submitting the sitemap to GSC. 

This then leads to the decision of whether to keep the "organic / native" urls, or use the seo urls. 

None of this is mentioned in the docs or anywhere on this forum. 

I suspect Nimitz has the optimal solution.  Don't use seo urls. 

Importantly, until discourse like this is published, how will anyone ever know this stuff? 

66
Finally had a chance to look at site again today. 

Sam's recommendation will change the color of the breadcrumb for the current page, but it won't make it "cold".   It's still a hot link. 

It shouldn't be a link at all. 

Any suggestions? 

67
Hey, thanks Sam!  I'll give it a shot. 

68
General Support / Re: Problem with canonical urls
« on: September 09, 2017, 01:39:01 PM »
A partial explanation is that this problem stems from the old duplicate content problem inherited from open cart:

https://isenselabs.com/posts/how-to-solve-the-duplicate-content-issue-in-opencart

It's funny that so many "experts" are running AC sites and there is no mention of this problem on this forum. 
The only person so far that I have seen mention the probable best strategy for AC seo-urls is Nimitz1061: don't use them.

If you use AC seo-urls, AC is creating at least one additional pathway to each product page, which google does not like. 
Two pathways to the same content is known as "duplicate content". 
This lowers google's overall "confidence score" in your site, which lowers your "authority" rank, which lowers your actual index rank. 
Whether you like it or not.

With AC seo-urls enabled, a normal "native" product url should be long: category/list/product, or L1/L2/L3. 
But seo_url.php was written to create a truncated short url: category/product, or L1/L3.
This means the bot will see duplicate content when it crawls your site.  Every Product Page will have two urls: the long one and the short one. 

Here's the crazy part: AC then writes a rel="canonical" into the head of each product page that describes the new short url as the canonical url!

This is wildly counter-intuitive.

It tells search bots to index the non-native short L1/L3 url, which is the same as tellingthe bot to discard the actual site-created, user-navigated native L1/L2/L3 url because it is a link to duplicate data. 

If a visitor comes to your homepage, clicks a category, and then clicks a product within that category, they will be navigating to a url that is not indexed by google.  Think about that. 

All of your internal navigation takes visitors to un-indexed pages, thanks to the opencart problem of truncated urls and the canonical habit of AC which is to tell search bots that the canonical url is not the native url, but rather a new code-generated truncated url that doesn't get created by any form of navigation. 

Regarding the "red text" weird urls mentioned in OP, I still have no idea where they are coming from. 

Solutions:
1 - AC devs: tell me exactly how (in comprehensive, non-coder language and steps) to rewrite /storefront/model/tool/seo-url.php so that seo-urls are long-form L1/L2/L3 urls that match the natively created product page urls; AND, tell me exactly where and how to change the rel="canonical" head element to match the same native long-form url. 
The result of these changes will be that seo-urls become "longer" but they will match the url created by internal site navigation, and the canonical url described in the head of each product page will now also match the native url created by internal site navigation, and AC will no longer be confusing google and lowering the authority of all AC sites by specifying canonicals that don't match native product urls, and AC will no longer be confounding users who don't understand why sitemaps have two sets of urls to each product page or why google threw out real urls while indexing fake urls that themselves don't ever occur during normal site navigation.

2 - I disable seo-urls. 
The value of seo-urls is debated.  They don't help the bot.  The bot will index every page regardless of seo-urls.  They "look better" in the address bar.  I guess.  Beauty is in the eye of the beholder? 
Is Amazon harmed by its rather non-pretty product urls? 
Theoretically, breadcrumbs offer adequate location-and-page-awareness for site visitors. 
Theoretically, product meta tag keywords and descriptions will capture the eye of search-engine users who find a site in search results, which means that they won't necessarily be put off by the url in the search result, regardless of whether it is a pretty url.

My personal dilemma is this: i built the site and published it more than a month ago.  Googled indexed it all wrong thanks to AC seo-url flaws, and lowered my site authority because of it.  This took almost a month to occur.  Now I have to start over, and according to google, it may take months before all of the bad urls drop off, unless I use the remove url tool to remove all my pages.  But, if i remove all the bad urls and then put up a repaired sitemap and ask for a recrawl, the bot will still struggle to make sense of it all, because I basically just removed and then reinstalled all the same content, only with different urls, which makes me look like a spammer-site and therefore lowers my site authority even more. 

Niiice.

That's my beef Mr. AbanteCart. 

Coding an application that does stuff is never enough, not for any coder. 
The application has to do stuff that actually works in the real world. 
Or at least causes no harm. 

69
General Support / Problem with canonical urls
« on: September 08, 2017, 03:19:37 PM »
www.inspired-designco.com

My site is pretty simple, 4 "levels".

Level 1 = Home page = Category List
Level 2 = Product Listing
Level 3 = Product Page
Level 4  = Cart/Checkout/etc

On Level1 Home page, you choose a Category which takes you to Level2 Product Listing where you choose a Product which takes you to Level3 Product Page. 

So the url for any product page should be sitename/listname/product, or "level1/level2/level3".

There are no Product links on the Level1 Home Category page.  You cannot jump from Level1 to Level3. 

So there should be no such thing as a level1/level3 url.  My AC site code should never generate a level1/level3 url. 

I have just learned that googlebot indexed all of my product pages as level1/level3 urls. 

How did I learn this? 
I asked google support why the googlebot was only indexing thumbnail images but no fullsize product images from my site. 
The support team explored my site and told me that the first problem they found was that none of my product pages were being indexed because all of my product pages were duplicates of pages that the bot already indexed. 

Let's use aprons for example. 

Go to home page.  This will be the level1 url:
https://www.inspired-designco.com/index.php?rt=index/home

Choose the Apron category.  This will take you to the proper level1/level2 url:
https://www.inspired-designco.com/better-bib-linen-chefs-apron

Then choose Navy-colored apron.  This will take you to the proper level1/level2/level3  url:
https://www.inspired-designco.com/better-bib-linen-chefs-apron/better-bib-linen-chefs-apron-navy

Now, go to this link:
https://www.inspired-designco.com/better-bib-linen-chefs-apron-navy
You cannot access this url from any link or page on my site.
The only way to access this url is to manually enter it into the address bar, or click the convenient hot link I provided above. 
This is a level1/level3 url. 
It should not exist. 

Google tells me that it indexed all of my product pages as this type of level1/level3 url. 

Google tells me that the reason I cannot find any of my expected level1/level2/level3 Product Page urls indexed on Google is because the bot considers those L1/L2/L3 urls to be duplicates of the L1/L3 urls that it already indexed, and that this "canonical problem may be contributing to the absence of indexed fullsize images which only appear on your Level3 pages". 

They are correct about the fullsize image location: the only place to see fullsize product images is on a product page.

I checked my xml sitemap generated by https://xmlsitemapgenerator.org/sitemap-generator.aspx
To my surprise, all of the product page urls in my sitemap are of the incorrect type of level1/level3 urls. 

So I used a different sitemap generator to create another sitemap: www.xml-sitemaps.com
This generator output three types of Product Page urls!  They are:
Correct type: level1/level2/level3 url.
Incorrect type: level1/level3 url.
Strange new incorrect type: level1/new-weird-level2/level3 url.

The strange new url is https://www.inspired-designco.com/id-inspired-design-co/proper-stuff-pillow-herringbone .

I have no idea how the sitemap crawler came up with the red part.  But the link actually works!
Even stranger, Google has actually indexed that page!!
Enter this into your google search bar: "site:https://www.inspired-designco.com/id-inspired-design-co/proper-stuff-pillow-herringbone" 
You see?  That url has actually been indexed by Google!

There is no way for the site to create that url, and yet robots find it and index it instead of the desired and predictable L1/L2/L3 url that should be created during normal navigation around the site.  . 

Solutions:

I think its' possible to manually clean up a sitemap so that the map only features the desired L1/L2/L3 type of url for Product Pages, then submit the map and ask google to recrawl the site and hope that the duplicate content / canonical / weird urls problem vanishes. 
The clean-up will be a labor-intensive and time-consuming process, even for my small site.  Impossible for a large site. 

And there is no guarantee that it will work. 
A sitemap crawler identified and defined strange urls that should not exist.
Even if I clean up the xml sitemap by hand, there is no guarantee that the google bot won't find and index the same strange urls. 

So that leads to my questions.

Why is a bot crawling my basic AC site and coming up with urls that would otherwise never be created? 
There is no way for a site visitor doing normal navigation to create a level1/level3 url.  You have to go through level2 to see a level3 url.
Likewise, there is no way for a site visitor doing normal navigation to create a level1/red-text/level3 url.   

Did I do something wrong when I built this AC site? 
Is there some button or feature I need to set to avoid having all these different and undesired pathways to a Product Page?

Has anyone else encountered strange urls discovered by spiders?

What can I do to the back or front end so that bots do not discover strange, undesired urls?

Thx.

70
Customization help / How to change color and state of active-page breadcrumb
« on: September 07, 2017, 02:24:14 PM »
Hi. 

The prevailing opinion on various dev sites is that the breadcrumb for the active page should not be live, and hence not have the same appearance as the preceding breadcrumb links. 

The logic is not challenging.  There is no reason to have a live breadcrumb link to the page you are currently on. 

In AC, the breadcrumb for the active page is hot.

I'm sure this could be a 2 minute fix for the devs here. 

It will take me hours to sort it out if I have to do it on my own. 

I wonder if you guys might post a simple tpl tweak to deactivate the breadcrumb for the active page?

Thanks!

71
SEO / Re: CSS Management, pageload speed, & BLOAT
« on: September 05, 2017, 01:08:41 PM »
Hi, for the sake of future readers who, like me, are not web development specialists, I offer the following comments:

Basara provides advice pertaining to AC features that should be enabled on any site:

1 - If you want fast pageloads, use a quality hosting company that provides reliable and fast server service. 

2 - Rename the oem .htaccess.txt file to .htaccess, and then add or uncomment various RewriteCond or RewriteRule rules within the file to accomplish the following objectives: direct all forms of url to your preferred url (ie www or https: or whichever you select as your primary site address), enable AC Retina screen image management, enable the AC SEO url, compression and cahce settings. 

3 - Use a content delivery network to deliver your site content, instead of having all of your content served only by your site host.  If you are Amazon, you need CDN, but you also buy web development, so none of my posts will ever mean anything to you.  If you are a small mom & pop with 30 products and 6 images of each product, my opinion is a CDN is not going to make a big difference in your pageload speed for most of your customers, provided that you have of course resized your images to site-appropriate dimensions and then optimized them with 4:2:0 chroma with 85% compression. 

Basara's final comments are related to my earlier paragraph that described FireFox DustMe extension.  The AC stylesheet obviously must include rules for various screen size breakpoints because AC is responsive.  As he says, the process of identifying unused rules must be accomplished over a period of time during which the site is thoroughly evaluated on ever conceivable screen size.  This is why Dustme is actually not a very helpful tool for a site such as AC that uses nearly 10,000 lines of css rules (unminified).  The hours spent manually deleting css rules over time would outweight the small gain reulting from reduced bloat. 

Which all brings us back to the central point of my post:
There are scripts available that apparently do a really good job over time of shrinking your css.  I wonder if anyone on the AC forum has made use of these scripts?  I wonder if the AC devs have considered them at all? 

Google webmaster tools include a speed assessment tool that measures pageload speed of your site in both mobile and desktop universe.  It is very simple to check speed, then change a site parameter and check speed again to see the effect of the change.  Google even goes so far as to make recommendations on how to improve your speed. 
This is how I determined that combining and minifying my css bumped me from high 70's to 90th percentile.  My site is almost what google considers to be fast. 

I'm quite sure that if shrink my css from 10,000 lines to 2000, I'll probably land at around 94, and i will have also done everything reasonable to optimize my site. 

Sooo, anyone had any success with css management scripts on AC or any other cms?

72
SEO / CSS Management, pageload speed, & BLOAT
« on: September 03, 2017, 05:21:58 PM »
There are two reasons to seek optimal pageload speed:
1 - faster pageloads = more retained visitors
2 - google defines what it considers to be optimal pageload speed, and also provides recommendations for how to achieve it.  It seems unlikely that pageload speed is not a factor in their ranking algorithm. 

Following google guidelines, the biggest and easiest improvement i made to improve pageload speed was to combine and minify css. 
If you look in your head.tpl, you will see two things:
1 - calls for 5 separate css files.
2 - AC's notes on how to merge and minify css into one file, and even a link to an essay on css management. 

head.tpl calls 5 separate css files.  The bootstrap & font-awesome & style.css files are BIG.  My site uses a third or less of the rules in each of those files.
 
As a rule of thumb, all CSS files are render-blocking, meaning that the page cannot start rendering until all css files are loaded and parsed. 
It takes longer to call and parse multiple css stylesheets than it takes to call and parse one.
It takes longer to parse an un-minified css file than to parse a minified css file. 
Obvious solution: use a text editor to create a new css file called "bigboy" or whatever you want.  Then copy/paste the entire contents of each of the five oem css files into bigboy, then minify it and save it and upload it to your live stylesheet folder. 
Then edit your live head.tpl file to comment out the calls for the five separate css files and replace those with a single call for the new bigboy file. 

This dramatically improves AC pageload speed. 

But the job isn't really done, because most AC sites are like mine - they are only using a quarter or less of the rules contained in the oem css files.

The rest is just bloat. 

There are 2 methods I know of to get rid of this bloat:
1 - Firefox Dustme extension can identify unused rules in your css file, which you can then manually delete.  This is a labor intensive process that must be performed carefully over time, but you can shrink your bigboy css files from 9000 lines to 2000 lines pretty easily.  This will speed up parsing and therefore pageload, and will also make you feel less bloated. 

2 - There are scripts available on github that constantly screen oem css files to identify only those rules that are actually being used, and then combine and minify only the used rules into one small css file that runs the site.  You always work within the original separate large css files to make changes to your site, the script always runs in the background to include your modifications into the single small css file that the site runs on. 

Method 2 is much better in terms of longterm site maintenance, because the de-bloating and combining and minifying steps are automated, meaning that you don't have to undo and redo this process each time you edit your site within the oem files. 

Which leads to my questions: 
How do you use these kinds of scripts with AC?   
Has anyone done it? 
Has anyone tested any of the various different opensource css management scripts that are available? 
Is there a tutorial available on how to install and run these scripts on AC? 
Would you like to write one? 

I'd do it, but I don't even know step 1.

Thx. 

73
Support / Re: I want to move to another web hosting. What should I do?
« on: September 02, 2017, 03:31:43 PM »
Thank you Geoffrey  :)
Can you please explain what "Home button will take you to a broken homepage"?
Setting up both store urls to https is not recommended but of course allowed.

Yes I can.  AC documentation is lacking in specificity and intuitive workflow.  A simple and comprehensive tutorial for establishing an https site does not exist.

In the absence of adequate documentation, I read what was available and then I researched the forum.  I found multiple instances of a recommendation for setting both store urls to https as a solution to my observed problem of navigating to my https site, then navigating to a product page, then clicking the home button which then took me to a non-https homepage that had two problems: it was non-https (which I guess is becoming obsolete), and it was broken, meaning that images did not show and the layout was jumbled.  The only solution i found for this problem was setting both urls to https. 

What do you mean by 'not recommended'?
What does that mean? 
Why is it not recommended? 
What is the harm of doing it? 
What is the alternative to doing it? 
What other approach should i be taking? 
What are the steps necessary to achieve the alternative approach?  All of the steps please. 

I think you should Not edit config.php to setup SSL. What Server value your have before?

Why not?  What is the reason that I should not do this? 
Is there a harm associated with this approach?
Why do you say that I should not do this?

Nothing else worked. 
I read and researched. 
This is what I came up with. 
The 1st line in my config.php was this:
define('SERVER_NAME', 'www.sitename.com');
I changed it to this:
define('SERVER_NAME', 'https://www.sitename.com');

It worked.  It has worked for over a month now.
Why should I not do this?
What is the harm?
What should I do instead?
Why should i do it instead?
What are the exact steps of what i should do instead? 
 
It is not helpful to have an AC developer tell me that I should not do this when the AC documentation does not tell me what I should do, and also when the developer does not explain anything.   


74
SEO / Re: google is indexing thumbnail images instead of full-size
« on: August 16, 2017, 12:32:29 PM »
Image sitemap won't work as a management tool for improving the image SEO flaw of AbanteCart because a sitemap has no influence over whether a page or image gets indexed. 

The bots will eventually crawl your entire site regardless of whether certain elements are present in a sitemap. 

Google will index the parts of your site that it thinks are worth indexing.  The sitemap has no bearing on what gets indexed. 




75
SEO / Re: google is indexing thumbnail images instead of full-size
« on: August 15, 2017, 05:27:42 PM »
Oh, one more:

robots.txt: noimageindex = do not index images on this page.

If you abandon thumbnails and preview pane with easy zoom, and replace all that with vertically stacked full-size images on each product page, then you can use the above command to block indexing of your category page and whatever other pages you have with small product images, and I guess remove the cart thumbs,  and then the only images google will index are the large images on the product pages. 

I'm not sure how Bing and other engines look at noimageindex. 


Pages: 1 ... 3 4 [5] 6 7 ... 11

Powered by SMFPacks Social Login Mod