canonicalization question

I Have a few question in regards to canonicalization any help would be greatly appreciated.

If you dont use a canonical tag and you have duplicate content will google bother to index any of the pages (try and work out correct url and show) or will it just not index pages at all?

what would be seen as acceptable amount of duplicate content in terms of percentage? Considering most pages will have duplicate content, such as job boards that show jobs in different location but have a default radius that shows jobs that will also show in other location landing pages.

Google recommend using a canonical tag instead of using a robot.txt and blocking. Why do they suggest this? can it create problems if you use robot.txt? if so whats the reason?.

Minifying a JaveScript File

Google Page Speed Insights recommends that I minify a few of my site’s JavaScript files to improve on download speed and save on
on file transfer size.

Looking through the Google Developers website, it recommends using Closure Compiler ( https://developers.google.com/closure/compiler/ )
among a few other tools to minify the JavaScript files.

My question is, once I minify the file using the Web Application of Closure Compiler for example, or any other similar tool, is it as easy as deleting my
original "un-minyfied" JavaScript file, and replacing it with my new "minifyed" JavaScript file on my hosting server?

google penalty mywebsite?

my website was created in 5 month ago. Number of clicks in In google.com/webmaster are (plot of clicks In google.com/webmaster )
2-september-2015 -> 315
3-september-2015 -> 491
4-september-2015 -> 454
5-september-2015 -> 551
6-september-2015 -> 392
7-september-2015 -> 269
8-september-2015 -> 222
.
.
x-october-2015 -> 111
x-november-2015 ->90
x-December 2015 ->50
x-January 2016 ->40
In firs month all content in my website were copy of original content (duplicate content) and end of content link to original content.
after that I solve duplicate content and hard work on link building but number of clicks In google.com/webmaster not increase.
google penalty mywebsit?
how to detect mywebsit be penalized?

remove .html extension SEO URL

hi i have this structure

www.xyz.com/solar/solar-panels

do i need to add solar-panels.html instead of solar-panels in the url, otherwise the solar-panels will treat as a subfolder.

i need more ranking in SEO URL pattern. Pls help and guide for my design of urls

i heard the clean url with out extension, will get more value and visiblity in searches?

The Story of a Brand That Got Out of Hand [NSFW]

So I found this one while surfing Reddit recently and thought it was pretty interesting/funny. American insurance company Esurance used to have a mascot…and the reason they don’t anymore is because the Internet made too much pr0n of her.

This was back in 2004. Erin Esurance was her name. They designed her to target the male 18 – 24 year old market. And the campaign, apparently, did really well. But nude art of her hit the Internet "within 24 hours of the cartoon’s television commercial debut," and eventually the dirty stuff began outranking official Esurance depictions of Erin in Google image searches.

In 2010 the campaign was ended. What a bizarre tale.

I guess, in a weird way, the mascot still does advertising for Esurance. Artists are still doodling her and even selling pictures and such. It’s just in a way that Esurance would clearly like to keep unofficial.

Quote:

Originally Posted by The article

In a strange way, Brewe [the creator of the character] sees Erin Esurance’s foray into the adult art world as part of a positive feedback loop: “As a brand manager, you’re kind of content when weird things happen,” she admits. “We thought, ‘Wow, we’re making a crazy impact on culture if we’re being embraced by different people,” corroborates Alan Lau, one of her animators. “It was an acknowledgement of our artwork.”

Pagninated Content

We have a new section of our site launching soon but we need to sort out the paginated content.

This page have 5 products on the first page with the option at the bottom of the page to cycle through the next pages of products ( << 1 2 3 4 5 >> … this kind of standard option).

For each component page ( Page2 or Page 3 etc. ) this will have a rel=canonical back to the first page or our ‘view all page’. This is standard practice.

There will also be the rel=next and rel=prev on the component pages pointing to the next or previous pages.

However, I become stuck when we apply filters to the products as this creates a Query String based on the filters applied and could return 25 results for example and therefore 5 pages of results ( based on 5 products per page ).

How do we implement rel=next and rel=prev on this filtered pages with query strings? Or do we not need to use rel=next/prev on filtered results, but still with a rel=canonical back to an unfiltered view all or page 1 link.

Thanks,

HJJ

Problems when moved from http to https

Hi fellow

We have a problem when indexing our webpages. We have moved from http to https and the new pages are not indexing properly. In google console shows that a robots file (outside our domain) is blocking some resources like javascript. But this is exactly what we want, we don’t want Google to crawl the javascript.

When we tried to fetch some important landing pages of our website google can only do a partial fetch because of the robot file
Can this affect our SERP results?

which way of bloggin is better for seo

I made a website to my cousin which is a static site with just few static pages.

i want to add a blog capability to her site so that she can write things when she wants, to have good quality content on her site as she is an expert in her field.

which way is better for seo: adding a link called blog to her existing static site, and adding a wordpress site in that blog link, or, make the whole site wordpress and then add back the existing static pages?