If you dont use a canonical tag and you have duplicate content will google bother to index any of the pages (try and work out correct url and show) or will it just not index pages at all?
what would be seen as acceptable amount of duplicate content in terms of percentage? Considering most pages will have duplicate content, such as job boards that show jobs in different location but have a default radius that shows jobs that will also show in other location landing pages.
Google recommend using a canonical tag instead of using a robot.txt and blocking. Why do they suggest this? can it create problems if you use robot.txt? if so whats the reason?.