You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Google (and other bots) might not easily find all the urls and index them when crawling the site, since most works and terms are only listed after searching or paging/scrolling, which I doubt the bots will be doing.
They can be in XML, RSS or plain text. The plain text format is literally a list of urls. The other formats can be used to add metadata, but that doesn't seem really valuable in our case.
The sitemap can be submitted in the Google Search Console, or referred from robots.txt
We could generate one by doing API queries and outputting route paths
I'm sure WordPress can generate a sitemap for the /om/ pages. We can combine the two sitemaps with a sitemap index
I don't think we should generate the sitemap as a static file at build time, as this would need to be triggered again manually after relevant changes in Libris or the QLIT backend
Better, perhaps, to do the sitemap generation in a Vue route view. It would take a long time but hopefully that's not a problem.
The text was updated successfully, but these errors were encountered:
Another approach that would probably benefit search bot indexing is #57 "Search query as url parameters", especially including the page number. If I understand correctly, the bot can then navigate through the result list and reach all the work urls by itself.
Google (and other bots) might not easily find all the urls and index them when crawling the site, since most works and terms are only listed after searching or paging/scrolling, which I doubt the bots will be doing.
Background:
robots.txt
/om/
pages. We can combine the two sitemaps with a sitemap indexThe text was updated successfully, but these errors were encountered: