Spydus Help
Maintenance / OPAC & enquiry maintenance / Search Engine Optimisation (SEO) parameters
In This Topic
    Search Engine Optimisation (SEO) parameters
    In This Topic
    This functionality requires additional commissioning and a fee applies. Please contact your Civica Account Manager for more details.

    The SEO Parameters feature may be used to specify the pages and paths in the OPAC that are desired to be either explicitly indexed or ignored by search engines (e.g. Google, Bing) .

    Spydus is configured by default such that relevant pages in the OPAC become indexed for web searching, and temporary content (such as a search result set) is ignored by crawler bots, and does not become indexed. This prevents temporary content from appearing in web search results for the library.

    However, this can mean that custom content created by the library (e.g. Page Generator content) does not get crawled or indexed. To ensure that it does, staff can manually add pages to the Sitemap and Robots.txt tables.

    To allow a customer page to be crawled:

    1. Navigate to Maintenance > OPAC & Enquiry > SEO Parameters.
    2. On the Sitemap tab, click Add.
      • Enter the URL, excluding the domain name (e.g. for the URL https://libraryname.spydus.com/cgi-bin/spydus.exe/MSGTRN/WPAC/PAGENAME, use the URL fragment /cgi-bin/spydus.exe/MSGTRN/WPAC/PAGENAME)
    3. Click the Robots.txt tab and click Add.
      • From the Rule dropdown, select ALLOW.
      • In the URL field, enter the same URL from step 2.

    Allowing content to be crawled and indexed simply ensures that search engines can rank your content; this does not ensure highly placed search results.

     Alternatively, to explicitly exclude pages from being crawled and indexed, URLs can be Disallowed via robots.txt.

    To disallow a page:

    1. Navigate to Maintenance > OPAC & Enquiry > SEO Parameters.
    2. Click the Robots.txt tab and click Add.
      • From the Rule dropdown, select DISALLOW.
      • Enter the URL, excluding the domain name (e.g. for the URL https://libraryname.spydus.com/cgi-bin/spydus.exe/MSGTRN/WPAC/PAGENAME, use the URL fragment /cgi-bin/spydus.exe/MSGTRN/WPAC/PAGENAME)
    The SEO Parameters interface is designed to modify the sitemap and robots files in a way that conforms with web crawling standards. Be aware that not all crawlers behave identically, and some are even coded to specifically ignore standards.