zoukankan      html  css  js  c++  java
  • Maximum File Size for Crawling Search Services

     

    February 16th, 2007

    By default, Search Services can crawl and filter a file with a size of up to 16 megabytes (MB). It will always crawl the first 16MB of a file. After this limit is reached, SharePoint Portal Server enters a warning in the gatherer log “The file reached the maximum download limit. Check that the full text of the document can be meaningfully crawled.”

    To increase the limit of 16 MB, you must add in the registry new entry MaxDownloadSize. To do this, follow these steps:

    1. Start Registry Editor (Regedit.exe).
    2. Locate the following key in the registry:
    3. HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Office Server\12.0\Search\Global\Gathering Manager
    4. Open Edit - New - DWORD Value. Name it MaxDownloadSize.
    5. Double-click, change the value to Decimal, and type the maximum size (in MB) for files that the gatherer downloads.
    6. Restart the server.
    7. Start Full Crawl.

    NOTE: Increasing the file size may cause a timeout exception because the crawler can timeout if the file takes too long to crawl/index (because of its size). To increase timeout value, follow these steps…for the full post, check out  Maximum File Size for Crawling

    You can also review searcu usage data by enable search query logging by following these steps:

    1. On the home page, on the Site Actions menu, point to Site Settings, and then click Modify All Site Settings.
    2. On the Site Settings page, under Site Collection Administration, click Site collection usage reports.
    3. On the Site Collection Usage Summary page, on the Quick Launch, click Search queries or Search results.

    Additional resources:

  • 相关阅读:
    z-index坑
    一些常用的可以封装好的方法
    echarts线状图
    vue 用js复制内容
    Java并发系列
    ThreadLocal讲解
    TreeMap源码学习
    HashMap源码学习
    Java Socket编程
    socket、tcp、udp、http 的认识及区别
  • 原文地址:https://www.cnblogs.com/ceci/p/1406664.html
Copyright © 2011-2022 走看看