Stop Transfer by HTTP Error 206, Partial Content?

Collapse
X
 
  • Time
  • Show
Clear All
new posts
  • schittli
    Junior Member
    • Apr 2004
    • 19

    Stop Transfer by HTTP Error 206, Partial Content?

    Good evening

    We have a lot of bandwidth usage by HTTP Error 206 Partial Content and it
    seems to be impossible to see who is transfering which Files.

    Is it possible to stop this kind of transfer?
    Our biggest file is a ~ 2.5MByte PDF File which is hopefully downloadable without partial content transfer. (?)

    Thanks a lot in advance,
    kind regards,
    Thomas
  • ZYV
    Senior Member
    • Sep 2005
    • 315

    #2
    You can stop this by having Apache to deny the request if HTTP_RANGE is set and is not null by returning 501 Not Implemented or 403 Forbidden / 406 Not Acceptable. There are several methods of achieving this goal, notably see the examples in Apache Cookbook, p.161-162 (available here: http://z.askapache.com/pdf/ , not sure of the legal status of these downloads).

    But you know, whenever I see a website not supporting HTTP_RANGE I dismiss it. You should think of a better way of counting dowloads (Google Analytics, mod_rewrite redirection, whatever).

    Comment

    • schittli
      Junior Member
      • Apr 2004
      • 19

      #3
      THANK YOU VERY MUCH!

      Hello ZYV,

      thank you very much for your fast & perfect answer with solution -
      it works perfect.

      My Website is very small and usually has just some visitors per
      week and I even don't have any objects which needs partial download.

      Kind regards,
      Thomas

      Comment

      • ZYV
        Senior Member
        • Sep 2005
        • 315

        #4
        Hello, Thomas,

        Glad it helped.

        I just realized that one probable reason for this might be robot abuse. Currently most of the robots implement PDF indexing, so they might be trying to download your stuff in order to index it. If this is undesired you might try to fiddle with robots.txt.

        Another reason might be online reading via the Acrobat plug-in. It usually only fetches pages it is about to display, so people reading your files inside the browser might generate lots of HTTP_RANGE requests.

        Comment

        • schittli
          Junior Member
          • Apr 2004
          • 19

          #5
          … thank you! Once more...

          Thank your very much for these additional informations!, I changed robots.txt
          and now it works perfect and now more than 99% is "viewed traffic".

          Thank you and kind regards,
          Thomas

          Comment

          • ZYV
            Senior Member
            • Sep 2005
            • 315

            #6
            Hi Thomas,

            Glad it helped

            @Z

            Comment

            Working...