Uname : Linux premium36.web-hosting.com 4.18.0-553.44.1.lve.el8.x86_64 #1 SMP Thu Mar 13 14:29:12 UTC 2025 x86_64
Soft : LiteSpeed
Ip : 198.54.115.237
Port : 443
~
/
opt
/
alt
/
python37
/
lib
/
python3.7
/
site-packages
/
future
/
backports
/
urllib
/
__pycache__
[ HOME ]
Exec
Submit
File Name : robotparser.cpython-37.pyc
B �A�[� � @ s� d dl mZmZmZ d dlmZ d dlmZ d dlm Z mZ e e_ ee_dgZ G dd� de�ZG dd� de�ZG d d � d e�ZdS )� )�absolute_import�division�unicode_literals)�str)�urllib)�parse�request�RobotFileParserc @ sZ e Zd ZdZddd�Zdd� Zdd� Zd d � Zdd� Zd d� Z dd� Z dd� Zdd� ZdS )r zs This class provides a set of methods to read, parse and answer questions about a single robots.txt file. � c C s, g | _ d | _d| _d| _| �|� d| _d S )NFr )�entries� default_entry�disallow_all� allow_all�set_url�last_checked)�self�url� r �T/opt/alt/python37/lib/python3.7/site-packages/future/backports/urllib/robotparser.py�__init__ s zRobotFileParser.__init__c C s | j S )z�Returns the time the robots.txt file was last fetched. This is useful for long-running web spiders that need to check for new robots.txt files periodically. )r )r r r r �mtime&