HEX
Server: Apache
System: Linux pdx1-shared-a1-38 6.6.104-grsec-jammy+ #3 SMP Tue Sep 16 00:28:11 UTC 2025 x86_64
User: mmickelson (3396398)
PHP: 8.1.31
Disabled: NONE
Upload Files
File: //usr/lib/python3/dist-packages/sqlparse/__pycache__/lexer.cpython-310.pyc
o

�e}_�	�@sPdZddlmZddlmZddlmZddlmZGdd�d�Z	dd	d
�Z
dS)z	SQL Lexer�)�
TextIOBase)�tokens)�	SQL_REGEX)�consumec@seZdZdZeddd��ZdS)�Lexerz?Lexer
    Empty class. Leaving for backwards-compatibility
    Nccs�t|t�r
|��}t|t�rn,t|t�r3|r|�|�}nz|�d�}Wnty2|�d�}Yn
wtd�t	|����t
|�}|D]>\}}tD]1\}}|||�}|sTqHt|tj
�rb||��fVnt|�rm||���Vt||��|d�ntj|fVqBdS)a�
        Return an iterable of (tokentype, value) pairs generated from
        `text`. If `unfiltered` is set to `True`, the filtering mechanism
        is bypassed even if filters are defined.

        Also preprocess the text, i.e. expand tabs and strip it if
        wanted and applies registered filters.

        Split ``text`` into (tokentype, text) pairs.

        ``stack`` is the initial stack (default: ``['root']``)
        zutf-8zunicode-escapez+Expected text or file-like object, got {!r}�N)�
isinstancer�read�str�bytes�decode�UnicodeDecodeError�	TypeError�format�type�	enumeraterr�
_TokenType�group�callabler�end�Error)�text�encoding�iterable�pos�char�rexmatch�action�m�r�0/usr/lib/python3/dist-packages/sqlparse/lexer.py�
get_tokenss>�


�
�
��zLexer.get_tokens�N)�__name__�
__module__�__qualname__�__doc__�staticmethodr!rrrr rsrNcCst��||�S)z�Tokenize sql.

    Tokenize *sql* using the :class:`Lexer` and return a 2-tuple stream
    of ``(token type, value)`` items.
    )rr!)�sqlrrrr �tokenizeLsr)r")r&�ior�sqlparser�sqlparse.keywordsr�sqlparse.utilsrrr)rrrr �<module>s6