File: //usr/lib/python3/dist-packages/sqlparse/__pycache__/lexer.cpython-310.pyc
o
�e}_� � @ sP d Z ddlmZ ddlmZ ddlmZ ddlmZ G dd� d�Z dd d
�Z
dS )z SQL Lexer� )�
TextIOBase)�tokens)� SQL_REGEX)�consumec @ s e Zd ZdZeddd��ZdS )�Lexerz?Lexer
Empty class. Leaving for backwards-compatibility
Nc c s � t | t�r
| �� } t | t�rn,t | t�r3|r| �|�} nz| �d�} W n ty2 | �d�} Y n
w td�t | ����t
| �}|D ]>\}}tD ]1\}}|| |�}|sTqHt |tj
�rb||�� fV nt|�rm||�� �V t||�� | d � ntj|fV qBdS )a�
Return an iterable of (tokentype, value) pairs generated from
`text`. If `unfiltered` is set to `True`, the filtering mechanism
is bypassed even if filters are defined.
Also preprocess the text, i.e. expand tabs and strip it if
wanted and applies registered filters.
Split ``text`` into (tokentype, text) pairs.
``stack`` is the initial stack (default: ``['root']``)
zutf-8zunicode-escapez+Expected text or file-like object, got {!r}� N)�
isinstancer �read�str�bytes�decode�UnicodeDecodeError� TypeError�format�type� enumerater r �
_TokenType�group�callabler �end�Error)�text�encoding�iterable�pos�char�rexmatch�action�m� r �0/usr/lib/python3/dist-packages/sqlparse/lexer.py�
get_tokens s>