Why Gemfury? Push, build, and install  RubyGems npm packages Python packages Maven artifacts PHP packages Go Modules Debian packages RPM packages NuGet packages

Repository URL to install this package:

Details    
odigos / etc / odigos-vmagent / instrumentations / python / packaging / __pycache__ / _tokenizer.cpython-311.pyc
Size: Mime:
§

Ög™ãóΗUddlmZddlZddlZddlmZddlmZmZddl	m
Z
eGd„d¦«¦«ZGd	„d
e¦«Z
idd“d
d“dd“dd“dd“dd“dejdej¦«“dd“dd“dd“dd “d!ejd"ej¦«“d#eje
je
jzejejz¦«“d$d%“d&d'“d(d)“d*d+“d,d-d.d/œ¥Zd0ed1<Gd2„d3¦«ZdS)4é)ÚannotationsN)Ú	dataclass)ÚIteratorÚNoReturné)Ú	Specifiercó.—eZdZUded<ded<ded<dS)ÚTokenÚstrÚnameÚtextÚintÚpositionN)Ú__name__Ú
__module__Ú__qualname__Ú__annotations__©óú;/tmp/pip-target-6culloxw/lib/python/packaging/_tokenizer.pyr
r
s+€€€€€€à
€I€IIØ
€I€IIØ€M€MM€M€Mrr
có,‡—eZdZdZdˆfd	„Zdd
„ZˆxZS)
ÚParserSyntaxErrorz7The provided source text could not be parsed correctly.ÚmessagerÚsourceÚspanútuple[int, int]ÚreturnÚNonecór•—||_||_||_t¦« ¦«dS)N)rrrÚsuperÚ__init__)ÚselfrrrÚ	__class__s    €rr!zParserSyntaxError.__init__s6ø€ðˆŒ	؈Œ؈Œå
‰Œ×ÒÑÔÐÐÐrcóª—d|jdzd|jd|jdz
zzdz}d |j|j|g¦«S)Nú rú~rú^z
    )rÚjoinrr)r"Úmarkers  rÚ__str__zParserSyntaxError.__str__"sO€Øt”y ”|Ñ# c¨T¬Y°q¬\¸D¼IÀa¼LÑ-HÑ&IÑIÈCÑOˆØ}Š}˜dœl¨D¬K¸Ð@ÑAÔAÐAr)rrrrrrrr)rr)rrrÚ__doc__r!r*Ú
__classcell__)r#s@rrrs_ø€€€€€ØAÐAðððððððBðBðBðBðBðBðBðBrrÚLEFT_PARENTHESISz\(ÚRIGHT_PARENTHESISz\)ÚLEFT_BRACKETz\[Ú
RIGHT_BRACKETz\]Ú	SEMICOLONú;ÚCOMMAú,Ú
QUOTED_STRINGzk
            (
                ('[^']*')
                |
                ("[^"]*")
            )
        ÚOPz(===|==|~=|!=|<=|>=|<|>)ÚBOOLOPz\b(or|and)\bÚINz\bin\bÚNOTz\bnot\bÚVARIABLEa‡
            \b(
                python_version
                |python_full_version
                |os[._]name
                |sys[._]platform
                |platform_(release|system)
                |platform[._](version|machine|python_implementation)
                |python_implementation
                |implementation_(name|version)
                |extra
            )\b
        Ú	SPECIFIERÚATz\@ÚURLz[^ \t]+Ú
IDENTIFIERz\b[a-zA-Z0-9][a-zA-Z0-9._-]*\bÚVERSION_PREFIX_TRAILz\.\*z\+[a-z0-9]+(?:[-_\.][a-z0-9]+)*z[ \t]+ú$)ÚVERSION_LOCAL_LABEL_TRAILÚWSÚENDú dict[str, str | re.Pattern[str]]Ú
DEFAULT_RULEScór—eZdZdZd!d„Zd"d
„Zddœd#d„Zd$d„Zd%d„Zdddœd&d„Z	e
jd'd „¦«ZdS)(Ú	Tokenizerz„Context-sensitive token parsing.

    Provides methods to examine the input stream to check whether the next token
    matches.
    rrÚrulesrDrrcóv—||_d„| ¦«D¦«|_d|_d|_dS)Ncó>—i|]\}}|tj|¦«“ŒSr)ÚreÚcompile)Ú.0rÚpatterns   rú
<dictcomp>z&Tokenizer.__init__.<locals>.<dictcomp>hs5€ð2
ð2
ð2
Ù*7¨$°ˆD•"”*˜WÑ%Ô%ð2
ð2
ð2
rr)rÚitemsrHÚ
next_tokenr)r"rrHs   rr!zTokenizer.__init__asG€ðˆŒð2
ð2
Ø;@¿;º;¹=¼=ð2
ñ2
ô2
ˆŒ
ð)-ˆŒØˆŒ
ˆ
ˆ
rrcó\—| |¦«r| ¦«dSdS)z8Move beyond provided token name, if at current position.N)ÚcheckÚread)r"rs  rÚconsumezTokenizer.consumens1€à:Š:dÑÔð	ØIŠI‰KŒKˆKˆKˆKð	ð	rF)ÚpeekrVÚboolcó—|jJd|›d|j›¦«‚||jvs
Jd|›¦«‚|j|}| |j|j¦«}|€dS|s!t||d|j¦«|_dS)zÿCheck whether the next token has the provided name.

        By default, if the check succeeds, the token *must* be read before
        another check. If `peek` is set to `True`, the token is not loaded and
        would need to be checked again.
        NzCannot check for z, already have zUnknown token name: FrT)rQrHÚmatchrrr
)r"rrVÚ
expressionrYs     rrSzTokenizer.checkss©€ð
ŒOÐ#Ð#ØI˜tÐIÐI°d´oÐIÐIñ
$Ô#Ð#àt”zÐ!Ð!Ð!Ð#B¸$Ð#BÐ#BÑ!Ô!Ð!à”Z Ô%ˆ
à× Ò  ¤¨d¬mÑ<Ô<ˆØˆ=ؐ5Øð	CÝ# D¨%°¬(°D´MÑBÔBˆDŒO؈trÚexpectedr
có„—| |¦«s| d|›¦«‚| ¦«S)zsExpect a certain token name next, failing with a syntax error otherwise.

        The token is *not* read.
        z	Expected )rSÚraise_syntax_errorrT)r"rr[s   rÚexpectzTokenizer.expectˆsD€ð
zŠz˜$ÑÔð	BØ×)Ò)Ð*@°hÐ*@Ð*@ÑAÔAÐA؏yŠy‰{Œ{Ðrcón—|j}|€J‚|xjt|j¦«z
c_d|_|S)z%Consume the next token and return it.N)rQrÚlenr
)r"Útokens  rrTzTokenizer.read‘s9€à”ˆØÐ Ð Ð àˆ
Œ
˜UœZ™œÑ(ˆ
Œ
؈ŒàˆrN)Ú
span_startÚspan_endrrbú
int | Nonercrcó\—|€|jn||€|jn|f}t||j|¬¦«‚)z.Raise ParserSyntaxError at the given position.N)rr)rrr)r"rrbrcrs     rr]zTokenizer.raise_syntax_error›sK€ð(Ð/ˆDŒMˆM°ZØ%Ð-ˆDŒMˆM°8ð
ˆõ ØØ”;Øð
ñ
ô
ð	
rÚ
open_tokenÚclose_tokenÚaroundúIterator[None]c#óK—| |¦«r|j}| ¦«nd}dV—|€dS| |¦«s | d|›d|›d|›|¬¦«| ¦«dS)NzExpected matching z for z, after )rb)rSrrTr])r"rfrgrhÚ
open_positions     rÚenclosing_tokenszTokenizer.enclosing_tokens­s­èè€ð:Š:jÑ!Ô!ð	!Ø œMˆMØIŠI‰KŒKˆKˆKà ˆMà
ˆˆˆàÐ ØˆFàzŠz˜+Ñ&Ô&ð	Ø×#Ò#ØS [ÐSÐS°zÐSÐSÈ6ÐSÐSØ(ð
$ñ
ô
ð
ð
	
	Š	‰Œˆˆˆr)rrrHrDrr)rrrr)rrrVrWrrW)rrr[rrr
)rr
)rrrbrdrcrdrr)rfrrgrrhrrri)
rrrr+r!rUrSr^rTr]Ú
contextlibÚcontextmanagerrlrrrrGrGZsۀ€€€€ððððððððððð
05ððððððð*ðððððððð"&Ø#ð
ð
ð
ð
ð
ð
ð$ÔðððñÔðððrrG)Ú
__future__rrmrKÚdataclassesrÚtypingrrÚ
specifiersrr
Ú	ExceptionrrLÚVERBOSEÚ_operator_regex_strÚ_version_regex_strÚ
IGNORECASErErrGrrrú<module>rxsvðØ"Ð"Ð"Ð"Ð"Ð"Ð"àÐÐÐØ	€	€	€	Ø!Ð!Ð!Ð!Ð!Ð!Ø%Ð%Ð%Ð%Ð%Ð%Ð%Ð%à!Ð!Ð!Ð!Ð!Ð!ððððððñôñ„ððBðBðBðBðB˜	ñBôBðBð*03ؘð03à˜ð03ðEð03ðUð	03ð
ð03ðˆTð
03ðZR”Zð	ð	Œ
ñ	ô	ð03ð"	Ð
%ð#03ð$
ˆoð%03ð&	ˆ)ð'03ð(
ˆ:ð)03ð*
”
ð	ð	Œ
ñôð+03ðJ”ØÔ%¨	Ô(DÑDØ
Œ
R”]Ñ"ñôðK03ðR	ˆ%ðS03ðT
ˆ:ðU03ðVÐ3ðW03ðX˜GðY03ðZ"DØ
Øð_03ð03ð03€
ð0ð0ð0ñ0ðfhðhðhðhðhñhôhðhðhðhr