run update
This commit is contained in:
parent
e85b9478a7
commit
c6f20c5f92
515 changed files with 22459 additions and 12734 deletions
|
@ -1,137 +0,0 @@
|
||||||
SQLAlchemy
|
|
||||||
==========
|
|
||||||
|
|
||||||
The Python SQL Toolkit and Object Relational Mapper
|
|
||||||
|
|
||||||
Introduction
|
|
||||||
-------------
|
|
||||||
|
|
||||||
SQLAlchemy is the Python SQL toolkit and Object Relational Mapper
|
|
||||||
that gives application developers the full power and
|
|
||||||
flexibility of SQL. SQLAlchemy provides a full suite
|
|
||||||
of well known enterprise-level persistence patterns,
|
|
||||||
designed for efficient and high-performing database
|
|
||||||
access, adapted into a simple and Pythonic domain
|
|
||||||
language.
|
|
||||||
|
|
||||||
Major SQLAlchemy features include:
|
|
||||||
|
|
||||||
* An industrial strength ORM, built
|
|
||||||
from the core on the identity map, unit of work,
|
|
||||||
and data mapper patterns. These patterns
|
|
||||||
allow transparent persistence of objects
|
|
||||||
using a declarative configuration system.
|
|
||||||
Domain models
|
|
||||||
can be constructed and manipulated naturally,
|
|
||||||
and changes are synchronized with the
|
|
||||||
current transaction automatically.
|
|
||||||
* A relationally-oriented query system, exposing
|
|
||||||
the full range of SQL's capabilities
|
|
||||||
explicitly, including joins, subqueries,
|
|
||||||
correlation, and most everything else,
|
|
||||||
in terms of the object model.
|
|
||||||
Writing queries with the ORM uses the same
|
|
||||||
techniques of relational composition you use
|
|
||||||
when writing SQL. While you can drop into
|
|
||||||
literal SQL at any time, it's virtually never
|
|
||||||
needed.
|
|
||||||
* A comprehensive and flexible system
|
|
||||||
of eager loading for related collections and objects.
|
|
||||||
Collections are cached within a session,
|
|
||||||
and can be loaded on individual access, all
|
|
||||||
at once using joins, or by query per collection
|
|
||||||
across the full result set.
|
|
||||||
* A Core SQL construction system and DBAPI
|
|
||||||
interaction layer. The SQLAlchemy Core is
|
|
||||||
separate from the ORM and is a full database
|
|
||||||
abstraction layer in its own right, and includes
|
|
||||||
an extensible Python-based SQL expression
|
|
||||||
language, schema metadata, connection pooling,
|
|
||||||
type coercion, and custom types.
|
|
||||||
* All primary and foreign key constraints are
|
|
||||||
assumed to be composite and natural. Surrogate
|
|
||||||
integer primary keys are of course still the
|
|
||||||
norm, but SQLAlchemy never assumes or hardcodes
|
|
||||||
to this model.
|
|
||||||
* Database introspection and generation. Database
|
|
||||||
schemas can be "reflected" in one step into
|
|
||||||
Python structures representing database metadata;
|
|
||||||
those same structures can then generate
|
|
||||||
CREATE statements right back out - all within
|
|
||||||
the Core, independent of the ORM.
|
|
||||||
|
|
||||||
SQLAlchemy's philosophy:
|
|
||||||
|
|
||||||
* SQL databases behave less and less like object
|
|
||||||
collections the more size and performance start to
|
|
||||||
matter; object collections behave less and less like
|
|
||||||
tables and rows the more abstraction starts to matter.
|
|
||||||
SQLAlchemy aims to accommodate both of these
|
|
||||||
principles.
|
|
||||||
* An ORM doesn't need to hide the "R". A relational
|
|
||||||
database provides rich, set-based functionality
|
|
||||||
that should be fully exposed. SQLAlchemy's
|
|
||||||
ORM provides an open-ended set of patterns
|
|
||||||
that allow a developer to construct a custom
|
|
||||||
mediation layer between a domain model and
|
|
||||||
a relational schema, turning the so-called
|
|
||||||
"object relational impedance" issue into
|
|
||||||
a distant memory.
|
|
||||||
* The developer, in all cases, makes all decisions
|
|
||||||
regarding the design, structure, and naming conventions
|
|
||||||
of both the object model as well as the relational
|
|
||||||
schema. SQLAlchemy only provides the means
|
|
||||||
to automate the execution of these decisions.
|
|
||||||
* With SQLAlchemy, there's no such thing as
|
|
||||||
"the ORM generated a bad query" - you
|
|
||||||
retain full control over the structure of
|
|
||||||
queries, including how joins are organized,
|
|
||||||
how subqueries and correlation is used, what
|
|
||||||
columns are requested. Everything SQLAlchemy
|
|
||||||
does is ultimately the result of a developer-
|
|
||||||
initiated decision.
|
|
||||||
* Don't use an ORM if the problem doesn't need one.
|
|
||||||
SQLAlchemy consists of a Core and separate ORM
|
|
||||||
component. The Core offers a full SQL expression
|
|
||||||
language that allows Pythonic construction
|
|
||||||
of SQL constructs that render directly to SQL
|
|
||||||
strings for a target database, returning
|
|
||||||
result sets that are essentially enhanced DBAPI
|
|
||||||
cursors.
|
|
||||||
* Transactions should be the norm. With SQLAlchemy's
|
|
||||||
ORM, nothing goes to permanent storage until
|
|
||||||
commit() is called. SQLAlchemy encourages applications
|
|
||||||
to create a consistent means of delineating
|
|
||||||
the start and end of a series of operations.
|
|
||||||
* Never render a literal value in a SQL statement.
|
|
||||||
Bound parameters are used to the greatest degree
|
|
||||||
possible, allowing query optimizers to cache
|
|
||||||
query plans effectively and making SQL injection
|
|
||||||
attacks a non-issue.
|
|
||||||
|
|
||||||
Documentation
|
|
||||||
-------------
|
|
||||||
|
|
||||||
Latest documentation is at:
|
|
||||||
|
|
||||||
http://www.sqlalchemy.org/docs/
|
|
||||||
|
|
||||||
Installation / Requirements
|
|
||||||
---------------------------
|
|
||||||
|
|
||||||
Full documentation for installation is at
|
|
||||||
`Installation <http://www.sqlalchemy.org/docs/intro.html#installation>`_.
|
|
||||||
|
|
||||||
Getting Help / Development / Bug reporting
|
|
||||||
------------------------------------------
|
|
||||||
|
|
||||||
Please refer to the `SQLAlchemy Community Guide <http://www.sqlalchemy.org/support.html>`_.
|
|
||||||
|
|
||||||
License
|
|
||||||
-------
|
|
||||||
|
|
||||||
SQLAlchemy is distributed under the `MIT license
|
|
||||||
<http://www.opensource.org/licenses/mit-license.php>`_.
|
|
||||||
|
|
||||||
|
|
||||||
|
|
|
@ -1,158 +0,0 @@
|
||||||
Metadata-Version: 2.0
|
|
||||||
Name: SQLAlchemy
|
|
||||||
Version: 1.0.12
|
|
||||||
Summary: Database Abstraction Library
|
|
||||||
Home-page: http://www.sqlalchemy.org
|
|
||||||
Author: Mike Bayer
|
|
||||||
Author-email: mike_mp@zzzcomputing.com
|
|
||||||
License: MIT License
|
|
||||||
Description-Content-Type: UNKNOWN
|
|
||||||
Platform: UNKNOWN
|
|
||||||
Classifier: Development Status :: 5 - Production/Stable
|
|
||||||
Classifier: Intended Audience :: Developers
|
|
||||||
Classifier: License :: OSI Approved :: MIT License
|
|
||||||
Classifier: Programming Language :: Python
|
|
||||||
Classifier: Programming Language :: Python :: 3
|
|
||||||
Classifier: Programming Language :: Python :: Implementation :: CPython
|
|
||||||
Classifier: Programming Language :: Python :: Implementation :: Jython
|
|
||||||
Classifier: Programming Language :: Python :: Implementation :: PyPy
|
|
||||||
Classifier: Topic :: Database :: Front-Ends
|
|
||||||
Classifier: Operating System :: OS Independent
|
|
||||||
|
|
||||||
SQLAlchemy
|
|
||||||
==========
|
|
||||||
|
|
||||||
The Python SQL Toolkit and Object Relational Mapper
|
|
||||||
|
|
||||||
Introduction
|
|
||||||
-------------
|
|
||||||
|
|
||||||
SQLAlchemy is the Python SQL toolkit and Object Relational Mapper
|
|
||||||
that gives application developers the full power and
|
|
||||||
flexibility of SQL. SQLAlchemy provides a full suite
|
|
||||||
of well known enterprise-level persistence patterns,
|
|
||||||
designed for efficient and high-performing database
|
|
||||||
access, adapted into a simple and Pythonic domain
|
|
||||||
language.
|
|
||||||
|
|
||||||
Major SQLAlchemy features include:
|
|
||||||
|
|
||||||
* An industrial strength ORM, built
|
|
||||||
from the core on the identity map, unit of work,
|
|
||||||
and data mapper patterns. These patterns
|
|
||||||
allow transparent persistence of objects
|
|
||||||
using a declarative configuration system.
|
|
||||||
Domain models
|
|
||||||
can be constructed and manipulated naturally,
|
|
||||||
and changes are synchronized with the
|
|
||||||
current transaction automatically.
|
|
||||||
* A relationally-oriented query system, exposing
|
|
||||||
the full range of SQL's capabilities
|
|
||||||
explicitly, including joins, subqueries,
|
|
||||||
correlation, and most everything else,
|
|
||||||
in terms of the object model.
|
|
||||||
Writing queries with the ORM uses the same
|
|
||||||
techniques of relational composition you use
|
|
||||||
when writing SQL. While you can drop into
|
|
||||||
literal SQL at any time, it's virtually never
|
|
||||||
needed.
|
|
||||||
* A comprehensive and flexible system
|
|
||||||
of eager loading for related collections and objects.
|
|
||||||
Collections are cached within a session,
|
|
||||||
and can be loaded on individual access, all
|
|
||||||
at once using joins, or by query per collection
|
|
||||||
across the full result set.
|
|
||||||
* A Core SQL construction system and DBAPI
|
|
||||||
interaction layer. The SQLAlchemy Core is
|
|
||||||
separate from the ORM and is a full database
|
|
||||||
abstraction layer in its own right, and includes
|
|
||||||
an extensible Python-based SQL expression
|
|
||||||
language, schema metadata, connection pooling,
|
|
||||||
type coercion, and custom types.
|
|
||||||
* All primary and foreign key constraints are
|
|
||||||
assumed to be composite and natural. Surrogate
|
|
||||||
integer primary keys are of course still the
|
|
||||||
norm, but SQLAlchemy never assumes or hardcodes
|
|
||||||
to this model.
|
|
||||||
* Database introspection and generation. Database
|
|
||||||
schemas can be "reflected" in one step into
|
|
||||||
Python structures representing database metadata;
|
|
||||||
those same structures can then generate
|
|
||||||
CREATE statements right back out - all within
|
|
||||||
the Core, independent of the ORM.
|
|
||||||
|
|
||||||
SQLAlchemy's philosophy:
|
|
||||||
|
|
||||||
* SQL databases behave less and less like object
|
|
||||||
collections the more size and performance start to
|
|
||||||
matter; object collections behave less and less like
|
|
||||||
tables and rows the more abstraction starts to matter.
|
|
||||||
SQLAlchemy aims to accommodate both of these
|
|
||||||
principles.
|
|
||||||
* An ORM doesn't need to hide the "R". A relational
|
|
||||||
database provides rich, set-based functionality
|
|
||||||
that should be fully exposed. SQLAlchemy's
|
|
||||||
ORM provides an open-ended set of patterns
|
|
||||||
that allow a developer to construct a custom
|
|
||||||
mediation layer between a domain model and
|
|
||||||
a relational schema, turning the so-called
|
|
||||||
"object relational impedance" issue into
|
|
||||||
a distant memory.
|
|
||||||
* The developer, in all cases, makes all decisions
|
|
||||||
regarding the design, structure, and naming conventions
|
|
||||||
of both the object model as well as the relational
|
|
||||||
schema. SQLAlchemy only provides the means
|
|
||||||
to automate the execution of these decisions.
|
|
||||||
* With SQLAlchemy, there's no such thing as
|
|
||||||
"the ORM generated a bad query" - you
|
|
||||||
retain full control over the structure of
|
|
||||||
queries, including how joins are organized,
|
|
||||||
how subqueries and correlation is used, what
|
|
||||||
columns are requested. Everything SQLAlchemy
|
|
||||||
does is ultimately the result of a developer-
|
|
||||||
initiated decision.
|
|
||||||
* Don't use an ORM if the problem doesn't need one.
|
|
||||||
SQLAlchemy consists of a Core and separate ORM
|
|
||||||
component. The Core offers a full SQL expression
|
|
||||||
language that allows Pythonic construction
|
|
||||||
of SQL constructs that render directly to SQL
|
|
||||||
strings for a target database, returning
|
|
||||||
result sets that are essentially enhanced DBAPI
|
|
||||||
cursors.
|
|
||||||
* Transactions should be the norm. With SQLAlchemy's
|
|
||||||
ORM, nothing goes to permanent storage until
|
|
||||||
commit() is called. SQLAlchemy encourages applications
|
|
||||||
to create a consistent means of delineating
|
|
||||||
the start and end of a series of operations.
|
|
||||||
* Never render a literal value in a SQL statement.
|
|
||||||
Bound parameters are used to the greatest degree
|
|
||||||
possible, allowing query optimizers to cache
|
|
||||||
query plans effectively and making SQL injection
|
|
||||||
attacks a non-issue.
|
|
||||||
|
|
||||||
Documentation
|
|
||||||
-------------
|
|
||||||
|
|
||||||
Latest documentation is at:
|
|
||||||
|
|
||||||
http://www.sqlalchemy.org/docs/
|
|
||||||
|
|
||||||
Installation / Requirements
|
|
||||||
---------------------------
|
|
||||||
|
|
||||||
Full documentation for installation is at
|
|
||||||
`Installation <http://www.sqlalchemy.org/docs/intro.html#installation>`_.
|
|
||||||
|
|
||||||
Getting Help / Development / Bug reporting
|
|
||||||
------------------------------------------
|
|
||||||
|
|
||||||
Please refer to the `SQLAlchemy Community Guide <http://www.sqlalchemy.org/support.html>`_.
|
|
||||||
|
|
||||||
License
|
|
||||||
-------
|
|
||||||
|
|
||||||
SQLAlchemy is distributed under the `MIT license
|
|
||||||
<http://www.opensource.org/licenses/mit-license.php>`_.
|
|
||||||
|
|
||||||
|
|
||||||
|
|
|
@ -1,376 +0,0 @@
|
||||||
SQLAlchemy-1.0.12.dist-info/DESCRIPTION.rst,sha256=ZN8fj2owI_rw0Emr3_RXqoNfTFkThjiZy7xcCzg1W_g,5013
|
|
||||||
SQLAlchemy-1.0.12.dist-info/METADATA,sha256=xCBLJSNub29eg_Bm-fHTUT_al-Sr8jh38ztUF4_s1so,5820
|
|
||||||
SQLAlchemy-1.0.12.dist-info/RECORD,,
|
|
||||||
SQLAlchemy-1.0.12.dist-info/WHEEL,sha256=AEztX7vHDtcgysb-4-5-DyIKMLIPg6NMxY9dXTRdoXQ,104
|
|
||||||
SQLAlchemy-1.0.12.dist-info/metadata.json,sha256=QT7EcApgL9QrRqR1YIngngveBNd13H8h-oNK9fsxj0U,1004
|
|
||||||
SQLAlchemy-1.0.12.dist-info/top_level.txt,sha256=rp-ZgB7D8G11ivXON5VGPjupT1voYmWqkciDt5Uaw_Q,11
|
|
||||||
sqlalchemy/__init__.py,sha256=fTurvwmGkoRt_zdwxoZNWTHg6VdzvBpeHyPmUnexOK4,2112
|
|
||||||
sqlalchemy/cprocessors.cpython-34m.so,sha256=hvG3A0r4VO9gevdsLGZYRdqNfG2rahDIFUqJ-fUxAB4,52136
|
|
||||||
sqlalchemy/cresultproxy.cpython-34m.so,sha256=piAFu3JE3mOaKpNSg6vcu8jGTl_-X6elUDWS2h_YOfQ,61504
|
|
||||||
sqlalchemy/cutils.cpython-34m.so,sha256=-ARQsTXx0XDzghnRNCwdaxm2eeIn2TuEqoU_Wb18h6E,34312
|
|
||||||
sqlalchemy/events.py,sha256=j8yref-XfuJxkPKbvnZmB4jeUAIujPcbLAzD2cKV4f4,43944
|
|
||||||
sqlalchemy/exc.py,sha256=NhA5R5nDdducWkp0MXtlQ0-Q6iF_rhqkHWblIfuSYGk,11706
|
|
||||||
sqlalchemy/inspection.py,sha256=zMa-2nt-OQ0Op1dqq0Z2XCnpdAFSTkqif5Kdi8Wz8AU,3093
|
|
||||||
sqlalchemy/interfaces.py,sha256=XSx5y-HittAzc79lU4C7rPbTtSW_Hc2c89NqCy50tsQ,10967
|
|
||||||
sqlalchemy/log.py,sha256=opX7UORq5N6_jWxN9aHX9OpiirwAcRA0qq-u5m4SMkQ,6712
|
|
||||||
sqlalchemy/pool.py,sha256=-F51TIJYl0XGTV2_sdpV8C1m0jTTQaq0nAezdmSgr84,47220
|
|
||||||
sqlalchemy/processors.py,sha256=Li1kdC-I0v03JxeOz4V7u4HAevK6LledyCPvaL06mYc,5220
|
|
||||||
sqlalchemy/schema.py,sha256=rZzZJJ8dT9trLSYknFpHm0N1kRERYwhqHH3QD31SJjc,1182
|
|
||||||
sqlalchemy/types.py,sha256=qcoy5xKaurDV4kaXr489GL2sz8FKkWX21Us3ZCqeasg,1650
|
|
||||||
sqlalchemy/connectors/__init__.py,sha256=97YbriYu5mcljh7opc1JOScRlf3Tk8ldbn5urBVm4WY,278
|
|
||||||
sqlalchemy/connectors/mxodbc.py,sha256=-0iqw2k8e-o3OkAKzoCWuAaEPxlEjslvfRM9hnVXENM,5348
|
|
||||||
sqlalchemy/connectors/pyodbc.py,sha256=pG2yf3cEDtTr-w_m4to6jF5l8hZk6MJv69K3cg84NfY,6264
|
|
||||||
sqlalchemy/connectors/zxJDBC.py,sha256=2KK_sVSgMsdW0ufZqAwgXjd1FsMb4hqbiUQRAkM0RYg,1868
|
|
||||||
sqlalchemy/databases/__init__.py,sha256=BaQyAuMjXNpZYV47hCseHrDtPzTfSw-iqUQYxMWJddw,817
|
|
||||||
sqlalchemy/dialects/__init__.py,sha256=7SMul8PL3gkbJRUwAwovHLae5qBBApRF-VcRwU-VtdU,1012
|
|
||||||
sqlalchemy/dialects/postgres.py,sha256=heNVHys6E91DIBepXT3ls_4_6N8HTTahrZ49W5IR3M0,614
|
|
||||||
sqlalchemy/dialects/firebird/__init__.py,sha256=QYmQ0SaGfq3YjDraCV9ALwqVW5A3KDUF0F6air_qp3Q,664
|
|
||||||
sqlalchemy/dialects/firebird/base.py,sha256=IT0prWkh1TFSTke-BqGdVMGdof53zmWWk6zbJZ_TuuI,28170
|
|
||||||
sqlalchemy/dialects/firebird/fdb.py,sha256=l4s6_8Z0HvqxgqGz0LNcKWP1qUmEc3M2XM718_drN34,4325
|
|
||||||
sqlalchemy/dialects/firebird/kinterbasdb.py,sha256=kCsn2ed4u9fyjcyfEI3rXQdKvL05z9wtf5YjW9-NrvI,6299
|
|
||||||
sqlalchemy/dialects/mssql/__init__.py,sha256=G12xmirGZgMzfUKZCA8BFfaCmqUDuYca9Fu2VP_eaks,1081
|
|
||||||
sqlalchemy/dialects/mssql/adodbapi.py,sha256=dHZgS3pEDX39ixhlDfTtDcjCq6rdjF85VS7rIZ1TfYo,2493
|
|
||||||
sqlalchemy/dialects/mssql/base.py,sha256=xqRmK_npoyH5gl626EjazVnu9TEArmrBIFme_avYFUg,66855
|
|
||||||
sqlalchemy/dialects/mssql/information_schema.py,sha256=pwuTsgOCY5eSBW9w-g-pyJDRfyuZ_rOEXXNYRuAroCE,6418
|
|
||||||
sqlalchemy/dialects/mssql/mxodbc.py,sha256=G9LypIeEizgxeShtDu2M7Vwm8NopnzaTmnZMD49mYeg,3856
|
|
||||||
sqlalchemy/dialects/mssql/pymssql.py,sha256=w92w4YQzXdHb53AjCrBcIRHsf6jmie1iN9H7gJNGX4k,3079
|
|
||||||
sqlalchemy/dialects/mssql/pyodbc.py,sha256=KRke1Hizrg3r5iYqxdBI0axXVQ_pZR_UPxLaAdF0mKk,9473
|
|
||||||
sqlalchemy/dialects/mssql/zxjdbc.py,sha256=u4uBgwk0LbI7_I5CIvM3C4bBb0pmrw2_DqRh_ehJTkI,2282
|
|
||||||
sqlalchemy/dialects/mysql/__init__.py,sha256=3cQ2juPT8LsZTicPa2J-0rCQjQIQaPgyBzxjV3O_7xs,1171
|
|
||||||
sqlalchemy/dialects/mysql/base.py,sha256=rwC8fnhGZaAnsPB1Jhg4sTcrWE2hjxrZJ5deCS0rAOc,122869
|
|
||||||
sqlalchemy/dialects/mysql/cymysql.py,sha256=nqsdQA8LBLIc6eilgX6qwkjm7szsUoqMTVYwK9kkfsE,2349
|
|
||||||
sqlalchemy/dialects/mysql/gaerdbms.py,sha256=2MxtTsIqlpq_J32HHqDzz-5vu-mC51Lb7PvyGkJa73M,3387
|
|
||||||
sqlalchemy/dialects/mysql/mysqlconnector.py,sha256=DMDm684Shk-ijVo7w-yidopYw7EC6EiOmJY56EPawok,5323
|
|
||||||
sqlalchemy/dialects/mysql/mysqldb.py,sha256=McqROngxAknbLOXoUAG9o9mP9FQBLs-ouD-JqqI2Ses,6564
|
|
||||||
sqlalchemy/dialects/mysql/oursql.py,sha256=rmdr-r66iJ2amqFeGvCohvE8WCl_i6R9KcgVG0uXOQs,8124
|
|
||||||
sqlalchemy/dialects/mysql/pymysql.py,sha256=e-qehI-sASmAjEa0ajHqjZjlyJYWsb3RPQY4iBR5pz0,1504
|
|
||||||
sqlalchemy/dialects/mysql/pyodbc.py,sha256=Ze9IOKw6ANVQj25IlmSGR8aaJhM0pMuRtbzKF7UsZCY,2665
|
|
||||||
sqlalchemy/dialects/mysql/zxjdbc.py,sha256=LIhe2mHSRVgi8I7qmiTMVBRSpuWJVnuDtpHTUivIx0M,3942
|
|
||||||
sqlalchemy/dialects/oracle/__init__.py,sha256=UhF2ZyPfT3EFAnP8ZjGng6GnWSzmAkjMax0Lucpn0Bg,797
|
|
||||||
sqlalchemy/dialects/oracle/base.py,sha256=2KJO-sU2CVKK1rij6bAQ5ZFJv203_NmzT8dE5qor9wc,55961
|
|
||||||
sqlalchemy/dialects/oracle/cx_oracle.py,sha256=-d5tHbNcCyjbgVtAvWfHgSY2yA8C9bvCzxhwkdWFNe0,38635
|
|
||||||
sqlalchemy/dialects/oracle/zxjdbc.py,sha256=nC7XOCY3NdTLrEyIacNTnLDCaeVjWn59q8UYssJL8Wo,8112
|
|
||||||
sqlalchemy/dialects/postgresql/__init__.py,sha256=SjCtM5b3EaGyRaTyg_i82sh_qjkLEIVUXW91XDihiCM,1299
|
|
||||||
sqlalchemy/dialects/postgresql/base.py,sha256=xhdLeHuWioTv9LYW41pcIPsEjD2fyeh7JflkLKmZMB8,104230
|
|
||||||
sqlalchemy/dialects/postgresql/constraints.py,sha256=8UDx_2TNQgqIUSRETZPhgninJigQ6rMfdRNI6vIt3Is,3119
|
|
||||||
sqlalchemy/dialects/postgresql/hstore.py,sha256=n8Wsd7Uldk3bbg66tTa0NKjVqjhJUbF1mVeUsM7keXA,11402
|
|
||||||
sqlalchemy/dialects/postgresql/json.py,sha256=MTlIGinMDa8iaVbZMOzYnremo0xL4tn2wyGTPwnvX6U,12215
|
|
||||||
sqlalchemy/dialects/postgresql/pg8000.py,sha256=x6o3P8Ad0wKsuF9qeyip39BKc5ORJZ4nWxv-8qOdj0E,8375
|
|
||||||
sqlalchemy/dialects/postgresql/psycopg2.py,sha256=4ac0upErNRJz6YWJYNbATCU3ncWFvat5kal_Cuq-Jhw,26953
|
|
||||||
sqlalchemy/dialects/postgresql/psycopg2cffi.py,sha256=8R3POkJH8z8a2DxwKNmfmQOsxFqsg4tU_OnjGj3OfDA,1651
|
|
||||||
sqlalchemy/dialects/postgresql/pypostgresql.py,sha256=raQRfZb8T9-c-jmq1w86Wci5QyiXgf_9_71OInT_sAw,2655
|
|
||||||
sqlalchemy/dialects/postgresql/ranges.py,sha256=MihdGXMdmCM6ToIlrj7OJx9Qh_8BX8bv5PSaAepHmII,4814
|
|
||||||
sqlalchemy/dialects/postgresql/zxjdbc.py,sha256=AhEGRiAy8q-GM0BStFcsLBgSwjxHkkwy2-BSroIoADo,1397
|
|
||||||
sqlalchemy/dialects/sqlite/__init__.py,sha256=0wW0VOhE_RtFDpRcbwvvo3XtD6Y2-SDgG4K7468eh_w,736
|
|
||||||
sqlalchemy/dialects/sqlite/base.py,sha256=_L9-854ITf8Fl2BgUymF9fKjDFvXSo7Pb2yuz1CMkDo,55007
|
|
||||||
sqlalchemy/dialects/sqlite/pysqlcipher.py,sha256=sgXCqn8ZtNIeTDwyo253Kj5mn4TPlIW3AZCNNmURi2A,4129
|
|
||||||
sqlalchemy/dialects/sqlite/pysqlite.py,sha256=G-Cg-iI-ErYsVjOH4UlQTEY9pLnLOLV89ik8q0-reuY,14980
|
|
||||||
sqlalchemy/dialects/sybase/__init__.py,sha256=gwCgFR_C_hoj0Re7PiaW3zmKSWaLpsd96UVXdM7EnTM,894
|
|
||||||
sqlalchemy/dialects/sybase/base.py,sha256=Xpl3vEd5VDyvoIRMg0DZa48Or--yBSrhaZ2CbTSCt0w,28853
|
|
||||||
sqlalchemy/dialects/sybase/mxodbc.py,sha256=E_ask6yFSjyhNPvv7gQsvA41WmyxbBvRGWjCyPVr9Gs,901
|
|
||||||
sqlalchemy/dialects/sybase/pyodbc.py,sha256=0a_gKwrIweJGcz3ZRYuQZb5BIvwjGmFEYBo9wGk66kI,2102
|
|
||||||
sqlalchemy/dialects/sybase/pysybase.py,sha256=tu2V_EbtgxWYOvt-ybo5_lLiBQzsIFaAtF8e7S1_-rk,3208
|
|
||||||
sqlalchemy/engine/__init__.py,sha256=fyIFw2R5wfLQzSbfE9Jz-28ZDP5RyB-5elNH92uTZYM,18803
|
|
||||||
sqlalchemy/engine/base.py,sha256=cRqbbG0QuUG-NGs3GOPVQsU0WLsw5bLT0Y07Yf8OOfU,79399
|
|
||||||
sqlalchemy/engine/default.py,sha256=U_yaliCazUHp6cfk_NVzhB4F_zOJSyy959rHyk40J4M,36548
|
|
||||||
sqlalchemy/engine/interfaces.py,sha256=CmPYM_oDp1zAPH13sKmufO4Tuha6KA-fXRQq-K_3YTE,35908
|
|
||||||
sqlalchemy/engine/reflection.py,sha256=jly5YN-cyjoBDxHs9qO6Mlgm1OZSb2NBNFALwZMEGxE,28590
|
|
||||||
sqlalchemy/engine/result.py,sha256=ot5RQxa6kjoScXRUR-DTl0iJJISBhmyNTj1JZkZiNsk,44027
|
|
||||||
sqlalchemy/engine/strategies.py,sha256=mwy-CTrnXzyaIA1TRQBQ_Z2O8wN0lnTNZwDefEWCR9A,8929
|
|
||||||
sqlalchemy/engine/threadlocal.py,sha256=y4wOLjtbeY-dvp2GcJDtos6F2jzfP11JVAaSFwZ0zRM,4191
|
|
||||||
sqlalchemy/engine/url.py,sha256=ZhS_Iqiu6V1kfIM2pcv3ud9fOPXkFOHBv8wiLOqbJhc,8228
|
|
||||||
sqlalchemy/engine/util.py,sha256=Tvb9sIkyd6qOwIA-RsBmo5j877UXa5x-jQmhqnhHWRA,2338
|
|
||||||
sqlalchemy/event/__init__.py,sha256=KnUVp-NVX6k276ntGffxgkjVmIWR22FSlzrbAKqQ6S4,419
|
|
||||||
sqlalchemy/event/api.py,sha256=O2udbj5D7HdXcvsGBQk6-dK9CAFfePTypWOrUdqmhYY,5990
|
|
||||||
sqlalchemy/event/attr.py,sha256=VfRJJl4RD24mQaIoDwArWL2hsGOX6ISSU6vKusVMNO0,12053
|
|
||||||
sqlalchemy/event/base.py,sha256=DWDKZV19fFsLavu2cXOxXV8NhO3XuCbKcKamBKyXuME,9540
|
|
||||||
sqlalchemy/event/legacy.py,sha256=ACnVeBUt8uwVfh1GNRu22cWCADC3CWZdrsBKzAd6UQQ,5814
|
|
||||||
sqlalchemy/event/registry.py,sha256=13wx1qdEmcQeCoAmgf_WQEMuR43h3v7iyd2Re54QdOE,7786
|
|
||||||
sqlalchemy/ext/__init__.py,sha256=smCZIGgjJprT4ddhuYSLZ8PrTn4NdXPP3j03a038SdE,322
|
|
||||||
sqlalchemy/ext/associationproxy.py,sha256=y61Y4UIZNBit5lqk2WzdHTCXIWRrBg3hHbRVsqXjnqE,33422
|
|
||||||
sqlalchemy/ext/automap.py,sha256=Aet-3zk2vbsJVLqigwZJYau0hB1D6Y21K65QVWeB5pc,41567
|
|
||||||
sqlalchemy/ext/baked.py,sha256=BnVaB4pkQxHk-Fyz4nUw225vCxO_zrDuVC6t5cSF9x8,16967
|
|
||||||
sqlalchemy/ext/compiler.py,sha256=aSSlySoTsqN-JkACWFIhv3pq2CuZwxKm6pSDfQoc10Q,16257
|
|
||||||
sqlalchemy/ext/horizontal_shard.py,sha256=XEBYIfs0YrTt_2vRuaBY6C33ZOZMUHQb2E4X2s3Szns,4814
|
|
||||||
sqlalchemy/ext/hybrid.py,sha256=wNXvuYEEmKy-Nc6z7fu1c2gNWCMOiQA0N14Y3FCq5lo,27989
|
|
||||||
sqlalchemy/ext/instrumentation.py,sha256=HRgNiuYJ90_uSKC1iDwsEl8_KXscMQkEb9KeElk-yLE,14856
|
|
||||||
sqlalchemy/ext/mutable.py,sha256=lx7b_ewFVe7O6I4gTXdi9M6C6TqxWCFiViqCM2VwUac,25444
|
|
||||||
sqlalchemy/ext/orderinglist.py,sha256=UCkuZxTWAQ0num-b5oNm8zNJAmVuIFcbFXt5e7JPx-U,13816
|
|
||||||
sqlalchemy/ext/serializer.py,sha256=fK3N1miYF16PSIZDjLFS2zI7y-scZ9qtmopXIfzPqrA,5586
|
|
||||||
sqlalchemy/ext/declarative/__init__.py,sha256=Jpwf2EukqwNe4RzDfCmX1p-hQ6pPhJEIL_xunaER3tw,756
|
|
||||||
sqlalchemy/ext/declarative/api.py,sha256=PdoO_jh50TWaMvXqnjNh-vX42VqB75ZyliluilphvsU,23317
|
|
||||||
sqlalchemy/ext/declarative/base.py,sha256=96SJBOfxpTMsU2jAHrvuXbsjUUJ7TvbLm11R8Hy2Irc,25231
|
|
||||||
sqlalchemy/ext/declarative/clsregistry.py,sha256=jaLLSr-66XvLnA1Z9kxjKatH_XHxWchqEXMKwvjKAXk,10817
|
|
||||||
sqlalchemy/orm/__init__.py,sha256=UzDockQEVMaWvr-FE4y1rptrMb5uX5k8v_UNQs82qFY,8033
|
|
||||||
sqlalchemy/orm/attributes.py,sha256=OmXkppJEZxRGc0acZZZkSbUhdfDl8ry3Skmvzl3OtLQ,56510
|
|
||||||
sqlalchemy/orm/base.py,sha256=F0aRZGK2_1F8phwBHnVYaChkAb-nnTRoFE1VKSvmAwA,14634
|
|
||||||
sqlalchemy/orm/collections.py,sha256=TFutWIn_c07DI48FDOKMsFMnAoQB3BG2FnEMGzEF3iI,53549
|
|
||||||
sqlalchemy/orm/dependency.py,sha256=phB8nS1788FSd4dWa2j9d4uj6QFlRL7nzcXvh3Bb7Zo,46192
|
|
||||||
sqlalchemy/orm/deprecated_interfaces.py,sha256=A63t6ivbZB3Wq8vWgL8I05uTRR6whcWnIPkquuTIPXU,18254
|
|
||||||
sqlalchemy/orm/descriptor_props.py,sha256=uk5r77w1VUWVgn0bkgOItkAlMh9FRgeT6OCgOHz3_bM,25141
|
|
||||||
sqlalchemy/orm/dynamic.py,sha256=I_YP7X-H9HLjeFHmYgsOas6JPdqg0Aqe0kaltt4HVzA,13283
|
|
||||||
sqlalchemy/orm/evaluator.py,sha256=Hozggsd_Fi0YyqHrr9-tldtOA9NLX0MVBF4e2vSM6GY,4731
|
|
||||||
sqlalchemy/orm/events.py,sha256=yRaoXlBL78b3l11itTrAy42UhLu42-7cgXKCFUGNXSg,69410
|
|
||||||
sqlalchemy/orm/exc.py,sha256=P5lxi5RMFokiHL136VBK0AP3UmAlJcSDHtzgo-M6Kgs,5439
|
|
||||||
sqlalchemy/orm/identity.py,sha256=zsb8xOZaPYKvs4sGhyxW21mILQDrtdSuzD4sTyeKdJs,9021
|
|
||||||
sqlalchemy/orm/instrumentation.py,sha256=xtq9soM3mpMws7xqNJIFYXqKw65p2nnxCTfmMpuvpeI,17510
|
|
||||||
sqlalchemy/orm/interfaces.py,sha256=AqitvZ_BBkB6L503uhdH55nxHplleJ2kQMwM7xKq9Sc,21552
|
|
||||||
sqlalchemy/orm/loading.py,sha256=cjC8DQ5g8_rMxroYrYHfW5s35Z5OFSNBUu0-LpxW7hI,22878
|
|
||||||
sqlalchemy/orm/mapper.py,sha256=sfooeslzwWAKN7WNIQoZ2Y3u_mCyIxd0tebp4yEUu8k,115074
|
|
||||||
sqlalchemy/orm/path_registry.py,sha256=8Pah0P8yPVUyRjoET7DvIMGtM5PC8HZJC4GtxAyqVAs,8370
|
|
||||||
sqlalchemy/orm/persistence.py,sha256=WzUUNm1UGm5mGxbv94hLTQowEDNoXfU1VoyGnoKeN_g,51028
|
|
||||||
sqlalchemy/orm/properties.py,sha256=HR3eoY3Ze3FUPPNCXM_FruWz4pEMWrGlqtCGiK2G1qE,10426
|
|
||||||
sqlalchemy/orm/query.py,sha256=2q2XprzbZhIlAbs0vihIr9dgqfJtcbrjNewgE9q26gE,147616
|
|
||||||
sqlalchemy/orm/relationships.py,sha256=79LRGGz8MxsKsAlv0vuZ6MYZXzDXXtfiOCZg-IQ9hiU,116992
|
|
||||||
sqlalchemy/orm/scoping.py,sha256=Ao-K4iqg4pBp7Si5JOAlro5zUL_r500TC3lVLcFMLDs,6421
|
|
||||||
sqlalchemy/orm/session.py,sha256=yctpvCsLUcFv9Sy8keT1SElZ2VH5DNScYtO7Z77ptYI,111314
|
|
||||||
sqlalchemy/orm/state.py,sha256=4LwwftOtPQldH12SKZV2UFgzqPOCj40QfQ08knZs0_E,22984
|
|
||||||
sqlalchemy/orm/strategies.py,sha256=rdLEs2pPrF8nqcQqezyG-fGdmE11r22fUva4ES3KGOE,58529
|
|
||||||
sqlalchemy/orm/strategy_options.py,sha256=_z7ZblWCnXh8bZpGSOXDoUwtdUqnXdCaWfKXYDgCuH0,34973
|
|
||||||
sqlalchemy/orm/sync.py,sha256=B-d-H1Gzw1TkflpvgJeQghwTzqObzhZCQdvEdSPyDeE,5451
|
|
||||||
sqlalchemy/orm/unitofwork.py,sha256=EQvZ7RZ-u5wJT51BWTeMJJi-tt22YRnmqywGUCn0Qrc,23343
|
|
||||||
sqlalchemy/orm/util.py,sha256=Mj3NXDd8Mwp4O5Vr5zvRGFUZRlB65WpExdDBFJp04wQ,38092
|
|
||||||
sqlalchemy/sql/__init__.py,sha256=IFCJYIilmmAQRnSDhv9Y6LQUSpx6pUU5zp9VT7sOx0c,1737
|
|
||||||
sqlalchemy/sql/annotation.py,sha256=8ncgAVUo5QCoinApKjREi8esWNMFklcBqie8Q42KsaQ,6136
|
|
||||||
sqlalchemy/sql/base.py,sha256=TuXOp7z0Q30qKAjhgcsts6WGvRbvg6F7OBojMQAxjX0,20990
|
|
||||||
sqlalchemy/sql/compiler.py,sha256=G0Ft_Dmq1AousO66eagPhI0g9Vkqui_c_LjqY0AbImU,100710
|
|
||||||
sqlalchemy/sql/crud.py,sha256=X86dyvzEnbj0-oeJO5ufi6zXxbSKBtDeu5JHlNg-BJU,19837
|
|
||||||
sqlalchemy/sql/ddl.py,sha256=nkjd_B4lKwC2GeyPjE0ZtRB9RKXccQL1g1XoZ4p69sM,37540
|
|
||||||
sqlalchemy/sql/default_comparator.py,sha256=QaowWtW4apULq_aohDvmj97j0sDtHQQjMRdNxXm83vk,10447
|
|
||||||
sqlalchemy/sql/dml.py,sha256=7846H52IMJfMYi5Jd-Cv6Hy9hZM4dkonXbjfBjl5ED4,33330
|
|
||||||
sqlalchemy/sql/elements.py,sha256=MLeecC5dMqeekZmFbPn0J-ODKJj5DBDE5v6kuSkq66I,132898
|
|
||||||
sqlalchemy/sql/expression.py,sha256=vFZ9MmBlC9Fg8IYzLMAwXgcsnXZhkZbUstY6dO8BFGY,5833
|
|
||||||
sqlalchemy/sql/functions.py,sha256=ZYKyvPnVKZMtHyyjyNwK0M5UWPrZmFz3vtTqHN-8658,18533
|
|
||||||
sqlalchemy/sql/naming.py,sha256=foE2lAzngLCFXCeHrpv0S4zT23GCnZLCiata2MPo0kE,4662
|
|
||||||
sqlalchemy/sql/operators.py,sha256=UeZgb7eRhWd4H7OfJZkx0ZWOjvo5chIUXQsBAIeeTDY,23013
|
|
||||||
sqlalchemy/sql/schema.py,sha256=awhLY5YjUBah8ZYxW9FBfe6lH0v4fW0UJLTNApnx7E0,145511
|
|
||||||
sqlalchemy/sql/selectable.py,sha256=o1Hom00WGHjI21Mdb5fkX-f0k2nksQNb_txT0KWK1zQ,118995
|
|
||||||
sqlalchemy/sql/sqltypes.py,sha256=JGxizqIjO1WFuZpppWj1Yi5cvCyBczb1JqUQeuhQn8s,54879
|
|
||||||
sqlalchemy/sql/type_api.py,sha256=Xe6yH4slgdLA8HRjT19GBOou51SS9o4oUhyK0xfn04c,42846
|
|
||||||
sqlalchemy/sql/util.py,sha256=7AsOsyhIq2eSLMWtwvqfTLc2MdCotGzEKQKFE3wk5sk,20382
|
|
||||||
sqlalchemy/sql/visitors.py,sha256=4ipGvAkqFaSAWgyNuKjx5x_ms8GIy9aq-wC5pj4-Z3g,10271
|
|
||||||
sqlalchemy/testing/__init__.py,sha256=MwKimX0atzs_SmG2j74GXLiyI8O56e3DLq96tcoL0TM,1095
|
|
||||||
sqlalchemy/testing/assertions.py,sha256=r1I2nHC599VZcY-5g0JYRQl8bl9kjkf6WFOooOmJ2eE,16112
|
|
||||||
sqlalchemy/testing/assertsql.py,sha256=-fP9Iuhdu52BJoT1lEj_KED8jy5ay_XiJu7i4Ry9eWA,12335
|
|
||||||
sqlalchemy/testing/config.py,sha256=nqvVm55Vk0BVNjk1Wj3aYR65j_EEEepfB-W9QSFLU-k,2469
|
|
||||||
sqlalchemy/testing/distutils_run.py,sha256=tkURrZRwgFiSwseKm1iJRkSjKf2Rtsb3pOXRWtACTHI,247
|
|
||||||
sqlalchemy/testing/engines.py,sha256=u6GlDMXt0FKqVTQe_QJ5JXAnkA6W-xdw6Fe_5gMAQhg,9359
|
|
||||||
sqlalchemy/testing/entities.py,sha256=IXqTgAihV-1TZyxL0MWdZzu4rFtxdbWKWFetIJWNGM4,2992
|
|
||||||
sqlalchemy/testing/exclusions.py,sha256=WuH_tVK5fZJWe8Hu2LzNB4HNQMa_iAUaGC-_6mHUdIM,12570
|
|
||||||
sqlalchemy/testing/fixtures.py,sha256=q4nK-81z2EWs17TjeJtPmnaJUCtDdoUiIU7jgLq3l_w,10721
|
|
||||||
sqlalchemy/testing/mock.py,sha256=vj5q-GzJrLW6mMVDLqsppxBu_p7K49VvjfiVt5tn0o8,630
|
|
||||||
sqlalchemy/testing/pickleable.py,sha256=8I8M4H1XN29pZPMxZdYkmpKWfwzPsUn6WK5FX4UP9L4,2641
|
|
||||||
sqlalchemy/testing/profiling.py,sha256=Q_wOTS5JtcGBcs2eCYIvoRoDS_FW_HcfEW3hXWB87Zg,8392
|
|
||||||
sqlalchemy/testing/provision.py,sha256=mU9g6JZEHIshqUkE6PWu-t61FVPs_cUJtEtVFRavj9g,9377
|
|
||||||
sqlalchemy/testing/replay_fixture.py,sha256=iAxg7XsFkKSCcJnrNPQNJfjMxOgeBAa-ShOkywWPJ4w,5429
|
|
||||||
sqlalchemy/testing/requirements.py,sha256=aIdvbfugMzrlVdldEbpcwretX-zjiukPhPUSZgulrzU,19949
|
|
||||||
sqlalchemy/testing/runner.py,sha256=hpNH6MNTif4TnBRySxpm92KgFwDK0mOa8eF7wZXumTI,1607
|
|
||||||
sqlalchemy/testing/schema.py,sha256=agOzrIMvmuUCeVZY5mYjJ1eJmOP69-wa0gZALtNtJBk,3446
|
|
||||||
sqlalchemy/testing/util.py,sha256=IJ688AWzichtXVwWgYf_A4BUbcXPGsK6BQP5fvY3h-U,7544
|
|
||||||
sqlalchemy/testing/warnings.py,sha256=-KskRAh1RkJ_69UIY_WR7i15u21U3gDLQ6nKlnJT7_w,987
|
|
||||||
sqlalchemy/testing/plugin/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
|
|
||||||
sqlalchemy/testing/plugin/bootstrap.py,sha256=Iw8R-d1gqoz_NKFtPyGfdX56QPcQHny_9Lvwov65aVY,1634
|
|
||||||
sqlalchemy/testing/plugin/noseplugin.py,sha256=In79x6zs9DOngfoYpaHojihWlSd4PeS7Nwzh3M_KNM4,2847
|
|
||||||
sqlalchemy/testing/plugin/plugin_base.py,sha256=h4RI4nPNdNq9kYABp6IP89Eknm29q8usgO-nWb8Eobc,17120
|
|
||||||
sqlalchemy/testing/plugin/pytestplugin.py,sha256=Pbc62y7Km0PHXd4M9dm5ThBwrlXkM4WtIX-W1pOaM84,5812
|
|
||||||
sqlalchemy/testing/suite/__init__.py,sha256=wqCTrb28i5FwhQZOyXVlnz3mA94iQOUBio7lszkFq-g,471
|
|
||||||
sqlalchemy/testing/suite/test_ddl.py,sha256=Baw0ou9nKdADmrRuXgWzF1FZx0rvkkw3JHc6yw5BN0M,1838
|
|
||||||
sqlalchemy/testing/suite/test_dialect.py,sha256=ORQPXUt53XtO-5ENlWgs8BpsSdPBDjyMRl4W2UjXLI4,1165
|
|
||||||
sqlalchemy/testing/suite/test_insert.py,sha256=nP0mgVpsVs72MHMADmihB1oXLbFBpsYsLGO3BlQ7RLU,8132
|
|
||||||
sqlalchemy/testing/suite/test_reflection.py,sha256=HtJRsJ_vuNMrOhnPTvuIvRg66OakSaSpeCU36zhaSPg,24616
|
|
||||||
sqlalchemy/testing/suite/test_results.py,sha256=oAcO1tD0I7c9ErMeSvSZBZfz1IBDMJHJTf64Y1pBodk,6685
|
|
||||||
sqlalchemy/testing/suite/test_select.py,sha256=u0wAz1g-GrAFdZpG4zwSrVckVtjULvjlbd0Z1U1jHAA,5729
|
|
||||||
sqlalchemy/testing/suite/test_sequence.py,sha256=fmBR4Pc5tOLSkXFxfcqwGx1z3xaxeJeUyqDnTakKTBU,3831
|
|
||||||
sqlalchemy/testing/suite/test_types.py,sha256=UKa-ZPdpz16mVKvT-9ISRAfqdrqiKaE7IA-_phQQuxo,17088
|
|
||||||
sqlalchemy/testing/suite/test_update_delete.py,sha256=r5p467r-EUsjEcWGfUE0VPIfN4LLXZpLRnnyBLyyjl4,1582
|
|
||||||
sqlalchemy/util/__init__.py,sha256=G06a5vBxg27RtWzY6dPZHt1FO8qtOiy_2C9PHTTMblI,2520
|
|
||||||
sqlalchemy/util/_collections.py,sha256=JZkeYK4GcIE1A5s6MAvHhmUp_X4wp6r7vMGT-iMftZ8,27842
|
|
||||||
sqlalchemy/util/compat.py,sha256=80OXp3D-F_R-pLf7s-zITPlfCqG1s_5o6KTlY1g2p0Q,6821
|
|
||||||
sqlalchemy/util/deprecations.py,sha256=D_LTsfb9jHokJtPEWNDRMJOc372xRGNjputAiTIysRU,4403
|
|
||||||
sqlalchemy/util/langhelpers.py,sha256=Nhe3Y9ieK6JaFYejjYosVOjOSSIBT2V385Hu6HGcyZk,41607
|
|
||||||
sqlalchemy/util/queue.py,sha256=rs3W0LDhKt7M_dlQEjYpI9KS-bzQmmwN38LE_-RRVvU,6548
|
|
||||||
sqlalchemy/util/topological.py,sha256=xKsYjjAat4p8cdqRHKwibLzr6WONbPTC0X8Mqg7jYno,2794
|
|
||||||
SQLAlchemy-1.0.12.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
|
|
||||||
sqlalchemy/orm/__pycache__/path_registry.cpython-34.pyc,,
|
|
||||||
sqlalchemy/ext/__pycache__/associationproxy.cpython-34.pyc,,
|
|
||||||
sqlalchemy/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
sqlalchemy/orm/__pycache__/deprecated_interfaces.cpython-34.pyc,,
|
|
||||||
sqlalchemy/orm/__pycache__/dynamic.cpython-34.pyc,,
|
|
||||||
sqlalchemy/event/__pycache__/legacy.cpython-34.pyc,,
|
|
||||||
sqlalchemy/event/__pycache__/api.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/__pycache__/postgres.cpython-34.pyc,,
|
|
||||||
sqlalchemy/testing/__pycache__/profiling.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/firebird/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/mysql/__pycache__/gaerdbms.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/mysql/__pycache__/oursql.cpython-34.pyc,,
|
|
||||||
sqlalchemy/testing/suite/__pycache__/test_types.cpython-34.pyc,,
|
|
||||||
sqlalchemy/event/__pycache__/registry.cpython-34.pyc,,
|
|
||||||
sqlalchemy/testing/__pycache__/distutils_run.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/mssql/__pycache__/mxodbc.cpython-34.pyc,,
|
|
||||||
sqlalchemy/engine/__pycache__/base.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/postgresql/__pycache__/pypostgresql.cpython-34.pyc,,
|
|
||||||
sqlalchemy/util/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
sqlalchemy/util/__pycache__/topological.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/firebird/__pycache__/kinterbasdb.cpython-34.pyc,,
|
|
||||||
sqlalchemy/orm/__pycache__/persistence.cpython-34.pyc,,
|
|
||||||
sqlalchemy/connectors/__pycache__/mxodbc.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/mssql/__pycache__/base.cpython-34.pyc,,
|
|
||||||
sqlalchemy/connectors/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
sqlalchemy/sql/__pycache__/elements.cpython-34.pyc,,
|
|
||||||
sqlalchemy/testing/suite/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
sqlalchemy/util/__pycache__/langhelpers.cpython-34.pyc,,
|
|
||||||
sqlalchemy/testing/__pycache__/entities.cpython-34.pyc,,
|
|
||||||
sqlalchemy/engine/__pycache__/interfaces.cpython-34.pyc,,
|
|
||||||
sqlalchemy/sql/__pycache__/schema.cpython-34.pyc,,
|
|
||||||
sqlalchemy/ext/__pycache__/baked.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/postgresql/__pycache__/zxjdbc.cpython-34.pyc,,
|
|
||||||
sqlalchemy/orm/__pycache__/base.cpython-34.pyc,,
|
|
||||||
sqlalchemy/connectors/__pycache__/pyodbc.cpython-34.pyc,,
|
|
||||||
sqlalchemy/sql/__pycache__/annotation.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/oracle/__pycache__/zxjdbc.cpython-34.pyc,,
|
|
||||||
sqlalchemy/testing/__pycache__/runner.cpython-34.pyc,,
|
|
||||||
sqlalchemy/testing/__pycache__/schema.cpython-34.pyc,,
|
|
||||||
sqlalchemy/orm/__pycache__/relationships.cpython-34.pyc,,
|
|
||||||
sqlalchemy/__pycache__/pool.cpython-34.pyc,,
|
|
||||||
sqlalchemy/testing/suite/__pycache__/test_sequence.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/sqlite/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
sqlalchemy/sql/__pycache__/ddl.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/sybase/__pycache__/pyodbc.cpython-34.pyc,,
|
|
||||||
sqlalchemy/orm/__pycache__/dependency.cpython-34.pyc,,
|
|
||||||
sqlalchemy/sql/__pycache__/visitors.cpython-34.pyc,,
|
|
||||||
sqlalchemy/testing/__pycache__/provision.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/postgresql/__pycache__/json.cpython-34.pyc,,
|
|
||||||
sqlalchemy/sql/__pycache__/selectable.cpython-34.pyc,,
|
|
||||||
sqlalchemy/orm/__pycache__/exc.cpython-34.pyc,,
|
|
||||||
sqlalchemy/ext/declarative/__pycache__/clsregistry.cpython-34.pyc,,
|
|
||||||
sqlalchemy/orm/__pycache__/interfaces.cpython-34.pyc,,
|
|
||||||
sqlalchemy/testing/__pycache__/assertions.cpython-34.pyc,,
|
|
||||||
sqlalchemy/ext/__pycache__/compiler.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/oracle/__pycache__/cx_oracle.cpython-34.pyc,,
|
|
||||||
sqlalchemy/testing/suite/__pycache__/test_select.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/firebird/__pycache__/fdb.cpython-34.pyc,,
|
|
||||||
sqlalchemy/orm/__pycache__/unitofwork.cpython-34.pyc,,
|
|
||||||
sqlalchemy/testing/__pycache__/util.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/postgresql/__pycache__/psycopg2cffi.cpython-34.pyc,,
|
|
||||||
sqlalchemy/__pycache__/interfaces.cpython-34.pyc,,
|
|
||||||
sqlalchemy/engine/__pycache__/util.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/postgresql/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
sqlalchemy/__pycache__/schema.cpython-34.pyc,,
|
|
||||||
sqlalchemy/orm/__pycache__/sync.cpython-34.pyc,,
|
|
||||||
sqlalchemy/__pycache__/processors.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/firebird/__pycache__/base.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/postgresql/__pycache__/psycopg2.cpython-34.pyc,,
|
|
||||||
sqlalchemy/databases/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
sqlalchemy/sql/__pycache__/sqltypes.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/oracle/__pycache__/base.cpython-34.pyc,,
|
|
||||||
sqlalchemy/sql/__pycache__/functions.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/sqlite/__pycache__/pysqlcipher.cpython-34.pyc,,
|
|
||||||
sqlalchemy/testing/suite/__pycache__/test_dialect.cpython-34.pyc,,
|
|
||||||
sqlalchemy/ext/__pycache__/automap.cpython-34.pyc,,
|
|
||||||
sqlalchemy/testing/__pycache__/mock.cpython-34.pyc,,
|
|
||||||
sqlalchemy/testing/__pycache__/requirements.cpython-34.pyc,,
|
|
||||||
sqlalchemy/testing/suite/__pycache__/test_results.cpython-34.pyc,,
|
|
||||||
sqlalchemy/orm/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/postgresql/__pycache__/base.cpython-34.pyc,,
|
|
||||||
sqlalchemy/util/__pycache__/deprecations.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/mssql/__pycache__/pyodbc.cpython-34.pyc,,
|
|
||||||
sqlalchemy/orm/__pycache__/state.cpython-34.pyc,,
|
|
||||||
sqlalchemy/event/__pycache__/base.cpython-34.pyc,,
|
|
||||||
sqlalchemy/__pycache__/log.cpython-34.pyc,,
|
|
||||||
sqlalchemy/connectors/__pycache__/zxJDBC.cpython-34.pyc,,
|
|
||||||
sqlalchemy/testing/plugin/__pycache__/plugin_base.cpython-34.pyc,,
|
|
||||||
sqlalchemy/orm/__pycache__/identity.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/mysql/__pycache__/mysqlconnector.cpython-34.pyc,,
|
|
||||||
sqlalchemy/orm/__pycache__/attributes.cpython-34.pyc,,
|
|
||||||
sqlalchemy/ext/declarative/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/sqlite/__pycache__/base.cpython-34.pyc,,
|
|
||||||
sqlalchemy/ext/__pycache__/serializer.cpython-34.pyc,,
|
|
||||||
sqlalchemy/testing/plugin/__pycache__/pytestplugin.cpython-34.pyc,,
|
|
||||||
sqlalchemy/orm/__pycache__/properties.cpython-34.pyc,,
|
|
||||||
sqlalchemy/orm/__pycache__/mapper.cpython-34.pyc,,
|
|
||||||
sqlalchemy/testing/__pycache__/fixtures.cpython-34.pyc,,
|
|
||||||
sqlalchemy/sql/__pycache__/base.cpython-34.pyc,,
|
|
||||||
sqlalchemy/orm/__pycache__/events.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/mysql/__pycache__/zxjdbc.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/mysql/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
sqlalchemy/orm/__pycache__/strategy_options.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/sybase/__pycache__/mxodbc.cpython-34.pyc,,
|
|
||||||
sqlalchemy/util/__pycache__/compat.cpython-34.pyc,,
|
|
||||||
sqlalchemy/testing/plugin/__pycache__/bootstrap.cpython-34.pyc,,
|
|
||||||
sqlalchemy/sql/__pycache__/compiler.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/mysql/__pycache__/mysqldb.cpython-34.pyc,,
|
|
||||||
sqlalchemy/__pycache__/inspection.cpython-34.pyc,,
|
|
||||||
sqlalchemy/testing/plugin/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/mssql/__pycache__/adodbapi.cpython-34.pyc,,
|
|
||||||
sqlalchemy/engine/__pycache__/url.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/oracle/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
sqlalchemy/engine/__pycache__/result.cpython-34.pyc,,
|
|
||||||
sqlalchemy/testing/suite/__pycache__/test_insert.cpython-34.pyc,,
|
|
||||||
sqlalchemy/event/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
sqlalchemy/orm/__pycache__/scoping.cpython-34.pyc,,
|
|
||||||
sqlalchemy/orm/__pycache__/instrumentation.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/sybase/__pycache__/base.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/mysql/__pycache__/pyodbc.cpython-34.pyc,,
|
|
||||||
sqlalchemy/testing/plugin/__pycache__/noseplugin.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/mysql/__pycache__/cymysql.cpython-34.pyc,,
|
|
||||||
sqlalchemy/testing/__pycache__/exclusions.cpython-34.pyc,,
|
|
||||||
sqlalchemy/ext/__pycache__/mutable.cpython-34.pyc,,
|
|
||||||
sqlalchemy/sql/__pycache__/default_comparator.cpython-34.pyc,,
|
|
||||||
sqlalchemy/engine/__pycache__/default.cpython-34.pyc,,
|
|
||||||
sqlalchemy/__pycache__/types.cpython-34.pyc,,
|
|
||||||
sqlalchemy/orm/__pycache__/session.cpython-34.pyc,,
|
|
||||||
sqlalchemy/util/__pycache__/_collections.cpython-34.pyc,,
|
|
||||||
sqlalchemy/engine/__pycache__/reflection.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/sybase/__pycache__/pysybase.cpython-34.pyc,,
|
|
||||||
sqlalchemy/testing/__pycache__/assertsql.cpython-34.pyc,,
|
|
||||||
sqlalchemy/testing/__pycache__/replay_fixture.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/mysql/__pycache__/pymysql.cpython-34.pyc,,
|
|
||||||
sqlalchemy/testing/__pycache__/config.cpython-34.pyc,,
|
|
||||||
sqlalchemy/orm/__pycache__/strategies.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/sqlite/__pycache__/pysqlite.cpython-34.pyc,,
|
|
||||||
sqlalchemy/orm/__pycache__/util.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/mysql/__pycache__/base.cpython-34.pyc,,
|
|
||||||
sqlalchemy/sql/__pycache__/crud.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/postgresql/__pycache__/pg8000.cpython-34.pyc,,
|
|
||||||
sqlalchemy/engine/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
sqlalchemy/orm/__pycache__/loading.cpython-34.pyc,,
|
|
||||||
sqlalchemy/testing/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/postgresql/__pycache__/ranges.cpython-34.pyc,,
|
|
||||||
sqlalchemy/sql/__pycache__/operators.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/mssql/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
sqlalchemy/testing/__pycache__/pickleable.cpython-34.pyc,,
|
|
||||||
sqlalchemy/sql/__pycache__/expression.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/mssql/__pycache__/pymssql.cpython-34.pyc,,
|
|
||||||
sqlalchemy/sql/__pycache__/naming.cpython-34.pyc,,
|
|
||||||
sqlalchemy/ext/__pycache__/horizontal_shard.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/sybase/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
sqlalchemy/engine/__pycache__/threadlocal.cpython-34.pyc,,
|
|
||||||
sqlalchemy/ext/declarative/__pycache__/api.cpython-34.pyc,,
|
|
||||||
sqlalchemy/testing/__pycache__/warnings.cpython-34.pyc,,
|
|
||||||
sqlalchemy/sql/__pycache__/util.cpython-34.pyc,,
|
|
||||||
sqlalchemy/sql/__pycache__/dml.cpython-34.pyc,,
|
|
||||||
sqlalchemy/ext/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/postgresql/__pycache__/hstore.cpython-34.pyc,,
|
|
||||||
sqlalchemy/orm/__pycache__/collections.cpython-34.pyc,,
|
|
||||||
sqlalchemy/sql/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
sqlalchemy/testing/suite/__pycache__/test_ddl.cpython-34.pyc,,
|
|
||||||
sqlalchemy/ext/__pycache__/orderinglist.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/postgresql/__pycache__/constraints.cpython-34.pyc,,
|
|
||||||
sqlalchemy/__pycache__/exc.cpython-34.pyc,,
|
|
||||||
sqlalchemy/testing/suite/__pycache__/test_update_delete.cpython-34.pyc,,
|
|
||||||
sqlalchemy/engine/__pycache__/strategies.cpython-34.pyc,,
|
|
||||||
sqlalchemy/ext/declarative/__pycache__/base.cpython-34.pyc,,
|
|
||||||
sqlalchemy/orm/__pycache__/evaluator.cpython-34.pyc,,
|
|
||||||
sqlalchemy/orm/__pycache__/query.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/mssql/__pycache__/zxjdbc.cpython-34.pyc,,
|
|
||||||
sqlalchemy/orm/__pycache__/descriptor_props.cpython-34.pyc,,
|
|
||||||
sqlalchemy/__pycache__/events.cpython-34.pyc,,
|
|
||||||
sqlalchemy/sql/__pycache__/type_api.cpython-34.pyc,,
|
|
||||||
sqlalchemy/util/__pycache__/queue.cpython-34.pyc,,
|
|
||||||
sqlalchemy/ext/__pycache__/hybrid.cpython-34.pyc,,
|
|
||||||
sqlalchemy/event/__pycache__/attr.cpython-34.pyc,,
|
|
||||||
sqlalchemy/testing/suite/__pycache__/test_reflection.cpython-34.pyc,,
|
|
||||||
sqlalchemy/dialects/mssql/__pycache__/information_schema.cpython-34.pyc,,
|
|
||||||
sqlalchemy/ext/__pycache__/instrumentation.cpython-34.pyc,,
|
|
||||||
sqlalchemy/testing/__pycache__/engines.cpython-34.pyc,,
|
|
|
@ -1,5 +0,0 @@
|
||||||
Wheel-Version: 1.0
|
|
||||||
Generator: bdist_wheel (0.30.0)
|
|
||||||
Root-Is-Purelib: false
|
|
||||||
Tag: cp34-cp34m-linux_x86_64
|
|
||||||
|
|
|
@ -1 +0,0 @@
|
||||||
{"classifiers": ["Development Status :: 5 - Production/Stable", "Intended Audience :: Developers", "License :: OSI Approved :: MIT License", "Programming Language :: Python", "Programming Language :: Python :: 3", "Programming Language :: Python :: Implementation :: CPython", "Programming Language :: Python :: Implementation :: Jython", "Programming Language :: Python :: Implementation :: PyPy", "Topic :: Database :: Front-Ends", "Operating System :: OS Independent"], "description_content_type": "UNKNOWN", "extensions": {"python.details": {"contacts": [{"email": "mike_mp@zzzcomputing.com", "name": "Mike Bayer", "role": "author"}], "document_names": {"description": "DESCRIPTION.rst"}, "project_urls": {"Home": "http://www.sqlalchemy.org"}}}, "generator": "bdist_wheel (0.30.0)", "license": "MIT License", "metadata_version": "2.0", "name": "SQLAlchemy", "summary": "Database Abstraction Library", "test_requires": [{"requires": ["mock", "pytest (>=2.5.2)", "pytest-xdist"]}], "version": "1.0.12"}
|
|
|
@ -1,11 +0,0 @@
|
||||||
Python bindings to the Ed25519 public-key signature system.
|
|
||||||
|
|
||||||
This offers a comfortable python interface to a C implementation of the
|
|
||||||
Ed25519 public-key signature system (http://ed25519.cr.yp.to/), using the
|
|
||||||
portable 'ref' code from the 'SUPERCOP' benchmarking suite.
|
|
||||||
|
|
||||||
This system provides high (128-bit) security, short (32-byte) keys, short
|
|
||||||
(64-byte) signatures, and fast (2-6ms) operation. Please see the README for
|
|
||||||
more details.
|
|
||||||
|
|
||||||
|
|
|
@ -1,17 +0,0 @@
|
||||||
ed25519/__init__.py,sha256=0AicD1xQAforRdrUWwmmURJkZ3Gi1lqaifukwZNYJos,401
|
|
||||||
ed25519/_ed25519.cpython-34m.so,sha256=-qvpNKMbtiJoFhWHlvH83lGmJEntE9ISrt8hYZE4zig,262968
|
|
||||||
ed25519/_version.py,sha256=yb119RosJrH_RO02_o3o12GWQvkxx3xD4X7UrJW9vTY,469
|
|
||||||
ed25519/keys.py,sha256=AbMFsbxn0qbwmQ6HntpNURsOGq_y4puwFxs6U7Of2eo,7123
|
|
||||||
ed25519/test_ed25519.py,sha256=IG8ot-yARHi6PoyJY6ixS1l2L23hE1lCXbSH-XQPCCM,12389
|
|
||||||
../../../bin/edsig,sha256=SA1mUUWCjAAaSEe6MKSpVWg-2qXwuiuK3PodCAUwCN0,2853
|
|
||||||
ed25519-1.4.dist-info/DESCRIPTION.rst,sha256=8UWGEqjPrB7zPyxLA5Ep6JL58ANbe0Wybqth188exdc,434
|
|
||||||
ed25519-1.4.dist-info/METADATA,sha256=8xAIfsJS4nw5H1ui1jHsVntmwcMjIzm4j_LHEaW3wNQ,1148
|
|
||||||
ed25519-1.4.dist-info/RECORD,,
|
|
||||||
ed25519-1.4.dist-info/WHEEL,sha256=AEztX7vHDtcgysb-4-5-DyIKMLIPg6NMxY9dXTRdoXQ,104
|
|
||||||
ed25519-1.4.dist-info/metadata.json,sha256=6X6ChTS1aIj99pNHtLNerEBCuO-F-P2Z1GgSMt2svQw,841
|
|
||||||
ed25519-1.4.dist-info/top_level.txt,sha256=U3-N9ZJMBO9MUuZLwoiMbsWSkxsd0TfkNSuzO6O_gYY,8
|
|
||||||
ed25519-1.4.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
|
|
||||||
ed25519/__pycache__/keys.cpython-34.pyc,,
|
|
||||||
ed25519/__pycache__/_version.cpython-34.pyc,,
|
|
||||||
ed25519/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
ed25519/__pycache__/test_ed25519.cpython-34.pyc,,
|
|
|
@ -1,5 +0,0 @@
|
||||||
Wheel-Version: 1.0
|
|
||||||
Generator: bdist_wheel (0.30.0)
|
|
||||||
Root-Is-Purelib: false
|
|
||||||
Tag: cp34-cp34m-linux_x86_64
|
|
||||||
|
|
|
@ -1 +0,0 @@
|
||||||
{"classifiers": ["Development Status :: 5 - Production/Stable", "Intended Audience :: Developers", "License :: OSI Approved :: MIT License", "Programming Language :: Python", "Programming Language :: Python :: 2.6", "Programming Language :: Python :: 2.7", "Programming Language :: Python :: 3.3", "Programming Language :: Python :: 3.4", "Topic :: Security :: Cryptography"], "description_content_type": "UNKNOWN", "extensions": {"python.details": {"contacts": [{"email": "warner-python-ed25519@lothar.com", "name": "Brian Warner", "role": "author"}], "document_names": {"description": "DESCRIPTION.rst"}, "project_urls": {"Home": "https://github.com/warner/python-ed25519"}}}, "generator": "bdist_wheel (0.30.0)", "license": "MIT", "metadata_version": "2.0", "name": "ed25519", "summary": "Ed25519 public-key signatures", "version": "1.4"}
|
|
Binary file not shown.
Binary file not shown.
Binary file not shown.
|
@ -1,199 +0,0 @@
|
||||||
netifaces 0.10.6
|
|
||||||
================
|
|
||||||
|
|
||||||
.. image:: https://drone.io/bitbucket.org/al45tair/netifaces/status.png
|
|
||||||
:target: https://drone.io/bitbucket.org/al45tair/netifaces/latest
|
|
||||||
:alt: Build Status
|
|
||||||
|
|
||||||
1. What is this?
|
|
||||||
----------------
|
|
||||||
|
|
||||||
It's been annoying me for some time that there's no easy way to get the
|
|
||||||
address(es) of the machine's network interfaces from Python. There is
|
|
||||||
a good reason for this difficulty, which is that it is virtually impossible
|
|
||||||
to do so in a portable manner. However, it seems to me that there should
|
|
||||||
be a package you can easy_install that will take care of working out the
|
|
||||||
details of doing so on the machine you're using, then you can get on with
|
|
||||||
writing Python code without concerning yourself with the nitty gritty of
|
|
||||||
system-dependent low-level networking APIs.
|
|
||||||
|
|
||||||
This package attempts to solve that problem.
|
|
||||||
|
|
||||||
2. How do I use it?
|
|
||||||
-------------------
|
|
||||||
|
|
||||||
First you need to install it, which you can do by typing::
|
|
||||||
|
|
||||||
tar xvzf netifaces-0.10.6.tar.gz
|
|
||||||
cd netifaces-0.10.6
|
|
||||||
python setup.py install
|
|
||||||
|
|
||||||
**Note that you will need the relevant developer tools for your platform**,
|
|
||||||
as netifaces is written in C and installing this way will compile the extension.
|
|
||||||
|
|
||||||
Once that's done, you'll need to start Python and do something like the
|
|
||||||
following::
|
|
||||||
|
|
||||||
>>> import netifaces
|
|
||||||
|
|
||||||
Then if you enter
|
|
||||||
|
|
||||||
>>> netifaces.interfaces()
|
|
||||||
['lo0', 'gif0', 'stf0', 'en0', 'en1', 'fw0']
|
|
||||||
|
|
||||||
you'll see the list of interface identifiers for your machine.
|
|
||||||
|
|
||||||
You can ask for the addresses of a particular interface by doing
|
|
||||||
|
|
||||||
>>> netifaces.ifaddresses('lo0')
|
|
||||||
{18: [{'addr': ''}], 2: [{'peer': '127.0.0.1', 'netmask': '255.0.0.0', 'addr': '127.0.0.1'}], 30: [{'peer': '::1', 'netmask': 'ffff:ffff:ffff:ffff:ffff:ffff:ffff:ffff', 'addr': '::1'}, {'peer': '', 'netmask': 'ffff:ffff:ffff:ffff::', 'addr': 'fe80::1%lo0'}]}
|
|
||||||
|
|
||||||
Hmmmm. That result looks a bit cryptic; let's break it apart and explain
|
|
||||||
what each piece means. It returned a dictionary, so let's look there first::
|
|
||||||
|
|
||||||
{ 18: [...], 2: [...], 30: [...] }
|
|
||||||
|
|
||||||
Each of the numbers refers to a particular address family. In this case, we
|
|
||||||
have three address families listed; on my system, 18 is ``AF_LINK`` (which means
|
|
||||||
the link layer interface, e.g. Ethernet), 2 is ``AF_INET`` (normal Internet
|
|
||||||
addresses), and 30 is ``AF_INET6`` (IPv6).
|
|
||||||
|
|
||||||
But wait! Don't use these numbers in your code. The numeric values here are
|
|
||||||
system dependent; fortunately, I thought of that when writing netifaces, so
|
|
||||||
the module declares a range of values that you might need. e.g.
|
|
||||||
|
|
||||||
>>> netifaces.AF_LINK
|
|
||||||
18
|
|
||||||
|
|
||||||
Again, on your system, the number may be different.
|
|
||||||
|
|
||||||
So, what we've established is that the dictionary that's returned has one
|
|
||||||
entry for each address family for which this interface has an address. Let's
|
|
||||||
take a look at the ``AF_INET`` addresses now:
|
|
||||||
|
|
||||||
>>> addrs = netifaces.ifaddresses('lo0')
|
|
||||||
>>> addrs[netifaces.AF_INET]
|
|
||||||
[{'peer': '127.0.0.1', 'netmask': '255.0.0.0', 'addr': '127.0.0.1'}]
|
|
||||||
|
|
||||||
You might be wondering why this value is a list. The reason is that it's
|
|
||||||
possible for an interface to have more than one address, even within the
|
|
||||||
same family. I'll say that again: *you can have more than one address of
|
|
||||||
the same type associated with each interface*.
|
|
||||||
|
|
||||||
*Asking for "the" address of a particular interface doesn't make sense.*
|
|
||||||
|
|
||||||
Right, so, we can see that this particular interface only has one address,
|
|
||||||
and, because it's a loopback interface, it's point-to-point and therefore
|
|
||||||
has a *peer* address rather than a broadcast address.
|
|
||||||
|
|
||||||
Let's look at a more interesting interface.
|
|
||||||
|
|
||||||
>>> addrs = netifaces.ifaddresses('en0')
|
|
||||||
>>> addrs[netifaces.AF_INET]
|
|
||||||
[{'broadcast': '10.15.255.255', 'netmask': '255.240.0.0', 'addr': '10.0.1.4'}, {'broadcast': '192.168.0.255', 'addr': '192.168.0.47'}]
|
|
||||||
|
|
||||||
This interface has two addresses (see, I told you...) Both of them are
|
|
||||||
regular IPv4 addresses, although in one case the netmask has been changed
|
|
||||||
from its default. The netmask *may not* appear on your system if it's set
|
|
||||||
to the default for the address range.
|
|
||||||
|
|
||||||
Because this interface isn't point-to-point, it also has broadcast addresses.
|
|
||||||
|
|
||||||
Now, say we want, instead of the IP addresses, to get the MAC address; that
|
|
||||||
is, the hardware address of the Ethernet adapter running this interface. We
|
|
||||||
can do
|
|
||||||
|
|
||||||
>>> addrs[netifaces.AF_LINK]
|
|
||||||
[{'addr': '00:12:34:56:78:9a'}]
|
|
||||||
|
|
||||||
Note that this may not be available on platforms without getifaddrs(), unless
|
|
||||||
they happen to implement ``SIOCGIFHWADDR``. Note also that you just get the
|
|
||||||
address; it's unlikely that you'll see anything else with an ``AF_LINK`` address.
|
|
||||||
Oh, and don't assume that all ``AF_LINK`` addresses are Ethernet; you might, for
|
|
||||||
instance, be on a Mac, in which case:
|
|
||||||
|
|
||||||
>>> addrs = netifaces.ifaddresses('fw0')
|
|
||||||
>>> addrs[netifaces.AF_LINK]
|
|
||||||
[{'addr': '00:12:34:56:78:9a:bc:de'}]
|
|
||||||
|
|
||||||
No, that isn't an exceptionally long Ethernet MAC address---it's a FireWire
|
|
||||||
address.
|
|
||||||
|
|
||||||
As of version 0.10.0, you can also obtain a list of gateways on your
|
|
||||||
machine:
|
|
||||||
|
|
||||||
>>> netifaces.gateways()
|
|
||||||
{2: [('10.0.1.1', 'en0', True), ('10.2.1.1', 'en1', False)], 30: [('fe80::1', 'en0', True)], 'default': { 2: ('10.0.1.1', 'en0'), 30: ('fe80::1', 'en0') }}
|
|
||||||
|
|
||||||
This dictionary is keyed on address family---in this case, ``AF_INET``---and
|
|
||||||
each entry is a list of gateways as ``(address, interface, is_default)`` tuples.
|
|
||||||
Notice that here we have two separate gateways for IPv4 (``AF_INET``); some
|
|
||||||
operating systems support configurations like this and can either route packets
|
|
||||||
based on their source, or based on administratively configured routing tables.
|
|
||||||
|
|
||||||
For convenience, we also allow you to index the dictionary with the special
|
|
||||||
value ``'default'``, which returns a dictionary mapping address families to the
|
|
||||||
default gateway in each case. Thus you can get the default IPv4 gateway with
|
|
||||||
|
|
||||||
>>> gws = netifaces.gateways()
|
|
||||||
>>> gws['default'][netifaces.AF_INET]
|
|
||||||
('10.0.1.1', 'en0')
|
|
||||||
|
|
||||||
Do note that there may be no default gateway for any given address family;
|
|
||||||
this is currently very common for IPv6 and much less common for IPv4 but it
|
|
||||||
can happen even for ``AF_INET``.
|
|
||||||
|
|
||||||
BTW, if you're trying to configure your machine to have multiple gateways for
|
|
||||||
the same address family, it's a very good idea to check the documentation for
|
|
||||||
your operating system *very* carefully, as some systems become extremely
|
|
||||||
confused or route packets in a non-obvious manner.
|
|
||||||
|
|
||||||
I'm very interested in hearing from anyone (on any platform) for whom the
|
|
||||||
``gateways()`` method doesn't produce the expected results. It's quite
|
|
||||||
complicated extracting this information from the operating system (whichever
|
|
||||||
operating system we're talking about), and so I expect there's at least one
|
|
||||||
system out there where this just won't work.
|
|
||||||
|
|
||||||
3. This is great! What platforms does it work on?
|
|
||||||
--------------------------------------------------
|
|
||||||
|
|
||||||
It gets regular testing on OS X, Linux and Windows. It has also been used
|
|
||||||
successfully on Solaris, and it's expected to work properly on other UNIX-like
|
|
||||||
systems as well. If you are running something that is not supported, and
|
|
||||||
wish to contribute a patch, please use BitBucket to send a pull request.
|
|
||||||
|
|
||||||
4. What license is this under?
|
|
||||||
------------------------------
|
|
||||||
|
|
||||||
It's an MIT-style license. Here goes:
|
|
||||||
|
|
||||||
Copyright (c) 2007-2017 Alastair Houghton
|
|
||||||
|
|
||||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
|
||||||
of this software and associated documentation files (the "Software"), to deal
|
|
||||||
in the Software without restriction, including without limitation the rights
|
|
||||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
|
||||||
copies of the Software, and to permit persons to whom the Software is
|
|
||||||
furnished to do so, subject to the following conditions:
|
|
||||||
|
|
||||||
The above copyright notice and this permission notice shall be included in all
|
|
||||||
copies or substantial portions of the Software.
|
|
||||||
|
|
||||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
|
||||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
|
||||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
|
||||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
|
||||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
|
||||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
|
||||||
SOFTWARE.
|
|
||||||
|
|
||||||
5. Why the jump to 0.10.0?
|
|
||||||
--------------------------
|
|
||||||
|
|
||||||
Because someone released a fork of netifaces with the version 0.9.0.
|
|
||||||
Hopefully skipping the version number should remove any confusion. In
|
|
||||||
addition starting with 0.10.0 Python 3 is now supported and other
|
|
||||||
features/bugfixes have been included as well. See the CHANGELOG for a
|
|
||||||
more complete list of changes.
|
|
||||||
|
|
||||||
|
|
|
@ -1,220 +0,0 @@
|
||||||
Metadata-Version: 2.0
|
|
||||||
Name: netifaces
|
|
||||||
Version: 0.10.6
|
|
||||||
Summary: Portable network interface information.
|
|
||||||
Home-page: https://bitbucket.org/al45tair/netifaces
|
|
||||||
Author: Alastair Houghton
|
|
||||||
Author-email: alastair@alastairs-place.net
|
|
||||||
License: MIT License
|
|
||||||
Description-Content-Type: UNKNOWN
|
|
||||||
Platform: UNKNOWN
|
|
||||||
Classifier: Development Status :: 4 - Beta
|
|
||||||
Classifier: Intended Audience :: Developers
|
|
||||||
Classifier: License :: OSI Approved :: MIT License
|
|
||||||
Classifier: Topic :: System :: Networking
|
|
||||||
Classifier: Programming Language :: Python
|
|
||||||
Classifier: Programming Language :: Python :: 2
|
|
||||||
Classifier: Programming Language :: Python :: 2.5
|
|
||||||
Classifier: Programming Language :: Python :: 2.6
|
|
||||||
Classifier: Programming Language :: Python :: 2.7
|
|
||||||
Classifier: Programming Language :: Python :: 3
|
|
||||||
|
|
||||||
netifaces 0.10.6
|
|
||||||
================
|
|
||||||
|
|
||||||
.. image:: https://drone.io/bitbucket.org/al45tair/netifaces/status.png
|
|
||||||
:target: https://drone.io/bitbucket.org/al45tair/netifaces/latest
|
|
||||||
:alt: Build Status
|
|
||||||
|
|
||||||
1. What is this?
|
|
||||||
----------------
|
|
||||||
|
|
||||||
It's been annoying me for some time that there's no easy way to get the
|
|
||||||
address(es) of the machine's network interfaces from Python. There is
|
|
||||||
a good reason for this difficulty, which is that it is virtually impossible
|
|
||||||
to do so in a portable manner. However, it seems to me that there should
|
|
||||||
be a package you can easy_install that will take care of working out the
|
|
||||||
details of doing so on the machine you're using, then you can get on with
|
|
||||||
writing Python code without concerning yourself with the nitty gritty of
|
|
||||||
system-dependent low-level networking APIs.
|
|
||||||
|
|
||||||
This package attempts to solve that problem.
|
|
||||||
|
|
||||||
2. How do I use it?
|
|
||||||
-------------------
|
|
||||||
|
|
||||||
First you need to install it, which you can do by typing::
|
|
||||||
|
|
||||||
tar xvzf netifaces-0.10.6.tar.gz
|
|
||||||
cd netifaces-0.10.6
|
|
||||||
python setup.py install
|
|
||||||
|
|
||||||
**Note that you will need the relevant developer tools for your platform**,
|
|
||||||
as netifaces is written in C and installing this way will compile the extension.
|
|
||||||
|
|
||||||
Once that's done, you'll need to start Python and do something like the
|
|
||||||
following::
|
|
||||||
|
|
||||||
>>> import netifaces
|
|
||||||
|
|
||||||
Then if you enter
|
|
||||||
|
|
||||||
>>> netifaces.interfaces()
|
|
||||||
['lo0', 'gif0', 'stf0', 'en0', 'en1', 'fw0']
|
|
||||||
|
|
||||||
you'll see the list of interface identifiers for your machine.
|
|
||||||
|
|
||||||
You can ask for the addresses of a particular interface by doing
|
|
||||||
|
|
||||||
>>> netifaces.ifaddresses('lo0')
|
|
||||||
{18: [{'addr': ''}], 2: [{'peer': '127.0.0.1', 'netmask': '255.0.0.0', 'addr': '127.0.0.1'}], 30: [{'peer': '::1', 'netmask': 'ffff:ffff:ffff:ffff:ffff:ffff:ffff:ffff', 'addr': '::1'}, {'peer': '', 'netmask': 'ffff:ffff:ffff:ffff::', 'addr': 'fe80::1%lo0'}]}
|
|
||||||
|
|
||||||
Hmmmm. That result looks a bit cryptic; let's break it apart and explain
|
|
||||||
what each piece means. It returned a dictionary, so let's look there first::
|
|
||||||
|
|
||||||
{ 18: [...], 2: [...], 30: [...] }
|
|
||||||
|
|
||||||
Each of the numbers refers to a particular address family. In this case, we
|
|
||||||
have three address families listed; on my system, 18 is ``AF_LINK`` (which means
|
|
||||||
the link layer interface, e.g. Ethernet), 2 is ``AF_INET`` (normal Internet
|
|
||||||
addresses), and 30 is ``AF_INET6`` (IPv6).
|
|
||||||
|
|
||||||
But wait! Don't use these numbers in your code. The numeric values here are
|
|
||||||
system dependent; fortunately, I thought of that when writing netifaces, so
|
|
||||||
the module declares a range of values that you might need. e.g.
|
|
||||||
|
|
||||||
>>> netifaces.AF_LINK
|
|
||||||
18
|
|
||||||
|
|
||||||
Again, on your system, the number may be different.
|
|
||||||
|
|
||||||
So, what we've established is that the dictionary that's returned has one
|
|
||||||
entry for each address family for which this interface has an address. Let's
|
|
||||||
take a look at the ``AF_INET`` addresses now:
|
|
||||||
|
|
||||||
>>> addrs = netifaces.ifaddresses('lo0')
|
|
||||||
>>> addrs[netifaces.AF_INET]
|
|
||||||
[{'peer': '127.0.0.1', 'netmask': '255.0.0.0', 'addr': '127.0.0.1'}]
|
|
||||||
|
|
||||||
You might be wondering why this value is a list. The reason is that it's
|
|
||||||
possible for an interface to have more than one address, even within the
|
|
||||||
same family. I'll say that again: *you can have more than one address of
|
|
||||||
the same type associated with each interface*.
|
|
||||||
|
|
||||||
*Asking for "the" address of a particular interface doesn't make sense.*
|
|
||||||
|
|
||||||
Right, so, we can see that this particular interface only has one address,
|
|
||||||
and, because it's a loopback interface, it's point-to-point and therefore
|
|
||||||
has a *peer* address rather than a broadcast address.
|
|
||||||
|
|
||||||
Let's look at a more interesting interface.
|
|
||||||
|
|
||||||
>>> addrs = netifaces.ifaddresses('en0')
|
|
||||||
>>> addrs[netifaces.AF_INET]
|
|
||||||
[{'broadcast': '10.15.255.255', 'netmask': '255.240.0.0', 'addr': '10.0.1.4'}, {'broadcast': '192.168.0.255', 'addr': '192.168.0.47'}]
|
|
||||||
|
|
||||||
This interface has two addresses (see, I told you...) Both of them are
|
|
||||||
regular IPv4 addresses, although in one case the netmask has been changed
|
|
||||||
from its default. The netmask *may not* appear on your system if it's set
|
|
||||||
to the default for the address range.
|
|
||||||
|
|
||||||
Because this interface isn't point-to-point, it also has broadcast addresses.
|
|
||||||
|
|
||||||
Now, say we want, instead of the IP addresses, to get the MAC address; that
|
|
||||||
is, the hardware address of the Ethernet adapter running this interface. We
|
|
||||||
can do
|
|
||||||
|
|
||||||
>>> addrs[netifaces.AF_LINK]
|
|
||||||
[{'addr': '00:12:34:56:78:9a'}]
|
|
||||||
|
|
||||||
Note that this may not be available on platforms without getifaddrs(), unless
|
|
||||||
they happen to implement ``SIOCGIFHWADDR``. Note also that you just get the
|
|
||||||
address; it's unlikely that you'll see anything else with an ``AF_LINK`` address.
|
|
||||||
Oh, and don't assume that all ``AF_LINK`` addresses are Ethernet; you might, for
|
|
||||||
instance, be on a Mac, in which case:
|
|
||||||
|
|
||||||
>>> addrs = netifaces.ifaddresses('fw0')
|
|
||||||
>>> addrs[netifaces.AF_LINK]
|
|
||||||
[{'addr': '00:12:34:56:78:9a:bc:de'}]
|
|
||||||
|
|
||||||
No, that isn't an exceptionally long Ethernet MAC address---it's a FireWire
|
|
||||||
address.
|
|
||||||
|
|
||||||
As of version 0.10.0, you can also obtain a list of gateways on your
|
|
||||||
machine:
|
|
||||||
|
|
||||||
>>> netifaces.gateways()
|
|
||||||
{2: [('10.0.1.1', 'en0', True), ('10.2.1.1', 'en1', False)], 30: [('fe80::1', 'en0', True)], 'default': { 2: ('10.0.1.1', 'en0'), 30: ('fe80::1', 'en0') }}
|
|
||||||
|
|
||||||
This dictionary is keyed on address family---in this case, ``AF_INET``---and
|
|
||||||
each entry is a list of gateways as ``(address, interface, is_default)`` tuples.
|
|
||||||
Notice that here we have two separate gateways for IPv4 (``AF_INET``); some
|
|
||||||
operating systems support configurations like this and can either route packets
|
|
||||||
based on their source, or based on administratively configured routing tables.
|
|
||||||
|
|
||||||
For convenience, we also allow you to index the dictionary with the special
|
|
||||||
value ``'default'``, which returns a dictionary mapping address families to the
|
|
||||||
default gateway in each case. Thus you can get the default IPv4 gateway with
|
|
||||||
|
|
||||||
>>> gws = netifaces.gateways()
|
|
||||||
>>> gws['default'][netifaces.AF_INET]
|
|
||||||
('10.0.1.1', 'en0')
|
|
||||||
|
|
||||||
Do note that there may be no default gateway for any given address family;
|
|
||||||
this is currently very common for IPv6 and much less common for IPv4 but it
|
|
||||||
can happen even for ``AF_INET``.
|
|
||||||
|
|
||||||
BTW, if you're trying to configure your machine to have multiple gateways for
|
|
||||||
the same address family, it's a very good idea to check the documentation for
|
|
||||||
your operating system *very* carefully, as some systems become extremely
|
|
||||||
confused or route packets in a non-obvious manner.
|
|
||||||
|
|
||||||
I'm very interested in hearing from anyone (on any platform) for whom the
|
|
||||||
``gateways()`` method doesn't produce the expected results. It's quite
|
|
||||||
complicated extracting this information from the operating system (whichever
|
|
||||||
operating system we're talking about), and so I expect there's at least one
|
|
||||||
system out there where this just won't work.
|
|
||||||
|
|
||||||
3. This is great! What platforms does it work on?
|
|
||||||
--------------------------------------------------
|
|
||||||
|
|
||||||
It gets regular testing on OS X, Linux and Windows. It has also been used
|
|
||||||
successfully on Solaris, and it's expected to work properly on other UNIX-like
|
|
||||||
systems as well. If you are running something that is not supported, and
|
|
||||||
wish to contribute a patch, please use BitBucket to send a pull request.
|
|
||||||
|
|
||||||
4. What license is this under?
|
|
||||||
------------------------------
|
|
||||||
|
|
||||||
It's an MIT-style license. Here goes:
|
|
||||||
|
|
||||||
Copyright (c) 2007-2017 Alastair Houghton
|
|
||||||
|
|
||||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
|
||||||
of this software and associated documentation files (the "Software"), to deal
|
|
||||||
in the Software without restriction, including without limitation the rights
|
|
||||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
|
||||||
copies of the Software, and to permit persons to whom the Software is
|
|
||||||
furnished to do so, subject to the following conditions:
|
|
||||||
|
|
||||||
The above copyright notice and this permission notice shall be included in all
|
|
||||||
copies or substantial portions of the Software.
|
|
||||||
|
|
||||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
|
||||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
|
||||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
|
||||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
|
||||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
|
||||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
|
||||||
SOFTWARE.
|
|
||||||
|
|
||||||
5. Why the jump to 0.10.0?
|
|
||||||
--------------------------
|
|
||||||
|
|
||||||
Because someone released a fork of netifaces with the version 0.9.0.
|
|
||||||
Hopefully skipping the version number should remove any confusion. In
|
|
||||||
addition starting with 0.10.0 Python 3 is now supported and other
|
|
||||||
features/bugfixes have been included as well. See the CHANGELOG for a
|
|
||||||
more complete list of changes.
|
|
||||||
|
|
||||||
|
|
|
@ -1,9 +0,0 @@
|
||||||
netifaces.cpython-34m.so,sha256=KiLZHMhvo_x40-9D0bLqZoVzQsGbimZY_33SUPowm9E,72976
|
|
||||||
netifaces-0.10.6.dist-info/DESCRIPTION.rst,sha256=WCNR0xdB7g_1r_U6WwIedMlurGlPeDjvJX-NBElPoII,8555
|
|
||||||
netifaces-0.10.6.dist-info/METADATA,sha256=InwXovYI_sgETAChE4hBUFbkSwYlZ_gWeKcNvyX8KOA,9322
|
|
||||||
netifaces-0.10.6.dist-info/RECORD,,
|
|
||||||
netifaces-0.10.6.dist-info/WHEEL,sha256=AEztX7vHDtcgysb-4-5-DyIKMLIPg6NMxY9dXTRdoXQ,104
|
|
||||||
netifaces-0.10.6.dist-info/metadata.json,sha256=W-IHSrO0Ma846gdBr18QTsvc9GjGN0SgAnZha0vW9tU,885
|
|
||||||
netifaces-0.10.6.dist-info/top_level.txt,sha256=PqMTaIuWtSjkdQHX6lH1Lmpv2aqBUYAGqATB8z3A6TQ,10
|
|
||||||
netifaces-0.10.6.dist-info/zip-safe,sha256=AbpHGcgLb-kRsJGnwFEktk7uzpZOCcBY74-YBdrKVGs,1
|
|
||||||
netifaces-0.10.6.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
|
|
|
@ -1,5 +0,0 @@
|
||||||
Wheel-Version: 1.0
|
|
||||||
Generator: bdist_wheel (0.30.0)
|
|
||||||
Root-Is-Purelib: false
|
|
||||||
Tag: cp34-cp34m-linux_x86_64
|
|
||||||
|
|
|
@ -1 +0,0 @@
|
||||||
{"classifiers": ["Development Status :: 4 - Beta", "Intended Audience :: Developers", "License :: OSI Approved :: MIT License", "Topic :: System :: Networking", "Programming Language :: Python", "Programming Language :: Python :: 2", "Programming Language :: Python :: 2.5", "Programming Language :: Python :: 2.6", "Programming Language :: Python :: 2.7", "Programming Language :: Python :: 3"], "description_content_type": "UNKNOWN", "extensions": {"python.details": {"contacts": [{"email": "alastair@alastairs-place.net", "name": "Alastair Houghton", "role": "author"}], "document_names": {"description": "DESCRIPTION.rst"}, "project_urls": {"Home": "https://bitbucket.org/al45tair/netifaces"}}}, "generator": "bdist_wheel (0.30.0)", "license": "MIT License", "metadata_version": "2.0", "name": "netifaces", "summary": "Portable network interface information.", "version": "0.10.6"}
|
|
Binary file not shown.
Binary file not shown.
Binary file not shown.
|
@ -1,39 +0,0 @@
|
||||||
pip
|
|
||||||
===
|
|
||||||
|
|
||||||
The `PyPA recommended
|
|
||||||
<https://packaging.python.org/en/latest/current/>`_
|
|
||||||
tool for installing Python packages.
|
|
||||||
|
|
||||||
* `Installation <https://pip.pypa.io/en/stable/installing.html>`_
|
|
||||||
* `Documentation <https://pip.pypa.io/>`_
|
|
||||||
* `Changelog <https://pip.pypa.io/en/stable/news.html>`_
|
|
||||||
* `Github Page <https://github.com/pypa/pip>`_
|
|
||||||
* `Issue Tracking <https://github.com/pypa/pip/issues>`_
|
|
||||||
* `User mailing list <http://groups.google.com/group/python-virtualenv>`_
|
|
||||||
* `Dev mailing list <http://groups.google.com/group/pypa-dev>`_
|
|
||||||
* User IRC: #pypa on Freenode.
|
|
||||||
* Dev IRC: #pypa-dev on Freenode.
|
|
||||||
|
|
||||||
|
|
||||||
.. image:: https://img.shields.io/pypi/v/pip.svg
|
|
||||||
:target: https://pypi.python.org/pypi/pip
|
|
||||||
|
|
||||||
.. image:: https://img.shields.io/travis/pypa/pip/master.svg
|
|
||||||
:target: http://travis-ci.org/pypa/pip
|
|
||||||
|
|
||||||
.. image:: https://img.shields.io/appveyor/ci/pypa/pip.svg
|
|
||||||
:target: https://ci.appveyor.com/project/pypa/pip/history
|
|
||||||
|
|
||||||
.. image:: https://readthedocs.org/projects/pip/badge/?version=stable
|
|
||||||
:target: https://pip.pypa.io/en/stable
|
|
||||||
|
|
||||||
Code of Conduct
|
|
||||||
---------------
|
|
||||||
|
|
||||||
Everyone interacting in the pip project's codebases, issue trackers, chat
|
|
||||||
rooms, and mailing lists is expected to follow the `PyPA Code of Conduct`_.
|
|
||||||
|
|
||||||
.. _PyPA Code of Conduct: https://www.pypa.io/en/latest/code-of-conduct/
|
|
||||||
|
|
||||||
|
|
|
@ -1,69 +0,0 @@
|
||||||
Metadata-Version: 2.0
|
|
||||||
Name: pip
|
|
||||||
Version: 9.0.1
|
|
||||||
Summary: The PyPA recommended tool for installing Python packages.
|
|
||||||
Home-page: https://pip.pypa.io/
|
|
||||||
Author: The pip developers
|
|
||||||
Author-email: python-virtualenv@groups.google.com
|
|
||||||
License: MIT
|
|
||||||
Keywords: easy_install distutils setuptools egg virtualenv
|
|
||||||
Platform: UNKNOWN
|
|
||||||
Classifier: Development Status :: 5 - Production/Stable
|
|
||||||
Classifier: Intended Audience :: Developers
|
|
||||||
Classifier: License :: OSI Approved :: MIT License
|
|
||||||
Classifier: Topic :: Software Development :: Build Tools
|
|
||||||
Classifier: Programming Language :: Python :: 2
|
|
||||||
Classifier: Programming Language :: Python :: 2.6
|
|
||||||
Classifier: Programming Language :: Python :: 2.7
|
|
||||||
Classifier: Programming Language :: Python :: 3
|
|
||||||
Classifier: Programming Language :: Python :: 3.3
|
|
||||||
Classifier: Programming Language :: Python :: 3.4
|
|
||||||
Classifier: Programming Language :: Python :: 3.5
|
|
||||||
Classifier: Programming Language :: Python :: Implementation :: PyPy
|
|
||||||
Requires-Python: >=2.6,!=3.0.*,!=3.1.*,!=3.2.*
|
|
||||||
Provides-Extra: testing
|
|
||||||
Requires-Dist: mock; extra == 'testing'
|
|
||||||
Requires-Dist: pretend; extra == 'testing'
|
|
||||||
Requires-Dist: pytest; extra == 'testing'
|
|
||||||
Requires-Dist: scripttest (>=1.3); extra == 'testing'
|
|
||||||
Requires-Dist: virtualenv (>=1.10); extra == 'testing'
|
|
||||||
|
|
||||||
pip
|
|
||||||
===
|
|
||||||
|
|
||||||
The `PyPA recommended
|
|
||||||
<https://packaging.python.org/en/latest/current/>`_
|
|
||||||
tool for installing Python packages.
|
|
||||||
|
|
||||||
* `Installation <https://pip.pypa.io/en/stable/installing.html>`_
|
|
||||||
* `Documentation <https://pip.pypa.io/>`_
|
|
||||||
* `Changelog <https://pip.pypa.io/en/stable/news.html>`_
|
|
||||||
* `Github Page <https://github.com/pypa/pip>`_
|
|
||||||
* `Issue Tracking <https://github.com/pypa/pip/issues>`_
|
|
||||||
* `User mailing list <http://groups.google.com/group/python-virtualenv>`_
|
|
||||||
* `Dev mailing list <http://groups.google.com/group/pypa-dev>`_
|
|
||||||
* User IRC: #pypa on Freenode.
|
|
||||||
* Dev IRC: #pypa-dev on Freenode.
|
|
||||||
|
|
||||||
|
|
||||||
.. image:: https://img.shields.io/pypi/v/pip.svg
|
|
||||||
:target: https://pypi.python.org/pypi/pip
|
|
||||||
|
|
||||||
.. image:: https://img.shields.io/travis/pypa/pip/master.svg
|
|
||||||
:target: http://travis-ci.org/pypa/pip
|
|
||||||
|
|
||||||
.. image:: https://img.shields.io/appveyor/ci/pypa/pip.svg
|
|
||||||
:target: https://ci.appveyor.com/project/pypa/pip/history
|
|
||||||
|
|
||||||
.. image:: https://readthedocs.org/projects/pip/badge/?version=stable
|
|
||||||
:target: https://pip.pypa.io/en/stable
|
|
||||||
|
|
||||||
Code of Conduct
|
|
||||||
---------------
|
|
||||||
|
|
||||||
Everyone interacting in the pip project's codebases, issue trackers, chat
|
|
||||||
rooms, and mailing lists is expected to follow the `PyPA Code of Conduct`_.
|
|
||||||
|
|
||||||
.. _PyPA Code of Conduct: https://www.pypa.io/en/latest/code-of-conduct/
|
|
||||||
|
|
||||||
|
|
|
@ -1,123 +0,0 @@
|
||||||
pip/__init__.py,sha256=00QWSreEBjb8Y8sPs8HeqgLXSB-3UrONJxo4J5APxEc,11348
|
|
||||||
pip/__main__.py,sha256=V6Kh-IEDEFpt1cahRE6MajUF_14qJR_Qsvn4MjWZXzE,584
|
|
||||||
pip/basecommand.py,sha256=TTlmZesQ4Vuxcto2KqwZGmgmN5ioHEl_DeFev9ie_SA,11910
|
|
||||||
pip/baseparser.py,sha256=AKMOeF3fTrRroiv0DmTQbdiLW0DQux2KqGC_dJJB9d0,10465
|
|
||||||
pip/cmdoptions.py,sha256=pRptFz05iFEfSW4Flg3x1_P92sYlFvq7elhnwujikNY,16473
|
|
||||||
pip/download.py,sha256=rA0wbmqC2n9ejX481YJSidmKgQqQDjdaxkHkHlAN68k,32171
|
|
||||||
pip/exceptions.py,sha256=BvqH-Jw3tP2b-2IJ2kjrQemOAPMqKrQMLRIZHZQpJXk,8121
|
|
||||||
pip/index.py,sha256=L6UhtAEZc2qw7BqfQrkPQcw2gCgEw3GukLRSA95BNyI,39950
|
|
||||||
pip/locations.py,sha256=9rJRlgonC6QC2zGDIn_7mXaoZ9_tF_IHM2BQhWVRgbo,5626
|
|
||||||
pip/pep425tags.py,sha256=q3kec4f6NHszuGYIhGIbVvs896D06uJAnKFgJ_wce44,10980
|
|
||||||
pip/status_codes.py,sha256=F6uDG6Gj7RNKQJUDnd87QKqI16Us-t-B0wPF_4QMpWc,156
|
|
||||||
pip/wheel.py,sha256=QSWmGs2ui-n4UMWm0JUY6aMCcwNKungVzbWsxI9KlJQ,32010
|
|
||||||
pip/_vendor/__init__.py,sha256=L-0x9jj0HSZen1Fm2U0GUbxfjfwQPIXc4XJ4IAxy8D8,4804
|
|
||||||
pip/commands/__init__.py,sha256=2Uq3HCdjchJD9FL1LB7rd5v6UySVAVizX0W3EX3hIoE,2244
|
|
||||||
pip/commands/check.py,sha256=-A7GI1-WZBh9a4P6UoH_aR-J7I8Lz8ly7m3wnCjmevs,1382
|
|
||||||
pip/commands/completion.py,sha256=kkPgVX7SUcJ_8Juw5GkgWaxHN9_45wmAr9mGs1zXEEs,2453
|
|
||||||
pip/commands/download.py,sha256=8RuuPmSYgAq3iEDTqZY_1PDXRqREdUULHNjWJeAv7Mo,7810
|
|
||||||
pip/commands/freeze.py,sha256=h6-yFMpjCjbNj8-gOm5UuoF6cg14N5rPV4TCi3_CeuI,2835
|
|
||||||
pip/commands/hash.py,sha256=MCt4jEFyfoce0lVeNEz1x49uaTY-VDkKiBvvxrVcHkw,1597
|
|
||||||
pip/commands/help.py,sha256=84HWkEdnGP_AEBHnn8gJP2Te0XTXRKFoXqXopbOZTNo,982
|
|
||||||
pip/commands/install.py,sha256=o-CR1TKf-b1qaFv47nNlawqsIfDjXyIzv_iJUw1Trag,18069
|
|
||||||
pip/commands/list.py,sha256=93bCiFyt2Qut_YHkYHJMZHpXladmxsjS-yOtZeb3uqI,11369
|
|
||||||
pip/commands/search.py,sha256=oTs9QNdefnrmCV_JeftG0PGiMuYVmiEDF1OUaYsmDao,4502
|
|
||||||
pip/commands/show.py,sha256=ZYM57_7U8KP9MQIIyHKQdZxmiEZByy-DRzB697VFoTY,5891
|
|
||||||
pip/commands/uninstall.py,sha256=tz8cXz4WdpUdnt3RvpdQwH6_SNMB50egBIZWa1dwfcc,2884
|
|
||||||
pip/commands/wheel.py,sha256=z5SEhws2YRMb0Ml1IEkg6jFZMLRpLl86bHCrQbYt5zo,7729
|
|
||||||
pip/compat/__init__.py,sha256=2Xs_IpsmdRgHbQgQO0c8_lPvHJnQXHyGWxPbLbYJL4c,4672
|
|
||||||
pip/compat/dictconfig.py,sha256=dRrelPDWrceDSzFT51RTEVY2GuM7UDyc5Igh_tn4Fvk,23096
|
|
||||||
pip/models/__init__.py,sha256=0Rs7_RA4DxeOkWT5Cq4CQzDrSEhvYcN3TH2cazr72PE,71
|
|
||||||
pip/models/index.py,sha256=pUfbO__v3mD9j-2n_ClwPS8pVyx4l2wIwyvWt8GMCRA,487
|
|
||||||
pip/operations/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
|
|
||||||
pip/operations/check.py,sha256=uwUN9cs1sPo7c0Sj6pRrSv7b22Pk29SXUImTelVchMQ,1590
|
|
||||||
pip/operations/freeze.py,sha256=k-7w7LsM-RpPv7ERBzHiPpYkH-GuYfHLyR-Cp_1VPL0,5194
|
|
||||||
pip/req/__init__.py,sha256=vFwZY8_Vc1WU1zFAespg1My_r_AT3n7cN0W9eX0EFqk,276
|
|
||||||
pip/req/req_file.py,sha256=fG9MDsXUNPhmGwxUiwrIXEynyD8Q7s3L47-hLZPDXq0,11926
|
|
||||||
pip/req/req_install.py,sha256=gYrH-lwQMmt55VVbav_EtRIPu94cQbHFHm_Kq6AeHbg,46487
|
|
||||||
pip/req/req_set.py,sha256=jHspXqcA2FxcF05dgUIAZ5huYPv6bn0wRUX0Z7PKmaA,34462
|
|
||||||
pip/req/req_uninstall.py,sha256=fdH2VgCjEC8NRYDS7fRu3ZJaBBUEy-N5muwxDX5MBNM,6897
|
|
||||||
pip/utils/__init__.py,sha256=zk1vF2EzHZX1ZKPwgeC9I6yKvs8IJ6NZEfXgp2IP8hI,27912
|
|
||||||
pip/utils/appdirs.py,sha256=kj2LK-I2fC5QnEh_A_v-ev_IQMcXaWWF5DE39sNvCLQ,8811
|
|
||||||
pip/utils/build.py,sha256=4smLRrfSCmXmjEnVnMFh2tBEpNcSLRe6J0ejZJ-wWJE,1312
|
|
||||||
pip/utils/deprecation.py,sha256=X_FMjtDbMJqfqEkdRrki-mYyIdPB6I6DHUTCA_ChY6M,2232
|
|
||||||
pip/utils/encoding.py,sha256=NQxGiFS5GbeAveLZTnx92t5r0PYqvt0iRnP2u9SGG1w,971
|
|
||||||
pip/utils/filesystem.py,sha256=ZEVBuYM3fqr2_lgOESh4Y7fPFszGD474zVm_M3Mb5Tk,899
|
|
||||||
pip/utils/glibc.py,sha256=jcQYjt_oJLPKVZB28Kauy4Sw70zS-wawxoU1HHX36_0,2939
|
|
||||||
pip/utils/hashes.py,sha256=oMk7cd3PbJgzpSQyXq1MytMud5f6H5Oa2YY5hYuCq6I,2866
|
|
||||||
pip/utils/logging.py,sha256=7yWu4gZw-Qclj7X80QVdpGWkdTWGKT4LiUVKcE04pro,3327
|
|
||||||
pip/utils/outdated.py,sha256=fNwOCL5r2EftPGhgCYGMKu032HC8cV-JAr9lp0HmToM,5455
|
|
||||||
pip/utils/packaging.py,sha256=qhmli14odw6DIhWJgQYS2Q0RrSbr8nXNcG48f5yTRms,2080
|
|
||||||
pip/utils/setuptools_build.py,sha256=0blfscmNJW_iZ5DcswJeDB_PbtTEjfK9RL1R1WEDW2E,278
|
|
||||||
pip/utils/ui.py,sha256=pbDkSAeumZ6jdZcOJ2yAbx8iBgeP2zfpqNnLJK1gskQ,11597
|
|
||||||
pip/vcs/__init__.py,sha256=WafFliUTHMmsSISV8PHp1M5EXDNSWyJr78zKaQmPLdY,12374
|
|
||||||
pip/vcs/bazaar.py,sha256=tYTwc4b4off8mr0O2o8SiGejqBDJxcbDBMSMd9-ISYc,3803
|
|
||||||
pip/vcs/git.py,sha256=5LfWryi78A-2ULjEZJvCTarJ_3l8venwXASlwm8hiug,11197
|
|
||||||
pip/vcs/mercurial.py,sha256=xG6rDiwHCRytJEs23SIHBXl_SwQo2jkkdD_6rVVP5h4,3472
|
|
||||||
pip/vcs/subversion.py,sha256=GAuX2Sk7IZvJyEzENKcVld_wGBrQ3fpXDlXjapZEYdI,9350
|
|
||||||
pip-9.0.1.dist-info/DESCRIPTION.rst,sha256=Va8Wj1XBpTbVQ2Z41mZRJdALEeziiS_ZewWn1H2ecY4,1287
|
|
||||||
pip-9.0.1.dist-info/METADATA,sha256=mvs_tLoKAbECXY_6QHiVWQsagSL-1UjolQTpScT8JSk,2529
|
|
||||||
pip-9.0.1.dist-info/RECORD,,
|
|
||||||
pip-9.0.1.dist-info/WHEEL,sha256=o2k-Qa-RMNIJmUdIc7KU6VWR_ErNRbWNlxDIpl7lm34,110
|
|
||||||
pip-9.0.1.dist-info/entry_points.txt,sha256=GWc-Wb9WUKZ1EuVWNz-G0l3BeIpbNJLx0OJbZ61AAV0,68
|
|
||||||
pip-9.0.1.dist-info/metadata.json,sha256=aqvkETDy4mHUBob-2Fn5WWlXORi_M2OSfQ2HQCUU_Fk,1565
|
|
||||||
pip-9.0.1.dist-info/top_level.txt,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
|
|
||||||
../../../bin/pip,sha256=kUtfTrIe4CRluRco6nKs-hUx0Eir2ABPF8Rr_1zK534,272
|
|
||||||
../../../bin/pip3,sha256=kUtfTrIe4CRluRco6nKs-hUx0Eir2ABPF8Rr_1zK534,272
|
|
||||||
../../../bin/pip3.4,sha256=kUtfTrIe4CRluRco6nKs-hUx0Eir2ABPF8Rr_1zK534,272
|
|
||||||
pip-9.0.1.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
|
|
||||||
pip/__pycache__/exceptions.cpython-34.pyc,,
|
|
||||||
pip/utils/__pycache__/ui.cpython-34.pyc,,
|
|
||||||
pip/__pycache__/basecommand.cpython-34.pyc,,
|
|
||||||
pip/commands/__pycache__/check.cpython-34.pyc,,
|
|
||||||
pip/utils/__pycache__/packaging.cpython-34.pyc,,
|
|
||||||
pip/utils/__pycache__/build.cpython-34.pyc,,
|
|
||||||
pip/vcs/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
pip/__pycache__/download.cpython-34.pyc,,
|
|
||||||
pip/utils/__pycache__/setuptools_build.cpython-34.pyc,,
|
|
||||||
pip/req/__pycache__/req_uninstall.cpython-34.pyc,,
|
|
||||||
pip/utils/__pycache__/deprecation.cpython-34.pyc,,
|
|
||||||
pip/operations/__pycache__/check.cpython-34.pyc,,
|
|
||||||
pip/_vendor/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
pip/utils/__pycache__/outdated.cpython-34.pyc,,
|
|
||||||
pip/commands/__pycache__/install.cpython-34.pyc,,
|
|
||||||
pip/operations/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
pip/commands/__pycache__/freeze.cpython-34.pyc,,
|
|
||||||
pip/req/__pycache__/req_set.cpython-34.pyc,,
|
|
||||||
pip/operations/__pycache__/freeze.cpython-34.pyc,,
|
|
||||||
pip/__pycache__/baseparser.cpython-34.pyc,,
|
|
||||||
pip/commands/__pycache__/hash.cpython-34.pyc,,
|
|
||||||
pip/commands/__pycache__/download.cpython-34.pyc,,
|
|
||||||
pip/commands/__pycache__/wheel.cpython-34.pyc,,
|
|
||||||
pip/commands/__pycache__/help.cpython-34.pyc,,
|
|
||||||
pip/utils/__pycache__/glibc.cpython-34.pyc,,
|
|
||||||
pip/__pycache__/locations.cpython-34.pyc,,
|
|
||||||
pip/commands/__pycache__/list.cpython-34.pyc,,
|
|
||||||
pip/compat/__pycache__/dictconfig.cpython-34.pyc,,
|
|
||||||
pip/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
pip/utils/__pycache__/hashes.cpython-34.pyc,,
|
|
||||||
pip/compat/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
pip/vcs/__pycache__/git.cpython-34.pyc,,
|
|
||||||
pip/req/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
pip/__pycache__/__main__.cpython-34.pyc,,
|
|
||||||
pip/__pycache__/status_codes.cpython-34.pyc,,
|
|
||||||
pip/models/__pycache__/index.cpython-34.pyc,,
|
|
||||||
pip/__pycache__/pep425tags.cpython-34.pyc,,
|
|
||||||
pip/commands/__pycache__/uninstall.cpython-34.pyc,,
|
|
||||||
pip/vcs/__pycache__/bazaar.cpython-34.pyc,,
|
|
||||||
pip/req/__pycache__/req_install.cpython-34.pyc,,
|
|
||||||
pip/vcs/__pycache__/mercurial.cpython-34.pyc,,
|
|
||||||
pip/commands/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
pip/commands/__pycache__/show.cpython-34.pyc,,
|
|
||||||
pip/__pycache__/index.cpython-34.pyc,,
|
|
||||||
pip/commands/__pycache__/completion.cpython-34.pyc,,
|
|
||||||
pip/req/__pycache__/req_file.cpython-34.pyc,,
|
|
||||||
pip/__pycache__/cmdoptions.cpython-34.pyc,,
|
|
||||||
pip/utils/__pycache__/filesystem.cpython-34.pyc,,
|
|
||||||
pip/__pycache__/wheel.cpython-34.pyc,,
|
|
||||||
pip/utils/__pycache__/appdirs.cpython-34.pyc,,
|
|
||||||
pip/utils/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
pip/vcs/__pycache__/subversion.cpython-34.pyc,,
|
|
||||||
pip/utils/__pycache__/logging.cpython-34.pyc,,
|
|
||||||
pip/commands/__pycache__/search.cpython-34.pyc,,
|
|
||||||
pip/utils/__pycache__/encoding.cpython-34.pyc,,
|
|
||||||
pip/models/__pycache__/__init__.cpython-34.pyc,,
|
|
|
@ -1,5 +0,0 @@
|
||||||
[console_scripts]
|
|
||||||
pip = pip:main
|
|
||||||
pip3 = pip:main
|
|
||||||
pip3.5 = pip:main
|
|
||||||
|
|
|
@ -1 +0,0 @@
|
||||||
{"classifiers": ["Development Status :: 5 - Production/Stable", "Intended Audience :: Developers", "License :: OSI Approved :: MIT License", "Topic :: Software Development :: Build Tools", "Programming Language :: Python :: 2", "Programming Language :: Python :: 2.6", "Programming Language :: Python :: 2.7", "Programming Language :: Python :: 3", "Programming Language :: Python :: 3.3", "Programming Language :: Python :: 3.4", "Programming Language :: Python :: 3.5", "Programming Language :: Python :: Implementation :: PyPy"], "extensions": {"python.commands": {"wrap_console": {"pip": "pip:main", "pip3": "pip:main", "pip3.5": "pip:main"}}, "python.details": {"contacts": [{"email": "python-virtualenv@groups.google.com", "name": "The pip developers", "role": "author"}], "document_names": {"description": "DESCRIPTION.rst"}, "project_urls": {"Home": "https://pip.pypa.io/"}}, "python.exports": {"console_scripts": {"pip": "pip:main", "pip3": "pip:main", "pip3.5": "pip:main"}}}, "extras": ["testing"], "generator": "bdist_wheel (0.29.0)", "keywords": ["easy_install", "distutils", "setuptools", "egg", "virtualenv"], "license": "MIT", "metadata_version": "2.0", "name": "pip", "requires_python": ">=2.6,!=3.0.*,!=3.1.*,!=3.2.*", "run_requires": [{"extra": "testing", "requires": ["mock", "pretend", "pytest", "scripttest (>=1.3)", "virtualenv (>=1.10)"]}], "summary": "The PyPA recommended tool for installing Python packages.", "test_requires": [{"requires": ["mock", "pretend", "pytest", "scripttest (>=1.3)", "virtualenv (>=1.10)"]}], "version": "9.0.1"}
|
|
|
@ -1,331 +0,0 @@
|
||||||
#!/usr/bin/env python
|
|
||||||
from __future__ import absolute_import
|
|
||||||
|
|
||||||
import locale
|
|
||||||
import logging
|
|
||||||
import os
|
|
||||||
import optparse
|
|
||||||
import warnings
|
|
||||||
|
|
||||||
import sys
|
|
||||||
import re
|
|
||||||
|
|
||||||
# 2016-06-17 barry@debian.org: urllib3 1.14 added optional support for socks,
|
|
||||||
# but if invoked (i.e. imported), it will issue a warning to stderr if socks
|
|
||||||
# isn't available. requests unconditionally imports urllib3's socks contrib
|
|
||||||
# module, triggering this warning. The warning breaks DEP-8 tests (because of
|
|
||||||
# the stderr output) and is just plain annoying in normal usage. I don't want
|
|
||||||
# to add socks as yet another dependency for pip, nor do I want to allow-stder
|
|
||||||
# in the DEP-8 tests, so just suppress the warning. pdb tells me this has to
|
|
||||||
# be done before the import of pip.vcs.
|
|
||||||
from pip._vendor.requests.packages.urllib3.exceptions import DependencyWarning
|
|
||||||
warnings.filterwarnings("ignore", category=DependencyWarning) # noqa
|
|
||||||
|
|
||||||
|
|
||||||
from pip.exceptions import InstallationError, CommandError, PipError
|
|
||||||
from pip.utils import get_installed_distributions, get_prog
|
|
||||||
from pip.utils import deprecation, dist_is_editable
|
|
||||||
from pip.vcs import git, mercurial, subversion, bazaar # noqa
|
|
||||||
from pip.baseparser import ConfigOptionParser, UpdatingDefaultsHelpFormatter
|
|
||||||
from pip.commands import get_summaries, get_similar_commands
|
|
||||||
from pip.commands import commands_dict
|
|
||||||
from pip._vendor.requests.packages.urllib3.exceptions import (
|
|
||||||
InsecureRequestWarning,
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
# assignment for flake8 to be happy
|
|
||||||
|
|
||||||
# This fixes a peculiarity when importing via __import__ - as we are
|
|
||||||
# initialising the pip module, "from pip import cmdoptions" is recursive
|
|
||||||
# and appears not to work properly in that situation.
|
|
||||||
import pip.cmdoptions
|
|
||||||
cmdoptions = pip.cmdoptions
|
|
||||||
|
|
||||||
# The version as used in the setup.py and the docs conf.py
|
|
||||||
__version__ = "9.0.1"
|
|
||||||
|
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
|
||||||
|
|
||||||
# Hide the InsecureRequestWarning from urllib3
|
|
||||||
warnings.filterwarnings("ignore", category=InsecureRequestWarning)
|
|
||||||
|
|
||||||
|
|
||||||
def autocomplete():
|
|
||||||
"""Command and option completion for the main option parser (and options)
|
|
||||||
and its subcommands (and options).
|
|
||||||
|
|
||||||
Enable by sourcing one of the completion shell scripts (bash, zsh or fish).
|
|
||||||
"""
|
|
||||||
# Don't complete if user hasn't sourced bash_completion file.
|
|
||||||
if 'PIP_AUTO_COMPLETE' not in os.environ:
|
|
||||||
return
|
|
||||||
cwords = os.environ['COMP_WORDS'].split()[1:]
|
|
||||||
cword = int(os.environ['COMP_CWORD'])
|
|
||||||
try:
|
|
||||||
current = cwords[cword - 1]
|
|
||||||
except IndexError:
|
|
||||||
current = ''
|
|
||||||
|
|
||||||
subcommands = [cmd for cmd, summary in get_summaries()]
|
|
||||||
options = []
|
|
||||||
# subcommand
|
|
||||||
try:
|
|
||||||
subcommand_name = [w for w in cwords if w in subcommands][0]
|
|
||||||
except IndexError:
|
|
||||||
subcommand_name = None
|
|
||||||
|
|
||||||
parser = create_main_parser()
|
|
||||||
# subcommand options
|
|
||||||
if subcommand_name:
|
|
||||||
# special case: 'help' subcommand has no options
|
|
||||||
if subcommand_name == 'help':
|
|
||||||
sys.exit(1)
|
|
||||||
# special case: list locally installed dists for uninstall command
|
|
||||||
if subcommand_name == 'uninstall' and not current.startswith('-'):
|
|
||||||
installed = []
|
|
||||||
lc = current.lower()
|
|
||||||
for dist in get_installed_distributions(local_only=True):
|
|
||||||
if dist.key.startswith(lc) and dist.key not in cwords[1:]:
|
|
||||||
installed.append(dist.key)
|
|
||||||
# if there are no dists installed, fall back to option completion
|
|
||||||
if installed:
|
|
||||||
for dist in installed:
|
|
||||||
print(dist)
|
|
||||||
sys.exit(1)
|
|
||||||
|
|
||||||
subcommand = commands_dict[subcommand_name]()
|
|
||||||
options += [(opt.get_opt_string(), opt.nargs)
|
|
||||||
for opt in subcommand.parser.option_list_all
|
|
||||||
if opt.help != optparse.SUPPRESS_HELP]
|
|
||||||
|
|
||||||
# filter out previously specified options from available options
|
|
||||||
prev_opts = [x.split('=')[0] for x in cwords[1:cword - 1]]
|
|
||||||
options = [(x, v) for (x, v) in options if x not in prev_opts]
|
|
||||||
# filter options by current input
|
|
||||||
options = [(k, v) for k, v in options if k.startswith(current)]
|
|
||||||
for option in options:
|
|
||||||
opt_label = option[0]
|
|
||||||
# append '=' to options which require args
|
|
||||||
if option[1]:
|
|
||||||
opt_label += '='
|
|
||||||
print(opt_label)
|
|
||||||
else:
|
|
||||||
# show main parser options only when necessary
|
|
||||||
if current.startswith('-') or current.startswith('--'):
|
|
||||||
opts = [i.option_list for i in parser.option_groups]
|
|
||||||
opts.append(parser.option_list)
|
|
||||||
opts = (o for it in opts for o in it)
|
|
||||||
|
|
||||||
subcommands += [i.get_opt_string() for i in opts
|
|
||||||
if i.help != optparse.SUPPRESS_HELP]
|
|
||||||
|
|
||||||
print(' '.join([x for x in subcommands if x.startswith(current)]))
|
|
||||||
sys.exit(1)
|
|
||||||
|
|
||||||
|
|
||||||
def create_main_parser():
|
|
||||||
parser_kw = {
|
|
||||||
'usage': '\n%prog <command> [options]',
|
|
||||||
'add_help_option': False,
|
|
||||||
'formatter': UpdatingDefaultsHelpFormatter(),
|
|
||||||
'name': 'global',
|
|
||||||
'prog': get_prog(),
|
|
||||||
}
|
|
||||||
|
|
||||||
parser = ConfigOptionParser(**parser_kw)
|
|
||||||
parser.disable_interspersed_args()
|
|
||||||
|
|
||||||
pip_pkg_dir = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
|
|
||||||
parser.version = 'pip %s from %s (python %s)' % (
|
|
||||||
__version__, pip_pkg_dir, sys.version[:3])
|
|
||||||
|
|
||||||
# add the general options
|
|
||||||
gen_opts = cmdoptions.make_option_group(cmdoptions.general_group, parser)
|
|
||||||
parser.add_option_group(gen_opts)
|
|
||||||
|
|
||||||
parser.main = True # so the help formatter knows
|
|
||||||
|
|
||||||
# create command listing for description
|
|
||||||
command_summaries = get_summaries()
|
|
||||||
description = [''] + ['%-27s %s' % (i, j) for i, j in command_summaries]
|
|
||||||
parser.description = '\n'.join(description)
|
|
||||||
|
|
||||||
return parser
|
|
||||||
|
|
||||||
|
|
||||||
def parseopts(args):
|
|
||||||
parser = create_main_parser()
|
|
||||||
|
|
||||||
# Note: parser calls disable_interspersed_args(), so the result of this
|
|
||||||
# call is to split the initial args into the general options before the
|
|
||||||
# subcommand and everything else.
|
|
||||||
# For example:
|
|
||||||
# args: ['--timeout=5', 'install', '--user', 'INITools']
|
|
||||||
# general_options: ['--timeout==5']
|
|
||||||
# args_else: ['install', '--user', 'INITools']
|
|
||||||
general_options, args_else = parser.parse_args(args)
|
|
||||||
|
|
||||||
# --version
|
|
||||||
if general_options.version:
|
|
||||||
sys.stdout.write(parser.version)
|
|
||||||
sys.stdout.write(os.linesep)
|
|
||||||
sys.exit()
|
|
||||||
|
|
||||||
# pip || pip help -> print_help()
|
|
||||||
if not args_else or (args_else[0] == 'help' and len(args_else) == 1):
|
|
||||||
parser.print_help()
|
|
||||||
sys.exit()
|
|
||||||
|
|
||||||
# the subcommand name
|
|
||||||
cmd_name = args_else[0]
|
|
||||||
|
|
||||||
if cmd_name not in commands_dict:
|
|
||||||
guess = get_similar_commands(cmd_name)
|
|
||||||
|
|
||||||
msg = ['unknown command "%s"' % cmd_name]
|
|
||||||
if guess:
|
|
||||||
msg.append('maybe you meant "%s"' % guess)
|
|
||||||
|
|
||||||
raise CommandError(' - '.join(msg))
|
|
||||||
|
|
||||||
# all the args without the subcommand
|
|
||||||
cmd_args = args[:]
|
|
||||||
cmd_args.remove(cmd_name)
|
|
||||||
|
|
||||||
return cmd_name, cmd_args
|
|
||||||
|
|
||||||
|
|
||||||
def check_isolated(args):
|
|
||||||
isolated = False
|
|
||||||
|
|
||||||
if "--isolated" in args:
|
|
||||||
isolated = True
|
|
||||||
|
|
||||||
return isolated
|
|
||||||
|
|
||||||
|
|
||||||
def main(args=None):
|
|
||||||
if args is None:
|
|
||||||
args = sys.argv[1:]
|
|
||||||
|
|
||||||
# Configure our deprecation warnings to be sent through loggers
|
|
||||||
deprecation.install_warning_logger()
|
|
||||||
|
|
||||||
autocomplete()
|
|
||||||
|
|
||||||
try:
|
|
||||||
cmd_name, cmd_args = parseopts(args)
|
|
||||||
except PipError as exc:
|
|
||||||
sys.stderr.write("ERROR: %s" % exc)
|
|
||||||
sys.stderr.write(os.linesep)
|
|
||||||
sys.exit(1)
|
|
||||||
|
|
||||||
# Needed for locale.getpreferredencoding(False) to work
|
|
||||||
# in pip.utils.encoding.auto_decode
|
|
||||||
try:
|
|
||||||
locale.setlocale(locale.LC_ALL, '')
|
|
||||||
except locale.Error as e:
|
|
||||||
# setlocale can apparently crash if locale are uninitialized
|
|
||||||
logger.debug("Ignoring error %s when setting locale", e)
|
|
||||||
command = commands_dict[cmd_name](isolated=check_isolated(cmd_args))
|
|
||||||
return command.main(cmd_args)
|
|
||||||
|
|
||||||
|
|
||||||
# ###########################################################
|
|
||||||
# # Writing freeze files
|
|
||||||
|
|
||||||
class FrozenRequirement(object):
|
|
||||||
|
|
||||||
def __init__(self, name, req, editable, comments=()):
|
|
||||||
self.name = name
|
|
||||||
self.req = req
|
|
||||||
self.editable = editable
|
|
||||||
self.comments = comments
|
|
||||||
|
|
||||||
_rev_re = re.compile(r'-r(\d+)$')
|
|
||||||
_date_re = re.compile(r'-(20\d\d\d\d\d\d)$')
|
|
||||||
|
|
||||||
@classmethod
|
|
||||||
def from_dist(cls, dist, dependency_links):
|
|
||||||
location = os.path.normcase(os.path.abspath(dist.location))
|
|
||||||
comments = []
|
|
||||||
from pip.vcs import vcs, get_src_requirement
|
|
||||||
if dist_is_editable(dist) and vcs.get_backend_name(location):
|
|
||||||
editable = True
|
|
||||||
try:
|
|
||||||
req = get_src_requirement(dist, location)
|
|
||||||
except InstallationError as exc:
|
|
||||||
logger.warning(
|
|
||||||
"Error when trying to get requirement for VCS system %s, "
|
|
||||||
"falling back to uneditable format", exc
|
|
||||||
)
|
|
||||||
req = None
|
|
||||||
if req is None:
|
|
||||||
logger.warning(
|
|
||||||
'Could not determine repository location of %s', location
|
|
||||||
)
|
|
||||||
comments.append(
|
|
||||||
'## !! Could not determine repository location'
|
|
||||||
)
|
|
||||||
req = dist.as_requirement()
|
|
||||||
editable = False
|
|
||||||
else:
|
|
||||||
editable = False
|
|
||||||
req = dist.as_requirement()
|
|
||||||
specs = req.specs
|
|
||||||
assert len(specs) == 1 and specs[0][0] in ["==", "==="], \
|
|
||||||
'Expected 1 spec with == or ===; specs = %r; dist = %r' % \
|
|
||||||
(specs, dist)
|
|
||||||
version = specs[0][1]
|
|
||||||
ver_match = cls._rev_re.search(version)
|
|
||||||
date_match = cls._date_re.search(version)
|
|
||||||
if ver_match or date_match:
|
|
||||||
svn_backend = vcs.get_backend('svn')
|
|
||||||
if svn_backend:
|
|
||||||
svn_location = svn_backend().get_location(
|
|
||||||
dist,
|
|
||||||
dependency_links,
|
|
||||||
)
|
|
||||||
if not svn_location:
|
|
||||||
logger.warning(
|
|
||||||
'Warning: cannot find svn location for %s', req)
|
|
||||||
comments.append(
|
|
||||||
'## FIXME: could not find svn URL in dependency_links '
|
|
||||||
'for this package:'
|
|
||||||
)
|
|
||||||
else:
|
|
||||||
comments.append(
|
|
||||||
'# Installing as editable to satisfy requirement %s:' %
|
|
||||||
req
|
|
||||||
)
|
|
||||||
if ver_match:
|
|
||||||
rev = ver_match.group(1)
|
|
||||||
else:
|
|
||||||
rev = '{%s}' % date_match.group(1)
|
|
||||||
editable = True
|
|
||||||
req = '%s@%s#egg=%s' % (
|
|
||||||
svn_location,
|
|
||||||
rev,
|
|
||||||
cls.egg_name(dist)
|
|
||||||
)
|
|
||||||
return cls(dist.project_name, req, editable, comments)
|
|
||||||
|
|
||||||
@staticmethod
|
|
||||||
def egg_name(dist):
|
|
||||||
name = dist.egg_name()
|
|
||||||
match = re.search(r'-py\d\.\d$', name)
|
|
||||||
if match:
|
|
||||||
name = name[:match.start()]
|
|
||||||
return name
|
|
||||||
|
|
||||||
def __str__(self):
|
|
||||||
req = self.req
|
|
||||||
if self.editable:
|
|
||||||
req = '-e %s' % req
|
|
||||||
return '\n'.join(list(self.comments) + [str(req)]) + '\n'
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
|
||||||
sys.exit(main())
|
|
|
@ -1,39 +0,0 @@
|
||||||
import logging
|
|
||||||
|
|
||||||
from pip.basecommand import Command
|
|
||||||
from pip.operations.check import check_requirements
|
|
||||||
from pip.utils import get_installed_distributions
|
|
||||||
|
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
|
||||||
|
|
||||||
|
|
||||||
class CheckCommand(Command):
|
|
||||||
"""Verify installed packages have compatible dependencies."""
|
|
||||||
name = 'check'
|
|
||||||
usage = """
|
|
||||||
%prog [options]"""
|
|
||||||
summary = 'Verify installed packages have compatible dependencies.'
|
|
||||||
|
|
||||||
def run(self, options, args):
|
|
||||||
dists = get_installed_distributions(local_only=False, skip=())
|
|
||||||
missing_reqs_dict, incompatible_reqs_dict = check_requirements(dists)
|
|
||||||
|
|
||||||
for dist in dists:
|
|
||||||
key = '%s==%s' % (dist.project_name, dist.version)
|
|
||||||
|
|
||||||
for requirement in missing_reqs_dict.get(key, []):
|
|
||||||
logger.info(
|
|
||||||
"%s %s requires %s, which is not installed.",
|
|
||||||
dist.project_name, dist.version, requirement.project_name)
|
|
||||||
|
|
||||||
for requirement, actual in incompatible_reqs_dict.get(key, []):
|
|
||||||
logger.info(
|
|
||||||
"%s %s has requirement %s, but you have %s %s.",
|
|
||||||
dist.project_name, dist.version, requirement,
|
|
||||||
actual.project_name, actual.version)
|
|
||||||
|
|
||||||
if missing_reqs_dict or incompatible_reqs_dict:
|
|
||||||
return 1
|
|
||||||
else:
|
|
||||||
logger.info("No broken requirements found.")
|
|
|
@ -1,455 +0,0 @@
|
||||||
from __future__ import absolute_import
|
|
||||||
|
|
||||||
import logging
|
|
||||||
import operator
|
|
||||||
import os
|
|
||||||
import tempfile
|
|
||||||
import shutil
|
|
||||||
import warnings
|
|
||||||
try:
|
|
||||||
import wheel
|
|
||||||
except ImportError:
|
|
||||||
wheel = None
|
|
||||||
|
|
||||||
from pip.req import RequirementSet
|
|
||||||
from pip.basecommand import RequirementCommand
|
|
||||||
from pip.locations import virtualenv_no_global, distutils_scheme
|
|
||||||
from pip.exceptions import (
|
|
||||||
InstallationError, CommandError, PreviousBuildDirError,
|
|
||||||
)
|
|
||||||
from pip import cmdoptions
|
|
||||||
from pip.utils import ensure_dir, get_installed_version
|
|
||||||
from pip.utils.build import BuildDirectory
|
|
||||||
from pip.utils.deprecation import RemovedInPip10Warning
|
|
||||||
from pip.utils.filesystem import check_path_owner
|
|
||||||
from pip.wheel import WheelCache, WheelBuilder
|
|
||||||
|
|
||||||
from pip.locations import running_under_virtualenv
|
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
|
||||||
|
|
||||||
|
|
||||||
class InstallCommand(RequirementCommand):
|
|
||||||
"""
|
|
||||||
Install packages from:
|
|
||||||
|
|
||||||
- PyPI (and other indexes) using requirement specifiers.
|
|
||||||
- VCS project urls.
|
|
||||||
- Local project directories.
|
|
||||||
- Local or remote source archives.
|
|
||||||
|
|
||||||
pip also supports installing from "requirements files", which provide
|
|
||||||
an easy way to specify a whole environment to be installed.
|
|
||||||
"""
|
|
||||||
name = 'install'
|
|
||||||
|
|
||||||
usage = """
|
|
||||||
%prog [options] <requirement specifier> [package-index-options] ...
|
|
||||||
%prog [options] -r <requirements file> [package-index-options] ...
|
|
||||||
%prog [options] [-e] <vcs project url> ...
|
|
||||||
%prog [options] [-e] <local project path> ...
|
|
||||||
%prog [options] <archive url/path> ..."""
|
|
||||||
|
|
||||||
summary = 'Install packages.'
|
|
||||||
|
|
||||||
def __init__(self, *args, **kw):
|
|
||||||
super(InstallCommand, self).__init__(*args, **kw)
|
|
||||||
|
|
||||||
default_user = True
|
|
||||||
if running_under_virtualenv():
|
|
||||||
default_user = False
|
|
||||||
if os.geteuid() == 0:
|
|
||||||
default_user = False
|
|
||||||
|
|
||||||
cmd_opts = self.cmd_opts
|
|
||||||
|
|
||||||
cmd_opts.add_option(cmdoptions.constraints())
|
|
||||||
cmd_opts.add_option(cmdoptions.editable())
|
|
||||||
cmd_opts.add_option(cmdoptions.requirements())
|
|
||||||
cmd_opts.add_option(cmdoptions.build_dir())
|
|
||||||
|
|
||||||
cmd_opts.add_option(
|
|
||||||
'-t', '--target',
|
|
||||||
dest='target_dir',
|
|
||||||
metavar='dir',
|
|
||||||
default=None,
|
|
||||||
help='Install packages into <dir>. '
|
|
||||||
'By default this will not replace existing files/folders in '
|
|
||||||
'<dir>. Use --upgrade to replace existing packages in <dir> '
|
|
||||||
'with new versions.'
|
|
||||||
)
|
|
||||||
|
|
||||||
cmd_opts.add_option(
|
|
||||||
'-d', '--download', '--download-dir', '--download-directory',
|
|
||||||
dest='download_dir',
|
|
||||||
metavar='dir',
|
|
||||||
default=None,
|
|
||||||
help=("Download packages into <dir> instead of installing them, "
|
|
||||||
"regardless of what's already installed."),
|
|
||||||
)
|
|
||||||
|
|
||||||
cmd_opts.add_option(cmdoptions.src())
|
|
||||||
|
|
||||||
cmd_opts.add_option(
|
|
||||||
'-U', '--upgrade',
|
|
||||||
dest='upgrade',
|
|
||||||
action='store_true',
|
|
||||||
help='Upgrade all specified packages to the newest available '
|
|
||||||
'version. The handling of dependencies depends on the '
|
|
||||||
'upgrade-strategy used.'
|
|
||||||
)
|
|
||||||
|
|
||||||
cmd_opts.add_option(
|
|
||||||
'--upgrade-strategy',
|
|
||||||
dest='upgrade_strategy',
|
|
||||||
default='eager',
|
|
||||||
choices=['only-if-needed', 'eager'],
|
|
||||||
help='Determines how dependency upgrading should be handled. '
|
|
||||||
'"eager" - dependencies are upgraded regardless of '
|
|
||||||
'whether the currently installed version satisfies the '
|
|
||||||
'requirements of the upgraded package(s). '
|
|
||||||
'"only-if-needed" - are upgraded only when they do not '
|
|
||||||
'satisfy the requirements of the upgraded package(s).'
|
|
||||||
)
|
|
||||||
|
|
||||||
cmd_opts.add_option(
|
|
||||||
'--force-reinstall',
|
|
||||||
dest='force_reinstall',
|
|
||||||
action='store_true',
|
|
||||||
help='When upgrading, reinstall all packages even if they are '
|
|
||||||
'already up-to-date.')
|
|
||||||
|
|
||||||
cmd_opts.add_option(
|
|
||||||
'-I', '--ignore-installed',
|
|
||||||
dest='ignore_installed',
|
|
||||||
action='store_true',
|
|
||||||
default=default_user,
|
|
||||||
help='Ignore the installed packages (reinstalling instead).')
|
|
||||||
|
|
||||||
cmd_opts.add_option(cmdoptions.ignore_requires_python())
|
|
||||||
cmd_opts.add_option(cmdoptions.no_deps())
|
|
||||||
|
|
||||||
cmd_opts.add_option(cmdoptions.install_options())
|
|
||||||
cmd_opts.add_option(cmdoptions.global_options())
|
|
||||||
|
|
||||||
cmd_opts.add_option(
|
|
||||||
'--user',
|
|
||||||
dest='use_user_site',
|
|
||||||
action='store_true',
|
|
||||||
default=default_user,
|
|
||||||
help="Install to the Python user install directory for your "
|
|
||||||
"platform. Typically ~/.local/, or %APPDATA%\Python on "
|
|
||||||
"Windows. (See the Python documentation for site.USER_BASE "
|
|
||||||
"for full details.) On Debian systems, this is the "
|
|
||||||
"default when running outside of a virtual environment "
|
|
||||||
"and not as root.")
|
|
||||||
|
|
||||||
cmd_opts.add_option(
|
|
||||||
'--system',
|
|
||||||
dest='use_user_site',
|
|
||||||
action='store_false',
|
|
||||||
help="Install using the system scheme (overrides --user on "
|
|
||||||
"Debian systems)")
|
|
||||||
|
|
||||||
cmd_opts.add_option(
|
|
||||||
'--egg',
|
|
||||||
dest='as_egg',
|
|
||||||
action='store_true',
|
|
||||||
help="Install packages as eggs, not 'flat', like pip normally "
|
|
||||||
"does. This option is not about installing *from* eggs. "
|
|
||||||
"(WARNING: Because this option overrides pip's normal install"
|
|
||||||
" logic, requirements files may not behave as expected.)")
|
|
||||||
|
|
||||||
cmd_opts.add_option(
|
|
||||||
'--root',
|
|
||||||
dest='root_path',
|
|
||||||
metavar='dir',
|
|
||||||
default=None,
|
|
||||||
help="Install everything relative to this alternate root "
|
|
||||||
"directory.")
|
|
||||||
|
|
||||||
cmd_opts.add_option(
|
|
||||||
'--prefix',
|
|
||||||
dest='prefix_path',
|
|
||||||
metavar='dir',
|
|
||||||
default=None,
|
|
||||||
help="Installation prefix where lib, bin and other top-level "
|
|
||||||
"folders are placed")
|
|
||||||
|
|
||||||
cmd_opts.add_option(
|
|
||||||
"--compile",
|
|
||||||
action="store_true",
|
|
||||||
dest="compile",
|
|
||||||
default=True,
|
|
||||||
help="Compile py files to pyc",
|
|
||||||
)
|
|
||||||
|
|
||||||
cmd_opts.add_option(
|
|
||||||
"--no-compile",
|
|
||||||
action="store_false",
|
|
||||||
dest="compile",
|
|
||||||
help="Do not compile py files to pyc",
|
|
||||||
)
|
|
||||||
|
|
||||||
cmd_opts.add_option(cmdoptions.use_wheel())
|
|
||||||
cmd_opts.add_option(cmdoptions.no_use_wheel())
|
|
||||||
cmd_opts.add_option(cmdoptions.no_binary())
|
|
||||||
cmd_opts.add_option(cmdoptions.only_binary())
|
|
||||||
cmd_opts.add_option(cmdoptions.pre())
|
|
||||||
cmd_opts.add_option(cmdoptions.no_clean())
|
|
||||||
cmd_opts.add_option(cmdoptions.require_hashes())
|
|
||||||
|
|
||||||
index_opts = cmdoptions.make_option_group(
|
|
||||||
cmdoptions.index_group,
|
|
||||||
self.parser,
|
|
||||||
)
|
|
||||||
|
|
||||||
self.parser.insert_option_group(0, index_opts)
|
|
||||||
self.parser.insert_option_group(0, cmd_opts)
|
|
||||||
|
|
||||||
def run(self, options, args):
|
|
||||||
cmdoptions.resolve_wheel_no_use_binary(options)
|
|
||||||
cmdoptions.check_install_build_global(options)
|
|
||||||
|
|
||||||
if options.as_egg:
|
|
||||||
warnings.warn(
|
|
||||||
"--egg has been deprecated and will be removed in the future. "
|
|
||||||
"This flag is mutually exclusive with large parts of pip, and "
|
|
||||||
"actually using it invalidates pip's ability to manage the "
|
|
||||||
"installation process.",
|
|
||||||
RemovedInPip10Warning,
|
|
||||||
)
|
|
||||||
|
|
||||||
if options.allow_external:
|
|
||||||
warnings.warn(
|
|
||||||
"--allow-external has been deprecated and will be removed in "
|
|
||||||
"the future. Due to changes in the repository protocol, it no "
|
|
||||||
"longer has any effect.",
|
|
||||||
RemovedInPip10Warning,
|
|
||||||
)
|
|
||||||
|
|
||||||
if options.allow_all_external:
|
|
||||||
warnings.warn(
|
|
||||||
"--allow-all-external has been deprecated and will be removed "
|
|
||||||
"in the future. Due to changes in the repository protocol, it "
|
|
||||||
"no longer has any effect.",
|
|
||||||
RemovedInPip10Warning,
|
|
||||||
)
|
|
||||||
|
|
||||||
if options.allow_unverified:
|
|
||||||
warnings.warn(
|
|
||||||
"--allow-unverified has been deprecated and will be removed "
|
|
||||||
"in the future. Due to changes in the repository protocol, it "
|
|
||||||
"no longer has any effect.",
|
|
||||||
RemovedInPip10Warning,
|
|
||||||
)
|
|
||||||
|
|
||||||
if options.download_dir:
|
|
||||||
warnings.warn(
|
|
||||||
"pip install --download has been deprecated and will be "
|
|
||||||
"removed in the future. Pip now has a download command that "
|
|
||||||
"should be used instead.",
|
|
||||||
RemovedInPip10Warning,
|
|
||||||
)
|
|
||||||
options.ignore_installed = True
|
|
||||||
|
|
||||||
if options.build_dir:
|
|
||||||
options.build_dir = os.path.abspath(options.build_dir)
|
|
||||||
|
|
||||||
options.src_dir = os.path.abspath(options.src_dir)
|
|
||||||
install_options = options.install_options or []
|
|
||||||
if options.use_user_site:
|
|
||||||
if options.prefix_path:
|
|
||||||
raise CommandError(
|
|
||||||
"Can not combine '--user' and '--prefix' as they imply "
|
|
||||||
"different installation locations"
|
|
||||||
)
|
|
||||||
if virtualenv_no_global():
|
|
||||||
raise InstallationError(
|
|
||||||
"Can not perform a '--user' install. User site-packages "
|
|
||||||
"are not visible in this virtualenv."
|
|
||||||
)
|
|
||||||
install_options.append('--user')
|
|
||||||
install_options.append('--prefix=')
|
|
||||||
|
|
||||||
temp_target_dir = None
|
|
||||||
if options.target_dir:
|
|
||||||
options.ignore_installed = True
|
|
||||||
temp_target_dir = tempfile.mkdtemp()
|
|
||||||
options.target_dir = os.path.abspath(options.target_dir)
|
|
||||||
if (os.path.exists(options.target_dir) and not
|
|
||||||
os.path.isdir(options.target_dir)):
|
|
||||||
raise CommandError(
|
|
||||||
"Target path exists but is not a directory, will not "
|
|
||||||
"continue."
|
|
||||||
)
|
|
||||||
install_options.append('--home=' + temp_target_dir)
|
|
||||||
|
|
||||||
global_options = options.global_options or []
|
|
||||||
|
|
||||||
with self._build_session(options) as session:
|
|
||||||
|
|
||||||
finder = self._build_package_finder(options, session)
|
|
||||||
build_delete = (not (options.no_clean or options.build_dir))
|
|
||||||
wheel_cache = WheelCache(options.cache_dir, options.format_control)
|
|
||||||
if options.cache_dir and not check_path_owner(options.cache_dir):
|
|
||||||
logger.warning(
|
|
||||||
"The directory '%s' or its parent directory is not owned "
|
|
||||||
"by the current user and caching wheels has been "
|
|
||||||
"disabled. check the permissions and owner of that "
|
|
||||||
"directory. If executing pip with sudo, you may want "
|
|
||||||
"sudo's -H flag.",
|
|
||||||
options.cache_dir,
|
|
||||||
)
|
|
||||||
options.cache_dir = None
|
|
||||||
|
|
||||||
with BuildDirectory(options.build_dir,
|
|
||||||
delete=build_delete) as build_dir:
|
|
||||||
requirement_set = RequirementSet(
|
|
||||||
build_dir=build_dir,
|
|
||||||
src_dir=options.src_dir,
|
|
||||||
download_dir=options.download_dir,
|
|
||||||
upgrade=options.upgrade,
|
|
||||||
upgrade_strategy=options.upgrade_strategy,
|
|
||||||
as_egg=options.as_egg,
|
|
||||||
ignore_installed=options.ignore_installed,
|
|
||||||
ignore_dependencies=options.ignore_dependencies,
|
|
||||||
ignore_requires_python=options.ignore_requires_python,
|
|
||||||
force_reinstall=options.force_reinstall,
|
|
||||||
use_user_site=options.use_user_site,
|
|
||||||
target_dir=temp_target_dir,
|
|
||||||
session=session,
|
|
||||||
pycompile=options.compile,
|
|
||||||
isolated=options.isolated_mode,
|
|
||||||
wheel_cache=wheel_cache,
|
|
||||||
require_hashes=options.require_hashes,
|
|
||||||
)
|
|
||||||
|
|
||||||
self.populate_requirement_set(
|
|
||||||
requirement_set, args, options, finder, session, self.name,
|
|
||||||
wheel_cache
|
|
||||||
)
|
|
||||||
|
|
||||||
if not requirement_set.has_requirements:
|
|
||||||
return
|
|
||||||
|
|
||||||
try:
|
|
||||||
if (options.download_dir or not wheel or not
|
|
||||||
options.cache_dir):
|
|
||||||
# on -d don't do complex things like building
|
|
||||||
# wheels, and don't try to build wheels when wheel is
|
|
||||||
# not installed.
|
|
||||||
requirement_set.prepare_files(finder)
|
|
||||||
else:
|
|
||||||
# build wheels before install.
|
|
||||||
wb = WheelBuilder(
|
|
||||||
requirement_set,
|
|
||||||
finder,
|
|
||||||
build_options=[],
|
|
||||||
global_options=[],
|
|
||||||
)
|
|
||||||
# Ignore the result: a failed wheel will be
|
|
||||||
# installed from the sdist/vcs whatever.
|
|
||||||
wb.build(autobuilding=True)
|
|
||||||
|
|
||||||
if not options.download_dir:
|
|
||||||
requirement_set.install(
|
|
||||||
install_options,
|
|
||||||
global_options,
|
|
||||||
root=options.root_path,
|
|
||||||
prefix=options.prefix_path,
|
|
||||||
)
|
|
||||||
|
|
||||||
possible_lib_locations = get_lib_location_guesses(
|
|
||||||
user=options.use_user_site,
|
|
||||||
home=temp_target_dir,
|
|
||||||
root=options.root_path,
|
|
||||||
prefix=options.prefix_path,
|
|
||||||
isolated=options.isolated_mode,
|
|
||||||
)
|
|
||||||
reqs = sorted(
|
|
||||||
requirement_set.successfully_installed,
|
|
||||||
key=operator.attrgetter('name'))
|
|
||||||
items = []
|
|
||||||
for req in reqs:
|
|
||||||
item = req.name
|
|
||||||
try:
|
|
||||||
installed_version = get_installed_version(
|
|
||||||
req.name, possible_lib_locations
|
|
||||||
)
|
|
||||||
if installed_version:
|
|
||||||
item += '-' + installed_version
|
|
||||||
except Exception:
|
|
||||||
pass
|
|
||||||
items.append(item)
|
|
||||||
installed = ' '.join(items)
|
|
||||||
if installed:
|
|
||||||
logger.info('Successfully installed %s', installed)
|
|
||||||
else:
|
|
||||||
downloaded = ' '.join([
|
|
||||||
req.name
|
|
||||||
for req in requirement_set.successfully_downloaded
|
|
||||||
])
|
|
||||||
if downloaded:
|
|
||||||
logger.info(
|
|
||||||
'Successfully downloaded %s', downloaded
|
|
||||||
)
|
|
||||||
except PreviousBuildDirError:
|
|
||||||
options.no_clean = True
|
|
||||||
raise
|
|
||||||
finally:
|
|
||||||
# Clean up
|
|
||||||
if not options.no_clean:
|
|
||||||
requirement_set.cleanup_files()
|
|
||||||
|
|
||||||
if options.target_dir:
|
|
||||||
ensure_dir(options.target_dir)
|
|
||||||
|
|
||||||
# Checking both purelib and platlib directories for installed
|
|
||||||
# packages to be moved to target directory
|
|
||||||
lib_dir_list = []
|
|
||||||
|
|
||||||
purelib_dir = distutils_scheme('', home=temp_target_dir)['purelib']
|
|
||||||
platlib_dir = distutils_scheme('', home=temp_target_dir)['platlib']
|
|
||||||
|
|
||||||
if os.path.exists(purelib_dir):
|
|
||||||
lib_dir_list.append(purelib_dir)
|
|
||||||
if os.path.exists(platlib_dir) and platlib_dir != purelib_dir:
|
|
||||||
lib_dir_list.append(platlib_dir)
|
|
||||||
|
|
||||||
for lib_dir in lib_dir_list:
|
|
||||||
for item in os.listdir(lib_dir):
|
|
||||||
target_item_dir = os.path.join(options.target_dir, item)
|
|
||||||
if os.path.exists(target_item_dir):
|
|
||||||
if not options.upgrade:
|
|
||||||
logger.warning(
|
|
||||||
'Target directory %s already exists. Specify '
|
|
||||||
'--upgrade to force replacement.',
|
|
||||||
target_item_dir
|
|
||||||
)
|
|
||||||
continue
|
|
||||||
if os.path.islink(target_item_dir):
|
|
||||||
logger.warning(
|
|
||||||
'Target directory %s already exists and is '
|
|
||||||
'a link. Pip will not automatically replace '
|
|
||||||
'links, please remove if replacement is '
|
|
||||||
'desired.',
|
|
||||||
target_item_dir
|
|
||||||
)
|
|
||||||
continue
|
|
||||||
if os.path.isdir(target_item_dir):
|
|
||||||
shutil.rmtree(target_item_dir)
|
|
||||||
else:
|
|
||||||
os.remove(target_item_dir)
|
|
||||||
|
|
||||||
shutil.move(
|
|
||||||
os.path.join(lib_dir, item),
|
|
||||||
target_item_dir
|
|
||||||
)
|
|
||||||
shutil.rmtree(temp_target_dir)
|
|
||||||
return requirement_set
|
|
||||||
|
|
||||||
|
|
||||||
def get_lib_location_guesses(*args, **kwargs):
|
|
||||||
scheme = distutils_scheme('', *args, **kwargs)
|
|
||||||
return [scheme['purelib'], scheme['platlib']]
|
|
|
@ -1,164 +0,0 @@
|
||||||
"""Stuff that differs in different Python versions and platform
|
|
||||||
distributions."""
|
|
||||||
from __future__ import absolute_import, division
|
|
||||||
|
|
||||||
import os
|
|
||||||
import sys
|
|
||||||
|
|
||||||
from pip._vendor.six import text_type
|
|
||||||
|
|
||||||
try:
|
|
||||||
from logging.config import dictConfig as logging_dictConfig
|
|
||||||
except ImportError:
|
|
||||||
from pip.compat.dictconfig import dictConfig as logging_dictConfig
|
|
||||||
|
|
||||||
try:
|
|
||||||
from collections import OrderedDict
|
|
||||||
except ImportError:
|
|
||||||
from pip._vendor.ordereddict import OrderedDict
|
|
||||||
|
|
||||||
try:
|
|
||||||
import ipaddress
|
|
||||||
except ImportError:
|
|
||||||
try:
|
|
||||||
from pip._vendor import ipaddress
|
|
||||||
except ImportError:
|
|
||||||
import ipaddr as ipaddress
|
|
||||||
ipaddress.ip_address = ipaddress.IPAddress
|
|
||||||
ipaddress.ip_network = ipaddress.IPNetwork
|
|
||||||
|
|
||||||
|
|
||||||
try:
|
|
||||||
import sysconfig
|
|
||||||
|
|
||||||
def get_stdlib():
|
|
||||||
paths = [
|
|
||||||
sysconfig.get_path("stdlib"),
|
|
||||||
sysconfig.get_path("platstdlib"),
|
|
||||||
]
|
|
||||||
return set(filter(bool, paths))
|
|
||||||
except ImportError:
|
|
||||||
from distutils import sysconfig
|
|
||||||
|
|
||||||
def get_stdlib():
|
|
||||||
paths = [
|
|
||||||
sysconfig.get_python_lib(standard_lib=True),
|
|
||||||
sysconfig.get_python_lib(standard_lib=True, plat_specific=True),
|
|
||||||
]
|
|
||||||
return set(filter(bool, paths))
|
|
||||||
|
|
||||||
|
|
||||||
__all__ = [
|
|
||||||
"logging_dictConfig", "ipaddress", "uses_pycache", "console_to_str",
|
|
||||||
"native_str", "get_path_uid", "stdlib_pkgs", "WINDOWS", "samefile",
|
|
||||||
"OrderedDict",
|
|
||||||
]
|
|
||||||
|
|
||||||
|
|
||||||
if sys.version_info >= (3, 4):
|
|
||||||
uses_pycache = True
|
|
||||||
from importlib.util import cache_from_source
|
|
||||||
else:
|
|
||||||
import imp
|
|
||||||
uses_pycache = hasattr(imp, 'cache_from_source')
|
|
||||||
if uses_pycache:
|
|
||||||
cache_from_source = imp.cache_from_source
|
|
||||||
else:
|
|
||||||
cache_from_source = None
|
|
||||||
|
|
||||||
|
|
||||||
if sys.version_info >= (3,):
|
|
||||||
def console_to_str(s):
|
|
||||||
try:
|
|
||||||
return s.decode(sys.__stdout__.encoding)
|
|
||||||
except UnicodeDecodeError:
|
|
||||||
return s.decode('utf_8')
|
|
||||||
|
|
||||||
def native_str(s, replace=False):
|
|
||||||
if isinstance(s, bytes):
|
|
||||||
return s.decode('utf-8', 'replace' if replace else 'strict')
|
|
||||||
return s
|
|
||||||
|
|
||||||
else:
|
|
||||||
def console_to_str(s):
|
|
||||||
return s
|
|
||||||
|
|
||||||
def native_str(s, replace=False):
|
|
||||||
# Replace is ignored -- unicode to UTF-8 can't fail
|
|
||||||
if isinstance(s, text_type):
|
|
||||||
return s.encode('utf-8')
|
|
||||||
return s
|
|
||||||
|
|
||||||
|
|
||||||
def total_seconds(td):
|
|
||||||
if hasattr(td, "total_seconds"):
|
|
||||||
return td.total_seconds()
|
|
||||||
else:
|
|
||||||
val = td.microseconds + (td.seconds + td.days * 24 * 3600) * 10 ** 6
|
|
||||||
return val / 10 ** 6
|
|
||||||
|
|
||||||
|
|
||||||
def get_path_uid(path):
|
|
||||||
"""
|
|
||||||
Return path's uid.
|
|
||||||
|
|
||||||
Does not follow symlinks:
|
|
||||||
https://github.com/pypa/pip/pull/935#discussion_r5307003
|
|
||||||
|
|
||||||
Placed this function in compat due to differences on AIX and
|
|
||||||
Jython, that should eventually go away.
|
|
||||||
|
|
||||||
:raises OSError: When path is a symlink or can't be read.
|
|
||||||
"""
|
|
||||||
if hasattr(os, 'O_NOFOLLOW'):
|
|
||||||
fd = os.open(path, os.O_RDONLY | os.O_NOFOLLOW)
|
|
||||||
file_uid = os.fstat(fd).st_uid
|
|
||||||
os.close(fd)
|
|
||||||
else: # AIX and Jython
|
|
||||||
# WARNING: time of check vulnerability, but best we can do w/o NOFOLLOW
|
|
||||||
if not os.path.islink(path):
|
|
||||||
# older versions of Jython don't have `os.fstat`
|
|
||||||
file_uid = os.stat(path).st_uid
|
|
||||||
else:
|
|
||||||
# raise OSError for parity with os.O_NOFOLLOW above
|
|
||||||
raise OSError(
|
|
||||||
"%s is a symlink; Will not return uid for symlinks" % path
|
|
||||||
)
|
|
||||||
return file_uid
|
|
||||||
|
|
||||||
|
|
||||||
def expanduser(path):
|
|
||||||
"""
|
|
||||||
Expand ~ and ~user constructions.
|
|
||||||
|
|
||||||
Includes a workaround for http://bugs.python.org/issue14768
|
|
||||||
"""
|
|
||||||
expanded = os.path.expanduser(path)
|
|
||||||
if path.startswith('~/') and expanded.startswith('//'):
|
|
||||||
expanded = expanded[1:]
|
|
||||||
return expanded
|
|
||||||
|
|
||||||
|
|
||||||
# packages in the stdlib that may have installation metadata, but should not be
|
|
||||||
# considered 'installed'. this theoretically could be determined based on
|
|
||||||
# dist.location (py27:`sysconfig.get_paths()['stdlib']`,
|
|
||||||
# py26:sysconfig.get_config_vars('LIBDEST')), but fear platform variation may
|
|
||||||
# make this ineffective, so hard-coding
|
|
||||||
stdlib_pkgs = ('python', 'wsgiref')
|
|
||||||
if sys.version_info >= (2, 7):
|
|
||||||
stdlib_pkgs += ('argparse',)
|
|
||||||
|
|
||||||
|
|
||||||
# windows detection, covers cpython and ironpython
|
|
||||||
WINDOWS = (sys.platform.startswith("win") or
|
|
||||||
(sys.platform == 'cli' and os.name == 'nt'))
|
|
||||||
|
|
||||||
|
|
||||||
def samefile(file1, file2):
|
|
||||||
"""Provide an alternative for os.path.samefile on Windows/Python2"""
|
|
||||||
if hasattr(os.path, 'samefile'):
|
|
||||||
return os.path.samefile(file1, file2)
|
|
||||||
else:
|
|
||||||
path1 = os.path.normcase(os.path.abspath(file1))
|
|
||||||
path2 = os.path.normcase(os.path.abspath(file2))
|
|
||||||
return path1 == path2
|
|
|
@ -1,565 +0,0 @@
|
||||||
# This is a copy of the Python logging.config.dictconfig module,
|
|
||||||
# reproduced with permission. It is provided here for backwards
|
|
||||||
# compatibility for Python versions prior to 2.7.
|
|
||||||
#
|
|
||||||
# Copyright 2009-2010 by Vinay Sajip. All Rights Reserved.
|
|
||||||
#
|
|
||||||
# Permission to use, copy, modify, and distribute this software and its
|
|
||||||
# documentation for any purpose and without fee is hereby granted,
|
|
||||||
# provided that the above copyright notice appear in all copies and that
|
|
||||||
# both that copyright notice and this permission notice appear in
|
|
||||||
# supporting documentation, and that the name of Vinay Sajip
|
|
||||||
# not be used in advertising or publicity pertaining to distribution
|
|
||||||
# of the software without specific, written prior permission.
|
|
||||||
# VINAY SAJIP DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE, INCLUDING
|
|
||||||
# ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL
|
|
||||||
# VINAY SAJIP BE LIABLE FOR ANY SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES OR
|
|
||||||
# ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER
|
|
||||||
# IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT
|
|
||||||
# OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
|
|
||||||
from __future__ import absolute_import
|
|
||||||
|
|
||||||
import logging.handlers
|
|
||||||
import re
|
|
||||||
import sys
|
|
||||||
import types
|
|
||||||
|
|
||||||
from pip._vendor import six
|
|
||||||
|
|
||||||
# flake8: noqa
|
|
||||||
|
|
||||||
IDENTIFIER = re.compile('^[a-z_][a-z0-9_]*$', re.I)
|
|
||||||
|
|
||||||
|
|
||||||
def valid_ident(s):
|
|
||||||
m = IDENTIFIER.match(s)
|
|
||||||
if not m:
|
|
||||||
raise ValueError('Not a valid Python identifier: %r' % s)
|
|
||||||
return True
|
|
||||||
|
|
||||||
#
|
|
||||||
# This function is defined in logging only in recent versions of Python
|
|
||||||
#
|
|
||||||
try:
|
|
||||||
from logging import _checkLevel
|
|
||||||
except ImportError:
|
|
||||||
def _checkLevel(level):
|
|
||||||
if isinstance(level, int):
|
|
||||||
rv = level
|
|
||||||
elif str(level) == level:
|
|
||||||
if level not in logging._levelNames:
|
|
||||||
raise ValueError('Unknown level: %r' % level)
|
|
||||||
rv = logging._levelNames[level]
|
|
||||||
else:
|
|
||||||
raise TypeError('Level not an integer or a '
|
|
||||||
'valid string: %r' % level)
|
|
||||||
return rv
|
|
||||||
|
|
||||||
# The ConvertingXXX classes are wrappers around standard Python containers,
|
|
||||||
# and they serve to convert any suitable values in the container. The
|
|
||||||
# conversion converts base dicts, lists and tuples to their wrapped
|
|
||||||
# equivalents, whereas strings which match a conversion format are converted
|
|
||||||
# appropriately.
|
|
||||||
#
|
|
||||||
# Each wrapper should have a configurator attribute holding the actual
|
|
||||||
# configurator to use for conversion.
|
|
||||||
|
|
||||||
|
|
||||||
class ConvertingDict(dict):
|
|
||||||
"""A converting dictionary wrapper."""
|
|
||||||
|
|
||||||
def __getitem__(self, key):
|
|
||||||
value = dict.__getitem__(self, key)
|
|
||||||
result = self.configurator.convert(value)
|
|
||||||
# If the converted value is different, save for next time
|
|
||||||
if value is not result:
|
|
||||||
self[key] = result
|
|
||||||
if type(result) in (ConvertingDict, ConvertingList,
|
|
||||||
ConvertingTuple):
|
|
||||||
result.parent = self
|
|
||||||
result.key = key
|
|
||||||
return result
|
|
||||||
|
|
||||||
def get(self, key, default=None):
|
|
||||||
value = dict.get(self, key, default)
|
|
||||||
result = self.configurator.convert(value)
|
|
||||||
# If the converted value is different, save for next time
|
|
||||||
if value is not result:
|
|
||||||
self[key] = result
|
|
||||||
if type(result) in (ConvertingDict, ConvertingList,
|
|
||||||
ConvertingTuple):
|
|
||||||
result.parent = self
|
|
||||||
result.key = key
|
|
||||||
return result
|
|
||||||
|
|
||||||
def pop(self, key, default=None):
|
|
||||||
value = dict.pop(self, key, default)
|
|
||||||
result = self.configurator.convert(value)
|
|
||||||
if value is not result:
|
|
||||||
if type(result) in (ConvertingDict, ConvertingList,
|
|
||||||
ConvertingTuple):
|
|
||||||
result.parent = self
|
|
||||||
result.key = key
|
|
||||||
return result
|
|
||||||
|
|
||||||
|
|
||||||
class ConvertingList(list):
|
|
||||||
"""A converting list wrapper."""
|
|
||||||
def __getitem__(self, key):
|
|
||||||
value = list.__getitem__(self, key)
|
|
||||||
result = self.configurator.convert(value)
|
|
||||||
# If the converted value is different, save for next time
|
|
||||||
if value is not result:
|
|
||||||
self[key] = result
|
|
||||||
if type(result) in (ConvertingDict, ConvertingList,
|
|
||||||
ConvertingTuple):
|
|
||||||
result.parent = self
|
|
||||||
result.key = key
|
|
||||||
return result
|
|
||||||
|
|
||||||
def pop(self, idx=-1):
|
|
||||||
value = list.pop(self, idx)
|
|
||||||
result = self.configurator.convert(value)
|
|
||||||
if value is not result:
|
|
||||||
if type(result) in (ConvertingDict, ConvertingList,
|
|
||||||
ConvertingTuple):
|
|
||||||
result.parent = self
|
|
||||||
return result
|
|
||||||
|
|
||||||
|
|
||||||
class ConvertingTuple(tuple):
|
|
||||||
"""A converting tuple wrapper."""
|
|
||||||
def __getitem__(self, key):
|
|
||||||
value = tuple.__getitem__(self, key)
|
|
||||||
result = self.configurator.convert(value)
|
|
||||||
if value is not result:
|
|
||||||
if type(result) in (ConvertingDict, ConvertingList,
|
|
||||||
ConvertingTuple):
|
|
||||||
result.parent = self
|
|
||||||
result.key = key
|
|
||||||
return result
|
|
||||||
|
|
||||||
|
|
||||||
class BaseConfigurator(object):
|
|
||||||
"""
|
|
||||||
The configurator base class which defines some useful defaults.
|
|
||||||
"""
|
|
||||||
|
|
||||||
CONVERT_PATTERN = re.compile(r'^(?P<prefix>[a-z]+)://(?P<suffix>.*)$')
|
|
||||||
|
|
||||||
WORD_PATTERN = re.compile(r'^\s*(\w+)\s*')
|
|
||||||
DOT_PATTERN = re.compile(r'^\.\s*(\w+)\s*')
|
|
||||||
INDEX_PATTERN = re.compile(r'^\[\s*(\w+)\s*\]\s*')
|
|
||||||
DIGIT_PATTERN = re.compile(r'^\d+$')
|
|
||||||
|
|
||||||
value_converters = {
|
|
||||||
'ext' : 'ext_convert',
|
|
||||||
'cfg' : 'cfg_convert',
|
|
||||||
}
|
|
||||||
|
|
||||||
# We might want to use a different one, e.g. importlib
|
|
||||||
importer = __import__
|
|
||||||
|
|
||||||
def __init__(self, config):
|
|
||||||
self.config = ConvertingDict(config)
|
|
||||||
self.config.configurator = self
|
|
||||||
|
|
||||||
def resolve(self, s):
|
|
||||||
"""
|
|
||||||
Resolve strings to objects using standard import and attribute
|
|
||||||
syntax.
|
|
||||||
"""
|
|
||||||
name = s.split('.')
|
|
||||||
used = name.pop(0)
|
|
||||||
try:
|
|
||||||
found = self.importer(used)
|
|
||||||
for frag in name:
|
|
||||||
used += '.' + frag
|
|
||||||
try:
|
|
||||||
found = getattr(found, frag)
|
|
||||||
except AttributeError:
|
|
||||||
self.importer(used)
|
|
||||||
found = getattr(found, frag)
|
|
||||||
return found
|
|
||||||
except ImportError:
|
|
||||||
e, tb = sys.exc_info()[1:]
|
|
||||||
v = ValueError('Cannot resolve %r: %s' % (s, e))
|
|
||||||
v.__cause__, v.__traceback__ = e, tb
|
|
||||||
raise v
|
|
||||||
|
|
||||||
def ext_convert(self, value):
|
|
||||||
"""Default converter for the ext:// protocol."""
|
|
||||||
return self.resolve(value)
|
|
||||||
|
|
||||||
def cfg_convert(self, value):
|
|
||||||
"""Default converter for the cfg:// protocol."""
|
|
||||||
rest = value
|
|
||||||
m = self.WORD_PATTERN.match(rest)
|
|
||||||
if m is None:
|
|
||||||
raise ValueError("Unable to convert %r" % value)
|
|
||||||
else:
|
|
||||||
rest = rest[m.end():]
|
|
||||||
d = self.config[m.groups()[0]]
|
|
||||||
# print d, rest
|
|
||||||
while rest:
|
|
||||||
m = self.DOT_PATTERN.match(rest)
|
|
||||||
if m:
|
|
||||||
d = d[m.groups()[0]]
|
|
||||||
else:
|
|
||||||
m = self.INDEX_PATTERN.match(rest)
|
|
||||||
if m:
|
|
||||||
idx = m.groups()[0]
|
|
||||||
if not self.DIGIT_PATTERN.match(idx):
|
|
||||||
d = d[idx]
|
|
||||||
else:
|
|
||||||
try:
|
|
||||||
n = int(idx) # try as number first (most likely)
|
|
||||||
d = d[n]
|
|
||||||
except TypeError:
|
|
||||||
d = d[idx]
|
|
||||||
if m:
|
|
||||||
rest = rest[m.end():]
|
|
||||||
else:
|
|
||||||
raise ValueError('Unable to convert '
|
|
||||||
'%r at %r' % (value, rest))
|
|
||||||
# rest should be empty
|
|
||||||
return d
|
|
||||||
|
|
||||||
def convert(self, value):
|
|
||||||
"""
|
|
||||||
Convert values to an appropriate type. dicts, lists and tuples are
|
|
||||||
replaced by their converting alternatives. Strings are checked to
|
|
||||||
see if they have a conversion format and are converted if they do.
|
|
||||||
"""
|
|
||||||
if not isinstance(value, ConvertingDict) and isinstance(value, dict):
|
|
||||||
value = ConvertingDict(value)
|
|
||||||
value.configurator = self
|
|
||||||
elif not isinstance(value, ConvertingList) and isinstance(value, list):
|
|
||||||
value = ConvertingList(value)
|
|
||||||
value.configurator = self
|
|
||||||
elif not isinstance(value, ConvertingTuple) and\
|
|
||||||
isinstance(value, tuple):
|
|
||||||
value = ConvertingTuple(value)
|
|
||||||
value.configurator = self
|
|
||||||
elif isinstance(value, six.string_types): # str for py3k
|
|
||||||
m = self.CONVERT_PATTERN.match(value)
|
|
||||||
if m:
|
|
||||||
d = m.groupdict()
|
|
||||||
prefix = d['prefix']
|
|
||||||
converter = self.value_converters.get(prefix, None)
|
|
||||||
if converter:
|
|
||||||
suffix = d['suffix']
|
|
||||||
converter = getattr(self, converter)
|
|
||||||
value = converter(suffix)
|
|
||||||
return value
|
|
||||||
|
|
||||||
def configure_custom(self, config):
|
|
||||||
"""Configure an object with a user-supplied factory."""
|
|
||||||
c = config.pop('()')
|
|
||||||
if not hasattr(c, '__call__') and hasattr(types, 'ClassType') and type(c) != types.ClassType:
|
|
||||||
c = self.resolve(c)
|
|
||||||
props = config.pop('.', None)
|
|
||||||
# Check for valid identifiers
|
|
||||||
kwargs = dict((k, config[k]) for k in config if valid_ident(k))
|
|
||||||
result = c(**kwargs)
|
|
||||||
if props:
|
|
||||||
for name, value in props.items():
|
|
||||||
setattr(result, name, value)
|
|
||||||
return result
|
|
||||||
|
|
||||||
def as_tuple(self, value):
|
|
||||||
"""Utility function which converts lists to tuples."""
|
|
||||||
if isinstance(value, list):
|
|
||||||
value = tuple(value)
|
|
||||||
return value
|
|
||||||
|
|
||||||
|
|
||||||
class DictConfigurator(BaseConfigurator):
|
|
||||||
"""
|
|
||||||
Configure logging using a dictionary-like object to describe the
|
|
||||||
configuration.
|
|
||||||
"""
|
|
||||||
|
|
||||||
def configure(self):
|
|
||||||
"""Do the configuration."""
|
|
||||||
|
|
||||||
config = self.config
|
|
||||||
if 'version' not in config:
|
|
||||||
raise ValueError("dictionary doesn't specify a version")
|
|
||||||
if config['version'] != 1:
|
|
||||||
raise ValueError("Unsupported version: %s" % config['version'])
|
|
||||||
incremental = config.pop('incremental', False)
|
|
||||||
EMPTY_DICT = {}
|
|
||||||
logging._acquireLock()
|
|
||||||
try:
|
|
||||||
if incremental:
|
|
||||||
handlers = config.get('handlers', EMPTY_DICT)
|
|
||||||
# incremental handler config only if handler name
|
|
||||||
# ties in to logging._handlers (Python 2.7)
|
|
||||||
if sys.version_info[:2] == (2, 7):
|
|
||||||
for name in handlers:
|
|
||||||
if name not in logging._handlers:
|
|
||||||
raise ValueError('No handler found with '
|
|
||||||
'name %r' % name)
|
|
||||||
else:
|
|
||||||
try:
|
|
||||||
handler = logging._handlers[name]
|
|
||||||
handler_config = handlers[name]
|
|
||||||
level = handler_config.get('level', None)
|
|
||||||
if level:
|
|
||||||
handler.setLevel(_checkLevel(level))
|
|
||||||
except StandardError as e:
|
|
||||||
raise ValueError('Unable to configure handler '
|
|
||||||
'%r: %s' % (name, e))
|
|
||||||
loggers = config.get('loggers', EMPTY_DICT)
|
|
||||||
for name in loggers:
|
|
||||||
try:
|
|
||||||
self.configure_logger(name, loggers[name], True)
|
|
||||||
except StandardError as e:
|
|
||||||
raise ValueError('Unable to configure logger '
|
|
||||||
'%r: %s' % (name, e))
|
|
||||||
root = config.get('root', None)
|
|
||||||
if root:
|
|
||||||
try:
|
|
||||||
self.configure_root(root, True)
|
|
||||||
except StandardError as e:
|
|
||||||
raise ValueError('Unable to configure root '
|
|
||||||
'logger: %s' % e)
|
|
||||||
else:
|
|
||||||
disable_existing = config.pop('disable_existing_loggers', True)
|
|
||||||
|
|
||||||
logging._handlers.clear()
|
|
||||||
del logging._handlerList[:]
|
|
||||||
|
|
||||||
# Do formatters first - they don't refer to anything else
|
|
||||||
formatters = config.get('formatters', EMPTY_DICT)
|
|
||||||
for name in formatters:
|
|
||||||
try:
|
|
||||||
formatters[name] = self.configure_formatter(
|
|
||||||
formatters[name])
|
|
||||||
except StandardError as e:
|
|
||||||
raise ValueError('Unable to configure '
|
|
||||||
'formatter %r: %s' % (name, e))
|
|
||||||
# Next, do filters - they don't refer to anything else, either
|
|
||||||
filters = config.get('filters', EMPTY_DICT)
|
|
||||||
for name in filters:
|
|
||||||
try:
|
|
||||||
filters[name] = self.configure_filter(filters[name])
|
|
||||||
except StandardError as e:
|
|
||||||
raise ValueError('Unable to configure '
|
|
||||||
'filter %r: %s' % (name, e))
|
|
||||||
|
|
||||||
# Next, do handlers - they refer to formatters and filters
|
|
||||||
# As handlers can refer to other handlers, sort the keys
|
|
||||||
# to allow a deterministic order of configuration
|
|
||||||
handlers = config.get('handlers', EMPTY_DICT)
|
|
||||||
for name in sorted(handlers):
|
|
||||||
try:
|
|
||||||
handler = self.configure_handler(handlers[name])
|
|
||||||
handler.name = name
|
|
||||||
handlers[name] = handler
|
|
||||||
except StandardError as e:
|
|
||||||
raise ValueError('Unable to configure handler '
|
|
||||||
'%r: %s' % (name, e))
|
|
||||||
# Next, do loggers - they refer to handlers and filters
|
|
||||||
|
|
||||||
# we don't want to lose the existing loggers,
|
|
||||||
# since other threads may have pointers to them.
|
|
||||||
# existing is set to contain all existing loggers,
|
|
||||||
# and as we go through the new configuration we
|
|
||||||
# remove any which are configured. At the end,
|
|
||||||
# what's left in existing is the set of loggers
|
|
||||||
# which were in the previous configuration but
|
|
||||||
# which are not in the new configuration.
|
|
||||||
root = logging.root
|
|
||||||
existing = list(root.manager.loggerDict)
|
|
||||||
# The list needs to be sorted so that we can
|
|
||||||
# avoid disabling child loggers of explicitly
|
|
||||||
# named loggers. With a sorted list it is easier
|
|
||||||
# to find the child loggers.
|
|
||||||
existing.sort()
|
|
||||||
# We'll keep the list of existing loggers
|
|
||||||
# which are children of named loggers here...
|
|
||||||
child_loggers = []
|
|
||||||
# now set up the new ones...
|
|
||||||
loggers = config.get('loggers', EMPTY_DICT)
|
|
||||||
for name in loggers:
|
|
||||||
if name in existing:
|
|
||||||
i = existing.index(name)
|
|
||||||
prefixed = name + "."
|
|
||||||
pflen = len(prefixed)
|
|
||||||
num_existing = len(existing)
|
|
||||||
i = i + 1 # look at the entry after name
|
|
||||||
while (i < num_existing) and\
|
|
||||||
(existing[i][:pflen] == prefixed):
|
|
||||||
child_loggers.append(existing[i])
|
|
||||||
i = i + 1
|
|
||||||
existing.remove(name)
|
|
||||||
try:
|
|
||||||
self.configure_logger(name, loggers[name])
|
|
||||||
except StandardError as e:
|
|
||||||
raise ValueError('Unable to configure logger '
|
|
||||||
'%r: %s' % (name, e))
|
|
||||||
|
|
||||||
# Disable any old loggers. There's no point deleting
|
|
||||||
# them as other threads may continue to hold references
|
|
||||||
# and by disabling them, you stop them doing any logging.
|
|
||||||
# However, don't disable children of named loggers, as that's
|
|
||||||
# probably not what was intended by the user.
|
|
||||||
for log in existing:
|
|
||||||
logger = root.manager.loggerDict[log]
|
|
||||||
if log in child_loggers:
|
|
||||||
logger.level = logging.NOTSET
|
|
||||||
logger.handlers = []
|
|
||||||
logger.propagate = True
|
|
||||||
elif disable_existing:
|
|
||||||
logger.disabled = True
|
|
||||||
|
|
||||||
# And finally, do the root logger
|
|
||||||
root = config.get('root', None)
|
|
||||||
if root:
|
|
||||||
try:
|
|
||||||
self.configure_root(root)
|
|
||||||
except StandardError as e:
|
|
||||||
raise ValueError('Unable to configure root '
|
|
||||||
'logger: %s' % e)
|
|
||||||
finally:
|
|
||||||
logging._releaseLock()
|
|
||||||
|
|
||||||
def configure_formatter(self, config):
|
|
||||||
"""Configure a formatter from a dictionary."""
|
|
||||||
if '()' in config:
|
|
||||||
factory = config['()'] # for use in exception handler
|
|
||||||
try:
|
|
||||||
result = self.configure_custom(config)
|
|
||||||
except TypeError as te:
|
|
||||||
if "'format'" not in str(te):
|
|
||||||
raise
|
|
||||||
# Name of parameter changed from fmt to format.
|
|
||||||
# Retry with old name.
|
|
||||||
# This is so that code can be used with older Python versions
|
|
||||||
#(e.g. by Django)
|
|
||||||
config['fmt'] = config.pop('format')
|
|
||||||
config['()'] = factory
|
|
||||||
result = self.configure_custom(config)
|
|
||||||
else:
|
|
||||||
fmt = config.get('format', None)
|
|
||||||
dfmt = config.get('datefmt', None)
|
|
||||||
result = logging.Formatter(fmt, dfmt)
|
|
||||||
return result
|
|
||||||
|
|
||||||
def configure_filter(self, config):
|
|
||||||
"""Configure a filter from a dictionary."""
|
|
||||||
if '()' in config:
|
|
||||||
result = self.configure_custom(config)
|
|
||||||
else:
|
|
||||||
name = config.get('name', '')
|
|
||||||
result = logging.Filter(name)
|
|
||||||
return result
|
|
||||||
|
|
||||||
def add_filters(self, filterer, filters):
|
|
||||||
"""Add filters to a filterer from a list of names."""
|
|
||||||
for f in filters:
|
|
||||||
try:
|
|
||||||
filterer.addFilter(self.config['filters'][f])
|
|
||||||
except StandardError as e:
|
|
||||||
raise ValueError('Unable to add filter %r: %s' % (f, e))
|
|
||||||
|
|
||||||
def configure_handler(self, config):
|
|
||||||
"""Configure a handler from a dictionary."""
|
|
||||||
formatter = config.pop('formatter', None)
|
|
||||||
if formatter:
|
|
||||||
try:
|
|
||||||
formatter = self.config['formatters'][formatter]
|
|
||||||
except StandardError as e:
|
|
||||||
raise ValueError('Unable to set formatter '
|
|
||||||
'%r: %s' % (formatter, e))
|
|
||||||
level = config.pop('level', None)
|
|
||||||
filters = config.pop('filters', None)
|
|
||||||
if '()' in config:
|
|
||||||
c = config.pop('()')
|
|
||||||
if not hasattr(c, '__call__') and hasattr(types, 'ClassType') and type(c) != types.ClassType:
|
|
||||||
c = self.resolve(c)
|
|
||||||
factory = c
|
|
||||||
else:
|
|
||||||
klass = self.resolve(config.pop('class'))
|
|
||||||
# Special case for handler which refers to another handler
|
|
||||||
if issubclass(klass, logging.handlers.MemoryHandler) and\
|
|
||||||
'target' in config:
|
|
||||||
try:
|
|
||||||
config['target'] = self.config['handlers'][config['target']]
|
|
||||||
except StandardError as e:
|
|
||||||
raise ValueError('Unable to set target handler '
|
|
||||||
'%r: %s' % (config['target'], e))
|
|
||||||
elif issubclass(klass, logging.handlers.SMTPHandler) and\
|
|
||||||
'mailhost' in config:
|
|
||||||
config['mailhost'] = self.as_tuple(config['mailhost'])
|
|
||||||
elif issubclass(klass, logging.handlers.SysLogHandler) and\
|
|
||||||
'address' in config:
|
|
||||||
config['address'] = self.as_tuple(config['address'])
|
|
||||||
factory = klass
|
|
||||||
kwargs = dict((k, config[k]) for k in config if valid_ident(k))
|
|
||||||
try:
|
|
||||||
result = factory(**kwargs)
|
|
||||||
except TypeError as te:
|
|
||||||
if "'stream'" not in str(te):
|
|
||||||
raise
|
|
||||||
# The argument name changed from strm to stream
|
|
||||||
# Retry with old name.
|
|
||||||
# This is so that code can be used with older Python versions
|
|
||||||
#(e.g. by Django)
|
|
||||||
kwargs['strm'] = kwargs.pop('stream')
|
|
||||||
result = factory(**kwargs)
|
|
||||||
if formatter:
|
|
||||||
result.setFormatter(formatter)
|
|
||||||
if level is not None:
|
|
||||||
result.setLevel(_checkLevel(level))
|
|
||||||
if filters:
|
|
||||||
self.add_filters(result, filters)
|
|
||||||
return result
|
|
||||||
|
|
||||||
def add_handlers(self, logger, handlers):
|
|
||||||
"""Add handlers to a logger from a list of names."""
|
|
||||||
for h in handlers:
|
|
||||||
try:
|
|
||||||
logger.addHandler(self.config['handlers'][h])
|
|
||||||
except StandardError as e:
|
|
||||||
raise ValueError('Unable to add handler %r: %s' % (h, e))
|
|
||||||
|
|
||||||
def common_logger_config(self, logger, config, incremental=False):
|
|
||||||
"""
|
|
||||||
Perform configuration which is common to root and non-root loggers.
|
|
||||||
"""
|
|
||||||
level = config.get('level', None)
|
|
||||||
if level is not None:
|
|
||||||
logger.setLevel(_checkLevel(level))
|
|
||||||
if not incremental:
|
|
||||||
# Remove any existing handlers
|
|
||||||
for h in logger.handlers[:]:
|
|
||||||
logger.removeHandler(h)
|
|
||||||
handlers = config.get('handlers', None)
|
|
||||||
if handlers:
|
|
||||||
self.add_handlers(logger, handlers)
|
|
||||||
filters = config.get('filters', None)
|
|
||||||
if filters:
|
|
||||||
self.add_filters(logger, filters)
|
|
||||||
|
|
||||||
def configure_logger(self, name, config, incremental=False):
|
|
||||||
"""Configure a non-root logger from a dictionary."""
|
|
||||||
logger = logging.getLogger(name)
|
|
||||||
self.common_logger_config(logger, config, incremental)
|
|
||||||
propagate = config.get('propagate', None)
|
|
||||||
if propagate is not None:
|
|
||||||
logger.propagate = propagate
|
|
||||||
|
|
||||||
def configure_root(self, config, incremental=False):
|
|
||||||
"""Configure a root logger from a dictionary."""
|
|
||||||
root = logging.getLogger()
|
|
||||||
self.common_logger_config(root, config, incremental)
|
|
||||||
|
|
||||||
dictConfigClass = DictConfigurator
|
|
||||||
|
|
||||||
|
|
||||||
def dictConfig(config):
|
|
||||||
"""Configure logging using a dictionary."""
|
|
||||||
dictConfigClass(config).configure()
|
|
|
@ -1,4 +0,0 @@
|
||||||
from pip.models.index import Index, PyPI
|
|
||||||
|
|
||||||
|
|
||||||
__all__ = ["Index", "PyPI"]
|
|
|
@ -1,16 +0,0 @@
|
||||||
from pip._vendor.six.moves.urllib import parse as urllib_parse
|
|
||||||
|
|
||||||
|
|
||||||
class Index(object):
|
|
||||||
def __init__(self, url):
|
|
||||||
self.url = url
|
|
||||||
self.netloc = urllib_parse.urlsplit(url).netloc
|
|
||||||
self.simple_url = self.url_to_path('simple')
|
|
||||||
self.pypi_url = self.url_to_path('pypi')
|
|
||||||
self.pip_json_url = self.url_to_path('pypi/pip/json')
|
|
||||||
|
|
||||||
def url_to_path(self, path):
|
|
||||||
return urllib_parse.urljoin(self.url, path)
|
|
||||||
|
|
||||||
|
|
||||||
PyPI = Index('https://pypi.python.org/')
|
|
|
@ -1,49 +0,0 @@
|
||||||
|
|
||||||
|
|
||||||
def check_requirements(installed_dists):
|
|
||||||
missing_reqs_dict = {}
|
|
||||||
incompatible_reqs_dict = {}
|
|
||||||
|
|
||||||
for dist in installed_dists:
|
|
||||||
key = '%s==%s' % (dist.project_name, dist.version)
|
|
||||||
|
|
||||||
missing_reqs = list(get_missing_reqs(dist, installed_dists))
|
|
||||||
if missing_reqs:
|
|
||||||
missing_reqs_dict[key] = missing_reqs
|
|
||||||
|
|
||||||
incompatible_reqs = list(get_incompatible_reqs(
|
|
||||||
dist, installed_dists))
|
|
||||||
if incompatible_reqs:
|
|
||||||
incompatible_reqs_dict[key] = incompatible_reqs
|
|
||||||
|
|
||||||
return (missing_reqs_dict, incompatible_reqs_dict)
|
|
||||||
|
|
||||||
|
|
||||||
def get_missing_reqs(dist, installed_dists):
|
|
||||||
"""Return all of the requirements of `dist` that aren't present in
|
|
||||||
`installed_dists`.
|
|
||||||
|
|
||||||
"""
|
|
||||||
installed_names = set(d.project_name.lower() for d in installed_dists)
|
|
||||||
missing_requirements = set()
|
|
||||||
|
|
||||||
for requirement in dist.requires():
|
|
||||||
if requirement.project_name.lower() not in installed_names:
|
|
||||||
missing_requirements.add(requirement)
|
|
||||||
yield requirement
|
|
||||||
|
|
||||||
|
|
||||||
def get_incompatible_reqs(dist, installed_dists):
|
|
||||||
"""Return all of the requirements of `dist` that are present in
|
|
||||||
`installed_dists`, but have incompatible versions.
|
|
||||||
|
|
||||||
"""
|
|
||||||
installed_dists_by_name = {}
|
|
||||||
for installed_dist in installed_dists:
|
|
||||||
installed_dists_by_name[installed_dist.project_name] = installed_dist
|
|
||||||
|
|
||||||
for requirement in dist.requires():
|
|
||||||
present_dist = installed_dists_by_name.get(requirement.project_name)
|
|
||||||
|
|
||||||
if present_dist and present_dist not in requirement:
|
|
||||||
yield (requirement, present_dist)
|
|
|
@ -1,132 +0,0 @@
|
||||||
from __future__ import absolute_import
|
|
||||||
|
|
||||||
import logging
|
|
||||||
import re
|
|
||||||
|
|
||||||
import pip
|
|
||||||
from pip.req import InstallRequirement
|
|
||||||
from pip.req.req_file import COMMENT_RE
|
|
||||||
from pip.utils import get_installed_distributions
|
|
||||||
from pip._vendor import pkg_resources
|
|
||||||
from pip._vendor.packaging.utils import canonicalize_name
|
|
||||||
from pip._vendor.pkg_resources import RequirementParseError
|
|
||||||
|
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
|
||||||
|
|
||||||
|
|
||||||
def freeze(
|
|
||||||
requirement=None,
|
|
||||||
find_links=None, local_only=None, user_only=None, skip_regex=None,
|
|
||||||
default_vcs=None,
|
|
||||||
isolated=False,
|
|
||||||
wheel_cache=None,
|
|
||||||
skip=()):
|
|
||||||
find_links = find_links or []
|
|
||||||
skip_match = None
|
|
||||||
|
|
||||||
if skip_regex:
|
|
||||||
skip_match = re.compile(skip_regex).search
|
|
||||||
|
|
||||||
dependency_links = []
|
|
||||||
|
|
||||||
for dist in pkg_resources.working_set:
|
|
||||||
if dist.has_metadata('dependency_links.txt'):
|
|
||||||
dependency_links.extend(
|
|
||||||
dist.get_metadata_lines('dependency_links.txt')
|
|
||||||
)
|
|
||||||
for link in find_links:
|
|
||||||
if '#egg=' in link:
|
|
||||||
dependency_links.append(link)
|
|
||||||
for link in find_links:
|
|
||||||
yield '-f %s' % link
|
|
||||||
installations = {}
|
|
||||||
for dist in get_installed_distributions(local_only=local_only,
|
|
||||||
skip=(),
|
|
||||||
user_only=user_only):
|
|
||||||
try:
|
|
||||||
req = pip.FrozenRequirement.from_dist(
|
|
||||||
dist,
|
|
||||||
dependency_links
|
|
||||||
)
|
|
||||||
except RequirementParseError:
|
|
||||||
logger.warning(
|
|
||||||
"Could not parse requirement: %s",
|
|
||||||
dist.project_name
|
|
||||||
)
|
|
||||||
continue
|
|
||||||
installations[req.name] = req
|
|
||||||
|
|
||||||
if requirement:
|
|
||||||
# the options that don't get turned into an InstallRequirement
|
|
||||||
# should only be emitted once, even if the same option is in multiple
|
|
||||||
# requirements files, so we need to keep track of what has been emitted
|
|
||||||
# so that we don't emit it again if it's seen again
|
|
||||||
emitted_options = set()
|
|
||||||
for req_file_path in requirement:
|
|
||||||
with open(req_file_path) as req_file:
|
|
||||||
for line in req_file:
|
|
||||||
if (not line.strip() or
|
|
||||||
line.strip().startswith('#') or
|
|
||||||
(skip_match and skip_match(line)) or
|
|
||||||
line.startswith((
|
|
||||||
'-r', '--requirement',
|
|
||||||
'-Z', '--always-unzip',
|
|
||||||
'-f', '--find-links',
|
|
||||||
'-i', '--index-url',
|
|
||||||
'--pre',
|
|
||||||
'--trusted-host',
|
|
||||||
'--process-dependency-links',
|
|
||||||
'--extra-index-url'))):
|
|
||||||
line = line.rstrip()
|
|
||||||
if line not in emitted_options:
|
|
||||||
emitted_options.add(line)
|
|
||||||
yield line
|
|
||||||
continue
|
|
||||||
|
|
||||||
if line.startswith('-e') or line.startswith('--editable'):
|
|
||||||
if line.startswith('-e'):
|
|
||||||
line = line[2:].strip()
|
|
||||||
else:
|
|
||||||
line = line[len('--editable'):].strip().lstrip('=')
|
|
||||||
line_req = InstallRequirement.from_editable(
|
|
||||||
line,
|
|
||||||
default_vcs=default_vcs,
|
|
||||||
isolated=isolated,
|
|
||||||
wheel_cache=wheel_cache,
|
|
||||||
)
|
|
||||||
else:
|
|
||||||
line_req = InstallRequirement.from_line(
|
|
||||||
COMMENT_RE.sub('', line).strip(),
|
|
||||||
isolated=isolated,
|
|
||||||
wheel_cache=wheel_cache,
|
|
||||||
)
|
|
||||||
|
|
||||||
if not line_req.name:
|
|
||||||
logger.info(
|
|
||||||
"Skipping line in requirement file [%s] because "
|
|
||||||
"it's not clear what it would install: %s",
|
|
||||||
req_file_path, line.strip(),
|
|
||||||
)
|
|
||||||
logger.info(
|
|
||||||
" (add #egg=PackageName to the URL to avoid"
|
|
||||||
" this warning)"
|
|
||||||
)
|
|
||||||
elif line_req.name not in installations:
|
|
||||||
logger.warning(
|
|
||||||
"Requirement file [%s] contains %s, but that "
|
|
||||||
"package is not installed",
|
|
||||||
req_file_path, COMMENT_RE.sub('', line).strip(),
|
|
||||||
)
|
|
||||||
else:
|
|
||||||
yield str(installations[line_req.name]).rstrip()
|
|
||||||
del installations[line_req.name]
|
|
||||||
|
|
||||||
yield(
|
|
||||||
'## The following requirements were added by '
|
|
||||||
'pip freeze:'
|
|
||||||
)
|
|
||||||
for installation in sorted(
|
|
||||||
installations.values(), key=lambda x: x.name.lower()):
|
|
||||||
if canonicalize_name(installation.name) not in skip:
|
|
||||||
yield str(installation).rstrip()
|
|
|
@ -1,10 +0,0 @@
|
||||||
from __future__ import absolute_import
|
|
||||||
|
|
||||||
from .req_install import InstallRequirement
|
|
||||||
from .req_set import RequirementSet, Requirements
|
|
||||||
from .req_file import parse_requirements
|
|
||||||
|
|
||||||
__all__ = [
|
|
||||||
"RequirementSet", "Requirements", "InstallRequirement",
|
|
||||||
"parse_requirements",
|
|
||||||
]
|
|
|
@ -1,798 +0,0 @@
|
||||||
from __future__ import absolute_import
|
|
||||||
|
|
||||||
from collections import defaultdict
|
|
||||||
from itertools import chain
|
|
||||||
import logging
|
|
||||||
import os
|
|
||||||
|
|
||||||
from pip._vendor import pkg_resources
|
|
||||||
from pip._vendor import requests
|
|
||||||
|
|
||||||
from pip.compat import expanduser
|
|
||||||
from pip.download import (is_file_url, is_dir_url, is_vcs_url, url_to_path,
|
|
||||||
unpack_url)
|
|
||||||
from pip.exceptions import (InstallationError, BestVersionAlreadyInstalled,
|
|
||||||
DistributionNotFound, PreviousBuildDirError,
|
|
||||||
HashError, HashErrors, HashUnpinned,
|
|
||||||
DirectoryUrlHashUnsupported, VcsHashUnsupported,
|
|
||||||
UnsupportedPythonVersion)
|
|
||||||
from pip.req.req_install import InstallRequirement
|
|
||||||
from pip.utils import (
|
|
||||||
display_path, dist_in_usersite, ensure_dir, normalize_path)
|
|
||||||
from pip.utils.hashes import MissingHashes
|
|
||||||
from pip.utils.logging import indent_log
|
|
||||||
from pip.utils.packaging import check_dist_requires_python
|
|
||||||
from pip.vcs import vcs
|
|
||||||
from pip.wheel import Wheel
|
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
|
||||||
|
|
||||||
|
|
||||||
class Requirements(object):
|
|
||||||
|
|
||||||
def __init__(self):
|
|
||||||
self._keys = []
|
|
||||||
self._dict = {}
|
|
||||||
|
|
||||||
def keys(self):
|
|
||||||
return self._keys
|
|
||||||
|
|
||||||
def values(self):
|
|
||||||
return [self._dict[key] for key in self._keys]
|
|
||||||
|
|
||||||
def __contains__(self, item):
|
|
||||||
return item in self._keys
|
|
||||||
|
|
||||||
def __setitem__(self, key, value):
|
|
||||||
if key not in self._keys:
|
|
||||||
self._keys.append(key)
|
|
||||||
self._dict[key] = value
|
|
||||||
|
|
||||||
def __getitem__(self, key):
|
|
||||||
return self._dict[key]
|
|
||||||
|
|
||||||
def __repr__(self):
|
|
||||||
values = ['%s: %s' % (repr(k), repr(self[k])) for k in self.keys()]
|
|
||||||
return 'Requirements({%s})' % ', '.join(values)
|
|
||||||
|
|
||||||
|
|
||||||
class DistAbstraction(object):
|
|
||||||
"""Abstracts out the wheel vs non-wheel prepare_files logic.
|
|
||||||
|
|
||||||
The requirements for anything installable are as follows:
|
|
||||||
- we must be able to determine the requirement name
|
|
||||||
(or we can't correctly handle the non-upgrade case).
|
|
||||||
- we must be able to generate a list of run-time dependencies
|
|
||||||
without installing any additional packages (or we would
|
|
||||||
have to either burn time by doing temporary isolated installs
|
|
||||||
or alternatively violate pips 'don't start installing unless
|
|
||||||
all requirements are available' rule - neither of which are
|
|
||||||
desirable).
|
|
||||||
- for packages with setup requirements, we must also be able
|
|
||||||
to determine their requirements without installing additional
|
|
||||||
packages (for the same reason as run-time dependencies)
|
|
||||||
- we must be able to create a Distribution object exposing the
|
|
||||||
above metadata.
|
|
||||||
"""
|
|
||||||
|
|
||||||
def __init__(self, req_to_install):
|
|
||||||
self.req_to_install = req_to_install
|
|
||||||
|
|
||||||
def dist(self, finder):
|
|
||||||
"""Return a setuptools Dist object."""
|
|
||||||
raise NotImplementedError(self.dist)
|
|
||||||
|
|
||||||
def prep_for_dist(self):
|
|
||||||
"""Ensure that we can get a Dist for this requirement."""
|
|
||||||
raise NotImplementedError(self.dist)
|
|
||||||
|
|
||||||
|
|
||||||
def make_abstract_dist(req_to_install):
|
|
||||||
"""Factory to make an abstract dist object.
|
|
||||||
|
|
||||||
Preconditions: Either an editable req with a source_dir, or satisfied_by or
|
|
||||||
a wheel link, or a non-editable req with a source_dir.
|
|
||||||
|
|
||||||
:return: A concrete DistAbstraction.
|
|
||||||
"""
|
|
||||||
if req_to_install.editable:
|
|
||||||
return IsSDist(req_to_install)
|
|
||||||
elif req_to_install.link and req_to_install.link.is_wheel:
|
|
||||||
return IsWheel(req_to_install)
|
|
||||||
else:
|
|
||||||
return IsSDist(req_to_install)
|
|
||||||
|
|
||||||
|
|
||||||
class IsWheel(DistAbstraction):
|
|
||||||
|
|
||||||
def dist(self, finder):
|
|
||||||
return list(pkg_resources.find_distributions(
|
|
||||||
self.req_to_install.source_dir))[0]
|
|
||||||
|
|
||||||
def prep_for_dist(self):
|
|
||||||
# FIXME:https://github.com/pypa/pip/issues/1112
|
|
||||||
pass
|
|
||||||
|
|
||||||
|
|
||||||
class IsSDist(DistAbstraction):
|
|
||||||
|
|
||||||
def dist(self, finder):
|
|
||||||
dist = self.req_to_install.get_dist()
|
|
||||||
# FIXME: shouldn't be globally added:
|
|
||||||
if dist.has_metadata('dependency_links.txt'):
|
|
||||||
finder.add_dependency_links(
|
|
||||||
dist.get_metadata_lines('dependency_links.txt')
|
|
||||||
)
|
|
||||||
return dist
|
|
||||||
|
|
||||||
def prep_for_dist(self):
|
|
||||||
self.req_to_install.run_egg_info()
|
|
||||||
self.req_to_install.assert_source_matches_version()
|
|
||||||
|
|
||||||
|
|
||||||
class Installed(DistAbstraction):
|
|
||||||
|
|
||||||
def dist(self, finder):
|
|
||||||
return self.req_to_install.satisfied_by
|
|
||||||
|
|
||||||
def prep_for_dist(self):
|
|
||||||
pass
|
|
||||||
|
|
||||||
|
|
||||||
class RequirementSet(object):
|
|
||||||
|
|
||||||
def __init__(self, build_dir, src_dir, download_dir, upgrade=False,
|
|
||||||
upgrade_strategy=None, ignore_installed=False, as_egg=False,
|
|
||||||
target_dir=None, ignore_dependencies=False,
|
|
||||||
force_reinstall=False, use_user_site=False, session=None,
|
|
||||||
pycompile=True, isolated=False, wheel_download_dir=None,
|
|
||||||
wheel_cache=None, require_hashes=False,
|
|
||||||
ignore_requires_python=False):
|
|
||||||
"""Create a RequirementSet.
|
|
||||||
|
|
||||||
:param wheel_download_dir: Where still-packed .whl files should be
|
|
||||||
written to. If None they are written to the download_dir parameter.
|
|
||||||
Separate to download_dir to permit only keeping wheel archives for
|
|
||||||
pip wheel.
|
|
||||||
:param download_dir: Where still packed archives should be written to.
|
|
||||||
If None they are not saved, and are deleted immediately after
|
|
||||||
unpacking.
|
|
||||||
:param wheel_cache: The pip wheel cache, for passing to
|
|
||||||
InstallRequirement.
|
|
||||||
"""
|
|
||||||
if session is None:
|
|
||||||
raise TypeError(
|
|
||||||
"RequirementSet() missing 1 required keyword argument: "
|
|
||||||
"'session'"
|
|
||||||
)
|
|
||||||
|
|
||||||
self.build_dir = build_dir
|
|
||||||
self.src_dir = src_dir
|
|
||||||
# XXX: download_dir and wheel_download_dir overlap semantically and may
|
|
||||||
# be combined if we're willing to have non-wheel archives present in
|
|
||||||
# the wheelhouse output by 'pip wheel'.
|
|
||||||
self.download_dir = download_dir
|
|
||||||
self.upgrade = upgrade
|
|
||||||
self.upgrade_strategy = upgrade_strategy
|
|
||||||
self.ignore_installed = ignore_installed
|
|
||||||
self.force_reinstall = force_reinstall
|
|
||||||
self.requirements = Requirements()
|
|
||||||
# Mapping of alias: real_name
|
|
||||||
self.requirement_aliases = {}
|
|
||||||
self.unnamed_requirements = []
|
|
||||||
self.ignore_dependencies = ignore_dependencies
|
|
||||||
self.ignore_requires_python = ignore_requires_python
|
|
||||||
self.successfully_downloaded = []
|
|
||||||
self.successfully_installed = []
|
|
||||||
self.reqs_to_cleanup = []
|
|
||||||
self.as_egg = as_egg
|
|
||||||
self.use_user_site = use_user_site
|
|
||||||
self.target_dir = target_dir # set from --target option
|
|
||||||
self.session = session
|
|
||||||
self.pycompile = pycompile
|
|
||||||
self.isolated = isolated
|
|
||||||
if wheel_download_dir:
|
|
||||||
wheel_download_dir = normalize_path(wheel_download_dir)
|
|
||||||
self.wheel_download_dir = wheel_download_dir
|
|
||||||
self._wheel_cache = wheel_cache
|
|
||||||
self.require_hashes = require_hashes
|
|
||||||
# Maps from install_req -> dependencies_of_install_req
|
|
||||||
self._dependencies = defaultdict(list)
|
|
||||||
|
|
||||||
def __str__(self):
|
|
||||||
reqs = [req for req in self.requirements.values()
|
|
||||||
if not req.comes_from]
|
|
||||||
reqs.sort(key=lambda req: req.name.lower())
|
|
||||||
return ' '.join([str(req.req) for req in reqs])
|
|
||||||
|
|
||||||
def __repr__(self):
|
|
||||||
reqs = [req for req in self.requirements.values()]
|
|
||||||
reqs.sort(key=lambda req: req.name.lower())
|
|
||||||
reqs_str = ', '.join([str(req.req) for req in reqs])
|
|
||||||
return ('<%s object; %d requirement(s): %s>'
|
|
||||||
% (self.__class__.__name__, len(reqs), reqs_str))
|
|
||||||
|
|
||||||
def add_requirement(self, install_req, parent_req_name=None,
|
|
||||||
extras_requested=None):
|
|
||||||
"""Add install_req as a requirement to install.
|
|
||||||
|
|
||||||
:param parent_req_name: The name of the requirement that needed this
|
|
||||||
added. The name is used because when multiple unnamed requirements
|
|
||||||
resolve to the same name, we could otherwise end up with dependency
|
|
||||||
links that point outside the Requirements set. parent_req must
|
|
||||||
already be added. Note that None implies that this is a user
|
|
||||||
supplied requirement, vs an inferred one.
|
|
||||||
:param extras_requested: an iterable of extras used to evaluate the
|
|
||||||
environement markers.
|
|
||||||
:return: Additional requirements to scan. That is either [] if
|
|
||||||
the requirement is not applicable, or [install_req] if the
|
|
||||||
requirement is applicable and has just been added.
|
|
||||||
"""
|
|
||||||
name = install_req.name
|
|
||||||
if not install_req.match_markers(extras_requested):
|
|
||||||
logger.warning("Ignoring %s: markers '%s' don't match your "
|
|
||||||
"environment", install_req.name,
|
|
||||||
install_req.markers)
|
|
||||||
return []
|
|
||||||
|
|
||||||
# This check has to come after we filter requirements with the
|
|
||||||
# environment markers.
|
|
||||||
if install_req.link and install_req.link.is_wheel:
|
|
||||||
wheel = Wheel(install_req.link.filename)
|
|
||||||
if not wheel.supported():
|
|
||||||
raise InstallationError(
|
|
||||||
"%s is not a supported wheel on this platform." %
|
|
||||||
wheel.filename
|
|
||||||
)
|
|
||||||
|
|
||||||
install_req.as_egg = self.as_egg
|
|
||||||
install_req.use_user_site = self.use_user_site
|
|
||||||
install_req.target_dir = self.target_dir
|
|
||||||
install_req.pycompile = self.pycompile
|
|
||||||
install_req.is_direct = (parent_req_name is None)
|
|
||||||
|
|
||||||
if not name:
|
|
||||||
# url or path requirement w/o an egg fragment
|
|
||||||
self.unnamed_requirements.append(install_req)
|
|
||||||
return [install_req]
|
|
||||||
else:
|
|
||||||
try:
|
|
||||||
existing_req = self.get_requirement(name)
|
|
||||||
except KeyError:
|
|
||||||
existing_req = None
|
|
||||||
if (parent_req_name is None and existing_req and not
|
|
||||||
existing_req.constraint and
|
|
||||||
existing_req.extras == install_req.extras and not
|
|
||||||
existing_req.req.specifier == install_req.req.specifier):
|
|
||||||
raise InstallationError(
|
|
||||||
'Double requirement given: %s (already in %s, name=%r)'
|
|
||||||
% (install_req, existing_req, name))
|
|
||||||
if not existing_req:
|
|
||||||
# Add requirement
|
|
||||||
self.requirements[name] = install_req
|
|
||||||
# FIXME: what about other normalizations? E.g., _ vs. -?
|
|
||||||
if name.lower() != name:
|
|
||||||
self.requirement_aliases[name.lower()] = name
|
|
||||||
result = [install_req]
|
|
||||||
else:
|
|
||||||
# Assume there's no need to scan, and that we've already
|
|
||||||
# encountered this for scanning.
|
|
||||||
result = []
|
|
||||||
if not install_req.constraint and existing_req.constraint:
|
|
||||||
if (install_req.link and not (existing_req.link and
|
|
||||||
install_req.link.path == existing_req.link.path)):
|
|
||||||
self.reqs_to_cleanup.append(install_req)
|
|
||||||
raise InstallationError(
|
|
||||||
"Could not satisfy constraints for '%s': "
|
|
||||||
"installation from path or url cannot be "
|
|
||||||
"constrained to a version" % name)
|
|
||||||
# If we're now installing a constraint, mark the existing
|
|
||||||
# object for real installation.
|
|
||||||
existing_req.constraint = False
|
|
||||||
existing_req.extras = tuple(
|
|
||||||
sorted(set(existing_req.extras).union(
|
|
||||||
set(install_req.extras))))
|
|
||||||
logger.debug("Setting %s extras to: %s",
|
|
||||||
existing_req, existing_req.extras)
|
|
||||||
# And now we need to scan this.
|
|
||||||
result = [existing_req]
|
|
||||||
# Canonicalise to the already-added object for the backref
|
|
||||||
# check below.
|
|
||||||
install_req = existing_req
|
|
||||||
if parent_req_name:
|
|
||||||
parent_req = self.get_requirement(parent_req_name)
|
|
||||||
self._dependencies[parent_req].append(install_req)
|
|
||||||
return result
|
|
||||||
|
|
||||||
def has_requirement(self, project_name):
|
|
||||||
name = project_name.lower()
|
|
||||||
if (name in self.requirements and
|
|
||||||
not self.requirements[name].constraint or
|
|
||||||
name in self.requirement_aliases and
|
|
||||||
not self.requirements[self.requirement_aliases[name]].constraint):
|
|
||||||
return True
|
|
||||||
return False
|
|
||||||
|
|
||||||
@property
|
|
||||||
def has_requirements(self):
|
|
||||||
return list(req for req in self.requirements.values() if not
|
|
||||||
req.constraint) or self.unnamed_requirements
|
|
||||||
|
|
||||||
@property
|
|
||||||
def is_download(self):
|
|
||||||
if self.download_dir:
|
|
||||||
self.download_dir = expanduser(self.download_dir)
|
|
||||||
if os.path.exists(self.download_dir):
|
|
||||||
return True
|
|
||||||
else:
|
|
||||||
logger.critical('Could not find download directory')
|
|
||||||
raise InstallationError(
|
|
||||||
"Could not find or access download directory '%s'"
|
|
||||||
% display_path(self.download_dir))
|
|
||||||
return False
|
|
||||||
|
|
||||||
def get_requirement(self, project_name):
|
|
||||||
for name in project_name, project_name.lower():
|
|
||||||
if name in self.requirements:
|
|
||||||
return self.requirements[name]
|
|
||||||
if name in self.requirement_aliases:
|
|
||||||
return self.requirements[self.requirement_aliases[name]]
|
|
||||||
raise KeyError("No project with the name %r" % project_name)
|
|
||||||
|
|
||||||
def uninstall(self, auto_confirm=False):
|
|
||||||
for req in self.requirements.values():
|
|
||||||
if req.constraint:
|
|
||||||
continue
|
|
||||||
req.uninstall(auto_confirm=auto_confirm)
|
|
||||||
req.commit_uninstall()
|
|
||||||
|
|
||||||
def prepare_files(self, finder):
|
|
||||||
"""
|
|
||||||
Prepare process. Create temp directories, download and/or unpack files.
|
|
||||||
"""
|
|
||||||
# make the wheelhouse
|
|
||||||
if self.wheel_download_dir:
|
|
||||||
ensure_dir(self.wheel_download_dir)
|
|
||||||
|
|
||||||
# If any top-level requirement has a hash specified, enter
|
|
||||||
# hash-checking mode, which requires hashes from all.
|
|
||||||
root_reqs = self.unnamed_requirements + self.requirements.values()
|
|
||||||
require_hashes = (self.require_hashes or
|
|
||||||
any(req.has_hash_options for req in root_reqs))
|
|
||||||
if require_hashes and self.as_egg:
|
|
||||||
raise InstallationError(
|
|
||||||
'--egg is not allowed with --require-hashes mode, since it '
|
|
||||||
'delegates dependency resolution to setuptools and could thus '
|
|
||||||
'result in installation of unhashed packages.')
|
|
||||||
|
|
||||||
# Actually prepare the files, and collect any exceptions. Most hash
|
|
||||||
# exceptions cannot be checked ahead of time, because
|
|
||||||
# req.populate_link() needs to be called before we can make decisions
|
|
||||||
# based on link type.
|
|
||||||
discovered_reqs = []
|
|
||||||
hash_errors = HashErrors()
|
|
||||||
for req in chain(root_reqs, discovered_reqs):
|
|
||||||
try:
|
|
||||||
discovered_reqs.extend(self._prepare_file(
|
|
||||||
finder,
|
|
||||||
req,
|
|
||||||
require_hashes=require_hashes,
|
|
||||||
ignore_dependencies=self.ignore_dependencies))
|
|
||||||
except HashError as exc:
|
|
||||||
exc.req = req
|
|
||||||
hash_errors.append(exc)
|
|
||||||
|
|
||||||
if hash_errors:
|
|
||||||
raise hash_errors
|
|
||||||
|
|
||||||
def _is_upgrade_allowed(self, req):
|
|
||||||
return self.upgrade and (
|
|
||||||
self.upgrade_strategy == "eager" or (
|
|
||||||
self.upgrade_strategy == "only-if-needed" and req.is_direct
|
|
||||||
)
|
|
||||||
)
|
|
||||||
|
|
||||||
def _check_skip_installed(self, req_to_install, finder):
|
|
||||||
"""Check if req_to_install should be skipped.
|
|
||||||
|
|
||||||
This will check if the req is installed, and whether we should upgrade
|
|
||||||
or reinstall it, taking into account all the relevant user options.
|
|
||||||
|
|
||||||
After calling this req_to_install will only have satisfied_by set to
|
|
||||||
None if the req_to_install is to be upgraded/reinstalled etc. Any
|
|
||||||
other value will be a dist recording the current thing installed that
|
|
||||||
satisfies the requirement.
|
|
||||||
|
|
||||||
Note that for vcs urls and the like we can't assess skipping in this
|
|
||||||
routine - we simply identify that we need to pull the thing down,
|
|
||||||
then later on it is pulled down and introspected to assess upgrade/
|
|
||||||
reinstalls etc.
|
|
||||||
|
|
||||||
:return: A text reason for why it was skipped, or None.
|
|
||||||
"""
|
|
||||||
# Check whether to upgrade/reinstall this req or not.
|
|
||||||
req_to_install.check_if_exists()
|
|
||||||
if req_to_install.satisfied_by:
|
|
||||||
upgrade_allowed = self._is_upgrade_allowed(req_to_install)
|
|
||||||
|
|
||||||
# Is the best version is installed.
|
|
||||||
best_installed = False
|
|
||||||
|
|
||||||
if upgrade_allowed:
|
|
||||||
# For link based requirements we have to pull the
|
|
||||||
# tree down and inspect to assess the version #, so
|
|
||||||
# its handled way down.
|
|
||||||
if not (self.force_reinstall or req_to_install.link):
|
|
||||||
try:
|
|
||||||
finder.find_requirement(
|
|
||||||
req_to_install, upgrade_allowed)
|
|
||||||
except BestVersionAlreadyInstalled:
|
|
||||||
best_installed = True
|
|
||||||
except DistributionNotFound:
|
|
||||||
# No distribution found, so we squash the
|
|
||||||
# error - it will be raised later when we
|
|
||||||
# re-try later to do the install.
|
|
||||||
# Why don't we just raise here?
|
|
||||||
pass
|
|
||||||
|
|
||||||
if not best_installed:
|
|
||||||
# don't uninstall conflict if user install and
|
|
||||||
# conflict is not user install
|
|
||||||
if not (self.use_user_site and not
|
|
||||||
dist_in_usersite(req_to_install.satisfied_by)):
|
|
||||||
req_to_install.conflicts_with = \
|
|
||||||
req_to_install.satisfied_by
|
|
||||||
req_to_install.satisfied_by = None
|
|
||||||
|
|
||||||
# Figure out a nice message to say why we're skipping this.
|
|
||||||
if best_installed:
|
|
||||||
skip_reason = 'already up-to-date'
|
|
||||||
elif self.upgrade_strategy == "only-if-needed":
|
|
||||||
skip_reason = 'not upgraded as not directly required'
|
|
||||||
else:
|
|
||||||
skip_reason = 'already satisfied'
|
|
||||||
|
|
||||||
return skip_reason
|
|
||||||
else:
|
|
||||||
return None
|
|
||||||
|
|
||||||
def _prepare_file(self,
|
|
||||||
finder,
|
|
||||||
req_to_install,
|
|
||||||
require_hashes=False,
|
|
||||||
ignore_dependencies=False):
|
|
||||||
"""Prepare a single requirements file.
|
|
||||||
|
|
||||||
:return: A list of additional InstallRequirements to also install.
|
|
||||||
"""
|
|
||||||
# Tell user what we are doing for this requirement:
|
|
||||||
# obtain (editable), skipping, processing (local url), collecting
|
|
||||||
# (remote url or package name)
|
|
||||||
if req_to_install.constraint or req_to_install.prepared:
|
|
||||||
return []
|
|
||||||
|
|
||||||
req_to_install.prepared = True
|
|
||||||
|
|
||||||
# ###################### #
|
|
||||||
# # print log messages # #
|
|
||||||
# ###################### #
|
|
||||||
if req_to_install.editable:
|
|
||||||
logger.info('Obtaining %s', req_to_install)
|
|
||||||
else:
|
|
||||||
# satisfied_by is only evaluated by calling _check_skip_installed,
|
|
||||||
# so it must be None here.
|
|
||||||
assert req_to_install.satisfied_by is None
|
|
||||||
if not self.ignore_installed:
|
|
||||||
skip_reason = self._check_skip_installed(
|
|
||||||
req_to_install, finder)
|
|
||||||
|
|
||||||
if req_to_install.satisfied_by:
|
|
||||||
assert skip_reason is not None, (
|
|
||||||
'_check_skip_installed returned None but '
|
|
||||||
'req_to_install.satisfied_by is set to %r'
|
|
||||||
% (req_to_install.satisfied_by,))
|
|
||||||
logger.info(
|
|
||||||
'Requirement %s: %s', skip_reason,
|
|
||||||
req_to_install)
|
|
||||||
else:
|
|
||||||
if (req_to_install.link and
|
|
||||||
req_to_install.link.scheme == 'file'):
|
|
||||||
path = url_to_path(req_to_install.link.url)
|
|
||||||
logger.info('Processing %s', display_path(path))
|
|
||||||
else:
|
|
||||||
logger.info('Collecting %s', req_to_install)
|
|
||||||
|
|
||||||
with indent_log():
|
|
||||||
# ################################ #
|
|
||||||
# # vcs update or unpack archive # #
|
|
||||||
# ################################ #
|
|
||||||
if req_to_install.editable:
|
|
||||||
if require_hashes:
|
|
||||||
raise InstallationError(
|
|
||||||
'The editable requirement %s cannot be installed when '
|
|
||||||
'requiring hashes, because there is no single file to '
|
|
||||||
'hash.' % req_to_install)
|
|
||||||
req_to_install.ensure_has_source_dir(self.src_dir)
|
|
||||||
req_to_install.update_editable(not self.is_download)
|
|
||||||
abstract_dist = make_abstract_dist(req_to_install)
|
|
||||||
abstract_dist.prep_for_dist()
|
|
||||||
if self.is_download:
|
|
||||||
req_to_install.archive(self.download_dir)
|
|
||||||
req_to_install.check_if_exists()
|
|
||||||
elif req_to_install.satisfied_by:
|
|
||||||
if require_hashes:
|
|
||||||
logger.debug(
|
|
||||||
'Since it is already installed, we are trusting this '
|
|
||||||
'package without checking its hash. To ensure a '
|
|
||||||
'completely repeatable environment, install into an '
|
|
||||||
'empty virtualenv.')
|
|
||||||
abstract_dist = Installed(req_to_install)
|
|
||||||
else:
|
|
||||||
# @@ if filesystem packages are not marked
|
|
||||||
# editable in a req, a non deterministic error
|
|
||||||
# occurs when the script attempts to unpack the
|
|
||||||
# build directory
|
|
||||||
req_to_install.ensure_has_source_dir(self.build_dir)
|
|
||||||
# If a checkout exists, it's unwise to keep going. version
|
|
||||||
# inconsistencies are logged later, but do not fail the
|
|
||||||
# installation.
|
|
||||||
# FIXME: this won't upgrade when there's an existing
|
|
||||||
# package unpacked in `req_to_install.source_dir`
|
|
||||||
if os.path.exists(
|
|
||||||
os.path.join(req_to_install.source_dir, 'setup.py')):
|
|
||||||
raise PreviousBuildDirError(
|
|
||||||
"pip can't proceed with requirements '%s' due to a"
|
|
||||||
" pre-existing build directory (%s). This is "
|
|
||||||
"likely due to a previous installation that failed"
|
|
||||||
". pip is being responsible and not assuming it "
|
|
||||||
"can delete this. Please delete it and try again."
|
|
||||||
% (req_to_install, req_to_install.source_dir)
|
|
||||||
)
|
|
||||||
req_to_install.populate_link(
|
|
||||||
finder,
|
|
||||||
self._is_upgrade_allowed(req_to_install),
|
|
||||||
require_hashes
|
|
||||||
)
|
|
||||||
# We can't hit this spot and have populate_link return None.
|
|
||||||
# req_to_install.satisfied_by is None here (because we're
|
|
||||||
# guarded) and upgrade has no impact except when satisfied_by
|
|
||||||
# is not None.
|
|
||||||
# Then inside find_requirement existing_applicable -> False
|
|
||||||
# If no new versions are found, DistributionNotFound is raised,
|
|
||||||
# otherwise a result is guaranteed.
|
|
||||||
assert req_to_install.link
|
|
||||||
link = req_to_install.link
|
|
||||||
|
|
||||||
# Now that we have the real link, we can tell what kind of
|
|
||||||
# requirements we have and raise some more informative errors
|
|
||||||
# than otherwise. (For example, we can raise VcsHashUnsupported
|
|
||||||
# for a VCS URL rather than HashMissing.)
|
|
||||||
if require_hashes:
|
|
||||||
# We could check these first 2 conditions inside
|
|
||||||
# unpack_url and save repetition of conditions, but then
|
|
||||||
# we would report less-useful error messages for
|
|
||||||
# unhashable requirements, complaining that there's no
|
|
||||||
# hash provided.
|
|
||||||
if is_vcs_url(link):
|
|
||||||
raise VcsHashUnsupported()
|
|
||||||
elif is_file_url(link) and is_dir_url(link):
|
|
||||||
raise DirectoryUrlHashUnsupported()
|
|
||||||
if (not req_to_install.original_link and
|
|
||||||
not req_to_install.is_pinned):
|
|
||||||
# Unpinned packages are asking for trouble when a new
|
|
||||||
# version is uploaded. This isn't a security check, but
|
|
||||||
# it saves users a surprising hash mismatch in the
|
|
||||||
# future.
|
|
||||||
#
|
|
||||||
# file:/// URLs aren't pinnable, so don't complain
|
|
||||||
# about them not being pinned.
|
|
||||||
raise HashUnpinned()
|
|
||||||
hashes = req_to_install.hashes(
|
|
||||||
trust_internet=not require_hashes)
|
|
||||||
if require_hashes and not hashes:
|
|
||||||
# Known-good hashes are missing for this requirement, so
|
|
||||||
# shim it with a facade object that will provoke hash
|
|
||||||
# computation and then raise a HashMissing exception
|
|
||||||
# showing the user what the hash should be.
|
|
||||||
hashes = MissingHashes()
|
|
||||||
|
|
||||||
try:
|
|
||||||
download_dir = self.download_dir
|
|
||||||
# We always delete unpacked sdists after pip ran.
|
|
||||||
autodelete_unpacked = True
|
|
||||||
if req_to_install.link.is_wheel \
|
|
||||||
and self.wheel_download_dir:
|
|
||||||
# when doing 'pip wheel` we download wheels to a
|
|
||||||
# dedicated dir.
|
|
||||||
download_dir = self.wheel_download_dir
|
|
||||||
if req_to_install.link.is_wheel:
|
|
||||||
if download_dir:
|
|
||||||
# When downloading, we only unpack wheels to get
|
|
||||||
# metadata.
|
|
||||||
autodelete_unpacked = True
|
|
||||||
else:
|
|
||||||
# When installing a wheel, we use the unpacked
|
|
||||||
# wheel.
|
|
||||||
autodelete_unpacked = False
|
|
||||||
unpack_url(
|
|
||||||
req_to_install.link, req_to_install.source_dir,
|
|
||||||
download_dir, autodelete_unpacked,
|
|
||||||
session=self.session, hashes=hashes)
|
|
||||||
except requests.HTTPError as exc:
|
|
||||||
logger.critical(
|
|
||||||
'Could not install requirement %s because '
|
|
||||||
'of error %s',
|
|
||||||
req_to_install,
|
|
||||||
exc,
|
|
||||||
)
|
|
||||||
raise InstallationError(
|
|
||||||
'Could not install requirement %s because '
|
|
||||||
'of HTTP error %s for URL %s' %
|
|
||||||
(req_to_install, exc, req_to_install.link)
|
|
||||||
)
|
|
||||||
abstract_dist = make_abstract_dist(req_to_install)
|
|
||||||
abstract_dist.prep_for_dist()
|
|
||||||
if self.is_download:
|
|
||||||
# Make a .zip of the source_dir we already created.
|
|
||||||
if req_to_install.link.scheme in vcs.all_schemes:
|
|
||||||
req_to_install.archive(self.download_dir)
|
|
||||||
# req_to_install.req is only avail after unpack for URL
|
|
||||||
# pkgs repeat check_if_exists to uninstall-on-upgrade
|
|
||||||
# (#14)
|
|
||||||
if not self.ignore_installed:
|
|
||||||
req_to_install.check_if_exists()
|
|
||||||
if req_to_install.satisfied_by:
|
|
||||||
if self.upgrade or self.ignore_installed:
|
|
||||||
# don't uninstall conflict if user install and
|
|
||||||
# conflict is not user install
|
|
||||||
if not (self.use_user_site and not
|
|
||||||
dist_in_usersite(
|
|
||||||
req_to_install.satisfied_by)):
|
|
||||||
req_to_install.conflicts_with = \
|
|
||||||
req_to_install.satisfied_by
|
|
||||||
req_to_install.satisfied_by = None
|
|
||||||
else:
|
|
||||||
logger.info(
|
|
||||||
'Requirement already satisfied (use '
|
|
||||||
'--upgrade to upgrade): %s',
|
|
||||||
req_to_install,
|
|
||||||
)
|
|
||||||
|
|
||||||
# ###################### #
|
|
||||||
# # parse dependencies # #
|
|
||||||
# ###################### #
|
|
||||||
dist = abstract_dist.dist(finder)
|
|
||||||
try:
|
|
||||||
check_dist_requires_python(dist)
|
|
||||||
except UnsupportedPythonVersion as e:
|
|
||||||
if self.ignore_requires_python:
|
|
||||||
logger.warning(e.args[0])
|
|
||||||
else:
|
|
||||||
req_to_install.remove_temporary_source()
|
|
||||||
raise
|
|
||||||
more_reqs = []
|
|
||||||
|
|
||||||
def add_req(subreq, extras_requested):
|
|
||||||
sub_install_req = InstallRequirement(
|
|
||||||
str(subreq),
|
|
||||||
req_to_install,
|
|
||||||
isolated=self.isolated,
|
|
||||||
wheel_cache=self._wheel_cache,
|
|
||||||
)
|
|
||||||
more_reqs.extend(self.add_requirement(
|
|
||||||
sub_install_req, req_to_install.name,
|
|
||||||
extras_requested=extras_requested))
|
|
||||||
|
|
||||||
# We add req_to_install before its dependencies, so that we
|
|
||||||
# can refer to it when adding dependencies.
|
|
||||||
if not self.has_requirement(req_to_install.name):
|
|
||||||
# 'unnamed' requirements will get added here
|
|
||||||
self.add_requirement(req_to_install, None)
|
|
||||||
|
|
||||||
if not ignore_dependencies:
|
|
||||||
if (req_to_install.extras):
|
|
||||||
logger.debug(
|
|
||||||
"Installing extra requirements: %r",
|
|
||||||
','.join(req_to_install.extras),
|
|
||||||
)
|
|
||||||
missing_requested = sorted(
|
|
||||||
set(req_to_install.extras) - set(dist.extras)
|
|
||||||
)
|
|
||||||
for missing in missing_requested:
|
|
||||||
logger.warning(
|
|
||||||
'%s does not provide the extra \'%s\'',
|
|
||||||
dist, missing
|
|
||||||
)
|
|
||||||
|
|
||||||
available_requested = sorted(
|
|
||||||
set(dist.extras) & set(req_to_install.extras)
|
|
||||||
)
|
|
||||||
for subreq in dist.requires(available_requested):
|
|
||||||
add_req(subreq, extras_requested=available_requested)
|
|
||||||
|
|
||||||
# cleanup tmp src
|
|
||||||
self.reqs_to_cleanup.append(req_to_install)
|
|
||||||
|
|
||||||
if not req_to_install.editable and not req_to_install.satisfied_by:
|
|
||||||
# XXX: --no-install leads this to report 'Successfully
|
|
||||||
# downloaded' for only non-editable reqs, even though we took
|
|
||||||
# action on them.
|
|
||||||
self.successfully_downloaded.append(req_to_install)
|
|
||||||
|
|
||||||
return more_reqs
|
|
||||||
|
|
||||||
def cleanup_files(self):
|
|
||||||
"""Clean up files, remove builds."""
|
|
||||||
logger.debug('Cleaning up...')
|
|
||||||
with indent_log():
|
|
||||||
for req in self.reqs_to_cleanup:
|
|
||||||
req.remove_temporary_source()
|
|
||||||
|
|
||||||
def _to_install(self):
|
|
||||||
"""Create the installation order.
|
|
||||||
|
|
||||||
The installation order is topological - requirements are installed
|
|
||||||
before the requiring thing. We break cycles at an arbitrary point,
|
|
||||||
and make no other guarantees.
|
|
||||||
"""
|
|
||||||
# The current implementation, which we may change at any point
|
|
||||||
# installs the user specified things in the order given, except when
|
|
||||||
# dependencies must come earlier to achieve topological order.
|
|
||||||
order = []
|
|
||||||
ordered_reqs = set()
|
|
||||||
|
|
||||||
def schedule(req):
|
|
||||||
if req.satisfied_by or req in ordered_reqs:
|
|
||||||
return
|
|
||||||
if req.constraint:
|
|
||||||
return
|
|
||||||
ordered_reqs.add(req)
|
|
||||||
for dep in self._dependencies[req]:
|
|
||||||
schedule(dep)
|
|
||||||
order.append(req)
|
|
||||||
for install_req in self.requirements.values():
|
|
||||||
schedule(install_req)
|
|
||||||
return order
|
|
||||||
|
|
||||||
def install(self, install_options, global_options=(), *args, **kwargs):
|
|
||||||
"""
|
|
||||||
Install everything in this set (after having downloaded and unpacked
|
|
||||||
the packages)
|
|
||||||
"""
|
|
||||||
to_install = self._to_install()
|
|
||||||
|
|
||||||
if to_install:
|
|
||||||
logger.info(
|
|
||||||
'Installing collected packages: %s',
|
|
||||||
', '.join([req.name for req in to_install]),
|
|
||||||
)
|
|
||||||
|
|
||||||
with indent_log():
|
|
||||||
for requirement in to_install:
|
|
||||||
if requirement.conflicts_with:
|
|
||||||
logger.info(
|
|
||||||
'Found existing installation: %s',
|
|
||||||
requirement.conflicts_with,
|
|
||||||
)
|
|
||||||
with indent_log():
|
|
||||||
requirement.uninstall(auto_confirm=True)
|
|
||||||
try:
|
|
||||||
requirement.install(
|
|
||||||
install_options,
|
|
||||||
global_options,
|
|
||||||
*args,
|
|
||||||
**kwargs
|
|
||||||
)
|
|
||||||
except:
|
|
||||||
# if install did not succeed, rollback previous uninstall
|
|
||||||
if (requirement.conflicts_with and not
|
|
||||||
requirement.install_succeeded):
|
|
||||||
requirement.rollback_uninstall()
|
|
||||||
raise
|
|
||||||
else:
|
|
||||||
if (requirement.conflicts_with and
|
|
||||||
requirement.install_succeeded):
|
|
||||||
requirement.commit_uninstall()
|
|
||||||
requirement.remove_temporary_source()
|
|
||||||
|
|
||||||
self.successfully_installed = to_install
|
|
|
@ -1,195 +0,0 @@
|
||||||
from __future__ import absolute_import
|
|
||||||
|
|
||||||
import logging
|
|
||||||
import os
|
|
||||||
import tempfile
|
|
||||||
|
|
||||||
from pip.compat import uses_pycache, WINDOWS, cache_from_source
|
|
||||||
from pip.exceptions import UninstallationError
|
|
||||||
from pip.utils import rmtree, ask, is_local, renames, normalize_path
|
|
||||||
from pip.utils.logging import indent_log
|
|
||||||
|
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
|
||||||
|
|
||||||
|
|
||||||
class UninstallPathSet(object):
|
|
||||||
"""A set of file paths to be removed in the uninstallation of a
|
|
||||||
requirement."""
|
|
||||||
def __init__(self, dist):
|
|
||||||
self.paths = set()
|
|
||||||
self._refuse = set()
|
|
||||||
self.pth = {}
|
|
||||||
self.dist = dist
|
|
||||||
self.save_dir = None
|
|
||||||
self._moved_paths = []
|
|
||||||
|
|
||||||
def _permitted(self, path):
|
|
||||||
"""
|
|
||||||
Return True if the given path is one we are permitted to
|
|
||||||
remove/modify, False otherwise.
|
|
||||||
|
|
||||||
"""
|
|
||||||
return is_local(path)
|
|
||||||
|
|
||||||
def add(self, path):
|
|
||||||
head, tail = os.path.split(path)
|
|
||||||
|
|
||||||
# we normalize the head to resolve parent directory symlinks, but not
|
|
||||||
# the tail, since we only want to uninstall symlinks, not their targets
|
|
||||||
path = os.path.join(normalize_path(head), os.path.normcase(tail))
|
|
||||||
|
|
||||||
if not os.path.exists(path):
|
|
||||||
return
|
|
||||||
if self._permitted(path):
|
|
||||||
self.paths.add(path)
|
|
||||||
else:
|
|
||||||
self._refuse.add(path)
|
|
||||||
|
|
||||||
# __pycache__ files can show up after 'installed-files.txt' is created,
|
|
||||||
# due to imports
|
|
||||||
if os.path.splitext(path)[1] == '.py' and uses_pycache:
|
|
||||||
self.add(cache_from_source(path))
|
|
||||||
|
|
||||||
def add_pth(self, pth_file, entry):
|
|
||||||
pth_file = normalize_path(pth_file)
|
|
||||||
if self._permitted(pth_file):
|
|
||||||
if pth_file not in self.pth:
|
|
||||||
self.pth[pth_file] = UninstallPthEntries(pth_file)
|
|
||||||
self.pth[pth_file].add(entry)
|
|
||||||
else:
|
|
||||||
self._refuse.add(pth_file)
|
|
||||||
|
|
||||||
def compact(self, paths):
|
|
||||||
"""Compact a path set to contain the minimal number of paths
|
|
||||||
necessary to contain all paths in the set. If /a/path/ and
|
|
||||||
/a/path/to/a/file.txt are both in the set, leave only the
|
|
||||||
shorter path."""
|
|
||||||
short_paths = set()
|
|
||||||
for path in sorted(paths, key=len):
|
|
||||||
if not any([
|
|
||||||
(path.startswith(shortpath) and
|
|
||||||
path[len(shortpath.rstrip(os.path.sep))] == os.path.sep)
|
|
||||||
for shortpath in short_paths]):
|
|
||||||
short_paths.add(path)
|
|
||||||
return short_paths
|
|
||||||
|
|
||||||
def _stash(self, path):
|
|
||||||
return os.path.join(
|
|
||||||
self.save_dir, os.path.splitdrive(path)[1].lstrip(os.path.sep))
|
|
||||||
|
|
||||||
def remove(self, auto_confirm=False):
|
|
||||||
"""Remove paths in ``self.paths`` with confirmation (unless
|
|
||||||
``auto_confirm`` is True)."""
|
|
||||||
if not self.paths:
|
|
||||||
logger.info(
|
|
||||||
"Can't uninstall '%s'. No files were found to uninstall.",
|
|
||||||
self.dist.project_name,
|
|
||||||
)
|
|
||||||
return
|
|
||||||
logger.info(
|
|
||||||
'Uninstalling %s-%s:',
|
|
||||||
self.dist.project_name, self.dist.version
|
|
||||||
)
|
|
||||||
|
|
||||||
with indent_log():
|
|
||||||
paths = sorted(self.compact(self.paths))
|
|
||||||
|
|
||||||
if auto_confirm:
|
|
||||||
response = 'y'
|
|
||||||
else:
|
|
||||||
for path in paths:
|
|
||||||
logger.info(path)
|
|
||||||
response = ask('Proceed (y/n)? ', ('y', 'n'))
|
|
||||||
if self._refuse:
|
|
||||||
logger.info('Not removing or modifying (outside of prefix):')
|
|
||||||
for path in self.compact(self._refuse):
|
|
||||||
logger.info(path)
|
|
||||||
if response == 'y':
|
|
||||||
self.save_dir = tempfile.mkdtemp(suffix='-uninstall',
|
|
||||||
prefix='pip-')
|
|
||||||
for path in paths:
|
|
||||||
new_path = self._stash(path)
|
|
||||||
logger.debug('Removing file or directory %s', path)
|
|
||||||
self._moved_paths.append(path)
|
|
||||||
renames(path, new_path)
|
|
||||||
for pth in self.pth.values():
|
|
||||||
pth.remove()
|
|
||||||
logger.info(
|
|
||||||
'Successfully uninstalled %s-%s',
|
|
||||||
self.dist.project_name, self.dist.version
|
|
||||||
)
|
|
||||||
|
|
||||||
def rollback(self):
|
|
||||||
"""Rollback the changes previously made by remove()."""
|
|
||||||
if self.save_dir is None:
|
|
||||||
logger.error(
|
|
||||||
"Can't roll back %s; was not uninstalled",
|
|
||||||
self.dist.project_name,
|
|
||||||
)
|
|
||||||
return False
|
|
||||||
logger.info('Rolling back uninstall of %s', self.dist.project_name)
|
|
||||||
for path in self._moved_paths:
|
|
||||||
tmp_path = self._stash(path)
|
|
||||||
logger.debug('Replacing %s', path)
|
|
||||||
renames(tmp_path, path)
|
|
||||||
for pth in self.pth.values():
|
|
||||||
pth.rollback()
|
|
||||||
|
|
||||||
def commit(self):
|
|
||||||
"""Remove temporary save dir: rollback will no longer be possible."""
|
|
||||||
if self.save_dir is not None:
|
|
||||||
rmtree(self.save_dir)
|
|
||||||
self.save_dir = None
|
|
||||||
self._moved_paths = []
|
|
||||||
|
|
||||||
|
|
||||||
class UninstallPthEntries(object):
|
|
||||||
def __init__(self, pth_file):
|
|
||||||
if not os.path.isfile(pth_file):
|
|
||||||
raise UninstallationError(
|
|
||||||
"Cannot remove entries from nonexistent file %s" % pth_file
|
|
||||||
)
|
|
||||||
self.file = pth_file
|
|
||||||
self.entries = set()
|
|
||||||
self._saved_lines = None
|
|
||||||
|
|
||||||
def add(self, entry):
|
|
||||||
entry = os.path.normcase(entry)
|
|
||||||
# On Windows, os.path.normcase converts the entry to use
|
|
||||||
# backslashes. This is correct for entries that describe absolute
|
|
||||||
# paths outside of site-packages, but all the others use forward
|
|
||||||
# slashes.
|
|
||||||
if WINDOWS and not os.path.splitdrive(entry)[0]:
|
|
||||||
entry = entry.replace('\\', '/')
|
|
||||||
self.entries.add(entry)
|
|
||||||
|
|
||||||
def remove(self):
|
|
||||||
logger.debug('Removing pth entries from %s:', self.file)
|
|
||||||
with open(self.file, 'rb') as fh:
|
|
||||||
# windows uses '\r\n' with py3k, but uses '\n' with py2.x
|
|
||||||
lines = fh.readlines()
|
|
||||||
self._saved_lines = lines
|
|
||||||
if any(b'\r\n' in line for line in lines):
|
|
||||||
endline = '\r\n'
|
|
||||||
else:
|
|
||||||
endline = '\n'
|
|
||||||
for entry in self.entries:
|
|
||||||
try:
|
|
||||||
logger.debug('Removing entry: %s', entry)
|
|
||||||
lines.remove((entry + endline).encode("utf-8"))
|
|
||||||
except ValueError:
|
|
||||||
pass
|
|
||||||
with open(self.file, 'wb') as fh:
|
|
||||||
fh.writelines(lines)
|
|
||||||
|
|
||||||
def rollback(self):
|
|
||||||
if self._saved_lines is None:
|
|
||||||
logger.error(
|
|
||||||
'Cannot roll back changes to %s, none were made', self.file
|
|
||||||
)
|
|
||||||
return False
|
|
||||||
logger.debug('Rolling %s back to previous state', self.file)
|
|
||||||
with open(self.file, 'wb') as fh:
|
|
||||||
fh.writelines(self._saved_lines)
|
|
||||||
return True
|
|
|
@ -1,42 +0,0 @@
|
||||||
from __future__ import absolute_import
|
|
||||||
|
|
||||||
import os.path
|
|
||||||
import tempfile
|
|
||||||
|
|
||||||
from pip.utils import rmtree
|
|
||||||
|
|
||||||
|
|
||||||
class BuildDirectory(object):
|
|
||||||
|
|
||||||
def __init__(self, name=None, delete=None):
|
|
||||||
# If we were not given an explicit directory, and we were not given an
|
|
||||||
# explicit delete option, then we'll default to deleting.
|
|
||||||
if name is None and delete is None:
|
|
||||||
delete = True
|
|
||||||
|
|
||||||
if name is None:
|
|
||||||
# We realpath here because some systems have their default tmpdir
|
|
||||||
# symlinked to another directory. This tends to confuse build
|
|
||||||
# scripts, so we canonicalize the path by traversing potential
|
|
||||||
# symlinks here.
|
|
||||||
name = os.path.realpath(tempfile.mkdtemp(prefix="pip-build-"))
|
|
||||||
# If we were not given an explicit directory, and we were not given
|
|
||||||
# an explicit delete option, then we'll default to deleting.
|
|
||||||
if delete is None:
|
|
||||||
delete = True
|
|
||||||
|
|
||||||
self.name = name
|
|
||||||
self.delete = delete
|
|
||||||
|
|
||||||
def __repr__(self):
|
|
||||||
return "<{} {!r}>".format(self.__class__.__name__, self.name)
|
|
||||||
|
|
||||||
def __enter__(self):
|
|
||||||
return self.name
|
|
||||||
|
|
||||||
def __exit__(self, exc, value, tb):
|
|
||||||
self.cleanup()
|
|
||||||
|
|
||||||
def cleanup(self):
|
|
||||||
if self.delete:
|
|
||||||
rmtree(self.name)
|
|
|
@ -1,76 +0,0 @@
|
||||||
"""
|
|
||||||
A module that implements tooling to enable easy warnings about deprecations.
|
|
||||||
"""
|
|
||||||
from __future__ import absolute_import
|
|
||||||
|
|
||||||
import logging
|
|
||||||
import warnings
|
|
||||||
|
|
||||||
|
|
||||||
class PipDeprecationWarning(Warning):
|
|
||||||
pass
|
|
||||||
|
|
||||||
|
|
||||||
class Pending(object):
|
|
||||||
pass
|
|
||||||
|
|
||||||
|
|
||||||
class RemovedInPip10Warning(PipDeprecationWarning):
|
|
||||||
pass
|
|
||||||
|
|
||||||
|
|
||||||
class RemovedInPip11Warning(PipDeprecationWarning, Pending):
|
|
||||||
pass
|
|
||||||
|
|
||||||
|
|
||||||
class Python26DeprecationWarning(PipDeprecationWarning):
|
|
||||||
pass
|
|
||||||
|
|
||||||
|
|
||||||
# Warnings <-> Logging Integration
|
|
||||||
|
|
||||||
|
|
||||||
_warnings_showwarning = None
|
|
||||||
|
|
||||||
|
|
||||||
def _showwarning(message, category, filename, lineno, file=None, line=None):
|
|
||||||
if file is not None:
|
|
||||||
if _warnings_showwarning is not None:
|
|
||||||
_warnings_showwarning(
|
|
||||||
message, category, filename, lineno, file, line,
|
|
||||||
)
|
|
||||||
else:
|
|
||||||
if issubclass(category, PipDeprecationWarning):
|
|
||||||
# We use a specially named logger which will handle all of the
|
|
||||||
# deprecation messages for pip.
|
|
||||||
logger = logging.getLogger("pip.deprecations")
|
|
||||||
|
|
||||||
# This is purposely using the % formatter here instead of letting
|
|
||||||
# the logging module handle the interpolation. This is because we
|
|
||||||
# want it to appear as if someone typed this entire message out.
|
|
||||||
log_message = "DEPRECATION: %s" % message
|
|
||||||
|
|
||||||
# PipDeprecationWarnings that are Pending still have at least 2
|
|
||||||
# versions to go until they are removed so they can just be
|
|
||||||
# warnings. Otherwise, they will be removed in the very next
|
|
||||||
# version of pip. We want these to be more obvious so we use the
|
|
||||||
# ERROR logging level.
|
|
||||||
if issubclass(category, Pending):
|
|
||||||
logger.warning(log_message)
|
|
||||||
else:
|
|
||||||
logger.error(log_message)
|
|
||||||
else:
|
|
||||||
_warnings_showwarning(
|
|
||||||
message, category, filename, lineno, file, line,
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
def install_warning_logger():
|
|
||||||
# Enable our Deprecation Warnings
|
|
||||||
warnings.simplefilter("default", PipDeprecationWarning, append=True)
|
|
||||||
|
|
||||||
global _warnings_showwarning
|
|
||||||
|
|
||||||
if _warnings_showwarning is None:
|
|
||||||
_warnings_showwarning = warnings.showwarning
|
|
||||||
warnings.showwarning = _showwarning
|
|
|
@ -1,130 +0,0 @@
|
||||||
from __future__ import absolute_import
|
|
||||||
|
|
||||||
import contextlib
|
|
||||||
import logging
|
|
||||||
import logging.handlers
|
|
||||||
import os
|
|
||||||
|
|
||||||
try:
|
|
||||||
import threading
|
|
||||||
except ImportError:
|
|
||||||
import dummy_threading as threading
|
|
||||||
|
|
||||||
from pip.compat import WINDOWS
|
|
||||||
from pip.utils import ensure_dir
|
|
||||||
|
|
||||||
try:
|
|
||||||
from pip._vendor import colorama
|
|
||||||
# Lots of different errors can come from this, including SystemError and
|
|
||||||
# ImportError.
|
|
||||||
except Exception:
|
|
||||||
colorama = None
|
|
||||||
|
|
||||||
|
|
||||||
_log_state = threading.local()
|
|
||||||
_log_state.indentation = 0
|
|
||||||
|
|
||||||
|
|
||||||
@contextlib.contextmanager
|
|
||||||
def indent_log(num=2):
|
|
||||||
"""
|
|
||||||
A context manager which will cause the log output to be indented for any
|
|
||||||
log messages emitted inside it.
|
|
||||||
"""
|
|
||||||
_log_state.indentation += num
|
|
||||||
try:
|
|
||||||
yield
|
|
||||||
finally:
|
|
||||||
_log_state.indentation -= num
|
|
||||||
|
|
||||||
|
|
||||||
def get_indentation():
|
|
||||||
return getattr(_log_state, 'indentation', 0)
|
|
||||||
|
|
||||||
|
|
||||||
class IndentingFormatter(logging.Formatter):
|
|
||||||
|
|
||||||
def format(self, record):
|
|
||||||
"""
|
|
||||||
Calls the standard formatter, but will indent all of the log messages
|
|
||||||
by our current indentation level.
|
|
||||||
"""
|
|
||||||
formatted = logging.Formatter.format(self, record)
|
|
||||||
formatted = "".join([
|
|
||||||
(" " * get_indentation()) + line
|
|
||||||
for line in formatted.splitlines(True)
|
|
||||||
])
|
|
||||||
return formatted
|
|
||||||
|
|
||||||
|
|
||||||
def _color_wrap(*colors):
|
|
||||||
def wrapped(inp):
|
|
||||||
return "".join(list(colors) + [inp, colorama.Style.RESET_ALL])
|
|
||||||
return wrapped
|
|
||||||
|
|
||||||
|
|
||||||
class ColorizedStreamHandler(logging.StreamHandler):
|
|
||||||
|
|
||||||
# Don't build up a list of colors if we don't have colorama
|
|
||||||
if colorama:
|
|
||||||
COLORS = [
|
|
||||||
# This needs to be in order from highest logging level to lowest.
|
|
||||||
(logging.ERROR, _color_wrap(colorama.Fore.RED)),
|
|
||||||
(logging.WARNING, _color_wrap(colorama.Fore.YELLOW)),
|
|
||||||
]
|
|
||||||
else:
|
|
||||||
COLORS = []
|
|
||||||
|
|
||||||
def __init__(self, stream=None):
|
|
||||||
logging.StreamHandler.__init__(self, stream)
|
|
||||||
|
|
||||||
if WINDOWS and colorama:
|
|
||||||
self.stream = colorama.AnsiToWin32(self.stream)
|
|
||||||
|
|
||||||
def should_color(self):
|
|
||||||
# Don't colorize things if we do not have colorama
|
|
||||||
if not colorama:
|
|
||||||
return False
|
|
||||||
|
|
||||||
real_stream = (
|
|
||||||
self.stream if not isinstance(self.stream, colorama.AnsiToWin32)
|
|
||||||
else self.stream.wrapped
|
|
||||||
)
|
|
||||||
|
|
||||||
# If the stream is a tty we should color it
|
|
||||||
if hasattr(real_stream, "isatty") and real_stream.isatty():
|
|
||||||
return True
|
|
||||||
|
|
||||||
# If we have an ASNI term we should color it
|
|
||||||
if os.environ.get("TERM") == "ANSI":
|
|
||||||
return True
|
|
||||||
|
|
||||||
# If anything else we should not color it
|
|
||||||
return False
|
|
||||||
|
|
||||||
def format(self, record):
|
|
||||||
msg = logging.StreamHandler.format(self, record)
|
|
||||||
|
|
||||||
if self.should_color():
|
|
||||||
for level, color in self.COLORS:
|
|
||||||
if record.levelno >= level:
|
|
||||||
msg = color(msg)
|
|
||||||
break
|
|
||||||
|
|
||||||
return msg
|
|
||||||
|
|
||||||
|
|
||||||
class BetterRotatingFileHandler(logging.handlers.RotatingFileHandler):
|
|
||||||
|
|
||||||
def _open(self):
|
|
||||||
ensure_dir(os.path.dirname(self.baseFilename))
|
|
||||||
return logging.handlers.RotatingFileHandler._open(self)
|
|
||||||
|
|
||||||
|
|
||||||
class MaxLevelFilter(logging.Filter):
|
|
||||||
|
|
||||||
def __init__(self, level):
|
|
||||||
self.level = level
|
|
||||||
|
|
||||||
def filter(self, record):
|
|
||||||
return record.levelno < self.level
|
|
|
@ -1,366 +0,0 @@
|
||||||
"""Handles all VCS (version control) support"""
|
|
||||||
from __future__ import absolute_import
|
|
||||||
|
|
||||||
import errno
|
|
||||||
import logging
|
|
||||||
import os
|
|
||||||
import shutil
|
|
||||||
import sys
|
|
||||||
|
|
||||||
from pip._vendor.six.moves.urllib import parse as urllib_parse
|
|
||||||
|
|
||||||
from pip.exceptions import BadCommand
|
|
||||||
from pip.utils import (display_path, backup_dir, call_subprocess,
|
|
||||||
rmtree, ask_path_exists)
|
|
||||||
|
|
||||||
|
|
||||||
__all__ = ['vcs', 'get_src_requirement']
|
|
||||||
|
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
|
||||||
|
|
||||||
|
|
||||||
class VcsSupport(object):
|
|
||||||
_registry = {}
|
|
||||||
schemes = ['ssh', 'git', 'hg', 'bzr', 'sftp', 'svn']
|
|
||||||
|
|
||||||
def __init__(self):
|
|
||||||
# Register more schemes with urlparse for various version control
|
|
||||||
# systems
|
|
||||||
urllib_parse.uses_netloc.extend(self.schemes)
|
|
||||||
# Python >= 2.7.4, 3.3 doesn't have uses_fragment
|
|
||||||
if getattr(urllib_parse, 'uses_fragment', None):
|
|
||||||
urllib_parse.uses_fragment.extend(self.schemes)
|
|
||||||
super(VcsSupport, self).__init__()
|
|
||||||
|
|
||||||
def __iter__(self):
|
|
||||||
return self._registry.__iter__()
|
|
||||||
|
|
||||||
@property
|
|
||||||
def backends(self):
|
|
||||||
return list(self._registry.values())
|
|
||||||
|
|
||||||
@property
|
|
||||||
def dirnames(self):
|
|
||||||
return [backend.dirname for backend in self.backends]
|
|
||||||
|
|
||||||
@property
|
|
||||||
def all_schemes(self):
|
|
||||||
schemes = []
|
|
||||||
for backend in self.backends:
|
|
||||||
schemes.extend(backend.schemes)
|
|
||||||
return schemes
|
|
||||||
|
|
||||||
def register(self, cls):
|
|
||||||
if not hasattr(cls, 'name'):
|
|
||||||
logger.warning('Cannot register VCS %s', cls.__name__)
|
|
||||||
return
|
|
||||||
if cls.name not in self._registry:
|
|
||||||
self._registry[cls.name] = cls
|
|
||||||
logger.debug('Registered VCS backend: %s', cls.name)
|
|
||||||
|
|
||||||
def unregister(self, cls=None, name=None):
|
|
||||||
if name in self._registry:
|
|
||||||
del self._registry[name]
|
|
||||||
elif cls in self._registry.values():
|
|
||||||
del self._registry[cls.name]
|
|
||||||
else:
|
|
||||||
logger.warning('Cannot unregister because no class or name given')
|
|
||||||
|
|
||||||
def get_backend_name(self, location):
|
|
||||||
"""
|
|
||||||
Return the name of the version control backend if found at given
|
|
||||||
location, e.g. vcs.get_backend_name('/path/to/vcs/checkout')
|
|
||||||
"""
|
|
||||||
for vc_type in self._registry.values():
|
|
||||||
if vc_type.controls_location(location):
|
|
||||||
logger.debug('Determine that %s uses VCS: %s',
|
|
||||||
location, vc_type.name)
|
|
||||||
return vc_type.name
|
|
||||||
return None
|
|
||||||
|
|
||||||
def get_backend(self, name):
|
|
||||||
name = name.lower()
|
|
||||||
if name in self._registry:
|
|
||||||
return self._registry[name]
|
|
||||||
|
|
||||||
def get_backend_from_location(self, location):
|
|
||||||
vc_type = self.get_backend_name(location)
|
|
||||||
if vc_type:
|
|
||||||
return self.get_backend(vc_type)
|
|
||||||
return None
|
|
||||||
|
|
||||||
|
|
||||||
vcs = VcsSupport()
|
|
||||||
|
|
||||||
|
|
||||||
class VersionControl(object):
|
|
||||||
name = ''
|
|
||||||
dirname = ''
|
|
||||||
# List of supported schemes for this Version Control
|
|
||||||
schemes = ()
|
|
||||||
|
|
||||||
def __init__(self, url=None, *args, **kwargs):
|
|
||||||
self.url = url
|
|
||||||
super(VersionControl, self).__init__(*args, **kwargs)
|
|
||||||
|
|
||||||
def _is_local_repository(self, repo):
|
|
||||||
"""
|
|
||||||
posix absolute paths start with os.path.sep,
|
|
||||||
win32 ones start with drive (like c:\\folder)
|
|
||||||
"""
|
|
||||||
drive, tail = os.path.splitdrive(repo)
|
|
||||||
return repo.startswith(os.path.sep) or drive
|
|
||||||
|
|
||||||
# See issue #1083 for why this method was introduced:
|
|
||||||
# https://github.com/pypa/pip/issues/1083
|
|
||||||
def translate_egg_surname(self, surname):
|
|
||||||
# For example, Django has branches of the form "stable/1.7.x".
|
|
||||||
return surname.replace('/', '_')
|
|
||||||
|
|
||||||
def export(self, location):
|
|
||||||
"""
|
|
||||||
Export the repository at the url to the destination location
|
|
||||||
i.e. only download the files, without vcs informations
|
|
||||||
"""
|
|
||||||
raise NotImplementedError
|
|
||||||
|
|
||||||
def get_url_rev(self):
|
|
||||||
"""
|
|
||||||
Returns the correct repository URL and revision by parsing the given
|
|
||||||
repository URL
|
|
||||||
"""
|
|
||||||
error_message = (
|
|
||||||
"Sorry, '%s' is a malformed VCS url. "
|
|
||||||
"The format is <vcs>+<protocol>://<url>, "
|
|
||||||
"e.g. svn+http://myrepo/svn/MyApp#egg=MyApp"
|
|
||||||
)
|
|
||||||
assert '+' in self.url, error_message % self.url
|
|
||||||
url = self.url.split('+', 1)[1]
|
|
||||||
scheme, netloc, path, query, frag = urllib_parse.urlsplit(url)
|
|
||||||
rev = None
|
|
||||||
if '@' in path:
|
|
||||||
path, rev = path.rsplit('@', 1)
|
|
||||||
url = urllib_parse.urlunsplit((scheme, netloc, path, query, ''))
|
|
||||||
return url, rev
|
|
||||||
|
|
||||||
def get_info(self, location):
|
|
||||||
"""
|
|
||||||
Returns (url, revision), where both are strings
|
|
||||||
"""
|
|
||||||
assert not location.rstrip('/').endswith(self.dirname), \
|
|
||||||
'Bad directory: %s' % location
|
|
||||||
return self.get_url(location), self.get_revision(location)
|
|
||||||
|
|
||||||
def normalize_url(self, url):
|
|
||||||
"""
|
|
||||||
Normalize a URL for comparison by unquoting it and removing any
|
|
||||||
trailing slash.
|
|
||||||
"""
|
|
||||||
return urllib_parse.unquote(url).rstrip('/')
|
|
||||||
|
|
||||||
def compare_urls(self, url1, url2):
|
|
||||||
"""
|
|
||||||
Compare two repo URLs for identity, ignoring incidental differences.
|
|
||||||
"""
|
|
||||||
return (self.normalize_url(url1) == self.normalize_url(url2))
|
|
||||||
|
|
||||||
def obtain(self, dest):
|
|
||||||
"""
|
|
||||||
Called when installing or updating an editable package, takes the
|
|
||||||
source path of the checkout.
|
|
||||||
"""
|
|
||||||
raise NotImplementedError
|
|
||||||
|
|
||||||
def switch(self, dest, url, rev_options):
|
|
||||||
"""
|
|
||||||
Switch the repo at ``dest`` to point to ``URL``.
|
|
||||||
"""
|
|
||||||
raise NotImplementedError
|
|
||||||
|
|
||||||
def update(self, dest, rev_options):
|
|
||||||
"""
|
|
||||||
Update an already-existing repo to the given ``rev_options``.
|
|
||||||
"""
|
|
||||||
raise NotImplementedError
|
|
||||||
|
|
||||||
def check_version(self, dest, rev_options):
|
|
||||||
"""
|
|
||||||
Return True if the version is identical to what exists and
|
|
||||||
doesn't need to be updated.
|
|
||||||
"""
|
|
||||||
raise NotImplementedError
|
|
||||||
|
|
||||||
def check_destination(self, dest, url, rev_options, rev_display):
|
|
||||||
"""
|
|
||||||
Prepare a location to receive a checkout/clone.
|
|
||||||
|
|
||||||
Return True if the location is ready for (and requires) a
|
|
||||||
checkout/clone, False otherwise.
|
|
||||||
"""
|
|
||||||
checkout = True
|
|
||||||
prompt = False
|
|
||||||
if os.path.exists(dest):
|
|
||||||
checkout = False
|
|
||||||
if os.path.exists(os.path.join(dest, self.dirname)):
|
|
||||||
existing_url = self.get_url(dest)
|
|
||||||
if self.compare_urls(existing_url, url):
|
|
||||||
logger.debug(
|
|
||||||
'%s in %s exists, and has correct URL (%s)',
|
|
||||||
self.repo_name.title(),
|
|
||||||
display_path(dest),
|
|
||||||
url,
|
|
||||||
)
|
|
||||||
if not self.check_version(dest, rev_options):
|
|
||||||
logger.info(
|
|
||||||
'Updating %s %s%s',
|
|
||||||
display_path(dest),
|
|
||||||
self.repo_name,
|
|
||||||
rev_display,
|
|
||||||
)
|
|
||||||
self.update(dest, rev_options)
|
|
||||||
else:
|
|
||||||
logger.info(
|
|
||||||
'Skipping because already up-to-date.')
|
|
||||||
else:
|
|
||||||
logger.warning(
|
|
||||||
'%s %s in %s exists with URL %s',
|
|
||||||
self.name,
|
|
||||||
self.repo_name,
|
|
||||||
display_path(dest),
|
|
||||||
existing_url,
|
|
||||||
)
|
|
||||||
prompt = ('(s)witch, (i)gnore, (w)ipe, (b)ackup ',
|
|
||||||
('s', 'i', 'w', 'b'))
|
|
||||||
else:
|
|
||||||
logger.warning(
|
|
||||||
'Directory %s already exists, and is not a %s %s.',
|
|
||||||
dest,
|
|
||||||
self.name,
|
|
||||||
self.repo_name,
|
|
||||||
)
|
|
||||||
prompt = ('(i)gnore, (w)ipe, (b)ackup ', ('i', 'w', 'b'))
|
|
||||||
if prompt:
|
|
||||||
logger.warning(
|
|
||||||
'The plan is to install the %s repository %s',
|
|
||||||
self.name,
|
|
||||||
url,
|
|
||||||
)
|
|
||||||
response = ask_path_exists('What to do? %s' % prompt[0],
|
|
||||||
prompt[1])
|
|
||||||
|
|
||||||
if response == 's':
|
|
||||||
logger.info(
|
|
||||||
'Switching %s %s to %s%s',
|
|
||||||
self.repo_name,
|
|
||||||
display_path(dest),
|
|
||||||
url,
|
|
||||||
rev_display,
|
|
||||||
)
|
|
||||||
self.switch(dest, url, rev_options)
|
|
||||||
elif response == 'i':
|
|
||||||
# do nothing
|
|
||||||
pass
|
|
||||||
elif response == 'w':
|
|
||||||
logger.warning('Deleting %s', display_path(dest))
|
|
||||||
rmtree(dest)
|
|
||||||
checkout = True
|
|
||||||
elif response == 'b':
|
|
||||||
dest_dir = backup_dir(dest)
|
|
||||||
logger.warning(
|
|
||||||
'Backing up %s to %s', display_path(dest), dest_dir,
|
|
||||||
)
|
|
||||||
shutil.move(dest, dest_dir)
|
|
||||||
checkout = True
|
|
||||||
elif response == 'a':
|
|
||||||
sys.exit(-1)
|
|
||||||
return checkout
|
|
||||||
|
|
||||||
def unpack(self, location):
|
|
||||||
"""
|
|
||||||
Clean up current location and download the url repository
|
|
||||||
(and vcs infos) into location
|
|
||||||
"""
|
|
||||||
if os.path.exists(location):
|
|
||||||
rmtree(location)
|
|
||||||
self.obtain(location)
|
|
||||||
|
|
||||||
def get_src_requirement(self, dist, location):
|
|
||||||
"""
|
|
||||||
Return a string representing the requirement needed to
|
|
||||||
redownload the files currently present in location, something
|
|
||||||
like:
|
|
||||||
{repository_url}@{revision}#egg={project_name}-{version_identifier}
|
|
||||||
"""
|
|
||||||
raise NotImplementedError
|
|
||||||
|
|
||||||
def get_url(self, location):
|
|
||||||
"""
|
|
||||||
Return the url used at location
|
|
||||||
Used in get_info or check_destination
|
|
||||||
"""
|
|
||||||
raise NotImplementedError
|
|
||||||
|
|
||||||
def get_revision(self, location):
|
|
||||||
"""
|
|
||||||
Return the current revision of the files at location
|
|
||||||
Used in get_info
|
|
||||||
"""
|
|
||||||
raise NotImplementedError
|
|
||||||
|
|
||||||
def run_command(self, cmd, show_stdout=True, cwd=None,
|
|
||||||
on_returncode='raise',
|
|
||||||
command_desc=None,
|
|
||||||
extra_environ=None, spinner=None):
|
|
||||||
"""
|
|
||||||
Run a VCS subcommand
|
|
||||||
This is simply a wrapper around call_subprocess that adds the VCS
|
|
||||||
command name, and checks that the VCS is available
|
|
||||||
"""
|
|
||||||
cmd = [self.name] + cmd
|
|
||||||
try:
|
|
||||||
return call_subprocess(cmd, show_stdout, cwd,
|
|
||||||
on_returncode,
|
|
||||||
command_desc, extra_environ,
|
|
||||||
spinner)
|
|
||||||
except OSError as e:
|
|
||||||
# errno.ENOENT = no such file or directory
|
|
||||||
# In other words, the VCS executable isn't available
|
|
||||||
if e.errno == errno.ENOENT:
|
|
||||||
raise BadCommand('Cannot find command %r' % self.name)
|
|
||||||
else:
|
|
||||||
raise # re-raise exception if a different error occurred
|
|
||||||
|
|
||||||
@classmethod
|
|
||||||
def controls_location(cls, location):
|
|
||||||
"""
|
|
||||||
Check if a location is controlled by the vcs.
|
|
||||||
It is meant to be overridden to implement smarter detection
|
|
||||||
mechanisms for specific vcs.
|
|
||||||
"""
|
|
||||||
logger.debug('Checking in %s for %s (%s)...',
|
|
||||||
location, cls.dirname, cls.name)
|
|
||||||
path = os.path.join(location, cls.dirname)
|
|
||||||
return os.path.exists(path)
|
|
||||||
|
|
||||||
|
|
||||||
def get_src_requirement(dist, location):
|
|
||||||
version_control = vcs.get_backend_from_location(location)
|
|
||||||
if version_control:
|
|
||||||
try:
|
|
||||||
return version_control().get_src_requirement(dist,
|
|
||||||
location)
|
|
||||||
except BadCommand:
|
|
||||||
logger.warning(
|
|
||||||
'cannot determine version of editable source in %s '
|
|
||||||
'(%s command not found in path)',
|
|
||||||
location,
|
|
||||||
version_control.name,
|
|
||||||
)
|
|
||||||
return dist.as_requirement()
|
|
||||||
logger.warning(
|
|
||||||
'cannot determine version of editable source in %s (is not SVN '
|
|
||||||
'checkout, Git clone, Mercurial clone or Bazaar branch)',
|
|
||||||
location,
|
|
||||||
)
|
|
||||||
return dist.as_requirement()
|
|
|
@ -1,300 +0,0 @@
|
||||||
from __future__ import absolute_import
|
|
||||||
|
|
||||||
import logging
|
|
||||||
import tempfile
|
|
||||||
import os.path
|
|
||||||
|
|
||||||
from pip.compat import samefile
|
|
||||||
from pip.exceptions import BadCommand
|
|
||||||
from pip._vendor.six.moves.urllib import parse as urllib_parse
|
|
||||||
from pip._vendor.six.moves.urllib import request as urllib_request
|
|
||||||
from pip._vendor.packaging.version import parse as parse_version
|
|
||||||
|
|
||||||
from pip.utils import display_path, rmtree
|
|
||||||
from pip.vcs import vcs, VersionControl
|
|
||||||
|
|
||||||
|
|
||||||
urlsplit = urllib_parse.urlsplit
|
|
||||||
urlunsplit = urllib_parse.urlunsplit
|
|
||||||
|
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
|
||||||
|
|
||||||
|
|
||||||
class Git(VersionControl):
|
|
||||||
name = 'git'
|
|
||||||
dirname = '.git'
|
|
||||||
repo_name = 'clone'
|
|
||||||
schemes = (
|
|
||||||
'git', 'git+http', 'git+https', 'git+ssh', 'git+git', 'git+file',
|
|
||||||
)
|
|
||||||
|
|
||||||
def __init__(self, url=None, *args, **kwargs):
|
|
||||||
|
|
||||||
# Works around an apparent Git bug
|
|
||||||
# (see http://article.gmane.org/gmane.comp.version-control.git/146500)
|
|
||||||
if url:
|
|
||||||
scheme, netloc, path, query, fragment = urlsplit(url)
|
|
||||||
if scheme.endswith('file'):
|
|
||||||
initial_slashes = path[:-len(path.lstrip('/'))]
|
|
||||||
newpath = (
|
|
||||||
initial_slashes +
|
|
||||||
urllib_request.url2pathname(path)
|
|
||||||
.replace('\\', '/').lstrip('/')
|
|
||||||
)
|
|
||||||
url = urlunsplit((scheme, netloc, newpath, query, fragment))
|
|
||||||
after_plus = scheme.find('+') + 1
|
|
||||||
url = scheme[:after_plus] + urlunsplit(
|
|
||||||
(scheme[after_plus:], netloc, newpath, query, fragment),
|
|
||||||
)
|
|
||||||
|
|
||||||
super(Git, self).__init__(url, *args, **kwargs)
|
|
||||||
|
|
||||||
def get_git_version(self):
|
|
||||||
VERSION_PFX = 'git version '
|
|
||||||
version = self.run_command(['version'], show_stdout=False)
|
|
||||||
if version.startswith(VERSION_PFX):
|
|
||||||
version = version[len(VERSION_PFX):]
|
|
||||||
else:
|
|
||||||
version = ''
|
|
||||||
# get first 3 positions of the git version becasue
|
|
||||||
# on windows it is x.y.z.windows.t, and this parses as
|
|
||||||
# LegacyVersion which always smaller than a Version.
|
|
||||||
version = '.'.join(version.split('.')[:3])
|
|
||||||
return parse_version(version)
|
|
||||||
|
|
||||||
def export(self, location):
|
|
||||||
"""Export the Git repository at the url to the destination location"""
|
|
||||||
temp_dir = tempfile.mkdtemp('-export', 'pip-')
|
|
||||||
self.unpack(temp_dir)
|
|
||||||
try:
|
|
||||||
if not location.endswith('/'):
|
|
||||||
location = location + '/'
|
|
||||||
self.run_command(
|
|
||||||
['checkout-index', '-a', '-f', '--prefix', location],
|
|
||||||
show_stdout=False, cwd=temp_dir)
|
|
||||||
finally:
|
|
||||||
rmtree(temp_dir)
|
|
||||||
|
|
||||||
def check_rev_options(self, rev, dest, rev_options):
|
|
||||||
"""Check the revision options before checkout to compensate that tags
|
|
||||||
and branches may need origin/ as a prefix.
|
|
||||||
Returns the SHA1 of the branch or tag if found.
|
|
||||||
"""
|
|
||||||
revisions = self.get_short_refs(dest)
|
|
||||||
|
|
||||||
origin_rev = 'origin/%s' % rev
|
|
||||||
if origin_rev in revisions:
|
|
||||||
# remote branch
|
|
||||||
return [revisions[origin_rev]]
|
|
||||||
elif rev in revisions:
|
|
||||||
# a local tag or branch name
|
|
||||||
return [revisions[rev]]
|
|
||||||
else:
|
|
||||||
logger.warning(
|
|
||||||
"Could not find a tag or branch '%s', assuming commit.", rev,
|
|
||||||
)
|
|
||||||
return rev_options
|
|
||||||
|
|
||||||
def check_version(self, dest, rev_options):
|
|
||||||
"""
|
|
||||||
Compare the current sha to the ref. ref may be a branch or tag name,
|
|
||||||
but current rev will always point to a sha. This means that a branch
|
|
||||||
or tag will never compare as True. So this ultimately only matches
|
|
||||||
against exact shas.
|
|
||||||
"""
|
|
||||||
return self.get_revision(dest).startswith(rev_options[0])
|
|
||||||
|
|
||||||
def switch(self, dest, url, rev_options):
|
|
||||||
self.run_command(['config', 'remote.origin.url', url], cwd=dest)
|
|
||||||
self.run_command(['checkout', '-q'] + rev_options, cwd=dest)
|
|
||||||
|
|
||||||
self.update_submodules(dest)
|
|
||||||
|
|
||||||
def update(self, dest, rev_options):
|
|
||||||
# First fetch changes from the default remote
|
|
||||||
if self.get_git_version() >= parse_version('1.9.0'):
|
|
||||||
# fetch tags in addition to everything else
|
|
||||||
self.run_command(['fetch', '-q', '--tags'], cwd=dest)
|
|
||||||
else:
|
|
||||||
self.run_command(['fetch', '-q'], cwd=dest)
|
|
||||||
# Then reset to wanted revision (maybe even origin/master)
|
|
||||||
if rev_options:
|
|
||||||
rev_options = self.check_rev_options(
|
|
||||||
rev_options[0], dest, rev_options,
|
|
||||||
)
|
|
||||||
self.run_command(['reset', '--hard', '-q'] + rev_options, cwd=dest)
|
|
||||||
#: update submodules
|
|
||||||
self.update_submodules(dest)
|
|
||||||
|
|
||||||
def obtain(self, dest):
|
|
||||||
url, rev = self.get_url_rev()
|
|
||||||
if rev:
|
|
||||||
rev_options = [rev]
|
|
||||||
rev_display = ' (to %s)' % rev
|
|
||||||
else:
|
|
||||||
rev_options = ['origin/master']
|
|
||||||
rev_display = ''
|
|
||||||
if self.check_destination(dest, url, rev_options, rev_display):
|
|
||||||
logger.info(
|
|
||||||
'Cloning %s%s to %s', url, rev_display, display_path(dest),
|
|
||||||
)
|
|
||||||
self.run_command(['clone', '-q', url, dest])
|
|
||||||
|
|
||||||
if rev:
|
|
||||||
rev_options = self.check_rev_options(rev, dest, rev_options)
|
|
||||||
# Only do a checkout if rev_options differs from HEAD
|
|
||||||
if not self.check_version(dest, rev_options):
|
|
||||||
self.run_command(
|
|
||||||
['checkout', '-q'] + rev_options,
|
|
||||||
cwd=dest,
|
|
||||||
)
|
|
||||||
#: repo may contain submodules
|
|
||||||
self.update_submodules(dest)
|
|
||||||
|
|
||||||
def get_url(self, location):
|
|
||||||
"""Return URL of the first remote encountered."""
|
|
||||||
remotes = self.run_command(
|
|
||||||
['config', '--get-regexp', 'remote\..*\.url'],
|
|
||||||
show_stdout=False, cwd=location)
|
|
||||||
remotes = remotes.splitlines()
|
|
||||||
found_remote = remotes[0]
|
|
||||||
for remote in remotes:
|
|
||||||
if remote.startswith('remote.origin.url '):
|
|
||||||
found_remote = remote
|
|
||||||
break
|
|
||||||
url = found_remote.split(' ')[1]
|
|
||||||
return url.strip()
|
|
||||||
|
|
||||||
def get_revision(self, location):
|
|
||||||
current_rev = self.run_command(
|
|
||||||
['rev-parse', 'HEAD'], show_stdout=False, cwd=location)
|
|
||||||
return current_rev.strip()
|
|
||||||
|
|
||||||
def get_full_refs(self, location):
|
|
||||||
"""Yields tuples of (commit, ref) for branches and tags"""
|
|
||||||
output = self.run_command(['show-ref'],
|
|
||||||
show_stdout=False, cwd=location)
|
|
||||||
for line in output.strip().splitlines():
|
|
||||||
commit, ref = line.split(' ', 1)
|
|
||||||
yield commit.strip(), ref.strip()
|
|
||||||
|
|
||||||
def is_ref_remote(self, ref):
|
|
||||||
return ref.startswith('refs/remotes/')
|
|
||||||
|
|
||||||
def is_ref_branch(self, ref):
|
|
||||||
return ref.startswith('refs/heads/')
|
|
||||||
|
|
||||||
def is_ref_tag(self, ref):
|
|
||||||
return ref.startswith('refs/tags/')
|
|
||||||
|
|
||||||
def is_ref_commit(self, ref):
|
|
||||||
"""A ref is a commit sha if it is not anything else"""
|
|
||||||
return not any((
|
|
||||||
self.is_ref_remote(ref),
|
|
||||||
self.is_ref_branch(ref),
|
|
||||||
self.is_ref_tag(ref),
|
|
||||||
))
|
|
||||||
|
|
||||||
# Should deprecate `get_refs` since it's ambiguous
|
|
||||||
def get_refs(self, location):
|
|
||||||
return self.get_short_refs(location)
|
|
||||||
|
|
||||||
def get_short_refs(self, location):
|
|
||||||
"""Return map of named refs (branches or tags) to commit hashes."""
|
|
||||||
rv = {}
|
|
||||||
for commit, ref in self.get_full_refs(location):
|
|
||||||
ref_name = None
|
|
||||||
if self.is_ref_remote(ref):
|
|
||||||
ref_name = ref[len('refs/remotes/'):]
|
|
||||||
elif self.is_ref_branch(ref):
|
|
||||||
ref_name = ref[len('refs/heads/'):]
|
|
||||||
elif self.is_ref_tag(ref):
|
|
||||||
ref_name = ref[len('refs/tags/'):]
|
|
||||||
if ref_name is not None:
|
|
||||||
rv[ref_name] = commit
|
|
||||||
return rv
|
|
||||||
|
|
||||||
def _get_subdirectory(self, location):
|
|
||||||
"""Return the relative path of setup.py to the git repo root."""
|
|
||||||
# find the repo root
|
|
||||||
git_dir = self.run_command(['rev-parse', '--git-dir'],
|
|
||||||
show_stdout=False, cwd=location).strip()
|
|
||||||
if not os.path.isabs(git_dir):
|
|
||||||
git_dir = os.path.join(location, git_dir)
|
|
||||||
root_dir = os.path.join(git_dir, '..')
|
|
||||||
# find setup.py
|
|
||||||
orig_location = location
|
|
||||||
while not os.path.exists(os.path.join(location, 'setup.py')):
|
|
||||||
last_location = location
|
|
||||||
location = os.path.dirname(location)
|
|
||||||
if location == last_location:
|
|
||||||
# We've traversed up to the root of the filesystem without
|
|
||||||
# finding setup.py
|
|
||||||
logger.warning(
|
|
||||||
"Could not find setup.py for directory %s (tried all "
|
|
||||||
"parent directories)",
|
|
||||||
orig_location,
|
|
||||||
)
|
|
||||||
return None
|
|
||||||
# relative path of setup.py to repo root
|
|
||||||
if samefile(root_dir, location):
|
|
||||||
return None
|
|
||||||
return os.path.relpath(location, root_dir)
|
|
||||||
|
|
||||||
def get_src_requirement(self, dist, location):
|
|
||||||
repo = self.get_url(location)
|
|
||||||
if not repo.lower().startswith('git:'):
|
|
||||||
repo = 'git+' + repo
|
|
||||||
egg_project_name = dist.egg_name().split('-', 1)[0]
|
|
||||||
if not repo:
|
|
||||||
return None
|
|
||||||
current_rev = self.get_revision(location)
|
|
||||||
req = '%s@%s#egg=%s' % (repo, current_rev, egg_project_name)
|
|
||||||
subdirectory = self._get_subdirectory(location)
|
|
||||||
if subdirectory:
|
|
||||||
req += '&subdirectory=' + subdirectory
|
|
||||||
return req
|
|
||||||
|
|
||||||
def get_url_rev(self):
|
|
||||||
"""
|
|
||||||
Prefixes stub URLs like 'user@hostname:user/repo.git' with 'ssh://'.
|
|
||||||
That's required because although they use SSH they sometimes doesn't
|
|
||||||
work with a ssh:// scheme (e.g. Github). But we need a scheme for
|
|
||||||
parsing. Hence we remove it again afterwards and return it as a stub.
|
|
||||||
"""
|
|
||||||
if '://' not in self.url:
|
|
||||||
assert 'file:' not in self.url
|
|
||||||
self.url = self.url.replace('git+', 'git+ssh://')
|
|
||||||
url, rev = super(Git, self).get_url_rev()
|
|
||||||
url = url.replace('ssh://', '')
|
|
||||||
else:
|
|
||||||
url, rev = super(Git, self).get_url_rev()
|
|
||||||
|
|
||||||
return url, rev
|
|
||||||
|
|
||||||
def update_submodules(self, location):
|
|
||||||
if not os.path.exists(os.path.join(location, '.gitmodules')):
|
|
||||||
return
|
|
||||||
self.run_command(
|
|
||||||
['submodule', 'update', '--init', '--recursive', '-q'],
|
|
||||||
cwd=location,
|
|
||||||
)
|
|
||||||
|
|
||||||
@classmethod
|
|
||||||
def controls_location(cls, location):
|
|
||||||
if super(Git, cls).controls_location(location):
|
|
||||||
return True
|
|
||||||
try:
|
|
||||||
r = cls().run_command(['rev-parse'],
|
|
||||||
cwd=location,
|
|
||||||
show_stdout=False,
|
|
||||||
on_returncode='ignore')
|
|
||||||
return not r
|
|
||||||
except BadCommand:
|
|
||||||
logger.debug("could not determine if %s is under git control "
|
|
||||||
"because git is not available", location)
|
|
||||||
return False
|
|
||||||
|
|
||||||
|
|
||||||
vcs.register(Git)
|
|
|
@ -1,3 +0,0 @@
|
||||||
UNKNOWN
|
|
||||||
|
|
||||||
|
|
|
@ -1 +0,0 @@
|
||||||
pip
|
|
|
@ -1,36 +0,0 @@
|
||||||
pkg_resources/__init__.py,sha256=qasrGUKwGQ8dGJP5SOEhLJoWRizj5HinbD2bXfrOH28,103308
|
|
||||||
pkg_resources/_vendor/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
|
|
||||||
pkg_resources/_vendor/appdirs.py,sha256=tgGaL0m4Jo2VeuGfoOOifLv7a7oUEJu2n1vRkqoPw-0,22374
|
|
||||||
pkg_resources/_vendor/pyparsing.py,sha256=PifeLY3-WhIcBVzLtv0U4T_pwDtPruBhBCkg5vLqa28,229867
|
|
||||||
pkg_resources/_vendor/six.py,sha256=A6hdJZVjI3t_geebZ9BzUvwRrIXo0lfwzQlM2LcKyas,30098
|
|
||||||
pkg_resources/_vendor/packaging/__about__.py,sha256=zkcCPTN_6TcLW0Nrlg0176-R1QQ_WVPTm8sz1R4-HjM,720
|
|
||||||
pkg_resources/_vendor/packaging/__init__.py,sha256=_vNac5TrzwsrzbOFIbF-5cHqc_Y2aPT2D7zrIR06BOo,513
|
|
||||||
pkg_resources/_vendor/packaging/_compat.py,sha256=Vi_A0rAQeHbU-a9X0tt1yQm9RqkgQbDSxzRw8WlU9kA,860
|
|
||||||
pkg_resources/_vendor/packaging/_structures.py,sha256=RImECJ4c_wTlaTYYwZYLHEiebDMaAJmK1oPARhw1T5o,1416
|
|
||||||
pkg_resources/_vendor/packaging/markers.py,sha256=uEcBBtGvzqltgnArqb9c4RrcInXezDLos14zbBHhWJo,8248
|
|
||||||
pkg_resources/_vendor/packaging/requirements.py,sha256=SikL2UynbsT0qtY9ltqngndha_sfo0w6XGFhAhoSoaQ,4355
|
|
||||||
pkg_resources/_vendor/packaging/specifiers.py,sha256=SAMRerzO3fK2IkFZCaZkuwZaL_EGqHNOz4pni4vhnN0,28025
|
|
||||||
pkg_resources/_vendor/packaging/utils.py,sha256=3m6WvPm6NNxE8rkTGmn0r75B_GZSGg7ikafxHsBN1WA,421
|
|
||||||
pkg_resources/_vendor/packaging/version.py,sha256=OwGnxYfr2ghNzYx59qWIBkrK3SnB6n-Zfd1XaLpnnM0,11556
|
|
||||||
pkg_resources/extern/__init__.py,sha256=JUtlHHvlxHSNuB4pWqNjcx7n6kG-fwXg7qmJ2zNJlIY,2487
|
|
||||||
pkg_resources-0.0.0.dist-info/DESCRIPTION.rst,sha256=OCTuuN6LcWulhHS3d5rfjdsQtW22n7HENFRh6jC6ego,10
|
|
||||||
pkg_resources-0.0.0.dist-info/METADATA,sha256=FOYDX6cmnDUkWo-yhqWQYtjKIMZR2IW2G1GFZhA6gUQ,177
|
|
||||||
pkg_resources-0.0.0.dist-info/RECORD,,
|
|
||||||
pkg_resources-0.0.0.dist-info/WHEEL,sha256=o2k-Qa-RMNIJmUdIc7KU6VWR_ErNRbWNlxDIpl7lm34,110
|
|
||||||
pkg_resources-0.0.0.dist-info/metadata.json,sha256=8ZVRFU96pY_wnWouockCkvXw981Y0iDB5nQFFGq8ZiY,221
|
|
||||||
pkg_resources-0.0.0.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
|
|
||||||
pkg_resources/_vendor/packaging/__pycache__/version.cpython-34.pyc,,
|
|
||||||
pkg_resources/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
pkg_resources/_vendor/__pycache__/appdirs.cpython-34.pyc,,
|
|
||||||
pkg_resources/_vendor/packaging/__pycache__/_structures.cpython-34.pyc,,
|
|
||||||
pkg_resources/_vendor/packaging/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
pkg_resources/_vendor/packaging/__pycache__/specifiers.cpython-34.pyc,,
|
|
||||||
pkg_resources/_vendor/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
pkg_resources/_vendor/packaging/__pycache__/utils.cpython-34.pyc,,
|
|
||||||
pkg_resources/_vendor/__pycache__/six.cpython-34.pyc,,
|
|
||||||
pkg_resources/_vendor/__pycache__/pyparsing.cpython-34.pyc,,
|
|
||||||
pkg_resources/_vendor/packaging/__pycache__/markers.cpython-34.pyc,,
|
|
||||||
pkg_resources/_vendor/packaging/__pycache__/__about__.cpython-34.pyc,,
|
|
||||||
pkg_resources/_vendor/packaging/__pycache__/_compat.cpython-34.pyc,,
|
|
||||||
pkg_resources/_vendor/packaging/__pycache__/requirements.cpython-34.pyc,,
|
|
||||||
pkg_resources/extern/__pycache__/__init__.cpython-34.pyc,,
|
|
|
@ -1 +0,0 @@
|
||||||
{"extensions": {"python.details": {"document_names": {"description": "DESCRIPTION.rst"}}}, "generator": "bdist_wheel (0.29.0)", "metadata_version": "2.0", "name": "pkg_resources", "summary": "UNKNOWN", "version": "0.0.0"}
|
|
|
@ -1,36 +0,0 @@
|
||||||
.. image:: https://img.shields.io/pypi/v/setuptools.svg
|
|
||||||
:target: https://pypi.org/project/setuptools
|
|
||||||
|
|
||||||
.. image:: https://readthedocs.org/projects/setuptools/badge/?version=latest
|
|
||||||
:target: https://setuptools.readthedocs.io
|
|
||||||
|
|
||||||
.. image:: https://img.shields.io/travis/pypa/setuptools/master.svg?label=Linux%20build%20%40%20Travis%20CI
|
|
||||||
:target: http://travis-ci.org/pypa/setuptools
|
|
||||||
|
|
||||||
.. image:: https://img.shields.io/appveyor/ci/jaraco/setuptools/master.svg?label=Windows%20build%20%40%20Appveyor
|
|
||||||
:target: https://ci.appveyor.com/project/jaraco/setuptools/branch/master
|
|
||||||
|
|
||||||
.. image:: https://img.shields.io/pypi/pyversions/setuptools.svg
|
|
||||||
|
|
||||||
See the `Installation Instructions
|
|
||||||
<https://packaging.python.org/installing/>`_ in the Python Packaging
|
|
||||||
User's Guide for instructions on installing, upgrading, and uninstalling
|
|
||||||
Setuptools.
|
|
||||||
|
|
||||||
The project is `maintained at GitHub <https://github.com/pypa/setuptools>`_.
|
|
||||||
|
|
||||||
Questions and comments should be directed to the `distutils-sig
|
|
||||||
mailing list <http://mail.python.org/pipermail/distutils-sig/>`_.
|
|
||||||
Bug reports and especially tested patches may be
|
|
||||||
submitted directly to the `bug tracker
|
|
||||||
<https://github.com/pypa/setuptools/issues>`_.
|
|
||||||
|
|
||||||
|
|
||||||
Code of Conduct
|
|
||||||
---------------
|
|
||||||
|
|
||||||
Everyone interacting in the setuptools project's codebases, issue trackers,
|
|
||||||
chat rooms, and mailing lists is expected to follow the
|
|
||||||
`PyPA Code of Conduct <https://www.pypa.io/en/latest/code-of-conduct/>`_.
|
|
||||||
|
|
||||||
|
|
|
@ -1 +0,0 @@
|
||||||
pip
|
|
|
@ -1,159 +0,0 @@
|
||||||
easy_install.py,sha256=MDC9vt5AxDsXX5qcKlBz2TnW6Tpuv_AobnfhCJ9X3PM,126
|
|
||||||
pkg_resources/__init__.py,sha256=0q4Rx1CSzw9caT4ewfrQmAAC60NZCjSQU-9vQjP34yo,106202
|
|
||||||
pkg_resources/py31compat.py,sha256=-ysVqoxLetAnL94uM0kHkomKQTC1JZLN2ZUjqUhMeKE,600
|
|
||||||
pkg_resources/_vendor/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
|
|
||||||
pkg_resources/_vendor/appdirs.py,sha256=tgGaL0m4Jo2VeuGfoOOifLv7a7oUEJu2n1vRkqoPw-0,22374
|
|
||||||
pkg_resources/_vendor/pyparsing.py,sha256=PifeLY3-WhIcBVzLtv0U4T_pwDtPruBhBCkg5vLqa28,229867
|
|
||||||
pkg_resources/_vendor/six.py,sha256=A6hdJZVjI3t_geebZ9BzUvwRrIXo0lfwzQlM2LcKyas,30098
|
|
||||||
pkg_resources/_vendor/packaging/__about__.py,sha256=zkcCPTN_6TcLW0Nrlg0176-R1QQ_WVPTm8sz1R4-HjM,720
|
|
||||||
pkg_resources/_vendor/packaging/__init__.py,sha256=_vNac5TrzwsrzbOFIbF-5cHqc_Y2aPT2D7zrIR06BOo,513
|
|
||||||
pkg_resources/_vendor/packaging/_compat.py,sha256=Vi_A0rAQeHbU-a9X0tt1yQm9RqkgQbDSxzRw8WlU9kA,860
|
|
||||||
pkg_resources/_vendor/packaging/_structures.py,sha256=RImECJ4c_wTlaTYYwZYLHEiebDMaAJmK1oPARhw1T5o,1416
|
|
||||||
pkg_resources/_vendor/packaging/markers.py,sha256=uEcBBtGvzqltgnArqb9c4RrcInXezDLos14zbBHhWJo,8248
|
|
||||||
pkg_resources/_vendor/packaging/requirements.py,sha256=SikL2UynbsT0qtY9ltqngndha_sfo0w6XGFhAhoSoaQ,4355
|
|
||||||
pkg_resources/_vendor/packaging/specifiers.py,sha256=SAMRerzO3fK2IkFZCaZkuwZaL_EGqHNOz4pni4vhnN0,28025
|
|
||||||
pkg_resources/_vendor/packaging/utils.py,sha256=3m6WvPm6NNxE8rkTGmn0r75B_GZSGg7ikafxHsBN1WA,421
|
|
||||||
pkg_resources/_vendor/packaging/version.py,sha256=OwGnxYfr2ghNzYx59qWIBkrK3SnB6n-Zfd1XaLpnnM0,11556
|
|
||||||
pkg_resources/extern/__init__.py,sha256=JUtlHHvlxHSNuB4pWqNjcx7n6kG-fwXg7qmJ2zNJlIY,2487
|
|
||||||
setuptools/__init__.py,sha256=MsRcLyrl8E49pBeFZ-PSwST-I2adqjvkfCC1h9gl0TQ,5037
|
|
||||||
setuptools/archive_util.py,sha256=Z58-gbZQ0j92UJy7X7uZevwI28JTVEXd__AjKy4aw78,6613
|
|
||||||
setuptools/build_meta.py,sha256=Z8fCFFJooVDcBuSUlVBWgwV41B9raH1sINpOP5-4o2Y,4756
|
|
||||||
setuptools/cli-32.exe,sha256=dfEuovMNnA2HLa3jRfMPVi5tk4R7alCbpTvuxtCyw0Y,65536
|
|
||||||
setuptools/cli-64.exe,sha256=KLABu5pyrnokJCv6skjXZ6GsXeyYHGcqOUT3oHI3Xpo,74752
|
|
||||||
setuptools/cli.exe,sha256=dfEuovMNnA2HLa3jRfMPVi5tk4R7alCbpTvuxtCyw0Y,65536
|
|
||||||
setuptools/config.py,sha256=ms8JAS3aHsOun-OO-jyvrQq3txyRE2AwKOiZP1aTan8,16317
|
|
||||||
setuptools/dep_util.py,sha256=fgixvC1R7sH3r13ktyf7N0FALoqEXL1cBarmNpSEoWg,935
|
|
||||||
setuptools/depends.py,sha256=hC8QIDcM3VDpRXvRVA6OfL9AaQfxvhxHcN_w6sAyNq8,5837
|
|
||||||
setuptools/dist.py,sha256=PZjofGBK1ZzA-VpbwuTlxf9XMkvwmGYPSIqUl8FpE2k,40364
|
|
||||||
setuptools/extension.py,sha256=uc6nHI-MxwmNCNPbUiBnybSyqhpJqjbhvOQ-emdvt_E,1729
|
|
||||||
setuptools/glob.py,sha256=Y-fpv8wdHZzv9DPCaGACpMSBWJ6amq_1e0R_i8_el4w,5207
|
|
||||||
setuptools/gui-32.exe,sha256=XBr0bHMA6Hpz2s9s9Bzjl-PwXfa9nH4ie0rFn4V2kWA,65536
|
|
||||||
setuptools/gui-64.exe,sha256=aYKMhX1IJLn4ULHgWX0sE0yREUt6B3TEHf_jOw6yNyE,75264
|
|
||||||
setuptools/gui.exe,sha256=XBr0bHMA6Hpz2s9s9Bzjl-PwXfa9nH4ie0rFn4V2kWA,65536
|
|
||||||
setuptools/launch.py,sha256=sd7ejwhBocCDx_wG9rIs0OaZ8HtmmFU8ZC6IR_S0Lvg,787
|
|
||||||
setuptools/lib2to3_ex.py,sha256=t5e12hbR2pi9V4ezWDTB4JM-AISUnGOkmcnYHek3xjg,2013
|
|
||||||
setuptools/monkey.py,sha256=s-yH6vfMFxXMrfVInT9_3gnEyAn-TYMHtXVNUOVI4T8,5791
|
|
||||||
setuptools/msvc.py,sha256=AEbWNLJ0pTuHJSkQuBZET6wr_d2-yGGPkdHCMdIKWB4,40884
|
|
||||||
setuptools/namespaces.py,sha256=F0Nrbv8KCT2OrO7rwa03om4N4GZKAlnce-rr-cgDQa8,3199
|
|
||||||
setuptools/package_index.py,sha256=ELInXIlJZqNbeAKAHYZVDLbwOkYZt-o-vyaFK_eS_N0,39970
|
|
||||||
setuptools/py26compat.py,sha256=VRGHC7z2gliR4_uICJsQNodUcNUzybpus3BrJkWbnK4,679
|
|
||||||
setuptools/py27compat.py,sha256=3mwxRMDk5Q5O1rSXOERbQDXhFqwDJhhUitfMW_qpUCo,536
|
|
||||||
setuptools/py31compat.py,sha256=qGRk3tefux8HbhNzhM0laR3mD8vhAZtffZgzLkBMXJs,1645
|
|
||||||
setuptools/py33compat.py,sha256=W8_JFZr8WQbJT_7-JFWjc_6lHGtoMK-4pCrHIwk5JN0,998
|
|
||||||
setuptools/py36compat.py,sha256=VUDWxmu5rt4QHlGTRtAFu6W5jvfL6WBjeDAzeoBy0OM,2891
|
|
||||||
setuptools/sandbox.py,sha256=hkGRod5_yt3EBHkGnRBf7uK1YceoqFpTT4b__9ZZ5UU,14549
|
|
||||||
setuptools/script (dev).tmpl,sha256=f7MR17dTkzaqkCMSVseyOCMVrPVSMdmTQsaB8cZzfuI,201
|
|
||||||
setuptools/script.tmpl,sha256=WGTt5piezO27c-Dbx6l5Q4T3Ff20A5z7872hv3aAhYY,138
|
|
||||||
setuptools/site-patch.py,sha256=BVt6yIrDMXJoflA5J6DJIcsJUfW_XEeVhOzelTTFDP4,2307
|
|
||||||
setuptools/ssl_support.py,sha256=Axo1QtiAtsvuENZq_BvhW5PeWw2nrX39-4qoSiVpB6w,8220
|
|
||||||
setuptools/unicode_utils.py,sha256=NOiZ_5hD72A6w-4wVj8awHFM3n51Kmw1Ic_vx15XFqw,996
|
|
||||||
setuptools/version.py,sha256=og_cuZQb0QI6ukKZFfZWPlr1HgJBPPn2vO2m_bI9ZTE,144
|
|
||||||
setuptools/windows_support.py,sha256=5GrfqSP2-dLGJoZTq2g6dCKkyQxxa2n5IQiXlJCoYEE,714
|
|
||||||
setuptools/command/__init__.py,sha256=-X7tSQahlz8sbGu_Xq9bqumFE117jU56E96tDDufNqw,590
|
|
||||||
setuptools/command/alias.py,sha256=KjpE0sz_SDIHv3fpZcIQK-sCkJz-SrC6Gmug6b9Nkc8,2426
|
|
||||||
setuptools/command/bdist_egg.py,sha256=TGN1XVQb9V8Rf-msDKaIZWmeGQf81HT83oqXJ_3M0gg,17441
|
|
||||||
setuptools/command/bdist_rpm.py,sha256=B7l0TnzCGb-0nLlm6rS00jWLkojASwVmdhW2w5Qz_Ak,1508
|
|
||||||
setuptools/command/bdist_wininst.py,sha256=_6dz3lpB1tY200LxKPLM7qgwTCceOMgaWFF-jW2-pm0,637
|
|
||||||
setuptools/command/build_clib.py,sha256=bQ9aBr-5ZSO-9fGsGsDLz0mnnFteHUZnftVLkhvHDq0,4484
|
|
||||||
setuptools/command/build_ext.py,sha256=dO89j-IC0dAjSty1sSZxvi0LSdkPGR_ZPXFuAAFDZj4,13049
|
|
||||||
setuptools/command/build_py.py,sha256=yWyYaaS9F3o9JbIczn064A5g1C5_UiKRDxGaTqYbtLE,9596
|
|
||||||
setuptools/command/develop.py,sha256=PuVOjmGWGfvHZmOBMj_bdeU087kl0jhnMHqKcDODBDE,8024
|
|
||||||
setuptools/command/dist_info.py,sha256=7Ewmog46orGjzME5UA_GQvqewRd1s25aCLxsfHCKqq8,924
|
|
||||||
setuptools/command/easy_install.py,sha256=eruE4R4JfOTx0_0hDYMMElpup33Qkn0P44lclgP8dA0,85973
|
|
||||||
setuptools/command/egg_info.py,sha256=HNUt2tQAAp8dULFS_6Qk9vflESI7jdqlCqq-VVQi7AA,25016
|
|
||||||
setuptools/command/install.py,sha256=a0EZpL_A866KEdhicTGbuyD_TYl1sykfzdrri-zazT4,4683
|
|
||||||
setuptools/command/install_egg_info.py,sha256=bMgeIeRiXzQ4DAGPV1328kcjwQjHjOWU4FngAWLV78Q,2203
|
|
||||||
setuptools/command/install_lib.py,sha256=11mxf0Ch12NsuYwS8PHwXBRvyh671QAM4cTRh7epzG0,3840
|
|
||||||
setuptools/command/install_scripts.py,sha256=UD0rEZ6861mTYhIdzcsqKnUl8PozocXWl9VBQ1VTWnc,2439
|
|
||||||
setuptools/command/launcher manifest.xml,sha256=xlLbjWrB01tKC0-hlVkOKkiSPbzMml2eOPtJ_ucCnbE,628
|
|
||||||
setuptools/command/py36compat.py,sha256=SzjZcOxF7zdFUT47Zv2n7AM3H8koDys_0OpS-n9gIfc,4986
|
|
||||||
setuptools/command/register.py,sha256=bHlMm1qmBbSdahTOT8w6UhA-EgeQIz7p6cD-qOauaiI,270
|
|
||||||
setuptools/command/rotate.py,sha256=co5C1EkI7P0GGT6Tqz-T2SIj2LBJTZXYELpmao6d4KQ,2164
|
|
||||||
setuptools/command/saveopts.py,sha256=za7QCBcQimKKriWcoCcbhxPjUz30gSB74zuTL47xpP4,658
|
|
||||||
setuptools/command/sdist.py,sha256=VldpcHRSlDrvvK2uV9O6HjQA2OtHCUa4QaMkYCYwTrA,6919
|
|
||||||
setuptools/command/setopt.py,sha256=NTWDyx-gjDF-txf4dO577s7LOzHVoKR0Mq33rFxaRr8,5085
|
|
||||||
setuptools/command/test.py,sha256=koi5lqjhXHlt0B3egYb98qRVETzKXKhWDD5OQY-AKuA,9044
|
|
||||||
setuptools/command/upload.py,sha256=i1gfItZ3nQOn5FKXb8tLC2Kd7eKC8lWO4bdE6NqGpE4,1172
|
|
||||||
setuptools/command/upload_docs.py,sha256=oXiGplM_cUKLwE4CWWw98RzCufAu8tBhMC97GegFcms,7311
|
|
||||||
setuptools/extern/__init__.py,sha256=ZtCLYQ8JTtOtm7SYoxekZw-UzY3TR50SRIUaeqr2ROk,131
|
|
||||||
setuptools-36.6.0.dist-info/DESCRIPTION.rst,sha256=1sSNG6a5L3fSMo1x9uE3jvumlEODgeqBUtSaYp_VVLw,1421
|
|
||||||
setuptools-36.6.0.dist-info/METADATA,sha256=GLuJ3zbtJdt_nwgq9UIpUoXOis1Ub4tWeOTKQIZHT1s,2847
|
|
||||||
setuptools-36.6.0.dist-info/RECORD,,
|
|
||||||
setuptools-36.6.0.dist-info/WHEEL,sha256=kdsN-5OJAZIiHN-iO4Rhl82KyS0bDWf4uBwMbkNafr8,110
|
|
||||||
setuptools-36.6.0.dist-info/dependency_links.txt,sha256=HlkCFkoK5TbZ5EMLbLKYhLcY_E31kBWD8TqW2EgmatQ,239
|
|
||||||
setuptools-36.6.0.dist-info/entry_points.txt,sha256=jBqCYDlVjl__sjYFGXo1JQGIMAYFJE-prYWUtnMZEew,2990
|
|
||||||
setuptools-36.6.0.dist-info/metadata.json,sha256=4yqt7_oaFRn8AA20H0H5W2AByP8z-0HuDpwGyiQH6UU,4916
|
|
||||||
setuptools-36.6.0.dist-info/top_level.txt,sha256=2HUXVVwA4Pff1xgTFr3GsTXXKaPaO6vlG6oNJ_4u4Tg,38
|
|
||||||
setuptools-36.6.0.dist-info/zip-safe,sha256=AbpHGcgLb-kRsJGnwFEktk7uzpZOCcBY74-YBdrKVGs,1
|
|
||||||
../../../bin/easy_install,sha256=tsci1id0sS7h2uWc2NQJYflZoKSI8AR-W02mXYmf7Es,300
|
|
||||||
../../../bin/easy_install-3.4,sha256=tsci1id0sS7h2uWc2NQJYflZoKSI8AR-W02mXYmf7Es,300
|
|
||||||
setuptools-36.6.0.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
|
|
||||||
setuptools/__pycache__/py31compat.cpython-34.pyc,,
|
|
||||||
setuptools/command/__pycache__/bdist_egg.cpython-34.pyc,,
|
|
||||||
setuptools/__pycache__/sandbox.cpython-34.pyc,,
|
|
||||||
setuptools/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
setuptools/extern/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
setuptools/__pycache__/site-patch.cpython-34.pyc,,
|
|
||||||
setuptools/__pycache__/config.cpython-34.pyc,,
|
|
||||||
setuptools/__pycache__/py26compat.cpython-34.pyc,,
|
|
||||||
setuptools/command/__pycache__/bdist_rpm.cpython-34.pyc,,
|
|
||||||
setuptools/command/__pycache__/test.cpython-34.pyc,,
|
|
||||||
pkg_resources/extern/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
setuptools/command/__pycache__/upload_docs.cpython-34.pyc,,
|
|
||||||
setuptools/__pycache__/ssl_support.cpython-34.pyc,,
|
|
||||||
setuptools/command/__pycache__/alias.cpython-34.pyc,,
|
|
||||||
pkg_resources/_vendor/__pycache__/pyparsing.cpython-34.pyc,,
|
|
||||||
pkg_resources/_vendor/packaging/__pycache__/__about__.cpython-34.pyc,,
|
|
||||||
setuptools/__pycache__/py33compat.cpython-34.pyc,,
|
|
||||||
setuptools/command/__pycache__/build_py.cpython-34.pyc,,
|
|
||||||
pkg_resources/_vendor/packaging/__pycache__/_compat.cpython-34.pyc,,
|
|
||||||
setuptools/command/__pycache__/install_lib.cpython-34.pyc,,
|
|
||||||
setuptools/command/__pycache__/dist_info.cpython-34.pyc,,
|
|
||||||
setuptools/__pycache__/build_meta.cpython-34.pyc,,
|
|
||||||
setuptools/command/__pycache__/bdist_wininst.cpython-34.pyc,,
|
|
||||||
setuptools/__pycache__/extension.cpython-34.pyc,,
|
|
||||||
pkg_resources/_vendor/packaging/__pycache__/_structures.cpython-34.pyc,,
|
|
||||||
pkg_resources/_vendor/packaging/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
setuptools/command/__pycache__/install_scripts.cpython-34.pyc,,
|
|
||||||
setuptools/command/__pycache__/install.cpython-34.pyc,,
|
|
||||||
setuptools/__pycache__/py27compat.cpython-34.pyc,,
|
|
||||||
setuptools/__pycache__/py36compat.cpython-34.pyc,,
|
|
||||||
setuptools/command/__pycache__/sdist.cpython-34.pyc,,
|
|
||||||
setuptools/__pycache__/package_index.cpython-34.pyc,,
|
|
||||||
setuptools/__pycache__/msvc.cpython-34.pyc,,
|
|
||||||
setuptools/__pycache__/archive_util.cpython-34.pyc,,
|
|
||||||
setuptools/command/__pycache__/egg_info.cpython-34.pyc,,
|
|
||||||
pkg_resources/_vendor/packaging/__pycache__/requirements.cpython-34.pyc,,
|
|
||||||
setuptools/__pycache__/lib2to3_ex.cpython-34.pyc,,
|
|
||||||
setuptools/command/__pycache__/install_egg_info.cpython-34.pyc,,
|
|
||||||
setuptools/command/__pycache__/upload.cpython-34.pyc,,
|
|
||||||
setuptools/command/__pycache__/build_ext.cpython-34.pyc,,
|
|
||||||
pkg_resources/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
pkg_resources/_vendor/__pycache__/appdirs.cpython-34.pyc,,
|
|
||||||
setuptools/__pycache__/namespaces.cpython-34.pyc,,
|
|
||||||
setuptools/__pycache__/monkey.cpython-34.pyc,,
|
|
||||||
setuptools/command/__pycache__/build_clib.cpython-34.pyc,,
|
|
||||||
pkg_resources/_vendor/__pycache__/six.cpython-34.pyc,,
|
|
||||||
pkg_resources/__pycache__/py31compat.cpython-34.pyc,,
|
|
||||||
pkg_resources/_vendor/packaging/__pycache__/specifiers.cpython-34.pyc,,
|
|
||||||
setuptools/__pycache__/dist.cpython-34.pyc,,
|
|
||||||
setuptools/__pycache__/depends.cpython-34.pyc,,
|
|
||||||
__pycache__/easy_install.cpython-34.pyc,,
|
|
||||||
setuptools/__pycache__/dep_util.cpython-34.pyc,,
|
|
||||||
setuptools/command/__pycache__/setopt.cpython-34.pyc,,
|
|
||||||
setuptools/__pycache__/version.cpython-34.pyc,,
|
|
||||||
setuptools/__pycache__/windows_support.cpython-34.pyc,,
|
|
||||||
pkg_resources/_vendor/packaging/__pycache__/utils.cpython-34.pyc,,
|
|
||||||
setuptools/__pycache__/glob.cpython-34.pyc,,
|
|
||||||
setuptools/command/__pycache__/develop.cpython-34.pyc,,
|
|
||||||
pkg_resources/_vendor/packaging/__pycache__/markers.cpython-34.pyc,,
|
|
||||||
setuptools/__pycache__/launch.cpython-34.pyc,,
|
|
||||||
setuptools/command/__pycache__/rotate.cpython-34.pyc,,
|
|
||||||
pkg_resources/_vendor/packaging/__pycache__/version.cpython-34.pyc,,
|
|
||||||
setuptools/command/__pycache__/py36compat.cpython-34.pyc,,
|
|
||||||
setuptools/command/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
setuptools/command/__pycache__/register.cpython-34.pyc,,
|
|
||||||
setuptools/__pycache__/unicode_utils.cpython-34.pyc,,
|
|
||||||
pkg_resources/_vendor/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
setuptools/command/__pycache__/easy_install.cpython-34.pyc,,
|
|
||||||
setuptools/command/__pycache__/saveopts.cpython-34.pyc,,
|
|
|
@ -1 +0,0 @@
|
||||||
{"classifiers": ["Development Status :: 5 - Production/Stable", "Intended Audience :: Developers", "License :: OSI Approved :: MIT License", "Operating System :: OS Independent", "Programming Language :: Python :: 2", "Programming Language :: Python :: 2.6", "Programming Language :: Python :: 2.7", "Programming Language :: Python :: 3", "Programming Language :: Python :: 3.3", "Programming Language :: Python :: 3.4", "Programming Language :: Python :: 3.5", "Programming Language :: Python :: 3.6", "Topic :: Software Development :: Libraries :: Python Modules", "Topic :: System :: Archiving :: Packaging", "Topic :: System :: Systems Administration", "Topic :: Utilities"], "description_content_type": "text/x-rst; charset=UTF-8", "extensions": {"python.commands": {"wrap_console": {"easy_install": "setuptools.command.easy_install:main", "easy_install-3.6": "setuptools.command.easy_install:main"}}, "python.details": {"contacts": [{"email": "distutils-sig@python.org", "name": "Python Packaging Authority", "role": "author"}], "document_names": {"description": "DESCRIPTION.rst"}, "project_urls": {"Home": "https://github.com/pypa/setuptools"}}, "python.exports": {"console_scripts": {"easy_install": "setuptools.command.easy_install:main", "easy_install-3.6": "setuptools.command.easy_install:main"}, "distutils.commands": {"alias": "setuptools.command.alias:alias", "bdist_egg": "setuptools.command.bdist_egg:bdist_egg", "bdist_rpm": "setuptools.command.bdist_rpm:bdist_rpm", "bdist_wininst": "setuptools.command.bdist_wininst:bdist_wininst", "build_clib": "setuptools.command.build_clib:build_clib", "build_ext": "setuptools.command.build_ext:build_ext", "build_py": "setuptools.command.build_py:build_py", "develop": "setuptools.command.develop:develop", "dist_info": "setuptools.command.dist_info:dist_info", "easy_install": "setuptools.command.easy_install:easy_install", "egg_info": "setuptools.command.egg_info:egg_info", "install": "setuptools.command.install:install", "install_egg_info": "setuptools.command.install_egg_info:install_egg_info", "install_lib": "setuptools.command.install_lib:install_lib", "install_scripts": "setuptools.command.install_scripts:install_scripts", "register": "setuptools.command.register:register", "rotate": "setuptools.command.rotate:rotate", "saveopts": "setuptools.command.saveopts:saveopts", "sdist": "setuptools.command.sdist:sdist", "setopt": "setuptools.command.setopt:setopt", "test": "setuptools.command.test:test", "upload": "setuptools.command.upload:upload", "upload_docs": "setuptools.command.upload_docs:upload_docs"}, "distutils.setup_keywords": {"convert_2to3_doctests": "setuptools.dist:assert_string_list", "dependency_links": "setuptools.dist:assert_string_list", "eager_resources": "setuptools.dist:assert_string_list", "entry_points": "setuptools.dist:check_entry_points", "exclude_package_data": "setuptools.dist:check_package_data", "extras_require": "setuptools.dist:check_extras", "include_package_data": "setuptools.dist:assert_bool", "install_requires": "setuptools.dist:check_requirements", "namespace_packages": "setuptools.dist:check_nsp", "package_data": "setuptools.dist:check_package_data", "packages": "setuptools.dist:check_packages", "python_requires": "setuptools.dist:check_specifier", "setup_requires": "setuptools.dist:check_requirements", "test_loader": "setuptools.dist:check_importable", "test_runner": "setuptools.dist:check_importable", "test_suite": "setuptools.dist:check_test_suite", "tests_require": "setuptools.dist:check_requirements", "use_2to3": "setuptools.dist:assert_bool", "use_2to3_exclude_fixers": "setuptools.dist:assert_string_list", "use_2to3_fixers": "setuptools.dist:assert_string_list", "zip_safe": "setuptools.dist:assert_bool"}, "egg_info.writers": {"PKG-INFO": "setuptools.command.egg_info:write_pkg_info", "dependency_links.txt": "setuptools.command.egg_info:overwrite_arg", "depends.txt": "setuptools.command.egg_info:warn_depends_obsolete", "eager_resources.txt": "setuptools.command.egg_info:overwrite_arg", "entry_points.txt": "setuptools.command.egg_info:write_entries", "namespace_packages.txt": "setuptools.command.egg_info:overwrite_arg", "requires.txt": "setuptools.command.egg_info:write_requirements", "top_level.txt": "setuptools.command.egg_info:write_toplevel_names"}, "setuptools.installation": {"eggsecutable": "setuptools.command.easy_install:bootstrap"}}}, "extras": ["certs", "ssl"], "generator": "bdist_wheel (0.30.0)", "keywords": ["CPAN", "PyPI", "distutils", "eggs", "package", "management"], "metadata_version": "2.0", "name": "setuptools", "requires_python": ">=2.6,!=3.0.*,!=3.1.*,!=3.2.*", "run_requires": [{"extra": "certs", "requires": ["certifi (==2016.9.26)"]}, {"environment": "sys_platform=='win32'", "extra": "ssl", "requires": ["wincertstore (==0.2)"]}], "summary": "Easily download, build, install, upgrade, and uninstall Python packages", "version": "36.6.0"}
|
|
|
@ -1,10 +0,0 @@
|
||||||
import distutils.command.register as orig
|
|
||||||
|
|
||||||
|
|
||||||
class register(orig.register):
|
|
||||||
__doc__ = orig.register.__doc__
|
|
||||||
|
|
||||||
def run(self):
|
|
||||||
# Make sure that we are using valid current name/version info
|
|
||||||
self.run_command('egg_info')
|
|
||||||
orig.register.run(self)
|
|
|
@ -1,4 +0,0 @@
|
||||||
from pkg_resources.extern import VendorImporter
|
|
||||||
|
|
||||||
names = 'six',
|
|
||||||
VendorImporter(__name__, names, 'pkg_resources._vendor').install()
|
|
|
@ -1,31 +0,0 @@
|
||||||
"""
|
|
||||||
Compatibility Support for Python 2.6 and earlier
|
|
||||||
"""
|
|
||||||
|
|
||||||
import sys
|
|
||||||
|
|
||||||
try:
|
|
||||||
from urllib.parse import splittag
|
|
||||||
except ImportError:
|
|
||||||
from urllib import splittag
|
|
||||||
|
|
||||||
|
|
||||||
def strip_fragment(url):
|
|
||||||
"""
|
|
||||||
In `Python 8280 <http://bugs.python.org/issue8280>`_, Python 2.7 and
|
|
||||||
later was patched to disregard the fragment when making URL requests.
|
|
||||||
Do the same for Python 2.6 and earlier.
|
|
||||||
"""
|
|
||||||
url, fragment = splittag(url)
|
|
||||||
return url
|
|
||||||
|
|
||||||
|
|
||||||
if sys.version_info >= (2, 7):
|
|
||||||
strip_fragment = lambda x: x
|
|
||||||
|
|
||||||
try:
|
|
||||||
from importlib import import_module
|
|
||||||
except ImportError:
|
|
||||||
|
|
||||||
def import_module(module_name):
|
|
||||||
return __import__(module_name, fromlist=['__name__'])
|
|
|
@ -1,56 +0,0 @@
|
||||||
import sys
|
|
||||||
import unittest
|
|
||||||
|
|
||||||
__all__ = ['get_config_vars', 'get_path']
|
|
||||||
|
|
||||||
try:
|
|
||||||
# Python 2.7 or >=3.2
|
|
||||||
from sysconfig import get_config_vars, get_path
|
|
||||||
except ImportError:
|
|
||||||
from distutils.sysconfig import get_config_vars, get_python_lib
|
|
||||||
|
|
||||||
def get_path(name):
|
|
||||||
if name not in ('platlib', 'purelib'):
|
|
||||||
raise ValueError("Name must be purelib or platlib")
|
|
||||||
return get_python_lib(name == 'platlib')
|
|
||||||
|
|
||||||
|
|
||||||
try:
|
|
||||||
# Python >=3.2
|
|
||||||
from tempfile import TemporaryDirectory
|
|
||||||
except ImportError:
|
|
||||||
import shutil
|
|
||||||
import tempfile
|
|
||||||
|
|
||||||
class TemporaryDirectory(object):
|
|
||||||
"""
|
|
||||||
Very simple temporary directory context manager.
|
|
||||||
Will try to delete afterward, but will also ignore OS and similar
|
|
||||||
errors on deletion.
|
|
||||||
"""
|
|
||||||
|
|
||||||
def __init__(self):
|
|
||||||
self.name = None # Handle mkdtemp raising an exception
|
|
||||||
self.name = tempfile.mkdtemp()
|
|
||||||
|
|
||||||
def __enter__(self):
|
|
||||||
return self.name
|
|
||||||
|
|
||||||
def __exit__(self, exctype, excvalue, exctrace):
|
|
||||||
try:
|
|
||||||
shutil.rmtree(self.name, True)
|
|
||||||
except OSError: # removal errors are not the only possible
|
|
||||||
pass
|
|
||||||
self.name = None
|
|
||||||
|
|
||||||
|
|
||||||
unittest_main = unittest.main
|
|
||||||
|
|
||||||
_PY31 = (3, 1) <= sys.version_info[:2] < (3, 2)
|
|
||||||
if _PY31:
|
|
||||||
# on Python 3.1, translate testRunner==None to TextTestRunner
|
|
||||||
# for compatibility with Python 2.6, 2.7, and 3.2+
|
|
||||||
def unittest_main(*args, **kwargs):
|
|
||||||
if 'testRunner' in kwargs and kwargs['testRunner'] is None:
|
|
||||||
kwargs['testRunner'] = unittest.TextTestRunner
|
|
||||||
return unittest.main(*args, **kwargs)
|
|
|
@ -1,27 +0,0 @@
|
||||||
.. image:: http://img.shields.io/pypi/v/six.svg
|
|
||||||
:target: https://pypi.python.org/pypi/six
|
|
||||||
|
|
||||||
.. image:: https://travis-ci.org/benjaminp/six.svg?branch=master
|
|
||||||
:target: https://travis-ci.org/benjaminp/six
|
|
||||||
|
|
||||||
.. image:: http://img.shields.io/badge/license-MIT-green.svg
|
|
||||||
:target: https://github.com/benjaminp/six/blob/master/LICENSE
|
|
||||||
|
|
||||||
Six is a Python 2 and 3 compatibility library. It provides utility functions
|
|
||||||
for smoothing over the differences between the Python versions with the goal of
|
|
||||||
writing Python code that is compatible on both Python versions. See the
|
|
||||||
documentation for more information on what is provided.
|
|
||||||
|
|
||||||
Six supports every Python version since 2.6. It is contained in only one Python
|
|
||||||
file, so it can be easily copied into your project. (The copyright and license
|
|
||||||
notice must be retained.)
|
|
||||||
|
|
||||||
Online documentation is at http://six.rtfd.org.
|
|
||||||
|
|
||||||
Bugs can be reported to https://github.com/benjaminp/six. The code can also
|
|
||||||
be found there.
|
|
||||||
|
|
||||||
For questions about six or porting in general, email the python-porting mailing
|
|
||||||
list: https://mail.python.org/mailman/listinfo/python-porting
|
|
||||||
|
|
||||||
|
|
|
@ -1 +0,0 @@
|
||||||
pip
|
|
|
@ -1,9 +0,0 @@
|
||||||
six.py,sha256=A08MPb-Gi9FfInI3IW7HimXFmEH2T2IPzHgDvdhZPRA,30888
|
|
||||||
six-1.11.0.dist-info/DESCRIPTION.rst,sha256=gPBoq1Ruc1QDWyLeXPlieL3F-XZz1_WXB-5gctCfg-A,1098
|
|
||||||
six-1.11.0.dist-info/METADATA,sha256=06nZXaDYN3vnC-pmUjhkECYFH_a--ywvcPIpUdNeH1o,1607
|
|
||||||
six-1.11.0.dist-info/RECORD,,
|
|
||||||
six-1.11.0.dist-info/WHEEL,sha256=o2k-Qa-RMNIJmUdIc7KU6VWR_ErNRbWNlxDIpl7lm34,110
|
|
||||||
six-1.11.0.dist-info/metadata.json,sha256=ac3f4f7MpSHSnZ1SqhHCwsL7FGWMG0gBEb0hhS2eSSM,703
|
|
||||||
six-1.11.0.dist-info/top_level.txt,sha256=_iVH_iYEtEXnD8nYGQYpYFUvkUW9sEO1GYbkeKSAais,4
|
|
||||||
six-1.11.0.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
|
|
||||||
__pycache__/six.cpython-34.pyc,,
|
|
|
@ -1 +0,0 @@
|
||||||
{"classifiers": ["Programming Language :: Python :: 2", "Programming Language :: Python :: 3", "Intended Audience :: Developers", "License :: OSI Approved :: MIT License", "Topic :: Software Development :: Libraries", "Topic :: Utilities"], "extensions": {"python.details": {"contacts": [{"email": "benjamin@python.org", "name": "Benjamin Peterson", "role": "author"}], "document_names": {"description": "DESCRIPTION.rst"}, "project_urls": {"Home": "http://pypi.python.org/pypi/six/"}}}, "generator": "bdist_wheel (0.29.0)", "license": "MIT", "metadata_version": "2.0", "name": "six", "summary": "Python 2 and 3 compatibility utilities", "test_requires": [{"requires": ["pytest"]}], "version": "1.11.0"}
|
|
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
|
@ -1,340 +0,0 @@
|
||||||
Wheel
|
|
||||||
=====
|
|
||||||
|
|
||||||
A built-package format for Python.
|
|
||||||
|
|
||||||
A wheel is a ZIP-format archive with a specially formatted filename
|
|
||||||
and the .whl extension. It is designed to contain all the files for a
|
|
||||||
PEP 376 compatible install in a way that is very close to the on-disk
|
|
||||||
format. Many packages will be properly installed with only the "Unpack"
|
|
||||||
step (simply extracting the file onto sys.path), and the unpacked archive
|
|
||||||
preserves enough information to "Spread" (copy data and scripts to their
|
|
||||||
final locations) at any later time.
|
|
||||||
|
|
||||||
The wheel project provides a `bdist_wheel` command for setuptools
|
|
||||||
(requires setuptools >= 0.8.0). Wheel files can be installed with a
|
|
||||||
newer `pip` from https://github.com/pypa/pip or with wheel's own command
|
|
||||||
line utility.
|
|
||||||
|
|
||||||
The wheel documentation is at http://wheel.rtfd.org/. The file format
|
|
||||||
is documented in PEP 427 (http://www.python.org/dev/peps/pep-0427/).
|
|
||||||
|
|
||||||
The reference implementation is at https://github.com/pypa/wheel
|
|
||||||
|
|
||||||
Why not egg?
|
|
||||||
------------
|
|
||||||
|
|
||||||
Python's egg format predates the packaging related standards we have
|
|
||||||
today, the most important being PEP 376 "Database of Installed Python
|
|
||||||
Distributions" which specifies the .dist-info directory (instead of
|
|
||||||
.egg-info) and PEP 426 "Metadata for Python Software Packages 2.0"
|
|
||||||
which specifies how to express dependencies (instead of requires.txt
|
|
||||||
in .egg-info).
|
|
||||||
|
|
||||||
Wheel implements these things. It also provides a richer file naming
|
|
||||||
convention that communicates the Python implementation and ABI as well
|
|
||||||
as simply the language version used in a particular package.
|
|
||||||
|
|
||||||
Unlike .egg, wheel will be a fully-documented standard at the binary
|
|
||||||
level that is truly easy to install even if you do not want to use the
|
|
||||||
reference implementation.
|
|
||||||
|
|
||||||
|
|
||||||
Code of Conduct
|
|
||||||
---------------
|
|
||||||
|
|
||||||
Everyone interacting in the wheel project's codebases, issue trackers, chat
|
|
||||||
rooms, and mailing lists is expected to follow the `PyPA Code of Conduct`_.
|
|
||||||
|
|
||||||
.. _PyPA Code of Conduct: https://www.pypa.io/en/latest/code-of-conduct/
|
|
||||||
|
|
||||||
|
|
||||||
0.30.0
|
|
||||||
======
|
|
||||||
- Added py-limited-api {cp32|cp33|cp34|...} flag to produce cpNN.abi3.{arch}
|
|
||||||
tags on CPython 3.
|
|
||||||
- Documented the ``license_file`` metadata key
|
|
||||||
- Improved Python, abi tagging for `wheel convert`. Thanks Ales Erjavec.
|
|
||||||
- Fixed `>` being prepended to lines starting with "From" in the long description
|
|
||||||
- Added support for specifying a build number (as per PEP 427).
|
|
||||||
Thanks Ian Cordasco.
|
|
||||||
- Made the order of files in generated ZIP files deterministic.
|
|
||||||
Thanks Matthias Bach.
|
|
||||||
- Made the order of requirements in metadata deterministic. Thanks Chris Lamb.
|
|
||||||
- Fixed `wheel install` clobbering existing files
|
|
||||||
- Improved the error message when trying to verify an unsigned wheel file
|
|
||||||
- Removed support for Python 2.6, 3.2 and 3.3.
|
|
||||||
|
|
||||||
0.29.0
|
|
||||||
======
|
|
||||||
- Fix compression type of files in archive (Issue #155, Pull Request #62,
|
|
||||||
thanks Xavier Fernandez)
|
|
||||||
|
|
||||||
0.28.0
|
|
||||||
======
|
|
||||||
- Fix file modes in archive (Issue #154)
|
|
||||||
|
|
||||||
0.27.0
|
|
||||||
======
|
|
||||||
- Support forcing a platform tag using `--plat-name` on pure-Python wheels, as
|
|
||||||
well as nonstandard platform tags on non-pure wheels (Pull Request #60, Issue
|
|
||||||
#144, thanks Andrés Díaz)
|
|
||||||
- Add SOABI tags to platform-specific wheels built for Python 2.X (Pull Request
|
|
||||||
#55, Issue #63, Issue #101)
|
|
||||||
- Support reproducible wheel files, wheels that can be rebuilt and will hash to
|
|
||||||
the same values as previous builds (Pull Request #52, Issue #143, thanks
|
|
||||||
Barry Warsaw)
|
|
||||||
- Support for changes in keyring >= 8.0 (Pull Request #61, thanks Jason R.
|
|
||||||
Coombs)
|
|
||||||
- Use the file context manager when checking if dependency_links.txt is empty,
|
|
||||||
fixes problems building wheels under PyPy on Windows (Issue #150, thanks
|
|
||||||
Cosimo Lupo)
|
|
||||||
- Don't attempt to (recursively) create a build directory ending with `..`
|
|
||||||
(invalid on all platforms, but code was only executed on Windows) (Issue #91)
|
|
||||||
- Added the PyPA Code of Conduct (Pull Request #56)
|
|
||||||
|
|
||||||
0.26.0
|
|
||||||
======
|
|
||||||
- Fix multiple entrypoint comparison failure on Python 3 (Issue #148)
|
|
||||||
|
|
||||||
0.25.0
|
|
||||||
======
|
|
||||||
- Add Python 3.5 to tox configuration
|
|
||||||
- Deterministic (sorted) metadata
|
|
||||||
- Fix tagging for Python 3.5 compatibility
|
|
||||||
- Support py2-none-'arch' and py3-none-'arch' tags
|
|
||||||
- Treat data-only wheels as pure
|
|
||||||
- Write to temporary file and rename when using wheel install --force
|
|
||||||
|
|
||||||
0.24.0
|
|
||||||
======
|
|
||||||
- The python tag used for pure-python packages is now .pyN (major version
|
|
||||||
only). This change actually occurred in 0.23.0 when the --python-tag
|
|
||||||
option was added, but was not explicitly mentioned in the changelog then.
|
|
||||||
- wininst2wheel and egg2wheel removed. Use "wheel convert [archive]"
|
|
||||||
instead.
|
|
||||||
- Wheel now supports setuptools style conditional requirements via the
|
|
||||||
extras_require={} syntax. Separate 'extra' names from conditions using
|
|
||||||
the : character. Wheel's own setup.py does this. (The empty-string
|
|
||||||
extra is the same as install_requires.) These conditional requirements
|
|
||||||
should work the same whether the package is installed by wheel or
|
|
||||||
by setup.py.
|
|
||||||
|
|
||||||
0.23.0
|
|
||||||
======
|
|
||||||
- Compatibility tag flags added to the bdist_wheel command
|
|
||||||
- sdist should include files necessary for tests
|
|
||||||
- 'wheel convert' can now also convert unpacked eggs to wheel
|
|
||||||
- Rename pydist.json to metadata.json to avoid stepping on the PEP
|
|
||||||
- The --skip-scripts option has been removed, and not generating scripts is now
|
|
||||||
the default. The option was a temporary approach until installers could
|
|
||||||
generate scripts themselves. That is now the case with pip 1.5 and later.
|
|
||||||
Note that using pip 1.4 to install a wheel without scripts will leave the
|
|
||||||
installation without entry-point wrappers. The "wheel install-scripts"
|
|
||||||
command can be used to generate the scripts in such cases.
|
|
||||||
- Thank you contributors
|
|
||||||
|
|
||||||
0.22.0
|
|
||||||
======
|
|
||||||
- Include entry_points.txt, scripts a.k.a. commands, in experimental
|
|
||||||
pydist.json
|
|
||||||
- Improved test_requires parsing
|
|
||||||
- Python 2.6 fixes, "wheel version" command courtesy pombredanne
|
|
||||||
|
|
||||||
0.21.0
|
|
||||||
======
|
|
||||||
- Pregenerated scripts are the default again.
|
|
||||||
- "setup.py bdist_wheel --skip-scripts" turns them off.
|
|
||||||
- setuptools is no longer a listed requirement for the 'wheel'
|
|
||||||
package. It is of course still required in order for bdist_wheel
|
|
||||||
to work.
|
|
||||||
- "python -m wheel" avoids importing pkg_resources until it's necessary.
|
|
||||||
|
|
||||||
0.20.0
|
|
||||||
======
|
|
||||||
- No longer include console_scripts in wheels. Ordinary scripts (shell files,
|
|
||||||
standalone Python files) are included as usual.
|
|
||||||
- Include new command "python -m wheel install-scripts [distribution
|
|
||||||
[distribution ...]]" to install the console_scripts (setuptools-style
|
|
||||||
scripts using pkg_resources) for a distribution.
|
|
||||||
|
|
||||||
0.19.0
|
|
||||||
======
|
|
||||||
- pymeta.json becomes pydist.json
|
|
||||||
|
|
||||||
0.18.0
|
|
||||||
======
|
|
||||||
- Python 3 Unicode improvements
|
|
||||||
|
|
||||||
0.17.0
|
|
||||||
======
|
|
||||||
- Support latest PEP-426 "pymeta.json" (json-format metadata)
|
|
||||||
|
|
||||||
0.16.0
|
|
||||||
======
|
|
||||||
- Python 2.6 compatibility bugfix (thanks John McFarlane)
|
|
||||||
- Non-prerelease version number
|
|
||||||
|
|
||||||
1.0.0a2
|
|
||||||
=======
|
|
||||||
- Bugfix for C-extension tags for CPython 3.3 (using SOABI)
|
|
||||||
|
|
||||||
1.0.0a1
|
|
||||||
=======
|
|
||||||
- Bugfix for bdist_wininst converter "wheel convert"
|
|
||||||
- Bugfix for dists where "is pure" is None instead of True or False
|
|
||||||
|
|
||||||
1.0.0a0
|
|
||||||
=======
|
|
||||||
- Update for version 1.0 of Wheel (PEP accepted).
|
|
||||||
- Python 3 fix for moving Unicode Description to metadata body
|
|
||||||
- Include rudimentary API documentation in Sphinx (thanks Kevin Horn)
|
|
||||||
|
|
||||||
0.15.0
|
|
||||||
======
|
|
||||||
- Various improvements
|
|
||||||
|
|
||||||
0.14.0
|
|
||||||
======
|
|
||||||
- Changed the signature format to better comply with the current JWS spec.
|
|
||||||
Breaks all existing signatures.
|
|
||||||
- Include ``wheel unsign`` command to remove RECORD.jws from an archive.
|
|
||||||
- Put the description in the newly allowed payload section of PKG-INFO
|
|
||||||
(METADATA) files.
|
|
||||||
|
|
||||||
0.13.0
|
|
||||||
======
|
|
||||||
- Use distutils instead of sysconfig to get installation paths; can install
|
|
||||||
headers.
|
|
||||||
- Improve WheelFile() sort.
|
|
||||||
- Allow bootstrap installs without any pkg_resources.
|
|
||||||
|
|
||||||
0.12.0
|
|
||||||
======
|
|
||||||
- Unit test for wheel.tool.install
|
|
||||||
|
|
||||||
0.11.0
|
|
||||||
======
|
|
||||||
- API cleanup
|
|
||||||
|
|
||||||
0.10.3
|
|
||||||
======
|
|
||||||
- Scripts fixer fix
|
|
||||||
|
|
||||||
0.10.2
|
|
||||||
======
|
|
||||||
- Fix keygen
|
|
||||||
|
|
||||||
0.10.1
|
|
||||||
======
|
|
||||||
- Preserve attributes on install.
|
|
||||||
|
|
||||||
0.10.0
|
|
||||||
======
|
|
||||||
- Include a copy of pkg_resources. Wheel can now install into a virtualenv
|
|
||||||
that does not have distribute (though most packages still require
|
|
||||||
pkg_resources to actually work; wheel install distribute)
|
|
||||||
- Define a new setup.cfg section [wheel]. universal=1 will
|
|
||||||
apply the py2.py3-none-any tag for pure python wheels.
|
|
||||||
|
|
||||||
0.9.7
|
|
||||||
=====
|
|
||||||
- Only import dirspec when needed. dirspec is only needed to find the
|
|
||||||
configuration for keygen/signing operations.
|
|
||||||
|
|
||||||
0.9.6
|
|
||||||
=====
|
|
||||||
- requires-dist from setup.cfg overwrites any requirements from setup.py
|
|
||||||
Care must be taken that the requirements are the same in both cases,
|
|
||||||
or just always install from wheel.
|
|
||||||
- drop dirspec requirement on win32
|
|
||||||
- improved command line utility, adds 'wheel convert [egg or wininst]' to
|
|
||||||
convert legacy binary formats to wheel
|
|
||||||
|
|
||||||
0.9.5
|
|
||||||
=====
|
|
||||||
- Wheel's own wheel file can be executed by Python, and can install itself:
|
|
||||||
``python wheel-0.9.5-py27-none-any/wheel install ...``
|
|
||||||
- Use argparse; basic ``wheel install`` command should run with only stdlib
|
|
||||||
dependencies.
|
|
||||||
- Allow requires_dist in setup.cfg's [metadata] section. In addition to
|
|
||||||
dependencies in setup.py, but will only be interpreted when installing
|
|
||||||
from wheel, not from sdist. Can be qualified with environment markers.
|
|
||||||
|
|
||||||
0.9.4
|
|
||||||
=====
|
|
||||||
- Fix wheel.signatures in sdist
|
|
||||||
|
|
||||||
0.9.3
|
|
||||||
=====
|
|
||||||
- Integrated digital signatures support without C extensions.
|
|
||||||
- Integrated "wheel install" command (single package, no dependency
|
|
||||||
resolution) including compatibility check.
|
|
||||||
- Support Python 3.3
|
|
||||||
- Use Metadata 1.3 (PEP 426)
|
|
||||||
|
|
||||||
0.9.2
|
|
||||||
=====
|
|
||||||
- Automatic signing if WHEEL_TOOL points to the wheel binary
|
|
||||||
- Even more Python 3 fixes
|
|
||||||
|
|
||||||
0.9.1
|
|
||||||
=====
|
|
||||||
- 'wheel sign' uses the keys generated by 'wheel keygen' (instead of generating
|
|
||||||
a new key at random each time)
|
|
||||||
- Python 2/3 encoding/decoding fixes
|
|
||||||
- Run tests on Python 2.6 (without signature verification)
|
|
||||||
|
|
||||||
0.9
|
|
||||||
===
|
|
||||||
- Updated digital signatures scheme
|
|
||||||
- Python 3 support for digital signatures
|
|
||||||
- Always verify RECORD hashes on extract
|
|
||||||
- "wheel" command line tool to sign, verify, unpack wheel files
|
|
||||||
|
|
||||||
0.8
|
|
||||||
===
|
|
||||||
- none/any draft pep tags update
|
|
||||||
- improved wininst2wheel script
|
|
||||||
- doc changes and other improvements
|
|
||||||
|
|
||||||
0.7
|
|
||||||
===
|
|
||||||
- sort .dist-info at end of wheel archive
|
|
||||||
- Windows & Python 3 fixes from Paul Moore
|
|
||||||
- pep8
|
|
||||||
- scripts to convert wininst & egg to wheel
|
|
||||||
|
|
||||||
0.6
|
|
||||||
===
|
|
||||||
- require distribute >= 0.6.28
|
|
||||||
- stop using verlib
|
|
||||||
|
|
||||||
0.5
|
|
||||||
===
|
|
||||||
- working pretty well
|
|
||||||
|
|
||||||
0.4.2
|
|
||||||
=====
|
|
||||||
- hyphenated name fix
|
|
||||||
|
|
||||||
0.4
|
|
||||||
===
|
|
||||||
- improve test coverage
|
|
||||||
- improve Windows compatibility
|
|
||||||
- include tox.ini courtesy of Marc Abramowitz
|
|
||||||
- draft hmac sha-256 signing function
|
|
||||||
|
|
||||||
0.3
|
|
||||||
===
|
|
||||||
- prototype egg2wheel conversion script
|
|
||||||
|
|
||||||
0.2
|
|
||||||
===
|
|
||||||
- Python 3 compatibility
|
|
||||||
|
|
||||||
0.1
|
|
||||||
===
|
|
||||||
- Initial version
|
|
||||||
|
|
||||||
|
|
|
@ -1 +0,0 @@
|
||||||
pip
|
|
|
@ -1,22 +0,0 @@
|
||||||
"wheel" copyright (c) 2012-2014 Daniel Holth <dholth@fastmail.fm> and
|
|
||||||
contributors.
|
|
||||||
|
|
||||||
The MIT License
|
|
||||||
|
|
||||||
Permission is hereby granted, free of charge, to any person obtaining a
|
|
||||||
copy of this software and associated documentation files (the "Software"),
|
|
||||||
to deal in the Software without restriction, including without limitation
|
|
||||||
the rights to use, copy, modify, merge, publish, distribute, sublicense,
|
|
||||||
and/or sell copies of the Software, and to permit persons to whom the
|
|
||||||
Software is furnished to do so, subject to the following conditions:
|
|
||||||
|
|
||||||
The above copyright notice and this permission notice shall be included
|
|
||||||
in all copies or substantial portions of the Software.
|
|
||||||
|
|
||||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
|
||||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
|
||||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL
|
|
||||||
THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR
|
|
||||||
OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE,
|
|
||||||
ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
|
|
||||||
OTHER DEALINGS IN THE SOFTWARE.
|
|
|
@ -1,374 +0,0 @@
|
||||||
Metadata-Version: 2.0
|
|
||||||
Name: wheel
|
|
||||||
Version: 0.30.0
|
|
||||||
Summary: A built-package format for Python.
|
|
||||||
Home-page: https://github.com/pypa/wheel
|
|
||||||
Author: Alex Grönholm
|
|
||||||
Author-email: alex.gronholm@nextday.fi
|
|
||||||
License: MIT
|
|
||||||
Description-Content-Type: UNKNOWN
|
|
||||||
Keywords: wheel,packaging
|
|
||||||
Platform: UNKNOWN
|
|
||||||
Classifier: Development Status :: 5 - Production/Stable
|
|
||||||
Classifier: Intended Audience :: Developers
|
|
||||||
Classifier: License :: OSI Approved :: MIT License
|
|
||||||
Classifier: Programming Language :: Python
|
|
||||||
Classifier: Programming Language :: Python :: 2
|
|
||||||
Classifier: Programming Language :: Python :: 2.7
|
|
||||||
Classifier: Programming Language :: Python :: 3
|
|
||||||
Classifier: Programming Language :: Python :: 3.4
|
|
||||||
Classifier: Programming Language :: Python :: 3.5
|
|
||||||
Classifier: Programming Language :: Python :: 3.6
|
|
||||||
Provides-Extra: faster-signatures
|
|
||||||
Requires-Dist: ed25519ll; extra == 'faster-signatures'
|
|
||||||
Provides-Extra: signatures
|
|
||||||
Requires-Dist: keyring; extra == 'signatures'
|
|
||||||
Requires-Dist: keyrings.alt; extra == 'signatures'
|
|
||||||
Provides-Extra: signatures
|
|
||||||
Requires-Dist: pyxdg; sys_platform!="win32" and extra == 'signatures'
|
|
||||||
Provides-Extra: test
|
|
||||||
Requires-Dist: jsonschema; extra == 'test'
|
|
||||||
Requires-Dist: pytest (>=3.0.0); extra == 'test'
|
|
||||||
Requires-Dist: pytest-cov; extra == 'test'
|
|
||||||
Provides-Extra: tool
|
|
||||||
|
|
||||||
Wheel
|
|
||||||
=====
|
|
||||||
|
|
||||||
A built-package format for Python.
|
|
||||||
|
|
||||||
A wheel is a ZIP-format archive with a specially formatted filename
|
|
||||||
and the .whl extension. It is designed to contain all the files for a
|
|
||||||
PEP 376 compatible install in a way that is very close to the on-disk
|
|
||||||
format. Many packages will be properly installed with only the "Unpack"
|
|
||||||
step (simply extracting the file onto sys.path), and the unpacked archive
|
|
||||||
preserves enough information to "Spread" (copy data and scripts to their
|
|
||||||
final locations) at any later time.
|
|
||||||
|
|
||||||
The wheel project provides a `bdist_wheel` command for setuptools
|
|
||||||
(requires setuptools >= 0.8.0). Wheel files can be installed with a
|
|
||||||
newer `pip` from https://github.com/pypa/pip or with wheel's own command
|
|
||||||
line utility.
|
|
||||||
|
|
||||||
The wheel documentation is at http://wheel.rtfd.org/. The file format
|
|
||||||
is documented in PEP 427 (http://www.python.org/dev/peps/pep-0427/).
|
|
||||||
|
|
||||||
The reference implementation is at https://github.com/pypa/wheel
|
|
||||||
|
|
||||||
Why not egg?
|
|
||||||
------------
|
|
||||||
|
|
||||||
Python's egg format predates the packaging related standards we have
|
|
||||||
today, the most important being PEP 376 "Database of Installed Python
|
|
||||||
Distributions" which specifies the .dist-info directory (instead of
|
|
||||||
.egg-info) and PEP 426 "Metadata for Python Software Packages 2.0"
|
|
||||||
which specifies how to express dependencies (instead of requires.txt
|
|
||||||
in .egg-info).
|
|
||||||
|
|
||||||
Wheel implements these things. It also provides a richer file naming
|
|
||||||
convention that communicates the Python implementation and ABI as well
|
|
||||||
as simply the language version used in a particular package.
|
|
||||||
|
|
||||||
Unlike .egg, wheel will be a fully-documented standard at the binary
|
|
||||||
level that is truly easy to install even if you do not want to use the
|
|
||||||
reference implementation.
|
|
||||||
|
|
||||||
|
|
||||||
Code of Conduct
|
|
||||||
---------------
|
|
||||||
|
|
||||||
Everyone interacting in the wheel project's codebases, issue trackers, chat
|
|
||||||
rooms, and mailing lists is expected to follow the `PyPA Code of Conduct`_.
|
|
||||||
|
|
||||||
.. _PyPA Code of Conduct: https://www.pypa.io/en/latest/code-of-conduct/
|
|
||||||
|
|
||||||
|
|
||||||
0.30.0
|
|
||||||
======
|
|
||||||
- Added py-limited-api {cp32|cp33|cp34|...} flag to produce cpNN.abi3.{arch}
|
|
||||||
tags on CPython 3.
|
|
||||||
- Documented the ``license_file`` metadata key
|
|
||||||
- Improved Python, abi tagging for `wheel convert`. Thanks Ales Erjavec.
|
|
||||||
- Fixed `>` being prepended to lines starting with "From" in the long description
|
|
||||||
- Added support for specifying a build number (as per PEP 427).
|
|
||||||
Thanks Ian Cordasco.
|
|
||||||
- Made the order of files in generated ZIP files deterministic.
|
|
||||||
Thanks Matthias Bach.
|
|
||||||
- Made the order of requirements in metadata deterministic. Thanks Chris Lamb.
|
|
||||||
- Fixed `wheel install` clobbering existing files
|
|
||||||
- Improved the error message when trying to verify an unsigned wheel file
|
|
||||||
- Removed support for Python 2.6, 3.2 and 3.3.
|
|
||||||
|
|
||||||
0.29.0
|
|
||||||
======
|
|
||||||
- Fix compression type of files in archive (Issue #155, Pull Request #62,
|
|
||||||
thanks Xavier Fernandez)
|
|
||||||
|
|
||||||
0.28.0
|
|
||||||
======
|
|
||||||
- Fix file modes in archive (Issue #154)
|
|
||||||
|
|
||||||
0.27.0
|
|
||||||
======
|
|
||||||
- Support forcing a platform tag using `--plat-name` on pure-Python wheels, as
|
|
||||||
well as nonstandard platform tags on non-pure wheels (Pull Request #60, Issue
|
|
||||||
#144, thanks Andrés Díaz)
|
|
||||||
- Add SOABI tags to platform-specific wheels built for Python 2.X (Pull Request
|
|
||||||
#55, Issue #63, Issue #101)
|
|
||||||
- Support reproducible wheel files, wheels that can be rebuilt and will hash to
|
|
||||||
the same values as previous builds (Pull Request #52, Issue #143, thanks
|
|
||||||
Barry Warsaw)
|
|
||||||
- Support for changes in keyring >= 8.0 (Pull Request #61, thanks Jason R.
|
|
||||||
Coombs)
|
|
||||||
- Use the file context manager when checking if dependency_links.txt is empty,
|
|
||||||
fixes problems building wheels under PyPy on Windows (Issue #150, thanks
|
|
||||||
Cosimo Lupo)
|
|
||||||
- Don't attempt to (recursively) create a build directory ending with `..`
|
|
||||||
(invalid on all platforms, but code was only executed on Windows) (Issue #91)
|
|
||||||
- Added the PyPA Code of Conduct (Pull Request #56)
|
|
||||||
|
|
||||||
0.26.0
|
|
||||||
======
|
|
||||||
- Fix multiple entrypoint comparison failure on Python 3 (Issue #148)
|
|
||||||
|
|
||||||
0.25.0
|
|
||||||
======
|
|
||||||
- Add Python 3.5 to tox configuration
|
|
||||||
- Deterministic (sorted) metadata
|
|
||||||
- Fix tagging for Python 3.5 compatibility
|
|
||||||
- Support py2-none-'arch' and py3-none-'arch' tags
|
|
||||||
- Treat data-only wheels as pure
|
|
||||||
- Write to temporary file and rename when using wheel install --force
|
|
||||||
|
|
||||||
0.24.0
|
|
||||||
======
|
|
||||||
- The python tag used for pure-python packages is now .pyN (major version
|
|
||||||
only). This change actually occurred in 0.23.0 when the --python-tag
|
|
||||||
option was added, but was not explicitly mentioned in the changelog then.
|
|
||||||
- wininst2wheel and egg2wheel removed. Use "wheel convert [archive]"
|
|
||||||
instead.
|
|
||||||
- Wheel now supports setuptools style conditional requirements via the
|
|
||||||
extras_require={} syntax. Separate 'extra' names from conditions using
|
|
||||||
the : character. Wheel's own setup.py does this. (The empty-string
|
|
||||||
extra is the same as install_requires.) These conditional requirements
|
|
||||||
should work the same whether the package is installed by wheel or
|
|
||||||
by setup.py.
|
|
||||||
|
|
||||||
0.23.0
|
|
||||||
======
|
|
||||||
- Compatibility tag flags added to the bdist_wheel command
|
|
||||||
- sdist should include files necessary for tests
|
|
||||||
- 'wheel convert' can now also convert unpacked eggs to wheel
|
|
||||||
- Rename pydist.json to metadata.json to avoid stepping on the PEP
|
|
||||||
- The --skip-scripts option has been removed, and not generating scripts is now
|
|
||||||
the default. The option was a temporary approach until installers could
|
|
||||||
generate scripts themselves. That is now the case with pip 1.5 and later.
|
|
||||||
Note that using pip 1.4 to install a wheel without scripts will leave the
|
|
||||||
installation without entry-point wrappers. The "wheel install-scripts"
|
|
||||||
command can be used to generate the scripts in such cases.
|
|
||||||
- Thank you contributors
|
|
||||||
|
|
||||||
0.22.0
|
|
||||||
======
|
|
||||||
- Include entry_points.txt, scripts a.k.a. commands, in experimental
|
|
||||||
pydist.json
|
|
||||||
- Improved test_requires parsing
|
|
||||||
- Python 2.6 fixes, "wheel version" command courtesy pombredanne
|
|
||||||
|
|
||||||
0.21.0
|
|
||||||
======
|
|
||||||
- Pregenerated scripts are the default again.
|
|
||||||
- "setup.py bdist_wheel --skip-scripts" turns them off.
|
|
||||||
- setuptools is no longer a listed requirement for the 'wheel'
|
|
||||||
package. It is of course still required in order for bdist_wheel
|
|
||||||
to work.
|
|
||||||
- "python -m wheel" avoids importing pkg_resources until it's necessary.
|
|
||||||
|
|
||||||
0.20.0
|
|
||||||
======
|
|
||||||
- No longer include console_scripts in wheels. Ordinary scripts (shell files,
|
|
||||||
standalone Python files) are included as usual.
|
|
||||||
- Include new command "python -m wheel install-scripts [distribution
|
|
||||||
[distribution ...]]" to install the console_scripts (setuptools-style
|
|
||||||
scripts using pkg_resources) for a distribution.
|
|
||||||
|
|
||||||
0.19.0
|
|
||||||
======
|
|
||||||
- pymeta.json becomes pydist.json
|
|
||||||
|
|
||||||
0.18.0
|
|
||||||
======
|
|
||||||
- Python 3 Unicode improvements
|
|
||||||
|
|
||||||
0.17.0
|
|
||||||
======
|
|
||||||
- Support latest PEP-426 "pymeta.json" (json-format metadata)
|
|
||||||
|
|
||||||
0.16.0
|
|
||||||
======
|
|
||||||
- Python 2.6 compatibility bugfix (thanks John McFarlane)
|
|
||||||
- Non-prerelease version number
|
|
||||||
|
|
||||||
1.0.0a2
|
|
||||||
=======
|
|
||||||
- Bugfix for C-extension tags for CPython 3.3 (using SOABI)
|
|
||||||
|
|
||||||
1.0.0a1
|
|
||||||
=======
|
|
||||||
- Bugfix for bdist_wininst converter "wheel convert"
|
|
||||||
- Bugfix for dists where "is pure" is None instead of True or False
|
|
||||||
|
|
||||||
1.0.0a0
|
|
||||||
=======
|
|
||||||
- Update for version 1.0 of Wheel (PEP accepted).
|
|
||||||
- Python 3 fix for moving Unicode Description to metadata body
|
|
||||||
- Include rudimentary API documentation in Sphinx (thanks Kevin Horn)
|
|
||||||
|
|
||||||
0.15.0
|
|
||||||
======
|
|
||||||
- Various improvements
|
|
||||||
|
|
||||||
0.14.0
|
|
||||||
======
|
|
||||||
- Changed the signature format to better comply with the current JWS spec.
|
|
||||||
Breaks all existing signatures.
|
|
||||||
- Include ``wheel unsign`` command to remove RECORD.jws from an archive.
|
|
||||||
- Put the description in the newly allowed payload section of PKG-INFO
|
|
||||||
(METADATA) files.
|
|
||||||
|
|
||||||
0.13.0
|
|
||||||
======
|
|
||||||
- Use distutils instead of sysconfig to get installation paths; can install
|
|
||||||
headers.
|
|
||||||
- Improve WheelFile() sort.
|
|
||||||
- Allow bootstrap installs without any pkg_resources.
|
|
||||||
|
|
||||||
0.12.0
|
|
||||||
======
|
|
||||||
- Unit test for wheel.tool.install
|
|
||||||
|
|
||||||
0.11.0
|
|
||||||
======
|
|
||||||
- API cleanup
|
|
||||||
|
|
||||||
0.10.3
|
|
||||||
======
|
|
||||||
- Scripts fixer fix
|
|
||||||
|
|
||||||
0.10.2
|
|
||||||
======
|
|
||||||
- Fix keygen
|
|
||||||
|
|
||||||
0.10.1
|
|
||||||
======
|
|
||||||
- Preserve attributes on install.
|
|
||||||
|
|
||||||
0.10.0
|
|
||||||
======
|
|
||||||
- Include a copy of pkg_resources. Wheel can now install into a virtualenv
|
|
||||||
that does not have distribute (though most packages still require
|
|
||||||
pkg_resources to actually work; wheel install distribute)
|
|
||||||
- Define a new setup.cfg section [wheel]. universal=1 will
|
|
||||||
apply the py2.py3-none-any tag for pure python wheels.
|
|
||||||
|
|
||||||
0.9.7
|
|
||||||
=====
|
|
||||||
- Only import dirspec when needed. dirspec is only needed to find the
|
|
||||||
configuration for keygen/signing operations.
|
|
||||||
|
|
||||||
0.9.6
|
|
||||||
=====
|
|
||||||
- requires-dist from setup.cfg overwrites any requirements from setup.py
|
|
||||||
Care must be taken that the requirements are the same in both cases,
|
|
||||||
or just always install from wheel.
|
|
||||||
- drop dirspec requirement on win32
|
|
||||||
- improved command line utility, adds 'wheel convert [egg or wininst]' to
|
|
||||||
convert legacy binary formats to wheel
|
|
||||||
|
|
||||||
0.9.5
|
|
||||||
=====
|
|
||||||
- Wheel's own wheel file can be executed by Python, and can install itself:
|
|
||||||
``python wheel-0.9.5-py27-none-any/wheel install ...``
|
|
||||||
- Use argparse; basic ``wheel install`` command should run with only stdlib
|
|
||||||
dependencies.
|
|
||||||
- Allow requires_dist in setup.cfg's [metadata] section. In addition to
|
|
||||||
dependencies in setup.py, but will only be interpreted when installing
|
|
||||||
from wheel, not from sdist. Can be qualified with environment markers.
|
|
||||||
|
|
||||||
0.9.4
|
|
||||||
=====
|
|
||||||
- Fix wheel.signatures in sdist
|
|
||||||
|
|
||||||
0.9.3
|
|
||||||
=====
|
|
||||||
- Integrated digital signatures support without C extensions.
|
|
||||||
- Integrated "wheel install" command (single package, no dependency
|
|
||||||
resolution) including compatibility check.
|
|
||||||
- Support Python 3.3
|
|
||||||
- Use Metadata 1.3 (PEP 426)
|
|
||||||
|
|
||||||
0.9.2
|
|
||||||
=====
|
|
||||||
- Automatic signing if WHEEL_TOOL points to the wheel binary
|
|
||||||
- Even more Python 3 fixes
|
|
||||||
|
|
||||||
0.9.1
|
|
||||||
=====
|
|
||||||
- 'wheel sign' uses the keys generated by 'wheel keygen' (instead of generating
|
|
||||||
a new key at random each time)
|
|
||||||
- Python 2/3 encoding/decoding fixes
|
|
||||||
- Run tests on Python 2.6 (without signature verification)
|
|
||||||
|
|
||||||
0.9
|
|
||||||
===
|
|
||||||
- Updated digital signatures scheme
|
|
||||||
- Python 3 support for digital signatures
|
|
||||||
- Always verify RECORD hashes on extract
|
|
||||||
- "wheel" command line tool to sign, verify, unpack wheel files
|
|
||||||
|
|
||||||
0.8
|
|
||||||
===
|
|
||||||
- none/any draft pep tags update
|
|
||||||
- improved wininst2wheel script
|
|
||||||
- doc changes and other improvements
|
|
||||||
|
|
||||||
0.7
|
|
||||||
===
|
|
||||||
- sort .dist-info at end of wheel archive
|
|
||||||
- Windows & Python 3 fixes from Paul Moore
|
|
||||||
- pep8
|
|
||||||
- scripts to convert wininst & egg to wheel
|
|
||||||
|
|
||||||
0.6
|
|
||||||
===
|
|
||||||
- require distribute >= 0.6.28
|
|
||||||
- stop using verlib
|
|
||||||
|
|
||||||
0.5
|
|
||||||
===
|
|
||||||
- working pretty well
|
|
||||||
|
|
||||||
0.4.2
|
|
||||||
=====
|
|
||||||
- hyphenated name fix
|
|
||||||
|
|
||||||
0.4
|
|
||||||
===
|
|
||||||
- improve test coverage
|
|
||||||
- improve Windows compatibility
|
|
||||||
- include tox.ini courtesy of Marc Abramowitz
|
|
||||||
- draft hmac sha-256 signing function
|
|
||||||
|
|
||||||
0.3
|
|
||||||
===
|
|
||||||
- prototype egg2wheel conversion script
|
|
||||||
|
|
||||||
0.2
|
|
||||||
===
|
|
||||||
- Python 3 compatibility
|
|
||||||
|
|
||||||
0.1
|
|
||||||
===
|
|
||||||
- Initial version
|
|
||||||
|
|
||||||
|
|
|
@ -1,46 +0,0 @@
|
||||||
wheel/__init__.py,sha256=ja92NKda3sstt4uKroYgFATu736whcI33p3GJNdslLQ,96
|
|
||||||
wheel/__main__.py,sha256=K--m7mq-27NO0fm-a8KlthkucCe0w_-0hVxL3uDujkU,419
|
|
||||||
wheel/archive.py,sha256=oEv42UnpxkoFMKcLXQ9RD8a8oic4X3oe2_H5FAgJ7_M,2376
|
|
||||||
wheel/bdist_wheel.py,sha256=qKWdyvpkdmuLB4_GGIZsjmlcMLZuZDd8tRvaQI0w_eo,18852
|
|
||||||
wheel/decorator.py,sha256=U2K77ZZ8x3x5vSIGCcEeh8GAxB6rABB7AlDwRukaoCk,541
|
|
||||||
wheel/egg2wheel.py,sha256=me4Iaz4idCvS-xjfAzfb2dXXlXx_w6AgLjH6hi1Bt1A,3043
|
|
||||||
wheel/install.py,sha256=zYQ-A8uQi-R2PwMvOh64YMlQDplqYpcBVM0EmbxZu8Y,18417
|
|
||||||
wheel/metadata.py,sha256=SzI1MtzITZJuAJuvUVzEWi60VhgDbXSV_hapyiX0rlw,11561
|
|
||||||
wheel/paths.py,sha256=OAtaJgCivlKvJKw1qC3YbJypvp2d38Eka8GQWdBWNZw,1129
|
|
||||||
wheel/pep425tags.py,sha256=Lk9zYm1rrHG1X3RKlf9plcwpsoSZT8UR7fG3jhaoZrQ,5760
|
|
||||||
wheel/pkginfo.py,sha256=GR76kupQzn1x9sKDaXuE6B6FsZ4OkfRtG7pndlXPvQ4,1257
|
|
||||||
wheel/util.py,sha256=eJB-mrhMAaCGcoKhTLDYdpCf5N8BMLtX4usW_7qeZBg,4732
|
|
||||||
wheel/wininst2wheel.py,sha256=afPAHWwa7FY0IkpG-BuuuY-dlB93VmFPrXff511NkBk,7772
|
|
||||||
wheel/signatures/__init__.py,sha256=O7kZICZvXxN5YRkCYrPmAEr1LpGaZKJh5sLPWIRIoYE,3766
|
|
||||||
wheel/signatures/djbec.py,sha256=jnfWxdS7dwLjiO6n0hy-4jLa_71SPrKWL0-7ocDrSHc,7035
|
|
||||||
wheel/signatures/ed25519py.py,sha256=nFKDMq4LW2iJKk4IZKMxY46GyZNYPKxuWha9xYHk9lE,1669
|
|
||||||
wheel/signatures/keys.py,sha256=k4j4yGZL31Dt2pa5TneIEeq6qkVIXEPExmFxiZxpE1Y,3299
|
|
||||||
wheel/tool/__init__.py,sha256=rOy5VFvj-gTKgMwi_u2_iNu_Pq6aqw4rEfaciDTbmwg,13421
|
|
||||||
wheel-0.30.0.dist-info/DESCRIPTION.rst,sha256=Alb3Ol--LhPgmWuBBPfzu54xzQ8J2skWNV34XCjhe0k,10549
|
|
||||||
wheel-0.30.0.dist-info/LICENSE.txt,sha256=zKniDGrx_Pv2lAjzd3aShsvuvN7TNhAMm0o_NfvmNeQ,1125
|
|
||||||
wheel-0.30.0.dist-info/METADATA,sha256=fYLxr6baQD-wDn4Yu8t-8fF7PJuiBTcThsl2UKBE7kg,11815
|
|
||||||
wheel-0.30.0.dist-info/RECORD,,
|
|
||||||
wheel-0.30.0.dist-info/WHEEL,sha256=kdsN-5OJAZIiHN-iO4Rhl82KyS0bDWf4uBwMbkNafr8,110
|
|
||||||
wheel-0.30.0.dist-info/entry_points.txt,sha256=pTyeGVsucyfr_BXe5OQKuA1Bp5YKaIAWy5pejkq4Qx0,109
|
|
||||||
wheel-0.30.0.dist-info/metadata.json,sha256=neXQocJnVqPTjr4zpuOVdxBGCmjrTsOs76AvP8ngyJY,1522
|
|
||||||
wheel-0.30.0.dist-info/top_level.txt,sha256=HxSBIbgEstMPe4eFawhA66Mq-QYHMopXVoAncfjb_1c,6
|
|
||||||
../../../bin/wheel,sha256=sjtPVJ0ZS5WdGK7UXcnQrN6MG_czYyrsndkMrC0qluw,279
|
|
||||||
wheel-0.30.0.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
|
|
||||||
wheel/signatures/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
wheel/__pycache__/decorator.cpython-34.pyc,,
|
|
||||||
wheel/__pycache__/__main__.cpython-34.pyc,,
|
|
||||||
wheel/signatures/__pycache__/ed25519py.cpython-34.pyc,,
|
|
||||||
wheel/__pycache__/util.cpython-34.pyc,,
|
|
||||||
wheel/__pycache__/wininst2wheel.cpython-34.pyc,,
|
|
||||||
wheel/__pycache__/pkginfo.cpython-34.pyc,,
|
|
||||||
wheel/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
wheel/signatures/__pycache__/djbec.cpython-34.pyc,,
|
|
||||||
wheel/__pycache__/metadata.cpython-34.pyc,,
|
|
||||||
wheel/__pycache__/egg2wheel.cpython-34.pyc,,
|
|
||||||
wheel/signatures/__pycache__/keys.cpython-34.pyc,,
|
|
||||||
wheel/__pycache__/archive.cpython-34.pyc,,
|
|
||||||
wheel/__pycache__/bdist_wheel.cpython-34.pyc,,
|
|
||||||
wheel/tool/__pycache__/__init__.cpython-34.pyc,,
|
|
||||||
wheel/__pycache__/install.cpython-34.pyc,,
|
|
||||||
wheel/__pycache__/pep425tags.cpython-34.pyc,,
|
|
||||||
wheel/__pycache__/paths.cpython-34.pyc,,
|
|
|
@ -1,6 +0,0 @@
|
||||||
Wheel-Version: 1.0
|
|
||||||
Generator: bdist_wheel (0.30.0)
|
|
||||||
Root-Is-Purelib: true
|
|
||||||
Tag: py2-none-any
|
|
||||||
Tag: py3-none-any
|
|
||||||
|
|
|
@ -1,6 +0,0 @@
|
||||||
[console_scripts]
|
|
||||||
wheel = wheel.tool:main
|
|
||||||
|
|
||||||
[distutils.commands]
|
|
||||||
bdist_wheel = wheel.bdist_wheel:bdist_wheel
|
|
||||||
|
|
|
@ -1 +0,0 @@
|
||||||
{"classifiers": ["Development Status :: 5 - Production/Stable", "Intended Audience :: Developers", "License :: OSI Approved :: MIT License", "Programming Language :: Python", "Programming Language :: Python :: 2", "Programming Language :: Python :: 2.7", "Programming Language :: Python :: 3", "Programming Language :: Python :: 3.4", "Programming Language :: Python :: 3.5", "Programming Language :: Python :: 3.6"], "description_content_type": "UNKNOWN", "extensions": {"python.commands": {"wrap_console": {"wheel": "wheel.tool:main"}}, "python.details": {"contacts": [{"email": "alex.gronholm@nextday.fi", "name": "Alex Gr\u00f6nholm", "role": "author"}], "document_names": {"description": "DESCRIPTION.rst", "license": "LICENSE.txt"}, "project_urls": {"Home": "https://github.com/pypa/wheel"}}, "python.exports": {"console_scripts": {"wheel": "wheel.tool:main"}, "distutils.commands": {"bdist_wheel": "wheel.bdist_wheel:bdist_wheel"}}}, "extras": ["faster-signatures", "signatures", "test", "tool"], "generator": "bdist_wheel (0.30.0)", "keywords": ["wheel", "packaging"], "license": "MIT", "metadata_version": "2.0", "name": "wheel", "run_requires": [{"extra": "faster-signatures", "requires": ["ed25519ll"]}, {"extra": "test", "requires": ["jsonschema", "pytest (>=3.0.0)", "pytest-cov"]}, {"extra": "signatures", "requires": ["keyring", "keyrings.alt"]}, {"environment": "sys_platform!=\"win32\"", "extra": "signatures", "requires": ["pyxdg"]}], "summary": "A built-package format for Python.", "version": "0.30.0"}
|
|
|
@ -1 +0,0 @@
|
||||||
wheel
|
|
|
@ -1,2 +0,0 @@
|
||||||
# __variables__ with double-quoted values will be available in setup.py:
|
|
||||||
__version__ = "0.30.0"
|
|
|
@ -1,19 +0,0 @@
|
||||||
"""
|
|
||||||
Wheel command line tool (enable python -m wheel syntax)
|
|
||||||
"""
|
|
||||||
|
|
||||||
import sys
|
|
||||||
|
|
||||||
|
|
||||||
def main(): # needed for console script
|
|
||||||
if __package__ == '':
|
|
||||||
# To be able to run 'python wheel-0.9.whl/wheel':
|
|
||||||
import os.path
|
|
||||||
path = os.path.dirname(os.path.dirname(__file__))
|
|
||||||
sys.path[0:0] = [path]
|
|
||||||
import wheel.tool
|
|
||||||
sys.exit(wheel.tool.main())
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
sys.exit(main())
|
|
|
@ -1,80 +0,0 @@
|
||||||
"""
|
|
||||||
Archive tools for wheel.
|
|
||||||
"""
|
|
||||||
|
|
||||||
import os
|
|
||||||
import os.path
|
|
||||||
import time
|
|
||||||
import zipfile
|
|
||||||
from distutils import log
|
|
||||||
|
|
||||||
|
|
||||||
def archive_wheelfile(base_name, base_dir):
|
|
||||||
"""Archive all files under `base_dir` in a whl file and name it like
|
|
||||||
`base_name`.
|
|
||||||
"""
|
|
||||||
olddir = os.path.abspath(os.curdir)
|
|
||||||
base_name = os.path.abspath(base_name)
|
|
||||||
try:
|
|
||||||
os.chdir(base_dir)
|
|
||||||
return make_wheelfile_inner(base_name)
|
|
||||||
finally:
|
|
||||||
os.chdir(olddir)
|
|
||||||
|
|
||||||
|
|
||||||
def make_wheelfile_inner(base_name, base_dir='.'):
|
|
||||||
"""Create a whl file from all the files under 'base_dir'.
|
|
||||||
|
|
||||||
Places .dist-info at the end of the archive."""
|
|
||||||
|
|
||||||
zip_filename = base_name + ".whl"
|
|
||||||
|
|
||||||
log.info("creating '%s' and adding '%s' to it", zip_filename, base_dir)
|
|
||||||
|
|
||||||
# Some applications need reproducible .whl files, but they can't do this
|
|
||||||
# without forcing the timestamp of the individual ZipInfo objects. See
|
|
||||||
# issue #143.
|
|
||||||
timestamp = os.environ.get('SOURCE_DATE_EPOCH')
|
|
||||||
if timestamp is None:
|
|
||||||
date_time = None
|
|
||||||
else:
|
|
||||||
date_time = time.gmtime(int(timestamp))[0:6]
|
|
||||||
|
|
||||||
# XXX support bz2, xz when available
|
|
||||||
zip = zipfile.ZipFile(zip_filename, "w", compression=zipfile.ZIP_DEFLATED)
|
|
||||||
|
|
||||||
score = {'WHEEL': 1, 'METADATA': 2, 'RECORD': 3}
|
|
||||||
deferred = []
|
|
||||||
|
|
||||||
def writefile(path, date_time):
|
|
||||||
st = os.stat(path)
|
|
||||||
if date_time is None:
|
|
||||||
mtime = time.gmtime(st.st_mtime)
|
|
||||||
date_time = mtime[0:6]
|
|
||||||
zinfo = zipfile.ZipInfo(path, date_time)
|
|
||||||
zinfo.external_attr = st.st_mode << 16
|
|
||||||
zinfo.compress_type = zipfile.ZIP_DEFLATED
|
|
||||||
with open(path, 'rb') as fp:
|
|
||||||
zip.writestr(zinfo, fp.read())
|
|
||||||
log.info("adding '%s'" % path)
|
|
||||||
|
|
||||||
for dirpath, dirnames, filenames in os.walk(base_dir):
|
|
||||||
# Sort the directory names so that `os.walk` will walk them in a
|
|
||||||
# defined order on the next iteration.
|
|
||||||
dirnames.sort()
|
|
||||||
for name in sorted(filenames):
|
|
||||||
path = os.path.normpath(os.path.join(dirpath, name))
|
|
||||||
|
|
||||||
if os.path.isfile(path):
|
|
||||||
if dirpath.endswith('.dist-info'):
|
|
||||||
deferred.append((score.get(name, 0), path))
|
|
||||||
else:
|
|
||||||
writefile(path, date_time)
|
|
||||||
|
|
||||||
deferred.sort()
|
|
||||||
for score, path in deferred:
|
|
||||||
writefile(path, date_time)
|
|
||||||
|
|
||||||
zip.close()
|
|
||||||
|
|
||||||
return zip_filename
|
|
|
@ -1,482 +0,0 @@
|
||||||
"""
|
|
||||||
Create a wheel (.whl) distribution.
|
|
||||||
|
|
||||||
A wheel is a built archive format.
|
|
||||||
"""
|
|
||||||
|
|
||||||
import csv
|
|
||||||
import hashlib
|
|
||||||
import os
|
|
||||||
import subprocess
|
|
||||||
import warnings
|
|
||||||
import shutil
|
|
||||||
import json
|
|
||||||
import sys
|
|
||||||
import re
|
|
||||||
from email.generator import Generator
|
|
||||||
from distutils.core import Command
|
|
||||||
from distutils.sysconfig import get_python_version
|
|
||||||
from distutils import log as logger
|
|
||||||
from shutil import rmtree
|
|
||||||
|
|
||||||
import pkg_resources
|
|
||||||
|
|
||||||
from .pep425tags import get_abbr_impl, get_impl_ver, get_abi_tag, get_platform
|
|
||||||
from .util import native, open_for_csv
|
|
||||||
from .archive import archive_wheelfile
|
|
||||||
from .pkginfo import read_pkg_info, write_pkg_info
|
|
||||||
from .metadata import pkginfo_to_dict
|
|
||||||
from . import pep425tags, metadata
|
|
||||||
from . import __version__ as wheel_version
|
|
||||||
|
|
||||||
|
|
||||||
safe_name = pkg_resources.safe_name
|
|
||||||
safe_version = pkg_resources.safe_version
|
|
||||||
|
|
||||||
PY_LIMITED_API_PATTERN = r'cp3\d'
|
|
||||||
|
|
||||||
|
|
||||||
def safer_name(name):
|
|
||||||
return safe_name(name).replace('-', '_')
|
|
||||||
|
|
||||||
|
|
||||||
def safer_version(version):
|
|
||||||
return safe_version(version).replace('-', '_')
|
|
||||||
|
|
||||||
|
|
||||||
class bdist_wheel(Command):
|
|
||||||
|
|
||||||
description = 'create a wheel distribution'
|
|
||||||
|
|
||||||
user_options = [('bdist-dir=', 'b',
|
|
||||||
"temporary directory for creating the distribution"),
|
|
||||||
('plat-name=', 'p',
|
|
||||||
"platform name to embed in generated filenames "
|
|
||||||
"(default: %s)" % get_platform()),
|
|
||||||
('keep-temp', 'k',
|
|
||||||
"keep the pseudo-installation tree around after " +
|
|
||||||
"creating the distribution archive"),
|
|
||||||
('dist-dir=', 'd',
|
|
||||||
"directory to put final built distributions in"),
|
|
||||||
('skip-build', None,
|
|
||||||
"skip rebuilding everything (for testing/debugging)"),
|
|
||||||
('relative', None,
|
|
||||||
"build the archive using relative paths"
|
|
||||||
"(default: false)"),
|
|
||||||
('owner=', 'u',
|
|
||||||
"Owner name used when creating a tar file"
|
|
||||||
" [default: current user]"),
|
|
||||||
('group=', 'g',
|
|
||||||
"Group name used when creating a tar file"
|
|
||||||
" [default: current group]"),
|
|
||||||
('universal', None,
|
|
||||||
"make a universal wheel"
|
|
||||||
" (default: false)"),
|
|
||||||
('python-tag=', None,
|
|
||||||
"Python implementation compatibility tag"
|
|
||||||
" (default: py%s)" % get_impl_ver()[0]),
|
|
||||||
('build-number=', None,
|
|
||||||
"Build number for this particular version. "
|
|
||||||
"As specified in PEP-0427, this must start with a digit. "
|
|
||||||
"[default: None]"),
|
|
||||||
('py-limited-api=', None,
|
|
||||||
"Python tag (cp32|cp33|cpNN) for abi3 wheel tag"
|
|
||||||
" (default: false)"),
|
|
||||||
]
|
|
||||||
|
|
||||||
boolean_options = ['keep-temp', 'skip-build', 'relative', 'universal']
|
|
||||||
|
|
||||||
def initialize_options(self):
|
|
||||||
self.bdist_dir = None
|
|
||||||
self.data_dir = None
|
|
||||||
self.plat_name = None
|
|
||||||
self.plat_tag = None
|
|
||||||
self.format = 'zip'
|
|
||||||
self.keep_temp = False
|
|
||||||
self.dist_dir = None
|
|
||||||
self.distinfo_dir = None
|
|
||||||
self.egginfo_dir = None
|
|
||||||
self.root_is_pure = None
|
|
||||||
self.skip_build = None
|
|
||||||
self.relative = False
|
|
||||||
self.owner = None
|
|
||||||
self.group = None
|
|
||||||
self.universal = False
|
|
||||||
self.python_tag = 'py' + get_impl_ver()[0]
|
|
||||||
self.build_number = None
|
|
||||||
self.py_limited_api = False
|
|
||||||
self.plat_name_supplied = False
|
|
||||||
|
|
||||||
def finalize_options(self):
|
|
||||||
if self.bdist_dir is None:
|
|
||||||
bdist_base = self.get_finalized_command('bdist').bdist_base
|
|
||||||
self.bdist_dir = os.path.join(bdist_base, 'wheel')
|
|
||||||
|
|
||||||
self.data_dir = self.wheel_dist_name + '.data'
|
|
||||||
self.plat_name_supplied = self.plat_name is not None
|
|
||||||
|
|
||||||
need_options = ('dist_dir', 'plat_name', 'skip_build')
|
|
||||||
|
|
||||||
self.set_undefined_options('bdist',
|
|
||||||
*zip(need_options, need_options))
|
|
||||||
|
|
||||||
self.root_is_pure = not (self.distribution.has_ext_modules()
|
|
||||||
or self.distribution.has_c_libraries())
|
|
||||||
|
|
||||||
if self.py_limited_api and not re.match(PY_LIMITED_API_PATTERN, self.py_limited_api):
|
|
||||||
raise ValueError("py-limited-api must match '%s'" % PY_LIMITED_API_PATTERN)
|
|
||||||
|
|
||||||
# Support legacy [wheel] section for setting universal
|
|
||||||
wheel = self.distribution.get_option_dict('wheel')
|
|
||||||
if 'universal' in wheel:
|
|
||||||
# please don't define this in your global configs
|
|
||||||
val = wheel['universal'][1].strip()
|
|
||||||
if val.lower() in ('1', 'true', 'yes'):
|
|
||||||
self.universal = True
|
|
||||||
|
|
||||||
if self.build_number is not None and not self.build_number[:1].isdigit():
|
|
||||||
raise ValueError("Build tag (build-number) must start with a digit.")
|
|
||||||
|
|
||||||
@property
|
|
||||||
def wheel_dist_name(self):
|
|
||||||
"""Return distribution full name with - replaced with _"""
|
|
||||||
components = (safer_name(self.distribution.get_name()),
|
|
||||||
safer_version(self.distribution.get_version()))
|
|
||||||
if self.build_number:
|
|
||||||
components += (self.build_number,)
|
|
||||||
return '-'.join(components)
|
|
||||||
|
|
||||||
def get_tag(self):
|
|
||||||
# bdist sets self.plat_name if unset, we should only use it for purepy
|
|
||||||
# wheels if the user supplied it.
|
|
||||||
if self.plat_name_supplied:
|
|
||||||
plat_name = self.plat_name
|
|
||||||
elif self.root_is_pure:
|
|
||||||
plat_name = 'any'
|
|
||||||
else:
|
|
||||||
plat_name = self.plat_name or get_platform()
|
|
||||||
if plat_name in ('linux-x86_64', 'linux_x86_64') and sys.maxsize == 2147483647:
|
|
||||||
plat_name = 'linux_i686'
|
|
||||||
plat_name = plat_name.replace('-', '_').replace('.', '_')
|
|
||||||
|
|
||||||
if self.root_is_pure:
|
|
||||||
if self.universal:
|
|
||||||
impl = 'py2.py3'
|
|
||||||
else:
|
|
||||||
impl = self.python_tag
|
|
||||||
tag = (impl, 'none', plat_name)
|
|
||||||
else:
|
|
||||||
impl_name = get_abbr_impl()
|
|
||||||
impl_ver = get_impl_ver()
|
|
||||||
impl = impl_name + impl_ver
|
|
||||||
# We don't work on CPython 3.1, 3.0.
|
|
||||||
if self.py_limited_api and (impl_name + impl_ver).startswith('cp3'):
|
|
||||||
impl = self.py_limited_api
|
|
||||||
abi_tag = 'abi3'
|
|
||||||
else:
|
|
||||||
abi_tag = str(get_abi_tag()).lower()
|
|
||||||
tag = (impl, abi_tag, plat_name)
|
|
||||||
supported_tags = pep425tags.get_supported(
|
|
||||||
supplied_platform=plat_name if self.plat_name_supplied else None)
|
|
||||||
# XXX switch to this alternate implementation for non-pure:
|
|
||||||
if not self.py_limited_api:
|
|
||||||
assert tag == supported_tags[0], "%s != %s" % (tag, supported_tags[0])
|
|
||||||
assert tag in supported_tags, "would build wheel with unsupported tag {}".format(tag)
|
|
||||||
return tag
|
|
||||||
|
|
||||||
def get_archive_basename(self):
|
|
||||||
"""Return archive name without extension"""
|
|
||||||
|
|
||||||
impl_tag, abi_tag, plat_tag = self.get_tag()
|
|
||||||
|
|
||||||
archive_basename = "%s-%s-%s-%s" % (
|
|
||||||
self.wheel_dist_name,
|
|
||||||
impl_tag,
|
|
||||||
abi_tag,
|
|
||||||
plat_tag)
|
|
||||||
return archive_basename
|
|
||||||
|
|
||||||
def run(self):
|
|
||||||
build_scripts = self.reinitialize_command('build_scripts')
|
|
||||||
build_scripts.executable = 'python'
|
|
||||||
|
|
||||||
if not self.skip_build:
|
|
||||||
self.run_command('build')
|
|
||||||
|
|
||||||
install = self.reinitialize_command('install',
|
|
||||||
reinit_subcommands=True)
|
|
||||||
install.root = self.bdist_dir
|
|
||||||
install.compile = False
|
|
||||||
install.skip_build = self.skip_build
|
|
||||||
install.warn_dir = False
|
|
||||||
|
|
||||||
# A wheel without setuptools scripts is more cross-platform.
|
|
||||||
# Use the (undocumented) `no_ep` option to setuptools'
|
|
||||||
# install_scripts command to avoid creating entry point scripts.
|
|
||||||
install_scripts = self.reinitialize_command('install_scripts')
|
|
||||||
install_scripts.no_ep = True
|
|
||||||
|
|
||||||
# Use a custom scheme for the archive, because we have to decide
|
|
||||||
# at installation time which scheme to use.
|
|
||||||
for key in ('headers', 'scripts', 'data', 'purelib', 'platlib'):
|
|
||||||
setattr(install,
|
|
||||||
'install_' + key,
|
|
||||||
os.path.join(self.data_dir, key))
|
|
||||||
|
|
||||||
basedir_observed = ''
|
|
||||||
|
|
||||||
if os.name == 'nt':
|
|
||||||
# win32 barfs if any of these are ''; could be '.'?
|
|
||||||
# (distutils.command.install:change_roots bug)
|
|
||||||
basedir_observed = os.path.normpath(os.path.join(self.data_dir, '..'))
|
|
||||||
self.install_libbase = self.install_lib = basedir_observed
|
|
||||||
|
|
||||||
setattr(install,
|
|
||||||
'install_purelib' if self.root_is_pure else 'install_platlib',
|
|
||||||
basedir_observed)
|
|
||||||
|
|
||||||
logger.info("installing to %s", self.bdist_dir)
|
|
||||||
|
|
||||||
self.run_command('install')
|
|
||||||
|
|
||||||
archive_basename = self.get_archive_basename()
|
|
||||||
|
|
||||||
pseudoinstall_root = os.path.join(self.dist_dir, archive_basename)
|
|
||||||
if not self.relative:
|
|
||||||
archive_root = self.bdist_dir
|
|
||||||
else:
|
|
||||||
archive_root = os.path.join(
|
|
||||||
self.bdist_dir,
|
|
||||||
self._ensure_relative(install.install_base))
|
|
||||||
|
|
||||||
self.set_undefined_options(
|
|
||||||
'install_egg_info', ('target', 'egginfo_dir'))
|
|
||||||
self.distinfo_dir = os.path.join(self.bdist_dir,
|
|
||||||
'%s.dist-info' % self.wheel_dist_name)
|
|
||||||
self.egg2dist(self.egginfo_dir,
|
|
||||||
self.distinfo_dir)
|
|
||||||
|
|
||||||
self.write_wheelfile(self.distinfo_dir)
|
|
||||||
|
|
||||||
self.write_record(self.bdist_dir, self.distinfo_dir)
|
|
||||||
|
|
||||||
# Make the archive
|
|
||||||
if not os.path.exists(self.dist_dir):
|
|
||||||
os.makedirs(self.dist_dir)
|
|
||||||
wheel_name = archive_wheelfile(pseudoinstall_root, archive_root)
|
|
||||||
|
|
||||||
# Sign the archive
|
|
||||||
if 'WHEEL_TOOL' in os.environ:
|
|
||||||
subprocess.call([os.environ['WHEEL_TOOL'], 'sign', wheel_name])
|
|
||||||
|
|
||||||
# Add to 'Distribution.dist_files' so that the "upload" command works
|
|
||||||
getattr(self.distribution, 'dist_files', []).append(
|
|
||||||
('bdist_wheel', get_python_version(), wheel_name))
|
|
||||||
|
|
||||||
if not self.keep_temp:
|
|
||||||
if self.dry_run:
|
|
||||||
logger.info('removing %s', self.bdist_dir)
|
|
||||||
else:
|
|
||||||
rmtree(self.bdist_dir)
|
|
||||||
|
|
||||||
def write_wheelfile(self, wheelfile_base, generator='bdist_wheel (' + wheel_version + ')'):
|
|
||||||
from email.message import Message
|
|
||||||
msg = Message()
|
|
||||||
msg['Wheel-Version'] = '1.0' # of the spec
|
|
||||||
msg['Generator'] = generator
|
|
||||||
msg['Root-Is-Purelib'] = str(self.root_is_pure).lower()
|
|
||||||
if self.build_number is not None:
|
|
||||||
msg['Build'] = self.build_number
|
|
||||||
|
|
||||||
# Doesn't work for bdist_wininst
|
|
||||||
impl_tag, abi_tag, plat_tag = self.get_tag()
|
|
||||||
for impl in impl_tag.split('.'):
|
|
||||||
for abi in abi_tag.split('.'):
|
|
||||||
for plat in plat_tag.split('.'):
|
|
||||||
msg['Tag'] = '-'.join((impl, abi, plat))
|
|
||||||
|
|
||||||
wheelfile_path = os.path.join(wheelfile_base, 'WHEEL')
|
|
||||||
logger.info('creating %s', wheelfile_path)
|
|
||||||
with open(wheelfile_path, 'w') as f:
|
|
||||||
Generator(f, maxheaderlen=0).flatten(msg)
|
|
||||||
|
|
||||||
def _ensure_relative(self, path):
|
|
||||||
# copied from dir_util, deleted
|
|
||||||
drive, path = os.path.splitdrive(path)
|
|
||||||
if path[0:1] == os.sep:
|
|
||||||
path = drive + path[1:]
|
|
||||||
return path
|
|
||||||
|
|
||||||
def _pkginfo_to_metadata(self, egg_info_path, pkginfo_path):
|
|
||||||
return metadata.pkginfo_to_metadata(egg_info_path, pkginfo_path)
|
|
||||||
|
|
||||||
def license_file(self):
|
|
||||||
"""Return license filename from a license-file key in setup.cfg, or None."""
|
|
||||||
metadata = self.distribution.get_option_dict('metadata')
|
|
||||||
if 'license_file' not in metadata:
|
|
||||||
return None
|
|
||||||
return metadata['license_file'][1]
|
|
||||||
|
|
||||||
def setupcfg_requirements(self):
|
|
||||||
"""Generate requirements from setup.cfg as
|
|
||||||
('Requires-Dist', 'requirement; qualifier') tuples. From a metadata
|
|
||||||
section in setup.cfg:
|
|
||||||
|
|
||||||
[metadata]
|
|
||||||
provides-extra = extra1
|
|
||||||
extra2
|
|
||||||
requires-dist = requirement; qualifier
|
|
||||||
another; qualifier2
|
|
||||||
unqualified
|
|
||||||
|
|
||||||
Yields
|
|
||||||
|
|
||||||
('Provides-Extra', 'extra1'),
|
|
||||||
('Provides-Extra', 'extra2'),
|
|
||||||
('Requires-Dist', 'requirement; qualifier'),
|
|
||||||
('Requires-Dist', 'another; qualifier2'),
|
|
||||||
('Requires-Dist', 'unqualified')
|
|
||||||
"""
|
|
||||||
metadata = self.distribution.get_option_dict('metadata')
|
|
||||||
|
|
||||||
# our .ini parser folds - to _ in key names:
|
|
||||||
for key, title in (('provides_extra', 'Provides-Extra'),
|
|
||||||
('requires_dist', 'Requires-Dist')):
|
|
||||||
if key not in metadata:
|
|
||||||
continue
|
|
||||||
field = metadata[key]
|
|
||||||
for line in field[1].splitlines():
|
|
||||||
line = line.strip()
|
|
||||||
if not line:
|
|
||||||
continue
|
|
||||||
yield (title, line)
|
|
||||||
|
|
||||||
def add_requirements(self, metadata_path):
|
|
||||||
"""Add additional requirements from setup.cfg to file metadata_path"""
|
|
||||||
additional = list(self.setupcfg_requirements())
|
|
||||||
if not additional:
|
|
||||||
return
|
|
||||||
|
|
||||||
pkg_info = read_pkg_info(metadata_path)
|
|
||||||
if 'Provides-Extra' in pkg_info or 'Requires-Dist' in pkg_info:
|
|
||||||
warnings.warn('setup.cfg requirements overwrite values from setup.py')
|
|
||||||
del pkg_info['Provides-Extra']
|
|
||||||
del pkg_info['Requires-Dist']
|
|
||||||
for k, v in additional:
|
|
||||||
pkg_info[k] = v
|
|
||||||
write_pkg_info(metadata_path, pkg_info)
|
|
||||||
|
|
||||||
def egg2dist(self, egginfo_path, distinfo_path):
|
|
||||||
"""Convert an .egg-info directory into a .dist-info directory"""
|
|
||||||
def adios(p):
|
|
||||||
"""Appropriately delete directory, file or link."""
|
|
||||||
if os.path.exists(p) and not os.path.islink(p) and os.path.isdir(p):
|
|
||||||
shutil.rmtree(p)
|
|
||||||
elif os.path.exists(p):
|
|
||||||
os.unlink(p)
|
|
||||||
|
|
||||||
adios(distinfo_path)
|
|
||||||
|
|
||||||
if not os.path.exists(egginfo_path):
|
|
||||||
# There is no egg-info. This is probably because the egg-info
|
|
||||||
# file/directory is not named matching the distribution name used
|
|
||||||
# to name the archive file. Check for this case and report
|
|
||||||
# accordingly.
|
|
||||||
import glob
|
|
||||||
pat = os.path.join(os.path.dirname(egginfo_path), '*.egg-info')
|
|
||||||
possible = glob.glob(pat)
|
|
||||||
err = "Egg metadata expected at %s but not found" % (egginfo_path,)
|
|
||||||
if possible:
|
|
||||||
alt = os.path.basename(possible[0])
|
|
||||||
err += " (%s found - possible misnamed archive file?)" % (alt,)
|
|
||||||
|
|
||||||
raise ValueError(err)
|
|
||||||
|
|
||||||
if os.path.isfile(egginfo_path):
|
|
||||||
# .egg-info is a single file
|
|
||||||
pkginfo_path = egginfo_path
|
|
||||||
pkg_info = self._pkginfo_to_metadata(egginfo_path, egginfo_path)
|
|
||||||
os.mkdir(distinfo_path)
|
|
||||||
else:
|
|
||||||
# .egg-info is a directory
|
|
||||||
pkginfo_path = os.path.join(egginfo_path, 'PKG-INFO')
|
|
||||||
pkg_info = self._pkginfo_to_metadata(egginfo_path, pkginfo_path)
|
|
||||||
|
|
||||||
# ignore common egg metadata that is useless to wheel
|
|
||||||
shutil.copytree(egginfo_path, distinfo_path,
|
|
||||||
ignore=lambda x, y: {'PKG-INFO', 'requires.txt', 'SOURCES.txt',
|
|
||||||
'not-zip-safe'}
|
|
||||||
)
|
|
||||||
|
|
||||||
# delete dependency_links if it is only whitespace
|
|
||||||
dependency_links_path = os.path.join(distinfo_path, 'dependency_links.txt')
|
|
||||||
with open(dependency_links_path, 'r') as dependency_links_file:
|
|
||||||
dependency_links = dependency_links_file.read().strip()
|
|
||||||
if not dependency_links:
|
|
||||||
adios(dependency_links_path)
|
|
||||||
|
|
||||||
write_pkg_info(os.path.join(distinfo_path, 'METADATA'), pkg_info)
|
|
||||||
|
|
||||||
# XXX deprecated. Still useful for current distribute/setuptools.
|
|
||||||
metadata_path = os.path.join(distinfo_path, 'METADATA')
|
|
||||||
self.add_requirements(metadata_path)
|
|
||||||
|
|
||||||
# XXX intentionally a different path than the PEP.
|
|
||||||
metadata_json_path = os.path.join(distinfo_path, 'metadata.json')
|
|
||||||
pymeta = pkginfo_to_dict(metadata_path,
|
|
||||||
distribution=self.distribution)
|
|
||||||
|
|
||||||
if 'description' in pymeta:
|
|
||||||
description_filename = 'DESCRIPTION.rst'
|
|
||||||
description_text = pymeta.pop('description')
|
|
||||||
description_path = os.path.join(distinfo_path,
|
|
||||||
description_filename)
|
|
||||||
with open(description_path, "wb") as description_file:
|
|
||||||
description_file.write(description_text.encode('utf-8'))
|
|
||||||
pymeta['extensions']['python.details']['document_names']['description'] = \
|
|
||||||
description_filename
|
|
||||||
|
|
||||||
# XXX heuristically copy any LICENSE/LICENSE.txt?
|
|
||||||
license = self.license_file()
|
|
||||||
if license:
|
|
||||||
license_filename = 'LICENSE.txt'
|
|
||||||
shutil.copy(license, os.path.join(self.distinfo_dir, license_filename))
|
|
||||||
pymeta['extensions']['python.details']['document_names']['license'] = license_filename
|
|
||||||
|
|
||||||
with open(metadata_json_path, "w") as metadata_json:
|
|
||||||
json.dump(pymeta, metadata_json, sort_keys=True)
|
|
||||||
|
|
||||||
adios(egginfo_path)
|
|
||||||
|
|
||||||
def write_record(self, bdist_dir, distinfo_dir):
|
|
||||||
from .util import urlsafe_b64encode
|
|
||||||
|
|
||||||
record_path = os.path.join(distinfo_dir, 'RECORD')
|
|
||||||
record_relpath = os.path.relpath(record_path, bdist_dir)
|
|
||||||
|
|
||||||
def walk():
|
|
||||||
for dir, dirs, files in os.walk(bdist_dir):
|
|
||||||
dirs.sort()
|
|
||||||
for f in sorted(files):
|
|
||||||
yield os.path.join(dir, f)
|
|
||||||
|
|
||||||
def skip(path):
|
|
||||||
"""Wheel hashes every possible file."""
|
|
||||||
return (path == record_relpath)
|
|
||||||
|
|
||||||
with open_for_csv(record_path, 'w+') as record_file:
|
|
||||||
writer = csv.writer(record_file)
|
|
||||||
for path in walk():
|
|
||||||
relpath = os.path.relpath(path, bdist_dir)
|
|
||||||
if skip(relpath):
|
|
||||||
hash = ''
|
|
||||||
size = ''
|
|
||||||
else:
|
|
||||||
with open(path, 'rb') as f:
|
|
||||||
data = f.read()
|
|
||||||
digest = hashlib.sha256(data).digest()
|
|
||||||
hash = 'sha256=' + native(urlsafe_b64encode(digest))
|
|
||||||
size = len(data)
|
|
||||||
record_path = os.path.relpath(
|
|
||||||
path, bdist_dir).replace(os.path.sep, '/')
|
|
||||||
writer.writerow((record_path, hash, size))
|
|
|
@ -1,19 +0,0 @@
|
||||||
# from Pyramid
|
|
||||||
|
|
||||||
|
|
||||||
class reify(object):
|
|
||||||
"""Put the result of a method which uses this (non-data)
|
|
||||||
descriptor decorator in the instance dict after the first call,
|
|
||||||
effectively replacing the decorator with an instance variable.
|
|
||||||
"""
|
|
||||||
|
|
||||||
def __init__(self, wrapped):
|
|
||||||
self.wrapped = wrapped
|
|
||||||
self.__doc__ = wrapped.__doc__
|
|
||||||
|
|
||||||
def __get__(self, inst, objtype=None):
|
|
||||||
if inst is None:
|
|
||||||
return self
|
|
||||||
val = self.wrapped(inst)
|
|
||||||
setattr(inst, self.wrapped.__name__, val)
|
|
||||||
return val
|
|
|
@ -1,90 +0,0 @@
|
||||||
#!/usr/bin/env python
|
|
||||||
import distutils.dist
|
|
||||||
import os.path
|
|
||||||
import re
|
|
||||||
import shutil
|
|
||||||
import sys
|
|
||||||
import tempfile
|
|
||||||
import zipfile
|
|
||||||
from argparse import ArgumentParser
|
|
||||||
from distutils.archive_util import make_archive
|
|
||||||
from glob import iglob
|
|
||||||
|
|
||||||
import wheel.bdist_wheel
|
|
||||||
from wheel.wininst2wheel import _bdist_wheel_tag
|
|
||||||
|
|
||||||
egg_info_re = re.compile(r'''(?P<name>.+?)-(?P<ver>.+?)
|
|
||||||
(-(?P<pyver>.+?))?(-(?P<arch>.+?))?.egg''', re.VERBOSE)
|
|
||||||
|
|
||||||
|
|
||||||
def egg2wheel(egg_path, dest_dir):
|
|
||||||
egg_info = egg_info_re.match(os.path.basename(egg_path)).groupdict()
|
|
||||||
dir = tempfile.mkdtemp(suffix="_e2w")
|
|
||||||
if os.path.isfile(egg_path):
|
|
||||||
# assume we have a bdist_egg otherwise
|
|
||||||
egg = zipfile.ZipFile(egg_path)
|
|
||||||
egg.extractall(dir)
|
|
||||||
else:
|
|
||||||
# support buildout-style installed eggs directories
|
|
||||||
for pth in os.listdir(egg_path):
|
|
||||||
src = os.path.join(egg_path, pth)
|
|
||||||
if os.path.isfile(src):
|
|
||||||
shutil.copy2(src, dir)
|
|
||||||
else:
|
|
||||||
shutil.copytree(src, os.path.join(dir, pth))
|
|
||||||
|
|
||||||
dist_info = "%s-%s" % (egg_info['name'], egg_info['ver'])
|
|
||||||
abi = 'none'
|
|
||||||
pyver = egg_info['pyver'].replace('.', '')
|
|
||||||
arch = (egg_info['arch'] or 'any').replace('.', '_').replace('-', '_')
|
|
||||||
if arch != 'any':
|
|
||||||
# assume all binary eggs are for CPython
|
|
||||||
pyver = 'cp' + pyver[2:]
|
|
||||||
wheel_name = '-'.join((
|
|
||||||
dist_info,
|
|
||||||
pyver,
|
|
||||||
abi,
|
|
||||||
arch
|
|
||||||
))
|
|
||||||
root_is_purelib = egg_info['arch'] is None
|
|
||||||
if root_is_purelib:
|
|
||||||
bw = wheel.bdist_wheel.bdist_wheel(distutils.dist.Distribution())
|
|
||||||
else:
|
|
||||||
bw = _bdist_wheel_tag(distutils.dist.Distribution())
|
|
||||||
|
|
||||||
bw.root_is_pure = root_is_purelib
|
|
||||||
bw.python_tag = pyver
|
|
||||||
bw.plat_name_supplied = True
|
|
||||||
bw.plat_name = egg_info['arch'] or 'any'
|
|
||||||
if not root_is_purelib:
|
|
||||||
bw.full_tag_supplied = True
|
|
||||||
bw.full_tag = (pyver, abi, arch)
|
|
||||||
|
|
||||||
dist_info_dir = os.path.join(dir, '%s.dist-info' % dist_info)
|
|
||||||
bw.egg2dist(os.path.join(dir, 'EGG-INFO'),
|
|
||||||
dist_info_dir)
|
|
||||||
bw.write_wheelfile(dist_info_dir, generator='egg2wheel')
|
|
||||||
bw.write_record(dir, dist_info_dir)
|
|
||||||
filename = make_archive(os.path.join(dest_dir, wheel_name), 'zip', root_dir=dir)
|
|
||||||
os.rename(filename, filename[:-3] + 'whl')
|
|
||||||
shutil.rmtree(dir)
|
|
||||||
|
|
||||||
|
|
||||||
def main():
|
|
||||||
parser = ArgumentParser()
|
|
||||||
parser.add_argument('eggs', nargs='*', help="Eggs to convert")
|
|
||||||
parser.add_argument('--dest-dir', '-d', default=os.path.curdir,
|
|
||||||
help="Directory to store wheels (default %(default)s)")
|
|
||||||
parser.add_argument('--verbose', '-v', action='store_true')
|
|
||||||
args = parser.parse_args()
|
|
||||||
for pat in args.eggs:
|
|
||||||
for egg in iglob(pat):
|
|
||||||
if args.verbose:
|
|
||||||
sys.stdout.write("{0}... ".format(egg))
|
|
||||||
egg2wheel(egg, args.dest_dir)
|
|
||||||
if args.verbose:
|
|
||||||
sys.stdout.write("OK\n")
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
main()
|
|
|
@ -1,494 +0,0 @@
|
||||||
"""
|
|
||||||
Operations on existing wheel files, including basic installation.
|
|
||||||
"""
|
|
||||||
# XXX see patched pip to install
|
|
||||||
|
|
||||||
import csv
|
|
||||||
import hashlib
|
|
||||||
import os.path
|
|
||||||
import re
|
|
||||||
import shutil
|
|
||||||
import sys
|
|
||||||
import warnings
|
|
||||||
import zipfile
|
|
||||||
|
|
||||||
from . import signatures
|
|
||||||
from .decorator import reify
|
|
||||||
from .paths import get_install_paths
|
|
||||||
from .pep425tags import get_supported
|
|
||||||
from .pkginfo import read_pkg_info_bytes
|
|
||||||
from .util import (
|
|
||||||
urlsafe_b64encode, from_json, urlsafe_b64decode, native, binary, HashingFile,
|
|
||||||
open_for_csv)
|
|
||||||
|
|
||||||
try:
|
|
||||||
_big_number = sys.maxsize
|
|
||||||
except NameError:
|
|
||||||
_big_number = sys.maxint
|
|
||||||
|
|
||||||
# The next major version after this version of the 'wheel' tool:
|
|
||||||
VERSION_TOO_HIGH = (1, 0)
|
|
||||||
|
|
||||||
# Non-greedy matching of an optional build number may be too clever (more
|
|
||||||
# invalid wheel filenames will match). Separate regex for .dist-info?
|
|
||||||
WHEEL_INFO_RE = re.compile(
|
|
||||||
r"""^(?P<namever>(?P<name>.+?)(-(?P<ver>\d.+?))?)
|
|
||||||
((-(?P<build>\d.*?))?-(?P<pyver>.+?)-(?P<abi>.+?)-(?P<plat>.+?)
|
|
||||||
\.whl|\.dist-info)$""",
|
|
||||||
re.VERBOSE).match
|
|
||||||
|
|
||||||
|
|
||||||
def parse_version(version):
|
|
||||||
"""Use parse_version from pkg_resources or distutils as available."""
|
|
||||||
global parse_version
|
|
||||||
try:
|
|
||||||
from pkg_resources import parse_version
|
|
||||||
except ImportError:
|
|
||||||
from distutils.version import LooseVersion as parse_version
|
|
||||||
return parse_version(version)
|
|
||||||
|
|
||||||
|
|
||||||
class BadWheelFile(ValueError):
|
|
||||||
pass
|
|
||||||
|
|
||||||
|
|
||||||
class WheelFile(object):
|
|
||||||
"""Parse wheel-specific attributes from a wheel (.whl) file and offer
|
|
||||||
basic installation and verification support.
|
|
||||||
|
|
||||||
WheelFile can be used to simply parse a wheel filename by avoiding the
|
|
||||||
methods that require the actual file contents."""
|
|
||||||
|
|
||||||
WHEEL_INFO = "WHEEL"
|
|
||||||
RECORD = "RECORD"
|
|
||||||
|
|
||||||
def __init__(self,
|
|
||||||
filename,
|
|
||||||
fp=None,
|
|
||||||
append=False,
|
|
||||||
context=get_supported):
|
|
||||||
"""
|
|
||||||
:param fp: A seekable file-like object or None to open(filename).
|
|
||||||
:param append: Open archive in append mode.
|
|
||||||
:param context: Function returning list of supported tags. Wheels
|
|
||||||
must have the same context to be sortable.
|
|
||||||
"""
|
|
||||||
self.filename = filename
|
|
||||||
self.fp = fp
|
|
||||||
self.append = append
|
|
||||||
self.context = context
|
|
||||||
basename = os.path.basename(filename)
|
|
||||||
self.parsed_filename = WHEEL_INFO_RE(basename)
|
|
||||||
if not basename.endswith('.whl') or self.parsed_filename is None:
|
|
||||||
raise BadWheelFile("Bad filename '%s'" % filename)
|
|
||||||
|
|
||||||
def __repr__(self):
|
|
||||||
return self.filename
|
|
||||||
|
|
||||||
@property
|
|
||||||
def distinfo_name(self):
|
|
||||||
return "%s.dist-info" % self.parsed_filename.group('namever')
|
|
||||||
|
|
||||||
@property
|
|
||||||
def datadir_name(self):
|
|
||||||
return "%s.data" % self.parsed_filename.group('namever')
|
|
||||||
|
|
||||||
@property
|
|
||||||
def record_name(self):
|
|
||||||
return "%s/%s" % (self.distinfo_name, self.RECORD)
|
|
||||||
|
|
||||||
@property
|
|
||||||
def wheelinfo_name(self):
|
|
||||||
return "%s/%s" % (self.distinfo_name, self.WHEEL_INFO)
|
|
||||||
|
|
||||||
@property
|
|
||||||
def tags(self):
|
|
||||||
"""A wheel file is compatible with the Cartesian product of the
|
|
||||||
period-delimited tags in its filename.
|
|
||||||
To choose a wheel file among several candidates having the same
|
|
||||||
distribution version 'ver', an installer ranks each triple of
|
|
||||||
(pyver, abi, plat) that its Python installation can run, sorting
|
|
||||||
the wheels by the best-ranked tag it supports and then by their
|
|
||||||
arity which is just len(list(compatibility_tags)).
|
|
||||||
"""
|
|
||||||
tags = self.parsed_filename.groupdict()
|
|
||||||
for pyver in tags['pyver'].split('.'):
|
|
||||||
for abi in tags['abi'].split('.'):
|
|
||||||
for plat in tags['plat'].split('.'):
|
|
||||||
yield (pyver, abi, plat)
|
|
||||||
|
|
||||||
compatibility_tags = tags
|
|
||||||
|
|
||||||
@property
|
|
||||||
def arity(self):
|
|
||||||
"""The number of compatibility tags the wheel declares."""
|
|
||||||
return len(list(self.compatibility_tags))
|
|
||||||
|
|
||||||
@property
|
|
||||||
def rank(self):
|
|
||||||
"""
|
|
||||||
Lowest index of any of this wheel's tags in self.context(), and the
|
|
||||||
arity e.g. (0, 1)
|
|
||||||
"""
|
|
||||||
return self.compatibility_rank(self.context())
|
|
||||||
|
|
||||||
@property
|
|
||||||
def compatible(self):
|
|
||||||
return self.rank[0] != _big_number # bad API!
|
|
||||||
|
|
||||||
# deprecated:
|
|
||||||
def compatibility_rank(self, supported):
|
|
||||||
"""Rank the wheel against the supported tags. Smaller ranks are more
|
|
||||||
compatible!
|
|
||||||
|
|
||||||
:param supported: A list of compatibility tags that the current
|
|
||||||
Python implemenation can run.
|
|
||||||
"""
|
|
||||||
preferences = []
|
|
||||||
for tag in self.compatibility_tags:
|
|
||||||
try:
|
|
||||||
preferences.append(supported.index(tag))
|
|
||||||
# Tag not present
|
|
||||||
except ValueError:
|
|
||||||
pass
|
|
||||||
if len(preferences):
|
|
||||||
return (min(preferences), self.arity)
|
|
||||||
return (_big_number, 0)
|
|
||||||
|
|
||||||
# deprecated
|
|
||||||
def supports_current_python(self, x):
|
|
||||||
assert self.context == x, 'context mismatch'
|
|
||||||
return self.compatible
|
|
||||||
|
|
||||||
# Comparability.
|
|
||||||
# Wheels are equal if they refer to the same file.
|
|
||||||
# If two wheels are not equal, compare based on (in this order):
|
|
||||||
# 1. Name
|
|
||||||
# 2. Version
|
|
||||||
# 3. Compatibility rank
|
|
||||||
# 4. Filename (as a tiebreaker)
|
|
||||||
@property
|
|
||||||
def _sort_key(self):
|
|
||||||
return (self.parsed_filename.group('name'),
|
|
||||||
parse_version(self.parsed_filename.group('ver')),
|
|
||||||
tuple(-x for x in self.rank),
|
|
||||||
self.filename)
|
|
||||||
|
|
||||||
def __eq__(self, other):
|
|
||||||
return self.filename == other.filename
|
|
||||||
|
|
||||||
def __ne__(self, other):
|
|
||||||
return self.filename != other.filename
|
|
||||||
|
|
||||||
def __lt__(self, other):
|
|
||||||
if self.context != other.context:
|
|
||||||
raise TypeError("{0}.context != {1}.context".format(self, other))
|
|
||||||
|
|
||||||
return self._sort_key < other._sort_key
|
|
||||||
|
|
||||||
# XXX prune
|
|
||||||
|
|
||||||
sn = self.parsed_filename.group('name')
|
|
||||||
on = other.parsed_filename.group('name')
|
|
||||||
if sn != on:
|
|
||||||
return sn < on
|
|
||||||
sv = parse_version(self.parsed_filename.group('ver'))
|
|
||||||
ov = parse_version(other.parsed_filename.group('ver'))
|
|
||||||
if sv != ov:
|
|
||||||
return sv < ov
|
|
||||||
# Compatibility
|
|
||||||
if self.context != other.context:
|
|
||||||
raise TypeError("{0}.context != {1}.context".format(self, other))
|
|
||||||
sc = self.rank
|
|
||||||
oc = other.rank
|
|
||||||
if sc is not None and oc is not None and sc != oc:
|
|
||||||
# Smaller compatibility ranks are "better" than larger ones,
|
|
||||||
# so we have to reverse the sense of the comparison here!
|
|
||||||
return sc > oc
|
|
||||||
elif sc is None and oc is not None:
|
|
||||||
return False
|
|
||||||
return self.filename < other.filename
|
|
||||||
|
|
||||||
def __gt__(self, other):
|
|
||||||
return other < self
|
|
||||||
|
|
||||||
def __le__(self, other):
|
|
||||||
return self == other or self < other
|
|
||||||
|
|
||||||
def __ge__(self, other):
|
|
||||||
return self == other or other < self
|
|
||||||
|
|
||||||
#
|
|
||||||
# Methods using the file's contents:
|
|
||||||
#
|
|
||||||
|
|
||||||
@reify
|
|
||||||
def zipfile(self):
|
|
||||||
mode = "r"
|
|
||||||
if self.append:
|
|
||||||
mode = "a"
|
|
||||||
vzf = VerifyingZipFile(self.fp if self.fp else self.filename, mode)
|
|
||||||
if not self.append:
|
|
||||||
self.verify(vzf)
|
|
||||||
return vzf
|
|
||||||
|
|
||||||
@reify
|
|
||||||
def parsed_wheel_info(self):
|
|
||||||
"""Parse wheel metadata (the .data/WHEEL file)"""
|
|
||||||
return read_pkg_info_bytes(self.zipfile.read(self.wheelinfo_name))
|
|
||||||
|
|
||||||
def check_version(self):
|
|
||||||
version = self.parsed_wheel_info['Wheel-Version']
|
|
||||||
if tuple(map(int, version.split('.'))) >= VERSION_TOO_HIGH:
|
|
||||||
raise ValueError("Wheel version is too high")
|
|
||||||
|
|
||||||
@reify
|
|
||||||
def install_paths(self):
|
|
||||||
"""
|
|
||||||
Consult distutils to get the install paths for our dist. A dict with
|
|
||||||
('purelib', 'platlib', 'headers', 'scripts', 'data').
|
|
||||||
|
|
||||||
We use the name from our filename as the dist name, which means headers
|
|
||||||
could be installed in the wrong place if the filesystem-escaped name
|
|
||||||
is different than the Name. Who cares?
|
|
||||||
"""
|
|
||||||
name = self.parsed_filename.group('name')
|
|
||||||
return get_install_paths(name)
|
|
||||||
|
|
||||||
def install(self, force=False, overrides={}):
|
|
||||||
"""
|
|
||||||
Install the wheel into site-packages.
|
|
||||||
"""
|
|
||||||
|
|
||||||
# Utility to get the target directory for a particular key
|
|
||||||
def get_path(key):
|
|
||||||
return overrides.get(key) or self.install_paths[key]
|
|
||||||
|
|
||||||
# The base target location is either purelib or platlib
|
|
||||||
if self.parsed_wheel_info['Root-Is-Purelib'] == 'true':
|
|
||||||
root = get_path('purelib')
|
|
||||||
else:
|
|
||||||
root = get_path('platlib')
|
|
||||||
|
|
||||||
# Parse all the names in the archive
|
|
||||||
name_trans = {}
|
|
||||||
for info in self.zipfile.infolist():
|
|
||||||
name = info.filename
|
|
||||||
# Zip files can contain entries representing directories.
|
|
||||||
# These end in a '/'.
|
|
||||||
# We ignore these, as we create directories on demand.
|
|
||||||
if name.endswith('/'):
|
|
||||||
continue
|
|
||||||
|
|
||||||
# Pathnames in a zipfile namelist are always /-separated.
|
|
||||||
# In theory, paths could start with ./ or have other oddities
|
|
||||||
# but this won't happen in practical cases of well-formed wheels.
|
|
||||||
# We'll cover the simple case of an initial './' as it's both easy
|
|
||||||
# to do and more common than most other oddities.
|
|
||||||
if name.startswith('./'):
|
|
||||||
name = name[2:]
|
|
||||||
|
|
||||||
# Split off the base directory to identify files that are to be
|
|
||||||
# installed in non-root locations
|
|
||||||
basedir, sep, filename = name.partition('/')
|
|
||||||
if sep and basedir == self.datadir_name:
|
|
||||||
# Data file. Target destination is elsewhere
|
|
||||||
key, sep, filename = filename.partition('/')
|
|
||||||
if not sep:
|
|
||||||
raise ValueError("Invalid filename in wheel: {0}".format(name))
|
|
||||||
target = get_path(key)
|
|
||||||
else:
|
|
||||||
# Normal file. Target destination is root
|
|
||||||
key = ''
|
|
||||||
target = root
|
|
||||||
filename = name
|
|
||||||
|
|
||||||
# Map the actual filename from the zipfile to its intended target
|
|
||||||
# directory and the pathname relative to that directory.
|
|
||||||
dest = os.path.normpath(os.path.join(target, filename))
|
|
||||||
name_trans[info] = (key, target, filename, dest)
|
|
||||||
|
|
||||||
# We're now ready to start processing the actual install. The process
|
|
||||||
# is as follows:
|
|
||||||
# 1. Prechecks - is the wheel valid, is its declared architecture
|
|
||||||
# OK, etc. [[Responsibility of the caller]]
|
|
||||||
# 2. Overwrite check - do any of the files to be installed already
|
|
||||||
# exist?
|
|
||||||
# 3. Actual install - put the files in their target locations.
|
|
||||||
# 4. Update RECORD - write a suitably modified RECORD file to
|
|
||||||
# reflect the actual installed paths.
|
|
||||||
|
|
||||||
if not force:
|
|
||||||
for info, v in name_trans.items():
|
|
||||||
k = info.filename
|
|
||||||
key, target, filename, dest = v
|
|
||||||
if os.path.exists(dest):
|
|
||||||
raise ValueError(
|
|
||||||
"Wheel file {0} would overwrite {1}. Use force if this is intended".format(
|
|
||||||
k, dest))
|
|
||||||
|
|
||||||
# Get the name of our executable, for use when replacing script
|
|
||||||
# wrapper hashbang lines.
|
|
||||||
# We encode it using getfilesystemencoding, as that is "the name of
|
|
||||||
# the encoding used to convert Unicode filenames into system file
|
|
||||||
# names".
|
|
||||||
exename = sys.executable.encode(sys.getfilesystemencoding())
|
|
||||||
record_data = []
|
|
||||||
record_name = self.distinfo_name + '/RECORD'
|
|
||||||
for info, (key, target, filename, dest) in name_trans.items():
|
|
||||||
name = info.filename
|
|
||||||
source = self.zipfile.open(info)
|
|
||||||
# Skip the RECORD file
|
|
||||||
if name == record_name:
|
|
||||||
continue
|
|
||||||
ddir = os.path.dirname(dest)
|
|
||||||
if not os.path.isdir(ddir):
|
|
||||||
os.makedirs(ddir)
|
|
||||||
|
|
||||||
temp_filename = dest + '.part'
|
|
||||||
try:
|
|
||||||
with HashingFile(temp_filename, 'wb') as destination:
|
|
||||||
if key == 'scripts':
|
|
||||||
hashbang = source.readline()
|
|
||||||
if hashbang.startswith(b'#!python'):
|
|
||||||
hashbang = b'#!' + exename + binary(os.linesep)
|
|
||||||
destination.write(hashbang)
|
|
||||||
|
|
||||||
shutil.copyfileobj(source, destination)
|
|
||||||
except:
|
|
||||||
if os.path.exists(temp_filename):
|
|
||||||
os.unlink(temp_filename)
|
|
||||||
|
|
||||||
raise
|
|
||||||
|
|
||||||
os.rename(temp_filename, dest)
|
|
||||||
reldest = os.path.relpath(dest, root)
|
|
||||||
reldest.replace(os.sep, '/')
|
|
||||||
record_data.append((reldest, destination.digest(), destination.length))
|
|
||||||
destination.close()
|
|
||||||
source.close()
|
|
||||||
# preserve attributes (especially +x bit for scripts)
|
|
||||||
attrs = info.external_attr >> 16
|
|
||||||
if attrs: # tends to be 0 if Windows.
|
|
||||||
os.chmod(dest, info.external_attr >> 16)
|
|
||||||
|
|
||||||
record_name = os.path.join(root, self.record_name)
|
|
||||||
with open_for_csv(record_name, 'w+') as record_file:
|
|
||||||
writer = csv.writer(record_file)
|
|
||||||
for reldest, digest, length in sorted(record_data):
|
|
||||||
writer.writerow((reldest, digest, length))
|
|
||||||
writer.writerow((self.record_name, '', ''))
|
|
||||||
|
|
||||||
def verify(self, zipfile=None):
|
|
||||||
"""Configure the VerifyingZipFile `zipfile` by verifying its signature
|
|
||||||
and setting expected hashes for every hash in RECORD.
|
|
||||||
Caller must complete the verification process by completely reading
|
|
||||||
every file in the archive (e.g. with extractall)."""
|
|
||||||
sig = None
|
|
||||||
if zipfile is None:
|
|
||||||
zipfile = self.zipfile
|
|
||||||
zipfile.strict = True
|
|
||||||
|
|
||||||
record_name = '/'.join((self.distinfo_name, 'RECORD'))
|
|
||||||
sig_name = '/'.join((self.distinfo_name, 'RECORD.jws'))
|
|
||||||
# tolerate s/mime signatures:
|
|
||||||
smime_sig_name = '/'.join((self.distinfo_name, 'RECORD.p7s'))
|
|
||||||
zipfile.set_expected_hash(record_name, None)
|
|
||||||
zipfile.set_expected_hash(sig_name, None)
|
|
||||||
zipfile.set_expected_hash(smime_sig_name, None)
|
|
||||||
record = zipfile.read(record_name)
|
|
||||||
|
|
||||||
record_digest = urlsafe_b64encode(hashlib.sha256(record).digest())
|
|
||||||
try:
|
|
||||||
sig = from_json(native(zipfile.read(sig_name)))
|
|
||||||
except KeyError: # no signature
|
|
||||||
pass
|
|
||||||
if sig:
|
|
||||||
headers, payload = signatures.verify(sig)
|
|
||||||
if payload['hash'] != "sha256=" + native(record_digest):
|
|
||||||
msg = "RECORD.sig claimed RECORD hash {0} != computed hash {1}."
|
|
||||||
raise BadWheelFile(msg.format(payload['hash'],
|
|
||||||
native(record_digest)))
|
|
||||||
|
|
||||||
reader = csv.reader((native(r) for r in record.splitlines()))
|
|
||||||
|
|
||||||
for row in reader:
|
|
||||||
filename = row[0]
|
|
||||||
hash = row[1]
|
|
||||||
if not hash:
|
|
||||||
if filename not in (record_name, sig_name):
|
|
||||||
sys.stderr.write("%s has no hash!\n" % filename)
|
|
||||||
continue
|
|
||||||
algo, data = row[1].split('=', 1)
|
|
||||||
assert algo == "sha256", "Unsupported hash algorithm"
|
|
||||||
zipfile.set_expected_hash(filename, urlsafe_b64decode(binary(data)))
|
|
||||||
|
|
||||||
|
|
||||||
class VerifyingZipFile(zipfile.ZipFile):
|
|
||||||
"""ZipFile that can assert that each of its extracted contents matches
|
|
||||||
an expected sha256 hash. Note that each file must be completly read in
|
|
||||||
order for its hash to be checked."""
|
|
||||||
|
|
||||||
def __init__(self, file, mode="r",
|
|
||||||
compression=zipfile.ZIP_STORED,
|
|
||||||
allowZip64=False):
|
|
||||||
zipfile.ZipFile.__init__(self, file, mode, compression, allowZip64)
|
|
||||||
|
|
||||||
self.strict = False
|
|
||||||
self._expected_hashes = {}
|
|
||||||
self._hash_algorithm = hashlib.sha256
|
|
||||||
|
|
||||||
def set_expected_hash(self, name, hash):
|
|
||||||
"""
|
|
||||||
:param name: name of zip entry
|
|
||||||
:param hash: bytes of hash (or None for "don't care")
|
|
||||||
"""
|
|
||||||
self._expected_hashes[name] = hash
|
|
||||||
|
|
||||||
def open(self, name_or_info, mode="r", pwd=None):
|
|
||||||
"""Return file-like object for 'name'."""
|
|
||||||
# A non-monkey-patched version would contain most of zipfile.py
|
|
||||||
ef = zipfile.ZipFile.open(self, name_or_info, mode, pwd)
|
|
||||||
if isinstance(name_or_info, zipfile.ZipInfo):
|
|
||||||
name = name_or_info.filename
|
|
||||||
else:
|
|
||||||
name = name_or_info
|
|
||||||
|
|
||||||
if name in self._expected_hashes and self._expected_hashes[name] is not None:
|
|
||||||
expected_hash = self._expected_hashes[name]
|
|
||||||
try:
|
|
||||||
_update_crc_orig = ef._update_crc
|
|
||||||
except AttributeError:
|
|
||||||
warnings.warn('Need ZipExtFile._update_crc to implement '
|
|
||||||
'file hash verification (in Python >= 2.7)')
|
|
||||||
return ef
|
|
||||||
running_hash = self._hash_algorithm()
|
|
||||||
if hasattr(ef, '_eof'): # py33
|
|
||||||
def _update_crc(data):
|
|
||||||
_update_crc_orig(data)
|
|
||||||
running_hash.update(data)
|
|
||||||
if ef._eof and running_hash.digest() != expected_hash:
|
|
||||||
raise BadWheelFile("Bad hash for file %r" % ef.name)
|
|
||||||
else:
|
|
||||||
def _update_crc(data, eof=None):
|
|
||||||
_update_crc_orig(data, eof=eof)
|
|
||||||
running_hash.update(data)
|
|
||||||
if eof and running_hash.digest() != expected_hash:
|
|
||||||
raise BadWheelFile("Bad hash for file %r" % ef.name)
|
|
||||||
ef._update_crc = _update_crc
|
|
||||||
elif self.strict and name not in self._expected_hashes:
|
|
||||||
raise BadWheelFile("No expected hash for file %r" % ef.name)
|
|
||||||
return ef
|
|
||||||
|
|
||||||
def pop(self):
|
|
||||||
"""Truncate the last file off this zipfile.
|
|
||||||
Assumes infolist() is in the same order as the files (true for
|
|
||||||
ordinary zip files created by Python)"""
|
|
||||||
if not self.fp:
|
|
||||||
raise RuntimeError(
|
|
||||||
"Attempt to pop from ZIP archive that was already closed")
|
|
||||||
last = self.infolist().pop()
|
|
||||||
del self.NameToInfo[last.filename]
|
|
||||||
self.fp.seek(last.header_offset, os.SEEK_SET)
|
|
||||||
self.fp.truncate()
|
|
||||||
self._didModify = True
|
|
|
@ -1,338 +0,0 @@
|
||||||
"""
|
|
||||||
Tools for converting old- to new-style metadata.
|
|
||||||
"""
|
|
||||||
|
|
||||||
import email.parser
|
|
||||||
import os.path
|
|
||||||
import re
|
|
||||||
import textwrap
|
|
||||||
from collections import namedtuple, OrderedDict
|
|
||||||
|
|
||||||
import pkg_resources
|
|
||||||
|
|
||||||
from . import __version__ as wheel_version
|
|
||||||
from .pkginfo import read_pkg_info
|
|
||||||
from .util import OrderedDefaultDict
|
|
||||||
|
|
||||||
METADATA_VERSION = "2.0"
|
|
||||||
|
|
||||||
PLURAL_FIELDS = {"classifier": "classifiers",
|
|
||||||
"provides_dist": "provides",
|
|
||||||
"provides_extra": "extras"}
|
|
||||||
|
|
||||||
SKIP_FIELDS = set()
|
|
||||||
|
|
||||||
CONTACT_FIELDS = (({"email": "author_email", "name": "author"},
|
|
||||||
"author"),
|
|
||||||
({"email": "maintainer_email", "name": "maintainer"},
|
|
||||||
"maintainer"))
|
|
||||||
|
|
||||||
# commonly filled out as "UNKNOWN" by distutils:
|
|
||||||
UNKNOWN_FIELDS = {"author", "author_email", "platform", "home_page", "license"}
|
|
||||||
|
|
||||||
# Wheel itself is probably the only program that uses non-extras markers
|
|
||||||
# in METADATA/PKG-INFO. Support its syntax with the extra at the end only.
|
|
||||||
EXTRA_RE = re.compile("""^(?P<package>.*?)(;\s*(?P<condition>.*?)(extra == '(?P<extra>.*?)')?)$""")
|
|
||||||
KEYWORDS_RE = re.compile("[\0-,]+")
|
|
||||||
|
|
||||||
MayRequiresKey = namedtuple('MayRequiresKey', ('condition', 'extra'))
|
|
||||||
|
|
||||||
|
|
||||||
def unique(iterable):
|
|
||||||
"""
|
|
||||||
Yield unique values in iterable, preserving order.
|
|
||||||
"""
|
|
||||||
seen = set()
|
|
||||||
for value in iterable:
|
|
||||||
if value not in seen:
|
|
||||||
seen.add(value)
|
|
||||||
yield value
|
|
||||||
|
|
||||||
|
|
||||||
def handle_requires(metadata, pkg_info, key):
|
|
||||||
"""
|
|
||||||
Place the runtime requirements from pkg_info into metadata.
|
|
||||||
"""
|
|
||||||
may_requires = OrderedDefaultDict(list)
|
|
||||||
for value in sorted(pkg_info.get_all(key)):
|
|
||||||
extra_match = EXTRA_RE.search(value)
|
|
||||||
if extra_match:
|
|
||||||
groupdict = extra_match.groupdict()
|
|
||||||
condition = groupdict['condition']
|
|
||||||
extra = groupdict['extra']
|
|
||||||
package = groupdict['package']
|
|
||||||
if condition.endswith(' and '):
|
|
||||||
condition = condition[:-5]
|
|
||||||
else:
|
|
||||||
condition, extra = None, None
|
|
||||||
package = value
|
|
||||||
key = MayRequiresKey(condition, extra)
|
|
||||||
may_requires[key].append(package)
|
|
||||||
|
|
||||||
if may_requires:
|
|
||||||
metadata['run_requires'] = []
|
|
||||||
|
|
||||||
def sort_key(item):
|
|
||||||
# Both condition and extra could be None, which can't be compared
|
|
||||||
# against strings in Python 3.
|
|
||||||
key, value = item
|
|
||||||
if key.condition is None:
|
|
||||||
return ''
|
|
||||||
return key.condition
|
|
||||||
|
|
||||||
for key, value in sorted(may_requires.items(), key=sort_key):
|
|
||||||
may_requirement = OrderedDict((('requires', value),))
|
|
||||||
if key.extra:
|
|
||||||
may_requirement['extra'] = key.extra
|
|
||||||
if key.condition:
|
|
||||||
may_requirement['environment'] = key.condition
|
|
||||||
metadata['run_requires'].append(may_requirement)
|
|
||||||
|
|
||||||
if 'extras' not in metadata:
|
|
||||||
metadata['extras'] = []
|
|
||||||
metadata['extras'].extend([key.extra for key in may_requires.keys() if key.extra])
|
|
||||||
|
|
||||||
|
|
||||||
def pkginfo_to_dict(path, distribution=None):
|
|
||||||
"""
|
|
||||||
Convert PKG-INFO to a prototype Metadata 2.0 (PEP 426) dict.
|
|
||||||
|
|
||||||
The description is included under the key ['description'] rather than
|
|
||||||
being written to a separate file.
|
|
||||||
|
|
||||||
path: path to PKG-INFO file
|
|
||||||
distribution: optional distutils Distribution()
|
|
||||||
"""
|
|
||||||
|
|
||||||
metadata = OrderedDefaultDict(
|
|
||||||
lambda: OrderedDefaultDict(lambda: OrderedDefaultDict(OrderedDict)))
|
|
||||||
metadata["generator"] = "bdist_wheel (" + wheel_version + ")"
|
|
||||||
try:
|
|
||||||
unicode
|
|
||||||
pkg_info = read_pkg_info(path)
|
|
||||||
except NameError:
|
|
||||||
with open(path, 'rb') as pkg_info_file:
|
|
||||||
pkg_info = email.parser.Parser().parsestr(pkg_info_file.read().decode('utf-8'))
|
|
||||||
description = None
|
|
||||||
|
|
||||||
if pkg_info['Summary']:
|
|
||||||
metadata['summary'] = pkginfo_unicode(pkg_info, 'Summary')
|
|
||||||
del pkg_info['Summary']
|
|
||||||
|
|
||||||
if pkg_info['Description']:
|
|
||||||
description = dedent_description(pkg_info)
|
|
||||||
del pkg_info['Description']
|
|
||||||
else:
|
|
||||||
payload = pkg_info.get_payload()
|
|
||||||
if isinstance(payload, bytes):
|
|
||||||
# Avoid a Python 2 Unicode error.
|
|
||||||
# We still suffer ? glyphs on Python 3.
|
|
||||||
payload = payload.decode('utf-8')
|
|
||||||
if payload:
|
|
||||||
description = payload
|
|
||||||
|
|
||||||
if description:
|
|
||||||
pkg_info['description'] = description
|
|
||||||
|
|
||||||
for key in sorted(unique(k.lower() for k in pkg_info.keys())):
|
|
||||||
low_key = key.replace('-', '_')
|
|
||||||
|
|
||||||
if low_key in SKIP_FIELDS:
|
|
||||||
continue
|
|
||||||
|
|
||||||
if low_key in UNKNOWN_FIELDS and pkg_info.get(key) == 'UNKNOWN':
|
|
||||||
continue
|
|
||||||
|
|
||||||
if low_key in sorted(PLURAL_FIELDS):
|
|
||||||
metadata[PLURAL_FIELDS[low_key]] = pkg_info.get_all(key)
|
|
||||||
|
|
||||||
elif low_key == "requires_dist":
|
|
||||||
handle_requires(metadata, pkg_info, key)
|
|
||||||
|
|
||||||
elif low_key == 'provides_extra':
|
|
||||||
if 'extras' not in metadata:
|
|
||||||
metadata['extras'] = []
|
|
||||||
metadata['extras'].extend(pkg_info.get_all(key))
|
|
||||||
|
|
||||||
elif low_key == 'home_page':
|
|
||||||
metadata['extensions']['python.details']['project_urls'] = {'Home': pkg_info[key]}
|
|
||||||
|
|
||||||
elif low_key == 'keywords':
|
|
||||||
metadata['keywords'] = KEYWORDS_RE.split(pkg_info[key])
|
|
||||||
|
|
||||||
else:
|
|
||||||
metadata[low_key] = pkg_info[key]
|
|
||||||
|
|
||||||
metadata['metadata_version'] = METADATA_VERSION
|
|
||||||
|
|
||||||
if 'extras' in metadata:
|
|
||||||
metadata['extras'] = sorted(set(metadata['extras']))
|
|
||||||
|
|
||||||
# include more information if distribution is available
|
|
||||||
if distribution:
|
|
||||||
for requires, attr in (('test_requires', 'tests_require'),):
|
|
||||||
try:
|
|
||||||
requirements = getattr(distribution, attr)
|
|
||||||
if isinstance(requirements, list):
|
|
||||||
new_requirements = sorted(convert_requirements(requirements))
|
|
||||||
metadata[requires] = [{'requires': new_requirements}]
|
|
||||||
except AttributeError:
|
|
||||||
pass
|
|
||||||
|
|
||||||
# handle contacts
|
|
||||||
contacts = []
|
|
||||||
for contact_type, role in CONTACT_FIELDS:
|
|
||||||
contact = OrderedDict()
|
|
||||||
for key in sorted(contact_type):
|
|
||||||
if contact_type[key] in metadata:
|
|
||||||
contact[key] = metadata.pop(contact_type[key])
|
|
||||||
if contact:
|
|
||||||
contact['role'] = role
|
|
||||||
contacts.append(contact)
|
|
||||||
if contacts:
|
|
||||||
metadata['extensions']['python.details']['contacts'] = contacts
|
|
||||||
|
|
||||||
# convert entry points to exports
|
|
||||||
try:
|
|
||||||
with open(os.path.join(os.path.dirname(path), "entry_points.txt"), "r") as ep_file:
|
|
||||||
ep_map = pkg_resources.EntryPoint.parse_map(ep_file.read())
|
|
||||||
exports = OrderedDict()
|
|
||||||
for group, items in sorted(ep_map.items()):
|
|
||||||
exports[group] = OrderedDict()
|
|
||||||
for item in sorted(map(str, items.values())):
|
|
||||||
name, export = item.split(' = ', 1)
|
|
||||||
exports[group][name] = export
|
|
||||||
if exports:
|
|
||||||
metadata['extensions']['python.exports'] = exports
|
|
||||||
except IOError:
|
|
||||||
pass
|
|
||||||
|
|
||||||
# copy console_scripts entry points to commands
|
|
||||||
if 'python.exports' in metadata['extensions']:
|
|
||||||
for (ep_script, wrap_script) in (('console_scripts', 'wrap_console'),
|
|
||||||
('gui_scripts', 'wrap_gui')):
|
|
||||||
if ep_script in metadata['extensions']['python.exports']:
|
|
||||||
metadata['extensions']['python.commands'][wrap_script] = \
|
|
||||||
metadata['extensions']['python.exports'][ep_script]
|
|
||||||
|
|
||||||
return metadata
|
|
||||||
|
|
||||||
|
|
||||||
def requires_to_requires_dist(requirement):
|
|
||||||
"""Compose the version predicates for requirement in PEP 345 fashion."""
|
|
||||||
requires_dist = []
|
|
||||||
for op, ver in requirement.specs:
|
|
||||||
requires_dist.append(op + ver)
|
|
||||||
if not requires_dist:
|
|
||||||
return ''
|
|
||||||
return " (%s)" % ','.join(sorted(requires_dist))
|
|
||||||
|
|
||||||
|
|
||||||
def convert_requirements(requirements):
|
|
||||||
"""Yield Requires-Dist: strings for parsed requirements strings."""
|
|
||||||
for req in requirements:
|
|
||||||
parsed_requirement = pkg_resources.Requirement.parse(req)
|
|
||||||
spec = requires_to_requires_dist(parsed_requirement)
|
|
||||||
extras = ",".join(parsed_requirement.extras)
|
|
||||||
if extras:
|
|
||||||
extras = "[%s]" % extras
|
|
||||||
yield (parsed_requirement.project_name + extras + spec)
|
|
||||||
|
|
||||||
|
|
||||||
def generate_requirements(extras_require):
|
|
||||||
"""
|
|
||||||
Convert requirements from a setup()-style dictionary to ('Requires-Dist', 'requirement')
|
|
||||||
and ('Provides-Extra', 'extra') tuples.
|
|
||||||
|
|
||||||
extras_require is a dictionary of {extra: [requirements]} as passed to setup(),
|
|
||||||
using the empty extra {'': [requirements]} to hold install_requires.
|
|
||||||
"""
|
|
||||||
for extra, depends in extras_require.items():
|
|
||||||
condition = ''
|
|
||||||
if extra and ':' in extra: # setuptools extra:condition syntax
|
|
||||||
extra, condition = extra.split(':', 1)
|
|
||||||
extra = pkg_resources.safe_extra(extra)
|
|
||||||
if extra:
|
|
||||||
yield ('Provides-Extra', extra)
|
|
||||||
if condition:
|
|
||||||
condition += " and "
|
|
||||||
condition += "extra == '%s'" % extra
|
|
||||||
if condition:
|
|
||||||
condition = '; ' + condition
|
|
||||||
for new_req in convert_requirements(depends):
|
|
||||||
yield ('Requires-Dist', new_req + condition)
|
|
||||||
|
|
||||||
|
|
||||||
def pkginfo_to_metadata(egg_info_path, pkginfo_path):
|
|
||||||
"""
|
|
||||||
Convert .egg-info directory with PKG-INFO to the Metadata 1.3 aka
|
|
||||||
old-draft Metadata 2.0 format.
|
|
||||||
"""
|
|
||||||
pkg_info = read_pkg_info(pkginfo_path)
|
|
||||||
pkg_info.replace_header('Metadata-Version', '2.0')
|
|
||||||
requires_path = os.path.join(egg_info_path, 'requires.txt')
|
|
||||||
if os.path.exists(requires_path):
|
|
||||||
with open(requires_path) as requires_file:
|
|
||||||
requires = requires_file.read()
|
|
||||||
for extra, reqs in sorted(pkg_resources.split_sections(requires),
|
|
||||||
key=lambda x: x[0] or ''):
|
|
||||||
for item in generate_requirements({extra: reqs}):
|
|
||||||
pkg_info[item[0]] = item[1]
|
|
||||||
|
|
||||||
description = pkg_info['Description']
|
|
||||||
if description:
|
|
||||||
pkg_info.set_payload(dedent_description(pkg_info))
|
|
||||||
del pkg_info['Description']
|
|
||||||
|
|
||||||
return pkg_info
|
|
||||||
|
|
||||||
|
|
||||||
def pkginfo_unicode(pkg_info, field):
|
|
||||||
"""Hack to coax Unicode out of an email Message() - Python 3.3+"""
|
|
||||||
text = pkg_info[field]
|
|
||||||
field = field.lower()
|
|
||||||
if not isinstance(text, str):
|
|
||||||
if not hasattr(pkg_info, 'raw_items'): # Python 3.2
|
|
||||||
return str(text)
|
|
||||||
for item in pkg_info.raw_items():
|
|
||||||
if item[0].lower() == field:
|
|
||||||
text = item[1].encode('ascii', 'surrogateescape') \
|
|
||||||
.decode('utf-8')
|
|
||||||
break
|
|
||||||
|
|
||||||
return text
|
|
||||||
|
|
||||||
|
|
||||||
def dedent_description(pkg_info):
|
|
||||||
"""
|
|
||||||
Dedent and convert pkg_info['Description'] to Unicode.
|
|
||||||
"""
|
|
||||||
description = pkg_info['Description']
|
|
||||||
|
|
||||||
# Python 3 Unicode handling, sorta.
|
|
||||||
surrogates = False
|
|
||||||
if not isinstance(description, str):
|
|
||||||
surrogates = True
|
|
||||||
description = pkginfo_unicode(pkg_info, 'Description')
|
|
||||||
|
|
||||||
description_lines = description.splitlines()
|
|
||||||
description_dedent = '\n'.join(
|
|
||||||
# if the first line of long_description is blank,
|
|
||||||
# the first line here will be indented.
|
|
||||||
(description_lines[0].lstrip(),
|
|
||||||
textwrap.dedent('\n'.join(description_lines[1:])),
|
|
||||||
'\n'))
|
|
||||||
|
|
||||||
if surrogates:
|
|
||||||
description_dedent = description_dedent \
|
|
||||||
.encode("utf8") \
|
|
||||||
.decode("ascii", "surrogateescape")
|
|
||||||
|
|
||||||
return description_dedent
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
import sys
|
|
||||||
import pprint
|
|
||||||
|
|
||||||
pprint.pprint(pkginfo_to_dict(sys.argv[1]))
|
|
|
@ -1,43 +0,0 @@
|
||||||
"""
|
|
||||||
Installation paths.
|
|
||||||
|
|
||||||
Map the .data/ subdirectory names to install paths.
|
|
||||||
"""
|
|
||||||
|
|
||||||
import distutils.command.install as install
|
|
||||||
import distutils.dist as dist
|
|
||||||
import os.path
|
|
||||||
import sys
|
|
||||||
|
|
||||||
|
|
||||||
def get_install_command(name):
|
|
||||||
# late binding due to potential monkeypatching
|
|
||||||
d = dist.Distribution({'name': name})
|
|
||||||
i = install.install(d)
|
|
||||||
i.finalize_options()
|
|
||||||
return i
|
|
||||||
|
|
||||||
|
|
||||||
def get_install_paths(name):
|
|
||||||
"""
|
|
||||||
Return the (distutils) install paths for the named dist.
|
|
||||||
|
|
||||||
A dict with ('purelib', 'platlib', 'headers', 'scripts', 'data') keys.
|
|
||||||
"""
|
|
||||||
paths = {}
|
|
||||||
|
|
||||||
i = get_install_command(name)
|
|
||||||
|
|
||||||
for key in install.SCHEME_KEYS:
|
|
||||||
paths[key] = getattr(i, 'install_' + key)
|
|
||||||
|
|
||||||
# pip uses a similar path as an alternative to the system's (read-only)
|
|
||||||
# include directory:
|
|
||||||
if hasattr(sys, 'real_prefix'): # virtualenv
|
|
||||||
paths['headers'] = os.path.join(sys.prefix,
|
|
||||||
'include',
|
|
||||||
'site',
|
|
||||||
'python' + sys.version[:3],
|
|
||||||
name)
|
|
||||||
|
|
||||||
return paths
|
|
|
@ -1,180 +0,0 @@
|
||||||
"""Generate and work with PEP 425 Compatibility Tags."""
|
|
||||||
|
|
||||||
import distutils.util
|
|
||||||
import platform
|
|
||||||
import sys
|
|
||||||
import sysconfig
|
|
||||||
import warnings
|
|
||||||
|
|
||||||
|
|
||||||
def get_config_var(var):
|
|
||||||
try:
|
|
||||||
return sysconfig.get_config_var(var)
|
|
||||||
except IOError as e: # pip Issue #1074
|
|
||||||
warnings.warn("{0}".format(e), RuntimeWarning)
|
|
||||||
return None
|
|
||||||
|
|
||||||
|
|
||||||
def get_abbr_impl():
|
|
||||||
"""Return abbreviated implementation name."""
|
|
||||||
impl = platform.python_implementation()
|
|
||||||
if impl == 'PyPy':
|
|
||||||
return 'pp'
|
|
||||||
elif impl == 'Jython':
|
|
||||||
return 'jy'
|
|
||||||
elif impl == 'IronPython':
|
|
||||||
return 'ip'
|
|
||||||
elif impl == 'CPython':
|
|
||||||
return 'cp'
|
|
||||||
|
|
||||||
raise LookupError('Unknown Python implementation: ' + impl)
|
|
||||||
|
|
||||||
|
|
||||||
def get_impl_ver():
|
|
||||||
"""Return implementation version."""
|
|
||||||
impl_ver = get_config_var("py_version_nodot")
|
|
||||||
if not impl_ver or get_abbr_impl() == 'pp':
|
|
||||||
impl_ver = ''.join(map(str, get_impl_version_info()))
|
|
||||||
return impl_ver
|
|
||||||
|
|
||||||
|
|
||||||
def get_impl_version_info():
|
|
||||||
"""Return sys.version_info-like tuple for use in decrementing the minor
|
|
||||||
version."""
|
|
||||||
if get_abbr_impl() == 'pp':
|
|
||||||
# as per https://github.com/pypa/pip/issues/2882
|
|
||||||
return (sys.version_info[0], sys.pypy_version_info.major,
|
|
||||||
sys.pypy_version_info.minor)
|
|
||||||
else:
|
|
||||||
return sys.version_info[0], sys.version_info[1]
|
|
||||||
|
|
||||||
|
|
||||||
def get_flag(var, fallback, expected=True, warn=True):
|
|
||||||
"""Use a fallback method for determining SOABI flags if the needed config
|
|
||||||
var is unset or unavailable."""
|
|
||||||
val = get_config_var(var)
|
|
||||||
if val is None:
|
|
||||||
if warn:
|
|
||||||
warnings.warn("Config variable '{0}' is unset, Python ABI tag may "
|
|
||||||
"be incorrect".format(var), RuntimeWarning, 2)
|
|
||||||
return fallback()
|
|
||||||
return val == expected
|
|
||||||
|
|
||||||
|
|
||||||
def get_abi_tag():
|
|
||||||
"""Return the ABI tag based on SOABI (if available) or emulate SOABI
|
|
||||||
(CPython 2, PyPy)."""
|
|
||||||
soabi = get_config_var('SOABI')
|
|
||||||
impl = get_abbr_impl()
|
|
||||||
if not soabi and impl in ('cp', 'pp') and hasattr(sys, 'maxunicode'):
|
|
||||||
d = ''
|
|
||||||
m = ''
|
|
||||||
u = ''
|
|
||||||
if get_flag('Py_DEBUG',
|
|
||||||
lambda: hasattr(sys, 'gettotalrefcount'),
|
|
||||||
warn=(impl == 'cp')):
|
|
||||||
d = 'd'
|
|
||||||
if get_flag('WITH_PYMALLOC',
|
|
||||||
lambda: impl == 'cp',
|
|
||||||
warn=(impl == 'cp')):
|
|
||||||
m = 'm'
|
|
||||||
if get_flag('Py_UNICODE_SIZE',
|
|
||||||
lambda: sys.maxunicode == 0x10ffff,
|
|
||||||
expected=4,
|
|
||||||
warn=(impl == 'cp' and
|
|
||||||
sys.version_info < (3, 3))) \
|
|
||||||
and sys.version_info < (3, 3):
|
|
||||||
u = 'u'
|
|
||||||
abi = '%s%s%s%s%s' % (impl, get_impl_ver(), d, m, u)
|
|
||||||
elif soabi and soabi.startswith('cpython-'):
|
|
||||||
abi = 'cp' + soabi.split('-')[1]
|
|
||||||
elif soabi:
|
|
||||||
abi = soabi.replace('.', '_').replace('-', '_')
|
|
||||||
else:
|
|
||||||
abi = None
|
|
||||||
return abi
|
|
||||||
|
|
||||||
|
|
||||||
def get_platform():
|
|
||||||
"""Return our platform name 'win32', 'linux_x86_64'"""
|
|
||||||
# XXX remove distutils dependency
|
|
||||||
result = distutils.util.get_platform().replace('.', '_').replace('-', '_')
|
|
||||||
if result == "linux_x86_64" and sys.maxsize == 2147483647:
|
|
||||||
# pip pull request #3497
|
|
||||||
result = "linux_i686"
|
|
||||||
return result
|
|
||||||
|
|
||||||
|
|
||||||
def get_supported(versions=None, supplied_platform=None):
|
|
||||||
"""Return a list of supported tags for each version specified in
|
|
||||||
`versions`.
|
|
||||||
|
|
||||||
:param versions: a list of string versions, of the form ["33", "32"],
|
|
||||||
or None. The first version will be assumed to support our ABI.
|
|
||||||
"""
|
|
||||||
supported = []
|
|
||||||
|
|
||||||
# Versions must be given with respect to the preference
|
|
||||||
if versions is None:
|
|
||||||
versions = []
|
|
||||||
version_info = get_impl_version_info()
|
|
||||||
major = version_info[:-1]
|
|
||||||
# Support all previous minor Python versions.
|
|
||||||
for minor in range(version_info[-1], -1, -1):
|
|
||||||
versions.append(''.join(map(str, major + (minor,))))
|
|
||||||
|
|
||||||
impl = get_abbr_impl()
|
|
||||||
|
|
||||||
abis = []
|
|
||||||
|
|
||||||
abi = get_abi_tag()
|
|
||||||
if abi:
|
|
||||||
abis[0:0] = [abi]
|
|
||||||
|
|
||||||
abi3s = set()
|
|
||||||
import imp
|
|
||||||
for suffix in imp.get_suffixes():
|
|
||||||
if suffix[0].startswith('.abi'):
|
|
||||||
abi3s.add(suffix[0].split('.', 2)[1])
|
|
||||||
|
|
||||||
abis.extend(sorted(list(abi3s)))
|
|
||||||
|
|
||||||
abis.append('none')
|
|
||||||
|
|
||||||
platforms = []
|
|
||||||
if supplied_platform:
|
|
||||||
platforms.append(supplied_platform)
|
|
||||||
platforms.append(get_platform())
|
|
||||||
|
|
||||||
# Current version, current API (built specifically for our Python):
|
|
||||||
for abi in abis:
|
|
||||||
for arch in platforms:
|
|
||||||
supported.append(('%s%s' % (impl, versions[0]), abi, arch))
|
|
||||||
|
|
||||||
# abi3 modules compatible with older version of Python
|
|
||||||
for version in versions[1:]:
|
|
||||||
# abi3 was introduced in Python 3.2
|
|
||||||
if version in ('31', '30'):
|
|
||||||
break
|
|
||||||
for abi in abi3s: # empty set if not Python 3
|
|
||||||
for arch in platforms:
|
|
||||||
supported.append(("%s%s" % (impl, version), abi, arch))
|
|
||||||
|
|
||||||
# No abi / arch, but requires our implementation:
|
|
||||||
for i, version in enumerate(versions):
|
|
||||||
supported.append(('%s%s' % (impl, version), 'none', 'any'))
|
|
||||||
if i == 0:
|
|
||||||
# Tagged specifically as being cross-version compatible
|
|
||||||
# (with just the major version specified)
|
|
||||||
supported.append(('%s%s' % (impl, versions[0][0]), 'none', 'any'))
|
|
||||||
|
|
||||||
# Major Python version + platform; e.g. binaries not using the Python API
|
|
||||||
supported.append(('py%s' % (versions[0][0]), 'none', arch))
|
|
||||||
|
|
||||||
# No abi / arch, generic Python
|
|
||||||
for i, version in enumerate(versions):
|
|
||||||
supported.append(('py%s' % (version,), 'none', 'any'))
|
|
||||||
if i == 0:
|
|
||||||
supported.append(('py%s' % (version[0]), 'none', 'any'))
|
|
||||||
|
|
||||||
return supported
|
|
|
@ -1,43 +0,0 @@
|
||||||
"""Tools for reading and writing PKG-INFO / METADATA without caring
|
|
||||||
about the encoding."""
|
|
||||||
|
|
||||||
from email.parser import Parser
|
|
||||||
|
|
||||||
try:
|
|
||||||
unicode
|
|
||||||
_PY3 = False
|
|
||||||
except NameError:
|
|
||||||
_PY3 = True
|
|
||||||
|
|
||||||
if not _PY3:
|
|
||||||
from email.generator import Generator
|
|
||||||
|
|
||||||
def read_pkg_info_bytes(bytestr):
|
|
||||||
return Parser().parsestr(bytestr)
|
|
||||||
|
|
||||||
def read_pkg_info(path):
|
|
||||||
with open(path, "r") as headers:
|
|
||||||
message = Parser().parse(headers)
|
|
||||||
return message
|
|
||||||
|
|
||||||
def write_pkg_info(path, message):
|
|
||||||
with open(path, 'w') as metadata:
|
|
||||||
Generator(metadata, mangle_from_=False, maxheaderlen=0).flatten(message)
|
|
||||||
else:
|
|
||||||
from email.generator import BytesGenerator
|
|
||||||
|
|
||||||
def read_pkg_info_bytes(bytestr):
|
|
||||||
headers = bytestr.decode(encoding="ascii", errors="surrogateescape")
|
|
||||||
message = Parser().parsestr(headers)
|
|
||||||
return message
|
|
||||||
|
|
||||||
def read_pkg_info(path):
|
|
||||||
with open(path, "r",
|
|
||||||
encoding="ascii",
|
|
||||||
errors="surrogateescape") as headers:
|
|
||||||
message = Parser().parse(headers)
|
|
||||||
return message
|
|
||||||
|
|
||||||
def write_pkg_info(path, message):
|
|
||||||
with open(path, "wb") as out:
|
|
||||||
BytesGenerator(out, mangle_from_=False, maxheaderlen=0).flatten(message)
|
|
|
@ -1,110 +0,0 @@
|
||||||
"""
|
|
||||||
Create and verify jws-js format Ed25519 signatures.
|
|
||||||
"""
|
|
||||||
|
|
||||||
import json
|
|
||||||
from ..util import urlsafe_b64decode, urlsafe_b64encode, native, binary
|
|
||||||
|
|
||||||
__all__ = ['sign', 'verify']
|
|
||||||
|
|
||||||
ed25519ll = None
|
|
||||||
|
|
||||||
ALG = "Ed25519"
|
|
||||||
|
|
||||||
|
|
||||||
def get_ed25519ll():
|
|
||||||
"""Lazy import-and-test of ed25519 module"""
|
|
||||||
global ed25519ll
|
|
||||||
|
|
||||||
if not ed25519ll:
|
|
||||||
try:
|
|
||||||
import ed25519ll # fast (thousands / s)
|
|
||||||
except (ImportError, OSError): # pragma nocover
|
|
||||||
from . import ed25519py as ed25519ll # pure Python (hundreds / s)
|
|
||||||
test()
|
|
||||||
|
|
||||||
return ed25519ll
|
|
||||||
|
|
||||||
|
|
||||||
def sign(payload, keypair):
|
|
||||||
"""Return a JWS-JS format signature given a JSON-serializable payload and
|
|
||||||
an Ed25519 keypair."""
|
|
||||||
get_ed25519ll()
|
|
||||||
#
|
|
||||||
header = {
|
|
||||||
"alg": ALG,
|
|
||||||
"jwk": {
|
|
||||||
"kty": ALG, # alg -> kty in jwk-08.
|
|
||||||
"vk": native(urlsafe_b64encode(keypair.vk))
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
encoded_header = urlsafe_b64encode(binary(json.dumps(header, sort_keys=True)))
|
|
||||||
encoded_payload = urlsafe_b64encode(binary(json.dumps(payload, sort_keys=True)))
|
|
||||||
secured_input = b".".join((encoded_header, encoded_payload))
|
|
||||||
sig_msg = ed25519ll.crypto_sign(secured_input, keypair.sk)
|
|
||||||
signature = sig_msg[:ed25519ll.SIGNATUREBYTES]
|
|
||||||
encoded_signature = urlsafe_b64encode(signature)
|
|
||||||
|
|
||||||
return {"recipients":
|
|
||||||
[{"header": native(encoded_header),
|
|
||||||
"signature": native(encoded_signature)}],
|
|
||||||
"payload": native(encoded_payload)}
|
|
||||||
|
|
||||||
|
|
||||||
def assertTrue(condition, message=""):
|
|
||||||
if not condition:
|
|
||||||
raise ValueError(message)
|
|
||||||
|
|
||||||
|
|
||||||
def verify(jwsjs):
|
|
||||||
"""Return (decoded headers, payload) if all signatures in jwsjs are
|
|
||||||
consistent, else raise ValueError.
|
|
||||||
|
|
||||||
Caller must decide whether the keys are actually trusted."""
|
|
||||||
get_ed25519ll()
|
|
||||||
# XXX forbid duplicate keys in JSON input using object_pairs_hook (2.7+)
|
|
||||||
recipients = jwsjs["recipients"]
|
|
||||||
encoded_payload = binary(jwsjs["payload"])
|
|
||||||
headers = []
|
|
||||||
for recipient in recipients:
|
|
||||||
assertTrue(len(recipient) == 2, "Unknown recipient key {0}".format(recipient))
|
|
||||||
h = binary(recipient["header"])
|
|
||||||
s = binary(recipient["signature"])
|
|
||||||
header = json.loads(native(urlsafe_b64decode(h)))
|
|
||||||
assertTrue(header["alg"] == ALG,
|
|
||||||
"Unexpected algorithm {0}".format(header["alg"]))
|
|
||||||
if "alg" in header["jwk"] and "kty" not in header["jwk"]:
|
|
||||||
header["jwk"]["kty"] = header["jwk"]["alg"] # b/w for JWK < -08
|
|
||||||
assertTrue(header["jwk"]["kty"] == ALG, # true for Ed25519
|
|
||||||
"Unexpected key type {0}".format(header["jwk"]["kty"]))
|
|
||||||
vk = urlsafe_b64decode(binary(header["jwk"]["vk"]))
|
|
||||||
secured_input = b".".join((h, encoded_payload))
|
|
||||||
sig = urlsafe_b64decode(s)
|
|
||||||
sig_msg = sig+secured_input
|
|
||||||
verified_input = native(ed25519ll.crypto_sign_open(sig_msg, vk))
|
|
||||||
verified_header, verified_payload = verified_input.split('.')
|
|
||||||
verified_header = binary(verified_header)
|
|
||||||
decoded_header = native(urlsafe_b64decode(verified_header))
|
|
||||||
headers.append(json.loads(decoded_header))
|
|
||||||
|
|
||||||
verified_payload = binary(verified_payload)
|
|
||||||
|
|
||||||
# only return header, payload that have passed through the crypto library.
|
|
||||||
payload = json.loads(native(urlsafe_b64decode(verified_payload)))
|
|
||||||
|
|
||||||
return headers, payload
|
|
||||||
|
|
||||||
|
|
||||||
def test():
|
|
||||||
kp = ed25519ll.crypto_sign_keypair()
|
|
||||||
payload = {'test': 'onstartup'}
|
|
||||||
jwsjs = json.loads(json.dumps(sign(payload, kp)))
|
|
||||||
verify(jwsjs)
|
|
||||||
jwsjs['payload'] += 'x'
|
|
||||||
try:
|
|
||||||
verify(jwsjs)
|
|
||||||
except ValueError:
|
|
||||||
pass
|
|
||||||
else: # pragma no cover
|
|
||||||
raise RuntimeError("No error from bad wheel.signatures payload.")
|
|
|
@ -1,323 +0,0 @@
|
||||||
# Ed25519 digital signatures
|
|
||||||
# Based on http://ed25519.cr.yp.to/python/ed25519.py
|
|
||||||
# See also http://ed25519.cr.yp.to/software.html
|
|
||||||
# Adapted by Ron Garret
|
|
||||||
# Sped up considerably using coordinate transforms found on:
|
|
||||||
# http://www.hyperelliptic.org/EFD/g1p/auto-twisted-extended-1.html
|
|
||||||
# Specifically add-2008-hwcd-4 and dbl-2008-hwcd
|
|
||||||
|
|
||||||
import hashlib
|
|
||||||
import random
|
|
||||||
|
|
||||||
try: # pragma nocover
|
|
||||||
unicode
|
|
||||||
PY3 = False
|
|
||||||
|
|
||||||
def asbytes(b):
|
|
||||||
"""Convert array of integers to byte string"""
|
|
||||||
return ''.join(chr(x) for x in b)
|
|
||||||
|
|
||||||
def joinbytes(b):
|
|
||||||
"""Convert array of bytes to byte string"""
|
|
||||||
return ''.join(b)
|
|
||||||
|
|
||||||
def bit(h, i):
|
|
||||||
"""Return i'th bit of bytestring h"""
|
|
||||||
return (ord(h[i // 8]) >> (i % 8)) & 1
|
|
||||||
except NameError: # pragma nocover
|
|
||||||
PY3 = True
|
|
||||||
asbytes = bytes
|
|
||||||
joinbytes = bytes
|
|
||||||
|
|
||||||
def bit(h, i):
|
|
||||||
return (h[i // 8] >> (i % 8)) & 1
|
|
||||||
|
|
||||||
b = 256
|
|
||||||
q = 2 ** 255 - 19
|
|
||||||
l = 2 ** 252 + 27742317777372353535851937790883648493
|
|
||||||
|
|
||||||
|
|
||||||
def H(m):
|
|
||||||
return hashlib.sha512(m).digest()
|
|
||||||
|
|
||||||
|
|
||||||
def expmod(b, e, m):
|
|
||||||
if e == 0:
|
|
||||||
return 1
|
|
||||||
|
|
||||||
t = expmod(b, e // 2, m) ** 2 % m
|
|
||||||
if e & 1:
|
|
||||||
t = (t * b) % m
|
|
||||||
|
|
||||||
return t
|
|
||||||
|
|
||||||
|
|
||||||
# Can probably get some extra speedup here by replacing this with
|
|
||||||
# an extended-euclidean, but performance seems OK without that
|
|
||||||
def inv(x):
|
|
||||||
return expmod(x, q - 2, q)
|
|
||||||
|
|
||||||
|
|
||||||
d = -121665 * inv(121666)
|
|
||||||
I = expmod(2, (q - 1) // 4, q)
|
|
||||||
|
|
||||||
|
|
||||||
def xrecover(y):
|
|
||||||
xx = (y * y - 1) * inv(d * y * y + 1)
|
|
||||||
x = expmod(xx, (q + 3) // 8, q)
|
|
||||||
if (x * x - xx) % q != 0:
|
|
||||||
x = (x * I) % q
|
|
||||||
|
|
||||||
if x % 2 != 0:
|
|
||||||
x = q - x
|
|
||||||
|
|
||||||
return x
|
|
||||||
|
|
||||||
|
|
||||||
By = 4 * inv(5)
|
|
||||||
Bx = xrecover(By)
|
|
||||||
B = [Bx % q, By % q]
|
|
||||||
|
|
||||||
|
|
||||||
# def edwards(P,Q):
|
|
||||||
# x1 = P[0]
|
|
||||||
# y1 = P[1]
|
|
||||||
# x2 = Q[0]
|
|
||||||
# y2 = Q[1]
|
|
||||||
# x3 = (x1*y2+x2*y1) * inv(1+d*x1*x2*y1*y2)
|
|
||||||
# y3 = (y1*y2+x1*x2) * inv(1-d*x1*x2*y1*y2)
|
|
||||||
# return (x3 % q,y3 % q)
|
|
||||||
|
|
||||||
# def scalarmult(P,e):
|
|
||||||
# if e == 0: return [0,1]
|
|
||||||
# Q = scalarmult(P,e/2)
|
|
||||||
# Q = edwards(Q,Q)
|
|
||||||
# if e & 1: Q = edwards(Q,P)
|
|
||||||
# return Q
|
|
||||||
|
|
||||||
# Faster (!) version based on:
|
|
||||||
# http://www.hyperelliptic.org/EFD/g1p/auto-twisted-extended-1.html
|
|
||||||
|
|
||||||
def xpt_add(pt1, pt2):
|
|
||||||
(X1, Y1, Z1, T1) = pt1
|
|
||||||
(X2, Y2, Z2, T2) = pt2
|
|
||||||
A = ((Y1 - X1) * (Y2 + X2)) % q
|
|
||||||
B = ((Y1 + X1) * (Y2 - X2)) % q
|
|
||||||
C = (Z1 * 2 * T2) % q
|
|
||||||
D = (T1 * 2 * Z2) % q
|
|
||||||
E = (D + C) % q
|
|
||||||
F = (B - A) % q
|
|
||||||
G = (B + A) % q
|
|
||||||
H = (D - C) % q
|
|
||||||
X3 = (E * F) % q
|
|
||||||
Y3 = (G * H) % q
|
|
||||||
Z3 = (F * G) % q
|
|
||||||
T3 = (E * H) % q
|
|
||||||
return (X3, Y3, Z3, T3)
|
|
||||||
|
|
||||||
|
|
||||||
def xpt_double(pt):
|
|
||||||
(X1, Y1, Z1, _) = pt
|
|
||||||
A = (X1 * X1)
|
|
||||||
B = (Y1 * Y1)
|
|
||||||
C = (2 * Z1 * Z1)
|
|
||||||
D = (-A) % q
|
|
||||||
J = (X1 + Y1) % q
|
|
||||||
E = (J * J - A - B) % q
|
|
||||||
G = (D + B) % q
|
|
||||||
F = (G - C) % q
|
|
||||||
H = (D - B) % q
|
|
||||||
X3 = (E * F) % q
|
|
||||||
Y3 = (G * H) % q
|
|
||||||
Z3 = (F * G) % q
|
|
||||||
T3 = (E * H) % q
|
|
||||||
return X3, Y3, Z3, T3
|
|
||||||
|
|
||||||
|
|
||||||
def pt_xform(pt):
|
|
||||||
(x, y) = pt
|
|
||||||
return x, y, 1, (x * y) % q
|
|
||||||
|
|
||||||
|
|
||||||
def pt_unxform(pt):
|
|
||||||
(x, y, z, _) = pt
|
|
||||||
return (x * inv(z)) % q, (y * inv(z)) % q
|
|
||||||
|
|
||||||
|
|
||||||
def xpt_mult(pt, n):
|
|
||||||
if n == 0:
|
|
||||||
return pt_xform((0, 1))
|
|
||||||
|
|
||||||
_ = xpt_double(xpt_mult(pt, n >> 1))
|
|
||||||
return xpt_add(_, pt) if n & 1 else _
|
|
||||||
|
|
||||||
|
|
||||||
def scalarmult(pt, e):
|
|
||||||
return pt_unxform(xpt_mult(pt_xform(pt), e))
|
|
||||||
|
|
||||||
|
|
||||||
def encodeint(y):
|
|
||||||
bits = [(y >> i) & 1 for i in range(b)]
|
|
||||||
e = [(sum([bits[i * 8 + j] << j for j in range(8)]))
|
|
||||||
for i in range(b // 8)]
|
|
||||||
return asbytes(e)
|
|
||||||
|
|
||||||
|
|
||||||
def encodepoint(P):
|
|
||||||
x = P[0]
|
|
||||||
y = P[1]
|
|
||||||
bits = [(y >> i) & 1 for i in range(b - 1)] + [x & 1]
|
|
||||||
e = [(sum([bits[i * 8 + j] << j for j in range(8)]))
|
|
||||||
for i in range(b // 8)]
|
|
||||||
return asbytes(e)
|
|
||||||
|
|
||||||
|
|
||||||
def publickey(sk):
|
|
||||||
h = H(sk)
|
|
||||||
a = 2 ** (b - 2) + sum(2 ** i * bit(h, i) for i in range(3, b - 2))
|
|
||||||
A = scalarmult(B, a)
|
|
||||||
return encodepoint(A)
|
|
||||||
|
|
||||||
|
|
||||||
def Hint(m):
|
|
||||||
h = H(m)
|
|
||||||
return sum(2 ** i * bit(h, i) for i in range(2 * b))
|
|
||||||
|
|
||||||
|
|
||||||
def signature(m, sk, pk):
|
|
||||||
h = H(sk)
|
|
||||||
a = 2 ** (b - 2) + sum(2 ** i * bit(h, i) for i in range(3, b - 2))
|
|
||||||
inter = joinbytes([h[i] for i in range(b // 8, b // 4)])
|
|
||||||
r = Hint(inter + m)
|
|
||||||
R = scalarmult(B, r)
|
|
||||||
S = (r + Hint(encodepoint(R) + pk + m) * a) % l
|
|
||||||
return encodepoint(R) + encodeint(S)
|
|
||||||
|
|
||||||
|
|
||||||
def isoncurve(P):
|
|
||||||
x = P[0]
|
|
||||||
y = P[1]
|
|
||||||
return (-x * x + y * y - 1 - d * x * x * y * y) % q == 0
|
|
||||||
|
|
||||||
|
|
||||||
def decodeint(s):
|
|
||||||
return sum(2 ** i * bit(s, i) for i in range(0, b))
|
|
||||||
|
|
||||||
|
|
||||||
def decodepoint(s):
|
|
||||||
y = sum(2 ** i * bit(s, i) for i in range(0, b - 1))
|
|
||||||
x = xrecover(y)
|
|
||||||
if x & 1 != bit(s, b - 1):
|
|
||||||
x = q - x
|
|
||||||
|
|
||||||
P = [x, y]
|
|
||||||
if not isoncurve(P):
|
|
||||||
raise Exception("decoding point that is not on curve")
|
|
||||||
|
|
||||||
return P
|
|
||||||
|
|
||||||
|
|
||||||
def checkvalid(s, m, pk):
|
|
||||||
if len(s) != b // 4:
|
|
||||||
raise Exception("signature length is wrong")
|
|
||||||
if len(pk) != b // 8:
|
|
||||||
raise Exception("public-key length is wrong")
|
|
||||||
|
|
||||||
R = decodepoint(s[0:b // 8])
|
|
||||||
A = decodepoint(pk)
|
|
||||||
S = decodeint(s[b // 8:b // 4])
|
|
||||||
h = Hint(encodepoint(R) + pk + m)
|
|
||||||
v1 = scalarmult(B, S)
|
|
||||||
# v2 = edwards(R,scalarmult(A,h))
|
|
||||||
v2 = pt_unxform(xpt_add(pt_xform(R), pt_xform(scalarmult(A, h))))
|
|
||||||
return v1 == v2
|
|
||||||
|
|
||||||
|
|
||||||
##########################################################
|
|
||||||
#
|
|
||||||
# Curve25519 reference implementation by Matthew Dempsky, from:
|
|
||||||
# http://cr.yp.to/highspeed/naclcrypto-20090310.pdf
|
|
||||||
|
|
||||||
# P = 2 ** 255 - 19
|
|
||||||
P = q
|
|
||||||
A = 486662
|
|
||||||
|
|
||||||
|
|
||||||
# def expmod(b, e, m):
|
|
||||||
# if e == 0: return 1
|
|
||||||
# t = expmod(b, e / 2, m) ** 2 % m
|
|
||||||
# if e & 1: t = (t * b) % m
|
|
||||||
# return t
|
|
||||||
|
|
||||||
# def inv(x): return expmod(x, P - 2, P)
|
|
||||||
|
|
||||||
|
|
||||||
def add(n, m, d):
|
|
||||||
(xn, zn) = n
|
|
||||||
(xm, zm) = m
|
|
||||||
(xd, zd) = d
|
|
||||||
x = 4 * (xm * xn - zm * zn) ** 2 * zd
|
|
||||||
z = 4 * (xm * zn - zm * xn) ** 2 * xd
|
|
||||||
return (x % P, z % P)
|
|
||||||
|
|
||||||
|
|
||||||
def double(n):
|
|
||||||
(xn, zn) = n
|
|
||||||
x = (xn ** 2 - zn ** 2) ** 2
|
|
||||||
z = 4 * xn * zn * (xn ** 2 + A * xn * zn + zn ** 2)
|
|
||||||
return (x % P, z % P)
|
|
||||||
|
|
||||||
|
|
||||||
def curve25519(n, base=9):
|
|
||||||
one = (base, 1)
|
|
||||||
two = double(one)
|
|
||||||
|
|
||||||
# f(m) evaluates to a tuple
|
|
||||||
# containing the mth multiple and the
|
|
||||||
# (m+1)th multiple of base.
|
|
||||||
def f(m):
|
|
||||||
if m == 1:
|
|
||||||
return (one, two)
|
|
||||||
|
|
||||||
(pm, pm1) = f(m // 2)
|
|
||||||
if m & 1:
|
|
||||||
return (add(pm, pm1, one), double(pm1))
|
|
||||||
|
|
||||||
return (double(pm), add(pm, pm1, one))
|
|
||||||
|
|
||||||
((x, z), _) = f(n)
|
|
||||||
return (x * inv(z)) % P
|
|
||||||
|
|
||||||
|
|
||||||
def genkey(n=0):
|
|
||||||
n = n or random.randint(0, P)
|
|
||||||
n &= ~7
|
|
||||||
n &= ~(128 << 8 * 31)
|
|
||||||
n |= 64 << 8 * 31
|
|
||||||
return n
|
|
||||||
|
|
||||||
|
|
||||||
# def str2int(s):
|
|
||||||
# return int(hexlify(s), 16)
|
|
||||||
# # return sum(ord(s[i]) << (8 * i) for i in range(32))
|
|
||||||
#
|
|
||||||
# def int2str(n):
|
|
||||||
# return unhexlify("%x" % n)
|
|
||||||
# # return ''.join([chr((n >> (8 * i)) & 255) for i in range(32)])
|
|
||||||
|
|
||||||
#################################################
|
|
||||||
|
|
||||||
|
|
||||||
def dsa_test():
|
|
||||||
import os
|
|
||||||
msg = str(random.randint(q, q + q)).encode('utf-8')
|
|
||||||
sk = os.urandom(32)
|
|
||||||
pk = publickey(sk)
|
|
||||||
sig = signature(msg, sk, pk)
|
|
||||||
return checkvalid(sig, msg, pk)
|
|
||||||
|
|
||||||
|
|
||||||
def dh_test():
|
|
||||||
sk1 = genkey()
|
|
||||||
sk2 = genkey()
|
|
||||||
return curve25519(sk1, curve25519(sk2)) == curve25519(sk2, curve25519(sk1))
|
|
|
@ -1,50 +0,0 @@
|
||||||
import os
|
|
||||||
import warnings
|
|
||||||
from collections import namedtuple
|
|
||||||
|
|
||||||
from . import djbec
|
|
||||||
|
|
||||||
__all__ = ['crypto_sign', 'crypto_sign_open', 'crypto_sign_keypair', 'Keypair',
|
|
||||||
'PUBLICKEYBYTES', 'SECRETKEYBYTES', 'SIGNATUREBYTES']
|
|
||||||
|
|
||||||
PUBLICKEYBYTES = 32
|
|
||||||
SECRETKEYBYTES = 64
|
|
||||||
SIGNATUREBYTES = 64
|
|
||||||
|
|
||||||
Keypair = namedtuple('Keypair', ('vk', 'sk')) # verifying key, secret key
|
|
||||||
|
|
||||||
|
|
||||||
def crypto_sign_keypair(seed=None):
|
|
||||||
"""Return (verifying, secret) key from a given seed, or os.urandom(32)"""
|
|
||||||
if seed is None:
|
|
||||||
seed = os.urandom(PUBLICKEYBYTES)
|
|
||||||
else:
|
|
||||||
warnings.warn("ed25519ll should choose random seed.",
|
|
||||||
RuntimeWarning)
|
|
||||||
if len(seed) != 32:
|
|
||||||
raise ValueError("seed must be 32 random bytes or None.")
|
|
||||||
skbytes = seed
|
|
||||||
vkbytes = djbec.publickey(skbytes)
|
|
||||||
return Keypair(vkbytes, skbytes+vkbytes)
|
|
||||||
|
|
||||||
|
|
||||||
def crypto_sign(msg, sk):
|
|
||||||
"""Return signature+message given message and secret key.
|
|
||||||
The signature is the first SIGNATUREBYTES bytes of the return value.
|
|
||||||
A copy of msg is in the remainder."""
|
|
||||||
if len(sk) != SECRETKEYBYTES:
|
|
||||||
raise ValueError("Bad signing key length %d" % len(sk))
|
|
||||||
vkbytes = sk[PUBLICKEYBYTES:]
|
|
||||||
skbytes = sk[:PUBLICKEYBYTES]
|
|
||||||
sig = djbec.signature(msg, skbytes, vkbytes)
|
|
||||||
return sig + msg
|
|
||||||
|
|
||||||
|
|
||||||
def crypto_sign_open(signed, vk):
|
|
||||||
"""Return message given signature+message and the verifying key."""
|
|
||||||
if len(vk) != PUBLICKEYBYTES:
|
|
||||||
raise ValueError("Bad verifying key length %d" % len(vk))
|
|
||||||
rc = djbec.checkvalid(signed[:SIGNATUREBYTES], signed[SIGNATUREBYTES:], vk)
|
|
||||||
if not rc:
|
|
||||||
raise ValueError("rc != True", rc)
|
|
||||||
return signed[SIGNATUREBYTES:]
|
|
|
@ -1,101 +0,0 @@
|
||||||
"""Store and retrieve wheel signing / verifying keys.
|
|
||||||
|
|
||||||
Given a scope (a package name, + meaning "all packages", or - meaning
|
|
||||||
"no packages"), return a list of verifying keys that are trusted for that
|
|
||||||
scope.
|
|
||||||
|
|
||||||
Given a package name, return a list of (scope, key) suggested keys to sign
|
|
||||||
that package (only the verifying keys; the private signing key is stored
|
|
||||||
elsewhere).
|
|
||||||
|
|
||||||
Keys here are represented as urlsafe_b64encoded strings with no padding.
|
|
||||||
|
|
||||||
Tentative command line interface:
|
|
||||||
|
|
||||||
# list trusts
|
|
||||||
wheel trust
|
|
||||||
# trust a particular key for all
|
|
||||||
wheel trust + key
|
|
||||||
# trust key for beaglevote
|
|
||||||
wheel trust beaglevote key
|
|
||||||
# stop trusting a key for all
|
|
||||||
wheel untrust + key
|
|
||||||
|
|
||||||
# generate a key pair
|
|
||||||
wheel keygen
|
|
||||||
|
|
||||||
# import a signing key from a file
|
|
||||||
wheel import keyfile
|
|
||||||
|
|
||||||
# export a signing key
|
|
||||||
wheel export key
|
|
||||||
"""
|
|
||||||
|
|
||||||
import json
|
|
||||||
import os.path
|
|
||||||
|
|
||||||
from ..util import native, load_config_paths, save_config_path
|
|
||||||
|
|
||||||
|
|
||||||
class WheelKeys(object):
|
|
||||||
SCHEMA = 1
|
|
||||||
CONFIG_NAME = 'wheel.json'
|
|
||||||
|
|
||||||
def __init__(self):
|
|
||||||
self.data = {'signers': [], 'verifiers': []}
|
|
||||||
|
|
||||||
def load(self):
|
|
||||||
# XXX JSON is not a great database
|
|
||||||
for path in load_config_paths('wheel'):
|
|
||||||
conf = os.path.join(native(path), self.CONFIG_NAME)
|
|
||||||
if os.path.exists(conf):
|
|
||||||
with open(conf, 'r') as infile:
|
|
||||||
self.data = json.load(infile)
|
|
||||||
for x in ('signers', 'verifiers'):
|
|
||||||
if x not in self.data:
|
|
||||||
self.data[x] = []
|
|
||||||
if 'schema' not in self.data:
|
|
||||||
self.data['schema'] = self.SCHEMA
|
|
||||||
elif self.data['schema'] != self.SCHEMA:
|
|
||||||
raise ValueError(
|
|
||||||
"Bad wheel.json version {0}, expected {1}".format(
|
|
||||||
self.data['schema'], self.SCHEMA))
|
|
||||||
break
|
|
||||||
return self
|
|
||||||
|
|
||||||
def save(self):
|
|
||||||
# Try not to call this a very long time after load()
|
|
||||||
path = save_config_path('wheel')
|
|
||||||
conf = os.path.join(native(path), self.CONFIG_NAME)
|
|
||||||
with open(conf, 'w+') as out:
|
|
||||||
json.dump(self.data, out, indent=2)
|
|
||||||
return self
|
|
||||||
|
|
||||||
def trust(self, scope, vk):
|
|
||||||
"""Start trusting a particular key for given scope."""
|
|
||||||
self.data['verifiers'].append({'scope': scope, 'vk': vk})
|
|
||||||
return self
|
|
||||||
|
|
||||||
def untrust(self, scope, vk):
|
|
||||||
"""Stop trusting a particular key for given scope."""
|
|
||||||
self.data['verifiers'].remove({'scope': scope, 'vk': vk})
|
|
||||||
return self
|
|
||||||
|
|
||||||
def trusted(self, scope=None):
|
|
||||||
"""Return list of [(scope, trusted key), ...] for given scope."""
|
|
||||||
trust = [(x['scope'], x['vk']) for x in self.data['verifiers']
|
|
||||||
if x['scope'] in (scope, '+')]
|
|
||||||
trust.sort(key=lambda x: x[0])
|
|
||||||
trust.reverse()
|
|
||||||
return trust
|
|
||||||
|
|
||||||
def signers(self, scope):
|
|
||||||
"""Return list of signing key(s)."""
|
|
||||||
sign = [(x['scope'], x['vk']) for x in self.data['signers'] if x['scope'] in (scope, '+')]
|
|
||||||
sign.sort(key=lambda x: x[0])
|
|
||||||
sign.reverse()
|
|
||||||
return sign
|
|
||||||
|
|
||||||
def add_signer(self, scope, vk):
|
|
||||||
"""Remember verifying key vk as being valid for signing in scope."""
|
|
||||||
self.data['signers'].append({'scope': scope, 'vk': vk})
|
|
|
@ -1,376 +0,0 @@
|
||||||
"""
|
|
||||||
Wheel command-line utility.
|
|
||||||
"""
|
|
||||||
|
|
||||||
import argparse
|
|
||||||
import hashlib
|
|
||||||
import json
|
|
||||||
import os
|
|
||||||
import sys
|
|
||||||
from glob import iglob
|
|
||||||
|
|
||||||
from .. import signatures
|
|
||||||
from ..install import WheelFile, VerifyingZipFile
|
|
||||||
from ..paths import get_install_command
|
|
||||||
from ..util import urlsafe_b64decode, urlsafe_b64encode, native, binary, matches_requirement
|
|
||||||
|
|
||||||
|
|
||||||
def require_pkgresources(name):
|
|
||||||
try:
|
|
||||||
import pkg_resources # noqa: F401
|
|
||||||
except ImportError:
|
|
||||||
raise RuntimeError("'{0}' needs pkg_resources (part of setuptools).".format(name))
|
|
||||||
|
|
||||||
|
|
||||||
class WheelError(Exception):
|
|
||||||
pass
|
|
||||||
|
|
||||||
|
|
||||||
# For testability
|
|
||||||
def get_keyring():
|
|
||||||
try:
|
|
||||||
from ..signatures import keys
|
|
||||||
import keyring
|
|
||||||
assert keyring.get_keyring().priority
|
|
||||||
except (ImportError, AssertionError):
|
|
||||||
raise WheelError(
|
|
||||||
"Install wheel[signatures] (requires keyring, keyrings.alt, pyxdg) for signatures.")
|
|
||||||
|
|
||||||
return keys.WheelKeys, keyring
|
|
||||||
|
|
||||||
|
|
||||||
def keygen(get_keyring=get_keyring):
|
|
||||||
"""Generate a public/private key pair."""
|
|
||||||
WheelKeys, keyring = get_keyring()
|
|
||||||
|
|
||||||
ed25519ll = signatures.get_ed25519ll()
|
|
||||||
|
|
||||||
wk = WheelKeys().load()
|
|
||||||
|
|
||||||
keypair = ed25519ll.crypto_sign_keypair()
|
|
||||||
vk = native(urlsafe_b64encode(keypair.vk))
|
|
||||||
sk = native(urlsafe_b64encode(keypair.sk))
|
|
||||||
kr = keyring.get_keyring()
|
|
||||||
kr.set_password("wheel", vk, sk)
|
|
||||||
sys.stdout.write("Created Ed25519 keypair with vk={0}\n".format(vk))
|
|
||||||
sys.stdout.write("in {0!r}\n".format(kr))
|
|
||||||
|
|
||||||
sk2 = kr.get_password('wheel', vk)
|
|
||||||
if sk2 != sk:
|
|
||||||
raise WheelError("Keyring is broken. Could not retrieve secret key.")
|
|
||||||
|
|
||||||
sys.stdout.write("Trusting {0} to sign and verify all packages.\n".format(vk))
|
|
||||||
wk.add_signer('+', vk)
|
|
||||||
wk.trust('+', vk)
|
|
||||||
wk.save()
|
|
||||||
|
|
||||||
|
|
||||||
def sign(wheelfile, replace=False, get_keyring=get_keyring):
|
|
||||||
"""Sign a wheel"""
|
|
||||||
WheelKeys, keyring = get_keyring()
|
|
||||||
|
|
||||||
ed25519ll = signatures.get_ed25519ll()
|
|
||||||
|
|
||||||
wf = WheelFile(wheelfile, append=True)
|
|
||||||
wk = WheelKeys().load()
|
|
||||||
|
|
||||||
name = wf.parsed_filename.group('name')
|
|
||||||
sign_with = wk.signers(name)[0]
|
|
||||||
sys.stdout.write("Signing {0} with {1}\n".format(name, sign_with[1]))
|
|
||||||
|
|
||||||
vk = sign_with[1]
|
|
||||||
kr = keyring.get_keyring()
|
|
||||||
sk = kr.get_password('wheel', vk)
|
|
||||||
keypair = ed25519ll.Keypair(urlsafe_b64decode(binary(vk)),
|
|
||||||
urlsafe_b64decode(binary(sk)))
|
|
||||||
|
|
||||||
record_name = wf.distinfo_name + '/RECORD'
|
|
||||||
sig_name = wf.distinfo_name + '/RECORD.jws'
|
|
||||||
if sig_name in wf.zipfile.namelist():
|
|
||||||
raise WheelError("Wheel is already signed.")
|
|
||||||
record_data = wf.zipfile.read(record_name)
|
|
||||||
payload = {"hash": "sha256=" + native(urlsafe_b64encode(hashlib.sha256(record_data).digest()))}
|
|
||||||
sig = signatures.sign(payload, keypair)
|
|
||||||
wf.zipfile.writestr(sig_name, json.dumps(sig, sort_keys=True))
|
|
||||||
wf.zipfile.close()
|
|
||||||
|
|
||||||
|
|
||||||
def unsign(wheelfile):
|
|
||||||
"""
|
|
||||||
Remove RECORD.jws from a wheel by truncating the zip file.
|
|
||||||
|
|
||||||
RECORD.jws must be at the end of the archive. The zip file must be an
|
|
||||||
ordinary archive, with the compressed files and the directory in the same
|
|
||||||
order, and without any non-zip content after the truncation point.
|
|
||||||
"""
|
|
||||||
vzf = VerifyingZipFile(wheelfile, "a")
|
|
||||||
info = vzf.infolist()
|
|
||||||
if not (len(info) and info[-1].filename.endswith('/RECORD.jws')):
|
|
||||||
raise WheelError('The wheel is not signed (RECORD.jws not found at end of the archive).')
|
|
||||||
vzf.pop()
|
|
||||||
vzf.close()
|
|
||||||
|
|
||||||
|
|
||||||
def verify(wheelfile):
|
|
||||||
"""Verify a wheel.
|
|
||||||
|
|
||||||
The signature will be verified for internal consistency ONLY and printed.
|
|
||||||
Wheel's own unpack/install commands verify the manifest against the
|
|
||||||
signature and file contents.
|
|
||||||
"""
|
|
||||||
wf = WheelFile(wheelfile)
|
|
||||||
sig_name = wf.distinfo_name + '/RECORD.jws'
|
|
||||||
try:
|
|
||||||
sig = json.loads(native(wf.zipfile.open(sig_name).read()))
|
|
||||||
except KeyError:
|
|
||||||
raise WheelError('The wheel is not signed (RECORD.jws not found at end of the archive).')
|
|
||||||
|
|
||||||
verified = signatures.verify(sig)
|
|
||||||
sys.stderr.write("Signatures are internally consistent.\n")
|
|
||||||
sys.stdout.write(json.dumps(verified, indent=2))
|
|
||||||
sys.stdout.write('\n')
|
|
||||||
|
|
||||||
|
|
||||||
def unpack(wheelfile, dest='.'):
|
|
||||||
"""Unpack a wheel.
|
|
||||||
|
|
||||||
Wheel content will be unpacked to {dest}/{name}-{ver}, where {name}
|
|
||||||
is the package name and {ver} its version.
|
|
||||||
|
|
||||||
:param wheelfile: The path to the wheel.
|
|
||||||
:param dest: Destination directory (default to current directory).
|
|
||||||
"""
|
|
||||||
wf = WheelFile(wheelfile)
|
|
||||||
namever = wf.parsed_filename.group('namever')
|
|
||||||
destination = os.path.join(dest, namever)
|
|
||||||
sys.stderr.write("Unpacking to: %s\n" % (destination))
|
|
||||||
wf.zipfile.extractall(destination)
|
|
||||||
wf.zipfile.close()
|
|
||||||
|
|
||||||
|
|
||||||
def install(requirements, requirements_file=None,
|
|
||||||
wheel_dirs=None, force=False, list_files=False,
|
|
||||||
dry_run=False):
|
|
||||||
"""Install wheels.
|
|
||||||
|
|
||||||
:param requirements: A list of requirements or wheel files to install.
|
|
||||||
:param requirements_file: A file containing requirements to install.
|
|
||||||
:param wheel_dirs: A list of directories to search for wheels.
|
|
||||||
:param force: Install a wheel file even if it is not compatible.
|
|
||||||
:param list_files: Only list the files to install, don't install them.
|
|
||||||
:param dry_run: Do everything but the actual install.
|
|
||||||
"""
|
|
||||||
|
|
||||||
# If no wheel directories specified, use the WHEELPATH environment
|
|
||||||
# variable, or the current directory if that is not set.
|
|
||||||
if not wheel_dirs:
|
|
||||||
wheelpath = os.getenv("WHEELPATH")
|
|
||||||
if wheelpath:
|
|
||||||
wheel_dirs = wheelpath.split(os.pathsep)
|
|
||||||
else:
|
|
||||||
wheel_dirs = [os.path.curdir]
|
|
||||||
|
|
||||||
# Get a list of all valid wheels in wheel_dirs
|
|
||||||
all_wheels = []
|
|
||||||
for d in wheel_dirs:
|
|
||||||
for w in os.listdir(d):
|
|
||||||
if w.endswith('.whl'):
|
|
||||||
wf = WheelFile(os.path.join(d, w))
|
|
||||||
if wf.compatible:
|
|
||||||
all_wheels.append(wf)
|
|
||||||
|
|
||||||
# If there is a requirements file, add it to the list of requirements
|
|
||||||
if requirements_file:
|
|
||||||
# If the file doesn't exist, search for it in wheel_dirs
|
|
||||||
# This allows standard requirements files to be stored with the
|
|
||||||
# wheels.
|
|
||||||
if not os.path.exists(requirements_file):
|
|
||||||
for d in wheel_dirs:
|
|
||||||
name = os.path.join(d, requirements_file)
|
|
||||||
if os.path.exists(name):
|
|
||||||
requirements_file = name
|
|
||||||
break
|
|
||||||
|
|
||||||
with open(requirements_file) as fd:
|
|
||||||
requirements.extend(fd)
|
|
||||||
|
|
||||||
to_install = []
|
|
||||||
for req in requirements:
|
|
||||||
if req.endswith('.whl'):
|
|
||||||
# Explicitly specified wheel filename
|
|
||||||
if os.path.exists(req):
|
|
||||||
wf = WheelFile(req)
|
|
||||||
if wf.compatible or force:
|
|
||||||
to_install.append(wf)
|
|
||||||
else:
|
|
||||||
msg = ("{0} is not compatible with this Python. "
|
|
||||||
"--force to install anyway.".format(req))
|
|
||||||
raise WheelError(msg)
|
|
||||||
else:
|
|
||||||
# We could search on wheel_dirs, but it's probably OK to
|
|
||||||
# assume the user has made an error.
|
|
||||||
raise WheelError("No such wheel file: {}".format(req))
|
|
||||||
continue
|
|
||||||
|
|
||||||
# We have a requirement spec
|
|
||||||
# If we don't have pkg_resources, this will raise an exception
|
|
||||||
matches = matches_requirement(req, all_wheels)
|
|
||||||
if not matches:
|
|
||||||
raise WheelError("No match for requirement {}".format(req))
|
|
||||||
to_install.append(max(matches))
|
|
||||||
|
|
||||||
# We now have a list of wheels to install
|
|
||||||
if list_files:
|
|
||||||
sys.stdout.write("Installing:\n")
|
|
||||||
|
|
||||||
if dry_run:
|
|
||||||
return
|
|
||||||
|
|
||||||
for wf in to_install:
|
|
||||||
if list_files:
|
|
||||||
sys.stdout.write(" {0}\n".format(wf.filename))
|
|
||||||
continue
|
|
||||||
wf.install(force=force)
|
|
||||||
wf.zipfile.close()
|
|
||||||
|
|
||||||
|
|
||||||
def install_scripts(distributions):
|
|
||||||
"""
|
|
||||||
Regenerate the entry_points console_scripts for the named distribution.
|
|
||||||
"""
|
|
||||||
try:
|
|
||||||
from setuptools.command import easy_install
|
|
||||||
import pkg_resources
|
|
||||||
except ImportError:
|
|
||||||
raise RuntimeError("'wheel install_scripts' needs setuptools.")
|
|
||||||
|
|
||||||
for dist in distributions:
|
|
||||||
pkg_resources_dist = pkg_resources.get_distribution(dist)
|
|
||||||
install = get_install_command(dist)
|
|
||||||
command = easy_install.easy_install(install.distribution)
|
|
||||||
command.args = ['wheel'] # dummy argument
|
|
||||||
command.finalize_options()
|
|
||||||
command.install_egg_scripts(pkg_resources_dist)
|
|
||||||
|
|
||||||
|
|
||||||
def convert(installers, dest_dir, verbose):
|
|
||||||
require_pkgresources('wheel convert')
|
|
||||||
|
|
||||||
# Only support wheel convert if pkg_resources is present
|
|
||||||
from ..wininst2wheel import bdist_wininst2wheel
|
|
||||||
from ..egg2wheel import egg2wheel
|
|
||||||
|
|
||||||
for pat in installers:
|
|
||||||
for installer in iglob(pat):
|
|
||||||
if os.path.splitext(installer)[1] == '.egg':
|
|
||||||
conv = egg2wheel
|
|
||||||
else:
|
|
||||||
conv = bdist_wininst2wheel
|
|
||||||
if verbose:
|
|
||||||
sys.stdout.write("{0}... ".format(installer))
|
|
||||||
sys.stdout.flush()
|
|
||||||
conv(installer, dest_dir)
|
|
||||||
if verbose:
|
|
||||||
sys.stdout.write("OK\n")
|
|
||||||
|
|
||||||
|
|
||||||
def parser():
|
|
||||||
p = argparse.ArgumentParser()
|
|
||||||
s = p.add_subparsers(help="commands")
|
|
||||||
|
|
||||||
def keygen_f(args):
|
|
||||||
keygen()
|
|
||||||
keygen_parser = s.add_parser('keygen', help='Generate signing key')
|
|
||||||
keygen_parser.set_defaults(func=keygen_f)
|
|
||||||
|
|
||||||
def sign_f(args):
|
|
||||||
sign(args.wheelfile)
|
|
||||||
sign_parser = s.add_parser('sign', help='Sign wheel')
|
|
||||||
sign_parser.add_argument('wheelfile', help='Wheel file')
|
|
||||||
sign_parser.set_defaults(func=sign_f)
|
|
||||||
|
|
||||||
def unsign_f(args):
|
|
||||||
unsign(args.wheelfile)
|
|
||||||
unsign_parser = s.add_parser('unsign', help=unsign.__doc__)
|
|
||||||
unsign_parser.add_argument('wheelfile', help='Wheel file')
|
|
||||||
unsign_parser.set_defaults(func=unsign_f)
|
|
||||||
|
|
||||||
def verify_f(args):
|
|
||||||
verify(args.wheelfile)
|
|
||||||
verify_parser = s.add_parser('verify', help=verify.__doc__)
|
|
||||||
verify_parser.add_argument('wheelfile', help='Wheel file')
|
|
||||||
verify_parser.set_defaults(func=verify_f)
|
|
||||||
|
|
||||||
def unpack_f(args):
|
|
||||||
unpack(args.wheelfile, args.dest)
|
|
||||||
unpack_parser = s.add_parser('unpack', help='Unpack wheel')
|
|
||||||
unpack_parser.add_argument('--dest', '-d', help='Destination directory',
|
|
||||||
default='.')
|
|
||||||
unpack_parser.add_argument('wheelfile', help='Wheel file')
|
|
||||||
unpack_parser.set_defaults(func=unpack_f)
|
|
||||||
|
|
||||||
def install_f(args):
|
|
||||||
install(args.requirements, args.requirements_file,
|
|
||||||
args.wheel_dirs, args.force, args.list_files)
|
|
||||||
install_parser = s.add_parser('install', help='Install wheels')
|
|
||||||
install_parser.add_argument('requirements', nargs='*',
|
|
||||||
help='Requirements to install.')
|
|
||||||
install_parser.add_argument('--force', default=False,
|
|
||||||
action='store_true',
|
|
||||||
help='Install incompatible wheel files.')
|
|
||||||
install_parser.add_argument('--wheel-dir', '-d', action='append',
|
|
||||||
dest='wheel_dirs',
|
|
||||||
help='Directories containing wheels.')
|
|
||||||
install_parser.add_argument('--requirements-file', '-r',
|
|
||||||
help="A file containing requirements to "
|
|
||||||
"install.")
|
|
||||||
install_parser.add_argument('--list', '-l', default=False,
|
|
||||||
dest='list_files',
|
|
||||||
action='store_true',
|
|
||||||
help="List wheels which would be installed, "
|
|
||||||
"but don't actually install anything.")
|
|
||||||
install_parser.set_defaults(func=install_f)
|
|
||||||
|
|
||||||
def install_scripts_f(args):
|
|
||||||
install_scripts(args.distributions)
|
|
||||||
install_scripts_parser = s.add_parser('install-scripts', help='Install console_scripts')
|
|
||||||
install_scripts_parser.add_argument('distributions', nargs='*',
|
|
||||||
help='Regenerate console_scripts for these distributions')
|
|
||||||
install_scripts_parser.set_defaults(func=install_scripts_f)
|
|
||||||
|
|
||||||
def convert_f(args):
|
|
||||||
convert(args.installers, args.dest_dir, args.verbose)
|
|
||||||
convert_parser = s.add_parser('convert', help='Convert egg or wininst to wheel')
|
|
||||||
convert_parser.add_argument('installers', nargs='*', help='Installers to convert')
|
|
||||||
convert_parser.add_argument('--dest-dir', '-d', default=os.path.curdir,
|
|
||||||
help="Directory to store wheels (default %(default)s)")
|
|
||||||
convert_parser.add_argument('--verbose', '-v', action='store_true')
|
|
||||||
convert_parser.set_defaults(func=convert_f)
|
|
||||||
|
|
||||||
def version_f(args):
|
|
||||||
from .. import __version__
|
|
||||||
sys.stdout.write("wheel %s\n" % __version__)
|
|
||||||
version_parser = s.add_parser('version', help='Print version and exit')
|
|
||||||
version_parser.set_defaults(func=version_f)
|
|
||||||
|
|
||||||
def help_f(args):
|
|
||||||
p.print_help()
|
|
||||||
help_parser = s.add_parser('help', help='Show this help')
|
|
||||||
help_parser.set_defaults(func=help_f)
|
|
||||||
|
|
||||||
return p
|
|
||||||
|
|
||||||
|
|
||||||
def main():
|
|
||||||
p = parser()
|
|
||||||
args = p.parse_args()
|
|
||||||
if not hasattr(args, 'func'):
|
|
||||||
p.print_help()
|
|
||||||
else:
|
|
||||||
# XXX on Python 3.3 we get 'args has no func' rather than short help.
|
|
||||||
try:
|
|
||||||
args.func(args)
|
|
||||||
return 0
|
|
||||||
except WheelError as e:
|
|
||||||
sys.stderr.write(e.message + "\n")
|
|
||||||
return 1
|
|
|
@ -1,176 +0,0 @@
|
||||||
"""Utility functions."""
|
|
||||||
|
|
||||||
import base64
|
|
||||||
import hashlib
|
|
||||||
import json
|
|
||||||
import os
|
|
||||||
import sys
|
|
||||||
from collections import OrderedDict
|
|
||||||
|
|
||||||
__all__ = ['urlsafe_b64encode', 'urlsafe_b64decode', 'utf8',
|
|
||||||
'to_json', 'from_json', 'matches_requirement']
|
|
||||||
|
|
||||||
|
|
||||||
# For encoding ascii back and forth between bytestrings, as is repeatedly
|
|
||||||
# necessary in JSON-based crypto under Python 3
|
|
||||||
if sys.version_info[0] < 3:
|
|
||||||
text_type = unicode # noqa: F821
|
|
||||||
|
|
||||||
def native(s):
|
|
||||||
return s
|
|
||||||
else:
|
|
||||||
text_type = str
|
|
||||||
|
|
||||||
def native(s):
|
|
||||||
if isinstance(s, bytes):
|
|
||||||
return s.decode('ascii')
|
|
||||||
return s
|
|
||||||
|
|
||||||
|
|
||||||
def urlsafe_b64encode(data):
|
|
||||||
"""urlsafe_b64encode without padding"""
|
|
||||||
return base64.urlsafe_b64encode(data).rstrip(binary('='))
|
|
||||||
|
|
||||||
|
|
||||||
def urlsafe_b64decode(data):
|
|
||||||
"""urlsafe_b64decode without padding"""
|
|
||||||
pad = b'=' * (4 - (len(data) & 3))
|
|
||||||
return base64.urlsafe_b64decode(data + pad)
|
|
||||||
|
|
||||||
|
|
||||||
def to_json(o):
|
|
||||||
"""Convert given data to JSON."""
|
|
||||||
return json.dumps(o, sort_keys=True)
|
|
||||||
|
|
||||||
|
|
||||||
def from_json(j):
|
|
||||||
"""Decode a JSON payload."""
|
|
||||||
return json.loads(j)
|
|
||||||
|
|
||||||
|
|
||||||
def open_for_csv(name, mode):
|
|
||||||
if sys.version_info[0] < 3:
|
|
||||||
nl = {}
|
|
||||||
bin = 'b'
|
|
||||||
else:
|
|
||||||
nl = {'newline': ''}
|
|
||||||
bin = ''
|
|
||||||
|
|
||||||
return open(name, mode + bin, **nl)
|
|
||||||
|
|
||||||
|
|
||||||
def utf8(data):
|
|
||||||
"""Utf-8 encode data."""
|
|
||||||
if isinstance(data, text_type):
|
|
||||||
return data.encode('utf-8')
|
|
||||||
return data
|
|
||||||
|
|
||||||
|
|
||||||
def binary(s):
|
|
||||||
if isinstance(s, text_type):
|
|
||||||
return s.encode('ascii')
|
|
||||||
return s
|
|
||||||
|
|
||||||
|
|
||||||
class HashingFile(object):
|
|
||||||
def __init__(self, path, mode, hashtype='sha256'):
|
|
||||||
self.fd = open(path, mode)
|
|
||||||
self.hashtype = hashtype
|
|
||||||
self.hash = hashlib.new(hashtype)
|
|
||||||
self.length = 0
|
|
||||||
|
|
||||||
def write(self, data):
|
|
||||||
self.hash.update(data)
|
|
||||||
self.length += len(data)
|
|
||||||
self.fd.write(data)
|
|
||||||
|
|
||||||
def close(self):
|
|
||||||
self.fd.close()
|
|
||||||
|
|
||||||
def digest(self):
|
|
||||||
if self.hashtype == 'md5':
|
|
||||||
return self.hash.hexdigest()
|
|
||||||
digest = self.hash.digest()
|
|
||||||
return self.hashtype + '=' + native(urlsafe_b64encode(digest))
|
|
||||||
|
|
||||||
def __enter__(self):
|
|
||||||
return self
|
|
||||||
|
|
||||||
def __exit__(self, exc_type, exc_val, exc_tb):
|
|
||||||
self.fd.close()
|
|
||||||
|
|
||||||
|
|
||||||
class OrderedDefaultDict(OrderedDict):
|
|
||||||
def __init__(self, *args, **kwargs):
|
|
||||||
if not args:
|
|
||||||
self.default_factory = None
|
|
||||||
else:
|
|
||||||
if not (args[0] is None or callable(args[0])):
|
|
||||||
raise TypeError('first argument must be callable or None')
|
|
||||||
self.default_factory = args[0]
|
|
||||||
args = args[1:]
|
|
||||||
super(OrderedDefaultDict, self).__init__(*args, **kwargs)
|
|
||||||
|
|
||||||
def __missing__(self, key):
|
|
||||||
if self.default_factory is None:
|
|
||||||
raise KeyError(key)
|
|
||||||
self[key] = default = self.default_factory()
|
|
||||||
return default
|
|
||||||
|
|
||||||
|
|
||||||
if sys.platform == 'win32':
|
|
||||||
import ctypes.wintypes
|
|
||||||
# CSIDL_APPDATA for reference - not used here for compatibility with
|
|
||||||
# dirspec, which uses LOCAL_APPDATA and COMMON_APPDATA in that order
|
|
||||||
csidl = dict(CSIDL_APPDATA=26, CSIDL_LOCAL_APPDATA=28, CSIDL_COMMON_APPDATA=35)
|
|
||||||
|
|
||||||
def get_path(name):
|
|
||||||
SHGFP_TYPE_CURRENT = 0
|
|
||||||
buf = ctypes.create_unicode_buffer(ctypes.wintypes.MAX_PATH)
|
|
||||||
ctypes.windll.shell32.SHGetFolderPathW(0, csidl[name], 0, SHGFP_TYPE_CURRENT, buf)
|
|
||||||
return buf.value
|
|
||||||
|
|
||||||
def save_config_path(*resource):
|
|
||||||
appdata = get_path("CSIDL_LOCAL_APPDATA")
|
|
||||||
path = os.path.join(appdata, *resource)
|
|
||||||
if not os.path.isdir(path):
|
|
||||||
os.makedirs(path)
|
|
||||||
return path
|
|
||||||
|
|
||||||
def load_config_paths(*resource):
|
|
||||||
ids = ["CSIDL_LOCAL_APPDATA", "CSIDL_COMMON_APPDATA"]
|
|
||||||
for id in ids:
|
|
||||||
base = get_path(id)
|
|
||||||
path = os.path.join(base, *resource)
|
|
||||||
if os.path.exists(path):
|
|
||||||
yield path
|
|
||||||
else:
|
|
||||||
def save_config_path(*resource):
|
|
||||||
import xdg.BaseDirectory
|
|
||||||
return xdg.BaseDirectory.save_config_path(*resource)
|
|
||||||
|
|
||||||
def load_config_paths(*resource):
|
|
||||||
import xdg.BaseDirectory
|
|
||||||
return xdg.BaseDirectory.load_config_paths(*resource)
|
|
||||||
|
|
||||||
|
|
||||||
def matches_requirement(req, wheels):
|
|
||||||
"""List of wheels matching a requirement.
|
|
||||||
|
|
||||||
:param req: The requirement to satisfy
|
|
||||||
:param wheels: List of wheels to search.
|
|
||||||
"""
|
|
||||||
try:
|
|
||||||
from pkg_resources import Distribution, Requirement
|
|
||||||
except ImportError:
|
|
||||||
raise RuntimeError("Cannot use requirements without pkg_resources")
|
|
||||||
|
|
||||||
req = Requirement.parse(req)
|
|
||||||
|
|
||||||
selected = []
|
|
||||||
for wf in wheels:
|
|
||||||
f = wf.parsed_filename
|
|
||||||
dist = Distribution(project_name=f.group("name"), version=f.group("ver"))
|
|
||||||
if dist in req:
|
|
||||||
selected.append(wf)
|
|
||||||
return selected
|
|
|
@ -1,217 +0,0 @@
|
||||||
#!/usr/bin/env python
|
|
||||||
import distutils.dist
|
|
||||||
import os.path
|
|
||||||
import re
|
|
||||||
import sys
|
|
||||||
import tempfile
|
|
||||||
import zipfile
|
|
||||||
from argparse import ArgumentParser
|
|
||||||
from glob import iglob
|
|
||||||
from shutil import rmtree
|
|
||||||
|
|
||||||
import wheel.bdist_wheel
|
|
||||||
from wheel.archive import archive_wheelfile
|
|
||||||
|
|
||||||
egg_info_re = re.compile(r'''(^|/)(?P<name>[^/]+?)-(?P<ver>.+?)
|
|
||||||
(-(?P<pyver>.+?))?(-(?P<arch>.+?))?.egg-info(/|$)''', re.VERBOSE)
|
|
||||||
|
|
||||||
|
|
||||||
def parse_info(wininfo_name, egginfo_name):
|
|
||||||
"""Extract metadata from filenames.
|
|
||||||
|
|
||||||
Extracts the 4 metadataitems needed (name, version, pyversion, arch) from
|
|
||||||
the installer filename and the name of the egg-info directory embedded in
|
|
||||||
the zipfile (if any).
|
|
||||||
|
|
||||||
The egginfo filename has the format::
|
|
||||||
|
|
||||||
name-ver(-pyver)(-arch).egg-info
|
|
||||||
|
|
||||||
The installer filename has the format::
|
|
||||||
|
|
||||||
name-ver.arch(-pyver).exe
|
|
||||||
|
|
||||||
Some things to note:
|
|
||||||
|
|
||||||
1. The installer filename is not definitive. An installer can be renamed
|
|
||||||
and work perfectly well as an installer. So more reliable data should
|
|
||||||
be used whenever possible.
|
|
||||||
2. The egg-info data should be preferred for the name and version, because
|
|
||||||
these come straight from the distutils metadata, and are mandatory.
|
|
||||||
3. The pyver from the egg-info data should be ignored, as it is
|
|
||||||
constructed from the version of Python used to build the installer,
|
|
||||||
which is irrelevant - the installer filename is correct here (even to
|
|
||||||
the point that when it's not there, any version is implied).
|
|
||||||
4. The architecture must be taken from the installer filename, as it is
|
|
||||||
not included in the egg-info data.
|
|
||||||
5. Architecture-neutral installers still have an architecture because the
|
|
||||||
installer format itself (being executable) is architecture-specific. We
|
|
||||||
should therefore ignore the architecture if the content is pure-python.
|
|
||||||
"""
|
|
||||||
|
|
||||||
egginfo = None
|
|
||||||
if egginfo_name:
|
|
||||||
egginfo = egg_info_re.search(egginfo_name)
|
|
||||||
if not egginfo:
|
|
||||||
raise ValueError("Egg info filename %s is not valid" % (egginfo_name,))
|
|
||||||
|
|
||||||
# Parse the wininst filename
|
|
||||||
# 1. Distribution name (up to the first '-')
|
|
||||||
w_name, sep, rest = wininfo_name.partition('-')
|
|
||||||
if not sep:
|
|
||||||
raise ValueError("Installer filename %s is not valid" % (wininfo_name,))
|
|
||||||
|
|
||||||
# Strip '.exe'
|
|
||||||
rest = rest[:-4]
|
|
||||||
# 2. Python version (from the last '-', must start with 'py')
|
|
||||||
rest2, sep, w_pyver = rest.rpartition('-')
|
|
||||||
if sep and w_pyver.startswith('py'):
|
|
||||||
rest = rest2
|
|
||||||
w_pyver = w_pyver.replace('.', '')
|
|
||||||
else:
|
|
||||||
# Not version specific - use py2.py3. While it is possible that
|
|
||||||
# pure-Python code is not compatible with both Python 2 and 3, there
|
|
||||||
# is no way of knowing from the wininst format, so we assume the best
|
|
||||||
# here (the user can always manually rename the wheel to be more
|
|
||||||
# restrictive if needed).
|
|
||||||
w_pyver = 'py2.py3'
|
|
||||||
# 3. Version and architecture
|
|
||||||
w_ver, sep, w_arch = rest.rpartition('.')
|
|
||||||
if not sep:
|
|
||||||
raise ValueError("Installer filename %s is not valid" % (wininfo_name,))
|
|
||||||
|
|
||||||
if egginfo:
|
|
||||||
w_name = egginfo.group('name')
|
|
||||||
w_ver = egginfo.group('ver')
|
|
||||||
|
|
||||||
return dict(name=w_name, ver=w_ver, arch=w_arch, pyver=w_pyver)
|
|
||||||
|
|
||||||
|
|
||||||
def bdist_wininst2wheel(path, dest_dir=os.path.curdir):
|
|
||||||
bdw = zipfile.ZipFile(path)
|
|
||||||
|
|
||||||
# Search for egg-info in the archive
|
|
||||||
egginfo_name = None
|
|
||||||
for filename in bdw.namelist():
|
|
||||||
if '.egg-info' in filename:
|
|
||||||
egginfo_name = filename
|
|
||||||
break
|
|
||||||
|
|
||||||
info = parse_info(os.path.basename(path), egginfo_name)
|
|
||||||
|
|
||||||
root_is_purelib = True
|
|
||||||
for zipinfo in bdw.infolist():
|
|
||||||
if zipinfo.filename.startswith('PLATLIB'):
|
|
||||||
root_is_purelib = False
|
|
||||||
break
|
|
||||||
if root_is_purelib:
|
|
||||||
paths = {'purelib': ''}
|
|
||||||
else:
|
|
||||||
paths = {'platlib': ''}
|
|
||||||
|
|
||||||
dist_info = "%(name)s-%(ver)s" % info
|
|
||||||
datadir = "%s.data/" % dist_info
|
|
||||||
|
|
||||||
# rewrite paths to trick ZipFile into extracting an egg
|
|
||||||
# XXX grab wininst .ini - between .exe, padding, and first zip file.
|
|
||||||
members = []
|
|
||||||
egginfo_name = ''
|
|
||||||
for zipinfo in bdw.infolist():
|
|
||||||
key, basename = zipinfo.filename.split('/', 1)
|
|
||||||
key = key.lower()
|
|
||||||
basepath = paths.get(key, None)
|
|
||||||
if basepath is None:
|
|
||||||
basepath = datadir + key.lower() + '/'
|
|
||||||
oldname = zipinfo.filename
|
|
||||||
newname = basepath + basename
|
|
||||||
zipinfo.filename = newname
|
|
||||||
del bdw.NameToInfo[oldname]
|
|
||||||
bdw.NameToInfo[newname] = zipinfo
|
|
||||||
# Collect member names, but omit '' (from an entry like "PLATLIB/"
|
|
||||||
if newname:
|
|
||||||
members.append(newname)
|
|
||||||
# Remember egg-info name for the egg2dist call below
|
|
||||||
if not egginfo_name:
|
|
||||||
if newname.endswith('.egg-info'):
|
|
||||||
egginfo_name = newname
|
|
||||||
elif '.egg-info/' in newname:
|
|
||||||
egginfo_name, sep, _ = newname.rpartition('/')
|
|
||||||
dir = tempfile.mkdtemp(suffix="_b2w")
|
|
||||||
bdw.extractall(dir, members)
|
|
||||||
|
|
||||||
# egg2wheel
|
|
||||||
abi = 'none'
|
|
||||||
pyver = info['pyver']
|
|
||||||
arch = (info['arch'] or 'any').replace('.', '_').replace('-', '_')
|
|
||||||
# Wininst installers always have arch even if they are not
|
|
||||||
# architecture-specific (because the format itself is).
|
|
||||||
# So, assume the content is architecture-neutral if root is purelib.
|
|
||||||
if root_is_purelib:
|
|
||||||
arch = 'any'
|
|
||||||
# If the installer is architecture-specific, it's almost certainly also
|
|
||||||
# CPython-specific.
|
|
||||||
if arch != 'any':
|
|
||||||
pyver = pyver.replace('py', 'cp')
|
|
||||||
wheel_name = '-'.join((
|
|
||||||
dist_info,
|
|
||||||
pyver,
|
|
||||||
abi,
|
|
||||||
arch
|
|
||||||
))
|
|
||||||
if root_is_purelib:
|
|
||||||
bw = wheel.bdist_wheel.bdist_wheel(distutils.dist.Distribution())
|
|
||||||
else:
|
|
||||||
bw = _bdist_wheel_tag(distutils.dist.Distribution())
|
|
||||||
|
|
||||||
bw.root_is_pure = root_is_purelib
|
|
||||||
bw.python_tag = pyver
|
|
||||||
bw.plat_name_supplied = True
|
|
||||||
bw.plat_name = info['arch'] or 'any'
|
|
||||||
|
|
||||||
if not root_is_purelib:
|
|
||||||
bw.full_tag_supplied = True
|
|
||||||
bw.full_tag = (pyver, abi, arch)
|
|
||||||
|
|
||||||
dist_info_dir = os.path.join(dir, '%s.dist-info' % dist_info)
|
|
||||||
bw.egg2dist(os.path.join(dir, egginfo_name), dist_info_dir)
|
|
||||||
bw.write_wheelfile(dist_info_dir, generator='wininst2wheel')
|
|
||||||
bw.write_record(dir, dist_info_dir)
|
|
||||||
|
|
||||||
archive_wheelfile(os.path.join(dest_dir, wheel_name), dir)
|
|
||||||
rmtree(dir)
|
|
||||||
|
|
||||||
|
|
||||||
class _bdist_wheel_tag(wheel.bdist_wheel.bdist_wheel):
|
|
||||||
# allow the client to override the default generated wheel tag
|
|
||||||
# The default bdist_wheel implementation uses python and abi tags
|
|
||||||
# of the running python process. This is not suitable for
|
|
||||||
# generating/repackaging prebuild binaries.
|
|
||||||
|
|
||||||
full_tag_supplied = False
|
|
||||||
full_tag = None # None or a (pytag, soabitag, plattag) triple
|
|
||||||
|
|
||||||
def get_tag(self):
|
|
||||||
if self.full_tag_supplied and self.full_tag is not None:
|
|
||||||
return self.full_tag
|
|
||||||
else:
|
|
||||||
return super(_bdist_wheel_tag, self).get_tag()
|
|
||||||
|
|
||||||
|
|
||||||
def main():
|
|
||||||
parser = ArgumentParser()
|
|
||||||
parser.add_argument('installers', nargs='*', help="Installers to convert")
|
|
||||||
parser.add_argument('--dest-dir', '-d', default=os.path.curdir,
|
|
||||||
help="Directory to store wheels (default %(default)s)")
|
|
||||||
parser.add_argument('--verbose', '-v', action='store_true')
|
|
||||||
args = parser.parse_args()
|
|
||||||
for pat in args.installers:
|
|
||||||
for installer in iglob(pat):
|
|
||||||
if args.verbose:
|
|
||||||
sys.stdout.write("{0}... ".format(installer))
|
|
||||||
bdist_wininst2wheel(installer, args.dest_dir)
|
|
||||||
if args.verbose:
|
|
||||||
sys.stdout.write("OK\n")
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
main()
|
|
155
lib/python3.7/site-packages/SQLAlchemy-1.0.12.egg-info/PKG-INFO
Normal file
155
lib/python3.7/site-packages/SQLAlchemy-1.0.12.egg-info/PKG-INFO
Normal file
|
@ -0,0 +1,155 @@
|
||||||
|
Metadata-Version: 1.1
|
||||||
|
Name: SQLAlchemy
|
||||||
|
Version: 1.0.12
|
||||||
|
Summary: Database Abstraction Library
|
||||||
|
Home-page: http://www.sqlalchemy.org
|
||||||
|
Author: Mike Bayer
|
||||||
|
Author-email: mike_mp@zzzcomputing.com
|
||||||
|
License: MIT License
|
||||||
|
Description: SQLAlchemy
|
||||||
|
==========
|
||||||
|
|
||||||
|
The Python SQL Toolkit and Object Relational Mapper
|
||||||
|
|
||||||
|
Introduction
|
||||||
|
-------------
|
||||||
|
|
||||||
|
SQLAlchemy is the Python SQL toolkit and Object Relational Mapper
|
||||||
|
that gives application developers the full power and
|
||||||
|
flexibility of SQL. SQLAlchemy provides a full suite
|
||||||
|
of well known enterprise-level persistence patterns,
|
||||||
|
designed for efficient and high-performing database
|
||||||
|
access, adapted into a simple and Pythonic domain
|
||||||
|
language.
|
||||||
|
|
||||||
|
Major SQLAlchemy features include:
|
||||||
|
|
||||||
|
* An industrial strength ORM, built
|
||||||
|
from the core on the identity map, unit of work,
|
||||||
|
and data mapper patterns. These patterns
|
||||||
|
allow transparent persistence of objects
|
||||||
|
using a declarative configuration system.
|
||||||
|
Domain models
|
||||||
|
can be constructed and manipulated naturally,
|
||||||
|
and changes are synchronized with the
|
||||||
|
current transaction automatically.
|
||||||
|
* A relationally-oriented query system, exposing
|
||||||
|
the full range of SQL's capabilities
|
||||||
|
explicitly, including joins, subqueries,
|
||||||
|
correlation, and most everything else,
|
||||||
|
in terms of the object model.
|
||||||
|
Writing queries with the ORM uses the same
|
||||||
|
techniques of relational composition you use
|
||||||
|
when writing SQL. While you can drop into
|
||||||
|
literal SQL at any time, it's virtually never
|
||||||
|
needed.
|
||||||
|
* A comprehensive and flexible system
|
||||||
|
of eager loading for related collections and objects.
|
||||||
|
Collections are cached within a session,
|
||||||
|
and can be loaded on individual access, all
|
||||||
|
at once using joins, or by query per collection
|
||||||
|
across the full result set.
|
||||||
|
* A Core SQL construction system and DBAPI
|
||||||
|
interaction layer. The SQLAlchemy Core is
|
||||||
|
separate from the ORM and is a full database
|
||||||
|
abstraction layer in its own right, and includes
|
||||||
|
an extensible Python-based SQL expression
|
||||||
|
language, schema metadata, connection pooling,
|
||||||
|
type coercion, and custom types.
|
||||||
|
* All primary and foreign key constraints are
|
||||||
|
assumed to be composite and natural. Surrogate
|
||||||
|
integer primary keys are of course still the
|
||||||
|
norm, but SQLAlchemy never assumes or hardcodes
|
||||||
|
to this model.
|
||||||
|
* Database introspection and generation. Database
|
||||||
|
schemas can be "reflected" in one step into
|
||||||
|
Python structures representing database metadata;
|
||||||
|
those same structures can then generate
|
||||||
|
CREATE statements right back out - all within
|
||||||
|
the Core, independent of the ORM.
|
||||||
|
|
||||||
|
SQLAlchemy's philosophy:
|
||||||
|
|
||||||
|
* SQL databases behave less and less like object
|
||||||
|
collections the more size and performance start to
|
||||||
|
matter; object collections behave less and less like
|
||||||
|
tables and rows the more abstraction starts to matter.
|
||||||
|
SQLAlchemy aims to accommodate both of these
|
||||||
|
principles.
|
||||||
|
* An ORM doesn't need to hide the "R". A relational
|
||||||
|
database provides rich, set-based functionality
|
||||||
|
that should be fully exposed. SQLAlchemy's
|
||||||
|
ORM provides an open-ended set of patterns
|
||||||
|
that allow a developer to construct a custom
|
||||||
|
mediation layer between a domain model and
|
||||||
|
a relational schema, turning the so-called
|
||||||
|
"object relational impedance" issue into
|
||||||
|
a distant memory.
|
||||||
|
* The developer, in all cases, makes all decisions
|
||||||
|
regarding the design, structure, and naming conventions
|
||||||
|
of both the object model as well as the relational
|
||||||
|
schema. SQLAlchemy only provides the means
|
||||||
|
to automate the execution of these decisions.
|
||||||
|
* With SQLAlchemy, there's no such thing as
|
||||||
|
"the ORM generated a bad query" - you
|
||||||
|
retain full control over the structure of
|
||||||
|
queries, including how joins are organized,
|
||||||
|
how subqueries and correlation is used, what
|
||||||
|
columns are requested. Everything SQLAlchemy
|
||||||
|
does is ultimately the result of a developer-
|
||||||
|
initiated decision.
|
||||||
|
* Don't use an ORM if the problem doesn't need one.
|
||||||
|
SQLAlchemy consists of a Core and separate ORM
|
||||||
|
component. The Core offers a full SQL expression
|
||||||
|
language that allows Pythonic construction
|
||||||
|
of SQL constructs that render directly to SQL
|
||||||
|
strings for a target database, returning
|
||||||
|
result sets that are essentially enhanced DBAPI
|
||||||
|
cursors.
|
||||||
|
* Transactions should be the norm. With SQLAlchemy's
|
||||||
|
ORM, nothing goes to permanent storage until
|
||||||
|
commit() is called. SQLAlchemy encourages applications
|
||||||
|
to create a consistent means of delineating
|
||||||
|
the start and end of a series of operations.
|
||||||
|
* Never render a literal value in a SQL statement.
|
||||||
|
Bound parameters are used to the greatest degree
|
||||||
|
possible, allowing query optimizers to cache
|
||||||
|
query plans effectively and making SQL injection
|
||||||
|
attacks a non-issue.
|
||||||
|
|
||||||
|
Documentation
|
||||||
|
-------------
|
||||||
|
|
||||||
|
Latest documentation is at:
|
||||||
|
|
||||||
|
http://www.sqlalchemy.org/docs/
|
||||||
|
|
||||||
|
Installation / Requirements
|
||||||
|
---------------------------
|
||||||
|
|
||||||
|
Full documentation for installation is at
|
||||||
|
`Installation <http://www.sqlalchemy.org/docs/intro.html#installation>`_.
|
||||||
|
|
||||||
|
Getting Help / Development / Bug reporting
|
||||||
|
------------------------------------------
|
||||||
|
|
||||||
|
Please refer to the `SQLAlchemy Community Guide <http://www.sqlalchemy.org/support.html>`_.
|
||||||
|
|
||||||
|
License
|
||||||
|
-------
|
||||||
|
|
||||||
|
SQLAlchemy is distributed under the `MIT license
|
||||||
|
<http://www.opensource.org/licenses/mit-license.php>`_.
|
||||||
|
|
||||||
|
|
||||||
|
Platform: UNKNOWN
|
||||||
|
Classifier: Development Status :: 5 - Production/Stable
|
||||||
|
Classifier: Intended Audience :: Developers
|
||||||
|
Classifier: License :: OSI Approved :: MIT License
|
||||||
|
Classifier: Programming Language :: Python
|
||||||
|
Classifier: Programming Language :: Python :: 3
|
||||||
|
Classifier: Programming Language :: Python :: Implementation :: CPython
|
||||||
|
Classifier: Programming Language :: Python :: Implementation :: Jython
|
||||||
|
Classifier: Programming Language :: Python :: Implementation :: PyPy
|
||||||
|
Classifier: Topic :: Database :: Front-Ends
|
||||||
|
Classifier: Operating System :: OS Independent
|
|
@ -0,0 +1,786 @@
|
||||||
|
AUTHORS
|
||||||
|
CHANGES
|
||||||
|
LICENSE
|
||||||
|
MANIFEST.in
|
||||||
|
README.dialects.rst
|
||||||
|
README.rst
|
||||||
|
README.unittests.rst
|
||||||
|
setup.cfg
|
||||||
|
setup.py
|
||||||
|
sqla_nose.py
|
||||||
|
tox.ini
|
||||||
|
doc/contents.html
|
||||||
|
doc/copyright.html
|
||||||
|
doc/genindex.html
|
||||||
|
doc/glossary.html
|
||||||
|
doc/index.html
|
||||||
|
doc/intro.html
|
||||||
|
doc/search.html
|
||||||
|
doc/searchindex.js
|
||||||
|
doc/_images/sqla_arch_small.png
|
||||||
|
doc/_images/sqla_engine_arch.png
|
||||||
|
doc/_modules/index.html
|
||||||
|
doc/_modules/examples/adjacency_list/adjacency_list.html
|
||||||
|
doc/_modules/examples/association/basic_association.html
|
||||||
|
doc/_modules/examples/association/dict_of_sets_with_default.html
|
||||||
|
doc/_modules/examples/association/proxied_association.html
|
||||||
|
doc/_modules/examples/custom_attributes/custom_management.html
|
||||||
|
doc/_modules/examples/custom_attributes/listen_for_events.html
|
||||||
|
doc/_modules/examples/dogpile_caching/advanced.html
|
||||||
|
doc/_modules/examples/dogpile_caching/caching_query.html
|
||||||
|
doc/_modules/examples/dogpile_caching/environment.html
|
||||||
|
doc/_modules/examples/dogpile_caching/fixture_data.html
|
||||||
|
doc/_modules/examples/dogpile_caching/helloworld.html
|
||||||
|
doc/_modules/examples/dogpile_caching/local_session_caching.html
|
||||||
|
doc/_modules/examples/dogpile_caching/model.html
|
||||||
|
doc/_modules/examples/dogpile_caching/relationship_caching.html
|
||||||
|
doc/_modules/examples/dynamic_dict/dynamic_dict.html
|
||||||
|
doc/_modules/examples/elementtree/adjacency_list.html
|
||||||
|
doc/_modules/examples/elementtree/optimized_al.html
|
||||||
|
doc/_modules/examples/elementtree/pickle.html
|
||||||
|
doc/_modules/examples/generic_associations/discriminator_on_association.html
|
||||||
|
doc/_modules/examples/generic_associations/generic_fk.html
|
||||||
|
doc/_modules/examples/generic_associations/table_per_association.html
|
||||||
|
doc/_modules/examples/generic_associations/table_per_related.html
|
||||||
|
doc/_modules/examples/graphs/directed_graph.html
|
||||||
|
doc/_modules/examples/inheritance/concrete.html
|
||||||
|
doc/_modules/examples/inheritance/joined.html
|
||||||
|
doc/_modules/examples/inheritance/single.html
|
||||||
|
doc/_modules/examples/join_conditions/cast.html
|
||||||
|
doc/_modules/examples/join_conditions/threeway.html
|
||||||
|
doc/_modules/examples/large_collection/large_collection.html
|
||||||
|
doc/_modules/examples/materialized_paths/materialized_paths.html
|
||||||
|
doc/_modules/examples/nested_sets/nested_sets.html
|
||||||
|
doc/_modules/examples/performance/__main__.html
|
||||||
|
doc/_modules/examples/performance/bulk_inserts.html
|
||||||
|
doc/_modules/examples/performance/bulk_updates.html
|
||||||
|
doc/_modules/examples/performance/large_resultsets.html
|
||||||
|
doc/_modules/examples/performance/short_selects.html
|
||||||
|
doc/_modules/examples/performance/single_inserts.html
|
||||||
|
doc/_modules/examples/postgis/postgis.html
|
||||||
|
doc/_modules/examples/sharding/attribute_shard.html
|
||||||
|
doc/_modules/examples/versioned_history/history_meta.html
|
||||||
|
doc/_modules/examples/versioned_history/test_versioning.html
|
||||||
|
doc/_modules/examples/versioned_rows/versioned_map.html
|
||||||
|
doc/_modules/examples/versioned_rows/versioned_rows.html
|
||||||
|
doc/_modules/examples/vertical/dictlike-polymorphic.html
|
||||||
|
doc/_modules/examples/vertical/dictlike.html
|
||||||
|
doc/_static/basic.css
|
||||||
|
doc/_static/changelog.css
|
||||||
|
doc/_static/comment-bright.png
|
||||||
|
doc/_static/comment-close.png
|
||||||
|
doc/_static/comment.png
|
||||||
|
doc/_static/detectmobile.js
|
||||||
|
doc/_static/docs.css
|
||||||
|
doc/_static/doctools.js
|
||||||
|
doc/_static/down-pressed.png
|
||||||
|
doc/_static/down.png
|
||||||
|
doc/_static/file.png
|
||||||
|
doc/_static/init.js
|
||||||
|
doc/_static/jquery-1.11.1.js
|
||||||
|
doc/_static/jquery.js
|
||||||
|
doc/_static/minus.png
|
||||||
|
doc/_static/plus.png
|
||||||
|
doc/_static/pygments.css
|
||||||
|
doc/_static/searchtools.js
|
||||||
|
doc/_static/sphinx_paramlinks.css
|
||||||
|
doc/_static/underscore-1.3.1.js
|
||||||
|
doc/_static/underscore.js
|
||||||
|
doc/_static/up-pressed.png
|
||||||
|
doc/_static/up.png
|
||||||
|
doc/_static/websupport.js
|
||||||
|
doc/build/Makefile
|
||||||
|
doc/build/conf.py
|
||||||
|
doc/build/contents.rst
|
||||||
|
doc/build/copyright.rst
|
||||||
|
doc/build/corrections.py
|
||||||
|
doc/build/glossary.rst
|
||||||
|
doc/build/index.rst
|
||||||
|
doc/build/intro.rst
|
||||||
|
doc/build/requirements.txt
|
||||||
|
doc/build/sqla_arch_small.png
|
||||||
|
doc/build/changelog/changelog_01.rst
|
||||||
|
doc/build/changelog/changelog_02.rst
|
||||||
|
doc/build/changelog/changelog_03.rst
|
||||||
|
doc/build/changelog/changelog_04.rst
|
||||||
|
doc/build/changelog/changelog_05.rst
|
||||||
|
doc/build/changelog/changelog_06.rst
|
||||||
|
doc/build/changelog/changelog_07.rst
|
||||||
|
doc/build/changelog/changelog_08.rst
|
||||||
|
doc/build/changelog/changelog_09.rst
|
||||||
|
doc/build/changelog/changelog_10.rst
|
||||||
|
doc/build/changelog/index.rst
|
||||||
|
doc/build/changelog/migration_04.rst
|
||||||
|
doc/build/changelog/migration_05.rst
|
||||||
|
doc/build/changelog/migration_06.rst
|
||||||
|
doc/build/changelog/migration_07.rst
|
||||||
|
doc/build/changelog/migration_08.rst
|
||||||
|
doc/build/changelog/migration_09.rst
|
||||||
|
doc/build/changelog/migration_10.rst
|
||||||
|
doc/build/core/api_basics.rst
|
||||||
|
doc/build/core/compiler.rst
|
||||||
|
doc/build/core/connections.rst
|
||||||
|
doc/build/core/constraints.rst
|
||||||
|
doc/build/core/custom_types.rst
|
||||||
|
doc/build/core/ddl.rst
|
||||||
|
doc/build/core/defaults.rst
|
||||||
|
doc/build/core/dml.rst
|
||||||
|
doc/build/core/engines.rst
|
||||||
|
doc/build/core/engines_connections.rst
|
||||||
|
doc/build/core/event.rst
|
||||||
|
doc/build/core/events.rst
|
||||||
|
doc/build/core/exceptions.rst
|
||||||
|
doc/build/core/expression_api.rst
|
||||||
|
doc/build/core/functions.rst
|
||||||
|
doc/build/core/index.rst
|
||||||
|
doc/build/core/inspection.rst
|
||||||
|
doc/build/core/interfaces.rst
|
||||||
|
doc/build/core/internals.rst
|
||||||
|
doc/build/core/metadata.rst
|
||||||
|
doc/build/core/pooling.rst
|
||||||
|
doc/build/core/reflection.rst
|
||||||
|
doc/build/core/schema.rst
|
||||||
|
doc/build/core/selectable.rst
|
||||||
|
doc/build/core/serializer.rst
|
||||||
|
doc/build/core/sqla_engine_arch.png
|
||||||
|
doc/build/core/sqlelement.rst
|
||||||
|
doc/build/core/tutorial.rst
|
||||||
|
doc/build/core/type_api.rst
|
||||||
|
doc/build/core/type_basics.rst
|
||||||
|
doc/build/core/types.rst
|
||||||
|
doc/build/dialects/firebird.rst
|
||||||
|
doc/build/dialects/index.rst
|
||||||
|
doc/build/dialects/mssql.rst
|
||||||
|
doc/build/dialects/mysql.rst
|
||||||
|
doc/build/dialects/oracle.rst
|
||||||
|
doc/build/dialects/postgresql.rst
|
||||||
|
doc/build/dialects/sqlite.rst
|
||||||
|
doc/build/dialects/sybase.rst
|
||||||
|
doc/build/faq/connections.rst
|
||||||
|
doc/build/faq/index.rst
|
||||||
|
doc/build/faq/metadata_schema.rst
|
||||||
|
doc/build/faq/ormconfiguration.rst
|
||||||
|
doc/build/faq/performance.rst
|
||||||
|
doc/build/faq/sessions.rst
|
||||||
|
doc/build/faq/sqlexpressions.rst
|
||||||
|
doc/build/orm/backref.rst
|
||||||
|
doc/build/orm/basic_relationships.rst
|
||||||
|
doc/build/orm/cascades.rst
|
||||||
|
doc/build/orm/classical.rst
|
||||||
|
doc/build/orm/collections.rst
|
||||||
|
doc/build/orm/composites.rst
|
||||||
|
doc/build/orm/constructors.rst
|
||||||
|
doc/build/orm/contextual.rst
|
||||||
|
doc/build/orm/deprecated.rst
|
||||||
|
doc/build/orm/events.rst
|
||||||
|
doc/build/orm/examples.rst
|
||||||
|
doc/build/orm/exceptions.rst
|
||||||
|
doc/build/orm/extending.rst
|
||||||
|
doc/build/orm/index.rst
|
||||||
|
doc/build/orm/inheritance.rst
|
||||||
|
doc/build/orm/internals.rst
|
||||||
|
doc/build/orm/join_conditions.rst
|
||||||
|
doc/build/orm/loading.rst
|
||||||
|
doc/build/orm/loading_columns.rst
|
||||||
|
doc/build/orm/loading_objects.rst
|
||||||
|
doc/build/orm/loading_relationships.rst
|
||||||
|
doc/build/orm/mapped_attributes.rst
|
||||||
|
doc/build/orm/mapped_sql_expr.rst
|
||||||
|
doc/build/orm/mapper_config.rst
|
||||||
|
doc/build/orm/mapping_api.rst
|
||||||
|
doc/build/orm/mapping_columns.rst
|
||||||
|
doc/build/orm/mapping_styles.rst
|
||||||
|
doc/build/orm/nonstandard_mappings.rst
|
||||||
|
doc/build/orm/persistence_techniques.rst
|
||||||
|
doc/build/orm/query.rst
|
||||||
|
doc/build/orm/relationship_api.rst
|
||||||
|
doc/build/orm/relationship_persistence.rst
|
||||||
|
doc/build/orm/relationships.rst
|
||||||
|
doc/build/orm/scalar_mapping.rst
|
||||||
|
doc/build/orm/self_referential.rst
|
||||||
|
doc/build/orm/session.rst
|
||||||
|
doc/build/orm/session_api.rst
|
||||||
|
doc/build/orm/session_basics.rst
|
||||||
|
doc/build/orm/session_events.rst
|
||||||
|
doc/build/orm/session_state_management.rst
|
||||||
|
doc/build/orm/session_transaction.rst
|
||||||
|
doc/build/orm/tutorial.rst
|
||||||
|
doc/build/orm/versioning.rst
|
||||||
|
doc/build/orm/extensions/associationproxy.rst
|
||||||
|
doc/build/orm/extensions/automap.rst
|
||||||
|
doc/build/orm/extensions/baked.rst
|
||||||
|
doc/build/orm/extensions/horizontal_shard.rst
|
||||||
|
doc/build/orm/extensions/hybrid.rst
|
||||||
|
doc/build/orm/extensions/index.rst
|
||||||
|
doc/build/orm/extensions/instrumentation.rst
|
||||||
|
doc/build/orm/extensions/mutable.rst
|
||||||
|
doc/build/orm/extensions/orderinglist.rst
|
||||||
|
doc/build/orm/extensions/declarative/api.rst
|
||||||
|
doc/build/orm/extensions/declarative/basic_use.rst
|
||||||
|
doc/build/orm/extensions/declarative/index.rst
|
||||||
|
doc/build/orm/extensions/declarative/inheritance.rst
|
||||||
|
doc/build/orm/extensions/declarative/mixins.rst
|
||||||
|
doc/build/orm/extensions/declarative/relationships.rst
|
||||||
|
doc/build/orm/extensions/declarative/table_config.rst
|
||||||
|
doc/build/texinputs/Makefile
|
||||||
|
doc/build/texinputs/sphinx.sty
|
||||||
|
doc/changelog/changelog_01.html
|
||||||
|
doc/changelog/changelog_02.html
|
||||||
|
doc/changelog/changelog_03.html
|
||||||
|
doc/changelog/changelog_04.html
|
||||||
|
doc/changelog/changelog_05.html
|
||||||
|
doc/changelog/changelog_06.html
|
||||||
|
doc/changelog/changelog_07.html
|
||||||
|
doc/changelog/changelog_08.html
|
||||||
|
doc/changelog/changelog_09.html
|
||||||
|
doc/changelog/changelog_10.html
|
||||||
|
doc/changelog/index.html
|
||||||
|
doc/changelog/migration_04.html
|
||||||
|
doc/changelog/migration_05.html
|
||||||
|
doc/changelog/migration_06.html
|
||||||
|
doc/changelog/migration_07.html
|
||||||
|
doc/changelog/migration_08.html
|
||||||
|
doc/changelog/migration_09.html
|
||||||
|
doc/changelog/migration_10.html
|
||||||
|
doc/core/api_basics.html
|
||||||
|
doc/core/compiler.html
|
||||||
|
doc/core/connections.html
|
||||||
|
doc/core/constraints.html
|
||||||
|
doc/core/custom_types.html
|
||||||
|
doc/core/ddl.html
|
||||||
|
doc/core/defaults.html
|
||||||
|
doc/core/dml.html
|
||||||
|
doc/core/engines.html
|
||||||
|
doc/core/engines_connections.html
|
||||||
|
doc/core/event.html
|
||||||
|
doc/core/events.html
|
||||||
|
doc/core/exceptions.html
|
||||||
|
doc/core/expression_api.html
|
||||||
|
doc/core/functions.html
|
||||||
|
doc/core/index.html
|
||||||
|
doc/core/inspection.html
|
||||||
|
doc/core/interfaces.html
|
||||||
|
doc/core/internals.html
|
||||||
|
doc/core/metadata.html
|
||||||
|
doc/core/pooling.html
|
||||||
|
doc/core/reflection.html
|
||||||
|
doc/core/schema.html
|
||||||
|
doc/core/selectable.html
|
||||||
|
doc/core/serializer.html
|
||||||
|
doc/core/sqlelement.html
|
||||||
|
doc/core/tutorial.html
|
||||||
|
doc/core/type_api.html
|
||||||
|
doc/core/type_basics.html
|
||||||
|
doc/core/types.html
|
||||||
|
doc/dialects/firebird.html
|
||||||
|
doc/dialects/index.html
|
||||||
|
doc/dialects/mssql.html
|
||||||
|
doc/dialects/mysql.html
|
||||||
|
doc/dialects/oracle.html
|
||||||
|
doc/dialects/postgresql.html
|
||||||
|
doc/dialects/sqlite.html
|
||||||
|
doc/dialects/sybase.html
|
||||||
|
doc/faq/connections.html
|
||||||
|
doc/faq/index.html
|
||||||
|
doc/faq/metadata_schema.html
|
||||||
|
doc/faq/ormconfiguration.html
|
||||||
|
doc/faq/performance.html
|
||||||
|
doc/faq/sessions.html
|
||||||
|
doc/faq/sqlexpressions.html
|
||||||
|
doc/orm/backref.html
|
||||||
|
doc/orm/basic_relationships.html
|
||||||
|
doc/orm/cascades.html
|
||||||
|
doc/orm/classical.html
|
||||||
|
doc/orm/collections.html
|
||||||
|
doc/orm/composites.html
|
||||||
|
doc/orm/constructors.html
|
||||||
|
doc/orm/contextual.html
|
||||||
|
doc/orm/deprecated.html
|
||||||
|
doc/orm/events.html
|
||||||
|
doc/orm/examples.html
|
||||||
|
doc/orm/exceptions.html
|
||||||
|
doc/orm/extending.html
|
||||||
|
doc/orm/index.html
|
||||||
|
doc/orm/inheritance.html
|
||||||
|
doc/orm/internals.html
|
||||||
|
doc/orm/join_conditions.html
|
||||||
|
doc/orm/loading.html
|
||||||
|
doc/orm/loading_columns.html
|
||||||
|
doc/orm/loading_objects.html
|
||||||
|
doc/orm/loading_relationships.html
|
||||||
|
doc/orm/mapped_attributes.html
|
||||||
|
doc/orm/mapped_sql_expr.html
|
||||||
|
doc/orm/mapper_config.html
|
||||||
|
doc/orm/mapping_api.html
|
||||||
|
doc/orm/mapping_columns.html
|
||||||
|
doc/orm/mapping_styles.html
|
||||||
|
doc/orm/nonstandard_mappings.html
|
||||||
|
doc/orm/persistence_techniques.html
|
||||||
|
doc/orm/query.html
|
||||||
|
doc/orm/relationship_api.html
|
||||||
|
doc/orm/relationship_persistence.html
|
||||||
|
doc/orm/relationships.html
|
||||||
|
doc/orm/scalar_mapping.html
|
||||||
|
doc/orm/self_referential.html
|
||||||
|
doc/orm/session.html
|
||||||
|
doc/orm/session_api.html
|
||||||
|
doc/orm/session_basics.html
|
||||||
|
doc/orm/session_events.html
|
||||||
|
doc/orm/session_state_management.html
|
||||||
|
doc/orm/session_transaction.html
|
||||||
|
doc/orm/tutorial.html
|
||||||
|
doc/orm/versioning.html
|
||||||
|
doc/orm/extensions/associationproxy.html
|
||||||
|
doc/orm/extensions/automap.html
|
||||||
|
doc/orm/extensions/baked.html
|
||||||
|
doc/orm/extensions/horizontal_shard.html
|
||||||
|
doc/orm/extensions/hybrid.html
|
||||||
|
doc/orm/extensions/index.html
|
||||||
|
doc/orm/extensions/instrumentation.html
|
||||||
|
doc/orm/extensions/mutable.html
|
||||||
|
doc/orm/extensions/orderinglist.html
|
||||||
|
doc/orm/extensions/declarative/api.html
|
||||||
|
doc/orm/extensions/declarative/basic_use.html
|
||||||
|
doc/orm/extensions/declarative/index.html
|
||||||
|
doc/orm/extensions/declarative/inheritance.html
|
||||||
|
doc/orm/extensions/declarative/mixins.html
|
||||||
|
doc/orm/extensions/declarative/relationships.html
|
||||||
|
doc/orm/extensions/declarative/table_config.html
|
||||||
|
examples/__init__.py
|
||||||
|
examples/adjacency_list/__init__.py
|
||||||
|
examples/adjacency_list/adjacency_list.py
|
||||||
|
examples/association/__init__.py
|
||||||
|
examples/association/basic_association.py
|
||||||
|
examples/association/dict_of_sets_with_default.py
|
||||||
|
examples/association/proxied_association.py
|
||||||
|
examples/custom_attributes/__init__.py
|
||||||
|
examples/custom_attributes/custom_management.py
|
||||||
|
examples/custom_attributes/listen_for_events.py
|
||||||
|
examples/dogpile_caching/__init__.py
|
||||||
|
examples/dogpile_caching/advanced.py
|
||||||
|
examples/dogpile_caching/caching_query.py
|
||||||
|
examples/dogpile_caching/environment.py
|
||||||
|
examples/dogpile_caching/fixture_data.py
|
||||||
|
examples/dogpile_caching/helloworld.py
|
||||||
|
examples/dogpile_caching/local_session_caching.py
|
||||||
|
examples/dogpile_caching/model.py
|
||||||
|
examples/dogpile_caching/relationship_caching.py
|
||||||
|
examples/dynamic_dict/__init__.py
|
||||||
|
examples/dynamic_dict/dynamic_dict.py
|
||||||
|
examples/elementtree/__init__.py
|
||||||
|
examples/elementtree/adjacency_list.py
|
||||||
|
examples/elementtree/optimized_al.py
|
||||||
|
examples/elementtree/pickle.py
|
||||||
|
examples/elementtree/test.xml
|
||||||
|
examples/elementtree/test2.xml
|
||||||
|
examples/elementtree/test3.xml
|
||||||
|
examples/generic_associations/__init__.py
|
||||||
|
examples/generic_associations/discriminator_on_association.py
|
||||||
|
examples/generic_associations/generic_fk.py
|
||||||
|
examples/generic_associations/table_per_association.py
|
||||||
|
examples/generic_associations/table_per_related.py
|
||||||
|
examples/graphs/__init__.py
|
||||||
|
examples/graphs/directed_graph.py
|
||||||
|
examples/inheritance/__init__.py
|
||||||
|
examples/inheritance/concrete.py
|
||||||
|
examples/inheritance/joined.py
|
||||||
|
examples/inheritance/single.py
|
||||||
|
examples/join_conditions/__init__.py
|
||||||
|
examples/join_conditions/cast.py
|
||||||
|
examples/join_conditions/threeway.py
|
||||||
|
examples/large_collection/__init__.py
|
||||||
|
examples/large_collection/large_collection.py
|
||||||
|
examples/materialized_paths/__init__.py
|
||||||
|
examples/materialized_paths/materialized_paths.py
|
||||||
|
examples/nested_sets/__init__.py
|
||||||
|
examples/nested_sets/nested_sets.py
|
||||||
|
examples/performance/__init__.py
|
||||||
|
examples/performance/__main__.py
|
||||||
|
examples/performance/bulk_inserts.py
|
||||||
|
examples/performance/bulk_updates.py
|
||||||
|
examples/performance/large_resultsets.py
|
||||||
|
examples/performance/short_selects.py
|
||||||
|
examples/performance/single_inserts.py
|
||||||
|
examples/postgis/__init__.py
|
||||||
|
examples/postgis/postgis.py
|
||||||
|
examples/sharding/__init__.py
|
||||||
|
examples/sharding/attribute_shard.py
|
||||||
|
examples/versioned_history/__init__.py
|
||||||
|
examples/versioned_history/history_meta.py
|
||||||
|
examples/versioned_history/test_versioning.py
|
||||||
|
examples/versioned_rows/__init__.py
|
||||||
|
examples/versioned_rows/versioned_map.py
|
||||||
|
examples/versioned_rows/versioned_rows.py
|
||||||
|
examples/vertical/__init__.py
|
||||||
|
examples/vertical/dictlike-polymorphic.py
|
||||||
|
examples/vertical/dictlike.py
|
||||||
|
lib/SQLAlchemy.egg-info/PKG-INFO
|
||||||
|
lib/SQLAlchemy.egg-info/SOURCES.txt
|
||||||
|
lib/SQLAlchemy.egg-info/dependency_links.txt
|
||||||
|
lib/SQLAlchemy.egg-info/top_level.txt
|
||||||
|
lib/sqlalchemy/__init__.py
|
||||||
|
lib/sqlalchemy/events.py
|
||||||
|
lib/sqlalchemy/exc.py
|
||||||
|
lib/sqlalchemy/inspection.py
|
||||||
|
lib/sqlalchemy/interfaces.py
|
||||||
|
lib/sqlalchemy/log.py
|
||||||
|
lib/sqlalchemy/pool.py
|
||||||
|
lib/sqlalchemy/processors.py
|
||||||
|
lib/sqlalchemy/schema.py
|
||||||
|
lib/sqlalchemy/types.py
|
||||||
|
lib/sqlalchemy/cextension/processors.c
|
||||||
|
lib/sqlalchemy/cextension/resultproxy.c
|
||||||
|
lib/sqlalchemy/cextension/utils.c
|
||||||
|
lib/sqlalchemy/connectors/__init__.py
|
||||||
|
lib/sqlalchemy/connectors/mxodbc.py
|
||||||
|
lib/sqlalchemy/connectors/pyodbc.py
|
||||||
|
lib/sqlalchemy/connectors/zxJDBC.py
|
||||||
|
lib/sqlalchemy/databases/__init__.py
|
||||||
|
lib/sqlalchemy/dialects/__init__.py
|
||||||
|
lib/sqlalchemy/dialects/postgres.py
|
||||||
|
lib/sqlalchemy/dialects/type_migration_guidelines.txt
|
||||||
|
lib/sqlalchemy/dialects/firebird/__init__.py
|
||||||
|
lib/sqlalchemy/dialects/firebird/base.py
|
||||||
|
lib/sqlalchemy/dialects/firebird/fdb.py
|
||||||
|
lib/sqlalchemy/dialects/firebird/kinterbasdb.py
|
||||||
|
lib/sqlalchemy/dialects/mssql/__init__.py
|
||||||
|
lib/sqlalchemy/dialects/mssql/adodbapi.py
|
||||||
|
lib/sqlalchemy/dialects/mssql/base.py
|
||||||
|
lib/sqlalchemy/dialects/mssql/information_schema.py
|
||||||
|
lib/sqlalchemy/dialects/mssql/mxodbc.py
|
||||||
|
lib/sqlalchemy/dialects/mssql/pymssql.py
|
||||||
|
lib/sqlalchemy/dialects/mssql/pyodbc.py
|
||||||
|
lib/sqlalchemy/dialects/mssql/zxjdbc.py
|
||||||
|
lib/sqlalchemy/dialects/mysql/__init__.py
|
||||||
|
lib/sqlalchemy/dialects/mysql/base.py
|
||||||
|
lib/sqlalchemy/dialects/mysql/cymysql.py
|
||||||
|
lib/sqlalchemy/dialects/mysql/gaerdbms.py
|
||||||
|
lib/sqlalchemy/dialects/mysql/mysqlconnector.py
|
||||||
|
lib/sqlalchemy/dialects/mysql/mysqldb.py
|
||||||
|
lib/sqlalchemy/dialects/mysql/oursql.py
|
||||||
|
lib/sqlalchemy/dialects/mysql/pymysql.py
|
||||||
|
lib/sqlalchemy/dialects/mysql/pyodbc.py
|
||||||
|
lib/sqlalchemy/dialects/mysql/zxjdbc.py
|
||||||
|
lib/sqlalchemy/dialects/oracle/__init__.py
|
||||||
|
lib/sqlalchemy/dialects/oracle/base.py
|
||||||
|
lib/sqlalchemy/dialects/oracle/cx_oracle.py
|
||||||
|
lib/sqlalchemy/dialects/oracle/zxjdbc.py
|
||||||
|
lib/sqlalchemy/dialects/postgresql/__init__.py
|
||||||
|
lib/sqlalchemy/dialects/postgresql/base.py
|
||||||
|
lib/sqlalchemy/dialects/postgresql/constraints.py
|
||||||
|
lib/sqlalchemy/dialects/postgresql/hstore.py
|
||||||
|
lib/sqlalchemy/dialects/postgresql/json.py
|
||||||
|
lib/sqlalchemy/dialects/postgresql/pg8000.py
|
||||||
|
lib/sqlalchemy/dialects/postgresql/psycopg2.py
|
||||||
|
lib/sqlalchemy/dialects/postgresql/psycopg2cffi.py
|
||||||
|
lib/sqlalchemy/dialects/postgresql/pypostgresql.py
|
||||||
|
lib/sqlalchemy/dialects/postgresql/ranges.py
|
||||||
|
lib/sqlalchemy/dialects/postgresql/zxjdbc.py
|
||||||
|
lib/sqlalchemy/dialects/sqlite/__init__.py
|
||||||
|
lib/sqlalchemy/dialects/sqlite/base.py
|
||||||
|
lib/sqlalchemy/dialects/sqlite/pysqlcipher.py
|
||||||
|
lib/sqlalchemy/dialects/sqlite/pysqlite.py
|
||||||
|
lib/sqlalchemy/dialects/sybase/__init__.py
|
||||||
|
lib/sqlalchemy/dialects/sybase/base.py
|
||||||
|
lib/sqlalchemy/dialects/sybase/mxodbc.py
|
||||||
|
lib/sqlalchemy/dialects/sybase/pyodbc.py
|
||||||
|
lib/sqlalchemy/dialects/sybase/pysybase.py
|
||||||
|
lib/sqlalchemy/engine/__init__.py
|
||||||
|
lib/sqlalchemy/engine/base.py
|
||||||
|
lib/sqlalchemy/engine/default.py
|
||||||
|
lib/sqlalchemy/engine/interfaces.py
|
||||||
|
lib/sqlalchemy/engine/reflection.py
|
||||||
|
lib/sqlalchemy/engine/result.py
|
||||||
|
lib/sqlalchemy/engine/strategies.py
|
||||||
|
lib/sqlalchemy/engine/threadlocal.py
|
||||||
|
lib/sqlalchemy/engine/url.py
|
||||||
|
lib/sqlalchemy/engine/util.py
|
||||||
|
lib/sqlalchemy/event/__init__.py
|
||||||
|
lib/sqlalchemy/event/api.py
|
||||||
|
lib/sqlalchemy/event/attr.py
|
||||||
|
lib/sqlalchemy/event/base.py
|
||||||
|
lib/sqlalchemy/event/legacy.py
|
||||||
|
lib/sqlalchemy/event/registry.py
|
||||||
|
lib/sqlalchemy/ext/__init__.py
|
||||||
|
lib/sqlalchemy/ext/associationproxy.py
|
||||||
|
lib/sqlalchemy/ext/automap.py
|
||||||
|
lib/sqlalchemy/ext/baked.py
|
||||||
|
lib/sqlalchemy/ext/compiler.py
|
||||||
|
lib/sqlalchemy/ext/horizontal_shard.py
|
||||||
|
lib/sqlalchemy/ext/hybrid.py
|
||||||
|
lib/sqlalchemy/ext/instrumentation.py
|
||||||
|
lib/sqlalchemy/ext/mutable.py
|
||||||
|
lib/sqlalchemy/ext/orderinglist.py
|
||||||
|
lib/sqlalchemy/ext/serializer.py
|
||||||
|
lib/sqlalchemy/ext/declarative/__init__.py
|
||||||
|
lib/sqlalchemy/ext/declarative/api.py
|
||||||
|
lib/sqlalchemy/ext/declarative/base.py
|
||||||
|
lib/sqlalchemy/ext/declarative/clsregistry.py
|
||||||
|
lib/sqlalchemy/orm/__init__.py
|
||||||
|
lib/sqlalchemy/orm/attributes.py
|
||||||
|
lib/sqlalchemy/orm/base.py
|
||||||
|
lib/sqlalchemy/orm/collections.py
|
||||||
|
lib/sqlalchemy/orm/dependency.py
|
||||||
|
lib/sqlalchemy/orm/deprecated_interfaces.py
|
||||||
|
lib/sqlalchemy/orm/descriptor_props.py
|
||||||
|
lib/sqlalchemy/orm/dynamic.py
|
||||||
|
lib/sqlalchemy/orm/evaluator.py
|
||||||
|
lib/sqlalchemy/orm/events.py
|
||||||
|
lib/sqlalchemy/orm/exc.py
|
||||||
|
lib/sqlalchemy/orm/identity.py
|
||||||
|
lib/sqlalchemy/orm/instrumentation.py
|
||||||
|
lib/sqlalchemy/orm/interfaces.py
|
||||||
|
lib/sqlalchemy/orm/loading.py
|
||||||
|
lib/sqlalchemy/orm/mapper.py
|
||||||
|
lib/sqlalchemy/orm/path_registry.py
|
||||||
|
lib/sqlalchemy/orm/persistence.py
|
||||||
|
lib/sqlalchemy/orm/properties.py
|
||||||
|
lib/sqlalchemy/orm/query.py
|
||||||
|
lib/sqlalchemy/orm/relationships.py
|
||||||
|
lib/sqlalchemy/orm/scoping.py
|
||||||
|
lib/sqlalchemy/orm/session.py
|
||||||
|
lib/sqlalchemy/orm/state.py
|
||||||
|
lib/sqlalchemy/orm/strategies.py
|
||||||
|
lib/sqlalchemy/orm/strategy_options.py
|
||||||
|
lib/sqlalchemy/orm/sync.py
|
||||||
|
lib/sqlalchemy/orm/unitofwork.py
|
||||||
|
lib/sqlalchemy/orm/util.py
|
||||||
|
lib/sqlalchemy/sql/__init__.py
|
||||||
|
lib/sqlalchemy/sql/annotation.py
|
||||||
|
lib/sqlalchemy/sql/base.py
|
||||||
|
lib/sqlalchemy/sql/compiler.py
|
||||||
|
lib/sqlalchemy/sql/crud.py
|
||||||
|
lib/sqlalchemy/sql/ddl.py
|
||||||
|
lib/sqlalchemy/sql/default_comparator.py
|
||||||
|
lib/sqlalchemy/sql/dml.py
|
||||||
|
lib/sqlalchemy/sql/elements.py
|
||||||
|
lib/sqlalchemy/sql/expression.py
|
||||||
|
lib/sqlalchemy/sql/functions.py
|
||||||
|
lib/sqlalchemy/sql/naming.py
|
||||||
|
lib/sqlalchemy/sql/operators.py
|
||||||
|
lib/sqlalchemy/sql/schema.py
|
||||||
|
lib/sqlalchemy/sql/selectable.py
|
||||||
|
lib/sqlalchemy/sql/sqltypes.py
|
||||||
|
lib/sqlalchemy/sql/type_api.py
|
||||||
|
lib/sqlalchemy/sql/util.py
|
||||||
|
lib/sqlalchemy/sql/visitors.py
|
||||||
|
lib/sqlalchemy/testing/__init__.py
|
||||||
|
lib/sqlalchemy/testing/assertions.py
|
||||||
|
lib/sqlalchemy/testing/assertsql.py
|
||||||
|
lib/sqlalchemy/testing/config.py
|
||||||
|
lib/sqlalchemy/testing/distutils_run.py
|
||||||
|
lib/sqlalchemy/testing/engines.py
|
||||||
|
lib/sqlalchemy/testing/entities.py
|
||||||
|
lib/sqlalchemy/testing/exclusions.py
|
||||||
|
lib/sqlalchemy/testing/fixtures.py
|
||||||
|
lib/sqlalchemy/testing/mock.py
|
||||||
|
lib/sqlalchemy/testing/pickleable.py
|
||||||
|
lib/sqlalchemy/testing/profiling.py
|
||||||
|
lib/sqlalchemy/testing/provision.py
|
||||||
|
lib/sqlalchemy/testing/replay_fixture.py
|
||||||
|
lib/sqlalchemy/testing/requirements.py
|
||||||
|
lib/sqlalchemy/testing/runner.py
|
||||||
|
lib/sqlalchemy/testing/schema.py
|
||||||
|
lib/sqlalchemy/testing/util.py
|
||||||
|
lib/sqlalchemy/testing/warnings.py
|
||||||
|
lib/sqlalchemy/testing/plugin/__init__.py
|
||||||
|
lib/sqlalchemy/testing/plugin/bootstrap.py
|
||||||
|
lib/sqlalchemy/testing/plugin/noseplugin.py
|
||||||
|
lib/sqlalchemy/testing/plugin/plugin_base.py
|
||||||
|
lib/sqlalchemy/testing/plugin/pytestplugin.py
|
||||||
|
lib/sqlalchemy/testing/suite/__init__.py
|
||||||
|
lib/sqlalchemy/testing/suite/test_ddl.py
|
||||||
|
lib/sqlalchemy/testing/suite/test_dialect.py
|
||||||
|
lib/sqlalchemy/testing/suite/test_insert.py
|
||||||
|
lib/sqlalchemy/testing/suite/test_reflection.py
|
||||||
|
lib/sqlalchemy/testing/suite/test_results.py
|
||||||
|
lib/sqlalchemy/testing/suite/test_select.py
|
||||||
|
lib/sqlalchemy/testing/suite/test_sequence.py
|
||||||
|
lib/sqlalchemy/testing/suite/test_types.py
|
||||||
|
lib/sqlalchemy/testing/suite/test_update_delete.py
|
||||||
|
lib/sqlalchemy/util/__init__.py
|
||||||
|
lib/sqlalchemy/util/_collections.py
|
||||||
|
lib/sqlalchemy/util/compat.py
|
||||||
|
lib/sqlalchemy/util/deprecations.py
|
||||||
|
lib/sqlalchemy/util/langhelpers.py
|
||||||
|
lib/sqlalchemy/util/queue.py
|
||||||
|
lib/sqlalchemy/util/topological.py
|
||||||
|
test/__init__.py
|
||||||
|
test/binary_data_one.dat
|
||||||
|
test/binary_data_two.dat
|
||||||
|
test/conftest.py
|
||||||
|
test/requirements.py
|
||||||
|
test/aaa_profiling/__init__.py
|
||||||
|
test/aaa_profiling/test_compiler.py
|
||||||
|
test/aaa_profiling/test_memusage.py
|
||||||
|
test/aaa_profiling/test_orm.py
|
||||||
|
test/aaa_profiling/test_pool.py
|
||||||
|
test/aaa_profiling/test_resultset.py
|
||||||
|
test/aaa_profiling/test_zoomark.py
|
||||||
|
test/aaa_profiling/test_zoomark_orm.py
|
||||||
|
test/base/__init__.py
|
||||||
|
test/base/test_dependency.py
|
||||||
|
test/base/test_events.py
|
||||||
|
test/base/test_except.py
|
||||||
|
test/base/test_inspect.py
|
||||||
|
test/base/test_tutorials.py
|
||||||
|
test/base/test_utils.py
|
||||||
|
test/dialect/__init__.py
|
||||||
|
test/dialect/test_firebird.py
|
||||||
|
test/dialect/test_mxodbc.py
|
||||||
|
test/dialect/test_oracle.py
|
||||||
|
test/dialect/test_pyodbc.py
|
||||||
|
test/dialect/test_sqlite.py
|
||||||
|
test/dialect/test_suite.py
|
||||||
|
test/dialect/test_sybase.py
|
||||||
|
test/dialect/mssql/__init__.py
|
||||||
|
test/dialect/mssql/test_compiler.py
|
||||||
|
test/dialect/mssql/test_engine.py
|
||||||
|
test/dialect/mssql/test_query.py
|
||||||
|
test/dialect/mssql/test_reflection.py
|
||||||
|
test/dialect/mssql/test_types.py
|
||||||
|
test/dialect/mysql/__init__.py
|
||||||
|
test/dialect/mysql/test_compiler.py
|
||||||
|
test/dialect/mysql/test_dialect.py
|
||||||
|
test/dialect/mysql/test_query.py
|
||||||
|
test/dialect/mysql/test_reflection.py
|
||||||
|
test/dialect/mysql/test_types.py
|
||||||
|
test/dialect/postgresql/__init__.py
|
||||||
|
test/dialect/postgresql/test_compiler.py
|
||||||
|
test/dialect/postgresql/test_dialect.py
|
||||||
|
test/dialect/postgresql/test_query.py
|
||||||
|
test/dialect/postgresql/test_reflection.py
|
||||||
|
test/dialect/postgresql/test_types.py
|
||||||
|
test/engine/__init__.py
|
||||||
|
test/engine/test_bind.py
|
||||||
|
test/engine/test_ddlevents.py
|
||||||
|
test/engine/test_execute.py
|
||||||
|
test/engine/test_logging.py
|
||||||
|
test/engine/test_parseconnect.py
|
||||||
|
test/engine/test_pool.py
|
||||||
|
test/engine/test_processors.py
|
||||||
|
test/engine/test_reconnect.py
|
||||||
|
test/engine/test_reflection.py
|
||||||
|
test/engine/test_transaction.py
|
||||||
|
test/ext/__init__.py
|
||||||
|
test/ext/test_associationproxy.py
|
||||||
|
test/ext/test_automap.py
|
||||||
|
test/ext/test_baked.py
|
||||||
|
test/ext/test_compiler.py
|
||||||
|
test/ext/test_extendedattr.py
|
||||||
|
test/ext/test_horizontal_shard.py
|
||||||
|
test/ext/test_hybrid.py
|
||||||
|
test/ext/test_mutable.py
|
||||||
|
test/ext/test_orderinglist.py
|
||||||
|
test/ext/test_serializer.py
|
||||||
|
test/ext/declarative/__init__.py
|
||||||
|
test/ext/declarative/test_basic.py
|
||||||
|
test/ext/declarative/test_clsregistry.py
|
||||||
|
test/ext/declarative/test_inheritance.py
|
||||||
|
test/ext/declarative/test_mixin.py
|
||||||
|
test/ext/declarative/test_reflection.py
|
||||||
|
test/orm/__init__.py
|
||||||
|
test/orm/_fixtures.py
|
||||||
|
test/orm/test_association.py
|
||||||
|
test/orm/test_assorted_eager.py
|
||||||
|
test/orm/test_attributes.py
|
||||||
|
test/orm/test_backref_mutations.py
|
||||||
|
test/orm/test_bind.py
|
||||||
|
test/orm/test_bulk.py
|
||||||
|
test/orm/test_bundle.py
|
||||||
|
test/orm/test_cascade.py
|
||||||
|
test/orm/test_collection.py
|
||||||
|
test/orm/test_compile.py
|
||||||
|
test/orm/test_composites.py
|
||||||
|
test/orm/test_cycles.py
|
||||||
|
test/orm/test_default_strategies.py
|
||||||
|
test/orm/test_defaults.py
|
||||||
|
test/orm/test_deferred.py
|
||||||
|
test/orm/test_deprecations.py
|
||||||
|
test/orm/test_descriptor.py
|
||||||
|
test/orm/test_dynamic.py
|
||||||
|
test/orm/test_eager_relations.py
|
||||||
|
test/orm/test_evaluator.py
|
||||||
|
test/orm/test_events.py
|
||||||
|
test/orm/test_expire.py
|
||||||
|
test/orm/test_froms.py
|
||||||
|
test/orm/test_generative.py
|
||||||
|
test/orm/test_hasparent.py
|
||||||
|
test/orm/test_immediate_load.py
|
||||||
|
test/orm/test_inspect.py
|
||||||
|
test/orm/test_instrumentation.py
|
||||||
|
test/orm/test_joins.py
|
||||||
|
test/orm/test_lazy_relations.py
|
||||||
|
test/orm/test_load_on_fks.py
|
||||||
|
test/orm/test_loading.py
|
||||||
|
test/orm/test_lockmode.py
|
||||||
|
test/orm/test_manytomany.py
|
||||||
|
test/orm/test_mapper.py
|
||||||
|
test/orm/test_merge.py
|
||||||
|
test/orm/test_naturalpks.py
|
||||||
|
test/orm/test_of_type.py
|
||||||
|
test/orm/test_onetoone.py
|
||||||
|
test/orm/test_options.py
|
||||||
|
test/orm/test_pickled.py
|
||||||
|
test/orm/test_query.py
|
||||||
|
test/orm/test_rel_fn.py
|
||||||
|
test/orm/test_relationships.py
|
||||||
|
test/orm/test_scoping.py
|
||||||
|
test/orm/test_selectable.py
|
||||||
|
test/orm/test_session.py
|
||||||
|
test/orm/test_subquery_relations.py
|
||||||
|
test/orm/test_sync.py
|
||||||
|
test/orm/test_transaction.py
|
||||||
|
test/orm/test_unitofwork.py
|
||||||
|
test/orm/test_unitofworkv2.py
|
||||||
|
test/orm/test_update_delete.py
|
||||||
|
test/orm/test_utils.py
|
||||||
|
test/orm/test_validators.py
|
||||||
|
test/orm/test_versioning.py
|
||||||
|
test/orm/inheritance/__init__.py
|
||||||
|
test/orm/inheritance/_poly_fixtures.py
|
||||||
|
test/orm/inheritance/test_abc_inheritance.py
|
||||||
|
test/orm/inheritance/test_abc_polymorphic.py
|
||||||
|
test/orm/inheritance/test_assorted_poly.py
|
||||||
|
test/orm/inheritance/test_basic.py
|
||||||
|
test/orm/inheritance/test_concrete.py
|
||||||
|
test/orm/inheritance/test_magazine.py
|
||||||
|
test/orm/inheritance/test_manytomany.py
|
||||||
|
test/orm/inheritance/test_poly_linked_list.py
|
||||||
|
test/orm/inheritance/test_poly_persistence.py
|
||||||
|
test/orm/inheritance/test_polymorphic_rel.py
|
||||||
|
test/orm/inheritance/test_productspec.py
|
||||||
|
test/orm/inheritance/test_relationship.py
|
||||||
|
test/orm/inheritance/test_selects.py
|
||||||
|
test/orm/inheritance/test_single.py
|
||||||
|
test/orm/inheritance/test_with_poly.py
|
||||||
|
test/perf/invalidate_stresstest.py
|
||||||
|
test/perf/orm2010.py
|
||||||
|
test/sql/__init__.py
|
||||||
|
test/sql/test_case_statement.py
|
||||||
|
test/sql/test_compiler.py
|
||||||
|
test/sql/test_constraints.py
|
||||||
|
test/sql/test_cte.py
|
||||||
|
test/sql/test_ddlemit.py
|
||||||
|
test/sql/test_defaults.py
|
||||||
|
test/sql/test_delete.py
|
||||||
|
test/sql/test_functions.py
|
||||||
|
test/sql/test_generative.py
|
||||||
|
test/sql/test_insert.py
|
||||||
|
test/sql/test_insert_exec.py
|
||||||
|
test/sql/test_inspect.py
|
||||||
|
test/sql/test_join_rewriting.py
|
||||||
|
test/sql/test_labels.py
|
||||||
|
test/sql/test_metadata.py
|
||||||
|
test/sql/test_operators.py
|
||||||
|
test/sql/test_query.py
|
||||||
|
test/sql/test_quote.py
|
||||||
|
test/sql/test_resultset.py
|
||||||
|
test/sql/test_returning.py
|
||||||
|
test/sql/test_rowcount.py
|
||||||
|
test/sql/test_selectable.py
|
||||||
|
test/sql/test_text.py
|
||||||
|
test/sql/test_type_expressions.py
|
||||||
|
test/sql/test_types.py
|
||||||
|
test/sql/test_unicode.py
|
||||||
|
test/sql/test_update.py
|
|
@ -0,0 +1,373 @@
|
||||||
|
../sqlalchemy/__init__.py
|
||||||
|
../sqlalchemy/__pycache__/__init__.cpython-37.pyc
|
||||||
|
../sqlalchemy/__pycache__/events.cpython-37.pyc
|
||||||
|
../sqlalchemy/__pycache__/exc.cpython-37.pyc
|
||||||
|
../sqlalchemy/__pycache__/inspection.cpython-37.pyc
|
||||||
|
../sqlalchemy/__pycache__/interfaces.cpython-37.pyc
|
||||||
|
../sqlalchemy/__pycache__/log.cpython-37.pyc
|
||||||
|
../sqlalchemy/__pycache__/pool.cpython-37.pyc
|
||||||
|
../sqlalchemy/__pycache__/processors.cpython-37.pyc
|
||||||
|
../sqlalchemy/__pycache__/schema.cpython-37.pyc
|
||||||
|
../sqlalchemy/__pycache__/types.cpython-37.pyc
|
||||||
|
../sqlalchemy/connectors/__init__.py
|
||||||
|
../sqlalchemy/connectors/__pycache__/__init__.cpython-37.pyc
|
||||||
|
../sqlalchemy/connectors/__pycache__/mxodbc.cpython-37.pyc
|
||||||
|
../sqlalchemy/connectors/__pycache__/pyodbc.cpython-37.pyc
|
||||||
|
../sqlalchemy/connectors/__pycache__/zxJDBC.cpython-37.pyc
|
||||||
|
../sqlalchemy/connectors/mxodbc.py
|
||||||
|
../sqlalchemy/connectors/pyodbc.py
|
||||||
|
../sqlalchemy/connectors/zxJDBC.py
|
||||||
|
../sqlalchemy/cprocessors.cpython-37m-x86_64-linux-gnu.so
|
||||||
|
../sqlalchemy/cresultproxy.cpython-37m-x86_64-linux-gnu.so
|
||||||
|
../sqlalchemy/cutils.cpython-37m-x86_64-linux-gnu.so
|
||||||
|
../sqlalchemy/databases/__init__.py
|
||||||
|
../sqlalchemy/databases/__pycache__/__init__.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/__init__.py
|
||||||
|
../sqlalchemy/dialects/__pycache__/__init__.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/__pycache__/postgres.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/firebird/__init__.py
|
||||||
|
../sqlalchemy/dialects/firebird/__pycache__/__init__.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/firebird/__pycache__/base.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/firebird/__pycache__/fdb.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/firebird/__pycache__/kinterbasdb.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/firebird/base.py
|
||||||
|
../sqlalchemy/dialects/firebird/fdb.py
|
||||||
|
../sqlalchemy/dialects/firebird/kinterbasdb.py
|
||||||
|
../sqlalchemy/dialects/mssql/__init__.py
|
||||||
|
../sqlalchemy/dialects/mssql/__pycache__/__init__.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/mssql/__pycache__/adodbapi.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/mssql/__pycache__/base.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/mssql/__pycache__/information_schema.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/mssql/__pycache__/mxodbc.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/mssql/__pycache__/pymssql.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/mssql/__pycache__/pyodbc.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/mssql/__pycache__/zxjdbc.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/mssql/adodbapi.py
|
||||||
|
../sqlalchemy/dialects/mssql/base.py
|
||||||
|
../sqlalchemy/dialects/mssql/information_schema.py
|
||||||
|
../sqlalchemy/dialects/mssql/mxodbc.py
|
||||||
|
../sqlalchemy/dialects/mssql/pymssql.py
|
||||||
|
../sqlalchemy/dialects/mssql/pyodbc.py
|
||||||
|
../sqlalchemy/dialects/mssql/zxjdbc.py
|
||||||
|
../sqlalchemy/dialects/mysql/__init__.py
|
||||||
|
../sqlalchemy/dialects/mysql/__pycache__/__init__.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/mysql/__pycache__/base.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/mysql/__pycache__/cymysql.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/mysql/__pycache__/gaerdbms.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/mysql/__pycache__/mysqlconnector.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/mysql/__pycache__/mysqldb.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/mysql/__pycache__/oursql.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/mysql/__pycache__/pymysql.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/mysql/__pycache__/pyodbc.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/mysql/__pycache__/zxjdbc.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/mysql/base.py
|
||||||
|
../sqlalchemy/dialects/mysql/cymysql.py
|
||||||
|
../sqlalchemy/dialects/mysql/gaerdbms.py
|
||||||
|
../sqlalchemy/dialects/mysql/mysqlconnector.py
|
||||||
|
../sqlalchemy/dialects/mysql/mysqldb.py
|
||||||
|
../sqlalchemy/dialects/mysql/oursql.py
|
||||||
|
../sqlalchemy/dialects/mysql/pymysql.py
|
||||||
|
../sqlalchemy/dialects/mysql/pyodbc.py
|
||||||
|
../sqlalchemy/dialects/mysql/zxjdbc.py
|
||||||
|
../sqlalchemy/dialects/oracle/__init__.py
|
||||||
|
../sqlalchemy/dialects/oracle/__pycache__/__init__.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/oracle/__pycache__/base.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/oracle/__pycache__/cx_oracle.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/oracle/__pycache__/zxjdbc.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/oracle/base.py
|
||||||
|
../sqlalchemy/dialects/oracle/cx_oracle.py
|
||||||
|
../sqlalchemy/dialects/oracle/zxjdbc.py
|
||||||
|
../sqlalchemy/dialects/postgres.py
|
||||||
|
../sqlalchemy/dialects/postgresql/__init__.py
|
||||||
|
../sqlalchemy/dialects/postgresql/__pycache__/__init__.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/postgresql/__pycache__/base.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/postgresql/__pycache__/constraints.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/postgresql/__pycache__/hstore.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/postgresql/__pycache__/json.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/postgresql/__pycache__/pg8000.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/postgresql/__pycache__/psycopg2.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/postgresql/__pycache__/psycopg2cffi.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/postgresql/__pycache__/pypostgresql.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/postgresql/__pycache__/ranges.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/postgresql/__pycache__/zxjdbc.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/postgresql/base.py
|
||||||
|
../sqlalchemy/dialects/postgresql/constraints.py
|
||||||
|
../sqlalchemy/dialects/postgresql/hstore.py
|
||||||
|
../sqlalchemy/dialects/postgresql/json.py
|
||||||
|
../sqlalchemy/dialects/postgresql/pg8000.py
|
||||||
|
../sqlalchemy/dialects/postgresql/psycopg2.py
|
||||||
|
../sqlalchemy/dialects/postgresql/psycopg2cffi.py
|
||||||
|
../sqlalchemy/dialects/postgresql/pypostgresql.py
|
||||||
|
../sqlalchemy/dialects/postgresql/ranges.py
|
||||||
|
../sqlalchemy/dialects/postgresql/zxjdbc.py
|
||||||
|
../sqlalchemy/dialects/sqlite/__init__.py
|
||||||
|
../sqlalchemy/dialects/sqlite/__pycache__/__init__.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/sqlite/__pycache__/base.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/sqlite/__pycache__/pysqlcipher.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/sqlite/__pycache__/pysqlite.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/sqlite/base.py
|
||||||
|
../sqlalchemy/dialects/sqlite/pysqlcipher.py
|
||||||
|
../sqlalchemy/dialects/sqlite/pysqlite.py
|
||||||
|
../sqlalchemy/dialects/sybase/__init__.py
|
||||||
|
../sqlalchemy/dialects/sybase/__pycache__/__init__.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/sybase/__pycache__/base.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/sybase/__pycache__/mxodbc.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/sybase/__pycache__/pyodbc.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/sybase/__pycache__/pysybase.cpython-37.pyc
|
||||||
|
../sqlalchemy/dialects/sybase/base.py
|
||||||
|
../sqlalchemy/dialects/sybase/mxodbc.py
|
||||||
|
../sqlalchemy/dialects/sybase/pyodbc.py
|
||||||
|
../sqlalchemy/dialects/sybase/pysybase.py
|
||||||
|
../sqlalchemy/engine/__init__.py
|
||||||
|
../sqlalchemy/engine/__pycache__/__init__.cpython-37.pyc
|
||||||
|
../sqlalchemy/engine/__pycache__/base.cpython-37.pyc
|
||||||
|
../sqlalchemy/engine/__pycache__/default.cpython-37.pyc
|
||||||
|
../sqlalchemy/engine/__pycache__/interfaces.cpython-37.pyc
|
||||||
|
../sqlalchemy/engine/__pycache__/reflection.cpython-37.pyc
|
||||||
|
../sqlalchemy/engine/__pycache__/result.cpython-37.pyc
|
||||||
|
../sqlalchemy/engine/__pycache__/strategies.cpython-37.pyc
|
||||||
|
../sqlalchemy/engine/__pycache__/threadlocal.cpython-37.pyc
|
||||||
|
../sqlalchemy/engine/__pycache__/url.cpython-37.pyc
|
||||||
|
../sqlalchemy/engine/__pycache__/util.cpython-37.pyc
|
||||||
|
../sqlalchemy/engine/base.py
|
||||||
|
../sqlalchemy/engine/default.py
|
||||||
|
../sqlalchemy/engine/interfaces.py
|
||||||
|
../sqlalchemy/engine/reflection.py
|
||||||
|
../sqlalchemy/engine/result.py
|
||||||
|
../sqlalchemy/engine/strategies.py
|
||||||
|
../sqlalchemy/engine/threadlocal.py
|
||||||
|
../sqlalchemy/engine/url.py
|
||||||
|
../sqlalchemy/engine/util.py
|
||||||
|
../sqlalchemy/event/__init__.py
|
||||||
|
../sqlalchemy/event/__pycache__/__init__.cpython-37.pyc
|
||||||
|
../sqlalchemy/event/__pycache__/api.cpython-37.pyc
|
||||||
|
../sqlalchemy/event/__pycache__/attr.cpython-37.pyc
|
||||||
|
../sqlalchemy/event/__pycache__/base.cpython-37.pyc
|
||||||
|
../sqlalchemy/event/__pycache__/legacy.cpython-37.pyc
|
||||||
|
../sqlalchemy/event/__pycache__/registry.cpython-37.pyc
|
||||||
|
../sqlalchemy/event/api.py
|
||||||
|
../sqlalchemy/event/attr.py
|
||||||
|
../sqlalchemy/event/base.py
|
||||||
|
../sqlalchemy/event/legacy.py
|
||||||
|
../sqlalchemy/event/registry.py
|
||||||
|
../sqlalchemy/events.py
|
||||||
|
../sqlalchemy/exc.py
|
||||||
|
../sqlalchemy/ext/__init__.py
|
||||||
|
../sqlalchemy/ext/__pycache__/__init__.cpython-37.pyc
|
||||||
|
../sqlalchemy/ext/__pycache__/associationproxy.cpython-37.pyc
|
||||||
|
../sqlalchemy/ext/__pycache__/automap.cpython-37.pyc
|
||||||
|
../sqlalchemy/ext/__pycache__/baked.cpython-37.pyc
|
||||||
|
../sqlalchemy/ext/__pycache__/compiler.cpython-37.pyc
|
||||||
|
../sqlalchemy/ext/__pycache__/horizontal_shard.cpython-37.pyc
|
||||||
|
../sqlalchemy/ext/__pycache__/hybrid.cpython-37.pyc
|
||||||
|
../sqlalchemy/ext/__pycache__/instrumentation.cpython-37.pyc
|
||||||
|
../sqlalchemy/ext/__pycache__/mutable.cpython-37.pyc
|
||||||
|
../sqlalchemy/ext/__pycache__/orderinglist.cpython-37.pyc
|
||||||
|
../sqlalchemy/ext/__pycache__/serializer.cpython-37.pyc
|
||||||
|
../sqlalchemy/ext/associationproxy.py
|
||||||
|
../sqlalchemy/ext/automap.py
|
||||||
|
../sqlalchemy/ext/baked.py
|
||||||
|
../sqlalchemy/ext/compiler.py
|
||||||
|
../sqlalchemy/ext/declarative/__init__.py
|
||||||
|
../sqlalchemy/ext/declarative/__pycache__/__init__.cpython-37.pyc
|
||||||
|
../sqlalchemy/ext/declarative/__pycache__/api.cpython-37.pyc
|
||||||
|
../sqlalchemy/ext/declarative/__pycache__/base.cpython-37.pyc
|
||||||
|
../sqlalchemy/ext/declarative/__pycache__/clsregistry.cpython-37.pyc
|
||||||
|
../sqlalchemy/ext/declarative/api.py
|
||||||
|
../sqlalchemy/ext/declarative/base.py
|
||||||
|
../sqlalchemy/ext/declarative/clsregistry.py
|
||||||
|
../sqlalchemy/ext/horizontal_shard.py
|
||||||
|
../sqlalchemy/ext/hybrid.py
|
||||||
|
../sqlalchemy/ext/instrumentation.py
|
||||||
|
../sqlalchemy/ext/mutable.py
|
||||||
|
../sqlalchemy/ext/orderinglist.py
|
||||||
|
../sqlalchemy/ext/serializer.py
|
||||||
|
../sqlalchemy/inspection.py
|
||||||
|
../sqlalchemy/interfaces.py
|
||||||
|
../sqlalchemy/log.py
|
||||||
|
../sqlalchemy/orm/__init__.py
|
||||||
|
../sqlalchemy/orm/__pycache__/__init__.cpython-37.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/attributes.cpython-37.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/base.cpython-37.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/collections.cpython-37.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/dependency.cpython-37.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/deprecated_interfaces.cpython-37.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/descriptor_props.cpython-37.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/dynamic.cpython-37.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/evaluator.cpython-37.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/events.cpython-37.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/exc.cpython-37.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/identity.cpython-37.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/instrumentation.cpython-37.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/interfaces.cpython-37.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/loading.cpython-37.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/mapper.cpython-37.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/path_registry.cpython-37.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/persistence.cpython-37.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/properties.cpython-37.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/query.cpython-37.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/relationships.cpython-37.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/scoping.cpython-37.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/session.cpython-37.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/state.cpython-37.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/strategies.cpython-37.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/strategy_options.cpython-37.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/sync.cpython-37.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/unitofwork.cpython-37.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/util.cpython-37.pyc
|
||||||
|
../sqlalchemy/orm/attributes.py
|
||||||
|
../sqlalchemy/orm/base.py
|
||||||
|
../sqlalchemy/orm/collections.py
|
||||||
|
../sqlalchemy/orm/dependency.py
|
||||||
|
../sqlalchemy/orm/deprecated_interfaces.py
|
||||||
|
../sqlalchemy/orm/descriptor_props.py
|
||||||
|
../sqlalchemy/orm/dynamic.py
|
||||||
|
../sqlalchemy/orm/evaluator.py
|
||||||
|
../sqlalchemy/orm/events.py
|
||||||
|
../sqlalchemy/orm/exc.py
|
||||||
|
../sqlalchemy/orm/identity.py
|
||||||
|
../sqlalchemy/orm/instrumentation.py
|
||||||
|
../sqlalchemy/orm/interfaces.py
|
||||||
|
../sqlalchemy/orm/loading.py
|
||||||
|
../sqlalchemy/orm/mapper.py
|
||||||
|
../sqlalchemy/orm/path_registry.py
|
||||||
|
../sqlalchemy/orm/persistence.py
|
||||||
|
../sqlalchemy/orm/properties.py
|
||||||
|
../sqlalchemy/orm/query.py
|
||||||
|
../sqlalchemy/orm/relationships.py
|
||||||
|
../sqlalchemy/orm/scoping.py
|
||||||
|
../sqlalchemy/orm/session.py
|
||||||
|
../sqlalchemy/orm/state.py
|
||||||
|
../sqlalchemy/orm/strategies.py
|
||||||
|
../sqlalchemy/orm/strategy_options.py
|
||||||
|
../sqlalchemy/orm/sync.py
|
||||||
|
../sqlalchemy/orm/unitofwork.py
|
||||||
|
../sqlalchemy/orm/util.py
|
||||||
|
../sqlalchemy/pool.py
|
||||||
|
../sqlalchemy/processors.py
|
||||||
|
../sqlalchemy/schema.py
|
||||||
|
../sqlalchemy/sql/__init__.py
|
||||||
|
../sqlalchemy/sql/__pycache__/__init__.cpython-37.pyc
|
||||||
|
../sqlalchemy/sql/__pycache__/annotation.cpython-37.pyc
|
||||||
|
../sqlalchemy/sql/__pycache__/base.cpython-37.pyc
|
||||||
|
../sqlalchemy/sql/__pycache__/compiler.cpython-37.pyc
|
||||||
|
../sqlalchemy/sql/__pycache__/crud.cpython-37.pyc
|
||||||
|
../sqlalchemy/sql/__pycache__/ddl.cpython-37.pyc
|
||||||
|
../sqlalchemy/sql/__pycache__/default_comparator.cpython-37.pyc
|
||||||
|
../sqlalchemy/sql/__pycache__/dml.cpython-37.pyc
|
||||||
|
../sqlalchemy/sql/__pycache__/elements.cpython-37.pyc
|
||||||
|
../sqlalchemy/sql/__pycache__/expression.cpython-37.pyc
|
||||||
|
../sqlalchemy/sql/__pycache__/functions.cpython-37.pyc
|
||||||
|
../sqlalchemy/sql/__pycache__/naming.cpython-37.pyc
|
||||||
|
../sqlalchemy/sql/__pycache__/operators.cpython-37.pyc
|
||||||
|
../sqlalchemy/sql/__pycache__/schema.cpython-37.pyc
|
||||||
|
../sqlalchemy/sql/__pycache__/selectable.cpython-37.pyc
|
||||||
|
../sqlalchemy/sql/__pycache__/sqltypes.cpython-37.pyc
|
||||||
|
../sqlalchemy/sql/__pycache__/type_api.cpython-37.pyc
|
||||||
|
../sqlalchemy/sql/__pycache__/util.cpython-37.pyc
|
||||||
|
../sqlalchemy/sql/__pycache__/visitors.cpython-37.pyc
|
||||||
|
../sqlalchemy/sql/annotation.py
|
||||||
|
../sqlalchemy/sql/base.py
|
||||||
|
../sqlalchemy/sql/compiler.py
|
||||||
|
../sqlalchemy/sql/crud.py
|
||||||
|
../sqlalchemy/sql/ddl.py
|
||||||
|
../sqlalchemy/sql/default_comparator.py
|
||||||
|
../sqlalchemy/sql/dml.py
|
||||||
|
../sqlalchemy/sql/elements.py
|
||||||
|
../sqlalchemy/sql/expression.py
|
||||||
|
../sqlalchemy/sql/functions.py
|
||||||
|
../sqlalchemy/sql/naming.py
|
||||||
|
../sqlalchemy/sql/operators.py
|
||||||
|
../sqlalchemy/sql/schema.py
|
||||||
|
../sqlalchemy/sql/selectable.py
|
||||||
|
../sqlalchemy/sql/sqltypes.py
|
||||||
|
../sqlalchemy/sql/type_api.py
|
||||||
|
../sqlalchemy/sql/util.py
|
||||||
|
../sqlalchemy/sql/visitors.py
|
||||||
|
../sqlalchemy/testing/__init__.py
|
||||||
|
../sqlalchemy/testing/__pycache__/__init__.cpython-37.pyc
|
||||||
|
../sqlalchemy/testing/__pycache__/assertions.cpython-37.pyc
|
||||||
|
../sqlalchemy/testing/__pycache__/assertsql.cpython-37.pyc
|
||||||
|
../sqlalchemy/testing/__pycache__/config.cpython-37.pyc
|
||||||
|
../sqlalchemy/testing/__pycache__/distutils_run.cpython-37.pyc
|
||||||
|
../sqlalchemy/testing/__pycache__/engines.cpython-37.pyc
|
||||||
|
../sqlalchemy/testing/__pycache__/entities.cpython-37.pyc
|
||||||
|
../sqlalchemy/testing/__pycache__/exclusions.cpython-37.pyc
|
||||||
|
../sqlalchemy/testing/__pycache__/fixtures.cpython-37.pyc
|
||||||
|
../sqlalchemy/testing/__pycache__/mock.cpython-37.pyc
|
||||||
|
../sqlalchemy/testing/__pycache__/pickleable.cpython-37.pyc
|
||||||
|
../sqlalchemy/testing/__pycache__/profiling.cpython-37.pyc
|
||||||
|
../sqlalchemy/testing/__pycache__/provision.cpython-37.pyc
|
||||||
|
../sqlalchemy/testing/__pycache__/replay_fixture.cpython-37.pyc
|
||||||
|
../sqlalchemy/testing/__pycache__/requirements.cpython-37.pyc
|
||||||
|
../sqlalchemy/testing/__pycache__/runner.cpython-37.pyc
|
||||||
|
../sqlalchemy/testing/__pycache__/schema.cpython-37.pyc
|
||||||
|
../sqlalchemy/testing/__pycache__/util.cpython-37.pyc
|
||||||
|
../sqlalchemy/testing/__pycache__/warnings.cpython-37.pyc
|
||||||
|
../sqlalchemy/testing/assertions.py
|
||||||
|
../sqlalchemy/testing/assertsql.py
|
||||||
|
../sqlalchemy/testing/config.py
|
||||||
|
../sqlalchemy/testing/distutils_run.py
|
||||||
|
../sqlalchemy/testing/engines.py
|
||||||
|
../sqlalchemy/testing/entities.py
|
||||||
|
../sqlalchemy/testing/exclusions.py
|
||||||
|
../sqlalchemy/testing/fixtures.py
|
||||||
|
../sqlalchemy/testing/mock.py
|
||||||
|
../sqlalchemy/testing/pickleable.py
|
||||||
|
../sqlalchemy/testing/plugin/__init__.py
|
||||||
|
../sqlalchemy/testing/plugin/__pycache__/__init__.cpython-37.pyc
|
||||||
|
../sqlalchemy/testing/plugin/__pycache__/bootstrap.cpython-37.pyc
|
||||||
|
../sqlalchemy/testing/plugin/__pycache__/noseplugin.cpython-37.pyc
|
||||||
|
../sqlalchemy/testing/plugin/__pycache__/plugin_base.cpython-37.pyc
|
||||||
|
../sqlalchemy/testing/plugin/__pycache__/pytestplugin.cpython-37.pyc
|
||||||
|
../sqlalchemy/testing/plugin/bootstrap.py
|
||||||
|
../sqlalchemy/testing/plugin/noseplugin.py
|
||||||
|
../sqlalchemy/testing/plugin/plugin_base.py
|
||||||
|
../sqlalchemy/testing/plugin/pytestplugin.py
|
||||||
|
../sqlalchemy/testing/profiling.py
|
||||||
|
../sqlalchemy/testing/provision.py
|
||||||
|
../sqlalchemy/testing/replay_fixture.py
|
||||||
|
../sqlalchemy/testing/requirements.py
|
||||||
|
../sqlalchemy/testing/runner.py
|
||||||
|
../sqlalchemy/testing/schema.py
|
||||||
|
../sqlalchemy/testing/suite/__init__.py
|
||||||
|
../sqlalchemy/testing/suite/__pycache__/__init__.cpython-37.pyc
|
||||||
|
../sqlalchemy/testing/suite/__pycache__/test_ddl.cpython-37.pyc
|
||||||
|
../sqlalchemy/testing/suite/__pycache__/test_dialect.cpython-37.pyc
|
||||||
|
../sqlalchemy/testing/suite/__pycache__/test_insert.cpython-37.pyc
|
||||||
|
../sqlalchemy/testing/suite/__pycache__/test_reflection.cpython-37.pyc
|
||||||
|
../sqlalchemy/testing/suite/__pycache__/test_results.cpython-37.pyc
|
||||||
|
../sqlalchemy/testing/suite/__pycache__/test_select.cpython-37.pyc
|
||||||
|
../sqlalchemy/testing/suite/__pycache__/test_sequence.cpython-37.pyc
|
||||||
|
../sqlalchemy/testing/suite/__pycache__/test_types.cpython-37.pyc
|
||||||
|
../sqlalchemy/testing/suite/__pycache__/test_update_delete.cpython-37.pyc
|
||||||
|
../sqlalchemy/testing/suite/test_ddl.py
|
||||||
|
../sqlalchemy/testing/suite/test_dialect.py
|
||||||
|
../sqlalchemy/testing/suite/test_insert.py
|
||||||
|
../sqlalchemy/testing/suite/test_reflection.py
|
||||||
|
../sqlalchemy/testing/suite/test_results.py
|
||||||
|
../sqlalchemy/testing/suite/test_select.py
|
||||||
|
../sqlalchemy/testing/suite/test_sequence.py
|
||||||
|
../sqlalchemy/testing/suite/test_types.py
|
||||||
|
../sqlalchemy/testing/suite/test_update_delete.py
|
||||||
|
../sqlalchemy/testing/util.py
|
||||||
|
../sqlalchemy/testing/warnings.py
|
||||||
|
../sqlalchemy/types.py
|
||||||
|
../sqlalchemy/util/__init__.py
|
||||||
|
../sqlalchemy/util/__pycache__/__init__.cpython-37.pyc
|
||||||
|
../sqlalchemy/util/__pycache__/_collections.cpython-37.pyc
|
||||||
|
../sqlalchemy/util/__pycache__/compat.cpython-37.pyc
|
||||||
|
../sqlalchemy/util/__pycache__/deprecations.cpython-37.pyc
|
||||||
|
../sqlalchemy/util/__pycache__/langhelpers.cpython-37.pyc
|
||||||
|
../sqlalchemy/util/__pycache__/queue.cpython-37.pyc
|
||||||
|
../sqlalchemy/util/__pycache__/topological.cpython-37.pyc
|
||||||
|
../sqlalchemy/util/_collections.py
|
||||||
|
../sqlalchemy/util/compat.py
|
||||||
|
../sqlalchemy/util/deprecations.py
|
||||||
|
../sqlalchemy/util/langhelpers.py
|
||||||
|
../sqlalchemy/util/queue.py
|
||||||
|
../sqlalchemy/util/topological.py
|
||||||
|
PKG-INFO
|
||||||
|
SOURCES.txt
|
||||||
|
dependency_links.txt
|
||||||
|
top_level.txt
|
Some files were not shown because too many files have changed in this diff Show more
Loading…
Reference in a new issue