Compare commits
No commits in common. "384f9a7e08c16d7044371b1cd779e02ea6088070" and "e9dacd2bf392f04b5634f7611def55c074d76817" have entirely different histories.
384f9a7e08
...
e9dacd2bf3
524 changed files with 12746 additions and 22485 deletions
16
README.md
16
README.md
|
|
@ -1,17 +1,3 @@
|
|||
to update you need:
|
||||
|
||||
apt install python3-pip python3-venv python3.5 python3.5-dev python3.6 python3.6-dev python3.6-dev
|
||||
|
||||
|
||||
# TOR update
|
||||
|
||||
wget https://www.torproject.org/dist/torbrowser/8.0.4/tor-browser-linux64-8.0.4_en-US.tar.xz
|
||||
tar xvf tor-browser-linux64-8.0.4_en-US.tar.xz
|
||||
|
||||
rm -r tor
|
||||
mv tor-browser_en-US/Browser/TorBrowser/Tor tor
|
||||
mv tor/libstdc++/libstdc++.so.6 tor/
|
||||
rmdir tor/libstdc++/
|
||||
rm -r tor/PluggableTransports
|
||||
rm -r tor-browser_en-US
|
||||
|
||||
apt install python3-pip python3.4 python3.5 python3.4-dev python3.5-dev python3.6 python3.6-dev
|
||||
|
|
|
|||
|
|
@ -0,0 +1,137 @@
|
|||
SQLAlchemy
|
||||
==========
|
||||
|
||||
The Python SQL Toolkit and Object Relational Mapper
|
||||
|
||||
Introduction
|
||||
-------------
|
||||
|
||||
SQLAlchemy is the Python SQL toolkit and Object Relational Mapper
|
||||
that gives application developers the full power and
|
||||
flexibility of SQL. SQLAlchemy provides a full suite
|
||||
of well known enterprise-level persistence patterns,
|
||||
designed for efficient and high-performing database
|
||||
access, adapted into a simple and Pythonic domain
|
||||
language.
|
||||
|
||||
Major SQLAlchemy features include:
|
||||
|
||||
* An industrial strength ORM, built
|
||||
from the core on the identity map, unit of work,
|
||||
and data mapper patterns. These patterns
|
||||
allow transparent persistence of objects
|
||||
using a declarative configuration system.
|
||||
Domain models
|
||||
can be constructed and manipulated naturally,
|
||||
and changes are synchronized with the
|
||||
current transaction automatically.
|
||||
* A relationally-oriented query system, exposing
|
||||
the full range of SQL's capabilities
|
||||
explicitly, including joins, subqueries,
|
||||
correlation, and most everything else,
|
||||
in terms of the object model.
|
||||
Writing queries with the ORM uses the same
|
||||
techniques of relational composition you use
|
||||
when writing SQL. While you can drop into
|
||||
literal SQL at any time, it's virtually never
|
||||
needed.
|
||||
* A comprehensive and flexible system
|
||||
of eager loading for related collections and objects.
|
||||
Collections are cached within a session,
|
||||
and can be loaded on individual access, all
|
||||
at once using joins, or by query per collection
|
||||
across the full result set.
|
||||
* A Core SQL construction system and DBAPI
|
||||
interaction layer. The SQLAlchemy Core is
|
||||
separate from the ORM and is a full database
|
||||
abstraction layer in its own right, and includes
|
||||
an extensible Python-based SQL expression
|
||||
language, schema metadata, connection pooling,
|
||||
type coercion, and custom types.
|
||||
* All primary and foreign key constraints are
|
||||
assumed to be composite and natural. Surrogate
|
||||
integer primary keys are of course still the
|
||||
norm, but SQLAlchemy never assumes or hardcodes
|
||||
to this model.
|
||||
* Database introspection and generation. Database
|
||||
schemas can be "reflected" in one step into
|
||||
Python structures representing database metadata;
|
||||
those same structures can then generate
|
||||
CREATE statements right back out - all within
|
||||
the Core, independent of the ORM.
|
||||
|
||||
SQLAlchemy's philosophy:
|
||||
|
||||
* SQL databases behave less and less like object
|
||||
collections the more size and performance start to
|
||||
matter; object collections behave less and less like
|
||||
tables and rows the more abstraction starts to matter.
|
||||
SQLAlchemy aims to accommodate both of these
|
||||
principles.
|
||||
* An ORM doesn't need to hide the "R". A relational
|
||||
database provides rich, set-based functionality
|
||||
that should be fully exposed. SQLAlchemy's
|
||||
ORM provides an open-ended set of patterns
|
||||
that allow a developer to construct a custom
|
||||
mediation layer between a domain model and
|
||||
a relational schema, turning the so-called
|
||||
"object relational impedance" issue into
|
||||
a distant memory.
|
||||
* The developer, in all cases, makes all decisions
|
||||
regarding the design, structure, and naming conventions
|
||||
of both the object model as well as the relational
|
||||
schema. SQLAlchemy only provides the means
|
||||
to automate the execution of these decisions.
|
||||
* With SQLAlchemy, there's no such thing as
|
||||
"the ORM generated a bad query" - you
|
||||
retain full control over the structure of
|
||||
queries, including how joins are organized,
|
||||
how subqueries and correlation is used, what
|
||||
columns are requested. Everything SQLAlchemy
|
||||
does is ultimately the result of a developer-
|
||||
initiated decision.
|
||||
* Don't use an ORM if the problem doesn't need one.
|
||||
SQLAlchemy consists of a Core and separate ORM
|
||||
component. The Core offers a full SQL expression
|
||||
language that allows Pythonic construction
|
||||
of SQL constructs that render directly to SQL
|
||||
strings for a target database, returning
|
||||
result sets that are essentially enhanced DBAPI
|
||||
cursors.
|
||||
* Transactions should be the norm. With SQLAlchemy's
|
||||
ORM, nothing goes to permanent storage until
|
||||
commit() is called. SQLAlchemy encourages applications
|
||||
to create a consistent means of delineating
|
||||
the start and end of a series of operations.
|
||||
* Never render a literal value in a SQL statement.
|
||||
Bound parameters are used to the greatest degree
|
||||
possible, allowing query optimizers to cache
|
||||
query plans effectively and making SQL injection
|
||||
attacks a non-issue.
|
||||
|
||||
Documentation
|
||||
-------------
|
||||
|
||||
Latest documentation is at:
|
||||
|
||||
http://www.sqlalchemy.org/docs/
|
||||
|
||||
Installation / Requirements
|
||||
---------------------------
|
||||
|
||||
Full documentation for installation is at
|
||||
`Installation <http://www.sqlalchemy.org/docs/intro.html#installation>`_.
|
||||
|
||||
Getting Help / Development / Bug reporting
|
||||
------------------------------------------
|
||||
|
||||
Please refer to the `SQLAlchemy Community Guide <http://www.sqlalchemy.org/support.html>`_.
|
||||
|
||||
License
|
||||
-------
|
||||
|
||||
SQLAlchemy is distributed under the `MIT license
|
||||
<http://www.opensource.org/licenses/mit-license.php>`_.
|
||||
|
||||
|
||||
|
||||
158
lib/python3.4/site-packages/SQLAlchemy-1.0.12.dist-info/METADATA
Normal file
158
lib/python3.4/site-packages/SQLAlchemy-1.0.12.dist-info/METADATA
Normal file
|
|
@ -0,0 +1,158 @@
|
|||
Metadata-Version: 2.0
|
||||
Name: SQLAlchemy
|
||||
Version: 1.0.12
|
||||
Summary: Database Abstraction Library
|
||||
Home-page: http://www.sqlalchemy.org
|
||||
Author: Mike Bayer
|
||||
Author-email: mike_mp@zzzcomputing.com
|
||||
License: MIT License
|
||||
Description-Content-Type: UNKNOWN
|
||||
Platform: UNKNOWN
|
||||
Classifier: Development Status :: 5 - Production/Stable
|
||||
Classifier: Intended Audience :: Developers
|
||||
Classifier: License :: OSI Approved :: MIT License
|
||||
Classifier: Programming Language :: Python
|
||||
Classifier: Programming Language :: Python :: 3
|
||||
Classifier: Programming Language :: Python :: Implementation :: CPython
|
||||
Classifier: Programming Language :: Python :: Implementation :: Jython
|
||||
Classifier: Programming Language :: Python :: Implementation :: PyPy
|
||||
Classifier: Topic :: Database :: Front-Ends
|
||||
Classifier: Operating System :: OS Independent
|
||||
|
||||
SQLAlchemy
|
||||
==========
|
||||
|
||||
The Python SQL Toolkit and Object Relational Mapper
|
||||
|
||||
Introduction
|
||||
-------------
|
||||
|
||||
SQLAlchemy is the Python SQL toolkit and Object Relational Mapper
|
||||
that gives application developers the full power and
|
||||
flexibility of SQL. SQLAlchemy provides a full suite
|
||||
of well known enterprise-level persistence patterns,
|
||||
designed for efficient and high-performing database
|
||||
access, adapted into a simple and Pythonic domain
|
||||
language.
|
||||
|
||||
Major SQLAlchemy features include:
|
||||
|
||||
* An industrial strength ORM, built
|
||||
from the core on the identity map, unit of work,
|
||||
and data mapper patterns. These patterns
|
||||
allow transparent persistence of objects
|
||||
using a declarative configuration system.
|
||||
Domain models
|
||||
can be constructed and manipulated naturally,
|
||||
and changes are synchronized with the
|
||||
current transaction automatically.
|
||||
* A relationally-oriented query system, exposing
|
||||
the full range of SQL's capabilities
|
||||
explicitly, including joins, subqueries,
|
||||
correlation, and most everything else,
|
||||
in terms of the object model.
|
||||
Writing queries with the ORM uses the same
|
||||
techniques of relational composition you use
|
||||
when writing SQL. While you can drop into
|
||||
literal SQL at any time, it's virtually never
|
||||
needed.
|
||||
* A comprehensive and flexible system
|
||||
of eager loading for related collections and objects.
|
||||
Collections are cached within a session,
|
||||
and can be loaded on individual access, all
|
||||
at once using joins, or by query per collection
|
||||
across the full result set.
|
||||
* A Core SQL construction system and DBAPI
|
||||
interaction layer. The SQLAlchemy Core is
|
||||
separate from the ORM and is a full database
|
||||
abstraction layer in its own right, and includes
|
||||
an extensible Python-based SQL expression
|
||||
language, schema metadata, connection pooling,
|
||||
type coercion, and custom types.
|
||||
* All primary and foreign key constraints are
|
||||
assumed to be composite and natural. Surrogate
|
||||
integer primary keys are of course still the
|
||||
norm, but SQLAlchemy never assumes or hardcodes
|
||||
to this model.
|
||||
* Database introspection and generation. Database
|
||||
schemas can be "reflected" in one step into
|
||||
Python structures representing database metadata;
|
||||
those same structures can then generate
|
||||
CREATE statements right back out - all within
|
||||
the Core, independent of the ORM.
|
||||
|
||||
SQLAlchemy's philosophy:
|
||||
|
||||
* SQL databases behave less and less like object
|
||||
collections the more size and performance start to
|
||||
matter; object collections behave less and less like
|
||||
tables and rows the more abstraction starts to matter.
|
||||
SQLAlchemy aims to accommodate both of these
|
||||
principles.
|
||||
* An ORM doesn't need to hide the "R". A relational
|
||||
database provides rich, set-based functionality
|
||||
that should be fully exposed. SQLAlchemy's
|
||||
ORM provides an open-ended set of patterns
|
||||
that allow a developer to construct a custom
|
||||
mediation layer between a domain model and
|
||||
a relational schema, turning the so-called
|
||||
"object relational impedance" issue into
|
||||
a distant memory.
|
||||
* The developer, in all cases, makes all decisions
|
||||
regarding the design, structure, and naming conventions
|
||||
of both the object model as well as the relational
|
||||
schema. SQLAlchemy only provides the means
|
||||
to automate the execution of these decisions.
|
||||
* With SQLAlchemy, there's no such thing as
|
||||
"the ORM generated a bad query" - you
|
||||
retain full control over the structure of
|
||||
queries, including how joins are organized,
|
||||
how subqueries and correlation is used, what
|
||||
columns are requested. Everything SQLAlchemy
|
||||
does is ultimately the result of a developer-
|
||||
initiated decision.
|
||||
* Don't use an ORM if the problem doesn't need one.
|
||||
SQLAlchemy consists of a Core and separate ORM
|
||||
component. The Core offers a full SQL expression
|
||||
language that allows Pythonic construction
|
||||
of SQL constructs that render directly to SQL
|
||||
strings for a target database, returning
|
||||
result sets that are essentially enhanced DBAPI
|
||||
cursors.
|
||||
* Transactions should be the norm. With SQLAlchemy's
|
||||
ORM, nothing goes to permanent storage until
|
||||
commit() is called. SQLAlchemy encourages applications
|
||||
to create a consistent means of delineating
|
||||
the start and end of a series of operations.
|
||||
* Never render a literal value in a SQL statement.
|
||||
Bound parameters are used to the greatest degree
|
||||
possible, allowing query optimizers to cache
|
||||
query plans effectively and making SQL injection
|
||||
attacks a non-issue.
|
||||
|
||||
Documentation
|
||||
-------------
|
||||
|
||||
Latest documentation is at:
|
||||
|
||||
http://www.sqlalchemy.org/docs/
|
||||
|
||||
Installation / Requirements
|
||||
---------------------------
|
||||
|
||||
Full documentation for installation is at
|
||||
`Installation <http://www.sqlalchemy.org/docs/intro.html#installation>`_.
|
||||
|
||||
Getting Help / Development / Bug reporting
|
||||
------------------------------------------
|
||||
|
||||
Please refer to the `SQLAlchemy Community Guide <http://www.sqlalchemy.org/support.html>`_.
|
||||
|
||||
License
|
||||
-------
|
||||
|
||||
SQLAlchemy is distributed under the `MIT license
|
||||
<http://www.opensource.org/licenses/mit-license.php>`_.
|
||||
|
||||
|
||||
|
||||
376
lib/python3.4/site-packages/SQLAlchemy-1.0.12.dist-info/RECORD
Normal file
376
lib/python3.4/site-packages/SQLAlchemy-1.0.12.dist-info/RECORD
Normal file
|
|
@ -0,0 +1,376 @@
|
|||
SQLAlchemy-1.0.12.dist-info/DESCRIPTION.rst,sha256=ZN8fj2owI_rw0Emr3_RXqoNfTFkThjiZy7xcCzg1W_g,5013
|
||||
SQLAlchemy-1.0.12.dist-info/METADATA,sha256=xCBLJSNub29eg_Bm-fHTUT_al-Sr8jh38ztUF4_s1so,5820
|
||||
SQLAlchemy-1.0.12.dist-info/RECORD,,
|
||||
SQLAlchemy-1.0.12.dist-info/WHEEL,sha256=AEztX7vHDtcgysb-4-5-DyIKMLIPg6NMxY9dXTRdoXQ,104
|
||||
SQLAlchemy-1.0.12.dist-info/metadata.json,sha256=QT7EcApgL9QrRqR1YIngngveBNd13H8h-oNK9fsxj0U,1004
|
||||
SQLAlchemy-1.0.12.dist-info/top_level.txt,sha256=rp-ZgB7D8G11ivXON5VGPjupT1voYmWqkciDt5Uaw_Q,11
|
||||
sqlalchemy/__init__.py,sha256=fTurvwmGkoRt_zdwxoZNWTHg6VdzvBpeHyPmUnexOK4,2112
|
||||
sqlalchemy/cprocessors.cpython-34m.so,sha256=hvG3A0r4VO9gevdsLGZYRdqNfG2rahDIFUqJ-fUxAB4,52136
|
||||
sqlalchemy/cresultproxy.cpython-34m.so,sha256=piAFu3JE3mOaKpNSg6vcu8jGTl_-X6elUDWS2h_YOfQ,61504
|
||||
sqlalchemy/cutils.cpython-34m.so,sha256=-ARQsTXx0XDzghnRNCwdaxm2eeIn2TuEqoU_Wb18h6E,34312
|
||||
sqlalchemy/events.py,sha256=j8yref-XfuJxkPKbvnZmB4jeUAIujPcbLAzD2cKV4f4,43944
|
||||
sqlalchemy/exc.py,sha256=NhA5R5nDdducWkp0MXtlQ0-Q6iF_rhqkHWblIfuSYGk,11706
|
||||
sqlalchemy/inspection.py,sha256=zMa-2nt-OQ0Op1dqq0Z2XCnpdAFSTkqif5Kdi8Wz8AU,3093
|
||||
sqlalchemy/interfaces.py,sha256=XSx5y-HittAzc79lU4C7rPbTtSW_Hc2c89NqCy50tsQ,10967
|
||||
sqlalchemy/log.py,sha256=opX7UORq5N6_jWxN9aHX9OpiirwAcRA0qq-u5m4SMkQ,6712
|
||||
sqlalchemy/pool.py,sha256=-F51TIJYl0XGTV2_sdpV8C1m0jTTQaq0nAezdmSgr84,47220
|
||||
sqlalchemy/processors.py,sha256=Li1kdC-I0v03JxeOz4V7u4HAevK6LledyCPvaL06mYc,5220
|
||||
sqlalchemy/schema.py,sha256=rZzZJJ8dT9trLSYknFpHm0N1kRERYwhqHH3QD31SJjc,1182
|
||||
sqlalchemy/types.py,sha256=qcoy5xKaurDV4kaXr489GL2sz8FKkWX21Us3ZCqeasg,1650
|
||||
sqlalchemy/connectors/__init__.py,sha256=97YbriYu5mcljh7opc1JOScRlf3Tk8ldbn5urBVm4WY,278
|
||||
sqlalchemy/connectors/mxodbc.py,sha256=-0iqw2k8e-o3OkAKzoCWuAaEPxlEjslvfRM9hnVXENM,5348
|
||||
sqlalchemy/connectors/pyodbc.py,sha256=pG2yf3cEDtTr-w_m4to6jF5l8hZk6MJv69K3cg84NfY,6264
|
||||
sqlalchemy/connectors/zxJDBC.py,sha256=2KK_sVSgMsdW0ufZqAwgXjd1FsMb4hqbiUQRAkM0RYg,1868
|
||||
sqlalchemy/databases/__init__.py,sha256=BaQyAuMjXNpZYV47hCseHrDtPzTfSw-iqUQYxMWJddw,817
|
||||
sqlalchemy/dialects/__init__.py,sha256=7SMul8PL3gkbJRUwAwovHLae5qBBApRF-VcRwU-VtdU,1012
|
||||
sqlalchemy/dialects/postgres.py,sha256=heNVHys6E91DIBepXT3ls_4_6N8HTTahrZ49W5IR3M0,614
|
||||
sqlalchemy/dialects/firebird/__init__.py,sha256=QYmQ0SaGfq3YjDraCV9ALwqVW5A3KDUF0F6air_qp3Q,664
|
||||
sqlalchemy/dialects/firebird/base.py,sha256=IT0prWkh1TFSTke-BqGdVMGdof53zmWWk6zbJZ_TuuI,28170
|
||||
sqlalchemy/dialects/firebird/fdb.py,sha256=l4s6_8Z0HvqxgqGz0LNcKWP1qUmEc3M2XM718_drN34,4325
|
||||
sqlalchemy/dialects/firebird/kinterbasdb.py,sha256=kCsn2ed4u9fyjcyfEI3rXQdKvL05z9wtf5YjW9-NrvI,6299
|
||||
sqlalchemy/dialects/mssql/__init__.py,sha256=G12xmirGZgMzfUKZCA8BFfaCmqUDuYca9Fu2VP_eaks,1081
|
||||
sqlalchemy/dialects/mssql/adodbapi.py,sha256=dHZgS3pEDX39ixhlDfTtDcjCq6rdjF85VS7rIZ1TfYo,2493
|
||||
sqlalchemy/dialects/mssql/base.py,sha256=xqRmK_npoyH5gl626EjazVnu9TEArmrBIFme_avYFUg,66855
|
||||
sqlalchemy/dialects/mssql/information_schema.py,sha256=pwuTsgOCY5eSBW9w-g-pyJDRfyuZ_rOEXXNYRuAroCE,6418
|
||||
sqlalchemy/dialects/mssql/mxodbc.py,sha256=G9LypIeEizgxeShtDu2M7Vwm8NopnzaTmnZMD49mYeg,3856
|
||||
sqlalchemy/dialects/mssql/pymssql.py,sha256=w92w4YQzXdHb53AjCrBcIRHsf6jmie1iN9H7gJNGX4k,3079
|
||||
sqlalchemy/dialects/mssql/pyodbc.py,sha256=KRke1Hizrg3r5iYqxdBI0axXVQ_pZR_UPxLaAdF0mKk,9473
|
||||
sqlalchemy/dialects/mssql/zxjdbc.py,sha256=u4uBgwk0LbI7_I5CIvM3C4bBb0pmrw2_DqRh_ehJTkI,2282
|
||||
sqlalchemy/dialects/mysql/__init__.py,sha256=3cQ2juPT8LsZTicPa2J-0rCQjQIQaPgyBzxjV3O_7xs,1171
|
||||
sqlalchemy/dialects/mysql/base.py,sha256=rwC8fnhGZaAnsPB1Jhg4sTcrWE2hjxrZJ5deCS0rAOc,122869
|
||||
sqlalchemy/dialects/mysql/cymysql.py,sha256=nqsdQA8LBLIc6eilgX6qwkjm7szsUoqMTVYwK9kkfsE,2349
|
||||
sqlalchemy/dialects/mysql/gaerdbms.py,sha256=2MxtTsIqlpq_J32HHqDzz-5vu-mC51Lb7PvyGkJa73M,3387
|
||||
sqlalchemy/dialects/mysql/mysqlconnector.py,sha256=DMDm684Shk-ijVo7w-yidopYw7EC6EiOmJY56EPawok,5323
|
||||
sqlalchemy/dialects/mysql/mysqldb.py,sha256=McqROngxAknbLOXoUAG9o9mP9FQBLs-ouD-JqqI2Ses,6564
|
||||
sqlalchemy/dialects/mysql/oursql.py,sha256=rmdr-r66iJ2amqFeGvCohvE8WCl_i6R9KcgVG0uXOQs,8124
|
||||
sqlalchemy/dialects/mysql/pymysql.py,sha256=e-qehI-sASmAjEa0ajHqjZjlyJYWsb3RPQY4iBR5pz0,1504
|
||||
sqlalchemy/dialects/mysql/pyodbc.py,sha256=Ze9IOKw6ANVQj25IlmSGR8aaJhM0pMuRtbzKF7UsZCY,2665
|
||||
sqlalchemy/dialects/mysql/zxjdbc.py,sha256=LIhe2mHSRVgi8I7qmiTMVBRSpuWJVnuDtpHTUivIx0M,3942
|
||||
sqlalchemy/dialects/oracle/__init__.py,sha256=UhF2ZyPfT3EFAnP8ZjGng6GnWSzmAkjMax0Lucpn0Bg,797
|
||||
sqlalchemy/dialects/oracle/base.py,sha256=2KJO-sU2CVKK1rij6bAQ5ZFJv203_NmzT8dE5qor9wc,55961
|
||||
sqlalchemy/dialects/oracle/cx_oracle.py,sha256=-d5tHbNcCyjbgVtAvWfHgSY2yA8C9bvCzxhwkdWFNe0,38635
|
||||
sqlalchemy/dialects/oracle/zxjdbc.py,sha256=nC7XOCY3NdTLrEyIacNTnLDCaeVjWn59q8UYssJL8Wo,8112
|
||||
sqlalchemy/dialects/postgresql/__init__.py,sha256=SjCtM5b3EaGyRaTyg_i82sh_qjkLEIVUXW91XDihiCM,1299
|
||||
sqlalchemy/dialects/postgresql/base.py,sha256=xhdLeHuWioTv9LYW41pcIPsEjD2fyeh7JflkLKmZMB8,104230
|
||||
sqlalchemy/dialects/postgresql/constraints.py,sha256=8UDx_2TNQgqIUSRETZPhgninJigQ6rMfdRNI6vIt3Is,3119
|
||||
sqlalchemy/dialects/postgresql/hstore.py,sha256=n8Wsd7Uldk3bbg66tTa0NKjVqjhJUbF1mVeUsM7keXA,11402
|
||||
sqlalchemy/dialects/postgresql/json.py,sha256=MTlIGinMDa8iaVbZMOzYnremo0xL4tn2wyGTPwnvX6U,12215
|
||||
sqlalchemy/dialects/postgresql/pg8000.py,sha256=x6o3P8Ad0wKsuF9qeyip39BKc5ORJZ4nWxv-8qOdj0E,8375
|
||||
sqlalchemy/dialects/postgresql/psycopg2.py,sha256=4ac0upErNRJz6YWJYNbATCU3ncWFvat5kal_Cuq-Jhw,26953
|
||||
sqlalchemy/dialects/postgresql/psycopg2cffi.py,sha256=8R3POkJH8z8a2DxwKNmfmQOsxFqsg4tU_OnjGj3OfDA,1651
|
||||
sqlalchemy/dialects/postgresql/pypostgresql.py,sha256=raQRfZb8T9-c-jmq1w86Wci5QyiXgf_9_71OInT_sAw,2655
|
||||
sqlalchemy/dialects/postgresql/ranges.py,sha256=MihdGXMdmCM6ToIlrj7OJx9Qh_8BX8bv5PSaAepHmII,4814
|
||||
sqlalchemy/dialects/postgresql/zxjdbc.py,sha256=AhEGRiAy8q-GM0BStFcsLBgSwjxHkkwy2-BSroIoADo,1397
|
||||
sqlalchemy/dialects/sqlite/__init__.py,sha256=0wW0VOhE_RtFDpRcbwvvo3XtD6Y2-SDgG4K7468eh_w,736
|
||||
sqlalchemy/dialects/sqlite/base.py,sha256=_L9-854ITf8Fl2BgUymF9fKjDFvXSo7Pb2yuz1CMkDo,55007
|
||||
sqlalchemy/dialects/sqlite/pysqlcipher.py,sha256=sgXCqn8ZtNIeTDwyo253Kj5mn4TPlIW3AZCNNmURi2A,4129
|
||||
sqlalchemy/dialects/sqlite/pysqlite.py,sha256=G-Cg-iI-ErYsVjOH4UlQTEY9pLnLOLV89ik8q0-reuY,14980
|
||||
sqlalchemy/dialects/sybase/__init__.py,sha256=gwCgFR_C_hoj0Re7PiaW3zmKSWaLpsd96UVXdM7EnTM,894
|
||||
sqlalchemy/dialects/sybase/base.py,sha256=Xpl3vEd5VDyvoIRMg0DZa48Or--yBSrhaZ2CbTSCt0w,28853
|
||||
sqlalchemy/dialects/sybase/mxodbc.py,sha256=E_ask6yFSjyhNPvv7gQsvA41WmyxbBvRGWjCyPVr9Gs,901
|
||||
sqlalchemy/dialects/sybase/pyodbc.py,sha256=0a_gKwrIweJGcz3ZRYuQZb5BIvwjGmFEYBo9wGk66kI,2102
|
||||
sqlalchemy/dialects/sybase/pysybase.py,sha256=tu2V_EbtgxWYOvt-ybo5_lLiBQzsIFaAtF8e7S1_-rk,3208
|
||||
sqlalchemy/engine/__init__.py,sha256=fyIFw2R5wfLQzSbfE9Jz-28ZDP5RyB-5elNH92uTZYM,18803
|
||||
sqlalchemy/engine/base.py,sha256=cRqbbG0QuUG-NGs3GOPVQsU0WLsw5bLT0Y07Yf8OOfU,79399
|
||||
sqlalchemy/engine/default.py,sha256=U_yaliCazUHp6cfk_NVzhB4F_zOJSyy959rHyk40J4M,36548
|
||||
sqlalchemy/engine/interfaces.py,sha256=CmPYM_oDp1zAPH13sKmufO4Tuha6KA-fXRQq-K_3YTE,35908
|
||||
sqlalchemy/engine/reflection.py,sha256=jly5YN-cyjoBDxHs9qO6Mlgm1OZSb2NBNFALwZMEGxE,28590
|
||||
sqlalchemy/engine/result.py,sha256=ot5RQxa6kjoScXRUR-DTl0iJJISBhmyNTj1JZkZiNsk,44027
|
||||
sqlalchemy/engine/strategies.py,sha256=mwy-CTrnXzyaIA1TRQBQ_Z2O8wN0lnTNZwDefEWCR9A,8929
|
||||
sqlalchemy/engine/threadlocal.py,sha256=y4wOLjtbeY-dvp2GcJDtos6F2jzfP11JVAaSFwZ0zRM,4191
|
||||
sqlalchemy/engine/url.py,sha256=ZhS_Iqiu6V1kfIM2pcv3ud9fOPXkFOHBv8wiLOqbJhc,8228
|
||||
sqlalchemy/engine/util.py,sha256=Tvb9sIkyd6qOwIA-RsBmo5j877UXa5x-jQmhqnhHWRA,2338
|
||||
sqlalchemy/event/__init__.py,sha256=KnUVp-NVX6k276ntGffxgkjVmIWR22FSlzrbAKqQ6S4,419
|
||||
sqlalchemy/event/api.py,sha256=O2udbj5D7HdXcvsGBQk6-dK9CAFfePTypWOrUdqmhYY,5990
|
||||
sqlalchemy/event/attr.py,sha256=VfRJJl4RD24mQaIoDwArWL2hsGOX6ISSU6vKusVMNO0,12053
|
||||
sqlalchemy/event/base.py,sha256=DWDKZV19fFsLavu2cXOxXV8NhO3XuCbKcKamBKyXuME,9540
|
||||
sqlalchemy/event/legacy.py,sha256=ACnVeBUt8uwVfh1GNRu22cWCADC3CWZdrsBKzAd6UQQ,5814
|
||||
sqlalchemy/event/registry.py,sha256=13wx1qdEmcQeCoAmgf_WQEMuR43h3v7iyd2Re54QdOE,7786
|
||||
sqlalchemy/ext/__init__.py,sha256=smCZIGgjJprT4ddhuYSLZ8PrTn4NdXPP3j03a038SdE,322
|
||||
sqlalchemy/ext/associationproxy.py,sha256=y61Y4UIZNBit5lqk2WzdHTCXIWRrBg3hHbRVsqXjnqE,33422
|
||||
sqlalchemy/ext/automap.py,sha256=Aet-3zk2vbsJVLqigwZJYau0hB1D6Y21K65QVWeB5pc,41567
|
||||
sqlalchemy/ext/baked.py,sha256=BnVaB4pkQxHk-Fyz4nUw225vCxO_zrDuVC6t5cSF9x8,16967
|
||||
sqlalchemy/ext/compiler.py,sha256=aSSlySoTsqN-JkACWFIhv3pq2CuZwxKm6pSDfQoc10Q,16257
|
||||
sqlalchemy/ext/horizontal_shard.py,sha256=XEBYIfs0YrTt_2vRuaBY6C33ZOZMUHQb2E4X2s3Szns,4814
|
||||
sqlalchemy/ext/hybrid.py,sha256=wNXvuYEEmKy-Nc6z7fu1c2gNWCMOiQA0N14Y3FCq5lo,27989
|
||||
sqlalchemy/ext/instrumentation.py,sha256=HRgNiuYJ90_uSKC1iDwsEl8_KXscMQkEb9KeElk-yLE,14856
|
||||
sqlalchemy/ext/mutable.py,sha256=lx7b_ewFVe7O6I4gTXdi9M6C6TqxWCFiViqCM2VwUac,25444
|
||||
sqlalchemy/ext/orderinglist.py,sha256=UCkuZxTWAQ0num-b5oNm8zNJAmVuIFcbFXt5e7JPx-U,13816
|
||||
sqlalchemy/ext/serializer.py,sha256=fK3N1miYF16PSIZDjLFS2zI7y-scZ9qtmopXIfzPqrA,5586
|
||||
sqlalchemy/ext/declarative/__init__.py,sha256=Jpwf2EukqwNe4RzDfCmX1p-hQ6pPhJEIL_xunaER3tw,756
|
||||
sqlalchemy/ext/declarative/api.py,sha256=PdoO_jh50TWaMvXqnjNh-vX42VqB75ZyliluilphvsU,23317
|
||||
sqlalchemy/ext/declarative/base.py,sha256=96SJBOfxpTMsU2jAHrvuXbsjUUJ7TvbLm11R8Hy2Irc,25231
|
||||
sqlalchemy/ext/declarative/clsregistry.py,sha256=jaLLSr-66XvLnA1Z9kxjKatH_XHxWchqEXMKwvjKAXk,10817
|
||||
sqlalchemy/orm/__init__.py,sha256=UzDockQEVMaWvr-FE4y1rptrMb5uX5k8v_UNQs82qFY,8033
|
||||
sqlalchemy/orm/attributes.py,sha256=OmXkppJEZxRGc0acZZZkSbUhdfDl8ry3Skmvzl3OtLQ,56510
|
||||
sqlalchemy/orm/base.py,sha256=F0aRZGK2_1F8phwBHnVYaChkAb-nnTRoFE1VKSvmAwA,14634
|
||||
sqlalchemy/orm/collections.py,sha256=TFutWIn_c07DI48FDOKMsFMnAoQB3BG2FnEMGzEF3iI,53549
|
||||
sqlalchemy/orm/dependency.py,sha256=phB8nS1788FSd4dWa2j9d4uj6QFlRL7nzcXvh3Bb7Zo,46192
|
||||
sqlalchemy/orm/deprecated_interfaces.py,sha256=A63t6ivbZB3Wq8vWgL8I05uTRR6whcWnIPkquuTIPXU,18254
|
||||
sqlalchemy/orm/descriptor_props.py,sha256=uk5r77w1VUWVgn0bkgOItkAlMh9FRgeT6OCgOHz3_bM,25141
|
||||
sqlalchemy/orm/dynamic.py,sha256=I_YP7X-H9HLjeFHmYgsOas6JPdqg0Aqe0kaltt4HVzA,13283
|
||||
sqlalchemy/orm/evaluator.py,sha256=Hozggsd_Fi0YyqHrr9-tldtOA9NLX0MVBF4e2vSM6GY,4731
|
||||
sqlalchemy/orm/events.py,sha256=yRaoXlBL78b3l11itTrAy42UhLu42-7cgXKCFUGNXSg,69410
|
||||
sqlalchemy/orm/exc.py,sha256=P5lxi5RMFokiHL136VBK0AP3UmAlJcSDHtzgo-M6Kgs,5439
|
||||
sqlalchemy/orm/identity.py,sha256=zsb8xOZaPYKvs4sGhyxW21mILQDrtdSuzD4sTyeKdJs,9021
|
||||
sqlalchemy/orm/instrumentation.py,sha256=xtq9soM3mpMws7xqNJIFYXqKw65p2nnxCTfmMpuvpeI,17510
|
||||
sqlalchemy/orm/interfaces.py,sha256=AqitvZ_BBkB6L503uhdH55nxHplleJ2kQMwM7xKq9Sc,21552
|
||||
sqlalchemy/orm/loading.py,sha256=cjC8DQ5g8_rMxroYrYHfW5s35Z5OFSNBUu0-LpxW7hI,22878
|
||||
sqlalchemy/orm/mapper.py,sha256=sfooeslzwWAKN7WNIQoZ2Y3u_mCyIxd0tebp4yEUu8k,115074
|
||||
sqlalchemy/orm/path_registry.py,sha256=8Pah0P8yPVUyRjoET7DvIMGtM5PC8HZJC4GtxAyqVAs,8370
|
||||
sqlalchemy/orm/persistence.py,sha256=WzUUNm1UGm5mGxbv94hLTQowEDNoXfU1VoyGnoKeN_g,51028
|
||||
sqlalchemy/orm/properties.py,sha256=HR3eoY3Ze3FUPPNCXM_FruWz4pEMWrGlqtCGiK2G1qE,10426
|
||||
sqlalchemy/orm/query.py,sha256=2q2XprzbZhIlAbs0vihIr9dgqfJtcbrjNewgE9q26gE,147616
|
||||
sqlalchemy/orm/relationships.py,sha256=79LRGGz8MxsKsAlv0vuZ6MYZXzDXXtfiOCZg-IQ9hiU,116992
|
||||
sqlalchemy/orm/scoping.py,sha256=Ao-K4iqg4pBp7Si5JOAlro5zUL_r500TC3lVLcFMLDs,6421
|
||||
sqlalchemy/orm/session.py,sha256=yctpvCsLUcFv9Sy8keT1SElZ2VH5DNScYtO7Z77ptYI,111314
|
||||
sqlalchemy/orm/state.py,sha256=4LwwftOtPQldH12SKZV2UFgzqPOCj40QfQ08knZs0_E,22984
|
||||
sqlalchemy/orm/strategies.py,sha256=rdLEs2pPrF8nqcQqezyG-fGdmE11r22fUva4ES3KGOE,58529
|
||||
sqlalchemy/orm/strategy_options.py,sha256=_z7ZblWCnXh8bZpGSOXDoUwtdUqnXdCaWfKXYDgCuH0,34973
|
||||
sqlalchemy/orm/sync.py,sha256=B-d-H1Gzw1TkflpvgJeQghwTzqObzhZCQdvEdSPyDeE,5451
|
||||
sqlalchemy/orm/unitofwork.py,sha256=EQvZ7RZ-u5wJT51BWTeMJJi-tt22YRnmqywGUCn0Qrc,23343
|
||||
sqlalchemy/orm/util.py,sha256=Mj3NXDd8Mwp4O5Vr5zvRGFUZRlB65WpExdDBFJp04wQ,38092
|
||||
sqlalchemy/sql/__init__.py,sha256=IFCJYIilmmAQRnSDhv9Y6LQUSpx6pUU5zp9VT7sOx0c,1737
|
||||
sqlalchemy/sql/annotation.py,sha256=8ncgAVUo5QCoinApKjREi8esWNMFklcBqie8Q42KsaQ,6136
|
||||
sqlalchemy/sql/base.py,sha256=TuXOp7z0Q30qKAjhgcsts6WGvRbvg6F7OBojMQAxjX0,20990
|
||||
sqlalchemy/sql/compiler.py,sha256=G0Ft_Dmq1AousO66eagPhI0g9Vkqui_c_LjqY0AbImU,100710
|
||||
sqlalchemy/sql/crud.py,sha256=X86dyvzEnbj0-oeJO5ufi6zXxbSKBtDeu5JHlNg-BJU,19837
|
||||
sqlalchemy/sql/ddl.py,sha256=nkjd_B4lKwC2GeyPjE0ZtRB9RKXccQL1g1XoZ4p69sM,37540
|
||||
sqlalchemy/sql/default_comparator.py,sha256=QaowWtW4apULq_aohDvmj97j0sDtHQQjMRdNxXm83vk,10447
|
||||
sqlalchemy/sql/dml.py,sha256=7846H52IMJfMYi5Jd-Cv6Hy9hZM4dkonXbjfBjl5ED4,33330
|
||||
sqlalchemy/sql/elements.py,sha256=MLeecC5dMqeekZmFbPn0J-ODKJj5DBDE5v6kuSkq66I,132898
|
||||
sqlalchemy/sql/expression.py,sha256=vFZ9MmBlC9Fg8IYzLMAwXgcsnXZhkZbUstY6dO8BFGY,5833
|
||||
sqlalchemy/sql/functions.py,sha256=ZYKyvPnVKZMtHyyjyNwK0M5UWPrZmFz3vtTqHN-8658,18533
|
||||
sqlalchemy/sql/naming.py,sha256=foE2lAzngLCFXCeHrpv0S4zT23GCnZLCiata2MPo0kE,4662
|
||||
sqlalchemy/sql/operators.py,sha256=UeZgb7eRhWd4H7OfJZkx0ZWOjvo5chIUXQsBAIeeTDY,23013
|
||||
sqlalchemy/sql/schema.py,sha256=awhLY5YjUBah8ZYxW9FBfe6lH0v4fW0UJLTNApnx7E0,145511
|
||||
sqlalchemy/sql/selectable.py,sha256=o1Hom00WGHjI21Mdb5fkX-f0k2nksQNb_txT0KWK1zQ,118995
|
||||
sqlalchemy/sql/sqltypes.py,sha256=JGxizqIjO1WFuZpppWj1Yi5cvCyBczb1JqUQeuhQn8s,54879
|
||||
sqlalchemy/sql/type_api.py,sha256=Xe6yH4slgdLA8HRjT19GBOou51SS9o4oUhyK0xfn04c,42846
|
||||
sqlalchemy/sql/util.py,sha256=7AsOsyhIq2eSLMWtwvqfTLc2MdCotGzEKQKFE3wk5sk,20382
|
||||
sqlalchemy/sql/visitors.py,sha256=4ipGvAkqFaSAWgyNuKjx5x_ms8GIy9aq-wC5pj4-Z3g,10271
|
||||
sqlalchemy/testing/__init__.py,sha256=MwKimX0atzs_SmG2j74GXLiyI8O56e3DLq96tcoL0TM,1095
|
||||
sqlalchemy/testing/assertions.py,sha256=r1I2nHC599VZcY-5g0JYRQl8bl9kjkf6WFOooOmJ2eE,16112
|
||||
sqlalchemy/testing/assertsql.py,sha256=-fP9Iuhdu52BJoT1lEj_KED8jy5ay_XiJu7i4Ry9eWA,12335
|
||||
sqlalchemy/testing/config.py,sha256=nqvVm55Vk0BVNjk1Wj3aYR65j_EEEepfB-W9QSFLU-k,2469
|
||||
sqlalchemy/testing/distutils_run.py,sha256=tkURrZRwgFiSwseKm1iJRkSjKf2Rtsb3pOXRWtACTHI,247
|
||||
sqlalchemy/testing/engines.py,sha256=u6GlDMXt0FKqVTQe_QJ5JXAnkA6W-xdw6Fe_5gMAQhg,9359
|
||||
sqlalchemy/testing/entities.py,sha256=IXqTgAihV-1TZyxL0MWdZzu4rFtxdbWKWFetIJWNGM4,2992
|
||||
sqlalchemy/testing/exclusions.py,sha256=WuH_tVK5fZJWe8Hu2LzNB4HNQMa_iAUaGC-_6mHUdIM,12570
|
||||
sqlalchemy/testing/fixtures.py,sha256=q4nK-81z2EWs17TjeJtPmnaJUCtDdoUiIU7jgLq3l_w,10721
|
||||
sqlalchemy/testing/mock.py,sha256=vj5q-GzJrLW6mMVDLqsppxBu_p7K49VvjfiVt5tn0o8,630
|
||||
sqlalchemy/testing/pickleable.py,sha256=8I8M4H1XN29pZPMxZdYkmpKWfwzPsUn6WK5FX4UP9L4,2641
|
||||
sqlalchemy/testing/profiling.py,sha256=Q_wOTS5JtcGBcs2eCYIvoRoDS_FW_HcfEW3hXWB87Zg,8392
|
||||
sqlalchemy/testing/provision.py,sha256=mU9g6JZEHIshqUkE6PWu-t61FVPs_cUJtEtVFRavj9g,9377
|
||||
sqlalchemy/testing/replay_fixture.py,sha256=iAxg7XsFkKSCcJnrNPQNJfjMxOgeBAa-ShOkywWPJ4w,5429
|
||||
sqlalchemy/testing/requirements.py,sha256=aIdvbfugMzrlVdldEbpcwretX-zjiukPhPUSZgulrzU,19949
|
||||
sqlalchemy/testing/runner.py,sha256=hpNH6MNTif4TnBRySxpm92KgFwDK0mOa8eF7wZXumTI,1607
|
||||
sqlalchemy/testing/schema.py,sha256=agOzrIMvmuUCeVZY5mYjJ1eJmOP69-wa0gZALtNtJBk,3446
|
||||
sqlalchemy/testing/util.py,sha256=IJ688AWzichtXVwWgYf_A4BUbcXPGsK6BQP5fvY3h-U,7544
|
||||
sqlalchemy/testing/warnings.py,sha256=-KskRAh1RkJ_69UIY_WR7i15u21U3gDLQ6nKlnJT7_w,987
|
||||
sqlalchemy/testing/plugin/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
|
||||
sqlalchemy/testing/plugin/bootstrap.py,sha256=Iw8R-d1gqoz_NKFtPyGfdX56QPcQHny_9Lvwov65aVY,1634
|
||||
sqlalchemy/testing/plugin/noseplugin.py,sha256=In79x6zs9DOngfoYpaHojihWlSd4PeS7Nwzh3M_KNM4,2847
|
||||
sqlalchemy/testing/plugin/plugin_base.py,sha256=h4RI4nPNdNq9kYABp6IP89Eknm29q8usgO-nWb8Eobc,17120
|
||||
sqlalchemy/testing/plugin/pytestplugin.py,sha256=Pbc62y7Km0PHXd4M9dm5ThBwrlXkM4WtIX-W1pOaM84,5812
|
||||
sqlalchemy/testing/suite/__init__.py,sha256=wqCTrb28i5FwhQZOyXVlnz3mA94iQOUBio7lszkFq-g,471
|
||||
sqlalchemy/testing/suite/test_ddl.py,sha256=Baw0ou9nKdADmrRuXgWzF1FZx0rvkkw3JHc6yw5BN0M,1838
|
||||
sqlalchemy/testing/suite/test_dialect.py,sha256=ORQPXUt53XtO-5ENlWgs8BpsSdPBDjyMRl4W2UjXLI4,1165
|
||||
sqlalchemy/testing/suite/test_insert.py,sha256=nP0mgVpsVs72MHMADmihB1oXLbFBpsYsLGO3BlQ7RLU,8132
|
||||
sqlalchemy/testing/suite/test_reflection.py,sha256=HtJRsJ_vuNMrOhnPTvuIvRg66OakSaSpeCU36zhaSPg,24616
|
||||
sqlalchemy/testing/suite/test_results.py,sha256=oAcO1tD0I7c9ErMeSvSZBZfz1IBDMJHJTf64Y1pBodk,6685
|
||||
sqlalchemy/testing/suite/test_select.py,sha256=u0wAz1g-GrAFdZpG4zwSrVckVtjULvjlbd0Z1U1jHAA,5729
|
||||
sqlalchemy/testing/suite/test_sequence.py,sha256=fmBR4Pc5tOLSkXFxfcqwGx1z3xaxeJeUyqDnTakKTBU,3831
|
||||
sqlalchemy/testing/suite/test_types.py,sha256=UKa-ZPdpz16mVKvT-9ISRAfqdrqiKaE7IA-_phQQuxo,17088
|
||||
sqlalchemy/testing/suite/test_update_delete.py,sha256=r5p467r-EUsjEcWGfUE0VPIfN4LLXZpLRnnyBLyyjl4,1582
|
||||
sqlalchemy/util/__init__.py,sha256=G06a5vBxg27RtWzY6dPZHt1FO8qtOiy_2C9PHTTMblI,2520
|
||||
sqlalchemy/util/_collections.py,sha256=JZkeYK4GcIE1A5s6MAvHhmUp_X4wp6r7vMGT-iMftZ8,27842
|
||||
sqlalchemy/util/compat.py,sha256=80OXp3D-F_R-pLf7s-zITPlfCqG1s_5o6KTlY1g2p0Q,6821
|
||||
sqlalchemy/util/deprecations.py,sha256=D_LTsfb9jHokJtPEWNDRMJOc372xRGNjputAiTIysRU,4403
|
||||
sqlalchemy/util/langhelpers.py,sha256=Nhe3Y9ieK6JaFYejjYosVOjOSSIBT2V385Hu6HGcyZk,41607
|
||||
sqlalchemy/util/queue.py,sha256=rs3W0LDhKt7M_dlQEjYpI9KS-bzQmmwN38LE_-RRVvU,6548
|
||||
sqlalchemy/util/topological.py,sha256=xKsYjjAat4p8cdqRHKwibLzr6WONbPTC0X8Mqg7jYno,2794
|
||||
SQLAlchemy-1.0.12.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
|
||||
sqlalchemy/orm/__pycache__/path_registry.cpython-34.pyc,,
|
||||
sqlalchemy/ext/__pycache__/associationproxy.cpython-34.pyc,,
|
||||
sqlalchemy/__pycache__/__init__.cpython-34.pyc,,
|
||||
sqlalchemy/orm/__pycache__/deprecated_interfaces.cpython-34.pyc,,
|
||||
sqlalchemy/orm/__pycache__/dynamic.cpython-34.pyc,,
|
||||
sqlalchemy/event/__pycache__/legacy.cpython-34.pyc,,
|
||||
sqlalchemy/event/__pycache__/api.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/__pycache__/postgres.cpython-34.pyc,,
|
||||
sqlalchemy/testing/__pycache__/profiling.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/firebird/__pycache__/__init__.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/mysql/__pycache__/gaerdbms.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/mysql/__pycache__/oursql.cpython-34.pyc,,
|
||||
sqlalchemy/testing/suite/__pycache__/test_types.cpython-34.pyc,,
|
||||
sqlalchemy/event/__pycache__/registry.cpython-34.pyc,,
|
||||
sqlalchemy/testing/__pycache__/distutils_run.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/mssql/__pycache__/mxodbc.cpython-34.pyc,,
|
||||
sqlalchemy/engine/__pycache__/base.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/postgresql/__pycache__/pypostgresql.cpython-34.pyc,,
|
||||
sqlalchemy/util/__pycache__/__init__.cpython-34.pyc,,
|
||||
sqlalchemy/util/__pycache__/topological.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/firebird/__pycache__/kinterbasdb.cpython-34.pyc,,
|
||||
sqlalchemy/orm/__pycache__/persistence.cpython-34.pyc,,
|
||||
sqlalchemy/connectors/__pycache__/mxodbc.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/mssql/__pycache__/base.cpython-34.pyc,,
|
||||
sqlalchemy/connectors/__pycache__/__init__.cpython-34.pyc,,
|
||||
sqlalchemy/sql/__pycache__/elements.cpython-34.pyc,,
|
||||
sqlalchemy/testing/suite/__pycache__/__init__.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/__pycache__/__init__.cpython-34.pyc,,
|
||||
sqlalchemy/util/__pycache__/langhelpers.cpython-34.pyc,,
|
||||
sqlalchemy/testing/__pycache__/entities.cpython-34.pyc,,
|
||||
sqlalchemy/engine/__pycache__/interfaces.cpython-34.pyc,,
|
||||
sqlalchemy/sql/__pycache__/schema.cpython-34.pyc,,
|
||||
sqlalchemy/ext/__pycache__/baked.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/postgresql/__pycache__/zxjdbc.cpython-34.pyc,,
|
||||
sqlalchemy/orm/__pycache__/base.cpython-34.pyc,,
|
||||
sqlalchemy/connectors/__pycache__/pyodbc.cpython-34.pyc,,
|
||||
sqlalchemy/sql/__pycache__/annotation.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/oracle/__pycache__/zxjdbc.cpython-34.pyc,,
|
||||
sqlalchemy/testing/__pycache__/runner.cpython-34.pyc,,
|
||||
sqlalchemy/testing/__pycache__/schema.cpython-34.pyc,,
|
||||
sqlalchemy/orm/__pycache__/relationships.cpython-34.pyc,,
|
||||
sqlalchemy/__pycache__/pool.cpython-34.pyc,,
|
||||
sqlalchemy/testing/suite/__pycache__/test_sequence.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/sqlite/__pycache__/__init__.cpython-34.pyc,,
|
||||
sqlalchemy/sql/__pycache__/ddl.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/sybase/__pycache__/pyodbc.cpython-34.pyc,,
|
||||
sqlalchemy/orm/__pycache__/dependency.cpython-34.pyc,,
|
||||
sqlalchemy/sql/__pycache__/visitors.cpython-34.pyc,,
|
||||
sqlalchemy/testing/__pycache__/provision.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/postgresql/__pycache__/json.cpython-34.pyc,,
|
||||
sqlalchemy/sql/__pycache__/selectable.cpython-34.pyc,,
|
||||
sqlalchemy/orm/__pycache__/exc.cpython-34.pyc,,
|
||||
sqlalchemy/ext/declarative/__pycache__/clsregistry.cpython-34.pyc,,
|
||||
sqlalchemy/orm/__pycache__/interfaces.cpython-34.pyc,,
|
||||
sqlalchemy/testing/__pycache__/assertions.cpython-34.pyc,,
|
||||
sqlalchemy/ext/__pycache__/compiler.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/oracle/__pycache__/cx_oracle.cpython-34.pyc,,
|
||||
sqlalchemy/testing/suite/__pycache__/test_select.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/firebird/__pycache__/fdb.cpython-34.pyc,,
|
||||
sqlalchemy/orm/__pycache__/unitofwork.cpython-34.pyc,,
|
||||
sqlalchemy/testing/__pycache__/util.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/postgresql/__pycache__/psycopg2cffi.cpython-34.pyc,,
|
||||
sqlalchemy/__pycache__/interfaces.cpython-34.pyc,,
|
||||
sqlalchemy/engine/__pycache__/util.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/postgresql/__pycache__/__init__.cpython-34.pyc,,
|
||||
sqlalchemy/__pycache__/schema.cpython-34.pyc,,
|
||||
sqlalchemy/orm/__pycache__/sync.cpython-34.pyc,,
|
||||
sqlalchemy/__pycache__/processors.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/firebird/__pycache__/base.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/postgresql/__pycache__/psycopg2.cpython-34.pyc,,
|
||||
sqlalchemy/databases/__pycache__/__init__.cpython-34.pyc,,
|
||||
sqlalchemy/sql/__pycache__/sqltypes.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/oracle/__pycache__/base.cpython-34.pyc,,
|
||||
sqlalchemy/sql/__pycache__/functions.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/sqlite/__pycache__/pysqlcipher.cpython-34.pyc,,
|
||||
sqlalchemy/testing/suite/__pycache__/test_dialect.cpython-34.pyc,,
|
||||
sqlalchemy/ext/__pycache__/automap.cpython-34.pyc,,
|
||||
sqlalchemy/testing/__pycache__/mock.cpython-34.pyc,,
|
||||
sqlalchemy/testing/__pycache__/requirements.cpython-34.pyc,,
|
||||
sqlalchemy/testing/suite/__pycache__/test_results.cpython-34.pyc,,
|
||||
sqlalchemy/orm/__pycache__/__init__.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/postgresql/__pycache__/base.cpython-34.pyc,,
|
||||
sqlalchemy/util/__pycache__/deprecations.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/mssql/__pycache__/pyodbc.cpython-34.pyc,,
|
||||
sqlalchemy/orm/__pycache__/state.cpython-34.pyc,,
|
||||
sqlalchemy/event/__pycache__/base.cpython-34.pyc,,
|
||||
sqlalchemy/__pycache__/log.cpython-34.pyc,,
|
||||
sqlalchemy/connectors/__pycache__/zxJDBC.cpython-34.pyc,,
|
||||
sqlalchemy/testing/plugin/__pycache__/plugin_base.cpython-34.pyc,,
|
||||
sqlalchemy/orm/__pycache__/identity.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/mysql/__pycache__/mysqlconnector.cpython-34.pyc,,
|
||||
sqlalchemy/orm/__pycache__/attributes.cpython-34.pyc,,
|
||||
sqlalchemy/ext/declarative/__pycache__/__init__.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/sqlite/__pycache__/base.cpython-34.pyc,,
|
||||
sqlalchemy/ext/__pycache__/serializer.cpython-34.pyc,,
|
||||
sqlalchemy/testing/plugin/__pycache__/pytestplugin.cpython-34.pyc,,
|
||||
sqlalchemy/orm/__pycache__/properties.cpython-34.pyc,,
|
||||
sqlalchemy/orm/__pycache__/mapper.cpython-34.pyc,,
|
||||
sqlalchemy/testing/__pycache__/fixtures.cpython-34.pyc,,
|
||||
sqlalchemy/sql/__pycache__/base.cpython-34.pyc,,
|
||||
sqlalchemy/orm/__pycache__/events.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/mysql/__pycache__/zxjdbc.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/mysql/__pycache__/__init__.cpython-34.pyc,,
|
||||
sqlalchemy/orm/__pycache__/strategy_options.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/sybase/__pycache__/mxodbc.cpython-34.pyc,,
|
||||
sqlalchemy/util/__pycache__/compat.cpython-34.pyc,,
|
||||
sqlalchemy/testing/plugin/__pycache__/bootstrap.cpython-34.pyc,,
|
||||
sqlalchemy/sql/__pycache__/compiler.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/mysql/__pycache__/mysqldb.cpython-34.pyc,,
|
||||
sqlalchemy/__pycache__/inspection.cpython-34.pyc,,
|
||||
sqlalchemy/testing/plugin/__pycache__/__init__.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/mssql/__pycache__/adodbapi.cpython-34.pyc,,
|
||||
sqlalchemy/engine/__pycache__/url.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/oracle/__pycache__/__init__.cpython-34.pyc,,
|
||||
sqlalchemy/engine/__pycache__/result.cpython-34.pyc,,
|
||||
sqlalchemy/testing/suite/__pycache__/test_insert.cpython-34.pyc,,
|
||||
sqlalchemy/event/__pycache__/__init__.cpython-34.pyc,,
|
||||
sqlalchemy/orm/__pycache__/scoping.cpython-34.pyc,,
|
||||
sqlalchemy/orm/__pycache__/instrumentation.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/sybase/__pycache__/base.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/mysql/__pycache__/pyodbc.cpython-34.pyc,,
|
||||
sqlalchemy/testing/plugin/__pycache__/noseplugin.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/mysql/__pycache__/cymysql.cpython-34.pyc,,
|
||||
sqlalchemy/testing/__pycache__/exclusions.cpython-34.pyc,,
|
||||
sqlalchemy/ext/__pycache__/mutable.cpython-34.pyc,,
|
||||
sqlalchemy/sql/__pycache__/default_comparator.cpython-34.pyc,,
|
||||
sqlalchemy/engine/__pycache__/default.cpython-34.pyc,,
|
||||
sqlalchemy/__pycache__/types.cpython-34.pyc,,
|
||||
sqlalchemy/orm/__pycache__/session.cpython-34.pyc,,
|
||||
sqlalchemy/util/__pycache__/_collections.cpython-34.pyc,,
|
||||
sqlalchemy/engine/__pycache__/reflection.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/sybase/__pycache__/pysybase.cpython-34.pyc,,
|
||||
sqlalchemy/testing/__pycache__/assertsql.cpython-34.pyc,,
|
||||
sqlalchemy/testing/__pycache__/replay_fixture.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/mysql/__pycache__/pymysql.cpython-34.pyc,,
|
||||
sqlalchemy/testing/__pycache__/config.cpython-34.pyc,,
|
||||
sqlalchemy/orm/__pycache__/strategies.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/sqlite/__pycache__/pysqlite.cpython-34.pyc,,
|
||||
sqlalchemy/orm/__pycache__/util.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/mysql/__pycache__/base.cpython-34.pyc,,
|
||||
sqlalchemy/sql/__pycache__/crud.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/postgresql/__pycache__/pg8000.cpython-34.pyc,,
|
||||
sqlalchemy/engine/__pycache__/__init__.cpython-34.pyc,,
|
||||
sqlalchemy/orm/__pycache__/loading.cpython-34.pyc,,
|
||||
sqlalchemy/testing/__pycache__/__init__.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/postgresql/__pycache__/ranges.cpython-34.pyc,,
|
||||
sqlalchemy/sql/__pycache__/operators.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/mssql/__pycache__/__init__.cpython-34.pyc,,
|
||||
sqlalchemy/testing/__pycache__/pickleable.cpython-34.pyc,,
|
||||
sqlalchemy/sql/__pycache__/expression.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/mssql/__pycache__/pymssql.cpython-34.pyc,,
|
||||
sqlalchemy/sql/__pycache__/naming.cpython-34.pyc,,
|
||||
sqlalchemy/ext/__pycache__/horizontal_shard.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/sybase/__pycache__/__init__.cpython-34.pyc,,
|
||||
sqlalchemy/engine/__pycache__/threadlocal.cpython-34.pyc,,
|
||||
sqlalchemy/ext/declarative/__pycache__/api.cpython-34.pyc,,
|
||||
sqlalchemy/testing/__pycache__/warnings.cpython-34.pyc,,
|
||||
sqlalchemy/sql/__pycache__/util.cpython-34.pyc,,
|
||||
sqlalchemy/sql/__pycache__/dml.cpython-34.pyc,,
|
||||
sqlalchemy/ext/__pycache__/__init__.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/postgresql/__pycache__/hstore.cpython-34.pyc,,
|
||||
sqlalchemy/orm/__pycache__/collections.cpython-34.pyc,,
|
||||
sqlalchemy/sql/__pycache__/__init__.cpython-34.pyc,,
|
||||
sqlalchemy/testing/suite/__pycache__/test_ddl.cpython-34.pyc,,
|
||||
sqlalchemy/ext/__pycache__/orderinglist.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/postgresql/__pycache__/constraints.cpython-34.pyc,,
|
||||
sqlalchemy/__pycache__/exc.cpython-34.pyc,,
|
||||
sqlalchemy/testing/suite/__pycache__/test_update_delete.cpython-34.pyc,,
|
||||
sqlalchemy/engine/__pycache__/strategies.cpython-34.pyc,,
|
||||
sqlalchemy/ext/declarative/__pycache__/base.cpython-34.pyc,,
|
||||
sqlalchemy/orm/__pycache__/evaluator.cpython-34.pyc,,
|
||||
sqlalchemy/orm/__pycache__/query.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/mssql/__pycache__/zxjdbc.cpython-34.pyc,,
|
||||
sqlalchemy/orm/__pycache__/descriptor_props.cpython-34.pyc,,
|
||||
sqlalchemy/__pycache__/events.cpython-34.pyc,,
|
||||
sqlalchemy/sql/__pycache__/type_api.cpython-34.pyc,,
|
||||
sqlalchemy/util/__pycache__/queue.cpython-34.pyc,,
|
||||
sqlalchemy/ext/__pycache__/hybrid.cpython-34.pyc,,
|
||||
sqlalchemy/event/__pycache__/attr.cpython-34.pyc,,
|
||||
sqlalchemy/testing/suite/__pycache__/test_reflection.cpython-34.pyc,,
|
||||
sqlalchemy/dialects/mssql/__pycache__/information_schema.cpython-34.pyc,,
|
||||
sqlalchemy/ext/__pycache__/instrumentation.cpython-34.pyc,,
|
||||
sqlalchemy/testing/__pycache__/engines.cpython-34.pyc,,
|
||||
|
|
@ -0,0 +1,5 @@
|
|||
Wheel-Version: 1.0
|
||||
Generator: bdist_wheel (0.30.0)
|
||||
Root-Is-Purelib: false
|
||||
Tag: cp34-cp34m-linux_x86_64
|
||||
|
||||
|
|
@ -0,0 +1 @@
|
|||
{"classifiers": ["Development Status :: 5 - Production/Stable", "Intended Audience :: Developers", "License :: OSI Approved :: MIT License", "Programming Language :: Python", "Programming Language :: Python :: 3", "Programming Language :: Python :: Implementation :: CPython", "Programming Language :: Python :: Implementation :: Jython", "Programming Language :: Python :: Implementation :: PyPy", "Topic :: Database :: Front-Ends", "Operating System :: OS Independent"], "description_content_type": "UNKNOWN", "extensions": {"python.details": {"contacts": [{"email": "mike_mp@zzzcomputing.com", "name": "Mike Bayer", "role": "author"}], "document_names": {"description": "DESCRIPTION.rst"}, "project_urls": {"Home": "http://www.sqlalchemy.org"}}}, "generator": "bdist_wheel (0.30.0)", "license": "MIT License", "metadata_version": "2.0", "name": "SQLAlchemy", "summary": "Database Abstraction Library", "test_requires": [{"requires": ["mock", "pytest (>=2.5.2)", "pytest-xdist"]}], "version": "1.0.12"}
|
||||
|
|
@ -0,0 +1,11 @@
|
|||
Python bindings to the Ed25519 public-key signature system.
|
||||
|
||||
This offers a comfortable python interface to a C implementation of the
|
||||
Ed25519 public-key signature system (http://ed25519.cr.yp.to/), using the
|
||||
portable 'ref' code from the 'SUPERCOP' benchmarking suite.
|
||||
|
||||
This system provides high (128-bit) security, short (32-byte) keys, short
|
||||
(64-byte) signatures, and fast (2-6ms) operation. Please see the README for
|
||||
more details.
|
||||
|
||||
|
||||
|
|
@ -1,4 +1,4 @@
|
|||
Metadata-Version: 1.1
|
||||
Metadata-Version: 2.0
|
||||
Name: ed25519
|
||||
Version: 1.4
|
||||
Summary: Ed25519 public-key signatures
|
||||
|
|
@ -6,16 +6,7 @@ Home-page: https://github.com/warner/python-ed25519
|
|||
Author: Brian Warner
|
||||
Author-email: warner-python-ed25519@lothar.com
|
||||
License: MIT
|
||||
Description: Python bindings to the Ed25519 public-key signature system.
|
||||
|
||||
This offers a comfortable python interface to a C implementation of the
|
||||
Ed25519 public-key signature system (http://ed25519.cr.yp.to/), using the
|
||||
portable 'ref' code from the 'SUPERCOP' benchmarking suite.
|
||||
|
||||
This system provides high (128-bit) security, short (32-byte) keys, short
|
||||
(64-byte) signatures, and fast (2-6ms) operation. Please see the README for
|
||||
more details.
|
||||
|
||||
Description-Content-Type: UNKNOWN
|
||||
Platform: UNKNOWN
|
||||
Classifier: Development Status :: 5 - Production/Stable
|
||||
Classifier: Intended Audience :: Developers
|
||||
|
|
@ -26,3 +17,15 @@ Classifier: Programming Language :: Python :: 2.7
|
|||
Classifier: Programming Language :: Python :: 3.3
|
||||
Classifier: Programming Language :: Python :: 3.4
|
||||
Classifier: Topic :: Security :: Cryptography
|
||||
|
||||
Python bindings to the Ed25519 public-key signature system.
|
||||
|
||||
This offers a comfortable python interface to a C implementation of the
|
||||
Ed25519 public-key signature system (http://ed25519.cr.yp.to/), using the
|
||||
portable 'ref' code from the 'SUPERCOP' benchmarking suite.
|
||||
|
||||
This system provides high (128-bit) security, short (32-byte) keys, short
|
||||
(64-byte) signatures, and fast (2-6ms) operation. Please see the README for
|
||||
more details.
|
||||
|
||||
|
||||
17
lib/python3.4/site-packages/ed25519-1.4.dist-info/RECORD
Normal file
17
lib/python3.4/site-packages/ed25519-1.4.dist-info/RECORD
Normal file
|
|
@ -0,0 +1,17 @@
|
|||
ed25519/__init__.py,sha256=0AicD1xQAforRdrUWwmmURJkZ3Gi1lqaifukwZNYJos,401
|
||||
ed25519/_ed25519.cpython-34m.so,sha256=-qvpNKMbtiJoFhWHlvH83lGmJEntE9ISrt8hYZE4zig,262968
|
||||
ed25519/_version.py,sha256=yb119RosJrH_RO02_o3o12GWQvkxx3xD4X7UrJW9vTY,469
|
||||
ed25519/keys.py,sha256=AbMFsbxn0qbwmQ6HntpNURsOGq_y4puwFxs6U7Of2eo,7123
|
||||
ed25519/test_ed25519.py,sha256=IG8ot-yARHi6PoyJY6ixS1l2L23hE1lCXbSH-XQPCCM,12389
|
||||
../../../bin/edsig,sha256=SA1mUUWCjAAaSEe6MKSpVWg-2qXwuiuK3PodCAUwCN0,2853
|
||||
ed25519-1.4.dist-info/DESCRIPTION.rst,sha256=8UWGEqjPrB7zPyxLA5Ep6JL58ANbe0Wybqth188exdc,434
|
||||
ed25519-1.4.dist-info/METADATA,sha256=8xAIfsJS4nw5H1ui1jHsVntmwcMjIzm4j_LHEaW3wNQ,1148
|
||||
ed25519-1.4.dist-info/RECORD,,
|
||||
ed25519-1.4.dist-info/WHEEL,sha256=AEztX7vHDtcgysb-4-5-DyIKMLIPg6NMxY9dXTRdoXQ,104
|
||||
ed25519-1.4.dist-info/metadata.json,sha256=6X6ChTS1aIj99pNHtLNerEBCuO-F-P2Z1GgSMt2svQw,841
|
||||
ed25519-1.4.dist-info/top_level.txt,sha256=U3-N9ZJMBO9MUuZLwoiMbsWSkxsd0TfkNSuzO6O_gYY,8
|
||||
ed25519-1.4.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
|
||||
ed25519/__pycache__/keys.cpython-34.pyc,,
|
||||
ed25519/__pycache__/_version.cpython-34.pyc,,
|
||||
ed25519/__pycache__/__init__.cpython-34.pyc,,
|
||||
ed25519/__pycache__/test_ed25519.cpython-34.pyc,,
|
||||
5
lib/python3.4/site-packages/ed25519-1.4.dist-info/WHEEL
Normal file
5
lib/python3.4/site-packages/ed25519-1.4.dist-info/WHEEL
Normal file
|
|
@ -0,0 +1,5 @@
|
|||
Wheel-Version: 1.0
|
||||
Generator: bdist_wheel (0.30.0)
|
||||
Root-Is-Purelib: false
|
||||
Tag: cp34-cp34m-linux_x86_64
|
||||
|
||||
|
|
@ -0,0 +1 @@
|
|||
{"classifiers": ["Development Status :: 5 - Production/Stable", "Intended Audience :: Developers", "License :: OSI Approved :: MIT License", "Programming Language :: Python", "Programming Language :: Python :: 2.6", "Programming Language :: Python :: 2.7", "Programming Language :: Python :: 3.3", "Programming Language :: Python :: 3.4", "Topic :: Security :: Cryptography"], "description_content_type": "UNKNOWN", "extensions": {"python.details": {"contacts": [{"email": "warner-python-ed25519@lothar.com", "name": "Brian Warner", "role": "author"}], "document_names": {"description": "DESCRIPTION.rst"}, "project_urls": {"Home": "https://github.com/warner/python-ed25519"}}}, "generator": "bdist_wheel (0.30.0)", "license": "MIT", "metadata_version": "2.0", "name": "ed25519", "summary": "Ed25519 public-key signatures", "version": "1.4"}
|
||||
BIN
lib/python3.4/site-packages/ed25519/_ed25519.cpython-34m.so
Executable file
BIN
lib/python3.4/site-packages/ed25519/_ed25519.cpython-34m.so
Executable file
Binary file not shown.
BIN
lib/python3.4/site-packages/ed25519/_ed25519.cpython-35m-x86_64-linux-gnu.so
Executable file
BIN
lib/python3.4/site-packages/ed25519/_ed25519.cpython-35m-x86_64-linux-gnu.so
Executable file
Binary file not shown.
BIN
lib/python3.4/site-packages/ed25519/_ed25519.cpython-36m-x86_64-linux-gnu.so
Executable file
BIN
lib/python3.4/site-packages/ed25519/_ed25519.cpython-36m-x86_64-linux-gnu.so
Executable file
Binary file not shown.
|
|
@ -0,0 +1,199 @@
|
|||
netifaces 0.10.6
|
||||
================
|
||||
|
||||
.. image:: https://drone.io/bitbucket.org/al45tair/netifaces/status.png
|
||||
:target: https://drone.io/bitbucket.org/al45tair/netifaces/latest
|
||||
:alt: Build Status
|
||||
|
||||
1. What is this?
|
||||
----------------
|
||||
|
||||
It's been annoying me for some time that there's no easy way to get the
|
||||
address(es) of the machine's network interfaces from Python. There is
|
||||
a good reason for this difficulty, which is that it is virtually impossible
|
||||
to do so in a portable manner. However, it seems to me that there should
|
||||
be a package you can easy_install that will take care of working out the
|
||||
details of doing so on the machine you're using, then you can get on with
|
||||
writing Python code without concerning yourself with the nitty gritty of
|
||||
system-dependent low-level networking APIs.
|
||||
|
||||
This package attempts to solve that problem.
|
||||
|
||||
2. How do I use it?
|
||||
-------------------
|
||||
|
||||
First you need to install it, which you can do by typing::
|
||||
|
||||
tar xvzf netifaces-0.10.6.tar.gz
|
||||
cd netifaces-0.10.6
|
||||
python setup.py install
|
||||
|
||||
**Note that you will need the relevant developer tools for your platform**,
|
||||
as netifaces is written in C and installing this way will compile the extension.
|
||||
|
||||
Once that's done, you'll need to start Python and do something like the
|
||||
following::
|
||||
|
||||
>>> import netifaces
|
||||
|
||||
Then if you enter
|
||||
|
||||
>>> netifaces.interfaces()
|
||||
['lo0', 'gif0', 'stf0', 'en0', 'en1', 'fw0']
|
||||
|
||||
you'll see the list of interface identifiers for your machine.
|
||||
|
||||
You can ask for the addresses of a particular interface by doing
|
||||
|
||||
>>> netifaces.ifaddresses('lo0')
|
||||
{18: [{'addr': ''}], 2: [{'peer': '127.0.0.1', 'netmask': '255.0.0.0', 'addr': '127.0.0.1'}], 30: [{'peer': '::1', 'netmask': 'ffff:ffff:ffff:ffff:ffff:ffff:ffff:ffff', 'addr': '::1'}, {'peer': '', 'netmask': 'ffff:ffff:ffff:ffff::', 'addr': 'fe80::1%lo0'}]}
|
||||
|
||||
Hmmmm. That result looks a bit cryptic; let's break it apart and explain
|
||||
what each piece means. It returned a dictionary, so let's look there first::
|
||||
|
||||
{ 18: [...], 2: [...], 30: [...] }
|
||||
|
||||
Each of the numbers refers to a particular address family. In this case, we
|
||||
have three address families listed; on my system, 18 is ``AF_LINK`` (which means
|
||||
the link layer interface, e.g. Ethernet), 2 is ``AF_INET`` (normal Internet
|
||||
addresses), and 30 is ``AF_INET6`` (IPv6).
|
||||
|
||||
But wait! Don't use these numbers in your code. The numeric values here are
|
||||
system dependent; fortunately, I thought of that when writing netifaces, so
|
||||
the module declares a range of values that you might need. e.g.
|
||||
|
||||
>>> netifaces.AF_LINK
|
||||
18
|
||||
|
||||
Again, on your system, the number may be different.
|
||||
|
||||
So, what we've established is that the dictionary that's returned has one
|
||||
entry for each address family for which this interface has an address. Let's
|
||||
take a look at the ``AF_INET`` addresses now:
|
||||
|
||||
>>> addrs = netifaces.ifaddresses('lo0')
|
||||
>>> addrs[netifaces.AF_INET]
|
||||
[{'peer': '127.0.0.1', 'netmask': '255.0.0.0', 'addr': '127.0.0.1'}]
|
||||
|
||||
You might be wondering why this value is a list. The reason is that it's
|
||||
possible for an interface to have more than one address, even within the
|
||||
same family. I'll say that again: *you can have more than one address of
|
||||
the same type associated with each interface*.
|
||||
|
||||
*Asking for "the" address of a particular interface doesn't make sense.*
|
||||
|
||||
Right, so, we can see that this particular interface only has one address,
|
||||
and, because it's a loopback interface, it's point-to-point and therefore
|
||||
has a *peer* address rather than a broadcast address.
|
||||
|
||||
Let's look at a more interesting interface.
|
||||
|
||||
>>> addrs = netifaces.ifaddresses('en0')
|
||||
>>> addrs[netifaces.AF_INET]
|
||||
[{'broadcast': '10.15.255.255', 'netmask': '255.240.0.0', 'addr': '10.0.1.4'}, {'broadcast': '192.168.0.255', 'addr': '192.168.0.47'}]
|
||||
|
||||
This interface has two addresses (see, I told you...) Both of them are
|
||||
regular IPv4 addresses, although in one case the netmask has been changed
|
||||
from its default. The netmask *may not* appear on your system if it's set
|
||||
to the default for the address range.
|
||||
|
||||
Because this interface isn't point-to-point, it also has broadcast addresses.
|
||||
|
||||
Now, say we want, instead of the IP addresses, to get the MAC address; that
|
||||
is, the hardware address of the Ethernet adapter running this interface. We
|
||||
can do
|
||||
|
||||
>>> addrs[netifaces.AF_LINK]
|
||||
[{'addr': '00:12:34:56:78:9a'}]
|
||||
|
||||
Note that this may not be available on platforms without getifaddrs(), unless
|
||||
they happen to implement ``SIOCGIFHWADDR``. Note also that you just get the
|
||||
address; it's unlikely that you'll see anything else with an ``AF_LINK`` address.
|
||||
Oh, and don't assume that all ``AF_LINK`` addresses are Ethernet; you might, for
|
||||
instance, be on a Mac, in which case:
|
||||
|
||||
>>> addrs = netifaces.ifaddresses('fw0')
|
||||
>>> addrs[netifaces.AF_LINK]
|
||||
[{'addr': '00:12:34:56:78:9a:bc:de'}]
|
||||
|
||||
No, that isn't an exceptionally long Ethernet MAC address---it's a FireWire
|
||||
address.
|
||||
|
||||
As of version 0.10.0, you can also obtain a list of gateways on your
|
||||
machine:
|
||||
|
||||
>>> netifaces.gateways()
|
||||
{2: [('10.0.1.1', 'en0', True), ('10.2.1.1', 'en1', False)], 30: [('fe80::1', 'en0', True)], 'default': { 2: ('10.0.1.1', 'en0'), 30: ('fe80::1', 'en0') }}
|
||||
|
||||
This dictionary is keyed on address family---in this case, ``AF_INET``---and
|
||||
each entry is a list of gateways as ``(address, interface, is_default)`` tuples.
|
||||
Notice that here we have two separate gateways for IPv4 (``AF_INET``); some
|
||||
operating systems support configurations like this and can either route packets
|
||||
based on their source, or based on administratively configured routing tables.
|
||||
|
||||
For convenience, we also allow you to index the dictionary with the special
|
||||
value ``'default'``, which returns a dictionary mapping address families to the
|
||||
default gateway in each case. Thus you can get the default IPv4 gateway with
|
||||
|
||||
>>> gws = netifaces.gateways()
|
||||
>>> gws['default'][netifaces.AF_INET]
|
||||
('10.0.1.1', 'en0')
|
||||
|
||||
Do note that there may be no default gateway for any given address family;
|
||||
this is currently very common for IPv6 and much less common for IPv4 but it
|
||||
can happen even for ``AF_INET``.
|
||||
|
||||
BTW, if you're trying to configure your machine to have multiple gateways for
|
||||
the same address family, it's a very good idea to check the documentation for
|
||||
your operating system *very* carefully, as some systems become extremely
|
||||
confused or route packets in a non-obvious manner.
|
||||
|
||||
I'm very interested in hearing from anyone (on any platform) for whom the
|
||||
``gateways()`` method doesn't produce the expected results. It's quite
|
||||
complicated extracting this information from the operating system (whichever
|
||||
operating system we're talking about), and so I expect there's at least one
|
||||
system out there where this just won't work.
|
||||
|
||||
3. This is great! What platforms does it work on?
|
||||
--------------------------------------------------
|
||||
|
||||
It gets regular testing on OS X, Linux and Windows. It has also been used
|
||||
successfully on Solaris, and it's expected to work properly on other UNIX-like
|
||||
systems as well. If you are running something that is not supported, and
|
||||
wish to contribute a patch, please use BitBucket to send a pull request.
|
||||
|
||||
4. What license is this under?
|
||||
------------------------------
|
||||
|
||||
It's an MIT-style license. Here goes:
|
||||
|
||||
Copyright (c) 2007-2017 Alastair Houghton
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
|
||||
5. Why the jump to 0.10.0?
|
||||
--------------------------
|
||||
|
||||
Because someone released a fork of netifaces with the version 0.9.0.
|
||||
Hopefully skipping the version number should remove any confusion. In
|
||||
addition starting with 0.10.0 Python 3 is now supported and other
|
||||
features/bugfixes have been included as well. See the CHANGELOG for a
|
||||
more complete list of changes.
|
||||
|
||||
|
||||
220
lib/python3.4/site-packages/netifaces-0.10.6.dist-info/METADATA
Normal file
220
lib/python3.4/site-packages/netifaces-0.10.6.dist-info/METADATA
Normal file
|
|
@ -0,0 +1,220 @@
|
|||
Metadata-Version: 2.0
|
||||
Name: netifaces
|
||||
Version: 0.10.6
|
||||
Summary: Portable network interface information.
|
||||
Home-page: https://bitbucket.org/al45tair/netifaces
|
||||
Author: Alastair Houghton
|
||||
Author-email: alastair@alastairs-place.net
|
||||
License: MIT License
|
||||
Description-Content-Type: UNKNOWN
|
||||
Platform: UNKNOWN
|
||||
Classifier: Development Status :: 4 - Beta
|
||||
Classifier: Intended Audience :: Developers
|
||||
Classifier: License :: OSI Approved :: MIT License
|
||||
Classifier: Topic :: System :: Networking
|
||||
Classifier: Programming Language :: Python
|
||||
Classifier: Programming Language :: Python :: 2
|
||||
Classifier: Programming Language :: Python :: 2.5
|
||||
Classifier: Programming Language :: Python :: 2.6
|
||||
Classifier: Programming Language :: Python :: 2.7
|
||||
Classifier: Programming Language :: Python :: 3
|
||||
|
||||
netifaces 0.10.6
|
||||
================
|
||||
|
||||
.. image:: https://drone.io/bitbucket.org/al45tair/netifaces/status.png
|
||||
:target: https://drone.io/bitbucket.org/al45tair/netifaces/latest
|
||||
:alt: Build Status
|
||||
|
||||
1. What is this?
|
||||
----------------
|
||||
|
||||
It's been annoying me for some time that there's no easy way to get the
|
||||
address(es) of the machine's network interfaces from Python. There is
|
||||
a good reason for this difficulty, which is that it is virtually impossible
|
||||
to do so in a portable manner. However, it seems to me that there should
|
||||
be a package you can easy_install that will take care of working out the
|
||||
details of doing so on the machine you're using, then you can get on with
|
||||
writing Python code without concerning yourself with the nitty gritty of
|
||||
system-dependent low-level networking APIs.
|
||||
|
||||
This package attempts to solve that problem.
|
||||
|
||||
2. How do I use it?
|
||||
-------------------
|
||||
|
||||
First you need to install it, which you can do by typing::
|
||||
|
||||
tar xvzf netifaces-0.10.6.tar.gz
|
||||
cd netifaces-0.10.6
|
||||
python setup.py install
|
||||
|
||||
**Note that you will need the relevant developer tools for your platform**,
|
||||
as netifaces is written in C and installing this way will compile the extension.
|
||||
|
||||
Once that's done, you'll need to start Python and do something like the
|
||||
following::
|
||||
|
||||
>>> import netifaces
|
||||
|
||||
Then if you enter
|
||||
|
||||
>>> netifaces.interfaces()
|
||||
['lo0', 'gif0', 'stf0', 'en0', 'en1', 'fw0']
|
||||
|
||||
you'll see the list of interface identifiers for your machine.
|
||||
|
||||
You can ask for the addresses of a particular interface by doing
|
||||
|
||||
>>> netifaces.ifaddresses('lo0')
|
||||
{18: [{'addr': ''}], 2: [{'peer': '127.0.0.1', 'netmask': '255.0.0.0', 'addr': '127.0.0.1'}], 30: [{'peer': '::1', 'netmask': 'ffff:ffff:ffff:ffff:ffff:ffff:ffff:ffff', 'addr': '::1'}, {'peer': '', 'netmask': 'ffff:ffff:ffff:ffff::', 'addr': 'fe80::1%lo0'}]}
|
||||
|
||||
Hmmmm. That result looks a bit cryptic; let's break it apart and explain
|
||||
what each piece means. It returned a dictionary, so let's look there first::
|
||||
|
||||
{ 18: [...], 2: [...], 30: [...] }
|
||||
|
||||
Each of the numbers refers to a particular address family. In this case, we
|
||||
have three address families listed; on my system, 18 is ``AF_LINK`` (which means
|
||||
the link layer interface, e.g. Ethernet), 2 is ``AF_INET`` (normal Internet
|
||||
addresses), and 30 is ``AF_INET6`` (IPv6).
|
||||
|
||||
But wait! Don't use these numbers in your code. The numeric values here are
|
||||
system dependent; fortunately, I thought of that when writing netifaces, so
|
||||
the module declares a range of values that you might need. e.g.
|
||||
|
||||
>>> netifaces.AF_LINK
|
||||
18
|
||||
|
||||
Again, on your system, the number may be different.
|
||||
|
||||
So, what we've established is that the dictionary that's returned has one
|
||||
entry for each address family for which this interface has an address. Let's
|
||||
take a look at the ``AF_INET`` addresses now:
|
||||
|
||||
>>> addrs = netifaces.ifaddresses('lo0')
|
||||
>>> addrs[netifaces.AF_INET]
|
||||
[{'peer': '127.0.0.1', 'netmask': '255.0.0.0', 'addr': '127.0.0.1'}]
|
||||
|
||||
You might be wondering why this value is a list. The reason is that it's
|
||||
possible for an interface to have more than one address, even within the
|
||||
same family. I'll say that again: *you can have more than one address of
|
||||
the same type associated with each interface*.
|
||||
|
||||
*Asking for "the" address of a particular interface doesn't make sense.*
|
||||
|
||||
Right, so, we can see that this particular interface only has one address,
|
||||
and, because it's a loopback interface, it's point-to-point and therefore
|
||||
has a *peer* address rather than a broadcast address.
|
||||
|
||||
Let's look at a more interesting interface.
|
||||
|
||||
>>> addrs = netifaces.ifaddresses('en0')
|
||||
>>> addrs[netifaces.AF_INET]
|
||||
[{'broadcast': '10.15.255.255', 'netmask': '255.240.0.0', 'addr': '10.0.1.4'}, {'broadcast': '192.168.0.255', 'addr': '192.168.0.47'}]
|
||||
|
||||
This interface has two addresses (see, I told you...) Both of them are
|
||||
regular IPv4 addresses, although in one case the netmask has been changed
|
||||
from its default. The netmask *may not* appear on your system if it's set
|
||||
to the default for the address range.
|
||||
|
||||
Because this interface isn't point-to-point, it also has broadcast addresses.
|
||||
|
||||
Now, say we want, instead of the IP addresses, to get the MAC address; that
|
||||
is, the hardware address of the Ethernet adapter running this interface. We
|
||||
can do
|
||||
|
||||
>>> addrs[netifaces.AF_LINK]
|
||||
[{'addr': '00:12:34:56:78:9a'}]
|
||||
|
||||
Note that this may not be available on platforms without getifaddrs(), unless
|
||||
they happen to implement ``SIOCGIFHWADDR``. Note also that you just get the
|
||||
address; it's unlikely that you'll see anything else with an ``AF_LINK`` address.
|
||||
Oh, and don't assume that all ``AF_LINK`` addresses are Ethernet; you might, for
|
||||
instance, be on a Mac, in which case:
|
||||
|
||||
>>> addrs = netifaces.ifaddresses('fw0')
|
||||
>>> addrs[netifaces.AF_LINK]
|
||||
[{'addr': '00:12:34:56:78:9a:bc:de'}]
|
||||
|
||||
No, that isn't an exceptionally long Ethernet MAC address---it's a FireWire
|
||||
address.
|
||||
|
||||
As of version 0.10.0, you can also obtain a list of gateways on your
|
||||
machine:
|
||||
|
||||
>>> netifaces.gateways()
|
||||
{2: [('10.0.1.1', 'en0', True), ('10.2.1.1', 'en1', False)], 30: [('fe80::1', 'en0', True)], 'default': { 2: ('10.0.1.1', 'en0'), 30: ('fe80::1', 'en0') }}
|
||||
|
||||
This dictionary is keyed on address family---in this case, ``AF_INET``---and
|
||||
each entry is a list of gateways as ``(address, interface, is_default)`` tuples.
|
||||
Notice that here we have two separate gateways for IPv4 (``AF_INET``); some
|
||||
operating systems support configurations like this and can either route packets
|
||||
based on their source, or based on administratively configured routing tables.
|
||||
|
||||
For convenience, we also allow you to index the dictionary with the special
|
||||
value ``'default'``, which returns a dictionary mapping address families to the
|
||||
default gateway in each case. Thus you can get the default IPv4 gateway with
|
||||
|
||||
>>> gws = netifaces.gateways()
|
||||
>>> gws['default'][netifaces.AF_INET]
|
||||
('10.0.1.1', 'en0')
|
||||
|
||||
Do note that there may be no default gateway for any given address family;
|
||||
this is currently very common for IPv6 and much less common for IPv4 but it
|
||||
can happen even for ``AF_INET``.
|
||||
|
||||
BTW, if you're trying to configure your machine to have multiple gateways for
|
||||
the same address family, it's a very good idea to check the documentation for
|
||||
your operating system *very* carefully, as some systems become extremely
|
||||
confused or route packets in a non-obvious manner.
|
||||
|
||||
I'm very interested in hearing from anyone (on any platform) for whom the
|
||||
``gateways()`` method doesn't produce the expected results. It's quite
|
||||
complicated extracting this information from the operating system (whichever
|
||||
operating system we're talking about), and so I expect there's at least one
|
||||
system out there where this just won't work.
|
||||
|
||||
3. This is great! What platforms does it work on?
|
||||
--------------------------------------------------
|
||||
|
||||
It gets regular testing on OS X, Linux and Windows. It has also been used
|
||||
successfully on Solaris, and it's expected to work properly on other UNIX-like
|
||||
systems as well. If you are running something that is not supported, and
|
||||
wish to contribute a patch, please use BitBucket to send a pull request.
|
||||
|
||||
4. What license is this under?
|
||||
------------------------------
|
||||
|
||||
It's an MIT-style license. Here goes:
|
||||
|
||||
Copyright (c) 2007-2017 Alastair Houghton
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
|
||||
5. Why the jump to 0.10.0?
|
||||
--------------------------
|
||||
|
||||
Because someone released a fork of netifaces with the version 0.9.0.
|
||||
Hopefully skipping the version number should remove any confusion. In
|
||||
addition starting with 0.10.0 Python 3 is now supported and other
|
||||
features/bugfixes have been included as well. See the CHANGELOG for a
|
||||
more complete list of changes.
|
||||
|
||||
|
||||
|
|
@ -0,0 +1,9 @@
|
|||
netifaces.cpython-34m.so,sha256=KiLZHMhvo_x40-9D0bLqZoVzQsGbimZY_33SUPowm9E,72976
|
||||
netifaces-0.10.6.dist-info/DESCRIPTION.rst,sha256=WCNR0xdB7g_1r_U6WwIedMlurGlPeDjvJX-NBElPoII,8555
|
||||
netifaces-0.10.6.dist-info/METADATA,sha256=InwXovYI_sgETAChE4hBUFbkSwYlZ_gWeKcNvyX8KOA,9322
|
||||
netifaces-0.10.6.dist-info/RECORD,,
|
||||
netifaces-0.10.6.dist-info/WHEEL,sha256=AEztX7vHDtcgysb-4-5-DyIKMLIPg6NMxY9dXTRdoXQ,104
|
||||
netifaces-0.10.6.dist-info/metadata.json,sha256=W-IHSrO0Ma846gdBr18QTsvc9GjGN0SgAnZha0vW9tU,885
|
||||
netifaces-0.10.6.dist-info/top_level.txt,sha256=PqMTaIuWtSjkdQHX6lH1Lmpv2aqBUYAGqATB8z3A6TQ,10
|
||||
netifaces-0.10.6.dist-info/zip-safe,sha256=AbpHGcgLb-kRsJGnwFEktk7uzpZOCcBY74-YBdrKVGs,1
|
||||
netifaces-0.10.6.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
|
||||
|
|
@ -0,0 +1,5 @@
|
|||
Wheel-Version: 1.0
|
||||
Generator: bdist_wheel (0.30.0)
|
||||
Root-Is-Purelib: false
|
||||
Tag: cp34-cp34m-linux_x86_64
|
||||
|
||||
|
|
@ -0,0 +1 @@
|
|||
{"classifiers": ["Development Status :: 4 - Beta", "Intended Audience :: Developers", "License :: OSI Approved :: MIT License", "Topic :: System :: Networking", "Programming Language :: Python", "Programming Language :: Python :: 2", "Programming Language :: Python :: 2.5", "Programming Language :: Python :: 2.6", "Programming Language :: Python :: 2.7", "Programming Language :: Python :: 3"], "description_content_type": "UNKNOWN", "extensions": {"python.details": {"contacts": [{"email": "alastair@alastairs-place.net", "name": "Alastair Houghton", "role": "author"}], "document_names": {"description": "DESCRIPTION.rst"}, "project_urls": {"Home": "https://bitbucket.org/al45tair/netifaces"}}}, "generator": "bdist_wheel (0.30.0)", "license": "MIT License", "metadata_version": "2.0", "name": "netifaces", "summary": "Portable network interface information.", "version": "0.10.6"}
|
||||
BIN
lib/python3.4/site-packages/netifaces.cpython-34m.so
Executable file
BIN
lib/python3.4/site-packages/netifaces.cpython-34m.so
Executable file
Binary file not shown.
BIN
lib/python3.4/site-packages/netifaces.cpython-35m-x86_64-linux-gnu.so
Executable file
BIN
lib/python3.4/site-packages/netifaces.cpython-35m-x86_64-linux-gnu.so
Executable file
Binary file not shown.
BIN
lib/python3.4/site-packages/netifaces.cpython-36m-x86_64-linux-gnu.so
Executable file
BIN
lib/python3.4/site-packages/netifaces.cpython-36m-x86_64-linux-gnu.so
Executable file
Binary file not shown.
|
|
@ -0,0 +1,39 @@
|
|||
pip
|
||||
===
|
||||
|
||||
The `PyPA recommended
|
||||
<https://packaging.python.org/en/latest/current/>`_
|
||||
tool for installing Python packages.
|
||||
|
||||
* `Installation <https://pip.pypa.io/en/stable/installing.html>`_
|
||||
* `Documentation <https://pip.pypa.io/>`_
|
||||
* `Changelog <https://pip.pypa.io/en/stable/news.html>`_
|
||||
* `Github Page <https://github.com/pypa/pip>`_
|
||||
* `Issue Tracking <https://github.com/pypa/pip/issues>`_
|
||||
* `User mailing list <http://groups.google.com/group/python-virtualenv>`_
|
||||
* `Dev mailing list <http://groups.google.com/group/pypa-dev>`_
|
||||
* User IRC: #pypa on Freenode.
|
||||
* Dev IRC: #pypa-dev on Freenode.
|
||||
|
||||
|
||||
.. image:: https://img.shields.io/pypi/v/pip.svg
|
||||
:target: https://pypi.python.org/pypi/pip
|
||||
|
||||
.. image:: https://img.shields.io/travis/pypa/pip/master.svg
|
||||
:target: http://travis-ci.org/pypa/pip
|
||||
|
||||
.. image:: https://img.shields.io/appveyor/ci/pypa/pip.svg
|
||||
:target: https://ci.appveyor.com/project/pypa/pip/history
|
||||
|
||||
.. image:: https://readthedocs.org/projects/pip/badge/?version=stable
|
||||
:target: https://pip.pypa.io/en/stable
|
||||
|
||||
Code of Conduct
|
||||
---------------
|
||||
|
||||
Everyone interacting in the pip project's codebases, issue trackers, chat
|
||||
rooms, and mailing lists is expected to follow the `PyPA Code of Conduct`_.
|
||||
|
||||
.. _PyPA Code of Conduct: https://www.pypa.io/en/latest/code-of-conduct/
|
||||
|
||||
|
||||
69
lib/python3.4/site-packages/pip-9.0.1.dist-info/METADATA
Normal file
69
lib/python3.4/site-packages/pip-9.0.1.dist-info/METADATA
Normal file
|
|
@ -0,0 +1,69 @@
|
|||
Metadata-Version: 2.0
|
||||
Name: pip
|
||||
Version: 9.0.1
|
||||
Summary: The PyPA recommended tool for installing Python packages.
|
||||
Home-page: https://pip.pypa.io/
|
||||
Author: The pip developers
|
||||
Author-email: python-virtualenv@groups.google.com
|
||||
License: MIT
|
||||
Keywords: easy_install distutils setuptools egg virtualenv
|
||||
Platform: UNKNOWN
|
||||
Classifier: Development Status :: 5 - Production/Stable
|
||||
Classifier: Intended Audience :: Developers
|
||||
Classifier: License :: OSI Approved :: MIT License
|
||||
Classifier: Topic :: Software Development :: Build Tools
|
||||
Classifier: Programming Language :: Python :: 2
|
||||
Classifier: Programming Language :: Python :: 2.6
|
||||
Classifier: Programming Language :: Python :: 2.7
|
||||
Classifier: Programming Language :: Python :: 3
|
||||
Classifier: Programming Language :: Python :: 3.3
|
||||
Classifier: Programming Language :: Python :: 3.4
|
||||
Classifier: Programming Language :: Python :: 3.5
|
||||
Classifier: Programming Language :: Python :: Implementation :: PyPy
|
||||
Requires-Python: >=2.6,!=3.0.*,!=3.1.*,!=3.2.*
|
||||
Provides-Extra: testing
|
||||
Requires-Dist: mock; extra == 'testing'
|
||||
Requires-Dist: pretend; extra == 'testing'
|
||||
Requires-Dist: pytest; extra == 'testing'
|
||||
Requires-Dist: scripttest (>=1.3); extra == 'testing'
|
||||
Requires-Dist: virtualenv (>=1.10); extra == 'testing'
|
||||
|
||||
pip
|
||||
===
|
||||
|
||||
The `PyPA recommended
|
||||
<https://packaging.python.org/en/latest/current/>`_
|
||||
tool for installing Python packages.
|
||||
|
||||
* `Installation <https://pip.pypa.io/en/stable/installing.html>`_
|
||||
* `Documentation <https://pip.pypa.io/>`_
|
||||
* `Changelog <https://pip.pypa.io/en/stable/news.html>`_
|
||||
* `Github Page <https://github.com/pypa/pip>`_
|
||||
* `Issue Tracking <https://github.com/pypa/pip/issues>`_
|
||||
* `User mailing list <http://groups.google.com/group/python-virtualenv>`_
|
||||
* `Dev mailing list <http://groups.google.com/group/pypa-dev>`_
|
||||
* User IRC: #pypa on Freenode.
|
||||
* Dev IRC: #pypa-dev on Freenode.
|
||||
|
||||
|
||||
.. image:: https://img.shields.io/pypi/v/pip.svg
|
||||
:target: https://pypi.python.org/pypi/pip
|
||||
|
||||
.. image:: https://img.shields.io/travis/pypa/pip/master.svg
|
||||
:target: http://travis-ci.org/pypa/pip
|
||||
|
||||
.. image:: https://img.shields.io/appveyor/ci/pypa/pip.svg
|
||||
:target: https://ci.appveyor.com/project/pypa/pip/history
|
||||
|
||||
.. image:: https://readthedocs.org/projects/pip/badge/?version=stable
|
||||
:target: https://pip.pypa.io/en/stable
|
||||
|
||||
Code of Conduct
|
||||
---------------
|
||||
|
||||
Everyone interacting in the pip project's codebases, issue trackers, chat
|
||||
rooms, and mailing lists is expected to follow the `PyPA Code of Conduct`_.
|
||||
|
||||
.. _PyPA Code of Conduct: https://www.pypa.io/en/latest/code-of-conduct/
|
||||
|
||||
|
||||
123
lib/python3.4/site-packages/pip-9.0.1.dist-info/RECORD
Normal file
123
lib/python3.4/site-packages/pip-9.0.1.dist-info/RECORD
Normal file
|
|
@ -0,0 +1,123 @@
|
|||
pip/__init__.py,sha256=00QWSreEBjb8Y8sPs8HeqgLXSB-3UrONJxo4J5APxEc,11348
|
||||
pip/__main__.py,sha256=V6Kh-IEDEFpt1cahRE6MajUF_14qJR_Qsvn4MjWZXzE,584
|
||||
pip/basecommand.py,sha256=TTlmZesQ4Vuxcto2KqwZGmgmN5ioHEl_DeFev9ie_SA,11910
|
||||
pip/baseparser.py,sha256=AKMOeF3fTrRroiv0DmTQbdiLW0DQux2KqGC_dJJB9d0,10465
|
||||
pip/cmdoptions.py,sha256=pRptFz05iFEfSW4Flg3x1_P92sYlFvq7elhnwujikNY,16473
|
||||
pip/download.py,sha256=rA0wbmqC2n9ejX481YJSidmKgQqQDjdaxkHkHlAN68k,32171
|
||||
pip/exceptions.py,sha256=BvqH-Jw3tP2b-2IJ2kjrQemOAPMqKrQMLRIZHZQpJXk,8121
|
||||
pip/index.py,sha256=L6UhtAEZc2qw7BqfQrkPQcw2gCgEw3GukLRSA95BNyI,39950
|
||||
pip/locations.py,sha256=9rJRlgonC6QC2zGDIn_7mXaoZ9_tF_IHM2BQhWVRgbo,5626
|
||||
pip/pep425tags.py,sha256=q3kec4f6NHszuGYIhGIbVvs896D06uJAnKFgJ_wce44,10980
|
||||
pip/status_codes.py,sha256=F6uDG6Gj7RNKQJUDnd87QKqI16Us-t-B0wPF_4QMpWc,156
|
||||
pip/wheel.py,sha256=QSWmGs2ui-n4UMWm0JUY6aMCcwNKungVzbWsxI9KlJQ,32010
|
||||
pip/_vendor/__init__.py,sha256=L-0x9jj0HSZen1Fm2U0GUbxfjfwQPIXc4XJ4IAxy8D8,4804
|
||||
pip/commands/__init__.py,sha256=2Uq3HCdjchJD9FL1LB7rd5v6UySVAVizX0W3EX3hIoE,2244
|
||||
pip/commands/check.py,sha256=-A7GI1-WZBh9a4P6UoH_aR-J7I8Lz8ly7m3wnCjmevs,1382
|
||||
pip/commands/completion.py,sha256=kkPgVX7SUcJ_8Juw5GkgWaxHN9_45wmAr9mGs1zXEEs,2453
|
||||
pip/commands/download.py,sha256=8RuuPmSYgAq3iEDTqZY_1PDXRqREdUULHNjWJeAv7Mo,7810
|
||||
pip/commands/freeze.py,sha256=h6-yFMpjCjbNj8-gOm5UuoF6cg14N5rPV4TCi3_CeuI,2835
|
||||
pip/commands/hash.py,sha256=MCt4jEFyfoce0lVeNEz1x49uaTY-VDkKiBvvxrVcHkw,1597
|
||||
pip/commands/help.py,sha256=84HWkEdnGP_AEBHnn8gJP2Te0XTXRKFoXqXopbOZTNo,982
|
||||
pip/commands/install.py,sha256=o-CR1TKf-b1qaFv47nNlawqsIfDjXyIzv_iJUw1Trag,18069
|
||||
pip/commands/list.py,sha256=93bCiFyt2Qut_YHkYHJMZHpXladmxsjS-yOtZeb3uqI,11369
|
||||
pip/commands/search.py,sha256=oTs9QNdefnrmCV_JeftG0PGiMuYVmiEDF1OUaYsmDao,4502
|
||||
pip/commands/show.py,sha256=ZYM57_7U8KP9MQIIyHKQdZxmiEZByy-DRzB697VFoTY,5891
|
||||
pip/commands/uninstall.py,sha256=tz8cXz4WdpUdnt3RvpdQwH6_SNMB50egBIZWa1dwfcc,2884
|
||||
pip/commands/wheel.py,sha256=z5SEhws2YRMb0Ml1IEkg6jFZMLRpLl86bHCrQbYt5zo,7729
|
||||
pip/compat/__init__.py,sha256=2Xs_IpsmdRgHbQgQO0c8_lPvHJnQXHyGWxPbLbYJL4c,4672
|
||||
pip/compat/dictconfig.py,sha256=dRrelPDWrceDSzFT51RTEVY2GuM7UDyc5Igh_tn4Fvk,23096
|
||||
pip/models/__init__.py,sha256=0Rs7_RA4DxeOkWT5Cq4CQzDrSEhvYcN3TH2cazr72PE,71
|
||||
pip/models/index.py,sha256=pUfbO__v3mD9j-2n_ClwPS8pVyx4l2wIwyvWt8GMCRA,487
|
||||
pip/operations/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
|
||||
pip/operations/check.py,sha256=uwUN9cs1sPo7c0Sj6pRrSv7b22Pk29SXUImTelVchMQ,1590
|
||||
pip/operations/freeze.py,sha256=k-7w7LsM-RpPv7ERBzHiPpYkH-GuYfHLyR-Cp_1VPL0,5194
|
||||
pip/req/__init__.py,sha256=vFwZY8_Vc1WU1zFAespg1My_r_AT3n7cN0W9eX0EFqk,276
|
||||
pip/req/req_file.py,sha256=fG9MDsXUNPhmGwxUiwrIXEynyD8Q7s3L47-hLZPDXq0,11926
|
||||
pip/req/req_install.py,sha256=gYrH-lwQMmt55VVbav_EtRIPu94cQbHFHm_Kq6AeHbg,46487
|
||||
pip/req/req_set.py,sha256=jHspXqcA2FxcF05dgUIAZ5huYPv6bn0wRUX0Z7PKmaA,34462
|
||||
pip/req/req_uninstall.py,sha256=fdH2VgCjEC8NRYDS7fRu3ZJaBBUEy-N5muwxDX5MBNM,6897
|
||||
pip/utils/__init__.py,sha256=zk1vF2EzHZX1ZKPwgeC9I6yKvs8IJ6NZEfXgp2IP8hI,27912
|
||||
pip/utils/appdirs.py,sha256=kj2LK-I2fC5QnEh_A_v-ev_IQMcXaWWF5DE39sNvCLQ,8811
|
||||
pip/utils/build.py,sha256=4smLRrfSCmXmjEnVnMFh2tBEpNcSLRe6J0ejZJ-wWJE,1312
|
||||
pip/utils/deprecation.py,sha256=X_FMjtDbMJqfqEkdRrki-mYyIdPB6I6DHUTCA_ChY6M,2232
|
||||
pip/utils/encoding.py,sha256=NQxGiFS5GbeAveLZTnx92t5r0PYqvt0iRnP2u9SGG1w,971
|
||||
pip/utils/filesystem.py,sha256=ZEVBuYM3fqr2_lgOESh4Y7fPFszGD474zVm_M3Mb5Tk,899
|
||||
pip/utils/glibc.py,sha256=jcQYjt_oJLPKVZB28Kauy4Sw70zS-wawxoU1HHX36_0,2939
|
||||
pip/utils/hashes.py,sha256=oMk7cd3PbJgzpSQyXq1MytMud5f6H5Oa2YY5hYuCq6I,2866
|
||||
pip/utils/logging.py,sha256=7yWu4gZw-Qclj7X80QVdpGWkdTWGKT4LiUVKcE04pro,3327
|
||||
pip/utils/outdated.py,sha256=fNwOCL5r2EftPGhgCYGMKu032HC8cV-JAr9lp0HmToM,5455
|
||||
pip/utils/packaging.py,sha256=qhmli14odw6DIhWJgQYS2Q0RrSbr8nXNcG48f5yTRms,2080
|
||||
pip/utils/setuptools_build.py,sha256=0blfscmNJW_iZ5DcswJeDB_PbtTEjfK9RL1R1WEDW2E,278
|
||||
pip/utils/ui.py,sha256=pbDkSAeumZ6jdZcOJ2yAbx8iBgeP2zfpqNnLJK1gskQ,11597
|
||||
pip/vcs/__init__.py,sha256=WafFliUTHMmsSISV8PHp1M5EXDNSWyJr78zKaQmPLdY,12374
|
||||
pip/vcs/bazaar.py,sha256=tYTwc4b4off8mr0O2o8SiGejqBDJxcbDBMSMd9-ISYc,3803
|
||||
pip/vcs/git.py,sha256=5LfWryi78A-2ULjEZJvCTarJ_3l8venwXASlwm8hiug,11197
|
||||
pip/vcs/mercurial.py,sha256=xG6rDiwHCRytJEs23SIHBXl_SwQo2jkkdD_6rVVP5h4,3472
|
||||
pip/vcs/subversion.py,sha256=GAuX2Sk7IZvJyEzENKcVld_wGBrQ3fpXDlXjapZEYdI,9350
|
||||
pip-9.0.1.dist-info/DESCRIPTION.rst,sha256=Va8Wj1XBpTbVQ2Z41mZRJdALEeziiS_ZewWn1H2ecY4,1287
|
||||
pip-9.0.1.dist-info/METADATA,sha256=mvs_tLoKAbECXY_6QHiVWQsagSL-1UjolQTpScT8JSk,2529
|
||||
pip-9.0.1.dist-info/RECORD,,
|
||||
pip-9.0.1.dist-info/WHEEL,sha256=o2k-Qa-RMNIJmUdIc7KU6VWR_ErNRbWNlxDIpl7lm34,110
|
||||
pip-9.0.1.dist-info/entry_points.txt,sha256=GWc-Wb9WUKZ1EuVWNz-G0l3BeIpbNJLx0OJbZ61AAV0,68
|
||||
pip-9.0.1.dist-info/metadata.json,sha256=aqvkETDy4mHUBob-2Fn5WWlXORi_M2OSfQ2HQCUU_Fk,1565
|
||||
pip-9.0.1.dist-info/top_level.txt,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
|
||||
../../../bin/pip,sha256=kUtfTrIe4CRluRco6nKs-hUx0Eir2ABPF8Rr_1zK534,272
|
||||
../../../bin/pip3,sha256=kUtfTrIe4CRluRco6nKs-hUx0Eir2ABPF8Rr_1zK534,272
|
||||
../../../bin/pip3.4,sha256=kUtfTrIe4CRluRco6nKs-hUx0Eir2ABPF8Rr_1zK534,272
|
||||
pip-9.0.1.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
|
||||
pip/__pycache__/exceptions.cpython-34.pyc,,
|
||||
pip/utils/__pycache__/ui.cpython-34.pyc,,
|
||||
pip/__pycache__/basecommand.cpython-34.pyc,,
|
||||
pip/commands/__pycache__/check.cpython-34.pyc,,
|
||||
pip/utils/__pycache__/packaging.cpython-34.pyc,,
|
||||
pip/utils/__pycache__/build.cpython-34.pyc,,
|
||||
pip/vcs/__pycache__/__init__.cpython-34.pyc,,
|
||||
pip/__pycache__/download.cpython-34.pyc,,
|
||||
pip/utils/__pycache__/setuptools_build.cpython-34.pyc,,
|
||||
pip/req/__pycache__/req_uninstall.cpython-34.pyc,,
|
||||
pip/utils/__pycache__/deprecation.cpython-34.pyc,,
|
||||
pip/operations/__pycache__/check.cpython-34.pyc,,
|
||||
pip/_vendor/__pycache__/__init__.cpython-34.pyc,,
|
||||
pip/utils/__pycache__/outdated.cpython-34.pyc,,
|
||||
pip/commands/__pycache__/install.cpython-34.pyc,,
|
||||
pip/operations/__pycache__/__init__.cpython-34.pyc,,
|
||||
pip/commands/__pycache__/freeze.cpython-34.pyc,,
|
||||
pip/req/__pycache__/req_set.cpython-34.pyc,,
|
||||
pip/operations/__pycache__/freeze.cpython-34.pyc,,
|
||||
pip/__pycache__/baseparser.cpython-34.pyc,,
|
||||
pip/commands/__pycache__/hash.cpython-34.pyc,,
|
||||
pip/commands/__pycache__/download.cpython-34.pyc,,
|
||||
pip/commands/__pycache__/wheel.cpython-34.pyc,,
|
||||
pip/commands/__pycache__/help.cpython-34.pyc,,
|
||||
pip/utils/__pycache__/glibc.cpython-34.pyc,,
|
||||
pip/__pycache__/locations.cpython-34.pyc,,
|
||||
pip/commands/__pycache__/list.cpython-34.pyc,,
|
||||
pip/compat/__pycache__/dictconfig.cpython-34.pyc,,
|
||||
pip/__pycache__/__init__.cpython-34.pyc,,
|
||||
pip/utils/__pycache__/hashes.cpython-34.pyc,,
|
||||
pip/compat/__pycache__/__init__.cpython-34.pyc,,
|
||||
pip/vcs/__pycache__/git.cpython-34.pyc,,
|
||||
pip/req/__pycache__/__init__.cpython-34.pyc,,
|
||||
pip/__pycache__/__main__.cpython-34.pyc,,
|
||||
pip/__pycache__/status_codes.cpython-34.pyc,,
|
||||
pip/models/__pycache__/index.cpython-34.pyc,,
|
||||
pip/__pycache__/pep425tags.cpython-34.pyc,,
|
||||
pip/commands/__pycache__/uninstall.cpython-34.pyc,,
|
||||
pip/vcs/__pycache__/bazaar.cpython-34.pyc,,
|
||||
pip/req/__pycache__/req_install.cpython-34.pyc,,
|
||||
pip/vcs/__pycache__/mercurial.cpython-34.pyc,,
|
||||
pip/commands/__pycache__/__init__.cpython-34.pyc,,
|
||||
pip/commands/__pycache__/show.cpython-34.pyc,,
|
||||
pip/__pycache__/index.cpython-34.pyc,,
|
||||
pip/commands/__pycache__/completion.cpython-34.pyc,,
|
||||
pip/req/__pycache__/req_file.cpython-34.pyc,,
|
||||
pip/__pycache__/cmdoptions.cpython-34.pyc,,
|
||||
pip/utils/__pycache__/filesystem.cpython-34.pyc,,
|
||||
pip/__pycache__/wheel.cpython-34.pyc,,
|
||||
pip/utils/__pycache__/appdirs.cpython-34.pyc,,
|
||||
pip/utils/__pycache__/__init__.cpython-34.pyc,,
|
||||
pip/vcs/__pycache__/subversion.cpython-34.pyc,,
|
||||
pip/utils/__pycache__/logging.cpython-34.pyc,,
|
||||
pip/commands/__pycache__/search.cpython-34.pyc,,
|
||||
pip/utils/__pycache__/encoding.cpython-34.pyc,,
|
||||
pip/models/__pycache__/__init__.cpython-34.pyc,,
|
||||
|
|
@ -1,5 +1,5 @@
|
|||
Wheel-Version: 1.0
|
||||
Generator: bdist_wheel (0.32.3)
|
||||
Generator: bdist_wheel (0.29.0)
|
||||
Root-Is-Purelib: true
|
||||
Tag: py2-none-any
|
||||
Tag: py3-none-any
|
||||
|
|
@ -0,0 +1,5 @@
|
|||
[console_scripts]
|
||||
pip = pip:main
|
||||
pip3 = pip:main
|
||||
pip3.5 = pip:main
|
||||
|
||||
|
|
@ -0,0 +1 @@
|
|||
{"classifiers": ["Development Status :: 5 - Production/Stable", "Intended Audience :: Developers", "License :: OSI Approved :: MIT License", "Topic :: Software Development :: Build Tools", "Programming Language :: Python :: 2", "Programming Language :: Python :: 2.6", "Programming Language :: Python :: 2.7", "Programming Language :: Python :: 3", "Programming Language :: Python :: 3.3", "Programming Language :: Python :: 3.4", "Programming Language :: Python :: 3.5", "Programming Language :: Python :: Implementation :: PyPy"], "extensions": {"python.commands": {"wrap_console": {"pip": "pip:main", "pip3": "pip:main", "pip3.5": "pip:main"}}, "python.details": {"contacts": [{"email": "python-virtualenv@groups.google.com", "name": "The pip developers", "role": "author"}], "document_names": {"description": "DESCRIPTION.rst"}, "project_urls": {"Home": "https://pip.pypa.io/"}}, "python.exports": {"console_scripts": {"pip": "pip:main", "pip3": "pip:main", "pip3.5": "pip:main"}}}, "extras": ["testing"], "generator": "bdist_wheel (0.29.0)", "keywords": ["easy_install", "distutils", "setuptools", "egg", "virtualenv"], "license": "MIT", "metadata_version": "2.0", "name": "pip", "requires_python": ">=2.6,!=3.0.*,!=3.1.*,!=3.2.*", "run_requires": [{"extra": "testing", "requires": ["mock", "pretend", "pytest", "scripttest (>=1.3)", "virtualenv (>=1.10)"]}], "summary": "The PyPA recommended tool for installing Python packages.", "test_requires": [{"requires": ["mock", "pretend", "pytest", "scripttest (>=1.3)", "virtualenv (>=1.10)"]}], "version": "9.0.1"}
|
||||
331
lib/python3.4/site-packages/pip/__init__.py
Normal file
331
lib/python3.4/site-packages/pip/__init__.py
Normal file
|
|
@ -0,0 +1,331 @@
|
|||
#!/usr/bin/env python
|
||||
from __future__ import absolute_import
|
||||
|
||||
import locale
|
||||
import logging
|
||||
import os
|
||||
import optparse
|
||||
import warnings
|
||||
|
||||
import sys
|
||||
import re
|
||||
|
||||
# 2016-06-17 barry@debian.org: urllib3 1.14 added optional support for socks,
|
||||
# but if invoked (i.e. imported), it will issue a warning to stderr if socks
|
||||
# isn't available. requests unconditionally imports urllib3's socks contrib
|
||||
# module, triggering this warning. The warning breaks DEP-8 tests (because of
|
||||
# the stderr output) and is just plain annoying in normal usage. I don't want
|
||||
# to add socks as yet another dependency for pip, nor do I want to allow-stder
|
||||
# in the DEP-8 tests, so just suppress the warning. pdb tells me this has to
|
||||
# be done before the import of pip.vcs.
|
||||
from pip._vendor.requests.packages.urllib3.exceptions import DependencyWarning
|
||||
warnings.filterwarnings("ignore", category=DependencyWarning) # noqa
|
||||
|
||||
|
||||
from pip.exceptions import InstallationError, CommandError, PipError
|
||||
from pip.utils import get_installed_distributions, get_prog
|
||||
from pip.utils import deprecation, dist_is_editable
|
||||
from pip.vcs import git, mercurial, subversion, bazaar # noqa
|
||||
from pip.baseparser import ConfigOptionParser, UpdatingDefaultsHelpFormatter
|
||||
from pip.commands import get_summaries, get_similar_commands
|
||||
from pip.commands import commands_dict
|
||||
from pip._vendor.requests.packages.urllib3.exceptions import (
|
||||
InsecureRequestWarning,
|
||||
)
|
||||
|
||||
|
||||
# assignment for flake8 to be happy
|
||||
|
||||
# This fixes a peculiarity when importing via __import__ - as we are
|
||||
# initialising the pip module, "from pip import cmdoptions" is recursive
|
||||
# and appears not to work properly in that situation.
|
||||
import pip.cmdoptions
|
||||
cmdoptions = pip.cmdoptions
|
||||
|
||||
# The version as used in the setup.py and the docs conf.py
|
||||
__version__ = "9.0.1"
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Hide the InsecureRequestWarning from urllib3
|
||||
warnings.filterwarnings("ignore", category=InsecureRequestWarning)
|
||||
|
||||
|
||||
def autocomplete():
|
||||
"""Command and option completion for the main option parser (and options)
|
||||
and its subcommands (and options).
|
||||
|
||||
Enable by sourcing one of the completion shell scripts (bash, zsh or fish).
|
||||
"""
|
||||
# Don't complete if user hasn't sourced bash_completion file.
|
||||
if 'PIP_AUTO_COMPLETE' not in os.environ:
|
||||
return
|
||||
cwords = os.environ['COMP_WORDS'].split()[1:]
|
||||
cword = int(os.environ['COMP_CWORD'])
|
||||
try:
|
||||
current = cwords[cword - 1]
|
||||
except IndexError:
|
||||
current = ''
|
||||
|
||||
subcommands = [cmd for cmd, summary in get_summaries()]
|
||||
options = []
|
||||
# subcommand
|
||||
try:
|
||||
subcommand_name = [w for w in cwords if w in subcommands][0]
|
||||
except IndexError:
|
||||
subcommand_name = None
|
||||
|
||||
parser = create_main_parser()
|
||||
# subcommand options
|
||||
if subcommand_name:
|
||||
# special case: 'help' subcommand has no options
|
||||
if subcommand_name == 'help':
|
||||
sys.exit(1)
|
||||
# special case: list locally installed dists for uninstall command
|
||||
if subcommand_name == 'uninstall' and not current.startswith('-'):
|
||||
installed = []
|
||||
lc = current.lower()
|
||||
for dist in get_installed_distributions(local_only=True):
|
||||
if dist.key.startswith(lc) and dist.key not in cwords[1:]:
|
||||
installed.append(dist.key)
|
||||
# if there are no dists installed, fall back to option completion
|
||||
if installed:
|
||||
for dist in installed:
|
||||
print(dist)
|
||||
sys.exit(1)
|
||||
|
||||
subcommand = commands_dict[subcommand_name]()
|
||||
options += [(opt.get_opt_string(), opt.nargs)
|
||||
for opt in subcommand.parser.option_list_all
|
||||
if opt.help != optparse.SUPPRESS_HELP]
|
||||
|
||||
# filter out previously specified options from available options
|
||||
prev_opts = [x.split('=')[0] for x in cwords[1:cword - 1]]
|
||||
options = [(x, v) for (x, v) in options if x not in prev_opts]
|
||||
# filter options by current input
|
||||
options = [(k, v) for k, v in options if k.startswith(current)]
|
||||
for option in options:
|
||||
opt_label = option[0]
|
||||
# append '=' to options which require args
|
||||
if option[1]:
|
||||
opt_label += '='
|
||||
print(opt_label)
|
||||
else:
|
||||
# show main parser options only when necessary
|
||||
if current.startswith('-') or current.startswith('--'):
|
||||
opts = [i.option_list for i in parser.option_groups]
|
||||
opts.append(parser.option_list)
|
||||
opts = (o for it in opts for o in it)
|
||||
|
||||
subcommands += [i.get_opt_string() for i in opts
|
||||
if i.help != optparse.SUPPRESS_HELP]
|
||||
|
||||
print(' '.join([x for x in subcommands if x.startswith(current)]))
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
def create_main_parser():
|
||||
parser_kw = {
|
||||
'usage': '\n%prog <command> [options]',
|
||||
'add_help_option': False,
|
||||
'formatter': UpdatingDefaultsHelpFormatter(),
|
||||
'name': 'global',
|
||||
'prog': get_prog(),
|
||||
}
|
||||
|
||||
parser = ConfigOptionParser(**parser_kw)
|
||||
parser.disable_interspersed_args()
|
||||
|
||||
pip_pkg_dir = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
|
||||
parser.version = 'pip %s from %s (python %s)' % (
|
||||
__version__, pip_pkg_dir, sys.version[:3])
|
||||
|
||||
# add the general options
|
||||
gen_opts = cmdoptions.make_option_group(cmdoptions.general_group, parser)
|
||||
parser.add_option_group(gen_opts)
|
||||
|
||||
parser.main = True # so the help formatter knows
|
||||
|
||||
# create command listing for description
|
||||
command_summaries = get_summaries()
|
||||
description = [''] + ['%-27s %s' % (i, j) for i, j in command_summaries]
|
||||
parser.description = '\n'.join(description)
|
||||
|
||||
return parser
|
||||
|
||||
|
||||
def parseopts(args):
|
||||
parser = create_main_parser()
|
||||
|
||||
# Note: parser calls disable_interspersed_args(), so the result of this
|
||||
# call is to split the initial args into the general options before the
|
||||
# subcommand and everything else.
|
||||
# For example:
|
||||
# args: ['--timeout=5', 'install', '--user', 'INITools']
|
||||
# general_options: ['--timeout==5']
|
||||
# args_else: ['install', '--user', 'INITools']
|
||||
general_options, args_else = parser.parse_args(args)
|
||||
|
||||
# --version
|
||||
if general_options.version:
|
||||
sys.stdout.write(parser.version)
|
||||
sys.stdout.write(os.linesep)
|
||||
sys.exit()
|
||||
|
||||
# pip || pip help -> print_help()
|
||||
if not args_else or (args_else[0] == 'help' and len(args_else) == 1):
|
||||
parser.print_help()
|
||||
sys.exit()
|
||||
|
||||
# the subcommand name
|
||||
cmd_name = args_else[0]
|
||||
|
||||
if cmd_name not in commands_dict:
|
||||
guess = get_similar_commands(cmd_name)
|
||||
|
||||
msg = ['unknown command "%s"' % cmd_name]
|
||||
if guess:
|
||||
msg.append('maybe you meant "%s"' % guess)
|
||||
|
||||
raise CommandError(' - '.join(msg))
|
||||
|
||||
# all the args without the subcommand
|
||||
cmd_args = args[:]
|
||||
cmd_args.remove(cmd_name)
|
||||
|
||||
return cmd_name, cmd_args
|
||||
|
||||
|
||||
def check_isolated(args):
|
||||
isolated = False
|
||||
|
||||
if "--isolated" in args:
|
||||
isolated = True
|
||||
|
||||
return isolated
|
||||
|
||||
|
||||
def main(args=None):
|
||||
if args is None:
|
||||
args = sys.argv[1:]
|
||||
|
||||
# Configure our deprecation warnings to be sent through loggers
|
||||
deprecation.install_warning_logger()
|
||||
|
||||
autocomplete()
|
||||
|
||||
try:
|
||||
cmd_name, cmd_args = parseopts(args)
|
||||
except PipError as exc:
|
||||
sys.stderr.write("ERROR: %s" % exc)
|
||||
sys.stderr.write(os.linesep)
|
||||
sys.exit(1)
|
||||
|
||||
# Needed for locale.getpreferredencoding(False) to work
|
||||
# in pip.utils.encoding.auto_decode
|
||||
try:
|
||||
locale.setlocale(locale.LC_ALL, '')
|
||||
except locale.Error as e:
|
||||
# setlocale can apparently crash if locale are uninitialized
|
||||
logger.debug("Ignoring error %s when setting locale", e)
|
||||
command = commands_dict[cmd_name](isolated=check_isolated(cmd_args))
|
||||
return command.main(cmd_args)
|
||||
|
||||
|
||||
# ###########################################################
|
||||
# # Writing freeze files
|
||||
|
||||
class FrozenRequirement(object):
|
||||
|
||||
def __init__(self, name, req, editable, comments=()):
|
||||
self.name = name
|
||||
self.req = req
|
||||
self.editable = editable
|
||||
self.comments = comments
|
||||
|
||||
_rev_re = re.compile(r'-r(\d+)$')
|
||||
_date_re = re.compile(r'-(20\d\d\d\d\d\d)$')
|
||||
|
||||
@classmethod
|
||||
def from_dist(cls, dist, dependency_links):
|
||||
location = os.path.normcase(os.path.abspath(dist.location))
|
||||
comments = []
|
||||
from pip.vcs import vcs, get_src_requirement
|
||||
if dist_is_editable(dist) and vcs.get_backend_name(location):
|
||||
editable = True
|
||||
try:
|
||||
req = get_src_requirement(dist, location)
|
||||
except InstallationError as exc:
|
||||
logger.warning(
|
||||
"Error when trying to get requirement for VCS system %s, "
|
||||
"falling back to uneditable format", exc
|
||||
)
|
||||
req = None
|
||||
if req is None:
|
||||
logger.warning(
|
||||
'Could not determine repository location of %s', location
|
||||
)
|
||||
comments.append(
|
||||
'## !! Could not determine repository location'
|
||||
)
|
||||
req = dist.as_requirement()
|
||||
editable = False
|
||||
else:
|
||||
editable = False
|
||||
req = dist.as_requirement()
|
||||
specs = req.specs
|
||||
assert len(specs) == 1 and specs[0][0] in ["==", "==="], \
|
||||
'Expected 1 spec with == or ===; specs = %r; dist = %r' % \
|
||||
(specs, dist)
|
||||
version = specs[0][1]
|
||||
ver_match = cls._rev_re.search(version)
|
||||
date_match = cls._date_re.search(version)
|
||||
if ver_match or date_match:
|
||||
svn_backend = vcs.get_backend('svn')
|
||||
if svn_backend:
|
||||
svn_location = svn_backend().get_location(
|
||||
dist,
|
||||
dependency_links,
|
||||
)
|
||||
if not svn_location:
|
||||
logger.warning(
|
||||
'Warning: cannot find svn location for %s', req)
|
||||
comments.append(
|
||||
'## FIXME: could not find svn URL in dependency_links '
|
||||
'for this package:'
|
||||
)
|
||||
else:
|
||||
comments.append(
|
||||
'# Installing as editable to satisfy requirement %s:' %
|
||||
req
|
||||
)
|
||||
if ver_match:
|
||||
rev = ver_match.group(1)
|
||||
else:
|
||||
rev = '{%s}' % date_match.group(1)
|
||||
editable = True
|
||||
req = '%s@%s#egg=%s' % (
|
||||
svn_location,
|
||||
rev,
|
||||
cls.egg_name(dist)
|
||||
)
|
||||
return cls(dist.project_name, req, editable, comments)
|
||||
|
||||
@staticmethod
|
||||
def egg_name(dist):
|
||||
name = dist.egg_name()
|
||||
match = re.search(r'-py\d\.\d$', name)
|
||||
if match:
|
||||
name = name[:match.start()]
|
||||
return name
|
||||
|
||||
def __str__(self):
|
||||
req = self.req
|
||||
if self.editable:
|
||||
req = '-e %s' % req
|
||||
return '\n'.join(list(self.comments) + [str(req)]) + '\n'
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
sys.exit(main())
|
||||
|
|
@ -13,7 +13,7 @@ if __package__ == '':
|
|||
path = os.path.dirname(os.path.dirname(__file__))
|
||||
sys.path.insert(0, path)
|
||||
|
||||
from pip._internal import main as _main # isort:skip # noqa
|
||||
import pip # noqa
|
||||
|
||||
if __name__ == '__main__':
|
||||
sys.exit(_main())
|
||||
sys.exit(pip.main())
|
||||
|
|
@ -70,13 +70,11 @@ if DEBUNDLED:
|
|||
vendored("six")
|
||||
vendored("six.moves")
|
||||
vendored("six.moves.urllib")
|
||||
vendored("six.moves.urllib.parse")
|
||||
vendored("packaging")
|
||||
vendored("packaging.version")
|
||||
vendored("packaging.specifiers")
|
||||
vendored("pkg_resources")
|
||||
vendored("progress")
|
||||
vendored("pytoml")
|
||||
vendored("retrying")
|
||||
vendored("requests")
|
||||
vendored("requests.packages")
|
||||
|
|
@ -111,4 +109,3 @@ if DEBUNDLED:
|
|||
vendored("requests.packages.urllib3.util.ssl_")
|
||||
vendored("requests.packages.urllib3.util.timeout")
|
||||
vendored("requests.packages.urllib3.util.url")
|
||||
vendored("urllib3")
|
||||
|
|
@ -2,48 +2,41 @@
|
|||
from __future__ import absolute_import
|
||||
|
||||
import logging
|
||||
import logging.config
|
||||
import optparse
|
||||
import os
|
||||
import sys
|
||||
import optparse
|
||||
import warnings
|
||||
|
||||
from pip._internal.cli import cmdoptions
|
||||
from pip._internal.cli.parser import (
|
||||
ConfigOptionParser, UpdatingDefaultsHelpFormatter,
|
||||
)
|
||||
from pip._internal.cli.status_codes import (
|
||||
ERROR, PREVIOUS_BUILD_DIR_ERROR, SUCCESS, UNKNOWN_ERROR,
|
||||
VIRTUALENV_NOT_FOUND,
|
||||
)
|
||||
from pip._internal.download import PipSession
|
||||
from pip._internal.exceptions import (
|
||||
BadCommand, CommandError, InstallationError, PreviousBuildDirError,
|
||||
UninstallationError,
|
||||
)
|
||||
from pip._internal.index import PackageFinder
|
||||
from pip._internal.locations import running_under_virtualenv
|
||||
from pip._internal.req.constructors import (
|
||||
install_req_from_editable, install_req_from_line,
|
||||
)
|
||||
from pip._internal.req.req_file import parse_requirements
|
||||
from pip._internal.utils.logging import setup_logging
|
||||
from pip._internal.utils.misc import get_prog, normalize_path
|
||||
from pip._internal.utils.outdated import pip_version_check
|
||||
from pip._internal.utils.typing import MYPY_CHECK_RUNNING
|
||||
from pip import cmdoptions
|
||||
from pip.index import PackageFinder
|
||||
from pip.locations import running_under_virtualenv
|
||||
from pip.download import PipSession
|
||||
from pip.exceptions import (BadCommand, InstallationError, UninstallationError,
|
||||
CommandError, PreviousBuildDirError)
|
||||
|
||||
from pip.compat import logging_dictConfig
|
||||
from pip.baseparser import ConfigOptionParser, UpdatingDefaultsHelpFormatter
|
||||
from pip.req import InstallRequirement, parse_requirements
|
||||
from pip.status_codes import (
|
||||
SUCCESS, ERROR, UNKNOWN_ERROR, VIRTUALENV_NOT_FOUND,
|
||||
PREVIOUS_BUILD_DIR_ERROR,
|
||||
)
|
||||
from pip.utils import deprecation, get_prog, normalize_path
|
||||
from pip.utils.logging import IndentingFormatter
|
||||
from pip.utils.outdated import pip_version_check
|
||||
|
||||
if MYPY_CHECK_RUNNING:
|
||||
from typing import Optional # noqa: F401
|
||||
|
||||
__all__ = ['Command']
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class Command(object):
|
||||
name = None # type: Optional[str]
|
||||
usage = None # type: Optional[str]
|
||||
hidden = False # type: bool
|
||||
ignore_require_venv = False # type: bool
|
||||
name = None
|
||||
usage = None
|
||||
hidden = False
|
||||
log_streams = ("ext://sys.stdout", "ext://sys.stderr")
|
||||
|
||||
def __init__(self, isolated=False):
|
||||
parser_kw = {
|
||||
|
|
@ -112,18 +105,97 @@ class Command(object):
|
|||
def main(self, args):
|
||||
options, args = self.parse_args(args)
|
||||
|
||||
# Set verbosity so that it can be used elsewhere.
|
||||
self.verbosity = options.verbose - options.quiet
|
||||
if options.quiet:
|
||||
if options.quiet == 1:
|
||||
level = "WARNING"
|
||||
if options.quiet == 2:
|
||||
level = "ERROR"
|
||||
else:
|
||||
level = "CRITICAL"
|
||||
elif options.verbose:
|
||||
level = "DEBUG"
|
||||
else:
|
||||
level = "INFO"
|
||||
|
||||
setup_logging(
|
||||
verbosity=self.verbosity,
|
||||
no_color=options.no_color,
|
||||
user_log_file=options.log,
|
||||
# The root logger should match the "console" level *unless* we
|
||||
# specified "--log" to send debug logs to a file.
|
||||
root_level = level
|
||||
if options.log:
|
||||
root_level = "DEBUG"
|
||||
|
||||
logging_dictConfig({
|
||||
"version": 1,
|
||||
"disable_existing_loggers": False,
|
||||
"filters": {
|
||||
"exclude_warnings": {
|
||||
"()": "pip.utils.logging.MaxLevelFilter",
|
||||
"level": logging.WARNING,
|
||||
},
|
||||
},
|
||||
"formatters": {
|
||||
"indent": {
|
||||
"()": IndentingFormatter,
|
||||
"format": "%(message)s",
|
||||
},
|
||||
},
|
||||
"handlers": {
|
||||
"console": {
|
||||
"level": level,
|
||||
"class": "pip.utils.logging.ColorizedStreamHandler",
|
||||
"stream": self.log_streams[0],
|
||||
"filters": ["exclude_warnings"],
|
||||
"formatter": "indent",
|
||||
},
|
||||
"console_errors": {
|
||||
"level": "WARNING",
|
||||
"class": "pip.utils.logging.ColorizedStreamHandler",
|
||||
"stream": self.log_streams[1],
|
||||
"formatter": "indent",
|
||||
},
|
||||
"user_log": {
|
||||
"level": "DEBUG",
|
||||
"class": "pip.utils.logging.BetterRotatingFileHandler",
|
||||
"filename": options.log or "/dev/null",
|
||||
"delay": True,
|
||||
"formatter": "indent",
|
||||
},
|
||||
},
|
||||
"root": {
|
||||
"level": root_level,
|
||||
"handlers": list(filter(None, [
|
||||
"console",
|
||||
"console_errors",
|
||||
"user_log" if options.log else None,
|
||||
])),
|
||||
},
|
||||
# Disable any logging besides WARNING unless we have DEBUG level
|
||||
# logging enabled. These use both pip._vendor and the bare names
|
||||
# for the case where someone unbundles our libraries.
|
||||
"loggers": dict(
|
||||
(
|
||||
name,
|
||||
{
|
||||
"level": (
|
||||
"WARNING"
|
||||
if level in ["INFO", "ERROR"]
|
||||
else "DEBUG"
|
||||
),
|
||||
},
|
||||
)
|
||||
for name in ["pip._vendor", "distlib", "requests", "urllib3"]
|
||||
),
|
||||
})
|
||||
|
||||
if sys.version_info[:2] == (2, 6):
|
||||
warnings.warn(
|
||||
"Python 2.6 is no longer supported by the Python core team, "
|
||||
"please upgrade your Python. A future version of pip will "
|
||||
"drop support for Python 2.6",
|
||||
deprecation.Python26DeprecationWarning
|
||||
)
|
||||
|
||||
# TODO: Try to get these passing down from the command?
|
||||
# TODO: try to get these passing down from the command?
|
||||
# without resorting to os.environ to hold these.
|
||||
# This also affects isolated builds and it should.
|
||||
|
||||
if options.no_input:
|
||||
os.environ['PIP_NO_INPUT'] = '1'
|
||||
|
|
@ -131,7 +203,7 @@ class Command(object):
|
|||
if options.exists_action:
|
||||
os.environ['PIP_EXISTS_ACTION'] = ' '.join(options.exists_action)
|
||||
|
||||
if options.require_venv and not self.ignore_require_venv:
|
||||
if options.require_venv:
|
||||
# If a venv is required check if it can really be found
|
||||
if not running_under_virtualenv():
|
||||
logger.critical(
|
||||
|
|
@ -165,29 +237,19 @@ class Command(object):
|
|||
logger.debug('Exception information:', exc_info=True)
|
||||
|
||||
return ERROR
|
||||
except BaseException:
|
||||
except:
|
||||
logger.critical('Exception:', exc_info=True)
|
||||
|
||||
return UNKNOWN_ERROR
|
||||
finally:
|
||||
allow_version_check = (
|
||||
# Does this command have the index_group options?
|
||||
hasattr(options, "no_index") and
|
||||
# Is this command allowed to perform this check?
|
||||
not (options.disable_pip_version_check or options.no_index)
|
||||
)
|
||||
# Check if we're using the latest version of pip available
|
||||
if allow_version_check:
|
||||
session = self._build_session(
|
||||
if (not options.disable_pip_version_check and not
|
||||
getattr(options, "no_index", False)):
|
||||
with self._build_session(
|
||||
options,
|
||||
retries=0,
|
||||
timeout=min(5, options.timeout)
|
||||
)
|
||||
with session:
|
||||
pip_version_check(session, options)
|
||||
|
||||
# Shutdown the logging module
|
||||
logging.shutdown()
|
||||
timeout=min(5, options.timeout)) as session:
|
||||
pip_version_check(session)
|
||||
|
||||
return SUCCESS
|
||||
|
||||
|
|
@ -200,56 +262,54 @@ class RequirementCommand(Command):
|
|||
"""
|
||||
Marshal cmd line args into a requirement set.
|
||||
"""
|
||||
# NOTE: As a side-effect, options.require_hashes and
|
||||
# requirement_set.require_hashes may be updated
|
||||
|
||||
for filename in options.constraints:
|
||||
for req_to_add in parse_requirements(
|
||||
for req in parse_requirements(
|
||||
filename,
|
||||
constraint=True, finder=finder, options=options,
|
||||
session=session, wheel_cache=wheel_cache):
|
||||
req_to_add.is_direct = True
|
||||
requirement_set.add_requirement(req_to_add)
|
||||
requirement_set.add_requirement(req)
|
||||
|
||||
for req in args:
|
||||
req_to_add = install_req_from_line(
|
||||
requirement_set.add_requirement(
|
||||
InstallRequirement.from_line(
|
||||
req, None, isolated=options.isolated_mode,
|
||||
wheel_cache=wheel_cache
|
||||
)
|
||||
req_to_add.is_direct = True
|
||||
requirement_set.add_requirement(req_to_add)
|
||||
)
|
||||
|
||||
for req in options.editables:
|
||||
req_to_add = install_req_from_editable(
|
||||
requirement_set.add_requirement(
|
||||
InstallRequirement.from_editable(
|
||||
req,
|
||||
default_vcs=options.default_vcs,
|
||||
isolated=options.isolated_mode,
|
||||
wheel_cache=wheel_cache
|
||||
)
|
||||
req_to_add.is_direct = True
|
||||
requirement_set.add_requirement(req_to_add)
|
||||
)
|
||||
|
||||
found_req_in_file = False
|
||||
for filename in options.requirements:
|
||||
for req_to_add in parse_requirements(
|
||||
for req in parse_requirements(
|
||||
filename,
|
||||
finder=finder, options=options, session=session,
|
||||
wheel_cache=wheel_cache):
|
||||
req_to_add.is_direct = True
|
||||
requirement_set.add_requirement(req_to_add)
|
||||
found_req_in_file = True
|
||||
requirement_set.add_requirement(req)
|
||||
# If --require-hashes was a line in a requirements file, tell
|
||||
# RequirementSet about it:
|
||||
requirement_set.require_hashes = options.require_hashes
|
||||
|
||||
if not (args or options.editables or options.requirements):
|
||||
if not (args or options.editables or found_req_in_file):
|
||||
opts = {'name': name}
|
||||
if options.find_links:
|
||||
raise CommandError(
|
||||
'You must give at least one requirement to %(name)s '
|
||||
'(maybe you meant "pip %(name)s %(links)s"?)' %
|
||||
msg = ('You must give at least one requirement to '
|
||||
'%(name)s (maybe you meant "pip %(name)s '
|
||||
'%(links)s"?)' %
|
||||
dict(opts, links=' '.join(options.find_links)))
|
||||
else:
|
||||
raise CommandError(
|
||||
'You must give at least one requirement to %(name)s '
|
||||
'(see "pip help %(name)s")' % opts)
|
||||
msg = ('You must give at least one requirement '
|
||||
'to %(name)s (see "pip help %(name)s")' % opts)
|
||||
logger.warning(msg)
|
||||
|
||||
def _build_package_finder(self, options, session,
|
||||
platform=None, python_versions=None,
|
||||
|
|
@ -274,5 +334,4 @@ class RequirementCommand(Command):
|
|||
versions=python_versions,
|
||||
abi=abi,
|
||||
implementation=implementation,
|
||||
prefer_binary=options.prefer_binary,
|
||||
)
|
||||
|
|
@ -1,19 +1,23 @@
|
|||
"""Base option parser setup"""
|
||||
from __future__ import absolute_import
|
||||
|
||||
import logging
|
||||
import optparse
|
||||
import sys
|
||||
import optparse
|
||||
import os
|
||||
import re
|
||||
import textwrap
|
||||
from distutils.util import strtobool
|
||||
|
||||
from pip._vendor.six import string_types
|
||||
from pip._vendor.six.moves import configparser
|
||||
from pip.locations import (
|
||||
legacy_config_file, config_basename, running_under_virtualenv,
|
||||
site_config_files
|
||||
)
|
||||
from pip.utils import appdirs, get_terminal_size
|
||||
|
||||
from pip._internal.cli.status_codes import UNKNOWN_ERROR
|
||||
from pip._internal.configuration import Configuration, ConfigurationError
|
||||
from pip._internal.utils.compat import get_terminal_size
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
_environ_prefix_re = re.compile(r"^PIP_", re.I)
|
||||
|
||||
|
||||
class PrettyHelpFormatter(optparse.IndentedHelpFormatter):
|
||||
|
|
@ -133,15 +137,58 @@ class ConfigOptionParser(CustomOptionParser):
|
|||
"""Custom option parser which updates its defaults by checking the
|
||||
configuration files and environmental variables"""
|
||||
|
||||
isolated = False
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
self.config = configparser.RawConfigParser()
|
||||
self.name = kwargs.pop('name')
|
||||
|
||||
isolated = kwargs.pop("isolated", False)
|
||||
self.config = Configuration(isolated)
|
||||
|
||||
self.isolated = kwargs.pop("isolated", False)
|
||||
self.files = self.get_config_files()
|
||||
if self.files:
|
||||
self.config.read(self.files)
|
||||
assert self.name
|
||||
optparse.OptionParser.__init__(self, *args, **kwargs)
|
||||
|
||||
def get_config_files(self):
|
||||
# the files returned by this method will be parsed in order with the
|
||||
# first files listed being overridden by later files in standard
|
||||
# ConfigParser fashion
|
||||
config_file = os.environ.get('PIP_CONFIG_FILE', False)
|
||||
if config_file == os.devnull:
|
||||
return []
|
||||
|
||||
# at the base we have any site-wide configuration
|
||||
files = list(site_config_files)
|
||||
|
||||
# per-user configuration next
|
||||
if not self.isolated:
|
||||
if config_file and os.path.exists(config_file):
|
||||
files.append(config_file)
|
||||
else:
|
||||
# This is the legacy config file, we consider it to be a lower
|
||||
# priority than the new file location.
|
||||
files.append(legacy_config_file)
|
||||
|
||||
# This is the new config file, we consider it to be a higher
|
||||
# priority than the legacy file.
|
||||
files.append(
|
||||
os.path.join(
|
||||
appdirs.user_config_dir("pip"),
|
||||
config_basename,
|
||||
)
|
||||
)
|
||||
|
||||
# finally virtualenv configuration first trumping others
|
||||
if running_under_virtualenv():
|
||||
venv_config_file = os.path.join(
|
||||
sys.prefix,
|
||||
config_basename,
|
||||
)
|
||||
if os.path.exists(venv_config_file):
|
||||
files.append(venv_config_file)
|
||||
|
||||
return files
|
||||
|
||||
def check_default(self, option, key, val):
|
||||
try:
|
||||
return option.check_value(key, val)
|
||||
|
|
@ -149,43 +196,30 @@ class ConfigOptionParser(CustomOptionParser):
|
|||
print("An error occurred during configuration: %s" % exc)
|
||||
sys.exit(3)
|
||||
|
||||
def _get_ordered_configuration_items(self):
|
||||
# Configuration gives keys in an unordered manner. Order them.
|
||||
override_order = ["global", self.name, ":env:"]
|
||||
|
||||
# Pool the options into different groups
|
||||
section_items = {name: [] for name in override_order}
|
||||
for section_key, val in self.config.items():
|
||||
# ignore empty values
|
||||
if not val:
|
||||
logger.debug(
|
||||
"Ignoring configuration key '%s' as it's value is empty.",
|
||||
section_key
|
||||
)
|
||||
continue
|
||||
|
||||
section, key = section_key.split(".", 1)
|
||||
if section in override_order:
|
||||
section_items[section].append((key, val))
|
||||
|
||||
# Yield each group in their override order
|
||||
for section in override_order:
|
||||
for key, val in section_items[section]:
|
||||
yield key, val
|
||||
|
||||
def _update_defaults(self, defaults):
|
||||
"""Updates the given defaults with values from the config files and
|
||||
the environ. Does a little special handling for certain types of
|
||||
options (lists)."""
|
||||
|
||||
# Then go and look for the other sources of configuration:
|
||||
config = {}
|
||||
# 1. config files
|
||||
for section in ('global', self.name):
|
||||
config.update(
|
||||
self.normalize_keys(self.get_config_section(section))
|
||||
)
|
||||
# 2. environmental variables
|
||||
if not self.isolated:
|
||||
config.update(self.normalize_keys(self.get_environ_vars()))
|
||||
# Accumulate complex default state.
|
||||
self.values = optparse.Values(self.defaults)
|
||||
late_eval = set()
|
||||
# Then set the options with those values
|
||||
for key, val in self._get_ordered_configuration_items():
|
||||
# '--' because configuration supports only long names
|
||||
option = self.get_option('--' + key)
|
||||
for key, val in config.items():
|
||||
# ignore empty values
|
||||
if not val:
|
||||
continue
|
||||
|
||||
option = self.get_option(key)
|
||||
# Ignore options not present in this parser. E.g. non-globals put
|
||||
# in [global] by users that want them to apply to all applicable
|
||||
# commands.
|
||||
|
|
@ -193,14 +227,7 @@ class ConfigOptionParser(CustomOptionParser):
|
|||
continue
|
||||
|
||||
if option.action in ('store_true', 'store_false', 'count'):
|
||||
try:
|
||||
val = strtobool(val)
|
||||
except ValueError:
|
||||
error_msg = invalid_config_error_message(
|
||||
option.action, key, val
|
||||
)
|
||||
self.error(error_msg)
|
||||
|
||||
elif option.action == 'append':
|
||||
val = val.split()
|
||||
val = [self.check_default(option, key, v) for v in val]
|
||||
|
|
@ -222,6 +249,30 @@ class ConfigOptionParser(CustomOptionParser):
|
|||
self.values = None
|
||||
return defaults
|
||||
|
||||
def normalize_keys(self, items):
|
||||
"""Return a config dictionary with normalized keys regardless of
|
||||
whether the keys were specified in environment variables or in config
|
||||
files"""
|
||||
normalized = {}
|
||||
for key, val in items:
|
||||
key = key.replace('_', '-')
|
||||
if not key.startswith('--'):
|
||||
key = '--%s' % key # only prefer long opts
|
||||
normalized[key] = val
|
||||
return normalized
|
||||
|
||||
def get_config_section(self, name):
|
||||
"""Get a section of a configuration"""
|
||||
if self.config.has_section(name):
|
||||
return self.config.items(name)
|
||||
return []
|
||||
|
||||
def get_environ_vars(self):
|
||||
"""Returns a generator with all environmental vars with prefix PIP_"""
|
||||
for key, val in os.environ.items():
|
||||
if _environ_prefix_re.search(key):
|
||||
yield (_environ_prefix_re.sub("", key).lower(), val)
|
||||
|
||||
def get_default_values(self):
|
||||
"""Overriding to make updating the defaults after instantiation of
|
||||
the option parser possible, _update_defaults() does the dirty work."""
|
||||
|
|
@ -229,12 +280,6 @@ class ConfigOptionParser(CustomOptionParser):
|
|||
# Old, pre-Optik 1.5 behaviour.
|
||||
return optparse.Values(self.defaults)
|
||||
|
||||
# Load the configuration, or error out in case of an error
|
||||
try:
|
||||
self.config.load()
|
||||
except ConfigurationError as err:
|
||||
self.exit(UNKNOWN_ERROR, str(err))
|
||||
|
||||
defaults = self._update_defaults(self.defaults.copy()) # ours
|
||||
for option in self._get_all_options():
|
||||
default = defaults.get(option.dest)
|
||||
|
|
@ -245,17 +290,4 @@ class ConfigOptionParser(CustomOptionParser):
|
|||
|
||||
def error(self, msg):
|
||||
self.print_usage(sys.stderr)
|
||||
self.exit(UNKNOWN_ERROR, "%s\n" % msg)
|
||||
|
||||
|
||||
def invalid_config_error_message(action, key, val):
|
||||
"""Returns a better error message when invalid configuration option
|
||||
is provided."""
|
||||
if action in ('store_true', 'store_false'):
|
||||
return ("{0} is not a valid value for {1} option, "
|
||||
"please specify a boolean value like yes/no, "
|
||||
"true/false or 1/0 instead.").format(val, key)
|
||||
|
||||
return ("{0} is not a valid value for {1} option, "
|
||||
"please specify a numerical value like 1/0 "
|
||||
"instead.").format(val, key)
|
||||
self.exit(2, "%s\n" % msg)
|
||||
|
|
@ -9,20 +9,16 @@ pass on state. To be consistent, all options will follow this design.
|
|||
"""
|
||||
from __future__ import absolute_import
|
||||
|
||||
import warnings
|
||||
from functools import partial
|
||||
from optparse import SUPPRESS_HELP, Option, OptionGroup
|
||||
from optparse import OptionGroup, SUPPRESS_HELP, Option
|
||||
import warnings
|
||||
|
||||
from pip._internal.exceptions import CommandError
|
||||
from pip._internal.locations import USER_CACHE_DIR, src_prefix
|
||||
from pip._internal.models.format_control import FormatControl
|
||||
from pip._internal.models.index import PyPI
|
||||
from pip._internal.utils.hashes import STRONG_HASHES
|
||||
from pip._internal.utils.typing import MYPY_CHECK_RUNNING
|
||||
from pip._internal.utils.ui import BAR_TYPES
|
||||
|
||||
if MYPY_CHECK_RUNNING:
|
||||
from typing import Any # noqa: F401
|
||||
from pip.index import (
|
||||
FormatControl, fmt_ctl_handle_mutual_exclude, fmt_ctl_no_binary,
|
||||
fmt_ctl_no_use_wheel)
|
||||
from pip.models import PyPI
|
||||
from pip.locations import USER_CACHE_DIR, src_prefix
|
||||
from pip.utils.hashes import STRONG_HASHES
|
||||
|
||||
|
||||
def make_option_group(group, parser):
|
||||
|
|
@ -37,6 +33,12 @@ def make_option_group(group, parser):
|
|||
return option_group
|
||||
|
||||
|
||||
def resolve_wheel_no_use_binary(options):
|
||||
if not options.use_wheel:
|
||||
control = options.format_control
|
||||
fmt_ctl_no_use_wheel(control)
|
||||
|
||||
|
||||
def check_install_build_global(options, check_options=None):
|
||||
"""Disable wheels if per-setup.py call options are set.
|
||||
|
||||
|
|
@ -52,50 +54,10 @@ def check_install_build_global(options, check_options=None):
|
|||
names = ["build_options", "global_options", "install_options"]
|
||||
if any(map(getname, names)):
|
||||
control = options.format_control
|
||||
control.disallow_binaries()
|
||||
fmt_ctl_no_binary(control)
|
||||
warnings.warn(
|
||||
'Disabling all use of wheels due to the use of --build-options '
|
||||
'/ --global-options / --install-options.', stacklevel=2,
|
||||
)
|
||||
|
||||
|
||||
def check_dist_restriction(options, check_target=False):
|
||||
"""Function for determining if custom platform options are allowed.
|
||||
|
||||
:param options: The OptionParser options.
|
||||
:param check_target: Whether or not to check if --target is being used.
|
||||
"""
|
||||
dist_restriction_set = any([
|
||||
options.python_version,
|
||||
options.platform,
|
||||
options.abi,
|
||||
options.implementation,
|
||||
])
|
||||
|
||||
binary_only = FormatControl(set(), {':all:'})
|
||||
sdist_dependencies_allowed = (
|
||||
options.format_control != binary_only and
|
||||
not options.ignore_dependencies
|
||||
)
|
||||
|
||||
# Installations or downloads using dist restrictions must not combine
|
||||
# source distributions and dist-specific wheels, as they are not
|
||||
# gauranteed to be locally compatible.
|
||||
if dist_restriction_set and sdist_dependencies_allowed:
|
||||
raise CommandError(
|
||||
"When restricting platform and interpreter constraints using "
|
||||
"--python-version, --platform, --abi, or --implementation, "
|
||||
"either --no-deps must be set, or --only-binary=:all: must be "
|
||||
"set and --no-binary must not be set (or must be set to "
|
||||
":none:)."
|
||||
)
|
||||
|
||||
if check_target:
|
||||
if dist_restriction_set and not options.target_dir:
|
||||
raise CommandError(
|
||||
"Can not use any platform or abi specific options unless "
|
||||
"installing via '--target'"
|
||||
)
|
||||
'/ --global-options / --install-options.', stacklevel=2)
|
||||
|
||||
|
||||
###########
|
||||
|
|
@ -107,8 +69,7 @@ help_ = partial(
|
|||
'-h', '--help',
|
||||
dest='help',
|
||||
action='help',
|
||||
help='Show help.',
|
||||
) # type: Any
|
||||
help='Show help.')
|
||||
|
||||
isolated_mode = partial(
|
||||
Option,
|
||||
|
|
@ -129,8 +90,7 @@ require_virtualenv = partial(
|
|||
dest='require_venv',
|
||||
action='store_true',
|
||||
default=False,
|
||||
help=SUPPRESS_HELP
|
||||
) # type: Any
|
||||
help=SUPPRESS_HELP)
|
||||
|
||||
verbose = partial(
|
||||
Option,
|
||||
|
|
@ -141,22 +101,12 @@ verbose = partial(
|
|||
help='Give more output. Option is additive, and can be used up to 3 times.'
|
||||
)
|
||||
|
||||
no_color = partial(
|
||||
Option,
|
||||
'--no-color',
|
||||
dest='no_color',
|
||||
action='store_true',
|
||||
default=False,
|
||||
help="Suppress colored output",
|
||||
)
|
||||
|
||||
version = partial(
|
||||
Option,
|
||||
'-V', '--version',
|
||||
dest='version',
|
||||
action='store_true',
|
||||
help='Show version and exit.',
|
||||
) # type: Any
|
||||
help='Show version and exit.')
|
||||
|
||||
quiet = partial(
|
||||
Option,
|
||||
|
|
@ -164,25 +114,10 @@ quiet = partial(
|
|||
dest='quiet',
|
||||
action='count',
|
||||
default=0,
|
||||
help=(
|
||||
'Give less output. Option is additive, and can be used up to 3'
|
||||
help=('Give less output. Option is additive, and can be used up to 3'
|
||||
' times (corresponding to WARNING, ERROR, and CRITICAL logging'
|
||||
' levels).'
|
||||
),
|
||||
) # type: Any
|
||||
|
||||
progress_bar = partial(
|
||||
Option,
|
||||
'--progress-bar',
|
||||
dest='progress_bar',
|
||||
type='choice',
|
||||
choices=list(BAR_TYPES.keys()),
|
||||
default='on',
|
||||
help=(
|
||||
'Specify type of progress to be displayed [' +
|
||||
'|'.join(BAR_TYPES.keys()) + '] (default: %default)'
|
||||
),
|
||||
) # type: Any
|
||||
' levels).')
|
||||
)
|
||||
|
||||
log = partial(
|
||||
Option,
|
||||
|
|
@ -190,7 +125,7 @@ log = partial(
|
|||
dest="log",
|
||||
metavar="path",
|
||||
help="Path to a verbose appending log."
|
||||
) # type: Any
|
||||
)
|
||||
|
||||
no_input = partial(
|
||||
Option,
|
||||
|
|
@ -199,8 +134,7 @@ no_input = partial(
|
|||
dest='no_input',
|
||||
action='store_true',
|
||||
default=False,
|
||||
help=SUPPRESS_HELP
|
||||
) # type: Any
|
||||
help=SUPPRESS_HELP)
|
||||
|
||||
proxy = partial(
|
||||
Option,
|
||||
|
|
@ -208,8 +142,7 @@ proxy = partial(
|
|||
dest='proxy',
|
||||
type='str',
|
||||
default='',
|
||||
help="Specify a proxy in the form [user:passwd@]proxy.server:port."
|
||||
) # type: Any
|
||||
help="Specify a proxy in the form [user:passwd@]proxy.server:port.")
|
||||
|
||||
retries = partial(
|
||||
Option,
|
||||
|
|
@ -218,8 +151,7 @@ retries = partial(
|
|||
type='int',
|
||||
default=5,
|
||||
help="Maximum number of retries each connection should attempt "
|
||||
"(default %default times).",
|
||||
) # type: Any
|
||||
"(default %default times).")
|
||||
|
||||
timeout = partial(
|
||||
Option,
|
||||
|
|
@ -228,8 +160,16 @@ timeout = partial(
|
|||
dest='timeout',
|
||||
type='float',
|
||||
default=15,
|
||||
help='Set the socket timeout (default %default seconds).',
|
||||
) # type: Any
|
||||
help='Set the socket timeout (default %default seconds).')
|
||||
|
||||
default_vcs = partial(
|
||||
Option,
|
||||
# The default version control system for editables, e.g. 'svn'
|
||||
'--default-vcs',
|
||||
dest='default_vcs',
|
||||
type='str',
|
||||
default='',
|
||||
help=SUPPRESS_HELP)
|
||||
|
||||
skip_requirements_regex = partial(
|
||||
Option,
|
||||
|
|
@ -238,8 +178,7 @@ skip_requirements_regex = partial(
|
|||
dest='skip_requirements_regex',
|
||||
type='str',
|
||||
default='',
|
||||
help=SUPPRESS_HELP,
|
||||
) # type: Any
|
||||
help=SUPPRESS_HELP)
|
||||
|
||||
|
||||
def exists_action():
|
||||
|
|
@ -253,8 +192,7 @@ def exists_action():
|
|||
action='append',
|
||||
metavar='action',
|
||||
help="Default action when a path already exists: "
|
||||
"(s)witch, (i)gnore, (w)ipe, (b)ackup, (a)bort).",
|
||||
)
|
||||
"(s)witch, (i)gnore, (w)ipe, (b)ackup, (a)bort.")
|
||||
|
||||
|
||||
cert = partial(
|
||||
|
|
@ -263,8 +201,7 @@ cert = partial(
|
|||
dest='cert',
|
||||
type='str',
|
||||
metavar='path',
|
||||
help="Path to alternate CA bundle.",
|
||||
) # type: Any
|
||||
help="Path to alternate CA bundle.")
|
||||
|
||||
client_cert = partial(
|
||||
Option,
|
||||
|
|
@ -274,8 +211,7 @@ client_cert = partial(
|
|||
default=None,
|
||||
metavar='path',
|
||||
help="Path to SSL client certificate, a single file containing the "
|
||||
"private key and the certificate in PEM format.",
|
||||
) # type: Any
|
||||
"private key and the certificate in PEM format.")
|
||||
|
||||
index_url = partial(
|
||||
Option,
|
||||
|
|
@ -286,8 +222,7 @@ index_url = partial(
|
|||
help="Base URL of Python Package Index (default %default). "
|
||||
"This should point to a repository compliant with PEP 503 "
|
||||
"(the simple repository API) or a local directory laid out "
|
||||
"in the same format.",
|
||||
) # type: Any
|
||||
"in the same format.")
|
||||
|
||||
|
||||
def extra_index_url():
|
||||
|
|
@ -299,7 +234,7 @@ def extra_index_url():
|
|||
default=[],
|
||||
help="Extra URLs of package indexes to use in addition to "
|
||||
"--index-url. Should follow the same rules as "
|
||||
"--index-url.",
|
||||
"--index-url."
|
||||
)
|
||||
|
||||
|
||||
|
|
@ -309,8 +244,7 @@ no_index = partial(
|
|||
dest='no_index',
|
||||
action='store_true',
|
||||
default=False,
|
||||
help='Ignore package index (only looking at --find-links URLs instead).',
|
||||
) # type: Any
|
||||
help='Ignore package index (only looking at --find-links URLs instead).')
|
||||
|
||||
|
||||
def find_links():
|
||||
|
|
@ -322,10 +256,30 @@ def find_links():
|
|||
metavar='url',
|
||||
help="If a url or path to an html file, then parse for links to "
|
||||
"archives. If a local path or file:// url that's a directory, "
|
||||
"then look for archives in the directory listing.",
|
||||
"then look for archives in the directory listing.")
|
||||
|
||||
|
||||
def allow_external():
|
||||
return Option(
|
||||
"--allow-external",
|
||||
dest="allow_external",
|
||||
action="append",
|
||||
default=[],
|
||||
metavar="PACKAGE",
|
||||
help=SUPPRESS_HELP,
|
||||
)
|
||||
|
||||
|
||||
allow_all_external = partial(
|
||||
Option,
|
||||
"--allow-all-external",
|
||||
dest="allow_all_external",
|
||||
action="store_true",
|
||||
default=False,
|
||||
help=SUPPRESS_HELP,
|
||||
)
|
||||
|
||||
|
||||
def trusted_host():
|
||||
return Option(
|
||||
"--trusted-host",
|
||||
|
|
@ -338,6 +292,38 @@ def trusted_host():
|
|||
)
|
||||
|
||||
|
||||
# Remove after 7.0
|
||||
no_allow_external = partial(
|
||||
Option,
|
||||
"--no-allow-external",
|
||||
dest="allow_all_external",
|
||||
action="store_false",
|
||||
default=False,
|
||||
help=SUPPRESS_HELP,
|
||||
)
|
||||
|
||||
|
||||
# Remove --allow-insecure after 7.0
|
||||
def allow_unsafe():
|
||||
return Option(
|
||||
"--allow-unverified", "--allow-insecure",
|
||||
dest="allow_unverified",
|
||||
action="append",
|
||||
default=[],
|
||||
metavar="PACKAGE",
|
||||
help=SUPPRESS_HELP,
|
||||
)
|
||||
|
||||
# Remove after 7.0
|
||||
no_allow_unsafe = partial(
|
||||
Option,
|
||||
"--no-allow-insecure",
|
||||
dest="allow_all_insecure",
|
||||
action="store_false",
|
||||
default=False,
|
||||
help=SUPPRESS_HELP
|
||||
)
|
||||
|
||||
# Remove after 1.5
|
||||
process_dependency_links = partial(
|
||||
Option,
|
||||
|
|
@ -346,7 +332,7 @@ process_dependency_links = partial(
|
|||
action="store_true",
|
||||
default=False,
|
||||
help="Enable the processing of dependency links.",
|
||||
) # type: Any
|
||||
)
|
||||
|
||||
|
||||
def constraints():
|
||||
|
|
@ -357,8 +343,7 @@ def constraints():
|
|||
default=[],
|
||||
metavar='file',
|
||||
help='Constrain versions using the given constraints file. '
|
||||
'This option can be used multiple times.'
|
||||
)
|
||||
'This option can be used multiple times.')
|
||||
|
||||
|
||||
def requirements():
|
||||
|
|
@ -369,8 +354,7 @@ def requirements():
|
|||
default=[],
|
||||
metavar='file',
|
||||
help='Install from the given requirements file. '
|
||||
'This option can be used multiple times.'
|
||||
)
|
||||
'This option can be used multiple times.')
|
||||
|
||||
|
||||
def editable():
|
||||
|
|
@ -384,7 +368,6 @@ def editable():
|
|||
'"develop mode") from a local project path or a VCS url.'),
|
||||
)
|
||||
|
||||
|
||||
src = partial(
|
||||
Option,
|
||||
'--src', '--source', '--source-dir', '--source-directory',
|
||||
|
|
@ -394,7 +377,28 @@ src = partial(
|
|||
help='Directory to check out editable projects into. '
|
||||
'The default in a virtualenv is "<venv path>/src". '
|
||||
'The default for global installs is "<current dir>/src".'
|
||||
) # type: Any
|
||||
)
|
||||
|
||||
# XXX: deprecated, remove in 9.0
|
||||
use_wheel = partial(
|
||||
Option,
|
||||
'--use-wheel',
|
||||
dest='use_wheel',
|
||||
action='store_true',
|
||||
default=True,
|
||||
help=SUPPRESS_HELP,
|
||||
)
|
||||
|
||||
# XXX: deprecated, remove in 9.0
|
||||
no_use_wheel = partial(
|
||||
Option,
|
||||
'--no-use-wheel',
|
||||
dest='use_wheel',
|
||||
action='store_false',
|
||||
default=True,
|
||||
help=('Do not Find and prefer wheel archives when searching indexes and '
|
||||
'find-links locations. DEPRECATED in favour of --no-binary.'),
|
||||
)
|
||||
|
||||
|
||||
def _get_format_control(values, option):
|
||||
|
|
@ -403,112 +407,41 @@ def _get_format_control(values, option):
|
|||
|
||||
|
||||
def _handle_no_binary(option, opt_str, value, parser):
|
||||
existing = _get_format_control(parser.values, option)
|
||||
FormatControl.handle_mutual_excludes(
|
||||
value, existing.no_binary, existing.only_binary,
|
||||
)
|
||||
existing = getattr(parser.values, option.dest)
|
||||
fmt_ctl_handle_mutual_exclude(
|
||||
value, existing.no_binary, existing.only_binary)
|
||||
|
||||
|
||||
def _handle_only_binary(option, opt_str, value, parser):
|
||||
existing = _get_format_control(parser.values, option)
|
||||
FormatControl.handle_mutual_excludes(
|
||||
value, existing.only_binary, existing.no_binary,
|
||||
)
|
||||
existing = getattr(parser.values, option.dest)
|
||||
fmt_ctl_handle_mutual_exclude(
|
||||
value, existing.only_binary, existing.no_binary)
|
||||
|
||||
|
||||
def no_binary():
|
||||
format_control = FormatControl(set(), set())
|
||||
return Option(
|
||||
"--no-binary", dest="format_control", action="callback",
|
||||
callback=_handle_no_binary, type="str",
|
||||
default=format_control,
|
||||
default=FormatControl(set(), set()),
|
||||
help="Do not use binary packages. Can be supplied multiple times, and "
|
||||
"each time adds to the existing value. Accepts either :all: to "
|
||||
"disable all binary packages, :none: to empty the set, or one or "
|
||||
"more package names with commas between them. Note that some "
|
||||
"packages are tricky to compile and may fail to install when "
|
||||
"this option is used on them.",
|
||||
)
|
||||
"this option is used on them.")
|
||||
|
||||
|
||||
def only_binary():
|
||||
format_control = FormatControl(set(), set())
|
||||
return Option(
|
||||
"--only-binary", dest="format_control", action="callback",
|
||||
callback=_handle_only_binary, type="str",
|
||||
default=format_control,
|
||||
default=FormatControl(set(), set()),
|
||||
help="Do not use source packages. Can be supplied multiple times, and "
|
||||
"each time adds to the existing value. Accepts either :all: to "
|
||||
"disable all source packages, :none: to empty the set, or one or "
|
||||
"more package names with commas between them. Packages without "
|
||||
"binary distributions will fail to install when this option is "
|
||||
"used on them.",
|
||||
)
|
||||
|
||||
|
||||
platform = partial(
|
||||
Option,
|
||||
'--platform',
|
||||
dest='platform',
|
||||
metavar='platform',
|
||||
default=None,
|
||||
help=("Only use wheels compatible with <platform>. "
|
||||
"Defaults to the platform of the running system."),
|
||||
)
|
||||
|
||||
|
||||
python_version = partial(
|
||||
Option,
|
||||
'--python-version',
|
||||
dest='python_version',
|
||||
metavar='python_version',
|
||||
default=None,
|
||||
help=("Only use wheels compatible with Python "
|
||||
"interpreter version <version>. If not specified, then the "
|
||||
"current system interpreter minor version is used. A major "
|
||||
"version (e.g. '2') can be specified to match all "
|
||||
"minor revs of that major version. A minor version "
|
||||
"(e.g. '34') can also be specified."),
|
||||
)
|
||||
|
||||
|
||||
implementation = partial(
|
||||
Option,
|
||||
'--implementation',
|
||||
dest='implementation',
|
||||
metavar='implementation',
|
||||
default=None,
|
||||
help=("Only use wheels compatible with Python "
|
||||
"implementation <implementation>, e.g. 'pp', 'jy', 'cp', "
|
||||
" or 'ip'. If not specified, then the current "
|
||||
"interpreter implementation is used. Use 'py' to force "
|
||||
"implementation-agnostic wheels."),
|
||||
)
|
||||
|
||||
|
||||
abi = partial(
|
||||
Option,
|
||||
'--abi',
|
||||
dest='abi',
|
||||
metavar='abi',
|
||||
default=None,
|
||||
help=("Only use wheels compatible with Python "
|
||||
"abi <abi>, e.g. 'pypy_41'. If not specified, then the "
|
||||
"current interpreter abi tag is used. Generally "
|
||||
"you will need to specify --implementation, "
|
||||
"--platform, and --python-version when using "
|
||||
"this option."),
|
||||
)
|
||||
|
||||
|
||||
def prefer_binary():
|
||||
return Option(
|
||||
"--prefer-binary",
|
||||
dest="prefer_binary",
|
||||
action="store_true",
|
||||
default=False,
|
||||
help="Prefer older binary packages over newer source packages."
|
||||
)
|
||||
"used on them.")
|
||||
|
||||
|
||||
cache_dir = partial(
|
||||
|
|
@ -534,39 +467,22 @@ no_deps = partial(
|
|||
dest='ignore_dependencies',
|
||||
action='store_true',
|
||||
default=False,
|
||||
help="Don't install package dependencies.",
|
||||
) # type: Any
|
||||
help="Don't install package dependencies.")
|
||||
|
||||
build_dir = partial(
|
||||
Option,
|
||||
'-b', '--build', '--build-dir', '--build-directory',
|
||||
dest='build_dir',
|
||||
metavar='dir',
|
||||
help='Directory to unpack packages into and build in. Note that '
|
||||
'an initial build still takes place in a temporary directory. '
|
||||
'The location of temporary directories can be controlled by setting '
|
||||
'the TMPDIR environment variable (TEMP on Windows) appropriately. '
|
||||
'When passed, build directories are not cleaned in case of failures.'
|
||||
) # type: Any
|
||||
help='Directory to unpack packages into and build in.'
|
||||
)
|
||||
|
||||
ignore_requires_python = partial(
|
||||
Option,
|
||||
'--ignore-requires-python',
|
||||
dest='ignore_requires_python',
|
||||
action='store_true',
|
||||
help='Ignore the Requires-Python information.'
|
||||
) # type: Any
|
||||
|
||||
no_build_isolation = partial(
|
||||
Option,
|
||||
'--no-build-isolation',
|
||||
dest='build_isolation',
|
||||
action='store_false',
|
||||
default=True,
|
||||
help='Disable isolation when building a modern source distribution. '
|
||||
'Build dependencies specified by PEP 518 must be already installed '
|
||||
'if this option is used.'
|
||||
) # type: Any
|
||||
help='Ignore the Requires-Python information.')
|
||||
|
||||
install_options = partial(
|
||||
Option,
|
||||
|
|
@ -578,8 +494,7 @@ install_options = partial(
|
|||
"command (use like --install-option=\"--install-scripts=/usr/local/"
|
||||
"bin\"). Use multiple --install-option options to pass multiple "
|
||||
"options to setup.py install. If you are using an option with a "
|
||||
"directory path, be sure to use absolute path.",
|
||||
) # type: Any
|
||||
"directory path, be sure to use absolute path.")
|
||||
|
||||
global_options = partial(
|
||||
Option,
|
||||
|
|
@ -588,16 +503,14 @@ global_options = partial(
|
|||
action='append',
|
||||
metavar='options',
|
||||
help="Extra global options to be supplied to the setup.py "
|
||||
"call before the install command.",
|
||||
) # type: Any
|
||||
"call before the install command.")
|
||||
|
||||
no_clean = partial(
|
||||
Option,
|
||||
'--no-clean',
|
||||
action='store_true',
|
||||
default=False,
|
||||
help="Don't clean up build directories."
|
||||
) # type: Any
|
||||
help="Don't clean up build directories.")
|
||||
|
||||
pre = partial(
|
||||
Option,
|
||||
|
|
@ -605,8 +518,7 @@ pre = partial(
|
|||
action='store_true',
|
||||
default=False,
|
||||
help="Include pre-release and development versions. By default, "
|
||||
"pip only finds stable versions.",
|
||||
) # type: Any
|
||||
"pip only finds stable versions.")
|
||||
|
||||
disable_pip_version_check = partial(
|
||||
Option,
|
||||
|
|
@ -615,9 +527,7 @@ disable_pip_version_check = partial(
|
|||
action="store_true",
|
||||
default=True,
|
||||
help="Don't periodically check PyPI to determine whether a new version "
|
||||
"of pip is available for download. Implied with --no-index.",
|
||||
) # type: Any
|
||||
|
||||
"of pip is available for download. Implied with --no-index.")
|
||||
|
||||
# Deprecated, Remove later
|
||||
always_unzip = partial(
|
||||
|
|
@ -626,7 +536,7 @@ always_unzip = partial(
|
|||
dest='always_unzip',
|
||||
action='store_true',
|
||||
help=SUPPRESS_HELP,
|
||||
) # type: Any
|
||||
)
|
||||
|
||||
|
||||
def _merge_hash(option, opt_str, value, parser):
|
||||
|
|
@ -656,8 +566,7 @@ hash = partial(
|
|||
callback=_merge_hash,
|
||||
type='string',
|
||||
help="Verify that the package's archive matches this "
|
||||
'hash before installing. Example: --hash=sha256:abcdef...',
|
||||
) # type: Any
|
||||
'hash before installing. Example: --hash=sha256:abcdef...')
|
||||
|
||||
|
||||
require_hashes = partial(
|
||||
|
|
@ -668,8 +577,7 @@ require_hashes = partial(
|
|||
default=False,
|
||||
help='Require a hash to check each requirement against, for '
|
||||
'repeatable installs. This option is implied when any package in a '
|
||||
'requirements file has a --hash option.',
|
||||
) # type: Any
|
||||
'requirements file has a --hash option.')
|
||||
|
||||
|
||||
##########
|
||||
|
|
@ -690,6 +598,7 @@ general_group = {
|
|||
proxy,
|
||||
retries,
|
||||
timeout,
|
||||
default_vcs,
|
||||
skip_requirements_regex,
|
||||
exists_action,
|
||||
trusted_host,
|
||||
|
|
@ -698,11 +607,10 @@ general_group = {
|
|||
cache_dir,
|
||||
no_cache,
|
||||
disable_pip_version_check,
|
||||
no_color,
|
||||
]
|
||||
}
|
||||
|
||||
index_group = {
|
||||
non_deprecated_index_group = {
|
||||
'name': 'Package Index Options',
|
||||
'options': [
|
||||
index_url,
|
||||
|
|
@ -712,3 +620,14 @@ index_group = {
|
|||
process_dependency_links,
|
||||
]
|
||||
}
|
||||
|
||||
index_group = {
|
||||
'name': 'Package Index Options (including deprecated options)',
|
||||
'options': non_deprecated_index_group['options'] + [
|
||||
allow_external,
|
||||
allow_all_external,
|
||||
no_allow_external,
|
||||
allow_unsafe,
|
||||
no_allow_unsafe,
|
||||
]
|
||||
}
|
||||
|
|
@ -3,25 +3,35 @@ Package containing all pip commands
|
|||
"""
|
||||
from __future__ import absolute_import
|
||||
|
||||
from pip._internal.commands.completion import CompletionCommand
|
||||
from pip._internal.commands.configuration import ConfigurationCommand
|
||||
from pip._internal.commands.download import DownloadCommand
|
||||
from pip._internal.commands.freeze import FreezeCommand
|
||||
from pip._internal.commands.hash import HashCommand
|
||||
from pip._internal.commands.help import HelpCommand
|
||||
from pip._internal.commands.list import ListCommand
|
||||
from pip._internal.commands.check import CheckCommand
|
||||
from pip._internal.commands.search import SearchCommand
|
||||
from pip._internal.commands.show import ShowCommand
|
||||
from pip._internal.commands.install import InstallCommand
|
||||
from pip._internal.commands.uninstall import UninstallCommand
|
||||
from pip._internal.commands.wheel import WheelCommand
|
||||
from pip.commands.completion import CompletionCommand
|
||||
from pip.commands.download import DownloadCommand
|
||||
from pip.commands.freeze import FreezeCommand
|
||||
from pip.commands.hash import HashCommand
|
||||
from pip.commands.help import HelpCommand
|
||||
from pip.commands.list import ListCommand
|
||||
from pip.commands.check import CheckCommand
|
||||
from pip.commands.search import SearchCommand
|
||||
from pip.commands.show import ShowCommand
|
||||
from pip.commands.install import InstallCommand
|
||||
from pip.commands.uninstall import UninstallCommand
|
||||
from pip.commands.wheel import WheelCommand
|
||||
|
||||
from pip._internal.utils.typing import MYPY_CHECK_RUNNING
|
||||
|
||||
if MYPY_CHECK_RUNNING:
|
||||
from typing import List, Type # noqa: F401
|
||||
from pip._internal.cli.base_command import Command # noqa: F401
|
||||
commands_dict = {
|
||||
CompletionCommand.name: CompletionCommand,
|
||||
FreezeCommand.name: FreezeCommand,
|
||||
HashCommand.name: HashCommand,
|
||||
HelpCommand.name: HelpCommand,
|
||||
SearchCommand.name: SearchCommand,
|
||||
ShowCommand.name: ShowCommand,
|
||||
InstallCommand.name: InstallCommand,
|
||||
UninstallCommand.name: UninstallCommand,
|
||||
DownloadCommand.name: DownloadCommand,
|
||||
ListCommand.name: ListCommand,
|
||||
CheckCommand.name: CheckCommand,
|
||||
WheelCommand.name: WheelCommand,
|
||||
}
|
||||
|
||||
|
||||
commands_order = [
|
||||
InstallCommand,
|
||||
|
|
@ -31,15 +41,12 @@ commands_order = [
|
|||
ListCommand,
|
||||
ShowCommand,
|
||||
CheckCommand,
|
||||
ConfigurationCommand,
|
||||
SearchCommand,
|
||||
WheelCommand,
|
||||
HashCommand,
|
||||
CompletionCommand,
|
||||
HelpCommand,
|
||||
] # type: List[Type[Command]]
|
||||
|
||||
commands_dict = {c.name: c for c in commands_order}
|
||||
]
|
||||
|
||||
|
||||
def get_summaries(ordered=True):
|
||||
39
lib/python3.4/site-packages/pip/commands/check.py
Normal file
39
lib/python3.4/site-packages/pip/commands/check.py
Normal file
|
|
@ -0,0 +1,39 @@
|
|||
import logging
|
||||
|
||||
from pip.basecommand import Command
|
||||
from pip.operations.check import check_requirements
|
||||
from pip.utils import get_installed_distributions
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class CheckCommand(Command):
|
||||
"""Verify installed packages have compatible dependencies."""
|
||||
name = 'check'
|
||||
usage = """
|
||||
%prog [options]"""
|
||||
summary = 'Verify installed packages have compatible dependencies.'
|
||||
|
||||
def run(self, options, args):
|
||||
dists = get_installed_distributions(local_only=False, skip=())
|
||||
missing_reqs_dict, incompatible_reqs_dict = check_requirements(dists)
|
||||
|
||||
for dist in dists:
|
||||
key = '%s==%s' % (dist.project_name, dist.version)
|
||||
|
||||
for requirement in missing_reqs_dict.get(key, []):
|
||||
logger.info(
|
||||
"%s %s requires %s, which is not installed.",
|
||||
dist.project_name, dist.version, requirement.project_name)
|
||||
|
||||
for requirement, actual in incompatible_reqs_dict.get(key, []):
|
||||
logger.info(
|
||||
"%s %s has requirement %s, but you have %s %s.",
|
||||
dist.project_name, dist.version, requirement,
|
||||
actual.project_name, actual.version)
|
||||
|
||||
if missing_reqs_dict or incompatible_reqs_dict:
|
||||
return 1
|
||||
else:
|
||||
logger.info("No broken requirements found.")
|
||||
|
|
@ -1,10 +1,7 @@
|
|||
from __future__ import absolute_import
|
||||
|
||||
import sys
|
||||
import textwrap
|
||||
|
||||
from pip._internal.cli.base_command import Command
|
||||
from pip._internal.utils.misc import get_prog
|
||||
from pip.basecommand import Command
|
||||
|
||||
BASE_COMPLETION = """
|
||||
# pip %(shell)s completion start%(script)s# pip %(shell)s completion end
|
||||
|
|
@ -12,44 +9,38 @@ BASE_COMPLETION = """
|
|||
|
||||
COMPLETION_SCRIPTS = {
|
||||
'bash': """
|
||||
_pip_completion()
|
||||
{
|
||||
_pip_completion()
|
||||
{
|
||||
COMPREPLY=( $( COMP_WORDS="${COMP_WORDS[*]}" \\
|
||||
COMP_CWORD=$COMP_CWORD \\
|
||||
PIP_AUTO_COMPLETE=1 $1 ) )
|
||||
}
|
||||
complete -o default -F _pip_completion %(prog)s
|
||||
""",
|
||||
'zsh': """
|
||||
function _pip_completion {
|
||||
}
|
||||
complete -o default -F _pip_completion pip
|
||||
""", 'zsh': """
|
||||
function _pip_completion {
|
||||
local words cword
|
||||
read -Ac words
|
||||
read -cn cword
|
||||
reply=( $( COMP_WORDS="$words[*]" \\
|
||||
COMP_CWORD=$(( cword-1 )) \\
|
||||
PIP_AUTO_COMPLETE=1 $words[1] ) )
|
||||
}
|
||||
compctl -K _pip_completion %(prog)s
|
||||
""",
|
||||
'fish': """
|
||||
function __fish_complete_pip
|
||||
set -lx COMP_WORDS (commandline -o) ""
|
||||
set -lx COMP_CWORD ( \\
|
||||
math (contains -i -- (commandline -t) $COMP_WORDS)-1 \\
|
||||
)
|
||||
set -lx PIP_AUTO_COMPLETE 1
|
||||
string split \\ -- (eval $COMP_WORDS[1])
|
||||
end
|
||||
complete -fa "(__fish_complete_pip)" -c %(prog)s
|
||||
""",
|
||||
}
|
||||
compctl -K _pip_completion pip
|
||||
""", 'fish': """
|
||||
function __fish_complete_pip
|
||||
set -lx COMP_WORDS (commandline -o) ""
|
||||
set -lx COMP_CWORD (math (contains -i -- (commandline -t) $COMP_WORDS)-1)
|
||||
set -lx PIP_AUTO_COMPLETE 1
|
||||
string split \ -- (eval $COMP_WORDS[1])
|
||||
end
|
||||
complete -fa "(__fish_complete_pip)" -c pip
|
||||
"""}
|
||||
|
||||
|
||||
class CompletionCommand(Command):
|
||||
"""A helper command to be used for command completion."""
|
||||
name = 'completion'
|
||||
summary = 'A helper command used for command completion.'
|
||||
ignore_require_venv = True
|
||||
|
||||
def __init__(self, *args, **kw):
|
||||
super(CompletionCommand, self).__init__(*args, **kw)
|
||||
|
|
@ -82,11 +73,7 @@ class CompletionCommand(Command):
|
|||
shells = COMPLETION_SCRIPTS.keys()
|
||||
shell_options = ['--' + shell for shell in sorted(shells)]
|
||||
if options.shell in shells:
|
||||
script = textwrap.dedent(
|
||||
COMPLETION_SCRIPTS.get(options.shell, '') % {
|
||||
'prog': get_prog(),
|
||||
}
|
||||
)
|
||||
script = COMPLETION_SCRIPTS.get(options.shell, '')
|
||||
print(BASE_COMPLETION % {'script': script, 'shell': options.shell})
|
||||
else:
|
||||
sys.stderr.write(
|
||||
|
|
@ -3,15 +3,15 @@ from __future__ import absolute_import
|
|||
import logging
|
||||
import os
|
||||
|
||||
from pip._internal.cli import cmdoptions
|
||||
from pip._internal.cli.base_command import RequirementCommand
|
||||
from pip._internal.operations.prepare import RequirementPreparer
|
||||
from pip._internal.req import RequirementSet
|
||||
from pip._internal.req.req_tracker import RequirementTracker
|
||||
from pip._internal.resolve import Resolver
|
||||
from pip._internal.utils.filesystem import check_path_owner
|
||||
from pip._internal.utils.misc import ensure_dir, normalize_path
|
||||
from pip._internal.utils.temp_dir import TempDirectory
|
||||
from pip.exceptions import CommandError
|
||||
from pip.index import FormatControl
|
||||
from pip.req import RequirementSet
|
||||
from pip.basecommand import RequirementCommand
|
||||
from pip import cmdoptions
|
||||
from pip.utils import ensure_dir, normalize_path
|
||||
from pip.utils.build import BuildDirectory
|
||||
from pip.utils.filesystem import check_path_owner
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
|
@ -33,8 +33,8 @@ class DownloadCommand(RequirementCommand):
|
|||
usage = """
|
||||
%prog [options] <requirement specifier> [package-index-options] ...
|
||||
%prog [options] -r <requirements file> [package-index-options] ...
|
||||
%prog [options] <vcs project url> ...
|
||||
%prog [options] <local project path> ...
|
||||
%prog [options] [-e] <vcs project url> ...
|
||||
%prog [options] [-e] <local project path> ...
|
||||
%prog [options] <archive url/path> ..."""
|
||||
|
||||
summary = 'Download packages.'
|
||||
|
|
@ -45,19 +45,17 @@ class DownloadCommand(RequirementCommand):
|
|||
cmd_opts = self.cmd_opts
|
||||
|
||||
cmd_opts.add_option(cmdoptions.constraints())
|
||||
cmd_opts.add_option(cmdoptions.editable())
|
||||
cmd_opts.add_option(cmdoptions.requirements())
|
||||
cmd_opts.add_option(cmdoptions.build_dir())
|
||||
cmd_opts.add_option(cmdoptions.no_deps())
|
||||
cmd_opts.add_option(cmdoptions.global_options())
|
||||
cmd_opts.add_option(cmdoptions.no_binary())
|
||||
cmd_opts.add_option(cmdoptions.only_binary())
|
||||
cmd_opts.add_option(cmdoptions.prefer_binary())
|
||||
cmd_opts.add_option(cmdoptions.src())
|
||||
cmd_opts.add_option(cmdoptions.pre())
|
||||
cmd_opts.add_option(cmdoptions.no_clean())
|
||||
cmd_opts.add_option(cmdoptions.require_hashes())
|
||||
cmd_opts.add_option(cmdoptions.progress_bar())
|
||||
cmd_opts.add_option(cmdoptions.no_build_isolation())
|
||||
|
||||
cmd_opts.add_option(
|
||||
'-d', '--dest', '--destination-dir', '--destination-directory',
|
||||
|
|
@ -67,13 +65,55 @@ class DownloadCommand(RequirementCommand):
|
|||
help=("Download packages into <dir>."),
|
||||
)
|
||||
|
||||
cmd_opts.add_option(cmdoptions.platform())
|
||||
cmd_opts.add_option(cmdoptions.python_version())
|
||||
cmd_opts.add_option(cmdoptions.implementation())
|
||||
cmd_opts.add_option(cmdoptions.abi())
|
||||
cmd_opts.add_option(
|
||||
'--platform',
|
||||
dest='platform',
|
||||
metavar='platform',
|
||||
default=None,
|
||||
help=("Only download wheels compatible with <platform>. "
|
||||
"Defaults to the platform of the running system."),
|
||||
)
|
||||
|
||||
cmd_opts.add_option(
|
||||
'--python-version',
|
||||
dest='python_version',
|
||||
metavar='python_version',
|
||||
default=None,
|
||||
help=("Only download wheels compatible with Python "
|
||||
"interpreter version <version>. If not specified, then the "
|
||||
"current system interpreter minor version is used. A major "
|
||||
"version (e.g. '2') can be specified to match all "
|
||||
"minor revs of that major version. A minor version "
|
||||
"(e.g. '34') can also be specified."),
|
||||
)
|
||||
|
||||
cmd_opts.add_option(
|
||||
'--implementation',
|
||||
dest='implementation',
|
||||
metavar='implementation',
|
||||
default=None,
|
||||
help=("Only download wheels compatible with Python "
|
||||
"implementation <implementation>, e.g. 'pp', 'jy', 'cp', "
|
||||
" or 'ip'. If not specified, then the current "
|
||||
"interpreter implementation is used. Use 'py' to force "
|
||||
"implementation-agnostic wheels."),
|
||||
)
|
||||
|
||||
cmd_opts.add_option(
|
||||
'--abi',
|
||||
dest='abi',
|
||||
metavar='abi',
|
||||
default=None,
|
||||
help=("Only download wheels compatible with Python "
|
||||
"abi <abi>, e.g. 'pypy_41'. If not specified, then the "
|
||||
"current interpreter abi tag is used. Generally "
|
||||
"you will need to specify --implementation, "
|
||||
"--platform, and --python-version when using "
|
||||
"this option."),
|
||||
)
|
||||
|
||||
index_opts = cmdoptions.make_option_group(
|
||||
cmdoptions.index_group,
|
||||
cmdoptions.non_deprecated_index_group,
|
||||
self.parser,
|
||||
)
|
||||
|
||||
|
|
@ -82,16 +122,26 @@ class DownloadCommand(RequirementCommand):
|
|||
|
||||
def run(self, options, args):
|
||||
options.ignore_installed = True
|
||||
# editable doesn't really make sense for `pip download`, but the bowels
|
||||
# of the RequirementSet code require that property.
|
||||
options.editables = []
|
||||
|
||||
if options.python_version:
|
||||
python_versions = [options.python_version]
|
||||
else:
|
||||
python_versions = None
|
||||
|
||||
cmdoptions.check_dist_restriction(options)
|
||||
dist_restriction_set = any([
|
||||
options.python_version,
|
||||
options.platform,
|
||||
options.abi,
|
||||
options.implementation,
|
||||
])
|
||||
binary_only = FormatControl(set(), set([':all:']))
|
||||
if dist_restriction_set and options.format_control != binary_only:
|
||||
raise CommandError(
|
||||
"--only-binary=:all: must be set and --no-binary must not "
|
||||
"be set (or must be set to :none:) when restricting platform "
|
||||
"and interpreter constraints using --python-version, "
|
||||
"--platform, --abi, or --implementation."
|
||||
)
|
||||
|
||||
options.src_dir = os.path.abspath(options.src_dir)
|
||||
options.download_dir = normalize_path(options.download_dir)
|
||||
|
|
@ -119,12 +169,18 @@ class DownloadCommand(RequirementCommand):
|
|||
)
|
||||
options.cache_dir = None
|
||||
|
||||
with RequirementTracker() as req_tracker, TempDirectory(
|
||||
options.build_dir, delete=build_delete, kind="download"
|
||||
) as directory:
|
||||
with BuildDirectory(options.build_dir,
|
||||
delete=build_delete) as build_dir:
|
||||
|
||||
requirement_set = RequirementSet(
|
||||
require_hashes=options.require_hashes,
|
||||
build_dir=build_dir,
|
||||
src_dir=options.src_dir,
|
||||
download_dir=options.download_dir,
|
||||
ignore_installed=True,
|
||||
ignore_dependencies=options.ignore_dependencies,
|
||||
session=session,
|
||||
isolated=options.isolated_mode,
|
||||
require_hashes=options.require_hashes
|
||||
)
|
||||
self.populate_requirement_set(
|
||||
requirement_set,
|
||||
|
|
@ -136,36 +192,18 @@ class DownloadCommand(RequirementCommand):
|
|||
None
|
||||
)
|
||||
|
||||
preparer = RequirementPreparer(
|
||||
build_dir=directory.path,
|
||||
src_dir=options.src_dir,
|
||||
download_dir=options.download_dir,
|
||||
wheel_download_dir=None,
|
||||
progress_bar=options.progress_bar,
|
||||
build_isolation=options.build_isolation,
|
||||
req_tracker=req_tracker,
|
||||
)
|
||||
if not requirement_set.has_requirements:
|
||||
return
|
||||
|
||||
resolver = Resolver(
|
||||
preparer=preparer,
|
||||
finder=finder,
|
||||
session=session,
|
||||
wheel_cache=None,
|
||||
use_user_site=False,
|
||||
upgrade_strategy="to-satisfy-only",
|
||||
force_reinstall=False,
|
||||
ignore_dependencies=options.ignore_dependencies,
|
||||
ignore_requires_python=False,
|
||||
ignore_installed=True,
|
||||
isolated=options.isolated_mode,
|
||||
)
|
||||
resolver.resolve(requirement_set)
|
||||
requirement_set.prepare_files(finder)
|
||||
|
||||
downloaded = ' '.join([
|
||||
req.name for req in requirement_set.successfully_downloaded
|
||||
])
|
||||
if downloaded:
|
||||
logger.info('Successfully downloaded %s', downloaded)
|
||||
logger.info(
|
||||
'Successfully downloaded %s', downloaded
|
||||
)
|
||||
|
||||
# Clean up
|
||||
if not options.no_clean:
|
||||
|
|
@ -2,13 +2,14 @@ from __future__ import absolute_import
|
|||
|
||||
import sys
|
||||
|
||||
from pip._internal.cache import WheelCache
|
||||
from pip._internal.cli.base_command import Command
|
||||
from pip._internal.models.format_control import FormatControl
|
||||
from pip._internal.operations.freeze import freeze
|
||||
from pip._internal.utils.compat import stdlib_pkgs
|
||||
import pip
|
||||
from pip.compat import stdlib_pkgs
|
||||
from pip.basecommand import Command
|
||||
from pip.operations.freeze import freeze
|
||||
from pip.wheel import WheelCache
|
||||
|
||||
DEV_PKGS = {'pip', 'setuptools', 'distribute', 'wheel'}
|
||||
|
||||
DEV_PKGS = ('pip', 'setuptools', 'distribute', 'wheel')
|
||||
|
||||
|
||||
class FreezeCommand(Command):
|
||||
|
|
@ -62,16 +63,11 @@ class FreezeCommand(Command):
|
|||
action='store_true',
|
||||
help='Do not skip these packages in the output:'
|
||||
' %s' % ', '.join(DEV_PKGS))
|
||||
self.cmd_opts.add_option(
|
||||
'--exclude-editable',
|
||||
dest='exclude_editable',
|
||||
action='store_true',
|
||||
help='Exclude editable package from output.')
|
||||
|
||||
self.parser.insert_option_group(0, self.cmd_opts)
|
||||
|
||||
def run(self, options, args):
|
||||
format_control = FormatControl(set(), set())
|
||||
format_control = pip.index.FormatControl(set(), set())
|
||||
wheel_cache = WheelCache(options.cache_dir, format_control)
|
||||
skip = set(stdlib_pkgs)
|
||||
if not options.freeze_all:
|
||||
|
|
@ -85,12 +81,7 @@ class FreezeCommand(Command):
|
|||
skip_regex=options.skip_requirements_regex,
|
||||
isolated=options.isolated_mode,
|
||||
wheel_cache=wheel_cache,
|
||||
skip=skip,
|
||||
exclude_editable=options.exclude_editable,
|
||||
)
|
||||
skip=skip)
|
||||
|
||||
try:
|
||||
for line in freeze(**freeze_kwargs):
|
||||
sys.stdout.write(line + '\n')
|
||||
finally:
|
||||
wheel_cache.cleanup()
|
||||
|
|
@ -4,10 +4,11 @@ import hashlib
|
|||
import logging
|
||||
import sys
|
||||
|
||||
from pip._internal.cli.base_command import Command
|
||||
from pip._internal.cli.status_codes import ERROR
|
||||
from pip._internal.utils.hashes import FAVORITE_HASH, STRONG_HASHES
|
||||
from pip._internal.utils.misc import read_chunks
|
||||
from pip.basecommand import Command
|
||||
from pip.status_codes import ERROR
|
||||
from pip.utils import read_chunks
|
||||
from pip.utils.hashes import FAVORITE_HASH, STRONG_HASHES
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
|
@ -23,7 +24,6 @@ class HashCommand(Command):
|
|||
name = 'hash'
|
||||
usage = '%prog [options] <file> ...'
|
||||
summary = 'Compute hashes of package archives.'
|
||||
ignore_require_venv = True
|
||||
|
||||
def __init__(self, *args, **kw):
|
||||
super(HashCommand, self).__init__(*args, **kw)
|
||||
|
|
@ -1,8 +1,7 @@
|
|||
from __future__ import absolute_import
|
||||
|
||||
from pip._internal.cli.base_command import Command
|
||||
from pip._internal.cli.status_codes import SUCCESS
|
||||
from pip._internal.exceptions import CommandError
|
||||
from pip.basecommand import Command, SUCCESS
|
||||
from pip.exceptions import CommandError
|
||||
|
||||
|
||||
class HelpCommand(Command):
|
||||
|
|
@ -11,10 +10,9 @@ class HelpCommand(Command):
|
|||
usage = """
|
||||
%prog <command>"""
|
||||
summary = 'Show help for commands.'
|
||||
ignore_require_venv = True
|
||||
|
||||
def run(self, options, args):
|
||||
from pip._internal.commands import commands_dict, get_similar_commands
|
||||
from pip.commands import commands_dict, get_similar_commands
|
||||
|
||||
try:
|
||||
# 'pip help' with no args is handled by pip.__init__.parseopt()
|
||||
455
lib/python3.4/site-packages/pip/commands/install.py
Normal file
455
lib/python3.4/site-packages/pip/commands/install.py
Normal file
|
|
@ -0,0 +1,455 @@
|
|||
from __future__ import absolute_import
|
||||
|
||||
import logging
|
||||
import operator
|
||||
import os
|
||||
import tempfile
|
||||
import shutil
|
||||
import warnings
|
||||
try:
|
||||
import wheel
|
||||
except ImportError:
|
||||
wheel = None
|
||||
|
||||
from pip.req import RequirementSet
|
||||
from pip.basecommand import RequirementCommand
|
||||
from pip.locations import virtualenv_no_global, distutils_scheme
|
||||
from pip.exceptions import (
|
||||
InstallationError, CommandError, PreviousBuildDirError,
|
||||
)
|
||||
from pip import cmdoptions
|
||||
from pip.utils import ensure_dir, get_installed_version
|
||||
from pip.utils.build import BuildDirectory
|
||||
from pip.utils.deprecation import RemovedInPip10Warning
|
||||
from pip.utils.filesystem import check_path_owner
|
||||
from pip.wheel import WheelCache, WheelBuilder
|
||||
|
||||
from pip.locations import running_under_virtualenv
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class InstallCommand(RequirementCommand):
|
||||
"""
|
||||
Install packages from:
|
||||
|
||||
- PyPI (and other indexes) using requirement specifiers.
|
||||
- VCS project urls.
|
||||
- Local project directories.
|
||||
- Local or remote source archives.
|
||||
|
||||
pip also supports installing from "requirements files", which provide
|
||||
an easy way to specify a whole environment to be installed.
|
||||
"""
|
||||
name = 'install'
|
||||
|
||||
usage = """
|
||||
%prog [options] <requirement specifier> [package-index-options] ...
|
||||
%prog [options] -r <requirements file> [package-index-options] ...
|
||||
%prog [options] [-e] <vcs project url> ...
|
||||
%prog [options] [-e] <local project path> ...
|
||||
%prog [options] <archive url/path> ..."""
|
||||
|
||||
summary = 'Install packages.'
|
||||
|
||||
def __init__(self, *args, **kw):
|
||||
super(InstallCommand, self).__init__(*args, **kw)
|
||||
|
||||
default_user = True
|
||||
if running_under_virtualenv():
|
||||
default_user = False
|
||||
if os.geteuid() == 0:
|
||||
default_user = False
|
||||
|
||||
cmd_opts = self.cmd_opts
|
||||
|
||||
cmd_opts.add_option(cmdoptions.constraints())
|
||||
cmd_opts.add_option(cmdoptions.editable())
|
||||
cmd_opts.add_option(cmdoptions.requirements())
|
||||
cmd_opts.add_option(cmdoptions.build_dir())
|
||||
|
||||
cmd_opts.add_option(
|
||||
'-t', '--target',
|
||||
dest='target_dir',
|
||||
metavar='dir',
|
||||
default=None,
|
||||
help='Install packages into <dir>. '
|
||||
'By default this will not replace existing files/folders in '
|
||||
'<dir>. Use --upgrade to replace existing packages in <dir> '
|
||||
'with new versions.'
|
||||
)
|
||||
|
||||
cmd_opts.add_option(
|
||||
'-d', '--download', '--download-dir', '--download-directory',
|
||||
dest='download_dir',
|
||||
metavar='dir',
|
||||
default=None,
|
||||
help=("Download packages into <dir> instead of installing them, "
|
||||
"regardless of what's already installed."),
|
||||
)
|
||||
|
||||
cmd_opts.add_option(cmdoptions.src())
|
||||
|
||||
cmd_opts.add_option(
|
||||
'-U', '--upgrade',
|
||||
dest='upgrade',
|
||||
action='store_true',
|
||||
help='Upgrade all specified packages to the newest available '
|
||||
'version. The handling of dependencies depends on the '
|
||||
'upgrade-strategy used.'
|
||||
)
|
||||
|
||||
cmd_opts.add_option(
|
||||
'--upgrade-strategy',
|
||||
dest='upgrade_strategy',
|
||||
default='eager',
|
||||
choices=['only-if-needed', 'eager'],
|
||||
help='Determines how dependency upgrading should be handled. '
|
||||
'"eager" - dependencies are upgraded regardless of '
|
||||
'whether the currently installed version satisfies the '
|
||||
'requirements of the upgraded package(s). '
|
||||
'"only-if-needed" - are upgraded only when they do not '
|
||||
'satisfy the requirements of the upgraded package(s).'
|
||||
)
|
||||
|
||||
cmd_opts.add_option(
|
||||
'--force-reinstall',
|
||||
dest='force_reinstall',
|
||||
action='store_true',
|
||||
help='When upgrading, reinstall all packages even if they are '
|
||||
'already up-to-date.')
|
||||
|
||||
cmd_opts.add_option(
|
||||
'-I', '--ignore-installed',
|
||||
dest='ignore_installed',
|
||||
action='store_true',
|
||||
default=default_user,
|
||||
help='Ignore the installed packages (reinstalling instead).')
|
||||
|
||||
cmd_opts.add_option(cmdoptions.ignore_requires_python())
|
||||
cmd_opts.add_option(cmdoptions.no_deps())
|
||||
|
||||
cmd_opts.add_option(cmdoptions.install_options())
|
||||
cmd_opts.add_option(cmdoptions.global_options())
|
||||
|
||||
cmd_opts.add_option(
|
||||
'--user',
|
||||
dest='use_user_site',
|
||||
action='store_true',
|
||||
default=default_user,
|
||||
help="Install to the Python user install directory for your "
|
||||
"platform. Typically ~/.local/, or %APPDATA%\Python on "
|
||||
"Windows. (See the Python documentation for site.USER_BASE "
|
||||
"for full details.) On Debian systems, this is the "
|
||||
"default when running outside of a virtual environment "
|
||||
"and not as root.")
|
||||
|
||||
cmd_opts.add_option(
|
||||
'--system',
|
||||
dest='use_user_site',
|
||||
action='store_false',
|
||||
help="Install using the system scheme (overrides --user on "
|
||||
"Debian systems)")
|
||||
|
||||
cmd_opts.add_option(
|
||||
'--egg',
|
||||
dest='as_egg',
|
||||
action='store_true',
|
||||
help="Install packages as eggs, not 'flat', like pip normally "
|
||||
"does. This option is not about installing *from* eggs. "
|
||||
"(WARNING: Because this option overrides pip's normal install"
|
||||
" logic, requirements files may not behave as expected.)")
|
||||
|
||||
cmd_opts.add_option(
|
||||
'--root',
|
||||
dest='root_path',
|
||||
metavar='dir',
|
||||
default=None,
|
||||
help="Install everything relative to this alternate root "
|
||||
"directory.")
|
||||
|
||||
cmd_opts.add_option(
|
||||
'--prefix',
|
||||
dest='prefix_path',
|
||||
metavar='dir',
|
||||
default=None,
|
||||
help="Installation prefix where lib, bin and other top-level "
|
||||
"folders are placed")
|
||||
|
||||
cmd_opts.add_option(
|
||||
"--compile",
|
||||
action="store_true",
|
||||
dest="compile",
|
||||
default=True,
|
||||
help="Compile py files to pyc",
|
||||
)
|
||||
|
||||
cmd_opts.add_option(
|
||||
"--no-compile",
|
||||
action="store_false",
|
||||
dest="compile",
|
||||
help="Do not compile py files to pyc",
|
||||
)
|
||||
|
||||
cmd_opts.add_option(cmdoptions.use_wheel())
|
||||
cmd_opts.add_option(cmdoptions.no_use_wheel())
|
||||
cmd_opts.add_option(cmdoptions.no_binary())
|
||||
cmd_opts.add_option(cmdoptions.only_binary())
|
||||
cmd_opts.add_option(cmdoptions.pre())
|
||||
cmd_opts.add_option(cmdoptions.no_clean())
|
||||
cmd_opts.add_option(cmdoptions.require_hashes())
|
||||
|
||||
index_opts = cmdoptions.make_option_group(
|
||||
cmdoptions.index_group,
|
||||
self.parser,
|
||||
)
|
||||
|
||||
self.parser.insert_option_group(0, index_opts)
|
||||
self.parser.insert_option_group(0, cmd_opts)
|
||||
|
||||
def run(self, options, args):
|
||||
cmdoptions.resolve_wheel_no_use_binary(options)
|
||||
cmdoptions.check_install_build_global(options)
|
||||
|
||||
if options.as_egg:
|
||||
warnings.warn(
|
||||
"--egg has been deprecated and will be removed in the future. "
|
||||
"This flag is mutually exclusive with large parts of pip, and "
|
||||
"actually using it invalidates pip's ability to manage the "
|
||||
"installation process.",
|
||||
RemovedInPip10Warning,
|
||||
)
|
||||
|
||||
if options.allow_external:
|
||||
warnings.warn(
|
||||
"--allow-external has been deprecated and will be removed in "
|
||||
"the future. Due to changes in the repository protocol, it no "
|
||||
"longer has any effect.",
|
||||
RemovedInPip10Warning,
|
||||
)
|
||||
|
||||
if options.allow_all_external:
|
||||
warnings.warn(
|
||||
"--allow-all-external has been deprecated and will be removed "
|
||||
"in the future. Due to changes in the repository protocol, it "
|
||||
"no longer has any effect.",
|
||||
RemovedInPip10Warning,
|
||||
)
|
||||
|
||||
if options.allow_unverified:
|
||||
warnings.warn(
|
||||
"--allow-unverified has been deprecated and will be removed "
|
||||
"in the future. Due to changes in the repository protocol, it "
|
||||
"no longer has any effect.",
|
||||
RemovedInPip10Warning,
|
||||
)
|
||||
|
||||
if options.download_dir:
|
||||
warnings.warn(
|
||||
"pip install --download has been deprecated and will be "
|
||||
"removed in the future. Pip now has a download command that "
|
||||
"should be used instead.",
|
||||
RemovedInPip10Warning,
|
||||
)
|
||||
options.ignore_installed = True
|
||||
|
||||
if options.build_dir:
|
||||
options.build_dir = os.path.abspath(options.build_dir)
|
||||
|
||||
options.src_dir = os.path.abspath(options.src_dir)
|
||||
install_options = options.install_options or []
|
||||
if options.use_user_site:
|
||||
if options.prefix_path:
|
||||
raise CommandError(
|
||||
"Can not combine '--user' and '--prefix' as they imply "
|
||||
"different installation locations"
|
||||
)
|
||||
if virtualenv_no_global():
|
||||
raise InstallationError(
|
||||
"Can not perform a '--user' install. User site-packages "
|
||||
"are not visible in this virtualenv."
|
||||
)
|
||||
install_options.append('--user')
|
||||
install_options.append('--prefix=')
|
||||
|
||||
temp_target_dir = None
|
||||
if options.target_dir:
|
||||
options.ignore_installed = True
|
||||
temp_target_dir = tempfile.mkdtemp()
|
||||
options.target_dir = os.path.abspath(options.target_dir)
|
||||
if (os.path.exists(options.target_dir) and not
|
||||
os.path.isdir(options.target_dir)):
|
||||
raise CommandError(
|
||||
"Target path exists but is not a directory, will not "
|
||||
"continue."
|
||||
)
|
||||
install_options.append('--home=' + temp_target_dir)
|
||||
|
||||
global_options = options.global_options or []
|
||||
|
||||
with self._build_session(options) as session:
|
||||
|
||||
finder = self._build_package_finder(options, session)
|
||||
build_delete = (not (options.no_clean or options.build_dir))
|
||||
wheel_cache = WheelCache(options.cache_dir, options.format_control)
|
||||
if options.cache_dir and not check_path_owner(options.cache_dir):
|
||||
logger.warning(
|
||||
"The directory '%s' or its parent directory is not owned "
|
||||
"by the current user and caching wheels has been "
|
||||
"disabled. check the permissions and owner of that "
|
||||
"directory. If executing pip with sudo, you may want "
|
||||
"sudo's -H flag.",
|
||||
options.cache_dir,
|
||||
)
|
||||
options.cache_dir = None
|
||||
|
||||
with BuildDirectory(options.build_dir,
|
||||
delete=build_delete) as build_dir:
|
||||
requirement_set = RequirementSet(
|
||||
build_dir=build_dir,
|
||||
src_dir=options.src_dir,
|
||||
download_dir=options.download_dir,
|
||||
upgrade=options.upgrade,
|
||||
upgrade_strategy=options.upgrade_strategy,
|
||||
as_egg=options.as_egg,
|
||||
ignore_installed=options.ignore_installed,
|
||||
ignore_dependencies=options.ignore_dependencies,
|
||||
ignore_requires_python=options.ignore_requires_python,
|
||||
force_reinstall=options.force_reinstall,
|
||||
use_user_site=options.use_user_site,
|
||||
target_dir=temp_target_dir,
|
||||
session=session,
|
||||
pycompile=options.compile,
|
||||
isolated=options.isolated_mode,
|
||||
wheel_cache=wheel_cache,
|
||||
require_hashes=options.require_hashes,
|
||||
)
|
||||
|
||||
self.populate_requirement_set(
|
||||
requirement_set, args, options, finder, session, self.name,
|
||||
wheel_cache
|
||||
)
|
||||
|
||||
if not requirement_set.has_requirements:
|
||||
return
|
||||
|
||||
try:
|
||||
if (options.download_dir or not wheel or not
|
||||
options.cache_dir):
|
||||
# on -d don't do complex things like building
|
||||
# wheels, and don't try to build wheels when wheel is
|
||||
# not installed.
|
||||
requirement_set.prepare_files(finder)
|
||||
else:
|
||||
# build wheels before install.
|
||||
wb = WheelBuilder(
|
||||
requirement_set,
|
||||
finder,
|
||||
build_options=[],
|
||||
global_options=[],
|
||||
)
|
||||
# Ignore the result: a failed wheel will be
|
||||
# installed from the sdist/vcs whatever.
|
||||
wb.build(autobuilding=True)
|
||||
|
||||
if not options.download_dir:
|
||||
requirement_set.install(
|
||||
install_options,
|
||||
global_options,
|
||||
root=options.root_path,
|
||||
prefix=options.prefix_path,
|
||||
)
|
||||
|
||||
possible_lib_locations = get_lib_location_guesses(
|
||||
user=options.use_user_site,
|
||||
home=temp_target_dir,
|
||||
root=options.root_path,
|
||||
prefix=options.prefix_path,
|
||||
isolated=options.isolated_mode,
|
||||
)
|
||||
reqs = sorted(
|
||||
requirement_set.successfully_installed,
|
||||
key=operator.attrgetter('name'))
|
||||
items = []
|
||||
for req in reqs:
|
||||
item = req.name
|
||||
try:
|
||||
installed_version = get_installed_version(
|
||||
req.name, possible_lib_locations
|
||||
)
|
||||
if installed_version:
|
||||
item += '-' + installed_version
|
||||
except Exception:
|
||||
pass
|
||||
items.append(item)
|
||||
installed = ' '.join(items)
|
||||
if installed:
|
||||
logger.info('Successfully installed %s', installed)
|
||||
else:
|
||||
downloaded = ' '.join([
|
||||
req.name
|
||||
for req in requirement_set.successfully_downloaded
|
||||
])
|
||||
if downloaded:
|
||||
logger.info(
|
||||
'Successfully downloaded %s', downloaded
|
||||
)
|
||||
except PreviousBuildDirError:
|
||||
options.no_clean = True
|
||||
raise
|
||||
finally:
|
||||
# Clean up
|
||||
if not options.no_clean:
|
||||
requirement_set.cleanup_files()
|
||||
|
||||
if options.target_dir:
|
||||
ensure_dir(options.target_dir)
|
||||
|
||||
# Checking both purelib and platlib directories for installed
|
||||
# packages to be moved to target directory
|
||||
lib_dir_list = []
|
||||
|
||||
purelib_dir = distutils_scheme('', home=temp_target_dir)['purelib']
|
||||
platlib_dir = distutils_scheme('', home=temp_target_dir)['platlib']
|
||||
|
||||
if os.path.exists(purelib_dir):
|
||||
lib_dir_list.append(purelib_dir)
|
||||
if os.path.exists(platlib_dir) and platlib_dir != purelib_dir:
|
||||
lib_dir_list.append(platlib_dir)
|
||||
|
||||
for lib_dir in lib_dir_list:
|
||||
for item in os.listdir(lib_dir):
|
||||
target_item_dir = os.path.join(options.target_dir, item)
|
||||
if os.path.exists(target_item_dir):
|
||||
if not options.upgrade:
|
||||
logger.warning(
|
||||
'Target directory %s already exists. Specify '
|
||||
'--upgrade to force replacement.',
|
||||
target_item_dir
|
||||
)
|
||||
continue
|
||||
if os.path.islink(target_item_dir):
|
||||
logger.warning(
|
||||
'Target directory %s already exists and is '
|
||||
'a link. Pip will not automatically replace '
|
||||
'links, please remove if replacement is '
|
||||
'desired.',
|
||||
target_item_dir
|
||||
)
|
||||
continue
|
||||
if os.path.isdir(target_item_dir):
|
||||
shutil.rmtree(target_item_dir)
|
||||
else:
|
||||
os.remove(target_item_dir)
|
||||
|
||||
shutil.move(
|
||||
os.path.join(lib_dir, item),
|
||||
target_item_dir
|
||||
)
|
||||
shutil.rmtree(temp_target_dir)
|
||||
return requirement_set
|
||||
|
||||
|
||||
def get_lib_location_guesses(*args, **kwargs):
|
||||
scheme = distutils_scheme('', *args, **kwargs)
|
||||
return [scheme['purelib'], scheme['platlib']]
|
||||
|
|
@ -2,18 +2,21 @@ from __future__ import absolute_import
|
|||
|
||||
import json
|
||||
import logging
|
||||
import warnings
|
||||
try:
|
||||
from itertools import zip_longest
|
||||
except ImportError:
|
||||
from itertools import izip_longest as zip_longest
|
||||
|
||||
from pip._vendor import six
|
||||
from pip._vendor.six.moves import zip_longest
|
||||
|
||||
from pip._internal.cli import cmdoptions
|
||||
from pip._internal.cli.base_command import Command
|
||||
from pip._internal.exceptions import CommandError
|
||||
from pip._internal.index import PackageFinder
|
||||
from pip._internal.utils.misc import (
|
||||
dist_is_editable, get_installed_distributions,
|
||||
)
|
||||
from pip._internal.utils.packaging import get_installer
|
||||
from pip.basecommand import Command
|
||||
from pip.exceptions import CommandError
|
||||
from pip.index import PackageFinder
|
||||
from pip.utils import (
|
||||
get_installed_distributions, dist_is_editable)
|
||||
from pip.utils.deprecation import RemovedInPip10Warning
|
||||
from pip.cmdoptions import make_option_group, index_group
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
|
@ -75,10 +78,9 @@ class ListCommand(Command):
|
|||
'--format',
|
||||
action='store',
|
||||
dest='list_format',
|
||||
default="columns",
|
||||
choices=('columns', 'freeze', 'json'),
|
||||
help="Select the output format among: columns (default), freeze, "
|
||||
"or json",
|
||||
choices=('legacy', 'columns', 'freeze', 'json'),
|
||||
help="Select the output format among: legacy (default), columns, "
|
||||
"freeze or json.",
|
||||
)
|
||||
|
||||
cmd_opts.add_option(
|
||||
|
|
@ -89,22 +91,7 @@ class ListCommand(Command):
|
|||
"installed packages.",
|
||||
)
|
||||
|
||||
cmd_opts.add_option(
|
||||
'--exclude-editable',
|
||||
action='store_false',
|
||||
dest='include_editable',
|
||||
help='Exclude editable package from output.',
|
||||
)
|
||||
cmd_opts.add_option(
|
||||
'--include-editable',
|
||||
action='store_true',
|
||||
dest='include_editable',
|
||||
help='Include editable package from output.',
|
||||
default=True,
|
||||
)
|
||||
index_opts = cmdoptions.make_option_group(
|
||||
cmdoptions.index_group, self.parser
|
||||
)
|
||||
index_opts = make_option_group(index_group, self.parser)
|
||||
|
||||
self.parser.insert_option_group(0, index_opts)
|
||||
self.parser.insert_option_group(0, cmd_opts)
|
||||
|
|
@ -123,6 +110,39 @@ class ListCommand(Command):
|
|||
)
|
||||
|
||||
def run(self, options, args):
|
||||
if options.allow_external:
|
||||
warnings.warn(
|
||||
"--allow-external has been deprecated and will be removed in "
|
||||
"the future. Due to changes in the repository protocol, it no "
|
||||
"longer has any effect.",
|
||||
RemovedInPip10Warning,
|
||||
)
|
||||
|
||||
if options.allow_all_external:
|
||||
warnings.warn(
|
||||
"--allow-all-external has been deprecated and will be removed "
|
||||
"in the future. Due to changes in the repository protocol, it "
|
||||
"no longer has any effect.",
|
||||
RemovedInPip10Warning,
|
||||
)
|
||||
|
||||
if options.allow_unverified:
|
||||
warnings.warn(
|
||||
"--allow-unverified has been deprecated and will be removed "
|
||||
"in the future. Due to changes in the repository protocol, it "
|
||||
"no longer has any effect.",
|
||||
RemovedInPip10Warning,
|
||||
)
|
||||
|
||||
if options.list_format is None:
|
||||
warnings.warn(
|
||||
"The default format will switch to columns in the future. "
|
||||
"You can use --format=(legacy|columns) (or define a "
|
||||
"format=(legacy|columns) in your pip.conf under the [list] "
|
||||
"section) to disable this warning.",
|
||||
RemovedInPip10Warning,
|
||||
)
|
||||
|
||||
if options.outdated and options.uptodate:
|
||||
raise CommandError(
|
||||
"Options --outdated and --uptodate cannot be combined.")
|
||||
|
|
@ -131,7 +151,6 @@ class ListCommand(Command):
|
|||
local_only=options.local,
|
||||
user_only=options.user,
|
||||
editables_only=options.editable,
|
||||
include_editables=options.include_editable,
|
||||
)
|
||||
|
||||
if options.outdated:
|
||||
|
|
@ -160,7 +179,7 @@ class ListCommand(Command):
|
|||
dep_keys = set()
|
||||
for dist in packages:
|
||||
dep_keys.update(requirement.key for requirement in dist.requires())
|
||||
return {pkg for pkg in packages if pkg.key not in dep_keys}
|
||||
return set(pkg for pkg in packages if pkg.key not in dep_keys)
|
||||
|
||||
def iter_packages_latest_infos(self, packages, options):
|
||||
index_urls = [options.index_url] + options.extra_index_urls
|
||||
|
|
@ -201,6 +220,23 @@ class ListCommand(Command):
|
|||
dist.latest_filetype = typ
|
||||
yield dist
|
||||
|
||||
def output_legacy(self, dist):
|
||||
if dist_is_editable(dist):
|
||||
return '%s (%s, %s)' % (
|
||||
dist.project_name,
|
||||
dist.version,
|
||||
dist.location,
|
||||
)
|
||||
else:
|
||||
return '%s (%s)' % (dist.project_name, dist.version)
|
||||
|
||||
def output_legacy_latest(self, dist):
|
||||
return '%s - Latest: %s [%s]' % (
|
||||
self.output_legacy(dist),
|
||||
dist.latest_version,
|
||||
dist.latest_filetype,
|
||||
)
|
||||
|
||||
def output_package_listing(self, packages, options):
|
||||
packages = sorted(
|
||||
packages,
|
||||
|
|
@ -211,13 +247,15 @@ class ListCommand(Command):
|
|||
self.output_package_listing_columns(data, header)
|
||||
elif options.list_format == 'freeze':
|
||||
for dist in packages:
|
||||
if options.verbose >= 1:
|
||||
logger.info("%s==%s (%s)", dist.project_name,
|
||||
dist.version, dist.location)
|
||||
else:
|
||||
logger.info("%s==%s", dist.project_name, dist.version)
|
||||
elif options.list_format == 'json':
|
||||
logger.info(format_for_json(packages, options))
|
||||
else: # legacy
|
||||
for dist in packages:
|
||||
if options.outdated:
|
||||
logger.info(self.output_legacy_latest(dist))
|
||||
else:
|
||||
logger.info(self.output_legacy(dist))
|
||||
|
||||
def output_package_listing_columns(self, data, header):
|
||||
# insert the header first: we need to know the size of column names
|
||||
|
|
@ -265,10 +303,8 @@ def format_for_columns(pkgs, options):
|
|||
header = ["Package", "Version"]
|
||||
|
||||
data = []
|
||||
if options.verbose >= 1 or any(dist_is_editable(x) for x in pkgs):
|
||||
if any(dist_is_editable(x) for x in pkgs):
|
||||
header.append("Location")
|
||||
if options.verbose >= 1:
|
||||
header.append("Installer")
|
||||
|
||||
for proj in pkgs:
|
||||
# if we're working on the 'outdated' list, separate out the
|
||||
|
|
@ -279,10 +315,8 @@ def format_for_columns(pkgs, options):
|
|||
row.append(proj.latest_version)
|
||||
row.append(proj.latest_filetype)
|
||||
|
||||
if options.verbose >= 1 or dist_is_editable(proj):
|
||||
if dist_is_editable(proj):
|
||||
row.append(proj.location)
|
||||
if options.verbose >= 1:
|
||||
row.append(get_installer(proj))
|
||||
|
||||
data.append(row)
|
||||
|
||||
|
|
@ -296,9 +330,6 @@ def format_for_json(packages, options):
|
|||
'name': dist.project_name,
|
||||
'version': six.text_type(dist.version),
|
||||
}
|
||||
if options.verbose >= 1:
|
||||
info['location'] = dist.location
|
||||
info['installer'] = get_installer(dist)
|
||||
if options.outdated:
|
||||
info['latest_version'] = six.text_type(dist.latest_version)
|
||||
info['latest_filetype'] = dist.latest_filetype
|
||||
|
|
@ -3,21 +3,19 @@ from __future__ import absolute_import
|
|||
import logging
|
||||
import sys
|
||||
import textwrap
|
||||
from collections import OrderedDict
|
||||
|
||||
from pip._vendor import pkg_resources
|
||||
from pip.basecommand import Command, SUCCESS
|
||||
from pip.compat import OrderedDict
|
||||
from pip.download import PipXmlrpcTransport
|
||||
from pip.models import PyPI
|
||||
from pip.utils import get_terminal_size
|
||||
from pip.utils.logging import indent_log
|
||||
from pip.exceptions import CommandError
|
||||
from pip.status_codes import NO_MATCHES_FOUND
|
||||
from pip._vendor.packaging.version import parse as parse_version
|
||||
# NOTE: XMLRPC Client is not annotated in typeshed as on 2017-07-17, which is
|
||||
# why we ignore the type on this import
|
||||
from pip._vendor.six.moves import xmlrpc_client # type: ignore
|
||||
from pip._vendor import pkg_resources
|
||||
from pip._vendor.six.moves import xmlrpc_client
|
||||
|
||||
from pip._internal.cli.base_command import Command
|
||||
from pip._internal.cli.status_codes import NO_MATCHES_FOUND, SUCCESS
|
||||
from pip._internal.download import PipXmlrpcTransport
|
||||
from pip._internal.exceptions import CommandError
|
||||
from pip._internal.models.index import PyPI
|
||||
from pip._internal.utils.compat import get_terminal_size
|
||||
from pip._internal.utils.logging import indent_log
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
|
@ -28,7 +26,6 @@ class SearchCommand(Command):
|
|||
usage = """
|
||||
%prog [options] <query>"""
|
||||
summary = 'Search PyPI for packages.'
|
||||
ignore_require_venv = True
|
||||
|
||||
def __init__(self, *args, **kw):
|
||||
super(SearchCommand, self).__init__(*args, **kw)
|
||||
|
|
@ -99,7 +96,7 @@ def print_results(hits, name_column_width=None, terminal_width=None):
|
|||
return
|
||||
if name_column_width is None:
|
||||
name_column_width = max([
|
||||
len(hit['name']) + len(highest_version(hit.get('versions', ['-'])))
|
||||
len(hit['name']) + len(hit.get('versions', ['-'])[-1])
|
||||
for hit in hits
|
||||
]) + 4
|
||||
|
||||
|
|
@ -107,7 +104,7 @@ def print_results(hits, name_column_width=None, terminal_width=None):
|
|||
for hit in hits:
|
||||
name = hit['name']
|
||||
summary = hit['summary'] or ''
|
||||
latest = highest_version(hit.get('versions', ['-']))
|
||||
version = hit.get('versions', ['-'])[-1]
|
||||
if terminal_width is not None:
|
||||
target_width = terminal_width - name_column_width - 5
|
||||
if target_width > 10:
|
||||
|
|
@ -116,12 +113,13 @@ def print_results(hits, name_column_width=None, terminal_width=None):
|
|||
summary = ('\n' + ' ' * (name_column_width + 3)).join(summary)
|
||||
|
||||
line = '%-*s - %s' % (name_column_width,
|
||||
'%s (%s)' % (name, latest), summary)
|
||||
'%s (%s)' % (name, version), summary)
|
||||
try:
|
||||
logger.info(line)
|
||||
if name in installed_packages:
|
||||
dist = pkg_resources.get_distribution(name)
|
||||
with indent_log():
|
||||
latest = highest_version(hit['versions'])
|
||||
if dist.version == latest:
|
||||
logger.info('INSTALLED: %s (latest)', dist.version)
|
||||
else:
|
||||
|
|
@ -1,29 +1,24 @@
|
|||
from __future__ import absolute_import
|
||||
|
||||
from email.parser import FeedParser
|
||||
import logging
|
||||
import os
|
||||
from email.parser import FeedParser # type: ignore
|
||||
|
||||
from pip.basecommand import Command
|
||||
from pip.status_codes import SUCCESS, ERROR
|
||||
from pip._vendor import pkg_resources
|
||||
from pip._vendor.packaging.utils import canonicalize_name
|
||||
|
||||
from pip._internal.cli.base_command import Command
|
||||
from pip._internal.cli.status_codes import ERROR, SUCCESS
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class ShowCommand(Command):
|
||||
"""
|
||||
Show information about one or more installed packages.
|
||||
|
||||
The output is in RFC-compliant mail header format.
|
||||
"""
|
||||
"""Show information about one or more installed packages."""
|
||||
name = 'show'
|
||||
usage = """
|
||||
%prog [options] <package> ..."""
|
||||
summary = 'Show information about installed packages.'
|
||||
ignore_require_venv = True
|
||||
|
||||
def __init__(self, *args, **kw):
|
||||
super(ShowCommand, self).__init__(*args, **kw)
|
||||
|
|
@ -131,14 +126,7 @@ def print_results(distributions, list_files=False, verbose=False):
|
|||
results_printed = True
|
||||
if i > 0:
|
||||
logger.info("---")
|
||||
|
||||
name = dist.get('name', '')
|
||||
required_by = [
|
||||
pkg.project_name for pkg in pkg_resources.working_set
|
||||
if name in [required.name for required in pkg.requires()]
|
||||
]
|
||||
|
||||
logger.info("Name: %s", name)
|
||||
logger.info("Name: %s", dist.get('name', ''))
|
||||
logger.info("Version: %s", dist.get('version', ''))
|
||||
logger.info("Summary: %s", dist.get('summary', ''))
|
||||
logger.info("Home-page: %s", dist.get('home-page', ''))
|
||||
|
|
@ -147,8 +135,6 @@ def print_results(distributions, list_files=False, verbose=False):
|
|||
logger.info("License: %s", dist.get('license', ''))
|
||||
logger.info("Location: %s", dist.get('location', ''))
|
||||
logger.info("Requires: %s", ', '.join(dist.get('requires', [])))
|
||||
logger.info("Required-by: %s", ', '.join(required_by))
|
||||
|
||||
if verbose:
|
||||
logger.info("Metadata-Version: %s",
|
||||
dist.get('metadata-version', ''))
|
||||
|
|
@ -1,12 +1,10 @@
|
|||
from __future__ import absolute_import
|
||||
|
||||
from pip._vendor.packaging.utils import canonicalize_name
|
||||
|
||||
from pip._internal.cli.base_command import Command
|
||||
from pip._internal.exceptions import InstallationError
|
||||
from pip._internal.req import parse_requirements
|
||||
from pip._internal.req.constructors import install_req_from_line
|
||||
from pip._internal.utils.misc import protect_pip_from_modification_on_windows
|
||||
import pip
|
||||
from pip.wheel import WheelCache
|
||||
from pip.req import InstallRequirement, RequirementSet, parse_requirements
|
||||
from pip.basecommand import Command
|
||||
from pip.exceptions import InstallationError
|
||||
|
||||
|
||||
class UninstallCommand(Command):
|
||||
|
|
@ -46,33 +44,33 @@ class UninstallCommand(Command):
|
|||
|
||||
def run(self, options, args):
|
||||
with self._build_session(options) as session:
|
||||
reqs_to_uninstall = {}
|
||||
for name in args:
|
||||
req = install_req_from_line(
|
||||
name, isolated=options.isolated_mode,
|
||||
format_control = pip.index.FormatControl(set(), set())
|
||||
wheel_cache = WheelCache(options.cache_dir, format_control)
|
||||
requirement_set = RequirementSet(
|
||||
build_dir=None,
|
||||
src_dir=None,
|
||||
download_dir=None,
|
||||
isolated=options.isolated_mode,
|
||||
session=session,
|
||||
wheel_cache=wheel_cache,
|
||||
)
|
||||
for name in args:
|
||||
requirement_set.add_requirement(
|
||||
InstallRequirement.from_line(
|
||||
name, isolated=options.isolated_mode,
|
||||
wheel_cache=wheel_cache
|
||||
)
|
||||
)
|
||||
if req.name:
|
||||
reqs_to_uninstall[canonicalize_name(req.name)] = req
|
||||
for filename in options.requirements:
|
||||
for req in parse_requirements(
|
||||
filename,
|
||||
options=options,
|
||||
session=session):
|
||||
if req.name:
|
||||
reqs_to_uninstall[canonicalize_name(req.name)] = req
|
||||
if not reqs_to_uninstall:
|
||||
session=session,
|
||||
wheel_cache=wheel_cache):
|
||||
requirement_set.add_requirement(req)
|
||||
if not requirement_set.has_requirements:
|
||||
raise InstallationError(
|
||||
'You must give at least one requirement to %(name)s (see '
|
||||
'"pip help %(name)s")' % dict(name=self.name)
|
||||
)
|
||||
|
||||
protect_pip_from_modification_on_windows(
|
||||
modifying_pip="pip" in reqs_to_uninstall
|
||||
)
|
||||
|
||||
for req in reqs_to_uninstall.values():
|
||||
uninstall_pathset = req.uninstall(
|
||||
auto_confirm=options.yes, verbose=self.verbosity > 0,
|
||||
)
|
||||
if uninstall_pathset:
|
||||
uninstall_pathset.commit()
|
||||
requirement_set.uninstall(auto_confirm=options.yes)
|
||||
|
|
@ -3,17 +3,17 @@ from __future__ import absolute_import
|
|||
|
||||
import logging
|
||||
import os
|
||||
import warnings
|
||||
|
||||
from pip.basecommand import RequirementCommand
|
||||
from pip.exceptions import CommandError, PreviousBuildDirError
|
||||
from pip.req import RequirementSet
|
||||
from pip.utils import import_or_raise
|
||||
from pip.utils.build import BuildDirectory
|
||||
from pip.utils.deprecation import RemovedInPip10Warning
|
||||
from pip.wheel import WheelCache, WheelBuilder
|
||||
from pip import cmdoptions
|
||||
|
||||
from pip._internal.cache import WheelCache
|
||||
from pip._internal.cli import cmdoptions
|
||||
from pip._internal.cli.base_command import RequirementCommand
|
||||
from pip._internal.exceptions import CommandError, PreviousBuildDirError
|
||||
from pip._internal.operations.prepare import RequirementPreparer
|
||||
from pip._internal.req import RequirementSet
|
||||
from pip._internal.req.req_tracker import RequirementTracker
|
||||
from pip._internal.resolve import Resolver
|
||||
from pip._internal.utils.temp_dir import TempDirectory
|
||||
from pip._internal.wheel import WheelBuilder
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
|
@ -56,17 +56,16 @@ class WheelCommand(RequirementCommand):
|
|||
help=("Build wheels into <dir>, where the default is the "
|
||||
"current working directory."),
|
||||
)
|
||||
cmd_opts.add_option(cmdoptions.use_wheel())
|
||||
cmd_opts.add_option(cmdoptions.no_use_wheel())
|
||||
cmd_opts.add_option(cmdoptions.no_binary())
|
||||
cmd_opts.add_option(cmdoptions.only_binary())
|
||||
cmd_opts.add_option(cmdoptions.prefer_binary())
|
||||
cmd_opts.add_option(
|
||||
'--build-option',
|
||||
dest='build_options',
|
||||
metavar='options',
|
||||
action='append',
|
||||
help="Extra arguments to be supplied to 'setup.py bdist_wheel'.",
|
||||
)
|
||||
cmd_opts.add_option(cmdoptions.no_build_isolation())
|
||||
help="Extra arguments to be supplied to 'setup.py bdist_wheel'.")
|
||||
cmd_opts.add_option(cmdoptions.constraints())
|
||||
cmd_opts.add_option(cmdoptions.editable())
|
||||
cmd_opts.add_option(cmdoptions.requirements())
|
||||
|
|
@ -74,7 +73,6 @@ class WheelCommand(RequirementCommand):
|
|||
cmd_opts.add_option(cmdoptions.ignore_requires_python())
|
||||
cmd_opts.add_option(cmdoptions.no_deps())
|
||||
cmd_opts.add_option(cmdoptions.build_dir())
|
||||
cmd_opts.add_option(cmdoptions.progress_bar())
|
||||
|
||||
cmd_opts.add_option(
|
||||
'--global-option',
|
||||
|
|
@ -103,9 +101,55 @@ class WheelCommand(RequirementCommand):
|
|||
self.parser.insert_option_group(0, index_opts)
|
||||
self.parser.insert_option_group(0, cmd_opts)
|
||||
|
||||
def check_required_packages(self):
|
||||
import_or_raise(
|
||||
'wheel.bdist_wheel',
|
||||
CommandError,
|
||||
"'pip wheel' requires the 'wheel' package. To fix this, run: "
|
||||
"pip install wheel"
|
||||
)
|
||||
pkg_resources = import_or_raise(
|
||||
'pkg_resources',
|
||||
CommandError,
|
||||
"'pip wheel' requires setuptools >= 0.8 for dist-info support."
|
||||
" To fix this, run: pip install --upgrade setuptools"
|
||||
)
|
||||
if not hasattr(pkg_resources, 'DistInfoDistribution'):
|
||||
raise CommandError(
|
||||
"'pip wheel' requires setuptools >= 0.8 for dist-info "
|
||||
"support. To fix this, run: pip install --upgrade "
|
||||
"setuptools"
|
||||
)
|
||||
|
||||
def run(self, options, args):
|
||||
self.check_required_packages()
|
||||
cmdoptions.resolve_wheel_no_use_binary(options)
|
||||
cmdoptions.check_install_build_global(options)
|
||||
|
||||
if options.allow_external:
|
||||
warnings.warn(
|
||||
"--allow-external has been deprecated and will be removed in "
|
||||
"the future. Due to changes in the repository protocol, it no "
|
||||
"longer has any effect.",
|
||||
RemovedInPip10Warning,
|
||||
)
|
||||
|
||||
if options.allow_all_external:
|
||||
warnings.warn(
|
||||
"--allow-all-external has been deprecated and will be removed "
|
||||
"in the future. Due to changes in the repository protocol, it "
|
||||
"no longer has any effect.",
|
||||
RemovedInPip10Warning,
|
||||
)
|
||||
|
||||
if options.allow_unverified:
|
||||
warnings.warn(
|
||||
"--allow-unverified has been deprecated and will be removed "
|
||||
"in the future. Due to changes in the repository protocol, it "
|
||||
"no longer has any effect.",
|
||||
RemovedInPip10Warning,
|
||||
)
|
||||
|
||||
index_urls = [options.index_url] + options.extra_index_urls
|
||||
if options.no_index:
|
||||
logger.debug('Ignoring indexes: %s', ','.join(index_urls))
|
||||
|
|
@ -120,57 +164,39 @@ class WheelCommand(RequirementCommand):
|
|||
finder = self._build_package_finder(options, session)
|
||||
build_delete = (not (options.no_clean or options.build_dir))
|
||||
wheel_cache = WheelCache(options.cache_dir, options.format_control)
|
||||
|
||||
with RequirementTracker() as req_tracker, TempDirectory(
|
||||
options.build_dir, delete=build_delete, kind="wheel"
|
||||
) as directory:
|
||||
|
||||
with BuildDirectory(options.build_dir,
|
||||
delete=build_delete) as build_dir:
|
||||
requirement_set = RequirementSet(
|
||||
require_hashes=options.require_hashes,
|
||||
)
|
||||
|
||||
try:
|
||||
self.populate_requirement_set(
|
||||
requirement_set, args, options, finder, session,
|
||||
self.name, wheel_cache
|
||||
)
|
||||
|
||||
preparer = RequirementPreparer(
|
||||
build_dir=directory.path,
|
||||
build_dir=build_dir,
|
||||
src_dir=options.src_dir,
|
||||
download_dir=None,
|
||||
wheel_download_dir=options.wheel_dir,
|
||||
progress_bar=options.progress_bar,
|
||||
build_isolation=options.build_isolation,
|
||||
req_tracker=req_tracker,
|
||||
)
|
||||
|
||||
resolver = Resolver(
|
||||
preparer=preparer,
|
||||
finder=finder,
|
||||
ignore_dependencies=options.ignore_dependencies,
|
||||
ignore_installed=True,
|
||||
ignore_requires_python=options.ignore_requires_python,
|
||||
isolated=options.isolated_mode,
|
||||
session=session,
|
||||
wheel_cache=wheel_cache,
|
||||
use_user_site=False,
|
||||
upgrade_strategy="to-satisfy-only",
|
||||
force_reinstall=False,
|
||||
ignore_dependencies=options.ignore_dependencies,
|
||||
ignore_requires_python=options.ignore_requires_python,
|
||||
ignore_installed=True,
|
||||
isolated=options.isolated_mode,
|
||||
wheel_download_dir=options.wheel_dir,
|
||||
require_hashes=options.require_hashes
|
||||
)
|
||||
resolver.resolve(requirement_set)
|
||||
|
||||
self.populate_requirement_set(
|
||||
requirement_set, args, options, finder, session, self.name,
|
||||
wheel_cache
|
||||
)
|
||||
|
||||
if not requirement_set.has_requirements:
|
||||
return
|
||||
|
||||
try:
|
||||
# build wheels
|
||||
wb = WheelBuilder(
|
||||
finder, preparer, wheel_cache,
|
||||
requirement_set,
|
||||
finder,
|
||||
build_options=options.build_options or [],
|
||||
global_options=options.global_options or [],
|
||||
no_clean=options.no_clean,
|
||||
)
|
||||
wheels_built_successfully = wb.build(
|
||||
requirement_set.requirements.values(), session=session,
|
||||
)
|
||||
if not wheels_built_successfully:
|
||||
if not wb.build():
|
||||
raise CommandError(
|
||||
"Failed to build one or more wheels"
|
||||
)
|
||||
|
|
@ -180,4 +206,3 @@ class WheelCommand(RequirementCommand):
|
|||
finally:
|
||||
if not options.no_clean:
|
||||
requirement_set.cleanup_files()
|
||||
wheel_cache.cleanup()
|
||||
164
lib/python3.4/site-packages/pip/compat/__init__.py
Normal file
164
lib/python3.4/site-packages/pip/compat/__init__.py
Normal file
|
|
@ -0,0 +1,164 @@
|
|||
"""Stuff that differs in different Python versions and platform
|
||||
distributions."""
|
||||
from __future__ import absolute_import, division
|
||||
|
||||
import os
|
||||
import sys
|
||||
|
||||
from pip._vendor.six import text_type
|
||||
|
||||
try:
|
||||
from logging.config import dictConfig as logging_dictConfig
|
||||
except ImportError:
|
||||
from pip.compat.dictconfig import dictConfig as logging_dictConfig
|
||||
|
||||
try:
|
||||
from collections import OrderedDict
|
||||
except ImportError:
|
||||
from pip._vendor.ordereddict import OrderedDict
|
||||
|
||||
try:
|
||||
import ipaddress
|
||||
except ImportError:
|
||||
try:
|
||||
from pip._vendor import ipaddress
|
||||
except ImportError:
|
||||
import ipaddr as ipaddress
|
||||
ipaddress.ip_address = ipaddress.IPAddress
|
||||
ipaddress.ip_network = ipaddress.IPNetwork
|
||||
|
||||
|
||||
try:
|
||||
import sysconfig
|
||||
|
||||
def get_stdlib():
|
||||
paths = [
|
||||
sysconfig.get_path("stdlib"),
|
||||
sysconfig.get_path("platstdlib"),
|
||||
]
|
||||
return set(filter(bool, paths))
|
||||
except ImportError:
|
||||
from distutils import sysconfig
|
||||
|
||||
def get_stdlib():
|
||||
paths = [
|
||||
sysconfig.get_python_lib(standard_lib=True),
|
||||
sysconfig.get_python_lib(standard_lib=True, plat_specific=True),
|
||||
]
|
||||
return set(filter(bool, paths))
|
||||
|
||||
|
||||
__all__ = [
|
||||
"logging_dictConfig", "ipaddress", "uses_pycache", "console_to_str",
|
||||
"native_str", "get_path_uid", "stdlib_pkgs", "WINDOWS", "samefile",
|
||||
"OrderedDict",
|
||||
]
|
||||
|
||||
|
||||
if sys.version_info >= (3, 4):
|
||||
uses_pycache = True
|
||||
from importlib.util import cache_from_source
|
||||
else:
|
||||
import imp
|
||||
uses_pycache = hasattr(imp, 'cache_from_source')
|
||||
if uses_pycache:
|
||||
cache_from_source = imp.cache_from_source
|
||||
else:
|
||||
cache_from_source = None
|
||||
|
||||
|
||||
if sys.version_info >= (3,):
|
||||
def console_to_str(s):
|
||||
try:
|
||||
return s.decode(sys.__stdout__.encoding)
|
||||
except UnicodeDecodeError:
|
||||
return s.decode('utf_8')
|
||||
|
||||
def native_str(s, replace=False):
|
||||
if isinstance(s, bytes):
|
||||
return s.decode('utf-8', 'replace' if replace else 'strict')
|
||||
return s
|
||||
|
||||
else:
|
||||
def console_to_str(s):
|
||||
return s
|
||||
|
||||
def native_str(s, replace=False):
|
||||
# Replace is ignored -- unicode to UTF-8 can't fail
|
||||
if isinstance(s, text_type):
|
||||
return s.encode('utf-8')
|
||||
return s
|
||||
|
||||
|
||||
def total_seconds(td):
|
||||
if hasattr(td, "total_seconds"):
|
||||
return td.total_seconds()
|
||||
else:
|
||||
val = td.microseconds + (td.seconds + td.days * 24 * 3600) * 10 ** 6
|
||||
return val / 10 ** 6
|
||||
|
||||
|
||||
def get_path_uid(path):
|
||||
"""
|
||||
Return path's uid.
|
||||
|
||||
Does not follow symlinks:
|
||||
https://github.com/pypa/pip/pull/935#discussion_r5307003
|
||||
|
||||
Placed this function in compat due to differences on AIX and
|
||||
Jython, that should eventually go away.
|
||||
|
||||
:raises OSError: When path is a symlink or can't be read.
|
||||
"""
|
||||
if hasattr(os, 'O_NOFOLLOW'):
|
||||
fd = os.open(path, os.O_RDONLY | os.O_NOFOLLOW)
|
||||
file_uid = os.fstat(fd).st_uid
|
||||
os.close(fd)
|
||||
else: # AIX and Jython
|
||||
# WARNING: time of check vulnerability, but best we can do w/o NOFOLLOW
|
||||
if not os.path.islink(path):
|
||||
# older versions of Jython don't have `os.fstat`
|
||||
file_uid = os.stat(path).st_uid
|
||||
else:
|
||||
# raise OSError for parity with os.O_NOFOLLOW above
|
||||
raise OSError(
|
||||
"%s is a symlink; Will not return uid for symlinks" % path
|
||||
)
|
||||
return file_uid
|
||||
|
||||
|
||||
def expanduser(path):
|
||||
"""
|
||||
Expand ~ and ~user constructions.
|
||||
|
||||
Includes a workaround for http://bugs.python.org/issue14768
|
||||
"""
|
||||
expanded = os.path.expanduser(path)
|
||||
if path.startswith('~/') and expanded.startswith('//'):
|
||||
expanded = expanded[1:]
|
||||
return expanded
|
||||
|
||||
|
||||
# packages in the stdlib that may have installation metadata, but should not be
|
||||
# considered 'installed'. this theoretically could be determined based on
|
||||
# dist.location (py27:`sysconfig.get_paths()['stdlib']`,
|
||||
# py26:sysconfig.get_config_vars('LIBDEST')), but fear platform variation may
|
||||
# make this ineffective, so hard-coding
|
||||
stdlib_pkgs = ('python', 'wsgiref')
|
||||
if sys.version_info >= (2, 7):
|
||||
stdlib_pkgs += ('argparse',)
|
||||
|
||||
|
||||
# windows detection, covers cpython and ironpython
|
||||
WINDOWS = (sys.platform.startswith("win") or
|
||||
(sys.platform == 'cli' and os.name == 'nt'))
|
||||
|
||||
|
||||
def samefile(file1, file2):
|
||||
"""Provide an alternative for os.path.samefile on Windows/Python2"""
|
||||
if hasattr(os.path, 'samefile'):
|
||||
return os.path.samefile(file1, file2)
|
||||
else:
|
||||
path1 = os.path.normcase(os.path.abspath(file1))
|
||||
path2 = os.path.normcase(os.path.abspath(file2))
|
||||
return path1 == path2
|
||||
565
lib/python3.4/site-packages/pip/compat/dictconfig.py
Normal file
565
lib/python3.4/site-packages/pip/compat/dictconfig.py
Normal file
|
|
@ -0,0 +1,565 @@
|
|||
# This is a copy of the Python logging.config.dictconfig module,
|
||||
# reproduced with permission. It is provided here for backwards
|
||||
# compatibility for Python versions prior to 2.7.
|
||||
#
|
||||
# Copyright 2009-2010 by Vinay Sajip. All Rights Reserved.
|
||||
#
|
||||
# Permission to use, copy, modify, and distribute this software and its
|
||||
# documentation for any purpose and without fee is hereby granted,
|
||||
# provided that the above copyright notice appear in all copies and that
|
||||
# both that copyright notice and this permission notice appear in
|
||||
# supporting documentation, and that the name of Vinay Sajip
|
||||
# not be used in advertising or publicity pertaining to distribution
|
||||
# of the software without specific, written prior permission.
|
||||
# VINAY SAJIP DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE, INCLUDING
|
||||
# ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL
|
||||
# VINAY SAJIP BE LIABLE FOR ANY SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES OR
|
||||
# ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER
|
||||
# IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT
|
||||
# OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
|
||||
from __future__ import absolute_import
|
||||
|
||||
import logging.handlers
|
||||
import re
|
||||
import sys
|
||||
import types
|
||||
|
||||
from pip._vendor import six
|
||||
|
||||
# flake8: noqa
|
||||
|
||||
IDENTIFIER = re.compile('^[a-z_][a-z0-9_]*$', re.I)
|
||||
|
||||
|
||||
def valid_ident(s):
|
||||
m = IDENTIFIER.match(s)
|
||||
if not m:
|
||||
raise ValueError('Not a valid Python identifier: %r' % s)
|
||||
return True
|
||||
|
||||
#
|
||||
# This function is defined in logging only in recent versions of Python
|
||||
#
|
||||
try:
|
||||
from logging import _checkLevel
|
||||
except ImportError:
|
||||
def _checkLevel(level):
|
||||
if isinstance(level, int):
|
||||
rv = level
|
||||
elif str(level) == level:
|
||||
if level not in logging._levelNames:
|
||||
raise ValueError('Unknown level: %r' % level)
|
||||
rv = logging._levelNames[level]
|
||||
else:
|
||||
raise TypeError('Level not an integer or a '
|
||||
'valid string: %r' % level)
|
||||
return rv
|
||||
|
||||
# The ConvertingXXX classes are wrappers around standard Python containers,
|
||||
# and they serve to convert any suitable values in the container. The
|
||||
# conversion converts base dicts, lists and tuples to their wrapped
|
||||
# equivalents, whereas strings which match a conversion format are converted
|
||||
# appropriately.
|
||||
#
|
||||
# Each wrapper should have a configurator attribute holding the actual
|
||||
# configurator to use for conversion.
|
||||
|
||||
|
||||
class ConvertingDict(dict):
|
||||
"""A converting dictionary wrapper."""
|
||||
|
||||
def __getitem__(self, key):
|
||||
value = dict.__getitem__(self, key)
|
||||
result = self.configurator.convert(value)
|
||||
# If the converted value is different, save for next time
|
||||
if value is not result:
|
||||
self[key] = result
|
||||
if type(result) in (ConvertingDict, ConvertingList,
|
||||
ConvertingTuple):
|
||||
result.parent = self
|
||||
result.key = key
|
||||
return result
|
||||
|
||||
def get(self, key, default=None):
|
||||
value = dict.get(self, key, default)
|
||||
result = self.configurator.convert(value)
|
||||
# If the converted value is different, save for next time
|
||||
if value is not result:
|
||||
self[key] = result
|
||||
if type(result) in (ConvertingDict, ConvertingList,
|
||||
ConvertingTuple):
|
||||
result.parent = self
|
||||
result.key = key
|
||||
return result
|
||||
|
||||
def pop(self, key, default=None):
|
||||
value = dict.pop(self, key, default)
|
||||
result = self.configurator.convert(value)
|
||||
if value is not result:
|
||||
if type(result) in (ConvertingDict, ConvertingList,
|
||||
ConvertingTuple):
|
||||
result.parent = self
|
||||
result.key = key
|
||||
return result
|
||||
|
||||
|
||||
class ConvertingList(list):
|
||||
"""A converting list wrapper."""
|
||||
def __getitem__(self, key):
|
||||
value = list.__getitem__(self, key)
|
||||
result = self.configurator.convert(value)
|
||||
# If the converted value is different, save for next time
|
||||
if value is not result:
|
||||
self[key] = result
|
||||
if type(result) in (ConvertingDict, ConvertingList,
|
||||
ConvertingTuple):
|
||||
result.parent = self
|
||||
result.key = key
|
||||
return result
|
||||
|
||||
def pop(self, idx=-1):
|
||||
value = list.pop(self, idx)
|
||||
result = self.configurator.convert(value)
|
||||
if value is not result:
|
||||
if type(result) in (ConvertingDict, ConvertingList,
|
||||
ConvertingTuple):
|
||||
result.parent = self
|
||||
return result
|
||||
|
||||
|
||||
class ConvertingTuple(tuple):
|
||||
"""A converting tuple wrapper."""
|
||||
def __getitem__(self, key):
|
||||
value = tuple.__getitem__(self, key)
|
||||
result = self.configurator.convert(value)
|
||||
if value is not result:
|
||||
if type(result) in (ConvertingDict, ConvertingList,
|
||||
ConvertingTuple):
|
||||
result.parent = self
|
||||
result.key = key
|
||||
return result
|
||||
|
||||
|
||||
class BaseConfigurator(object):
|
||||
"""
|
||||
The configurator base class which defines some useful defaults.
|
||||
"""
|
||||
|
||||
CONVERT_PATTERN = re.compile(r'^(?P<prefix>[a-z]+)://(?P<suffix>.*)$')
|
||||
|
||||
WORD_PATTERN = re.compile(r'^\s*(\w+)\s*')
|
||||
DOT_PATTERN = re.compile(r'^\.\s*(\w+)\s*')
|
||||
INDEX_PATTERN = re.compile(r'^\[\s*(\w+)\s*\]\s*')
|
||||
DIGIT_PATTERN = re.compile(r'^\d+$')
|
||||
|
||||
value_converters = {
|
||||
'ext' : 'ext_convert',
|
||||
'cfg' : 'cfg_convert',
|
||||
}
|
||||
|
||||
# We might want to use a different one, e.g. importlib
|
||||
importer = __import__
|
||||
|
||||
def __init__(self, config):
|
||||
self.config = ConvertingDict(config)
|
||||
self.config.configurator = self
|
||||
|
||||
def resolve(self, s):
|
||||
"""
|
||||
Resolve strings to objects using standard import and attribute
|
||||
syntax.
|
||||
"""
|
||||
name = s.split('.')
|
||||
used = name.pop(0)
|
||||
try:
|
||||
found = self.importer(used)
|
||||
for frag in name:
|
||||
used += '.' + frag
|
||||
try:
|
||||
found = getattr(found, frag)
|
||||
except AttributeError:
|
||||
self.importer(used)
|
||||
found = getattr(found, frag)
|
||||
return found
|
||||
except ImportError:
|
||||
e, tb = sys.exc_info()[1:]
|
||||
v = ValueError('Cannot resolve %r: %s' % (s, e))
|
||||
v.__cause__, v.__traceback__ = e, tb
|
||||
raise v
|
||||
|
||||
def ext_convert(self, value):
|
||||
"""Default converter for the ext:// protocol."""
|
||||
return self.resolve(value)
|
||||
|
||||
def cfg_convert(self, value):
|
||||
"""Default converter for the cfg:// protocol."""
|
||||
rest = value
|
||||
m = self.WORD_PATTERN.match(rest)
|
||||
if m is None:
|
||||
raise ValueError("Unable to convert %r" % value)
|
||||
else:
|
||||
rest = rest[m.end():]
|
||||
d = self.config[m.groups()[0]]
|
||||
# print d, rest
|
||||
while rest:
|
||||
m = self.DOT_PATTERN.match(rest)
|
||||
if m:
|
||||
d = d[m.groups()[0]]
|
||||
else:
|
||||
m = self.INDEX_PATTERN.match(rest)
|
||||
if m:
|
||||
idx = m.groups()[0]
|
||||
if not self.DIGIT_PATTERN.match(idx):
|
||||
d = d[idx]
|
||||
else:
|
||||
try:
|
||||
n = int(idx) # try as number first (most likely)
|
||||
d = d[n]
|
||||
except TypeError:
|
||||
d = d[idx]
|
||||
if m:
|
||||
rest = rest[m.end():]
|
||||
else:
|
||||
raise ValueError('Unable to convert '
|
||||
'%r at %r' % (value, rest))
|
||||
# rest should be empty
|
||||
return d
|
||||
|
||||
def convert(self, value):
|
||||
"""
|
||||
Convert values to an appropriate type. dicts, lists and tuples are
|
||||
replaced by their converting alternatives. Strings are checked to
|
||||
see if they have a conversion format and are converted if they do.
|
||||
"""
|
||||
if not isinstance(value, ConvertingDict) and isinstance(value, dict):
|
||||
value = ConvertingDict(value)
|
||||
value.configurator = self
|
||||
elif not isinstance(value, ConvertingList) and isinstance(value, list):
|
||||
value = ConvertingList(value)
|
||||
value.configurator = self
|
||||
elif not isinstance(value, ConvertingTuple) and\
|
||||
isinstance(value, tuple):
|
||||
value = ConvertingTuple(value)
|
||||
value.configurator = self
|
||||
elif isinstance(value, six.string_types): # str for py3k
|
||||
m = self.CONVERT_PATTERN.match(value)
|
||||
if m:
|
||||
d = m.groupdict()
|
||||
prefix = d['prefix']
|
||||
converter = self.value_converters.get(prefix, None)
|
||||
if converter:
|
||||
suffix = d['suffix']
|
||||
converter = getattr(self, converter)
|
||||
value = converter(suffix)
|
||||
return value
|
||||
|
||||
def configure_custom(self, config):
|
||||
"""Configure an object with a user-supplied factory."""
|
||||
c = config.pop('()')
|
||||
if not hasattr(c, '__call__') and hasattr(types, 'ClassType') and type(c) != types.ClassType:
|
||||
c = self.resolve(c)
|
||||
props = config.pop('.', None)
|
||||
# Check for valid identifiers
|
||||
kwargs = dict((k, config[k]) for k in config if valid_ident(k))
|
||||
result = c(**kwargs)
|
||||
if props:
|
||||
for name, value in props.items():
|
||||
setattr(result, name, value)
|
||||
return result
|
||||
|
||||
def as_tuple(self, value):
|
||||
"""Utility function which converts lists to tuples."""
|
||||
if isinstance(value, list):
|
||||
value = tuple(value)
|
||||
return value
|
||||
|
||||
|
||||
class DictConfigurator(BaseConfigurator):
|
||||
"""
|
||||
Configure logging using a dictionary-like object to describe the
|
||||
configuration.
|
||||
"""
|
||||
|
||||
def configure(self):
|
||||
"""Do the configuration."""
|
||||
|
||||
config = self.config
|
||||
if 'version' not in config:
|
||||
raise ValueError("dictionary doesn't specify a version")
|
||||
if config['version'] != 1:
|
||||
raise ValueError("Unsupported version: %s" % config['version'])
|
||||
incremental = config.pop('incremental', False)
|
||||
EMPTY_DICT = {}
|
||||
logging._acquireLock()
|
||||
try:
|
||||
if incremental:
|
||||
handlers = config.get('handlers', EMPTY_DICT)
|
||||
# incremental handler config only if handler name
|
||||
# ties in to logging._handlers (Python 2.7)
|
||||
if sys.version_info[:2] == (2, 7):
|
||||
for name in handlers:
|
||||
if name not in logging._handlers:
|
||||
raise ValueError('No handler found with '
|
||||
'name %r' % name)
|
||||
else:
|
||||
try:
|
||||
handler = logging._handlers[name]
|
||||
handler_config = handlers[name]
|
||||
level = handler_config.get('level', None)
|
||||
if level:
|
||||
handler.setLevel(_checkLevel(level))
|
||||
except StandardError as e:
|
||||
raise ValueError('Unable to configure handler '
|
||||
'%r: %s' % (name, e))
|
||||
loggers = config.get('loggers', EMPTY_DICT)
|
||||
for name in loggers:
|
||||
try:
|
||||
self.configure_logger(name, loggers[name], True)
|
||||
except StandardError as e:
|
||||
raise ValueError('Unable to configure logger '
|
||||
'%r: %s' % (name, e))
|
||||
root = config.get('root', None)
|
||||
if root:
|
||||
try:
|
||||
self.configure_root(root, True)
|
||||
except StandardError as e:
|
||||
raise ValueError('Unable to configure root '
|
||||
'logger: %s' % e)
|
||||
else:
|
||||
disable_existing = config.pop('disable_existing_loggers', True)
|
||||
|
||||
logging._handlers.clear()
|
||||
del logging._handlerList[:]
|
||||
|
||||
# Do formatters first - they don't refer to anything else
|
||||
formatters = config.get('formatters', EMPTY_DICT)
|
||||
for name in formatters:
|
||||
try:
|
||||
formatters[name] = self.configure_formatter(
|
||||
formatters[name])
|
||||
except StandardError as e:
|
||||
raise ValueError('Unable to configure '
|
||||
'formatter %r: %s' % (name, e))
|
||||
# Next, do filters - they don't refer to anything else, either
|
||||
filters = config.get('filters', EMPTY_DICT)
|
||||
for name in filters:
|
||||
try:
|
||||
filters[name] = self.configure_filter(filters[name])
|
||||
except StandardError as e:
|
||||
raise ValueError('Unable to configure '
|
||||
'filter %r: %s' % (name, e))
|
||||
|
||||
# Next, do handlers - they refer to formatters and filters
|
||||
# As handlers can refer to other handlers, sort the keys
|
||||
# to allow a deterministic order of configuration
|
||||
handlers = config.get('handlers', EMPTY_DICT)
|
||||
for name in sorted(handlers):
|
||||
try:
|
||||
handler = self.configure_handler(handlers[name])
|
||||
handler.name = name
|
||||
handlers[name] = handler
|
||||
except StandardError as e:
|
||||
raise ValueError('Unable to configure handler '
|
||||
'%r: %s' % (name, e))
|
||||
# Next, do loggers - they refer to handlers and filters
|
||||
|
||||
# we don't want to lose the existing loggers,
|
||||
# since other threads may have pointers to them.
|
||||
# existing is set to contain all existing loggers,
|
||||
# and as we go through the new configuration we
|
||||
# remove any which are configured. At the end,
|
||||
# what's left in existing is the set of loggers
|
||||
# which were in the previous configuration but
|
||||
# which are not in the new configuration.
|
||||
root = logging.root
|
||||
existing = list(root.manager.loggerDict)
|
||||
# The list needs to be sorted so that we can
|
||||
# avoid disabling child loggers of explicitly
|
||||
# named loggers. With a sorted list it is easier
|
||||
# to find the child loggers.
|
||||
existing.sort()
|
||||
# We'll keep the list of existing loggers
|
||||
# which are children of named loggers here...
|
||||
child_loggers = []
|
||||
# now set up the new ones...
|
||||
loggers = config.get('loggers', EMPTY_DICT)
|
||||
for name in loggers:
|
||||
if name in existing:
|
||||
i = existing.index(name)
|
||||
prefixed = name + "."
|
||||
pflen = len(prefixed)
|
||||
num_existing = len(existing)
|
||||
i = i + 1 # look at the entry after name
|
||||
while (i < num_existing) and\
|
||||
(existing[i][:pflen] == prefixed):
|
||||
child_loggers.append(existing[i])
|
||||
i = i + 1
|
||||
existing.remove(name)
|
||||
try:
|
||||
self.configure_logger(name, loggers[name])
|
||||
except StandardError as e:
|
||||
raise ValueError('Unable to configure logger '
|
||||
'%r: %s' % (name, e))
|
||||
|
||||
# Disable any old loggers. There's no point deleting
|
||||
# them as other threads may continue to hold references
|
||||
# and by disabling them, you stop them doing any logging.
|
||||
# However, don't disable children of named loggers, as that's
|
||||
# probably not what was intended by the user.
|
||||
for log in existing:
|
||||
logger = root.manager.loggerDict[log]
|
||||
if log in child_loggers:
|
||||
logger.level = logging.NOTSET
|
||||
logger.handlers = []
|
||||
logger.propagate = True
|
||||
elif disable_existing:
|
||||
logger.disabled = True
|
||||
|
||||
# And finally, do the root logger
|
||||
root = config.get('root', None)
|
||||
if root:
|
||||
try:
|
||||
self.configure_root(root)
|
||||
except StandardError as e:
|
||||
raise ValueError('Unable to configure root '
|
||||
'logger: %s' % e)
|
||||
finally:
|
||||
logging._releaseLock()
|
||||
|
||||
def configure_formatter(self, config):
|
||||
"""Configure a formatter from a dictionary."""
|
||||
if '()' in config:
|
||||
factory = config['()'] # for use in exception handler
|
||||
try:
|
||||
result = self.configure_custom(config)
|
||||
except TypeError as te:
|
||||
if "'format'" not in str(te):
|
||||
raise
|
||||
# Name of parameter changed from fmt to format.
|
||||
# Retry with old name.
|
||||
# This is so that code can be used with older Python versions
|
||||
#(e.g. by Django)
|
||||
config['fmt'] = config.pop('format')
|
||||
config['()'] = factory
|
||||
result = self.configure_custom(config)
|
||||
else:
|
||||
fmt = config.get('format', None)
|
||||
dfmt = config.get('datefmt', None)
|
||||
result = logging.Formatter(fmt, dfmt)
|
||||
return result
|
||||
|
||||
def configure_filter(self, config):
|
||||
"""Configure a filter from a dictionary."""
|
||||
if '()' in config:
|
||||
result = self.configure_custom(config)
|
||||
else:
|
||||
name = config.get('name', '')
|
||||
result = logging.Filter(name)
|
||||
return result
|
||||
|
||||
def add_filters(self, filterer, filters):
|
||||
"""Add filters to a filterer from a list of names."""
|
||||
for f in filters:
|
||||
try:
|
||||
filterer.addFilter(self.config['filters'][f])
|
||||
except StandardError as e:
|
||||
raise ValueError('Unable to add filter %r: %s' % (f, e))
|
||||
|
||||
def configure_handler(self, config):
|
||||
"""Configure a handler from a dictionary."""
|
||||
formatter = config.pop('formatter', None)
|
||||
if formatter:
|
||||
try:
|
||||
formatter = self.config['formatters'][formatter]
|
||||
except StandardError as e:
|
||||
raise ValueError('Unable to set formatter '
|
||||
'%r: %s' % (formatter, e))
|
||||
level = config.pop('level', None)
|
||||
filters = config.pop('filters', None)
|
||||
if '()' in config:
|
||||
c = config.pop('()')
|
||||
if not hasattr(c, '__call__') and hasattr(types, 'ClassType') and type(c) != types.ClassType:
|
||||
c = self.resolve(c)
|
||||
factory = c
|
||||
else:
|
||||
klass = self.resolve(config.pop('class'))
|
||||
# Special case for handler which refers to another handler
|
||||
if issubclass(klass, logging.handlers.MemoryHandler) and\
|
||||
'target' in config:
|
||||
try:
|
||||
config['target'] = self.config['handlers'][config['target']]
|
||||
except StandardError as e:
|
||||
raise ValueError('Unable to set target handler '
|
||||
'%r: %s' % (config['target'], e))
|
||||
elif issubclass(klass, logging.handlers.SMTPHandler) and\
|
||||
'mailhost' in config:
|
||||
config['mailhost'] = self.as_tuple(config['mailhost'])
|
||||
elif issubclass(klass, logging.handlers.SysLogHandler) and\
|
||||
'address' in config:
|
||||
config['address'] = self.as_tuple(config['address'])
|
||||
factory = klass
|
||||
kwargs = dict((k, config[k]) for k in config if valid_ident(k))
|
||||
try:
|
||||
result = factory(**kwargs)
|
||||
except TypeError as te:
|
||||
if "'stream'" not in str(te):
|
||||
raise
|
||||
# The argument name changed from strm to stream
|
||||
# Retry with old name.
|
||||
# This is so that code can be used with older Python versions
|
||||
#(e.g. by Django)
|
||||
kwargs['strm'] = kwargs.pop('stream')
|
||||
result = factory(**kwargs)
|
||||
if formatter:
|
||||
result.setFormatter(formatter)
|
||||
if level is not None:
|
||||
result.setLevel(_checkLevel(level))
|
||||
if filters:
|
||||
self.add_filters(result, filters)
|
||||
return result
|
||||
|
||||
def add_handlers(self, logger, handlers):
|
||||
"""Add handlers to a logger from a list of names."""
|
||||
for h in handlers:
|
||||
try:
|
||||
logger.addHandler(self.config['handlers'][h])
|
||||
except StandardError as e:
|
||||
raise ValueError('Unable to add handler %r: %s' % (h, e))
|
||||
|
||||
def common_logger_config(self, logger, config, incremental=False):
|
||||
"""
|
||||
Perform configuration which is common to root and non-root loggers.
|
||||
"""
|
||||
level = config.get('level', None)
|
||||
if level is not None:
|
||||
logger.setLevel(_checkLevel(level))
|
||||
if not incremental:
|
||||
# Remove any existing handlers
|
||||
for h in logger.handlers[:]:
|
||||
logger.removeHandler(h)
|
||||
handlers = config.get('handlers', None)
|
||||
if handlers:
|
||||
self.add_handlers(logger, handlers)
|
||||
filters = config.get('filters', None)
|
||||
if filters:
|
||||
self.add_filters(logger, filters)
|
||||
|
||||
def configure_logger(self, name, config, incremental=False):
|
||||
"""Configure a non-root logger from a dictionary."""
|
||||
logger = logging.getLogger(name)
|
||||
self.common_logger_config(logger, config, incremental)
|
||||
propagate = config.get('propagate', None)
|
||||
if propagate is not None:
|
||||
logger.propagate = propagate
|
||||
|
||||
def configure_root(self, config, incremental=False):
|
||||
"""Configure a root logger from a dictionary."""
|
||||
root = logging.getLogger()
|
||||
self.common_logger_config(root, config, incremental)
|
||||
|
||||
dictConfigClass = DictConfigurator
|
||||
|
||||
|
||||
def dictConfig(config):
|
||||
"""Configure logging using a dictionary."""
|
||||
dictConfigClass(config).configure()
|
||||
|
|
@ -11,48 +11,44 @@ import platform
|
|||
import re
|
||||
import shutil
|
||||
import sys
|
||||
|
||||
from pip._vendor import requests, six, urllib3
|
||||
from pip._vendor.cachecontrol import CacheControlAdapter
|
||||
from pip._vendor.cachecontrol.caches import FileCache
|
||||
from pip._vendor.lockfile import LockError
|
||||
from pip._vendor.requests.adapters import BaseAdapter, HTTPAdapter
|
||||
from pip._vendor.requests.auth import AuthBase, HTTPBasicAuth
|
||||
from pip._vendor.requests.models import CONTENT_CHUNK_SIZE, Response
|
||||
from pip._vendor.requests.structures import CaseInsensitiveDict
|
||||
from pip._vendor.requests.utils import get_netrc_auth
|
||||
# NOTE: XMLRPC Client is not annotated in typeshed as on 2017-07-17, which is
|
||||
# why we ignore the type on this import
|
||||
from pip._vendor.six.moves import xmlrpc_client # type: ignore
|
||||
from pip._vendor.six.moves.urllib import parse as urllib_parse
|
||||
from pip._vendor.six.moves.urllib import request as urllib_request
|
||||
from pip._vendor.six.moves.urllib.parse import unquote as urllib_unquote
|
||||
from pip._vendor.urllib3.util import IS_PYOPENSSL
|
||||
|
||||
import pip
|
||||
from pip._internal.exceptions import HashMismatch, InstallationError
|
||||
from pip._internal.locations import write_delete_marker_file
|
||||
from pip._internal.models.index import PyPI
|
||||
from pip._internal.utils.encoding import auto_decode
|
||||
from pip._internal.utils.filesystem import check_path_owner
|
||||
from pip._internal.utils.glibc import libc_ver
|
||||
from pip._internal.utils.logging import indent_log
|
||||
from pip._internal.utils.misc import (
|
||||
ARCHIVE_EXTENSIONS, ask_path_exists, backup_dir, call_subprocess, consume,
|
||||
display_path, format_size, get_installed_version, rmtree, splitext,
|
||||
unpack_file,
|
||||
)
|
||||
from pip._internal.utils.setuptools_build import SETUPTOOLS_SHIM
|
||||
from pip._internal.utils.temp_dir import TempDirectory
|
||||
from pip._internal.utils.ui import DownloadProgressProvider
|
||||
from pip._internal.vcs import vcs
|
||||
import tempfile
|
||||
|
||||
try:
|
||||
import ssl # noqa
|
||||
HAS_TLS = True
|
||||
except ImportError:
|
||||
ssl = None
|
||||
HAS_TLS = False
|
||||
|
||||
from pip._vendor.six.moves.urllib import parse as urllib_parse
|
||||
from pip._vendor.six.moves.urllib import request as urllib_request
|
||||
|
||||
import pip
|
||||
|
||||
from pip.exceptions import InstallationError, HashMismatch
|
||||
from pip.models import PyPI
|
||||
from pip.utils import (splitext, rmtree, format_size, display_path,
|
||||
backup_dir, ask_path_exists, unpack_file,
|
||||
ARCHIVE_EXTENSIONS, consume, call_subprocess)
|
||||
from pip.utils.encoding import auto_decode
|
||||
from pip.utils.filesystem import check_path_owner
|
||||
from pip.utils.logging import indent_log
|
||||
from pip.utils.setuptools_build import SETUPTOOLS_SHIM
|
||||
from pip.utils.glibc import libc_ver
|
||||
from pip.utils.ui import DownloadProgressBar, DownloadProgressSpinner
|
||||
from pip.locations import write_delete_marker_file
|
||||
from pip.vcs import vcs
|
||||
from pip._vendor import requests, six
|
||||
from pip._vendor.requests.adapters import BaseAdapter, HTTPAdapter
|
||||
from pip._vendor.requests.auth import AuthBase, HTTPBasicAuth
|
||||
from pip._vendor.requests.models import CONTENT_CHUNK_SIZE, Response
|
||||
from pip._vendor.requests.utils import get_netrc_auth
|
||||
from pip._vendor.requests.structures import CaseInsensitiveDict
|
||||
from pip._vendor.requests.packages import urllib3
|
||||
from pip._vendor.cachecontrol import CacheControlAdapter
|
||||
from pip._vendor.cachecontrol.caches import FileCache
|
||||
from pip._vendor.lockfile import LockError
|
||||
from pip._vendor.six.moves import xmlrpc_client
|
||||
|
||||
HAS_TLS = (ssl is not None) or IS_PYOPENSSL
|
||||
|
||||
__all__ = ['get_file_content',
|
||||
'is_url', 'url_to_path', 'path_to_url',
|
||||
|
|
@ -120,13 +116,10 @@ def user_agent():
|
|||
if platform.machine():
|
||||
data["cpu"] = platform.machine()
|
||||
|
||||
if HAS_TLS:
|
||||
# Python 2.6 doesn't have ssl.OPENSSL_VERSION.
|
||||
if HAS_TLS and sys.version_info[:2] > (2, 6):
|
||||
data["openssl_version"] = ssl.OPENSSL_VERSION
|
||||
|
||||
setuptools_version = get_installed_version("setuptools")
|
||||
if setuptools_version is not None:
|
||||
data["setuptools_version"] = setuptools_version
|
||||
|
||||
return "{data[installer][name]}/{data[installer][version]} {json}".format(
|
||||
data=data,
|
||||
json=json.dumps(data, separators=(",", ":"), sort_keys=True),
|
||||
|
|
@ -210,9 +203,8 @@ class MultiDomainBasicAuth(AuthBase):
|
|||
if "@" in netloc:
|
||||
userinfo = netloc.rsplit("@", 1)[0]
|
||||
if ":" in userinfo:
|
||||
user, pwd = userinfo.split(":", 1)
|
||||
return (urllib_unquote(user), urllib_unquote(pwd))
|
||||
return urllib_unquote(userinfo), None
|
||||
return userinfo.split(":", 1)
|
||||
return userinfo, None
|
||||
return None, None
|
||||
|
||||
|
||||
|
|
@ -350,9 +342,7 @@ class PipSession(requests.Session):
|
|||
# connection got interrupted in some way. A 503 error in general
|
||||
# is typically considered a transient error so we'll go ahead and
|
||||
# retry it.
|
||||
# A 500 may indicate transient error in Amazon S3
|
||||
# A 520 or 527 - may indicate transient error in CloudFlare
|
||||
status_forcelist=[500, 503, 520, 527],
|
||||
status_forcelist=[503],
|
||||
|
||||
# Add a small amount of back off between failed requests in
|
||||
# order to prevent hammering the service.
|
||||
|
|
@ -386,7 +376,7 @@ class PipSession(requests.Session):
|
|||
# We want to use a non-validating adapter for any requests which are
|
||||
# deemed insecure.
|
||||
for host in insecure_hosts:
|
||||
self.mount("https://{}/".format(host), insecure_adapter)
|
||||
self.mount("https://{0}/".format(host), insecure_adapter)
|
||||
|
||||
def request(self, method, url, *args, **kwargs):
|
||||
# Allow setting a default timeout on a session
|
||||
|
|
@ -398,12 +388,7 @@ class PipSession(requests.Session):
|
|||
|
||||
def get_file_content(url, comes_from=None, session=None):
|
||||
"""Gets the content of a file; it may be a filename, file: URL, or
|
||||
http: URL. Returns (location, content). Content is unicode.
|
||||
|
||||
:param url: File path or url.
|
||||
:param comes_from: Origin description of requirements.
|
||||
:param session: Instance of pip.download.PipSession.
|
||||
"""
|
||||
http: URL. Returns (location, content). Content is unicode."""
|
||||
if session is None:
|
||||
raise TypeError(
|
||||
"get_file_content() missing 1 required keyword argument: 'session'"
|
||||
|
|
@ -524,13 +509,14 @@ def _progress_indicator(iterable, *args, **kwargs):
|
|||
return iterable
|
||||
|
||||
|
||||
def _download_url(resp, link, content_file, hashes, progress_bar):
|
||||
def _download_url(resp, link, content_file, hashes):
|
||||
try:
|
||||
total_length = int(resp.headers['content-length'])
|
||||
except (ValueError, KeyError, TypeError):
|
||||
total_length = 0
|
||||
|
||||
cached_resp = getattr(resp, "from_cache", False)
|
||||
|
||||
if logger.getEffectiveLevel() > logging.INFO:
|
||||
show_progress = False
|
||||
elif cached_resp:
|
||||
|
|
@ -594,12 +580,12 @@ def _download_url(resp, link, content_file, hashes, progress_bar):
|
|||
url = link.url_without_fragment
|
||||
|
||||
if show_progress: # We don't show progress on cached responses
|
||||
progress_indicator = DownloadProgressProvider(progress_bar,
|
||||
max=total_length)
|
||||
if total_length:
|
||||
logger.info("Downloading %s (%s)", url, format_size(total_length))
|
||||
progress_indicator = DownloadProgressBar(max=total_length).iter
|
||||
else:
|
||||
logger.info("Downloading %s", url)
|
||||
progress_indicator = DownloadProgressSpinner().iter
|
||||
elif cached_resp:
|
||||
logger.info("Using cached %s", url)
|
||||
else:
|
||||
|
|
@ -647,13 +633,14 @@ def _copy_file(filename, location, link):
|
|||
|
||||
|
||||
def unpack_http_url(link, location, download_dir=None,
|
||||
session=None, hashes=None, progress_bar="on"):
|
||||
session=None, hashes=None):
|
||||
if session is None:
|
||||
raise TypeError(
|
||||
"unpack_http_url() missing 1 required keyword argument: 'session'"
|
||||
)
|
||||
|
||||
with TempDirectory(kind="unpack") as temp_dir:
|
||||
temp_dir = tempfile.mkdtemp('-unpack', 'pip-')
|
||||
|
||||
# If a download dir is specified, is the file already downloaded there?
|
||||
already_downloaded_path = None
|
||||
if download_dir:
|
||||
|
|
@ -668,12 +655,11 @@ def unpack_http_url(link, location, download_dir=None,
|
|||
# let's download to a tmp dir
|
||||
from_path, content_type = _download_http_url(link,
|
||||
session,
|
||||
temp_dir.path,
|
||||
hashes,
|
||||
progress_bar)
|
||||
temp_dir,
|
||||
hashes)
|
||||
|
||||
# unpack the archive to the build dir location. even when only
|
||||
# downloading archives, they have to be unpacked to parse dependencies
|
||||
# unpack the archive to the build dir location. even when only downloading
|
||||
# archives, they have to be unpacked to parse dependencies
|
||||
unpack_file(from_path, location, content_type, link)
|
||||
|
||||
# a download dir is specified; let's copy the archive there
|
||||
|
|
@ -682,6 +668,7 @@ def unpack_http_url(link, location, download_dir=None,
|
|||
|
||||
if not already_downloaded_path:
|
||||
os.unlink(from_path)
|
||||
rmtree(temp_dir)
|
||||
|
||||
|
||||
def unpack_file_url(link, location, download_dir=None, hashes=None):
|
||||
|
|
@ -798,8 +785,7 @@ class PipXmlrpcTransport(xmlrpc_client.Transport):
|
|||
|
||||
|
||||
def unpack_url(link, location, download_dir=None,
|
||||
only_download=False, session=None, hashes=None,
|
||||
progress_bar="on"):
|
||||
only_download=False, session=None, hashes=None):
|
||||
"""Unpack link.
|
||||
If link is a VCS link:
|
||||
if only_download, export into download_dir and ignore location
|
||||
|
|
@ -832,14 +818,13 @@ def unpack_url(link, location, download_dir=None,
|
|||
location,
|
||||
download_dir,
|
||||
session,
|
||||
hashes=hashes,
|
||||
progress_bar=progress_bar
|
||||
hashes=hashes
|
||||
)
|
||||
if only_download:
|
||||
write_delete_marker_file(location)
|
||||
|
||||
|
||||
def _download_http_url(link, session, temp_dir, hashes, progress_bar):
|
||||
def _download_http_url(link, session, temp_dir, hashes):
|
||||
"""Download link url into temp_dir using provided session"""
|
||||
target_url = link.url.split('#', 1)[0]
|
||||
try:
|
||||
|
|
@ -894,7 +879,7 @@ def _download_http_url(link, session, temp_dir, hashes, progress_bar):
|
|||
filename += ext
|
||||
file_path = os.path.join(temp_dir, filename)
|
||||
with open(file_path, 'wb') as content_file:
|
||||
_download_url(resp, link, content_file, hashes, progress_bar)
|
||||
_download_url(resp, link, content_file, hashes)
|
||||
return file_path, content_type
|
||||
|
||||
|
||||
|
|
@ -10,10 +10,6 @@ class PipError(Exception):
|
|||
"""Base pip exception"""
|
||||
|
||||
|
||||
class ConfigurationError(PipError):
|
||||
"""General exception in configuration"""
|
||||
|
||||
|
||||
class InstallationError(PipError):
|
||||
"""General exception during installation"""
|
||||
|
||||
|
|
@ -162,8 +158,7 @@ class HashMissing(HashError):
|
|||
self.gotten_hash = gotten_hash
|
||||
|
||||
def body(self):
|
||||
# Dodge circular import.
|
||||
from pip._internal.utils.hashes import FAVORITE_HASH
|
||||
from pip.utils.hashes import FAVORITE_HASH # Dodge circular import.
|
||||
|
||||
package = None
|
||||
if self.req:
|
||||
|
|
@ -247,22 +242,3 @@ class HashMismatch(HashError):
|
|||
class UnsupportedPythonVersion(InstallationError):
|
||||
"""Unsupported python version according to Requires-Python package
|
||||
metadata."""
|
||||
|
||||
|
||||
class ConfigurationFileCouldNotBeLoaded(ConfigurationError):
|
||||
"""When there are errors while loading a configuration file
|
||||
"""
|
||||
|
||||
def __init__(self, reason="could not be loaded", fname=None, error=None):
|
||||
super(ConfigurationFileCouldNotBeLoaded, self).__init__(error)
|
||||
self.reason = reason
|
||||
self.fname = fname
|
||||
self.error = error
|
||||
|
||||
def __str__(self):
|
||||
if self.fname is not None:
|
||||
message_part = " in {}.".format(self.fname)
|
||||
else:
|
||||
assert self.error is not None
|
||||
message_part = ".\n{}\n".format(self.error.message)
|
||||
return "Configuration file {}{}".format(self.reason, message_part)
|
||||
|
|
@ -1,46 +1,44 @@
|
|||
"""Routines related to PyPI, indexes"""
|
||||
from __future__ import absolute_import
|
||||
|
||||
import cgi
|
||||
import itertools
|
||||
import logging
|
||||
import mimetypes
|
||||
import os
|
||||
import posixpath
|
||||
import re
|
||||
import sys
|
||||
import cgi
|
||||
from collections import namedtuple
|
||||
import itertools
|
||||
import sys
|
||||
import os
|
||||
import re
|
||||
import mimetypes
|
||||
import posixpath
|
||||
import warnings
|
||||
|
||||
from pip._vendor import html5lib, requests, six
|
||||
from pip._vendor.distlib.compat import unescape
|
||||
from pip._vendor.packaging import specifiers
|
||||
from pip._vendor.packaging.utils import canonicalize_name
|
||||
from pip._vendor.packaging.version import parse as parse_version
|
||||
from pip._vendor.requests.exceptions import SSLError
|
||||
from pip._vendor.six.moves.urllib import parse as urllib_parse
|
||||
from pip._vendor.six.moves.urllib import request as urllib_request
|
||||
|
||||
from pip._internal.download import HAS_TLS, is_url, path_to_url, url_to_path
|
||||
from pip._internal.exceptions import (
|
||||
BestVersionAlreadyInstalled, DistributionNotFound, InvalidWheelFilename,
|
||||
from pip.compat import ipaddress
|
||||
from pip.utils import (
|
||||
cached_property, splitext, normalize_path,
|
||||
ARCHIVE_EXTENSIONS, SUPPORTED_EXTENSIONS,
|
||||
)
|
||||
from pip.utils.deprecation import RemovedInPip10Warning
|
||||
from pip.utils.logging import indent_log
|
||||
from pip.utils.packaging import check_requires_python
|
||||
from pip.exceptions import (
|
||||
DistributionNotFound, BestVersionAlreadyInstalled, InvalidWheelFilename,
|
||||
UnsupportedWheel,
|
||||
)
|
||||
from pip._internal.models.candidate import InstallationCandidate
|
||||
from pip._internal.models.format_control import FormatControl
|
||||
from pip._internal.models.index import PyPI
|
||||
from pip._internal.models.link import Link
|
||||
from pip._internal.pep425tags import get_supported
|
||||
from pip._internal.utils.compat import ipaddress
|
||||
from pip._internal.utils.deprecation import deprecated
|
||||
from pip._internal.utils.logging import indent_log
|
||||
from pip._internal.utils.misc import (
|
||||
ARCHIVE_EXTENSIONS, SUPPORTED_EXTENSIONS, normalize_path,
|
||||
remove_auth_from_url,
|
||||
)
|
||||
from pip._internal.utils.packaging import check_requires_python
|
||||
from pip._internal.wheel import Wheel, wheel_ext
|
||||
from pip.download import HAS_TLS, is_url, path_to_url, url_to_path
|
||||
from pip.wheel import Wheel, wheel_ext
|
||||
from pip.pep425tags import get_supported
|
||||
from pip._vendor import html5lib, requests, six
|
||||
from pip._vendor.packaging.version import parse as parse_version
|
||||
from pip._vendor.packaging.utils import canonicalize_name
|
||||
from pip._vendor.packaging import specifiers
|
||||
from pip._vendor.requests.exceptions import SSLError
|
||||
from pip._vendor.distlib.compat import unescape
|
||||
|
||||
__all__ = ['FormatControl', 'PackageFinder']
|
||||
|
||||
__all__ = ['FormatControl', 'fmt_ctl_handle_mutual_exclude', 'PackageFinder']
|
||||
|
||||
|
||||
SECURE_ORIGINS = [
|
||||
|
|
@ -59,120 +57,45 @@ SECURE_ORIGINS = [
|
|||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def _get_content_type(url, session):
|
||||
"""Get the Content-Type of the given url, using a HEAD request"""
|
||||
scheme, netloc, path, query, fragment = urllib_parse.urlsplit(url)
|
||||
if scheme not in {'http', 'https'}:
|
||||
# FIXME: some warning or something?
|
||||
# assertion error?
|
||||
return ''
|
||||
class InstallationCandidate(object):
|
||||
|
||||
resp = session.head(url, allow_redirects=True)
|
||||
resp.raise_for_status()
|
||||
def __init__(self, project, version, location):
|
||||
self.project = project
|
||||
self.version = parse_version(version)
|
||||
self.location = location
|
||||
self._key = (self.project, self.version, self.location)
|
||||
|
||||
return resp.headers.get("Content-Type", "")
|
||||
|
||||
|
||||
def _handle_get_page_fail(link, reason, url, meth=None):
|
||||
if meth is None:
|
||||
meth = logger.debug
|
||||
meth("Could not fetch URL %s: %s - skipping", link, reason)
|
||||
|
||||
|
||||
def _get_html_page(link, session=None):
|
||||
if session is None:
|
||||
raise TypeError(
|
||||
"_get_html_page() missing 1 required keyword argument: 'session'"
|
||||
def __repr__(self):
|
||||
return "<InstallationCandidate({0!r}, {1!r}, {2!r})>".format(
|
||||
self.project, self.version, self.location,
|
||||
)
|
||||
|
||||
url = link.url
|
||||
url = url.split('#', 1)[0]
|
||||
def __hash__(self):
|
||||
return hash(self._key)
|
||||
|
||||
# Check for VCS schemes that do not support lookup as web pages.
|
||||
from pip._internal.vcs import VcsSupport
|
||||
for scheme in VcsSupport.schemes:
|
||||
if url.lower().startswith(scheme) and url[len(scheme)] in '+:':
|
||||
logger.debug('Cannot look at %s URL %s', scheme, link)
|
||||
return None
|
||||
def __lt__(self, other):
|
||||
return self._compare(other, lambda s, o: s < o)
|
||||
|
||||
try:
|
||||
filename = link.filename
|
||||
for bad_ext in ARCHIVE_EXTENSIONS:
|
||||
if filename.endswith(bad_ext):
|
||||
content_type = _get_content_type(url, session=session)
|
||||
if content_type.lower().startswith('text/html'):
|
||||
break
|
||||
else:
|
||||
logger.debug(
|
||||
'Skipping page %s because of Content-Type: %s',
|
||||
link,
|
||||
content_type,
|
||||
)
|
||||
return
|
||||
def __le__(self, other):
|
||||
return self._compare(other, lambda s, o: s <= o)
|
||||
|
||||
logger.debug('Getting page %s', url)
|
||||
def __eq__(self, other):
|
||||
return self._compare(other, lambda s, o: s == o)
|
||||
|
||||
# Tack index.html onto file:// URLs that point to directories
|
||||
(scheme, netloc, path, params, query, fragment) = \
|
||||
urllib_parse.urlparse(url)
|
||||
if (scheme == 'file' and
|
||||
os.path.isdir(urllib_request.url2pathname(path))):
|
||||
# add trailing slash if not present so urljoin doesn't trim
|
||||
# final segment
|
||||
if not url.endswith('/'):
|
||||
url += '/'
|
||||
url = urllib_parse.urljoin(url, 'index.html')
|
||||
logger.debug(' file: URL is directory, getting %s', url)
|
||||
def __ge__(self, other):
|
||||
return self._compare(other, lambda s, o: s >= o)
|
||||
|
||||
resp = session.get(
|
||||
url,
|
||||
headers={
|
||||
"Accept": "text/html",
|
||||
# We don't want to blindly returned cached data for
|
||||
# /simple/, because authors generally expecting that
|
||||
# twine upload && pip install will function, but if
|
||||
# they've done a pip install in the last ~10 minutes
|
||||
# it won't. Thus by setting this to zero we will not
|
||||
# blindly use any cached data, however the benefit of
|
||||
# using max-age=0 instead of no-cache, is that we will
|
||||
# still support conditional requests, so we will still
|
||||
# minimize traffic sent in cases where the page hasn't
|
||||
# changed at all, we will just always incur the round
|
||||
# trip for the conditional GET now instead of only
|
||||
# once per 10 minutes.
|
||||
# For more information, please see pypa/pip#5670.
|
||||
"Cache-Control": "max-age=0",
|
||||
},
|
||||
)
|
||||
resp.raise_for_status()
|
||||
def __gt__(self, other):
|
||||
return self._compare(other, lambda s, o: s > o)
|
||||
|
||||
# The check for archives above only works if the url ends with
|
||||
# something that looks like an archive. However that is not a
|
||||
# requirement of an url. Unless we issue a HEAD request on every
|
||||
# url we cannot know ahead of time for sure if something is HTML
|
||||
# or not. However we can check after we've downloaded it.
|
||||
content_type = resp.headers.get('Content-Type', 'unknown')
|
||||
if not content_type.lower().startswith("text/html"):
|
||||
logger.debug(
|
||||
'Skipping page %s because of Content-Type: %s',
|
||||
link,
|
||||
content_type,
|
||||
)
|
||||
return
|
||||
def __ne__(self, other):
|
||||
return self._compare(other, lambda s, o: s != o)
|
||||
|
||||
inst = HTMLPage(resp.content, resp.url, resp.headers)
|
||||
except requests.HTTPError as exc:
|
||||
_handle_get_page_fail(link, exc, url)
|
||||
except SSLError as exc:
|
||||
reason = "There was a problem confirming the ssl certificate: "
|
||||
reason += str(exc)
|
||||
_handle_get_page_fail(link, reason, url, meth=logger.info)
|
||||
except requests.ConnectionError as exc:
|
||||
_handle_get_page_fail(link, "connection error: %s" % exc, url)
|
||||
except requests.Timeout:
|
||||
_handle_get_page_fail(link, "timed out", url)
|
||||
else:
|
||||
return inst
|
||||
def _compare(self, other, method):
|
||||
if not isinstance(other, InstallationCandidate):
|
||||
return NotImplemented
|
||||
|
||||
return method(self._key, other._key)
|
||||
|
||||
|
||||
class PackageFinder(object):
|
||||
|
|
@ -185,8 +108,7 @@ class PackageFinder(object):
|
|||
def __init__(self, find_links, index_urls, allow_all_prereleases=False,
|
||||
trusted_hosts=None, process_dependency_links=False,
|
||||
session=None, format_control=None, platform=None,
|
||||
versions=None, abi=None, implementation=None,
|
||||
prefer_binary=False):
|
||||
versions=None, abi=None, implementation=None):
|
||||
"""Create a PackageFinder.
|
||||
|
||||
:param format_control: A FormatControl object or None. Used to control
|
||||
|
|
@ -254,9 +176,6 @@ class PackageFinder(object):
|
|||
impl=implementation,
|
||||
)
|
||||
|
||||
# Do we prefer old, but valid, binary dist over new source dist
|
||||
self.prefer_binary = prefer_binary
|
||||
|
||||
# If we don't have TLS enabled, then WARN if anyplace we're looking
|
||||
# relies on TLS.
|
||||
if not HAS_TLS:
|
||||
|
|
@ -270,31 +189,16 @@ class PackageFinder(object):
|
|||
)
|
||||
break
|
||||
|
||||
def get_formatted_locations(self):
|
||||
lines = []
|
||||
if self.index_urls and self.index_urls != [PyPI.simple_url]:
|
||||
lines.append(
|
||||
"Looking in indexes: {}".format(", ".join(
|
||||
remove_auth_from_url(url) for url in self.index_urls))
|
||||
)
|
||||
if self.find_links:
|
||||
lines.append(
|
||||
"Looking in links: {}".format(", ".join(self.find_links))
|
||||
)
|
||||
return "\n".join(lines)
|
||||
|
||||
def add_dependency_links(self, links):
|
||||
# FIXME: this shouldn't be global list this, it should only
|
||||
# apply to requirements of the package that specifies the
|
||||
# dependency_links value
|
||||
# FIXME: also, we should track comes_from (i.e., use Link)
|
||||
# # FIXME: this shouldn't be global list this, it should only
|
||||
# # apply to requirements of the package that specifies the
|
||||
# # dependency_links value
|
||||
# # FIXME: also, we should track comes_from (i.e., use Link)
|
||||
if self.process_dependency_links:
|
||||
deprecated(
|
||||
warnings.warn(
|
||||
"Dependency Links processing has been deprecated and will be "
|
||||
"removed in a future release.",
|
||||
replacement="PEP 508 URL dependencies",
|
||||
gone_in="18.2",
|
||||
issue=4187,
|
||||
RemovedInPip10Warning,
|
||||
)
|
||||
self.dependency_links.extend(links)
|
||||
|
||||
|
|
@ -337,16 +241,14 @@ class PackageFinder(object):
|
|||
else:
|
||||
logger.warning(
|
||||
"Url '%s' is ignored: it is neither a file "
|
||||
"nor a directory.", url,
|
||||
)
|
||||
"nor a directory.", url)
|
||||
elif is_url(url):
|
||||
# Only add url with clear scheme
|
||||
urls.append(url)
|
||||
else:
|
||||
logger.warning(
|
||||
"Url '%s' is ignored. It is either a non-existing "
|
||||
"path or lacks a specific scheme.", url,
|
||||
)
|
||||
"path or lacks a specific scheme.", url)
|
||||
|
||||
return files, urls
|
||||
|
||||
|
|
@ -359,14 +261,11 @@ class PackageFinder(object):
|
|||
1. existing installs
|
||||
2. wheels ordered via Wheel.support_index_min(self.valid_tags)
|
||||
3. source archives
|
||||
If prefer_binary was set, then all wheels are sorted above sources.
|
||||
Note: it was considered to embed this logic into the Link
|
||||
comparison operators, but then different sdist links
|
||||
with the same version, would have to be considered equal
|
||||
"""
|
||||
support_num = len(self.valid_tags)
|
||||
build_tag = tuple()
|
||||
binary_preference = 0
|
||||
if candidate.location.is_wheel:
|
||||
# can raise InvalidWheelFilename
|
||||
wheel = Wheel(candidate.location.filename)
|
||||
|
|
@ -375,16 +274,10 @@ class PackageFinder(object):
|
|||
"%s is not a supported wheel for this platform. It "
|
||||
"can't be sorted." % wheel.filename
|
||||
)
|
||||
if self.prefer_binary:
|
||||
binary_preference = 1
|
||||
pri = -(wheel.support_index_min(self.valid_tags))
|
||||
if wheel.build_tag is not None:
|
||||
match = re.match(r'^(\d+)(.*)$', wheel.build_tag)
|
||||
build_tag_groups = match.groups()
|
||||
build_tag = (int(build_tag_groups[0]), build_tag_groups[1])
|
||||
else: # sdist
|
||||
pri = -(support_num)
|
||||
return (binary_preference, candidate.version, build_tag, pri)
|
||||
return (candidate.version, pri)
|
||||
|
||||
def _validate_secure_origin(self, logger, location):
|
||||
# Determine if this url used a secure transport mechanism
|
||||
|
|
@ -448,9 +341,9 @@ class PackageFinder(object):
|
|||
# log a warning that we are ignoring it.
|
||||
logger.warning(
|
||||
"The repository located at %s is not a trusted or secure host and "
|
||||
"is being ignored. If this repository is available via HTTPS we "
|
||||
"recommend you use HTTPS instead, otherwise you may silence "
|
||||
"this warning and allow it anyway with '--trusted-host %s'.",
|
||||
"is being ignored. If this repository is available via HTTPS it "
|
||||
"is recommended to use HTTPS instead, otherwise you may silence "
|
||||
"this warning and allow it anyways with '--trusted-host %s'.",
|
||||
parsed.hostname,
|
||||
parsed.hostname,
|
||||
)
|
||||
|
|
@ -490,13 +383,13 @@ class PackageFinder(object):
|
|||
index_locations = self._get_index_urls_locations(project_name)
|
||||
index_file_loc, index_url_loc = self._sort_locations(index_locations)
|
||||
fl_file_loc, fl_url_loc = self._sort_locations(
|
||||
self.find_links, expand_dir=True,
|
||||
)
|
||||
self.find_links, expand_dir=True)
|
||||
dep_file_loc, dep_url_loc = self._sort_locations(self.dependency_links)
|
||||
|
||||
file_locations = (Link(url) for url in itertools.chain(
|
||||
index_file_loc, fl_file_loc, dep_file_loc,
|
||||
))
|
||||
file_locations = (
|
||||
Link(url) for url in itertools.chain(
|
||||
index_file_loc, fl_file_loc, dep_file_loc)
|
||||
)
|
||||
|
||||
# We trust every url that the user has given us whether it was given
|
||||
# via --index-url or --find-links
|
||||
|
|
@ -518,7 +411,7 @@ class PackageFinder(object):
|
|||
logger.debug('* %s', location)
|
||||
|
||||
canonical_name = canonicalize_name(project_name)
|
||||
formats = self.format_control.get_allowed_formats(canonical_name)
|
||||
formats = fmt_ctl_formats(self.format_control, canonical_name)
|
||||
search = Search(project_name, canonical_name, formats)
|
||||
find_links_versions = self._package_versions(
|
||||
# We trust every directly linked archive in find_links
|
||||
|
|
@ -531,7 +424,7 @@ class PackageFinder(object):
|
|||
logger.debug('Analyzing links from page %s', page.url)
|
||||
with indent_log():
|
||||
page_versions.extend(
|
||||
self._package_versions(page.iter_links(), search)
|
||||
self._package_versions(page.links, search)
|
||||
)
|
||||
|
||||
dependency_versions = self._package_versions(
|
||||
|
|
@ -611,7 +504,7 @@ class PackageFinder(object):
|
|||
req,
|
||||
', '.join(
|
||||
sorted(
|
||||
{str(c.version) for c in all_candidates},
|
||||
set(str(c.version) for c in all_candidates),
|
||||
key=parse_version,
|
||||
)
|
||||
)
|
||||
|
|
@ -722,13 +615,11 @@ class PackageFinder(object):
|
|||
return
|
||||
if ext not in SUPPORTED_EXTENSIONS:
|
||||
self._log_skipped_link(
|
||||
link, 'unsupported archive format: %s' % ext,
|
||||
)
|
||||
link, 'unsupported archive format: %s' % ext)
|
||||
return
|
||||
if "binary" not in search.formats and ext == wheel_ext:
|
||||
self._log_skipped_link(
|
||||
link, 'No binaries permitted for %s' % search.supplied,
|
||||
)
|
||||
link, 'No binaries permitted for %s' % search.supplied)
|
||||
return
|
||||
if "macosx10" in link.path and ext == '.zip':
|
||||
self._log_skipped_link(link, 'macosx10 one')
|
||||
|
|
@ -754,15 +645,14 @@ class PackageFinder(object):
|
|||
# This should be up by the search.ok_binary check, but see issue 2700.
|
||||
if "source" not in search.formats and ext != wheel_ext:
|
||||
self._log_skipped_link(
|
||||
link, 'No sources permitted for %s' % search.supplied,
|
||||
)
|
||||
link, 'No sources permitted for %s' % search.supplied)
|
||||
return
|
||||
|
||||
if not version:
|
||||
version = egg_info_matches(egg_info, search.supplied, link)
|
||||
if version is None:
|
||||
self._log_skipped_link(
|
||||
link, 'Missing project version for %s' % search.supplied)
|
||||
link, 'wrong project name (not %s)' % search.supplied)
|
||||
return
|
||||
|
||||
match = self._py_version_re.search(version)
|
||||
|
|
@ -790,7 +680,7 @@ class PackageFinder(object):
|
|||
return InstallationCandidate(search.supplied, version, link)
|
||||
|
||||
def _get_page(self, link):
|
||||
return _get_html_page(link, session=self.session)
|
||||
return HTMLPage.get_page(link, session=self.session)
|
||||
|
||||
|
||||
def egg_info_matches(
|
||||
|
|
@ -810,7 +700,7 @@ def egg_info_matches(
|
|||
return None
|
||||
if search_name is None:
|
||||
full_match = match.group(0)
|
||||
return full_match.split('-', 1)[-1]
|
||||
return full_match[full_match.index('-'):]
|
||||
name = match.group(0).lower()
|
||||
# To match the "safe" name that pkg_resources creates:
|
||||
name = name.replace('_', '-')
|
||||
|
|
@ -822,71 +712,384 @@ def egg_info_matches(
|
|||
return None
|
||||
|
||||
|
||||
def _determine_base_url(document, page_url):
|
||||
"""Determine the HTML document's base URL.
|
||||
|
||||
This looks for a ``<base>`` tag in the HTML document. If present, its href
|
||||
attribute denotes the base URL of anchor tags in the document. If there is
|
||||
no such tag (or if it does not have a valid href attribute), the HTML
|
||||
file's URL is used as the base URL.
|
||||
|
||||
:param document: An HTML document representation. The current
|
||||
implementation expects the result of ``html5lib.parse()``.
|
||||
:param page_url: The URL of the HTML document.
|
||||
"""
|
||||
for base in document.findall(".//base"):
|
||||
href = base.get("href")
|
||||
if href is not None:
|
||||
return href
|
||||
return page_url
|
||||
|
||||
|
||||
def _get_encoding_from_headers(headers):
|
||||
"""Determine if we have any encoding information in our headers.
|
||||
"""
|
||||
if headers and "Content-Type" in headers:
|
||||
content_type, params = cgi.parse_header(headers["Content-Type"])
|
||||
if "charset" in params:
|
||||
return params['charset']
|
||||
return None
|
||||
|
||||
|
||||
_CLEAN_LINK_RE = re.compile(r'[^a-z0-9$&+,/:;=?@.#%_\\|-]', re.I)
|
||||
|
||||
|
||||
def _clean_link(url):
|
||||
"""Makes sure a link is fully encoded. That is, if a ' ' shows up in
|
||||
the link, it will be rewritten to %20 (while not over-quoting
|
||||
% or other characters)."""
|
||||
return _CLEAN_LINK_RE.sub(lambda match: '%%%2x' % ord(match.group(0)), url)
|
||||
|
||||
|
||||
class HTMLPage(object):
|
||||
"""Represents one page, along with its URL"""
|
||||
|
||||
def __init__(self, content, url, headers=None):
|
||||
# Determine if we have any encoding information in our headers
|
||||
encoding = None
|
||||
if headers and "Content-Type" in headers:
|
||||
content_type, params = cgi.parse_header(headers["Content-Type"])
|
||||
|
||||
if "charset" in params:
|
||||
encoding = params['charset']
|
||||
|
||||
self.content = content
|
||||
self.parsed = html5lib.parse(
|
||||
self.content,
|
||||
transport_encoding=encoding,
|
||||
namespaceHTMLElements=False,
|
||||
)
|
||||
self.url = url
|
||||
self.headers = headers
|
||||
|
||||
def __str__(self):
|
||||
return self.url
|
||||
|
||||
def iter_links(self):
|
||||
"""Yields all links in the page"""
|
||||
document = html5lib.parse(
|
||||
self.content,
|
||||
transport_encoding=_get_encoding_from_headers(self.headers),
|
||||
namespaceHTMLElements=False,
|
||||
@classmethod
|
||||
def get_page(cls, link, skip_archives=True, session=None):
|
||||
if session is None:
|
||||
raise TypeError(
|
||||
"get_page() missing 1 required keyword argument: 'session'"
|
||||
)
|
||||
base_url = _determine_base_url(document, self.url)
|
||||
for anchor in document.findall(".//a"):
|
||||
|
||||
url = link.url
|
||||
url = url.split('#', 1)[0]
|
||||
|
||||
# Check for VCS schemes that do not support lookup as web pages.
|
||||
from pip.vcs import VcsSupport
|
||||
for scheme in VcsSupport.schemes:
|
||||
if url.lower().startswith(scheme) and url[len(scheme)] in '+:':
|
||||
logger.debug('Cannot look at %s URL %s', scheme, link)
|
||||
return None
|
||||
|
||||
try:
|
||||
if skip_archives:
|
||||
filename = link.filename
|
||||
for bad_ext in ARCHIVE_EXTENSIONS:
|
||||
if filename.endswith(bad_ext):
|
||||
content_type = cls._get_content_type(
|
||||
url, session=session,
|
||||
)
|
||||
if content_type.lower().startswith('text/html'):
|
||||
break
|
||||
else:
|
||||
logger.debug(
|
||||
'Skipping page %s because of Content-Type: %s',
|
||||
link,
|
||||
content_type,
|
||||
)
|
||||
return
|
||||
|
||||
logger.debug('Getting page %s', url)
|
||||
|
||||
# Tack index.html onto file:// URLs that point to directories
|
||||
(scheme, netloc, path, params, query, fragment) = \
|
||||
urllib_parse.urlparse(url)
|
||||
if (scheme == 'file' and
|
||||
os.path.isdir(urllib_request.url2pathname(path))):
|
||||
# add trailing slash if not present so urljoin doesn't trim
|
||||
# final segment
|
||||
if not url.endswith('/'):
|
||||
url += '/'
|
||||
url = urllib_parse.urljoin(url, 'index.html')
|
||||
logger.debug(' file: URL is directory, getting %s', url)
|
||||
|
||||
resp = session.get(
|
||||
url,
|
||||
headers={
|
||||
"Accept": "text/html",
|
||||
"Cache-Control": "max-age=600",
|
||||
},
|
||||
)
|
||||
resp.raise_for_status()
|
||||
|
||||
# The check for archives above only works if the url ends with
|
||||
# something that looks like an archive. However that is not a
|
||||
# requirement of an url. Unless we issue a HEAD request on every
|
||||
# url we cannot know ahead of time for sure if something is HTML
|
||||
# or not. However we can check after we've downloaded it.
|
||||
content_type = resp.headers.get('Content-Type', 'unknown')
|
||||
if not content_type.lower().startswith("text/html"):
|
||||
logger.debug(
|
||||
'Skipping page %s because of Content-Type: %s',
|
||||
link,
|
||||
content_type,
|
||||
)
|
||||
return
|
||||
|
||||
inst = cls(resp.content, resp.url, resp.headers)
|
||||
except requests.HTTPError as exc:
|
||||
cls._handle_fail(link, exc, url)
|
||||
except SSLError as exc:
|
||||
reason = ("There was a problem confirming the ssl certificate: "
|
||||
"%s" % exc)
|
||||
cls._handle_fail(link, reason, url, meth=logger.info)
|
||||
except requests.ConnectionError as exc:
|
||||
cls._handle_fail(link, "connection error: %s" % exc, url)
|
||||
except requests.Timeout:
|
||||
cls._handle_fail(link, "timed out", url)
|
||||
else:
|
||||
return inst
|
||||
|
||||
@staticmethod
|
||||
def _handle_fail(link, reason, url, meth=None):
|
||||
if meth is None:
|
||||
meth = logger.debug
|
||||
|
||||
meth("Could not fetch URL %s: %s - skipping", link, reason)
|
||||
|
||||
@staticmethod
|
||||
def _get_content_type(url, session):
|
||||
"""Get the Content-Type of the given url, using a HEAD request"""
|
||||
scheme, netloc, path, query, fragment = urllib_parse.urlsplit(url)
|
||||
if scheme not in ('http', 'https'):
|
||||
# FIXME: some warning or something?
|
||||
# assertion error?
|
||||
return ''
|
||||
|
||||
resp = session.head(url, allow_redirects=True)
|
||||
resp.raise_for_status()
|
||||
|
||||
return resp.headers.get("Content-Type", "")
|
||||
|
||||
@cached_property
|
||||
def base_url(self):
|
||||
bases = [
|
||||
x for x in self.parsed.findall(".//base")
|
||||
if x.get("href") is not None
|
||||
]
|
||||
if bases and bases[0].get("href"):
|
||||
return bases[0].get("href")
|
||||
else:
|
||||
return self.url
|
||||
|
||||
@property
|
||||
def links(self):
|
||||
"""Yields all links in the page"""
|
||||
for anchor in self.parsed.findall(".//a"):
|
||||
if anchor.get("href"):
|
||||
href = anchor.get("href")
|
||||
url = _clean_link(urllib_parse.urljoin(base_url, href))
|
||||
url = self.clean_link(
|
||||
urllib_parse.urljoin(self.base_url, href)
|
||||
)
|
||||
pyrequire = anchor.get('data-requires-python')
|
||||
pyrequire = unescape(pyrequire) if pyrequire else None
|
||||
yield Link(url, self.url, requires_python=pyrequire)
|
||||
yield Link(url, self, requires_python=pyrequire)
|
||||
|
||||
_clean_re = re.compile(r'[^a-z0-9$&+,/:;=?@.#%_\\|-]', re.I)
|
||||
|
||||
def clean_link(self, url):
|
||||
"""Makes sure a link is fully encoded. That is, if a ' ' shows up in
|
||||
the link, it will be rewritten to %20 (while not over-quoting
|
||||
% or other characters)."""
|
||||
return self._clean_re.sub(
|
||||
lambda match: '%%%2x' % ord(match.group(0)), url)
|
||||
|
||||
|
||||
class Link(object):
|
||||
|
||||
def __init__(self, url, comes_from=None, requires_python=None):
|
||||
"""
|
||||
Object representing a parsed link from https://pypi.python.org/simple/*
|
||||
|
||||
url:
|
||||
url of the resource pointed to (href of the link)
|
||||
comes_from:
|
||||
instance of HTMLPage where the link was found, or string.
|
||||
requires_python:
|
||||
String containing the `Requires-Python` metadata field, specified
|
||||
in PEP 345. This may be specified by a data-requires-python
|
||||
attribute in the HTML link tag, as described in PEP 503.
|
||||
"""
|
||||
|
||||
# url can be a UNC windows share
|
||||
if url.startswith('\\\\'):
|
||||
url = path_to_url(url)
|
||||
|
||||
self.url = url
|
||||
self.comes_from = comes_from
|
||||
self.requires_python = requires_python if requires_python else None
|
||||
|
||||
def __str__(self):
|
||||
if self.requires_python:
|
||||
rp = ' (requires-python:%s)' % self.requires_python
|
||||
else:
|
||||
rp = ''
|
||||
if self.comes_from:
|
||||
return '%s (from %s)%s' % (self.url, self.comes_from, rp)
|
||||
else:
|
||||
return str(self.url)
|
||||
|
||||
def __repr__(self):
|
||||
return '<Link %s>' % self
|
||||
|
||||
def __eq__(self, other):
|
||||
if not isinstance(other, Link):
|
||||
return NotImplemented
|
||||
return self.url == other.url
|
||||
|
||||
def __ne__(self, other):
|
||||
if not isinstance(other, Link):
|
||||
return NotImplemented
|
||||
return self.url != other.url
|
||||
|
||||
def __lt__(self, other):
|
||||
if not isinstance(other, Link):
|
||||
return NotImplemented
|
||||
return self.url < other.url
|
||||
|
||||
def __le__(self, other):
|
||||
if not isinstance(other, Link):
|
||||
return NotImplemented
|
||||
return self.url <= other.url
|
||||
|
||||
def __gt__(self, other):
|
||||
if not isinstance(other, Link):
|
||||
return NotImplemented
|
||||
return self.url > other.url
|
||||
|
||||
def __ge__(self, other):
|
||||
if not isinstance(other, Link):
|
||||
return NotImplemented
|
||||
return self.url >= other.url
|
||||
|
||||
def __hash__(self):
|
||||
return hash(self.url)
|
||||
|
||||
@property
|
||||
def filename(self):
|
||||
_, netloc, path, _, _ = urllib_parse.urlsplit(self.url)
|
||||
name = posixpath.basename(path.rstrip('/')) or netloc
|
||||
name = urllib_parse.unquote(name)
|
||||
assert name, ('URL %r produced no filename' % self.url)
|
||||
return name
|
||||
|
||||
@property
|
||||
def scheme(self):
|
||||
return urllib_parse.urlsplit(self.url)[0]
|
||||
|
||||
@property
|
||||
def netloc(self):
|
||||
return urllib_parse.urlsplit(self.url)[1]
|
||||
|
||||
@property
|
||||
def path(self):
|
||||
return urllib_parse.unquote(urllib_parse.urlsplit(self.url)[2])
|
||||
|
||||
def splitext(self):
|
||||
return splitext(posixpath.basename(self.path.rstrip('/')))
|
||||
|
||||
@property
|
||||
def ext(self):
|
||||
return self.splitext()[1]
|
||||
|
||||
@property
|
||||
def url_without_fragment(self):
|
||||
scheme, netloc, path, query, fragment = urllib_parse.urlsplit(self.url)
|
||||
return urllib_parse.urlunsplit((scheme, netloc, path, query, None))
|
||||
|
||||
_egg_fragment_re = re.compile(r'[#&]egg=([^&]*)')
|
||||
|
||||
@property
|
||||
def egg_fragment(self):
|
||||
match = self._egg_fragment_re.search(self.url)
|
||||
if not match:
|
||||
return None
|
||||
return match.group(1)
|
||||
|
||||
_subdirectory_fragment_re = re.compile(r'[#&]subdirectory=([^&]*)')
|
||||
|
||||
@property
|
||||
def subdirectory_fragment(self):
|
||||
match = self._subdirectory_fragment_re.search(self.url)
|
||||
if not match:
|
||||
return None
|
||||
return match.group(1)
|
||||
|
||||
_hash_re = re.compile(
|
||||
r'(sha1|sha224|sha384|sha256|sha512|md5)=([a-f0-9]+)'
|
||||
)
|
||||
|
||||
@property
|
||||
def hash(self):
|
||||
match = self._hash_re.search(self.url)
|
||||
if match:
|
||||
return match.group(2)
|
||||
return None
|
||||
|
||||
@property
|
||||
def hash_name(self):
|
||||
match = self._hash_re.search(self.url)
|
||||
if match:
|
||||
return match.group(1)
|
||||
return None
|
||||
|
||||
@property
|
||||
def show_url(self):
|
||||
return posixpath.basename(self.url.split('#', 1)[0].split('?', 1)[0])
|
||||
|
||||
@property
|
||||
def is_wheel(self):
|
||||
return self.ext == wheel_ext
|
||||
|
||||
@property
|
||||
def is_artifact(self):
|
||||
"""
|
||||
Determines if this points to an actual artifact (e.g. a tarball) or if
|
||||
it points to an "abstract" thing like a path or a VCS location.
|
||||
"""
|
||||
from pip.vcs import vcs
|
||||
|
||||
if self.scheme in vcs.all_schemes:
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
|
||||
FormatControl = namedtuple('FormatControl', 'no_binary only_binary')
|
||||
"""This object has two fields, no_binary and only_binary.
|
||||
|
||||
If a field is falsy, it isn't set. If it is {':all:'}, it should match all
|
||||
packages except those listed in the other field. Only one field can be set
|
||||
to {':all:'} at a time. The rest of the time exact package name matches
|
||||
are listed, with any given package only showing up in one field at a time.
|
||||
"""
|
||||
|
||||
|
||||
def fmt_ctl_handle_mutual_exclude(value, target, other):
|
||||
new = value.split(',')
|
||||
while ':all:' in new:
|
||||
other.clear()
|
||||
target.clear()
|
||||
target.add(':all:')
|
||||
del new[:new.index(':all:') + 1]
|
||||
if ':none:' not in new:
|
||||
# Without a none, we want to discard everything as :all: covers it
|
||||
return
|
||||
for name in new:
|
||||
if name == ':none:':
|
||||
target.clear()
|
||||
continue
|
||||
name = canonicalize_name(name)
|
||||
other.discard(name)
|
||||
target.add(name)
|
||||
|
||||
|
||||
def fmt_ctl_formats(fmt_ctl, canonical_name):
|
||||
result = set(["binary", "source"])
|
||||
if canonical_name in fmt_ctl.only_binary:
|
||||
result.discard('source')
|
||||
elif canonical_name in fmt_ctl.no_binary:
|
||||
result.discard('binary')
|
||||
elif ':all:' in fmt_ctl.only_binary:
|
||||
result.discard('source')
|
||||
elif ':all:' in fmt_ctl.no_binary:
|
||||
result.discard('binary')
|
||||
return frozenset(result)
|
||||
|
||||
|
||||
def fmt_ctl_no_binary(fmt_ctl):
|
||||
fmt_ctl_handle_mutual_exclude(
|
||||
':all:', fmt_ctl.no_binary, fmt_ctl.only_binary)
|
||||
|
||||
|
||||
def fmt_ctl_no_use_wheel(fmt_ctl):
|
||||
fmt_ctl_no_binary(fmt_ctl)
|
||||
warnings.warn(
|
||||
'--no-use-wheel is deprecated and will be removed in the future. '
|
||||
' Please use --no-binary :all: instead.', RemovedInPip10Warning,
|
||||
stacklevel=2)
|
||||
|
||||
|
||||
Search = namedtuple('Search', 'supplied canonical formats')
|
||||
|
|
@ -3,15 +3,15 @@ from __future__ import absolute_import
|
|||
|
||||
import os
|
||||
import os.path
|
||||
import platform
|
||||
import site
|
||||
import sys
|
||||
import sysconfig
|
||||
from distutils import sysconfig as distutils_sysconfig
|
||||
from distutils.command.install import SCHEME_KEYS # type: ignore
|
||||
|
||||
from pip._internal.utils import appdirs
|
||||
from pip._internal.utils.compat import WINDOWS, expanduser
|
||||
from distutils import sysconfig
|
||||
from distutils.command.install import install, SCHEME_KEYS # noqa
|
||||
|
||||
from pip.compat import WINDOWS, expanduser
|
||||
from pip.utils import appdirs
|
||||
|
||||
|
||||
# Application Directories
|
||||
USER_CACHE_DIR = appdirs.user_cache_dir("pip")
|
||||
|
|
@ -80,18 +80,8 @@ src_prefix = os.path.abspath(src_prefix)
|
|||
|
||||
# FIXME doesn't account for venv linked to global site-packages
|
||||
|
||||
site_packages = sysconfig.get_path("purelib")
|
||||
# This is because of a bug in PyPy's sysconfig module, see
|
||||
# https://bitbucket.org/pypy/pypy/issues/2506/sysconfig-returns-incorrect-paths
|
||||
# for more information.
|
||||
if platform.python_implementation().lower() == "pypy":
|
||||
site_packages = distutils_sysconfig.get_python_lib()
|
||||
try:
|
||||
# Use getusersitepackages if this is present, as it ensures that the
|
||||
# value is initialised properly.
|
||||
user_site = site.getusersitepackages()
|
||||
except AttributeError:
|
||||
user_site = site.USER_SITE
|
||||
site_packages = sysconfig.get_python_lib()
|
||||
user_site = site.USER_SITE
|
||||
user_dir = expanduser('~')
|
||||
if WINDOWS:
|
||||
bin_py = os.path.join(sys.prefix, 'Scripts')
|
||||
|
|
@ -119,6 +109,7 @@ else:
|
|||
legacy_storage_dir,
|
||||
config_basename,
|
||||
)
|
||||
|
||||
# Forcing to use /usr/local/bin for standard macOS framework installs
|
||||
# Also log to ~/Library/Logs/ for use with the Console.app log viewer
|
||||
if sys.platform[:6] == 'darwin' and sys.prefix[:16] == '/System/Library/':
|
||||
|
|
@ -129,9 +120,6 @@ site_config_files = [
|
|||
for path in appdirs.site_config_dirs('pip')
|
||||
]
|
||||
|
||||
venv_config_file = os.path.join(sys.prefix, config_basename)
|
||||
new_config_file = os.path.join(appdirs.user_config_dir("pip"), config_basename)
|
||||
|
||||
|
||||
def distutils_scheme(dist_name, user=False, home=None, root=None,
|
||||
isolated=False, prefix=None):
|
||||
|
|
@ -155,7 +143,7 @@ def distutils_scheme(dist_name, user=False, home=None, root=None,
|
|||
# NOTE: setting user or home has the side-effect of creating the home dir
|
||||
# or user base for installations during finalize_options()
|
||||
# ideally, we'd prefer a scheme class that has no side-effects.
|
||||
assert not (user and prefix), "user={} prefix={}".format(user, prefix)
|
||||
assert not (user and prefix), "user={0} prefix={1}".format(user, prefix)
|
||||
i.user = user or i.user
|
||||
if user:
|
||||
i.prefix = ""
|
||||
4
lib/python3.4/site-packages/pip/models/__init__.py
Normal file
4
lib/python3.4/site-packages/pip/models/__init__.py
Normal file
|
|
@ -0,0 +1,4 @@
|
|||
from pip.models.index import Index, PyPI
|
||||
|
||||
|
||||
__all__ = ["Index", "PyPI"]
|
||||
16
lib/python3.4/site-packages/pip/models/index.py
Normal file
16
lib/python3.4/site-packages/pip/models/index.py
Normal file
|
|
@ -0,0 +1,16 @@
|
|||
from pip._vendor.six.moves.urllib import parse as urllib_parse
|
||||
|
||||
|
||||
class Index(object):
|
||||
def __init__(self, url):
|
||||
self.url = url
|
||||
self.netloc = urllib_parse.urlsplit(url).netloc
|
||||
self.simple_url = self.url_to_path('simple')
|
||||
self.pypi_url = self.url_to_path('pypi')
|
||||
self.pip_json_url = self.url_to_path('pypi/pip/json')
|
||||
|
||||
def url_to_path(self, path):
|
||||
return urllib_parse.urljoin(self.url, path)
|
||||
|
||||
|
||||
PyPI = Index('https://pypi.python.org/')
|
||||
49
lib/python3.4/site-packages/pip/operations/check.py
Normal file
49
lib/python3.4/site-packages/pip/operations/check.py
Normal file
|
|
@ -0,0 +1,49 @@
|
|||
|
||||
|
||||
def check_requirements(installed_dists):
|
||||
missing_reqs_dict = {}
|
||||
incompatible_reqs_dict = {}
|
||||
|
||||
for dist in installed_dists:
|
||||
key = '%s==%s' % (dist.project_name, dist.version)
|
||||
|
||||
missing_reqs = list(get_missing_reqs(dist, installed_dists))
|
||||
if missing_reqs:
|
||||
missing_reqs_dict[key] = missing_reqs
|
||||
|
||||
incompatible_reqs = list(get_incompatible_reqs(
|
||||
dist, installed_dists))
|
||||
if incompatible_reqs:
|
||||
incompatible_reqs_dict[key] = incompatible_reqs
|
||||
|
||||
return (missing_reqs_dict, incompatible_reqs_dict)
|
||||
|
||||
|
||||
def get_missing_reqs(dist, installed_dists):
|
||||
"""Return all of the requirements of `dist` that aren't present in
|
||||
`installed_dists`.
|
||||
|
||||
"""
|
||||
installed_names = set(d.project_name.lower() for d in installed_dists)
|
||||
missing_requirements = set()
|
||||
|
||||
for requirement in dist.requires():
|
||||
if requirement.project_name.lower() not in installed_names:
|
||||
missing_requirements.add(requirement)
|
||||
yield requirement
|
||||
|
||||
|
||||
def get_incompatible_reqs(dist, installed_dists):
|
||||
"""Return all of the requirements of `dist` that are present in
|
||||
`installed_dists`, but have incompatible versions.
|
||||
|
||||
"""
|
||||
installed_dists_by_name = {}
|
||||
for installed_dist in installed_dists:
|
||||
installed_dists_by_name[installed_dist.project_name] = installed_dist
|
||||
|
||||
for requirement in dist.requires():
|
||||
present_dist = installed_dists_by_name.get(requirement.project_name)
|
||||
|
||||
if present_dist and present_dist not in requirement:
|
||||
yield (requirement, present_dist)
|
||||
132
lib/python3.4/site-packages/pip/operations/freeze.py
Normal file
132
lib/python3.4/site-packages/pip/operations/freeze.py
Normal file
|
|
@ -0,0 +1,132 @@
|
|||
from __future__ import absolute_import
|
||||
|
||||
import logging
|
||||
import re
|
||||
|
||||
import pip
|
||||
from pip.req import InstallRequirement
|
||||
from pip.req.req_file import COMMENT_RE
|
||||
from pip.utils import get_installed_distributions
|
||||
from pip._vendor import pkg_resources
|
||||
from pip._vendor.packaging.utils import canonicalize_name
|
||||
from pip._vendor.pkg_resources import RequirementParseError
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def freeze(
|
||||
requirement=None,
|
||||
find_links=None, local_only=None, user_only=None, skip_regex=None,
|
||||
default_vcs=None,
|
||||
isolated=False,
|
||||
wheel_cache=None,
|
||||
skip=()):
|
||||
find_links = find_links or []
|
||||
skip_match = None
|
||||
|
||||
if skip_regex:
|
||||
skip_match = re.compile(skip_regex).search
|
||||
|
||||
dependency_links = []
|
||||
|
||||
for dist in pkg_resources.working_set:
|
||||
if dist.has_metadata('dependency_links.txt'):
|
||||
dependency_links.extend(
|
||||
dist.get_metadata_lines('dependency_links.txt')
|
||||
)
|
||||
for link in find_links:
|
||||
if '#egg=' in link:
|
||||
dependency_links.append(link)
|
||||
for link in find_links:
|
||||
yield '-f %s' % link
|
||||
installations = {}
|
||||
for dist in get_installed_distributions(local_only=local_only,
|
||||
skip=(),
|
||||
user_only=user_only):
|
||||
try:
|
||||
req = pip.FrozenRequirement.from_dist(
|
||||
dist,
|
||||
dependency_links
|
||||
)
|
||||
except RequirementParseError:
|
||||
logger.warning(
|
||||
"Could not parse requirement: %s",
|
||||
dist.project_name
|
||||
)
|
||||
continue
|
||||
installations[req.name] = req
|
||||
|
||||
if requirement:
|
||||
# the options that don't get turned into an InstallRequirement
|
||||
# should only be emitted once, even if the same option is in multiple
|
||||
# requirements files, so we need to keep track of what has been emitted
|
||||
# so that we don't emit it again if it's seen again
|
||||
emitted_options = set()
|
||||
for req_file_path in requirement:
|
||||
with open(req_file_path) as req_file:
|
||||
for line in req_file:
|
||||
if (not line.strip() or
|
||||
line.strip().startswith('#') or
|
||||
(skip_match and skip_match(line)) or
|
||||
line.startswith((
|
||||
'-r', '--requirement',
|
||||
'-Z', '--always-unzip',
|
||||
'-f', '--find-links',
|
||||
'-i', '--index-url',
|
||||
'--pre',
|
||||
'--trusted-host',
|
||||
'--process-dependency-links',
|
||||
'--extra-index-url'))):
|
||||
line = line.rstrip()
|
||||
if line not in emitted_options:
|
||||
emitted_options.add(line)
|
||||
yield line
|
||||
continue
|
||||
|
||||
if line.startswith('-e') or line.startswith('--editable'):
|
||||
if line.startswith('-e'):
|
||||
line = line[2:].strip()
|
||||
else:
|
||||
line = line[len('--editable'):].strip().lstrip('=')
|
||||
line_req = InstallRequirement.from_editable(
|
||||
line,
|
||||
default_vcs=default_vcs,
|
||||
isolated=isolated,
|
||||
wheel_cache=wheel_cache,
|
||||
)
|
||||
else:
|
||||
line_req = InstallRequirement.from_line(
|
||||
COMMENT_RE.sub('', line).strip(),
|
||||
isolated=isolated,
|
||||
wheel_cache=wheel_cache,
|
||||
)
|
||||
|
||||
if not line_req.name:
|
||||
logger.info(
|
||||
"Skipping line in requirement file [%s] because "
|
||||
"it's not clear what it would install: %s",
|
||||
req_file_path, line.strip(),
|
||||
)
|
||||
logger.info(
|
||||
" (add #egg=PackageName to the URL to avoid"
|
||||
" this warning)"
|
||||
)
|
||||
elif line_req.name not in installations:
|
||||
logger.warning(
|
||||
"Requirement file [%s] contains %s, but that "
|
||||
"package is not installed",
|
||||
req_file_path, COMMENT_RE.sub('', line).strip(),
|
||||
)
|
||||
else:
|
||||
yield str(installations[line_req.name]).rstrip()
|
||||
del installations[line_req.name]
|
||||
|
||||
yield(
|
||||
'## The following requirements were added by '
|
||||
'pip freeze:'
|
||||
)
|
||||
for installation in sorted(
|
||||
installations.values(), key=lambda x: x.name.lower()):
|
||||
if canonicalize_name(installation.name) not in skip:
|
||||
yield str(installation).rstrip()
|
||||
|
|
@ -1,17 +1,21 @@
|
|||
"""Generate and work with PEP 425 Compatibility Tags."""
|
||||
from __future__ import absolute_import
|
||||
|
||||
import distutils.util
|
||||
import logging
|
||||
import platform
|
||||
import re
|
||||
import sys
|
||||
import sysconfig
|
||||
import warnings
|
||||
from collections import OrderedDict
|
||||
import platform
|
||||
import logging
|
||||
|
||||
import pip._internal.utils.glibc
|
||||
from pip._internal.utils.compat import get_extension_suffixes
|
||||
try:
|
||||
import sysconfig
|
||||
except ImportError: # pragma nocover
|
||||
# Python < 2.7
|
||||
import distutils.sysconfig as sysconfig
|
||||
import distutils.util
|
||||
|
||||
from pip.compat import OrderedDict
|
||||
import pip.utils.glibc
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
|
@ -22,7 +26,7 @@ def get_config_var(var):
|
|||
try:
|
||||
return sysconfig.get_config_var(var)
|
||||
except IOError as e: # Issue #1074
|
||||
warnings.warn("{}".format(e), RuntimeWarning)
|
||||
warnings.warn("{0}".format(e), RuntimeWarning)
|
||||
return None
|
||||
|
||||
|
||||
|
|
@ -62,7 +66,7 @@ def get_impl_tag():
|
|||
"""
|
||||
Returns the Tag for this specific implementation.
|
||||
"""
|
||||
return "{}{}".format(get_abbr_impl(), get_impl_ver())
|
||||
return "{0}{1}".format(get_abbr_impl(), get_impl_ver())
|
||||
|
||||
|
||||
def get_flag(var, fallback, expected=True, warn=True):
|
||||
|
|
@ -82,7 +86,7 @@ def get_abi_tag():
|
|||
(CPython 2, PyPy)."""
|
||||
soabi = get_config_var('SOABI')
|
||||
impl = get_abbr_impl()
|
||||
if not soabi and impl in {'cp', 'pp'} and hasattr(sys, 'maxunicode'):
|
||||
if not soabi and impl in ('cp', 'pp') and hasattr(sys, 'maxunicode'):
|
||||
d = ''
|
||||
m = ''
|
||||
u = ''
|
||||
|
|
@ -129,7 +133,7 @@ def get_platform():
|
|||
elif machine == "ppc64" and _is_running_32bit():
|
||||
machine = "ppc"
|
||||
|
||||
return 'macosx_{}_{}_{}'.format(split_ver[0], split_ver[1], machine)
|
||||
return 'macosx_{0}_{1}_{2}'.format(split_ver[0], split_ver[1], machine)
|
||||
|
||||
# XXX remove distutils dependency
|
||||
result = distutils.util.get_platform().replace('.', '_').replace('-', '_')
|
||||
|
|
@ -143,7 +147,7 @@ def get_platform():
|
|||
|
||||
def is_manylinux1_compatible():
|
||||
# Only Linux, and only x86-64 / i686
|
||||
if get_platform() not in {"linux_x86_64", "linux_i686"}:
|
||||
if get_platform() not in ("linux_x86_64", "linux_i686"):
|
||||
return False
|
||||
|
||||
# Check for presence of _manylinux module
|
||||
|
|
@ -155,7 +159,7 @@ def is_manylinux1_compatible():
|
|||
pass
|
||||
|
||||
# Check glibc version. CentOS 5 uses glibc 2.5.
|
||||
return pip._internal.utils.glibc.have_compatible_glibc(2, 5)
|
||||
return pip.utils.glibc.have_compatible_glibc(2, 5)
|
||||
|
||||
|
||||
def get_darwin_arches(major, minor, machine):
|
||||
|
|
@ -253,9 +257,10 @@ def get_supported(versions=None, noarch=False, platform=None,
|
|||
abis[0:0] = [abi]
|
||||
|
||||
abi3s = set()
|
||||
for suffix in get_extension_suffixes():
|
||||
if suffix.startswith('.abi'):
|
||||
abi3s.add(suffix.split('.', 2)[1])
|
||||
import imp
|
||||
for suffix in imp.get_suffixes():
|
||||
if suffix[0].startswith('.abi'):
|
||||
abi3s.add(suffix[0].split('.', 2)[1])
|
||||
|
||||
abis.extend(sorted(list(abi3s)))
|
||||
|
||||
|
|
@ -268,7 +273,7 @@ def get_supported(versions=None, noarch=False, platform=None,
|
|||
match = _osx_arch_pat.match(arch)
|
||||
if match:
|
||||
name, major, minor, actual_arch = match.groups()
|
||||
tpl = '{}_{}_%i_%s'.format(name, major)
|
||||
tpl = '{0}_{1}_%i_%s'.format(name, major)
|
||||
arches = []
|
||||
for m in reversed(range(int(minor) + 1)):
|
||||
for a in get_darwin_arches(int(major), m, actual_arch):
|
||||
|
|
@ -289,7 +294,7 @@ def get_supported(versions=None, noarch=False, platform=None,
|
|||
# abi3 modules compatible with older version of Python
|
||||
for version in versions[1:]:
|
||||
# abi3 was introduced in Python 3.2
|
||||
if version in {'31', '30'}:
|
||||
if version in ('31', '30'):
|
||||
break
|
||||
for abi in abi3s: # empty set if not Python 3
|
||||
for arch in arches:
|
||||
|
|
@ -313,5 +318,7 @@ def get_supported(versions=None, noarch=False, platform=None,
|
|||
|
||||
return supported
|
||||
|
||||
supported_tags = get_supported()
|
||||
supported_tags_noarch = get_supported(noarch=True)
|
||||
|
||||
implementation_tag = get_impl_tag()
|
||||
10
lib/python3.4/site-packages/pip/req/__init__.py
Normal file
10
lib/python3.4/site-packages/pip/req/__init__.py
Normal file
|
|
@ -0,0 +1,10 @@
|
|||
from __future__ import absolute_import
|
||||
|
||||
from .req_install import InstallRequirement
|
||||
from .req_set import RequirementSet, Requirements
|
||||
from .req_file import parse_requirements
|
||||
|
||||
__all__ = [
|
||||
"RequirementSet", "Requirements", "InstallRequirement",
|
||||
"parse_requirements",
|
||||
]
|
||||
|
|
@ -4,33 +4,28 @@ Requirements file parsing
|
|||
|
||||
from __future__ import absolute_import
|
||||
|
||||
import optparse
|
||||
import os
|
||||
import re
|
||||
import shlex
|
||||
import sys
|
||||
import optparse
|
||||
import warnings
|
||||
|
||||
from pip._vendor.six.moves import filterfalse
|
||||
from pip._vendor.six.moves.urllib import parse as urllib_parse
|
||||
from pip._vendor.six.moves import filterfalse
|
||||
|
||||
from pip._internal.cli import cmdoptions
|
||||
from pip._internal.download import get_file_content
|
||||
from pip._internal.exceptions import RequirementsFileParseError
|
||||
from pip._internal.req.constructors import (
|
||||
install_req_from_editable, install_req_from_line,
|
||||
)
|
||||
import pip
|
||||
from pip.download import get_file_content
|
||||
from pip.req.req_install import InstallRequirement
|
||||
from pip.exceptions import (RequirementsFileParseError)
|
||||
from pip.utils.deprecation import RemovedInPip10Warning
|
||||
from pip import cmdoptions
|
||||
|
||||
__all__ = ['parse_requirements']
|
||||
|
||||
SCHEME_RE = re.compile(r'^(http|https|file):', re.I)
|
||||
COMMENT_RE = re.compile(r'(^|\s)+#.*$')
|
||||
|
||||
# Matches environment variable-style values in '${MY_VARIABLE_1}' with the
|
||||
# variable name consisting of only uppercase letters, digits or the '_'
|
||||
# (underscore). This follows the POSIX standard defined in IEEE Std 1003.1,
|
||||
# 2013 Edition.
|
||||
ENV_VAR_RE = re.compile(r'(?P<var>\$\{(?P<name>[A-Z0-9_]+)\})')
|
||||
|
||||
SUPPORTED_OPTIONS = [
|
||||
cmdoptions.constraints,
|
||||
cmdoptions.editable,
|
||||
|
|
@ -39,6 +34,13 @@ SUPPORTED_OPTIONS = [
|
|||
cmdoptions.index_url,
|
||||
cmdoptions.find_links,
|
||||
cmdoptions.extra_index_url,
|
||||
cmdoptions.allow_external,
|
||||
cmdoptions.allow_all_external,
|
||||
cmdoptions.no_allow_external,
|
||||
cmdoptions.allow_unsafe,
|
||||
cmdoptions.no_allow_unsafe,
|
||||
cmdoptions.use_wheel,
|
||||
cmdoptions.no_use_wheel,
|
||||
cmdoptions.always_unzip,
|
||||
cmdoptions.no_binary,
|
||||
cmdoptions.only_binary,
|
||||
|
|
@ -102,7 +104,6 @@ def preprocess(content, options):
|
|||
lines_enum = join_lines(lines_enum)
|
||||
lines_enum = ignore_comments(lines_enum)
|
||||
lines_enum = skip_regex(lines_enum, options)
|
||||
lines_enum = expand_env_variables(lines_enum)
|
||||
return lines_enum
|
||||
|
||||
|
||||
|
|
@ -126,7 +127,7 @@ def process_line(line, filename, line_number, finder=None, comes_from=None,
|
|||
:param constraint: If True, parsing a constraints file.
|
||||
:param options: OptionParser options that we may update
|
||||
"""
|
||||
parser = build_parser(line)
|
||||
parser = build_parser()
|
||||
defaults = parser.get_default_values()
|
||||
defaults.index_url = None
|
||||
if finder:
|
||||
|
|
@ -140,8 +141,7 @@ def process_line(line, filename, line_number, finder=None, comes_from=None,
|
|||
|
||||
# preserve for the nested code path
|
||||
line_comes_from = '%s %s (line %s)' % (
|
||||
'-c' if constraint else '-r', filename, line_number,
|
||||
)
|
||||
'-c' if constraint else '-r', filename, line_number)
|
||||
|
||||
# yield a line requirement
|
||||
if args_str:
|
||||
|
|
@ -153,7 +153,7 @@ def process_line(line, filename, line_number, finder=None, comes_from=None,
|
|||
for dest in SUPPORTED_OPTIONS_REQ_DEST:
|
||||
if dest in opts.__dict__ and opts.__dict__[dest]:
|
||||
req_options[dest] = opts.__dict__[dest]
|
||||
yield install_req_from_line(
|
||||
yield InstallRequirement.from_line(
|
||||
args_str, line_comes_from, constraint=constraint,
|
||||
isolated=isolated, options=req_options, wheel_cache=wheel_cache
|
||||
)
|
||||
|
|
@ -161,9 +161,11 @@ def process_line(line, filename, line_number, finder=None, comes_from=None,
|
|||
# yield an editable requirement
|
||||
elif opts.editables:
|
||||
isolated = options.isolated_mode if options else False
|
||||
yield install_req_from_editable(
|
||||
default_vcs = options.default_vcs if options else None
|
||||
yield InstallRequirement.from_editable(
|
||||
opts.editables[0], comes_from=line_comes_from,
|
||||
constraint=constraint, isolated=isolated, wheel_cache=wheel_cache
|
||||
constraint=constraint, default_vcs=default_vcs, isolated=isolated,
|
||||
wheel_cache=wheel_cache
|
||||
)
|
||||
|
||||
# parse a nested requirements file
|
||||
|
|
@ -196,8 +198,35 @@ def process_line(line, filename, line_number, finder=None, comes_from=None,
|
|||
|
||||
# set finder options
|
||||
elif finder:
|
||||
if opts.allow_external:
|
||||
warnings.warn(
|
||||
"--allow-external has been deprecated and will be removed in "
|
||||
"the future. Due to changes in the repository protocol, it no "
|
||||
"longer has any effect.",
|
||||
RemovedInPip10Warning,
|
||||
)
|
||||
|
||||
if opts.allow_all_external:
|
||||
warnings.warn(
|
||||
"--allow-all-external has been deprecated and will be removed "
|
||||
"in the future. Due to changes in the repository protocol, it "
|
||||
"no longer has any effect.",
|
||||
RemovedInPip10Warning,
|
||||
)
|
||||
|
||||
if opts.allow_unverified:
|
||||
warnings.warn(
|
||||
"--allow-unverified has been deprecated and will be removed "
|
||||
"in the future. Due to changes in the repository protocol, it "
|
||||
"no longer has any effect.",
|
||||
RemovedInPip10Warning,
|
||||
)
|
||||
|
||||
if opts.index_url:
|
||||
finder.index_urls = [opts.index_url]
|
||||
if opts.use_wheel is False:
|
||||
finder.use_wheel = False
|
||||
pip.index.fmt_ctl_no_use_wheel(finder.format_control)
|
||||
if opts.no_index is True:
|
||||
finder.index_urls = []
|
||||
if opts.extra_index_urls:
|
||||
|
|
@ -238,7 +267,7 @@ def break_args_options(line):
|
|||
return ' '.join(args), ' '.join(options)
|
||||
|
||||
|
||||
def build_parser(line):
|
||||
def build_parser():
|
||||
"""
|
||||
Return a parser for parsing requirement lines
|
||||
"""
|
||||
|
|
@ -252,8 +281,6 @@ def build_parser(line):
|
|||
# By default optparse sys.exits on parsing errors. We want to wrap
|
||||
# that in our own exception.
|
||||
def parser_exit(self, msg):
|
||||
# add offending line
|
||||
msg = 'Invalid requirement: %s\n%s' % (line, msg)
|
||||
raise RequirementsFileParseError(msg)
|
||||
parser.exit = parser_exit
|
||||
|
||||
|
|
@ -309,32 +336,7 @@ def skip_regex(lines_enum, options):
|
|||
skip_regex = options.skip_requirements_regex if options else None
|
||||
if skip_regex:
|
||||
pattern = re.compile(skip_regex)
|
||||
lines_enum = filterfalse(lambda e: pattern.search(e[1]), lines_enum)
|
||||
lines_enum = filterfalse(
|
||||
lambda e: pattern.search(e[1]),
|
||||
lines_enum)
|
||||
return lines_enum
|
||||
|
||||
|
||||
def expand_env_variables(lines_enum):
|
||||
"""Replace all environment variables that can be retrieved via `os.getenv`.
|
||||
|
||||
The only allowed format for environment variables defined in the
|
||||
requirement file is `${MY_VARIABLE_1}` to ensure two things:
|
||||
|
||||
1. Strings that contain a `$` aren't accidentally (partially) expanded.
|
||||
2. Ensure consistency across platforms for requirement files.
|
||||
|
||||
These points are the result of a discusssion on the `github pull
|
||||
request #3514 <https://github.com/pypa/pip/pull/3514>`_.
|
||||
|
||||
Valid characters in variable names follow the `POSIX standard
|
||||
<http://pubs.opengroup.org/onlinepubs/9699919799/>`_ and are limited
|
||||
to uppercase letter, digits and the `_` (underscore).
|
||||
"""
|
||||
for line_number, line in lines_enum:
|
||||
for env_var, var_name in ENV_VAR_RE.findall(line):
|
||||
value = os.getenv(var_name)
|
||||
if not value:
|
||||
continue
|
||||
|
||||
line = line.replace(env_var, value)
|
||||
|
||||
yield line_number, line
|
||||
File diff suppressed because it is too large
Load diff
798
lib/python3.4/site-packages/pip/req/req_set.py
Normal file
798
lib/python3.4/site-packages/pip/req/req_set.py
Normal file
|
|
@ -0,0 +1,798 @@
|
|||
from __future__ import absolute_import
|
||||
|
||||
from collections import defaultdict
|
||||
from itertools import chain
|
||||
import logging
|
||||
import os
|
||||
|
||||
from pip._vendor import pkg_resources
|
||||
from pip._vendor import requests
|
||||
|
||||
from pip.compat import expanduser
|
||||
from pip.download import (is_file_url, is_dir_url, is_vcs_url, url_to_path,
|
||||
unpack_url)
|
||||
from pip.exceptions import (InstallationError, BestVersionAlreadyInstalled,
|
||||
DistributionNotFound, PreviousBuildDirError,
|
||||
HashError, HashErrors, HashUnpinned,
|
||||
DirectoryUrlHashUnsupported, VcsHashUnsupported,
|
||||
UnsupportedPythonVersion)
|
||||
from pip.req.req_install import InstallRequirement
|
||||
from pip.utils import (
|
||||
display_path, dist_in_usersite, ensure_dir, normalize_path)
|
||||
from pip.utils.hashes import MissingHashes
|
||||
from pip.utils.logging import indent_log
|
||||
from pip.utils.packaging import check_dist_requires_python
|
||||
from pip.vcs import vcs
|
||||
from pip.wheel import Wheel
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class Requirements(object):
|
||||
|
||||
def __init__(self):
|
||||
self._keys = []
|
||||
self._dict = {}
|
||||
|
||||
def keys(self):
|
||||
return self._keys
|
||||
|
||||
def values(self):
|
||||
return [self._dict[key] for key in self._keys]
|
||||
|
||||
def __contains__(self, item):
|
||||
return item in self._keys
|
||||
|
||||
def __setitem__(self, key, value):
|
||||
if key not in self._keys:
|
||||
self._keys.append(key)
|
||||
self._dict[key] = value
|
||||
|
||||
def __getitem__(self, key):
|
||||
return self._dict[key]
|
||||
|
||||
def __repr__(self):
|
||||
values = ['%s: %s' % (repr(k), repr(self[k])) for k in self.keys()]
|
||||
return 'Requirements({%s})' % ', '.join(values)
|
||||
|
||||
|
||||
class DistAbstraction(object):
|
||||
"""Abstracts out the wheel vs non-wheel prepare_files logic.
|
||||
|
||||
The requirements for anything installable are as follows:
|
||||
- we must be able to determine the requirement name
|
||||
(or we can't correctly handle the non-upgrade case).
|
||||
- we must be able to generate a list of run-time dependencies
|
||||
without installing any additional packages (or we would
|
||||
have to either burn time by doing temporary isolated installs
|
||||
or alternatively violate pips 'don't start installing unless
|
||||
all requirements are available' rule - neither of which are
|
||||
desirable).
|
||||
- for packages with setup requirements, we must also be able
|
||||
to determine their requirements without installing additional
|
||||
packages (for the same reason as run-time dependencies)
|
||||
- we must be able to create a Distribution object exposing the
|
||||
above metadata.
|
||||
"""
|
||||
|
||||
def __init__(self, req_to_install):
|
||||
self.req_to_install = req_to_install
|
||||
|
||||
def dist(self, finder):
|
||||
"""Return a setuptools Dist object."""
|
||||
raise NotImplementedError(self.dist)
|
||||
|
||||
def prep_for_dist(self):
|
||||
"""Ensure that we can get a Dist for this requirement."""
|
||||
raise NotImplementedError(self.dist)
|
||||
|
||||
|
||||
def make_abstract_dist(req_to_install):
|
||||
"""Factory to make an abstract dist object.
|
||||
|
||||
Preconditions: Either an editable req with a source_dir, or satisfied_by or
|
||||
a wheel link, or a non-editable req with a source_dir.
|
||||
|
||||
:return: A concrete DistAbstraction.
|
||||
"""
|
||||
if req_to_install.editable:
|
||||
return IsSDist(req_to_install)
|
||||
elif req_to_install.link and req_to_install.link.is_wheel:
|
||||
return IsWheel(req_to_install)
|
||||
else:
|
||||
return IsSDist(req_to_install)
|
||||
|
||||
|
||||
class IsWheel(DistAbstraction):
|
||||
|
||||
def dist(self, finder):
|
||||
return list(pkg_resources.find_distributions(
|
||||
self.req_to_install.source_dir))[0]
|
||||
|
||||
def prep_for_dist(self):
|
||||
# FIXME:https://github.com/pypa/pip/issues/1112
|
||||
pass
|
||||
|
||||
|
||||
class IsSDist(DistAbstraction):
|
||||
|
||||
def dist(self, finder):
|
||||
dist = self.req_to_install.get_dist()
|
||||
# FIXME: shouldn't be globally added:
|
||||
if dist.has_metadata('dependency_links.txt'):
|
||||
finder.add_dependency_links(
|
||||
dist.get_metadata_lines('dependency_links.txt')
|
||||
)
|
||||
return dist
|
||||
|
||||
def prep_for_dist(self):
|
||||
self.req_to_install.run_egg_info()
|
||||
self.req_to_install.assert_source_matches_version()
|
||||
|
||||
|
||||
class Installed(DistAbstraction):
|
||||
|
||||
def dist(self, finder):
|
||||
return self.req_to_install.satisfied_by
|
||||
|
||||
def prep_for_dist(self):
|
||||
pass
|
||||
|
||||
|
||||
class RequirementSet(object):
|
||||
|
||||
def __init__(self, build_dir, src_dir, download_dir, upgrade=False,
|
||||
upgrade_strategy=None, ignore_installed=False, as_egg=False,
|
||||
target_dir=None, ignore_dependencies=False,
|
||||
force_reinstall=False, use_user_site=False, session=None,
|
||||
pycompile=True, isolated=False, wheel_download_dir=None,
|
||||
wheel_cache=None, require_hashes=False,
|
||||
ignore_requires_python=False):
|
||||
"""Create a RequirementSet.
|
||||
|
||||
:param wheel_download_dir: Where still-packed .whl files should be
|
||||
written to. If None they are written to the download_dir parameter.
|
||||
Separate to download_dir to permit only keeping wheel archives for
|
||||
pip wheel.
|
||||
:param download_dir: Where still packed archives should be written to.
|
||||
If None they are not saved, and are deleted immediately after
|
||||
unpacking.
|
||||
:param wheel_cache: The pip wheel cache, for passing to
|
||||
InstallRequirement.
|
||||
"""
|
||||
if session is None:
|
||||
raise TypeError(
|
||||
"RequirementSet() missing 1 required keyword argument: "
|
||||
"'session'"
|
||||
)
|
||||
|
||||
self.build_dir = build_dir
|
||||
self.src_dir = src_dir
|
||||
# XXX: download_dir and wheel_download_dir overlap semantically and may
|
||||
# be combined if we're willing to have non-wheel archives present in
|
||||
# the wheelhouse output by 'pip wheel'.
|
||||
self.download_dir = download_dir
|
||||
self.upgrade = upgrade
|
||||
self.upgrade_strategy = upgrade_strategy
|
||||
self.ignore_installed = ignore_installed
|
||||
self.force_reinstall = force_reinstall
|
||||
self.requirements = Requirements()
|
||||
# Mapping of alias: real_name
|
||||
self.requirement_aliases = {}
|
||||
self.unnamed_requirements = []
|
||||
self.ignore_dependencies = ignore_dependencies
|
||||
self.ignore_requires_python = ignore_requires_python
|
||||
self.successfully_downloaded = []
|
||||
self.successfully_installed = []
|
||||
self.reqs_to_cleanup = []
|
||||
self.as_egg = as_egg
|
||||
self.use_user_site = use_user_site
|
||||
self.target_dir = target_dir # set from --target option
|
||||
self.session = session
|
||||
self.pycompile = pycompile
|
||||
self.isolated = isolated
|
||||
if wheel_download_dir:
|
||||
wheel_download_dir = normalize_path(wheel_download_dir)
|
||||
self.wheel_download_dir = wheel_download_dir
|
||||
self._wheel_cache = wheel_cache
|
||||
self.require_hashes = require_hashes
|
||||
# Maps from install_req -> dependencies_of_install_req
|
||||
self._dependencies = defaultdict(list)
|
||||
|
||||
def __str__(self):
|
||||
reqs = [req for req in self.requirements.values()
|
||||
if not req.comes_from]
|
||||
reqs.sort(key=lambda req: req.name.lower())
|
||||
return ' '.join([str(req.req) for req in reqs])
|
||||
|
||||
def __repr__(self):
|
||||
reqs = [req for req in self.requirements.values()]
|
||||
reqs.sort(key=lambda req: req.name.lower())
|
||||
reqs_str = ', '.join([str(req.req) for req in reqs])
|
||||
return ('<%s object; %d requirement(s): %s>'
|
||||
% (self.__class__.__name__, len(reqs), reqs_str))
|
||||
|
||||
def add_requirement(self, install_req, parent_req_name=None,
|
||||
extras_requested=None):
|
||||
"""Add install_req as a requirement to install.
|
||||
|
||||
:param parent_req_name: The name of the requirement that needed this
|
||||
added. The name is used because when multiple unnamed requirements
|
||||
resolve to the same name, we could otherwise end up with dependency
|
||||
links that point outside the Requirements set. parent_req must
|
||||
already be added. Note that None implies that this is a user
|
||||
supplied requirement, vs an inferred one.
|
||||
:param extras_requested: an iterable of extras used to evaluate the
|
||||
environement markers.
|
||||
:return: Additional requirements to scan. That is either [] if
|
||||
the requirement is not applicable, or [install_req] if the
|
||||
requirement is applicable and has just been added.
|
||||
"""
|
||||
name = install_req.name
|
||||
if not install_req.match_markers(extras_requested):
|
||||
logger.warning("Ignoring %s: markers '%s' don't match your "
|
||||
"environment", install_req.name,
|
||||
install_req.markers)
|
||||
return []
|
||||
|
||||
# This check has to come after we filter requirements with the
|
||||
# environment markers.
|
||||
if install_req.link and install_req.link.is_wheel:
|
||||
wheel = Wheel(install_req.link.filename)
|
||||
if not wheel.supported():
|
||||
raise InstallationError(
|
||||
"%s is not a supported wheel on this platform." %
|
||||
wheel.filename
|
||||
)
|
||||
|
||||
install_req.as_egg = self.as_egg
|
||||
install_req.use_user_site = self.use_user_site
|
||||
install_req.target_dir = self.target_dir
|
||||
install_req.pycompile = self.pycompile
|
||||
install_req.is_direct = (parent_req_name is None)
|
||||
|
||||
if not name:
|
||||
# url or path requirement w/o an egg fragment
|
||||
self.unnamed_requirements.append(install_req)
|
||||
return [install_req]
|
||||
else:
|
||||
try:
|
||||
existing_req = self.get_requirement(name)
|
||||
except KeyError:
|
||||
existing_req = None
|
||||
if (parent_req_name is None and existing_req and not
|
||||
existing_req.constraint and
|
||||
existing_req.extras == install_req.extras and not
|
||||
existing_req.req.specifier == install_req.req.specifier):
|
||||
raise InstallationError(
|
||||
'Double requirement given: %s (already in %s, name=%r)'
|
||||
% (install_req, existing_req, name))
|
||||
if not existing_req:
|
||||
# Add requirement
|
||||
self.requirements[name] = install_req
|
||||
# FIXME: what about other normalizations? E.g., _ vs. -?
|
||||
if name.lower() != name:
|
||||
self.requirement_aliases[name.lower()] = name
|
||||
result = [install_req]
|
||||
else:
|
||||
# Assume there's no need to scan, and that we've already
|
||||
# encountered this for scanning.
|
||||
result = []
|
||||
if not install_req.constraint and existing_req.constraint:
|
||||
if (install_req.link and not (existing_req.link and
|
||||
install_req.link.path == existing_req.link.path)):
|
||||
self.reqs_to_cleanup.append(install_req)
|
||||
raise InstallationError(
|
||||
"Could not satisfy constraints for '%s': "
|
||||
"installation from path or url cannot be "
|
||||
"constrained to a version" % name)
|
||||
# If we're now installing a constraint, mark the existing
|
||||
# object for real installation.
|
||||
existing_req.constraint = False
|
||||
existing_req.extras = tuple(
|
||||
sorted(set(existing_req.extras).union(
|
||||
set(install_req.extras))))
|
||||
logger.debug("Setting %s extras to: %s",
|
||||
existing_req, existing_req.extras)
|
||||
# And now we need to scan this.
|
||||
result = [existing_req]
|
||||
# Canonicalise to the already-added object for the backref
|
||||
# check below.
|
||||
install_req = existing_req
|
||||
if parent_req_name:
|
||||
parent_req = self.get_requirement(parent_req_name)
|
||||
self._dependencies[parent_req].append(install_req)
|
||||
return result
|
||||
|
||||
def has_requirement(self, project_name):
|
||||
name = project_name.lower()
|
||||
if (name in self.requirements and
|
||||
not self.requirements[name].constraint or
|
||||
name in self.requirement_aliases and
|
||||
not self.requirements[self.requirement_aliases[name]].constraint):
|
||||
return True
|
||||
return False
|
||||
|
||||
@property
|
||||
def has_requirements(self):
|
||||
return list(req for req in self.requirements.values() if not
|
||||
req.constraint) or self.unnamed_requirements
|
||||
|
||||
@property
|
||||
def is_download(self):
|
||||
if self.download_dir:
|
||||
self.download_dir = expanduser(self.download_dir)
|
||||
if os.path.exists(self.download_dir):
|
||||
return True
|
||||
else:
|
||||
logger.critical('Could not find download directory')
|
||||
raise InstallationError(
|
||||
"Could not find or access download directory '%s'"
|
||||
% display_path(self.download_dir))
|
||||
return False
|
||||
|
||||
def get_requirement(self, project_name):
|
||||
for name in project_name, project_name.lower():
|
||||
if name in self.requirements:
|
||||
return self.requirements[name]
|
||||
if name in self.requirement_aliases:
|
||||
return self.requirements[self.requirement_aliases[name]]
|
||||
raise KeyError("No project with the name %r" % project_name)
|
||||
|
||||
def uninstall(self, auto_confirm=False):
|
||||
for req in self.requirements.values():
|
||||
if req.constraint:
|
||||
continue
|
||||
req.uninstall(auto_confirm=auto_confirm)
|
||||
req.commit_uninstall()
|
||||
|
||||
def prepare_files(self, finder):
|
||||
"""
|
||||
Prepare process. Create temp directories, download and/or unpack files.
|
||||
"""
|
||||
# make the wheelhouse
|
||||
if self.wheel_download_dir:
|
||||
ensure_dir(self.wheel_download_dir)
|
||||
|
||||
# If any top-level requirement has a hash specified, enter
|
||||
# hash-checking mode, which requires hashes from all.
|
||||
root_reqs = self.unnamed_requirements + self.requirements.values()
|
||||
require_hashes = (self.require_hashes or
|
||||
any(req.has_hash_options for req in root_reqs))
|
||||
if require_hashes and self.as_egg:
|
||||
raise InstallationError(
|
||||
'--egg is not allowed with --require-hashes mode, since it '
|
||||
'delegates dependency resolution to setuptools and could thus '
|
||||
'result in installation of unhashed packages.')
|
||||
|
||||
# Actually prepare the files, and collect any exceptions. Most hash
|
||||
# exceptions cannot be checked ahead of time, because
|
||||
# req.populate_link() needs to be called before we can make decisions
|
||||
# based on link type.
|
||||
discovered_reqs = []
|
||||
hash_errors = HashErrors()
|
||||
for req in chain(root_reqs, discovered_reqs):
|
||||
try:
|
||||
discovered_reqs.extend(self._prepare_file(
|
||||
finder,
|
||||
req,
|
||||
require_hashes=require_hashes,
|
||||
ignore_dependencies=self.ignore_dependencies))
|
||||
except HashError as exc:
|
||||
exc.req = req
|
||||
hash_errors.append(exc)
|
||||
|
||||
if hash_errors:
|
||||
raise hash_errors
|
||||
|
||||
def _is_upgrade_allowed(self, req):
|
||||
return self.upgrade and (
|
||||
self.upgrade_strategy == "eager" or (
|
||||
self.upgrade_strategy == "only-if-needed" and req.is_direct
|
||||
)
|
||||
)
|
||||
|
||||
def _check_skip_installed(self, req_to_install, finder):
|
||||
"""Check if req_to_install should be skipped.
|
||||
|
||||
This will check if the req is installed, and whether we should upgrade
|
||||
or reinstall it, taking into account all the relevant user options.
|
||||
|
||||
After calling this req_to_install will only have satisfied_by set to
|
||||
None if the req_to_install is to be upgraded/reinstalled etc. Any
|
||||
other value will be a dist recording the current thing installed that
|
||||
satisfies the requirement.
|
||||
|
||||
Note that for vcs urls and the like we can't assess skipping in this
|
||||
routine - we simply identify that we need to pull the thing down,
|
||||
then later on it is pulled down and introspected to assess upgrade/
|
||||
reinstalls etc.
|
||||
|
||||
:return: A text reason for why it was skipped, or None.
|
||||
"""
|
||||
# Check whether to upgrade/reinstall this req or not.
|
||||
req_to_install.check_if_exists()
|
||||
if req_to_install.satisfied_by:
|
||||
upgrade_allowed = self._is_upgrade_allowed(req_to_install)
|
||||
|
||||
# Is the best version is installed.
|
||||
best_installed = False
|
||||
|
||||
if upgrade_allowed:
|
||||
# For link based requirements we have to pull the
|
||||
# tree down and inspect to assess the version #, so
|
||||
# its handled way down.
|
||||
if not (self.force_reinstall or req_to_install.link):
|
||||
try:
|
||||
finder.find_requirement(
|
||||
req_to_install, upgrade_allowed)
|
||||
except BestVersionAlreadyInstalled:
|
||||
best_installed = True
|
||||
except DistributionNotFound:
|
||||
# No distribution found, so we squash the
|
||||
# error - it will be raised later when we
|
||||
# re-try later to do the install.
|
||||
# Why don't we just raise here?
|
||||
pass
|
||||
|
||||
if not best_installed:
|
||||
# don't uninstall conflict if user install and
|
||||
# conflict is not user install
|
||||
if not (self.use_user_site and not
|
||||
dist_in_usersite(req_to_install.satisfied_by)):
|
||||
req_to_install.conflicts_with = \
|
||||
req_to_install.satisfied_by
|
||||
req_to_install.satisfied_by = None
|
||||
|
||||
# Figure out a nice message to say why we're skipping this.
|
||||
if best_installed:
|
||||
skip_reason = 'already up-to-date'
|
||||
elif self.upgrade_strategy == "only-if-needed":
|
||||
skip_reason = 'not upgraded as not directly required'
|
||||
else:
|
||||
skip_reason = 'already satisfied'
|
||||
|
||||
return skip_reason
|
||||
else:
|
||||
return None
|
||||
|
||||
def _prepare_file(self,
|
||||
finder,
|
||||
req_to_install,
|
||||
require_hashes=False,
|
||||
ignore_dependencies=False):
|
||||
"""Prepare a single requirements file.
|
||||
|
||||
:return: A list of additional InstallRequirements to also install.
|
||||
"""
|
||||
# Tell user what we are doing for this requirement:
|
||||
# obtain (editable), skipping, processing (local url), collecting
|
||||
# (remote url or package name)
|
||||
if req_to_install.constraint or req_to_install.prepared:
|
||||
return []
|
||||
|
||||
req_to_install.prepared = True
|
||||
|
||||
# ###################### #
|
||||
# # print log messages # #
|
||||
# ###################### #
|
||||
if req_to_install.editable:
|
||||
logger.info('Obtaining %s', req_to_install)
|
||||
else:
|
||||
# satisfied_by is only evaluated by calling _check_skip_installed,
|
||||
# so it must be None here.
|
||||
assert req_to_install.satisfied_by is None
|
||||
if not self.ignore_installed:
|
||||
skip_reason = self._check_skip_installed(
|
||||
req_to_install, finder)
|
||||
|
||||
if req_to_install.satisfied_by:
|
||||
assert skip_reason is not None, (
|
||||
'_check_skip_installed returned None but '
|
||||
'req_to_install.satisfied_by is set to %r'
|
||||
% (req_to_install.satisfied_by,))
|
||||
logger.info(
|
||||
'Requirement %s: %s', skip_reason,
|
||||
req_to_install)
|
||||
else:
|
||||
if (req_to_install.link and
|
||||
req_to_install.link.scheme == 'file'):
|
||||
path = url_to_path(req_to_install.link.url)
|
||||
logger.info('Processing %s', display_path(path))
|
||||
else:
|
||||
logger.info('Collecting %s', req_to_install)
|
||||
|
||||
with indent_log():
|
||||
# ################################ #
|
||||
# # vcs update or unpack archive # #
|
||||
# ################################ #
|
||||
if req_to_install.editable:
|
||||
if require_hashes:
|
||||
raise InstallationError(
|
||||
'The editable requirement %s cannot be installed when '
|
||||
'requiring hashes, because there is no single file to '
|
||||
'hash.' % req_to_install)
|
||||
req_to_install.ensure_has_source_dir(self.src_dir)
|
||||
req_to_install.update_editable(not self.is_download)
|
||||
abstract_dist = make_abstract_dist(req_to_install)
|
||||
abstract_dist.prep_for_dist()
|
||||
if self.is_download:
|
||||
req_to_install.archive(self.download_dir)
|
||||
req_to_install.check_if_exists()
|
||||
elif req_to_install.satisfied_by:
|
||||
if require_hashes:
|
||||
logger.debug(
|
||||
'Since it is already installed, we are trusting this '
|
||||
'package without checking its hash. To ensure a '
|
||||
'completely repeatable environment, install into an '
|
||||
'empty virtualenv.')
|
||||
abstract_dist = Installed(req_to_install)
|
||||
else:
|
||||
# @@ if filesystem packages are not marked
|
||||
# editable in a req, a non deterministic error
|
||||
# occurs when the script attempts to unpack the
|
||||
# build directory
|
||||
req_to_install.ensure_has_source_dir(self.build_dir)
|
||||
# If a checkout exists, it's unwise to keep going. version
|
||||
# inconsistencies are logged later, but do not fail the
|
||||
# installation.
|
||||
# FIXME: this won't upgrade when there's an existing
|
||||
# package unpacked in `req_to_install.source_dir`
|
||||
if os.path.exists(
|
||||
os.path.join(req_to_install.source_dir, 'setup.py')):
|
||||
raise PreviousBuildDirError(
|
||||
"pip can't proceed with requirements '%s' due to a"
|
||||
" pre-existing build directory (%s). This is "
|
||||
"likely due to a previous installation that failed"
|
||||
". pip is being responsible and not assuming it "
|
||||
"can delete this. Please delete it and try again."
|
||||
% (req_to_install, req_to_install.source_dir)
|
||||
)
|
||||
req_to_install.populate_link(
|
||||
finder,
|
||||
self._is_upgrade_allowed(req_to_install),
|
||||
require_hashes
|
||||
)
|
||||
# We can't hit this spot and have populate_link return None.
|
||||
# req_to_install.satisfied_by is None here (because we're
|
||||
# guarded) and upgrade has no impact except when satisfied_by
|
||||
# is not None.
|
||||
# Then inside find_requirement existing_applicable -> False
|
||||
# If no new versions are found, DistributionNotFound is raised,
|
||||
# otherwise a result is guaranteed.
|
||||
assert req_to_install.link
|
||||
link = req_to_install.link
|
||||
|
||||
# Now that we have the real link, we can tell what kind of
|
||||
# requirements we have and raise some more informative errors
|
||||
# than otherwise. (For example, we can raise VcsHashUnsupported
|
||||
# for a VCS URL rather than HashMissing.)
|
||||
if require_hashes:
|
||||
# We could check these first 2 conditions inside
|
||||
# unpack_url and save repetition of conditions, but then
|
||||
# we would report less-useful error messages for
|
||||
# unhashable requirements, complaining that there's no
|
||||
# hash provided.
|
||||
if is_vcs_url(link):
|
||||
raise VcsHashUnsupported()
|
||||
elif is_file_url(link) and is_dir_url(link):
|
||||
raise DirectoryUrlHashUnsupported()
|
||||
if (not req_to_install.original_link and
|
||||
not req_to_install.is_pinned):
|
||||
# Unpinned packages are asking for trouble when a new
|
||||
# version is uploaded. This isn't a security check, but
|
||||
# it saves users a surprising hash mismatch in the
|
||||
# future.
|
||||
#
|
||||
# file:/// URLs aren't pinnable, so don't complain
|
||||
# about them not being pinned.
|
||||
raise HashUnpinned()
|
||||
hashes = req_to_install.hashes(
|
||||
trust_internet=not require_hashes)
|
||||
if require_hashes and not hashes:
|
||||
# Known-good hashes are missing for this requirement, so
|
||||
# shim it with a facade object that will provoke hash
|
||||
# computation and then raise a HashMissing exception
|
||||
# showing the user what the hash should be.
|
||||
hashes = MissingHashes()
|
||||
|
||||
try:
|
||||
download_dir = self.download_dir
|
||||
# We always delete unpacked sdists after pip ran.
|
||||
autodelete_unpacked = True
|
||||
if req_to_install.link.is_wheel \
|
||||
and self.wheel_download_dir:
|
||||
# when doing 'pip wheel` we download wheels to a
|
||||
# dedicated dir.
|
||||
download_dir = self.wheel_download_dir
|
||||
if req_to_install.link.is_wheel:
|
||||
if download_dir:
|
||||
# When downloading, we only unpack wheels to get
|
||||
# metadata.
|
||||
autodelete_unpacked = True
|
||||
else:
|
||||
# When installing a wheel, we use the unpacked
|
||||
# wheel.
|
||||
autodelete_unpacked = False
|
||||
unpack_url(
|
||||
req_to_install.link, req_to_install.source_dir,
|
||||
download_dir, autodelete_unpacked,
|
||||
session=self.session, hashes=hashes)
|
||||
except requests.HTTPError as exc:
|
||||
logger.critical(
|
||||
'Could not install requirement %s because '
|
||||
'of error %s',
|
||||
req_to_install,
|
||||
exc,
|
||||
)
|
||||
raise InstallationError(
|
||||
'Could not install requirement %s because '
|
||||
'of HTTP error %s for URL %s' %
|
||||
(req_to_install, exc, req_to_install.link)
|
||||
)
|
||||
abstract_dist = make_abstract_dist(req_to_install)
|
||||
abstract_dist.prep_for_dist()
|
||||
if self.is_download:
|
||||
# Make a .zip of the source_dir we already created.
|
||||
if req_to_install.link.scheme in vcs.all_schemes:
|
||||
req_to_install.archive(self.download_dir)
|
||||
# req_to_install.req is only avail after unpack for URL
|
||||
# pkgs repeat check_if_exists to uninstall-on-upgrade
|
||||
# (#14)
|
||||
if not self.ignore_installed:
|
||||
req_to_install.check_if_exists()
|
||||
if req_to_install.satisfied_by:
|
||||
if self.upgrade or self.ignore_installed:
|
||||
# don't uninstall conflict if user install and
|
||||
# conflict is not user install
|
||||
if not (self.use_user_site and not
|
||||
dist_in_usersite(
|
||||
req_to_install.satisfied_by)):
|
||||
req_to_install.conflicts_with = \
|
||||
req_to_install.satisfied_by
|
||||
req_to_install.satisfied_by = None
|
||||
else:
|
||||
logger.info(
|
||||
'Requirement already satisfied (use '
|
||||
'--upgrade to upgrade): %s',
|
||||
req_to_install,
|
||||
)
|
||||
|
||||
# ###################### #
|
||||
# # parse dependencies # #
|
||||
# ###################### #
|
||||
dist = abstract_dist.dist(finder)
|
||||
try:
|
||||
check_dist_requires_python(dist)
|
||||
except UnsupportedPythonVersion as e:
|
||||
if self.ignore_requires_python:
|
||||
logger.warning(e.args[0])
|
||||
else:
|
||||
req_to_install.remove_temporary_source()
|
||||
raise
|
||||
more_reqs = []
|
||||
|
||||
def add_req(subreq, extras_requested):
|
||||
sub_install_req = InstallRequirement(
|
||||
str(subreq),
|
||||
req_to_install,
|
||||
isolated=self.isolated,
|
||||
wheel_cache=self._wheel_cache,
|
||||
)
|
||||
more_reqs.extend(self.add_requirement(
|
||||
sub_install_req, req_to_install.name,
|
||||
extras_requested=extras_requested))
|
||||
|
||||
# We add req_to_install before its dependencies, so that we
|
||||
# can refer to it when adding dependencies.
|
||||
if not self.has_requirement(req_to_install.name):
|
||||
# 'unnamed' requirements will get added here
|
||||
self.add_requirement(req_to_install, None)
|
||||
|
||||
if not ignore_dependencies:
|
||||
if (req_to_install.extras):
|
||||
logger.debug(
|
||||
"Installing extra requirements: %r",
|
||||
','.join(req_to_install.extras),
|
||||
)
|
||||
missing_requested = sorted(
|
||||
set(req_to_install.extras) - set(dist.extras)
|
||||
)
|
||||
for missing in missing_requested:
|
||||
logger.warning(
|
||||
'%s does not provide the extra \'%s\'',
|
||||
dist, missing
|
||||
)
|
||||
|
||||
available_requested = sorted(
|
||||
set(dist.extras) & set(req_to_install.extras)
|
||||
)
|
||||
for subreq in dist.requires(available_requested):
|
||||
add_req(subreq, extras_requested=available_requested)
|
||||
|
||||
# cleanup tmp src
|
||||
self.reqs_to_cleanup.append(req_to_install)
|
||||
|
||||
if not req_to_install.editable and not req_to_install.satisfied_by:
|
||||
# XXX: --no-install leads this to report 'Successfully
|
||||
# downloaded' for only non-editable reqs, even though we took
|
||||
# action on them.
|
||||
self.successfully_downloaded.append(req_to_install)
|
||||
|
||||
return more_reqs
|
||||
|
||||
def cleanup_files(self):
|
||||
"""Clean up files, remove builds."""
|
||||
logger.debug('Cleaning up...')
|
||||
with indent_log():
|
||||
for req in self.reqs_to_cleanup:
|
||||
req.remove_temporary_source()
|
||||
|
||||
def _to_install(self):
|
||||
"""Create the installation order.
|
||||
|
||||
The installation order is topological - requirements are installed
|
||||
before the requiring thing. We break cycles at an arbitrary point,
|
||||
and make no other guarantees.
|
||||
"""
|
||||
# The current implementation, which we may change at any point
|
||||
# installs the user specified things in the order given, except when
|
||||
# dependencies must come earlier to achieve topological order.
|
||||
order = []
|
||||
ordered_reqs = set()
|
||||
|
||||
def schedule(req):
|
||||
if req.satisfied_by or req in ordered_reqs:
|
||||
return
|
||||
if req.constraint:
|
||||
return
|
||||
ordered_reqs.add(req)
|
||||
for dep in self._dependencies[req]:
|
||||
schedule(dep)
|
||||
order.append(req)
|
||||
for install_req in self.requirements.values():
|
||||
schedule(install_req)
|
||||
return order
|
||||
|
||||
def install(self, install_options, global_options=(), *args, **kwargs):
|
||||
"""
|
||||
Install everything in this set (after having downloaded and unpacked
|
||||
the packages)
|
||||
"""
|
||||
to_install = self._to_install()
|
||||
|
||||
if to_install:
|
||||
logger.info(
|
||||
'Installing collected packages: %s',
|
||||
', '.join([req.name for req in to_install]),
|
||||
)
|
||||
|
||||
with indent_log():
|
||||
for requirement in to_install:
|
||||
if requirement.conflicts_with:
|
||||
logger.info(
|
||||
'Found existing installation: %s',
|
||||
requirement.conflicts_with,
|
||||
)
|
||||
with indent_log():
|
||||
requirement.uninstall(auto_confirm=True)
|
||||
try:
|
||||
requirement.install(
|
||||
install_options,
|
||||
global_options,
|
||||
*args,
|
||||
**kwargs
|
||||
)
|
||||
except:
|
||||
# if install did not succeed, rollback previous uninstall
|
||||
if (requirement.conflicts_with and not
|
||||
requirement.install_succeeded):
|
||||
requirement.rollback_uninstall()
|
||||
raise
|
||||
else:
|
||||
if (requirement.conflicts_with and
|
||||
requirement.install_succeeded):
|
||||
requirement.commit_uninstall()
|
||||
requirement.remove_temporary_source()
|
||||
|
||||
self.successfully_installed = to_install
|
||||
195
lib/python3.4/site-packages/pip/req/req_uninstall.py
Normal file
195
lib/python3.4/site-packages/pip/req/req_uninstall.py
Normal file
|
|
@ -0,0 +1,195 @@
|
|||
from __future__ import absolute_import
|
||||
|
||||
import logging
|
||||
import os
|
||||
import tempfile
|
||||
|
||||
from pip.compat import uses_pycache, WINDOWS, cache_from_source
|
||||
from pip.exceptions import UninstallationError
|
||||
from pip.utils import rmtree, ask, is_local, renames, normalize_path
|
||||
from pip.utils.logging import indent_log
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class UninstallPathSet(object):
|
||||
"""A set of file paths to be removed in the uninstallation of a
|
||||
requirement."""
|
||||
def __init__(self, dist):
|
||||
self.paths = set()
|
||||
self._refuse = set()
|
||||
self.pth = {}
|
||||
self.dist = dist
|
||||
self.save_dir = None
|
||||
self._moved_paths = []
|
||||
|
||||
def _permitted(self, path):
|
||||
"""
|
||||
Return True if the given path is one we are permitted to
|
||||
remove/modify, False otherwise.
|
||||
|
||||
"""
|
||||
return is_local(path)
|
||||
|
||||
def add(self, path):
|
||||
head, tail = os.path.split(path)
|
||||
|
||||
# we normalize the head to resolve parent directory symlinks, but not
|
||||
# the tail, since we only want to uninstall symlinks, not their targets
|
||||
path = os.path.join(normalize_path(head), os.path.normcase(tail))
|
||||
|
||||
if not os.path.exists(path):
|
||||
return
|
||||
if self._permitted(path):
|
||||
self.paths.add(path)
|
||||
else:
|
||||
self._refuse.add(path)
|
||||
|
||||
# __pycache__ files can show up after 'installed-files.txt' is created,
|
||||
# due to imports
|
||||
if os.path.splitext(path)[1] == '.py' and uses_pycache:
|
||||
self.add(cache_from_source(path))
|
||||
|
||||
def add_pth(self, pth_file, entry):
|
||||
pth_file = normalize_path(pth_file)
|
||||
if self._permitted(pth_file):
|
||||
if pth_file not in self.pth:
|
||||
self.pth[pth_file] = UninstallPthEntries(pth_file)
|
||||
self.pth[pth_file].add(entry)
|
||||
else:
|
||||
self._refuse.add(pth_file)
|
||||
|
||||
def compact(self, paths):
|
||||
"""Compact a path set to contain the minimal number of paths
|
||||
necessary to contain all paths in the set. If /a/path/ and
|
||||
/a/path/to/a/file.txt are both in the set, leave only the
|
||||
shorter path."""
|
||||
short_paths = set()
|
||||
for path in sorted(paths, key=len):
|
||||
if not any([
|
||||
(path.startswith(shortpath) and
|
||||
path[len(shortpath.rstrip(os.path.sep))] == os.path.sep)
|
||||
for shortpath in short_paths]):
|
||||
short_paths.add(path)
|
||||
return short_paths
|
||||
|
||||
def _stash(self, path):
|
||||
return os.path.join(
|
||||
self.save_dir, os.path.splitdrive(path)[1].lstrip(os.path.sep))
|
||||
|
||||
def remove(self, auto_confirm=False):
|
||||
"""Remove paths in ``self.paths`` with confirmation (unless
|
||||
``auto_confirm`` is True)."""
|
||||
if not self.paths:
|
||||
logger.info(
|
||||
"Can't uninstall '%s'. No files were found to uninstall.",
|
||||
self.dist.project_name,
|
||||
)
|
||||
return
|
||||
logger.info(
|
||||
'Uninstalling %s-%s:',
|
||||
self.dist.project_name, self.dist.version
|
||||
)
|
||||
|
||||
with indent_log():
|
||||
paths = sorted(self.compact(self.paths))
|
||||
|
||||
if auto_confirm:
|
||||
response = 'y'
|
||||
else:
|
||||
for path in paths:
|
||||
logger.info(path)
|
||||
response = ask('Proceed (y/n)? ', ('y', 'n'))
|
||||
if self._refuse:
|
||||
logger.info('Not removing or modifying (outside of prefix):')
|
||||
for path in self.compact(self._refuse):
|
||||
logger.info(path)
|
||||
if response == 'y':
|
||||
self.save_dir = tempfile.mkdtemp(suffix='-uninstall',
|
||||
prefix='pip-')
|
||||
for path in paths:
|
||||
new_path = self._stash(path)
|
||||
logger.debug('Removing file or directory %s', path)
|
||||
self._moved_paths.append(path)
|
||||
renames(path, new_path)
|
||||
for pth in self.pth.values():
|
||||
pth.remove()
|
||||
logger.info(
|
||||
'Successfully uninstalled %s-%s',
|
||||
self.dist.project_name, self.dist.version
|
||||
)
|
||||
|
||||
def rollback(self):
|
||||
"""Rollback the changes previously made by remove()."""
|
||||
if self.save_dir is None:
|
||||
logger.error(
|
||||
"Can't roll back %s; was not uninstalled",
|
||||
self.dist.project_name,
|
||||
)
|
||||
return False
|
||||
logger.info('Rolling back uninstall of %s', self.dist.project_name)
|
||||
for path in self._moved_paths:
|
||||
tmp_path = self._stash(path)
|
||||
logger.debug('Replacing %s', path)
|
||||
renames(tmp_path, path)
|
||||
for pth in self.pth.values():
|
||||
pth.rollback()
|
||||
|
||||
def commit(self):
|
||||
"""Remove temporary save dir: rollback will no longer be possible."""
|
||||
if self.save_dir is not None:
|
||||
rmtree(self.save_dir)
|
||||
self.save_dir = None
|
||||
self._moved_paths = []
|
||||
|
||||
|
||||
class UninstallPthEntries(object):
|
||||
def __init__(self, pth_file):
|
||||
if not os.path.isfile(pth_file):
|
||||
raise UninstallationError(
|
||||
"Cannot remove entries from nonexistent file %s" % pth_file
|
||||
)
|
||||
self.file = pth_file
|
||||
self.entries = set()
|
||||
self._saved_lines = None
|
||||
|
||||
def add(self, entry):
|
||||
entry = os.path.normcase(entry)
|
||||
# On Windows, os.path.normcase converts the entry to use
|
||||
# backslashes. This is correct for entries that describe absolute
|
||||
# paths outside of site-packages, but all the others use forward
|
||||
# slashes.
|
||||
if WINDOWS and not os.path.splitdrive(entry)[0]:
|
||||
entry = entry.replace('\\', '/')
|
||||
self.entries.add(entry)
|
||||
|
||||
def remove(self):
|
||||
logger.debug('Removing pth entries from %s:', self.file)
|
||||
with open(self.file, 'rb') as fh:
|
||||
# windows uses '\r\n' with py3k, but uses '\n' with py2.x
|
||||
lines = fh.readlines()
|
||||
self._saved_lines = lines
|
||||
if any(b'\r\n' in line for line in lines):
|
||||
endline = '\r\n'
|
||||
else:
|
||||
endline = '\n'
|
||||
for entry in self.entries:
|
||||
try:
|
||||
logger.debug('Removing entry: %s', entry)
|
||||
lines.remove((entry + endline).encode("utf-8"))
|
||||
except ValueError:
|
||||
pass
|
||||
with open(self.file, 'wb') as fh:
|
||||
fh.writelines(lines)
|
||||
|
||||
def rollback(self):
|
||||
if self._saved_lines is None:
|
||||
logger.error(
|
||||
'Cannot roll back changes to %s, none were made', self.file
|
||||
)
|
||||
return False
|
||||
logger.debug('Rolling %s back to previous state', self.file)
|
||||
with open(self.file, 'wb') as fh:
|
||||
fh.writelines(self._saved_lines)
|
||||
return True
|
||||
|
|
@ -1,5 +1,6 @@
|
|||
from __future__ import absolute_import
|
||||
|
||||
from collections import deque
|
||||
import contextlib
|
||||
import errno
|
||||
import io
|
||||
|
|
@ -7,33 +8,26 @@ import locale
|
|||
# we have a submodule named 'logging' which would shadow this if we used the
|
||||
# regular name:
|
||||
import logging as std_logging
|
||||
import re
|
||||
import os
|
||||
import posixpath
|
||||
import re
|
||||
import shutil
|
||||
import stat
|
||||
import subprocess
|
||||
import sys
|
||||
import tarfile
|
||||
import zipfile
|
||||
from collections import deque
|
||||
|
||||
from pip._vendor import pkg_resources
|
||||
# NOTE: retrying is not annotated in typeshed as on 2017-07-17, which is
|
||||
# why we ignore the type on this import.
|
||||
from pip._vendor.retrying import retry # type: ignore
|
||||
from pip._vendor.six import PY2
|
||||
from pip._vendor.six.moves import input
|
||||
from pip._vendor.six.moves.urllib import parse as urllib_parse
|
||||
|
||||
from pip._internal.exceptions import CommandError, InstallationError
|
||||
from pip._internal.locations import (
|
||||
running_under_virtualenv, site_packages, user_site, virtualenv_no_global,
|
||||
from pip.exceptions import InstallationError
|
||||
from pip.compat import console_to_str, expanduser, stdlib_pkgs
|
||||
from pip.locations import (
|
||||
site_packages, user_site, running_under_virtualenv, virtualenv_no_global,
|
||||
write_delete_marker_file,
|
||||
)
|
||||
from pip._internal.utils.compat import (
|
||||
WINDOWS, console_to_str, expanduser, stdlib_pkgs,
|
||||
)
|
||||
from pip._vendor import pkg_resources
|
||||
from pip._vendor.six.moves import input
|
||||
from pip._vendor.six import PY2
|
||||
from pip._vendor.retrying import retry
|
||||
|
||||
if PY2:
|
||||
from io import BytesIO as StringIO
|
||||
|
|
@ -46,11 +40,11 @@ __all__ = ['rmtree', 'display_path', 'backup_dir',
|
|||
'is_svn_page', 'file_contents',
|
||||
'split_leading_dir', 'has_leading_dir',
|
||||
'normalize_path',
|
||||
'renames', 'get_prog',
|
||||
'renames', 'get_terminal_size', 'get_prog',
|
||||
'unzip_file', 'untar_file', 'unpack_file', 'call_subprocess',
|
||||
'captured_stdout', 'ensure_dir',
|
||||
'ARCHIVE_EXTENSIONS', 'SUPPORTED_EXTENSIONS',
|
||||
'get_installed_version', 'remove_auth_from_url']
|
||||
'get_installed_version']
|
||||
|
||||
|
||||
logger = std_logging.getLogger(__name__)
|
||||
|
|
@ -94,11 +88,8 @@ def ensure_dir(path):
|
|||
|
||||
def get_prog():
|
||||
try:
|
||||
prog = os.path.basename(sys.argv[0])
|
||||
if prog in ('__main__.py', '-c'):
|
||||
if os.path.basename(sys.argv[0]) in ('__main__.py', '-c'):
|
||||
return "%s -m pip" % sys.executable
|
||||
else:
|
||||
return prog
|
||||
except (AttributeError, TypeError, IndexError):
|
||||
pass
|
||||
return 'pip'
|
||||
|
|
@ -187,16 +178,12 @@ def format_size(bytes):
|
|||
|
||||
|
||||
def is_installable_dir(path):
|
||||
"""Is path is a directory containing setup.py or pyproject.toml?
|
||||
"""
|
||||
"""Return True if `path` is a directory containing a setup.py file."""
|
||||
if not os.path.isdir(path):
|
||||
return False
|
||||
setup_py = os.path.join(path, 'setup.py')
|
||||
if os.path.isfile(setup_py):
|
||||
return True
|
||||
pyproject_toml = os.path.join(path, 'pyproject.toml')
|
||||
if os.path.isfile(pyproject_toml):
|
||||
return True
|
||||
return False
|
||||
|
||||
|
||||
|
|
@ -309,7 +296,7 @@ def is_local(path):
|
|||
if running_under_virtualenv():
|
||||
return path.startswith(normalize_path(sys.prefix))
|
||||
else:
|
||||
from pip._internal.locations import distutils_scheme
|
||||
from pip.locations import distutils_scheme
|
||||
if path.startswith(prefix):
|
||||
for local_path in distutils_scheme("").values():
|
||||
if path.startswith(normalize_path(local_path)):
|
||||
|
|
@ -339,7 +326,7 @@ def dist_in_usersite(dist):
|
|||
def dist_in_site_packages(dist):
|
||||
"""
|
||||
Return True if given Distribution is installed in
|
||||
sysconfig.get_python_lib().
|
||||
distutils.sysconfig.get_python_lib().
|
||||
"""
|
||||
return normalize_path(
|
||||
dist_location(dist)
|
||||
|
|
@ -369,7 +356,7 @@ def get_installed_distributions(local_only=True,
|
|||
``skip`` argument is an iterable of lower-case project names to
|
||||
ignore; defaults to stdlib_pkgs
|
||||
|
||||
If ``include_editables`` is False, don't report editables.
|
||||
If ``editables`` is False, don't report editables.
|
||||
|
||||
If ``editables_only`` is True , only report editables.
|
||||
|
||||
|
|
@ -463,6 +450,36 @@ def dist_location(dist):
|
|||
return dist.location
|
||||
|
||||
|
||||
def get_terminal_size():
|
||||
"""Returns a tuple (x, y) representing the width(x) and the height(x)
|
||||
in characters of the terminal window."""
|
||||
def ioctl_GWINSZ(fd):
|
||||
try:
|
||||
import fcntl
|
||||
import termios
|
||||
import struct
|
||||
cr = struct.unpack(
|
||||
'hh',
|
||||
fcntl.ioctl(fd, termios.TIOCGWINSZ, '1234')
|
||||
)
|
||||
except:
|
||||
return None
|
||||
if cr == (0, 0):
|
||||
return None
|
||||
return cr
|
||||
cr = ioctl_GWINSZ(0) or ioctl_GWINSZ(1) or ioctl_GWINSZ(2)
|
||||
if not cr:
|
||||
try:
|
||||
fd = os.open(os.ctermid(), os.O_RDONLY)
|
||||
cr = ioctl_GWINSZ(fd)
|
||||
os.close(fd)
|
||||
except:
|
||||
pass
|
||||
if not cr:
|
||||
cr = (os.environ.get('LINES', 25), os.environ.get('COLUMNS', 80))
|
||||
return int(cr[1]), int(cr[0])
|
||||
|
||||
|
||||
def current_umask():
|
||||
"""Get the current umask which involves having to set it temporarily."""
|
||||
mask = os.umask(0)
|
||||
|
|
@ -607,7 +624,7 @@ def unpack_file(filename, location, content_type, link):
|
|||
elif (content_type and content_type.startswith('text/html') and
|
||||
is_svn_page(file_contents(filename))):
|
||||
# We don't really care about this
|
||||
from pip._internal.vcs.subversion import Subversion
|
||||
from pip.vcs.subversion import Subversion
|
||||
Subversion('svn+' + link.url).unpack(location)
|
||||
else:
|
||||
# FIXME: handle?
|
||||
|
|
@ -625,14 +642,7 @@ def unpack_file(filename, location, content_type, link):
|
|||
def call_subprocess(cmd, show_stdout=True, cwd=None,
|
||||
on_returncode='raise',
|
||||
command_desc=None,
|
||||
extra_environ=None, unset_environ=None, spinner=None):
|
||||
"""
|
||||
Args:
|
||||
unset_environ: an iterable of environment variable names to unset
|
||||
prior to calling subprocess.Popen().
|
||||
"""
|
||||
if unset_environ is None:
|
||||
unset_environ = []
|
||||
extra_environ=None, spinner=None):
|
||||
# This function's handling of subprocess output is confusing and I
|
||||
# previously broke it terribly, so as penance I will write a long comment
|
||||
# explaining things.
|
||||
|
|
@ -669,21 +679,17 @@ def call_subprocess(cmd, show_stdout=True, cwd=None,
|
|||
env = os.environ.copy()
|
||||
if extra_environ:
|
||||
env.update(extra_environ)
|
||||
for name in unset_environ:
|
||||
env.pop(name, None)
|
||||
try:
|
||||
proc = subprocess.Popen(
|
||||
cmd, stderr=subprocess.STDOUT, stdin=subprocess.PIPE,
|
||||
stdout=stdout, cwd=cwd, env=env,
|
||||
)
|
||||
proc.stdin.close()
|
||||
cmd, stderr=subprocess.STDOUT, stdin=None, stdout=stdout,
|
||||
cwd=cwd, env=env)
|
||||
except Exception as exc:
|
||||
logger.critical(
|
||||
"Error %s while executing command %s", exc, command_desc,
|
||||
)
|
||||
raise
|
||||
all_output = []
|
||||
if stdout is not None:
|
||||
all_output = []
|
||||
while True:
|
||||
line = console_to_str(proc.stdout.readline())
|
||||
if not line:
|
||||
|
|
@ -697,11 +703,7 @@ def call_subprocess(cmd, show_stdout=True, cwd=None,
|
|||
# Update the spinner
|
||||
if spinner is not None:
|
||||
spinner.spin()
|
||||
try:
|
||||
proc.wait()
|
||||
finally:
|
||||
if proc.stdout:
|
||||
proc.stdout.close()
|
||||
if spinner is not None:
|
||||
if proc.returncode:
|
||||
spinner.finish("error")
|
||||
|
|
@ -843,15 +845,17 @@ class cached_property(object):
|
|||
return value
|
||||
|
||||
|
||||
def get_installed_version(dist_name, working_set=None):
|
||||
def get_installed_version(dist_name, lookup_dirs=None):
|
||||
"""Get the installed version of dist_name avoiding pkg_resources cache"""
|
||||
# Create a requirement that we'll look for inside of setuptools.
|
||||
req = pkg_resources.Requirement.parse(dist_name)
|
||||
|
||||
if working_set is None:
|
||||
# We want to avoid having this cached, so we need to construct a new
|
||||
# working set each time.
|
||||
if lookup_dirs is None:
|
||||
working_set = pkg_resources.WorkingSet()
|
||||
else:
|
||||
working_set = pkg_resources.WorkingSet(lookup_dirs)
|
||||
|
||||
# Get the installed distribution from our working set
|
||||
dist = working_set.find(req)
|
||||
|
|
@ -864,95 +868,3 @@ def get_installed_version(dist_name, working_set=None):
|
|||
def consume(iterator):
|
||||
"""Consume an iterable at C speed."""
|
||||
deque(iterator, maxlen=0)
|
||||
|
||||
|
||||
# Simulates an enum
|
||||
def enum(*sequential, **named):
|
||||
enums = dict(zip(sequential, range(len(sequential))), **named)
|
||||
reverse = {value: key for key, value in enums.items()}
|
||||
enums['reverse_mapping'] = reverse
|
||||
return type('Enum', (), enums)
|
||||
|
||||
|
||||
def make_vcs_requirement_url(repo_url, rev, egg_project_name, subdir=None):
|
||||
"""
|
||||
Return the URL for a VCS requirement.
|
||||
|
||||
Args:
|
||||
repo_url: the remote VCS url, with any needed VCS prefix (e.g. "git+").
|
||||
"""
|
||||
req = '{}@{}#egg={}'.format(repo_url, rev, egg_project_name)
|
||||
if subdir:
|
||||
req += '&subdirectory={}'.format(subdir)
|
||||
|
||||
return req
|
||||
|
||||
|
||||
def split_auth_from_netloc(netloc):
|
||||
"""
|
||||
Parse out and remove the auth information from a netloc.
|
||||
|
||||
Returns: (netloc, (username, password)).
|
||||
"""
|
||||
if '@' not in netloc:
|
||||
return netloc, (None, None)
|
||||
|
||||
# Split from the right because that's how urllib.parse.urlsplit()
|
||||
# behaves if more than one @ is present (which can be checked using
|
||||
# the password attribute of urlsplit()'s return value).
|
||||
auth, netloc = netloc.rsplit('@', 1)
|
||||
if ':' in auth:
|
||||
# Split from the left because that's how urllib.parse.urlsplit()
|
||||
# behaves if more than one : is present (which again can be checked
|
||||
# using the password attribute of the return value)
|
||||
user_pass = tuple(auth.split(':', 1))
|
||||
else:
|
||||
user_pass = auth, None
|
||||
|
||||
return netloc, user_pass
|
||||
|
||||
|
||||
def remove_auth_from_url(url):
|
||||
# Return a copy of url with 'username:password@' removed.
|
||||
# username/pass params are passed to subversion through flags
|
||||
# and are not recognized in the url.
|
||||
|
||||
# parsed url
|
||||
purl = urllib_parse.urlsplit(url)
|
||||
netloc, user_pass = split_auth_from_netloc(purl.netloc)
|
||||
|
||||
# stripped url
|
||||
url_pieces = (
|
||||
purl.scheme, netloc, purl.path, purl.query, purl.fragment
|
||||
)
|
||||
surl = urllib_parse.urlunsplit(url_pieces)
|
||||
return surl
|
||||
|
||||
|
||||
def protect_pip_from_modification_on_windows(modifying_pip):
|
||||
"""Protection of pip.exe from modification on Windows
|
||||
|
||||
On Windows, any operation modifying pip should be run as:
|
||||
python -m pip ...
|
||||
"""
|
||||
pip_names = [
|
||||
"pip.exe",
|
||||
"pip{}.exe".format(sys.version_info[0]),
|
||||
"pip{}.{}.exe".format(*sys.version_info[:2])
|
||||
]
|
||||
|
||||
# See https://github.com/pypa/pip/issues/1299 for more discussion
|
||||
should_show_use_python_msg = (
|
||||
modifying_pip and
|
||||
WINDOWS and
|
||||
os.path.basename(sys.argv[0]) in pip_names
|
||||
)
|
||||
|
||||
if should_show_use_python_msg:
|
||||
new_command = [
|
||||
sys.executable, "-m", "pip"
|
||||
] + sys.argv[1:]
|
||||
raise CommandError(
|
||||
'To modify pip, please run the following command:\n{}'
|
||||
.format(" ".join(new_command))
|
||||
)
|
||||
|
|
@ -7,10 +7,9 @@ from __future__ import absolute_import
|
|||
import os
|
||||
import sys
|
||||
|
||||
from pip.compat import WINDOWS, expanduser
|
||||
from pip._vendor.six import PY2, text_type
|
||||
|
||||
from pip._internal.utils.compat import WINDOWS, expanduser
|
||||
|
||||
|
||||
def user_cache_dir(appname):
|
||||
r"""
|
||||
|
|
@ -61,7 +60,7 @@ def user_cache_dir(appname):
|
|||
|
||||
|
||||
def user_data_dir(appname, roaming=False):
|
||||
r"""
|
||||
"""
|
||||
Return full path to the user-specific data dir for this application.
|
||||
|
||||
"appname" is the name of application.
|
||||
|
|
@ -75,7 +74,6 @@ def user_data_dir(appname, roaming=False):
|
|||
|
||||
Typical user data directories are:
|
||||
macOS: ~/Library/Application Support/<AppName>
|
||||
if it exists, else ~/.config/<AppName>
|
||||
Unix: ~/.local/share/<AppName> # or in
|
||||
$XDG_DATA_HOME, if defined
|
||||
Win XP (not roaming): C:\Documents and Settings\<username>\ ...
|
||||
|
|
@ -95,13 +93,6 @@ def user_data_dir(appname, roaming=False):
|
|||
path = os.path.join(
|
||||
expanduser('~/Library/Application Support/'),
|
||||
appname,
|
||||
) if os.path.isdir(os.path.join(
|
||||
expanduser('~/Library/Application Support/'),
|
||||
appname,
|
||||
)
|
||||
) else os.path.join(
|
||||
expanduser('~/.config/'),
|
||||
appname,
|
||||
)
|
||||
else:
|
||||
path = os.path.join(
|
||||
|
|
@ -146,7 +137,7 @@ def user_config_dir(appname, roaming=True):
|
|||
# for the discussion regarding site_config_dirs locations
|
||||
# see <https://github.com/pypa/pip/issues/1733>
|
||||
def site_config_dirs(appname):
|
||||
r"""Return a list of potential user-shared config dirs for this application.
|
||||
"""Return a list of potential user-shared config dirs for this application.
|
||||
|
||||
"appname" is the name of application.
|
||||
|
||||
|
|
@ -231,7 +222,6 @@ def _get_win_folder_with_ctypes(csidl_name):
|
|||
|
||||
return buf.value
|
||||
|
||||
|
||||
if WINDOWS:
|
||||
try:
|
||||
import ctypes
|
||||
42
lib/python3.4/site-packages/pip/utils/build.py
Normal file
42
lib/python3.4/site-packages/pip/utils/build.py
Normal file
|
|
@ -0,0 +1,42 @@
|
|||
from __future__ import absolute_import
|
||||
|
||||
import os.path
|
||||
import tempfile
|
||||
|
||||
from pip.utils import rmtree
|
||||
|
||||
|
||||
class BuildDirectory(object):
|
||||
|
||||
def __init__(self, name=None, delete=None):
|
||||
# If we were not given an explicit directory, and we were not given an
|
||||
# explicit delete option, then we'll default to deleting.
|
||||
if name is None and delete is None:
|
||||
delete = True
|
||||
|
||||
if name is None:
|
||||
# We realpath here because some systems have their default tmpdir
|
||||
# symlinked to another directory. This tends to confuse build
|
||||
# scripts, so we canonicalize the path by traversing potential
|
||||
# symlinks here.
|
||||
name = os.path.realpath(tempfile.mkdtemp(prefix="pip-build-"))
|
||||
# If we were not given an explicit directory, and we were not given
|
||||
# an explicit delete option, then we'll default to deleting.
|
||||
if delete is None:
|
||||
delete = True
|
||||
|
||||
self.name = name
|
||||
self.delete = delete
|
||||
|
||||
def __repr__(self):
|
||||
return "<{} {!r}>".format(self.__class__.__name__, self.name)
|
||||
|
||||
def __enter__(self):
|
||||
return self.name
|
||||
|
||||
def __exit__(self, exc, value, tb):
|
||||
self.cleanup()
|
||||
|
||||
def cleanup(self):
|
||||
if self.delete:
|
||||
rmtree(self.name)
|
||||
76
lib/python3.4/site-packages/pip/utils/deprecation.py
Normal file
76
lib/python3.4/site-packages/pip/utils/deprecation.py
Normal file
|
|
@ -0,0 +1,76 @@
|
|||
"""
|
||||
A module that implements tooling to enable easy warnings about deprecations.
|
||||
"""
|
||||
from __future__ import absolute_import
|
||||
|
||||
import logging
|
||||
import warnings
|
||||
|
||||
|
||||
class PipDeprecationWarning(Warning):
|
||||
pass
|
||||
|
||||
|
||||
class Pending(object):
|
||||
pass
|
||||
|
||||
|
||||
class RemovedInPip10Warning(PipDeprecationWarning):
|
||||
pass
|
||||
|
||||
|
||||
class RemovedInPip11Warning(PipDeprecationWarning, Pending):
|
||||
pass
|
||||
|
||||
|
||||
class Python26DeprecationWarning(PipDeprecationWarning):
|
||||
pass
|
||||
|
||||
|
||||
# Warnings <-> Logging Integration
|
||||
|
||||
|
||||
_warnings_showwarning = None
|
||||
|
||||
|
||||
def _showwarning(message, category, filename, lineno, file=None, line=None):
|
||||
if file is not None:
|
||||
if _warnings_showwarning is not None:
|
||||
_warnings_showwarning(
|
||||
message, category, filename, lineno, file, line,
|
||||
)
|
||||
else:
|
||||
if issubclass(category, PipDeprecationWarning):
|
||||
# We use a specially named logger which will handle all of the
|
||||
# deprecation messages for pip.
|
||||
logger = logging.getLogger("pip.deprecations")
|
||||
|
||||
# This is purposely using the % formatter here instead of letting
|
||||
# the logging module handle the interpolation. This is because we
|
||||
# want it to appear as if someone typed this entire message out.
|
||||
log_message = "DEPRECATION: %s" % message
|
||||
|
||||
# PipDeprecationWarnings that are Pending still have at least 2
|
||||
# versions to go until they are removed so they can just be
|
||||
# warnings. Otherwise, they will be removed in the very next
|
||||
# version of pip. We want these to be more obvious so we use the
|
||||
# ERROR logging level.
|
||||
if issubclass(category, Pending):
|
||||
logger.warning(log_message)
|
||||
else:
|
||||
logger.error(log_message)
|
||||
else:
|
||||
_warnings_showwarning(
|
||||
message, category, filename, lineno, file, line,
|
||||
)
|
||||
|
||||
|
||||
def install_warning_logger():
|
||||
# Enable our Deprecation Warnings
|
||||
warnings.simplefilter("default", PipDeprecationWarning, append=True)
|
||||
|
||||
global _warnings_showwarning
|
||||
|
||||
if _warnings_showwarning is None:
|
||||
_warnings_showwarning = warnings.showwarning
|
||||
warnings.showwarning = _showwarning
|
||||
|
|
@ -1,7 +1,7 @@
|
|||
import codecs
|
||||
import locale
|
||||
import re
|
||||
import sys
|
||||
|
||||
|
||||
BOMS = [
|
||||
(codecs.BOM_UTF8, 'utf8'),
|
||||
|
|
@ -13,7 +13,7 @@ BOMS = [
|
|||
(codecs.BOM_UTF32_LE, 'utf32-le'),
|
||||
]
|
||||
|
||||
ENCODING_RE = re.compile(br'coding[:=]\s*([-\w.]+)')
|
||||
ENCODING_RE = re.compile(b'coding[:=]\s*([-\w.]+)')
|
||||
|
||||
|
||||
def auto_decode(data):
|
||||
|
|
@ -28,6 +28,4 @@ def auto_decode(data):
|
|||
if line[0:1] == b'#' and ENCODING_RE.search(line):
|
||||
encoding = ENCODING_RE.search(line).groups()[0].decode('ascii')
|
||||
return data.decode(encoding)
|
||||
return data.decode(
|
||||
locale.getpreferredencoding(False) or sys.getdefaultencoding(),
|
||||
)
|
||||
return data.decode(locale.getpreferredencoding(False))
|
||||
|
|
@ -1,7 +1,7 @@
|
|||
import os
|
||||
import os.path
|
||||
|
||||
from pip._internal.utils.compat import get_path_uid
|
||||
from pip.compat import get_path_uid
|
||||
|
||||
|
||||
def check_path_owner(path):
|
||||
|
|
@ -1,7 +1,8 @@
|
|||
from __future__ import absolute_import
|
||||
|
||||
import ctypes
|
||||
import re
|
||||
import ctypes
|
||||
import platform
|
||||
import warnings
|
||||
|
||||
|
||||
|
|
@ -72,13 +73,9 @@ def have_compatible_glibc(required_major, minimum_minor):
|
|||
# misleading. Solution: instead of using platform, use our code that actually
|
||||
# works.
|
||||
def libc_ver():
|
||||
"""Try to determine the glibc version
|
||||
|
||||
Returns a tuple of strings (lib, version) which default to empty strings
|
||||
in case the lookup fails.
|
||||
"""
|
||||
glibc_version = glibc_version_string()
|
||||
if glibc_version is None:
|
||||
return ("", "")
|
||||
# For non-glibc platforms, fall back on platform.libc_ver
|
||||
return platform.libc_ver()
|
||||
else:
|
||||
return ("glibc", glibc_version)
|
||||
|
|
@ -2,12 +2,10 @@ from __future__ import absolute_import
|
|||
|
||||
import hashlib
|
||||
|
||||
from pip.exceptions import HashMismatch, HashMissing, InstallationError
|
||||
from pip.utils import read_chunks
|
||||
from pip._vendor.six import iteritems, iterkeys, itervalues
|
||||
|
||||
from pip._internal.exceptions import (
|
||||
HashMismatch, HashMissing, InstallationError,
|
||||
)
|
||||
from pip._internal.utils.misc import read_chunks
|
||||
|
||||
# The recommended hash algo of the moment. Change this whenever the state of
|
||||
# the art changes; it won't hurt backward compatibility.
|
||||
130
lib/python3.4/site-packages/pip/utils/logging.py
Normal file
130
lib/python3.4/site-packages/pip/utils/logging.py
Normal file
|
|
@ -0,0 +1,130 @@
|
|||
from __future__ import absolute_import
|
||||
|
||||
import contextlib
|
||||
import logging
|
||||
import logging.handlers
|
||||
import os
|
||||
|
||||
try:
|
||||
import threading
|
||||
except ImportError:
|
||||
import dummy_threading as threading
|
||||
|
||||
from pip.compat import WINDOWS
|
||||
from pip.utils import ensure_dir
|
||||
|
||||
try:
|
||||
from pip._vendor import colorama
|
||||
# Lots of different errors can come from this, including SystemError and
|
||||
# ImportError.
|
||||
except Exception:
|
||||
colorama = None
|
||||
|
||||
|
||||
_log_state = threading.local()
|
||||
_log_state.indentation = 0
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def indent_log(num=2):
|
||||
"""
|
||||
A context manager which will cause the log output to be indented for any
|
||||
log messages emitted inside it.
|
||||
"""
|
||||
_log_state.indentation += num
|
||||
try:
|
||||
yield
|
||||
finally:
|
||||
_log_state.indentation -= num
|
||||
|
||||
|
||||
def get_indentation():
|
||||
return getattr(_log_state, 'indentation', 0)
|
||||
|
||||
|
||||
class IndentingFormatter(logging.Formatter):
|
||||
|
||||
def format(self, record):
|
||||
"""
|
||||
Calls the standard formatter, but will indent all of the log messages
|
||||
by our current indentation level.
|
||||
"""
|
||||
formatted = logging.Formatter.format(self, record)
|
||||
formatted = "".join([
|
||||
(" " * get_indentation()) + line
|
||||
for line in formatted.splitlines(True)
|
||||
])
|
||||
return formatted
|
||||
|
||||
|
||||
def _color_wrap(*colors):
|
||||
def wrapped(inp):
|
||||
return "".join(list(colors) + [inp, colorama.Style.RESET_ALL])
|
||||
return wrapped
|
||||
|
||||
|
||||
class ColorizedStreamHandler(logging.StreamHandler):
|
||||
|
||||
# Don't build up a list of colors if we don't have colorama
|
||||
if colorama:
|
||||
COLORS = [
|
||||
# This needs to be in order from highest logging level to lowest.
|
||||
(logging.ERROR, _color_wrap(colorama.Fore.RED)),
|
||||
(logging.WARNING, _color_wrap(colorama.Fore.YELLOW)),
|
||||
]
|
||||
else:
|
||||
COLORS = []
|
||||
|
||||
def __init__(self, stream=None):
|
||||
logging.StreamHandler.__init__(self, stream)
|
||||
|
||||
if WINDOWS and colorama:
|
||||
self.stream = colorama.AnsiToWin32(self.stream)
|
||||
|
||||
def should_color(self):
|
||||
# Don't colorize things if we do not have colorama
|
||||
if not colorama:
|
||||
return False
|
||||
|
||||
real_stream = (
|
||||
self.stream if not isinstance(self.stream, colorama.AnsiToWin32)
|
||||
else self.stream.wrapped
|
||||
)
|
||||
|
||||
# If the stream is a tty we should color it
|
||||
if hasattr(real_stream, "isatty") and real_stream.isatty():
|
||||
return True
|
||||
|
||||
# If we have an ASNI term we should color it
|
||||
if os.environ.get("TERM") == "ANSI":
|
||||
return True
|
||||
|
||||
# If anything else we should not color it
|
||||
return False
|
||||
|
||||
def format(self, record):
|
||||
msg = logging.StreamHandler.format(self, record)
|
||||
|
||||
if self.should_color():
|
||||
for level, color in self.COLORS:
|
||||
if record.levelno >= level:
|
||||
msg = color(msg)
|
||||
break
|
||||
|
||||
return msg
|
||||
|
||||
|
||||
class BetterRotatingFileHandler(logging.handlers.RotatingFileHandler):
|
||||
|
||||
def _open(self):
|
||||
ensure_dir(os.path.dirname(self.baseFilename))
|
||||
return logging.handlers.RotatingFileHandler._open(self)
|
||||
|
||||
|
||||
class MaxLevelFilter(logging.Filter):
|
||||
|
||||
def __init__(self, level):
|
||||
self.level = level
|
||||
|
||||
def filter(self, record):
|
||||
return record.levelno < self.level
|
||||
|
|
@ -6,13 +6,15 @@ import logging
|
|||
import os.path
|
||||
import sys
|
||||
|
||||
from pip._vendor import lockfile, pkg_resources
|
||||
from pip._vendor import lockfile
|
||||
from pip._vendor.packaging import version as packaging_version
|
||||
|
||||
from pip._internal.index import PackageFinder
|
||||
from pip._internal.utils.compat import WINDOWS
|
||||
from pip._internal.utils.filesystem import check_path_owner
|
||||
from pip._internal.utils.misc import ensure_dir, get_installed_version
|
||||
from pip.compat import total_seconds, WINDOWS
|
||||
from pip.models import PyPI
|
||||
from pip.locations import USER_CACHE_DIR, running_under_virtualenv
|
||||
from pip.utils import ensure_dir, get_installed_version
|
||||
from pip.utils.filesystem import check_path_owner
|
||||
|
||||
|
||||
SELFCHECK_DATE_FMT = "%Y-%m-%dT%H:%M:%SZ"
|
||||
|
||||
|
|
@ -20,27 +22,43 @@ SELFCHECK_DATE_FMT = "%Y-%m-%dT%H:%M:%SZ"
|
|||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class SelfCheckState(object):
|
||||
def __init__(self, cache_dir):
|
||||
self.state = {}
|
||||
self.statefile_path = None
|
||||
class VirtualenvSelfCheckState(object):
|
||||
def __init__(self):
|
||||
self.statefile_path = os.path.join(sys.prefix, "pip-selfcheck.json")
|
||||
|
||||
# Try to load the existing state
|
||||
if cache_dir:
|
||||
self.statefile_path = os.path.join(cache_dir, "selfcheck.json")
|
||||
# Load the existing state
|
||||
try:
|
||||
with open(self.statefile_path) as statefile:
|
||||
self.state = json.load(statefile)
|
||||
except (IOError, ValueError):
|
||||
self.state = {}
|
||||
|
||||
def save(self, pypi_version, current_time):
|
||||
# Attempt to write out our version check file
|
||||
with open(self.statefile_path, "w") as statefile:
|
||||
json.dump(
|
||||
{
|
||||
"last_check": current_time.strftime(SELFCHECK_DATE_FMT),
|
||||
"pypi_version": pypi_version,
|
||||
},
|
||||
statefile,
|
||||
sort_keys=True,
|
||||
separators=(",", ":")
|
||||
)
|
||||
|
||||
|
||||
class GlobalSelfCheckState(object):
|
||||
def __init__(self):
|
||||
self.statefile_path = os.path.join(USER_CACHE_DIR, "selfcheck.json")
|
||||
|
||||
# Load the existing state
|
||||
try:
|
||||
with open(self.statefile_path) as statefile:
|
||||
self.state = json.load(statefile)[sys.prefix]
|
||||
except (IOError, ValueError, KeyError):
|
||||
# Explicitly suppressing exceptions, since we don't want to
|
||||
# error out if the cache file is invalid.
|
||||
pass
|
||||
self.state = {}
|
||||
|
||||
def save(self, pypi_version, current_time):
|
||||
# If we do not have a path to cache in, don't bother saving.
|
||||
if not self.statefile_path:
|
||||
return
|
||||
|
||||
# Check to make sure that we own the directory
|
||||
if not check_path_owner(os.path.dirname(self.statefile_path)):
|
||||
return
|
||||
|
|
@ -67,21 +85,14 @@ class SelfCheckState(object):
|
|||
separators=(",", ":"))
|
||||
|
||||
|
||||
def was_installed_by_pip(pkg):
|
||||
"""Checks whether pkg was installed by pip
|
||||
|
||||
This is used not to display the upgrade message when pip is in fact
|
||||
installed by system package manager, such as dnf on Fedora.
|
||||
"""
|
||||
try:
|
||||
dist = pkg_resources.get_distribution(pkg)
|
||||
return (dist.has_metadata('INSTALLER') and
|
||||
'pip' in dist.get_metadata_lines('INSTALLER'))
|
||||
except pkg_resources.DistributionNotFound:
|
||||
return False
|
||||
def load_selfcheck_statefile():
|
||||
if running_under_virtualenv():
|
||||
return VirtualenvSelfCheckState()
|
||||
else:
|
||||
return GlobalSelfCheckState()
|
||||
|
||||
|
||||
def pip_version_check(session, options):
|
||||
def pip_version_check(session):
|
||||
"""Check for an update for pip.
|
||||
|
||||
Limit the frequency of checks to once per week. State is stored either in
|
||||
|
|
@ -89,14 +100,14 @@ def pip_version_check(session, options):
|
|||
of the pip script path.
|
||||
"""
|
||||
installed_version = get_installed_version("pip")
|
||||
if not installed_version:
|
||||
if installed_version is None:
|
||||
return
|
||||
|
||||
pip_version = packaging_version.parse(installed_version)
|
||||
pypi_version = None
|
||||
|
||||
try:
|
||||
state = SelfCheckState(cache_dir=options.cache_dir)
|
||||
state = load_selfcheck_statefile()
|
||||
|
||||
current_time = datetime.datetime.utcnow()
|
||||
# Determine if we need to refresh the state
|
||||
|
|
@ -105,26 +116,23 @@ def pip_version_check(session, options):
|
|||
state.state["last_check"],
|
||||
SELFCHECK_DATE_FMT
|
||||
)
|
||||
if (current_time - last_check).total_seconds() < 7 * 24 * 60 * 60:
|
||||
if total_seconds(current_time - last_check) < 7 * 24 * 60 * 60:
|
||||
pypi_version = state.state["pypi_version"]
|
||||
|
||||
# Refresh the version if we need to or just see if we need to warn
|
||||
if pypi_version is None:
|
||||
# Lets use PackageFinder to see what the latest pip version is
|
||||
finder = PackageFinder(
|
||||
find_links=options.find_links,
|
||||
index_urls=[options.index_url] + options.extra_index_urls,
|
||||
allow_all_prereleases=False, # Explicitly set to False
|
||||
trusted_hosts=options.trusted_hosts,
|
||||
process_dependency_links=options.process_dependency_links,
|
||||
session=session,
|
||||
resp = session.get(
|
||||
PyPI.pip_json_url,
|
||||
headers={"Accept": "application/json"},
|
||||
)
|
||||
all_candidates = finder.find_all_candidates("pip")
|
||||
if not all_candidates:
|
||||
return
|
||||
pypi_version = str(
|
||||
max(all_candidates, key=lambda c: c.version).version
|
||||
resp.raise_for_status()
|
||||
pypi_version = [
|
||||
v for v in sorted(
|
||||
list(resp.json()["releases"]),
|
||||
key=packaging_version.parse,
|
||||
)
|
||||
if not packaging_version.parse(v).is_prerelease
|
||||
][-1]
|
||||
|
||||
# save that we've performed a check
|
||||
state.save(pypi_version, current_time)
|
||||
|
|
@ -133,8 +141,7 @@ def pip_version_check(session, options):
|
|||
|
||||
# Determine if our pypi_version is older
|
||||
if (pip_version < remote_version and
|
||||
pip_version.base_version != remote_version.base_version and
|
||||
was_installed_by_pip('pip')):
|
||||
pip_version.base_version != remote_version.base_version):
|
||||
# Advise "python -m pip" on Windows to avoid issues
|
||||
# with overwriting pip.exe.
|
||||
if WINDOWS:
|
||||
|
|
@ -147,6 +154,7 @@ def pip_version_check(session, options):
|
|||
"'%s install --upgrade pip' command.",
|
||||
pip_version, pypi_version, pip_cmd
|
||||
)
|
||||
|
||||
except Exception:
|
||||
logger.debug(
|
||||
"There was an error checking the latest version of pip",
|
||||
|
|
@ -1,14 +1,15 @@
|
|||
from __future__ import absolute_import
|
||||
|
||||
from email.parser import FeedParser
|
||||
|
||||
import logging
|
||||
import sys
|
||||
from email.parser import FeedParser # type: ignore
|
||||
|
||||
from pip._vendor.packaging import specifiers
|
||||
from pip._vendor.packaging import version
|
||||
from pip._vendor import pkg_resources
|
||||
from pip._vendor.packaging import specifiers, version
|
||||
|
||||
from pip._internal import exceptions
|
||||
from pip._internal.utils.misc import display_path
|
||||
from pip import exceptions
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
|
@ -36,20 +37,16 @@ def check_requires_python(requires_python):
|
|||
def get_metadata(dist):
|
||||
if (isinstance(dist, pkg_resources.DistInfoDistribution) and
|
||||
dist.has_metadata('METADATA')):
|
||||
metadata = dist.get_metadata('METADATA')
|
||||
return dist.get_metadata('METADATA')
|
||||
elif dist.has_metadata('PKG-INFO'):
|
||||
metadata = dist.get_metadata('PKG-INFO')
|
||||
else:
|
||||
logger.warning("No metadata found in %s", display_path(dist.location))
|
||||
metadata = ''
|
||||
|
||||
feed_parser = FeedParser()
|
||||
feed_parser.feed(metadata)
|
||||
return feed_parser.close()
|
||||
return dist.get_metadata('PKG-INFO')
|
||||
|
||||
|
||||
def check_dist_requires_python(dist):
|
||||
pkg_info_dict = get_metadata(dist)
|
||||
metadata = get_metadata(dist)
|
||||
feed_parser = FeedParser()
|
||||
feed_parser.feed(metadata)
|
||||
pkg_info_dict = feed_parser.close()
|
||||
requires_python = pkg_info_dict.get('Requires-Python')
|
||||
try:
|
||||
if not check_requires_python(requires_python):
|
||||
|
|
@ -61,15 +58,6 @@ def check_dist_requires_python(dist):
|
|||
)
|
||||
except specifiers.InvalidSpecifier as e:
|
||||
logger.warning(
|
||||
"Package %s has an invalid Requires-Python entry %s - %s",
|
||||
dist.project_name, requires_python, e,
|
||||
)
|
||||
"Package %s has an invalid Requires-Python entry %s - %s" % (
|
||||
dist.project_name, requires_python, e))
|
||||
return
|
||||
|
||||
|
||||
def get_installer(dist):
|
||||
if dist.has_metadata('INSTALLER'):
|
||||
for line in dist.get_metadata_lines('INSTALLER'):
|
||||
if line.strip():
|
||||
return line.strip()
|
||||
return ''
|
||||
|
|
@ -1,28 +1,22 @@
|
|||
from __future__ import absolute_import, division
|
||||
from __future__ import absolute_import
|
||||
from __future__ import division
|
||||
|
||||
import contextlib
|
||||
import itertools
|
||||
import logging
|
||||
import sys
|
||||
from signal import signal, SIGINT, default_int_handler
|
||||
import time
|
||||
from signal import SIGINT, default_int_handler, signal
|
||||
import contextlib
|
||||
import logging
|
||||
|
||||
from pip.compat import WINDOWS
|
||||
from pip.utils import format_size
|
||||
from pip.utils.logging import get_indentation
|
||||
from pip._vendor import six
|
||||
from pip._vendor.progress.bar import (
|
||||
Bar, ChargingBar, FillingCirclesBar, FillingSquaresBar, IncrementalBar,
|
||||
ShadyBar,
|
||||
)
|
||||
from pip._vendor.progress.helpers import HIDE_CURSOR, SHOW_CURSOR, WritelnMixin
|
||||
from pip._vendor.progress.bar import Bar, IncrementalBar
|
||||
from pip._vendor.progress.helpers import (WritelnMixin,
|
||||
HIDE_CURSOR, SHOW_CURSOR)
|
||||
from pip._vendor.progress.spinner import Spinner
|
||||
|
||||
from pip._internal.utils.compat import WINDOWS
|
||||
from pip._internal.utils.logging import get_indentation
|
||||
from pip._internal.utils.misc import format_size
|
||||
from pip._internal.utils.typing import MYPY_CHECK_RUNNING
|
||||
|
||||
if MYPY_CHECK_RUNNING:
|
||||
from typing import Any # noqa: F401
|
||||
|
||||
try:
|
||||
from pip._vendor import colorama
|
||||
# Lots of different errors can come from this, including SystemError and
|
||||
|
|
@ -60,7 +54,7 @@ def _select_progress_class(preferred, fallback):
|
|||
return preferred
|
||||
|
||||
|
||||
_BaseBar = _select_progress_class(IncrementalBar, Bar) # type: Any
|
||||
_BaseBar = _select_progress_class(IncrementalBar, Bar)
|
||||
|
||||
|
||||
class InterruptibleMixin(object):
|
||||
|
|
@ -118,20 +112,6 @@ class InterruptibleMixin(object):
|
|||
self.original_handler(signum, frame)
|
||||
|
||||
|
||||
class SilentBar(Bar):
|
||||
|
||||
def update(self):
|
||||
pass
|
||||
|
||||
|
||||
class BlueEmojiBar(IncrementalBar):
|
||||
|
||||
suffix = "%(percent)d%%"
|
||||
bar_prefix = " "
|
||||
bar_suffix = " "
|
||||
phases = (u"\U0001F539", u"\U0001F537", u"\U0001F535") # type: Any
|
||||
|
||||
|
||||
class DownloadProgressMixin(object):
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
|
|
@ -191,54 +171,13 @@ class WindowsMixin(object):
|
|||
self.file.flush = lambda: self.file.wrapped.flush()
|
||||
|
||||
|
||||
class BaseDownloadProgressBar(WindowsMixin, InterruptibleMixin,
|
||||
DownloadProgressMixin):
|
||||
class DownloadProgressBar(WindowsMixin, InterruptibleMixin,
|
||||
DownloadProgressMixin, _BaseBar):
|
||||
|
||||
file = sys.stdout
|
||||
message = "%(percent)d%%"
|
||||
suffix = "%(downloaded)s %(download_speed)s %(pretty_eta)s"
|
||||
|
||||
# NOTE: The "type: ignore" comments on the following classes are there to
|
||||
# work around https://github.com/python/typing/issues/241
|
||||
|
||||
|
||||
class DefaultDownloadProgressBar(BaseDownloadProgressBar,
|
||||
_BaseBar): # type: ignore
|
||||
pass
|
||||
|
||||
|
||||
class DownloadSilentBar(BaseDownloadProgressBar, SilentBar): # type: ignore
|
||||
pass
|
||||
|
||||
|
||||
class DownloadIncrementalBar(BaseDownloadProgressBar, # type: ignore
|
||||
IncrementalBar):
|
||||
pass
|
||||
|
||||
|
||||
class DownloadChargingBar(BaseDownloadProgressBar, # type: ignore
|
||||
ChargingBar):
|
||||
pass
|
||||
|
||||
|
||||
class DownloadShadyBar(BaseDownloadProgressBar, ShadyBar): # type: ignore
|
||||
pass
|
||||
|
||||
|
||||
class DownloadFillingSquaresBar(BaseDownloadProgressBar, # type: ignore
|
||||
FillingSquaresBar):
|
||||
pass
|
||||
|
||||
|
||||
class DownloadFillingCirclesBar(BaseDownloadProgressBar, # type: ignore
|
||||
FillingCirclesBar):
|
||||
pass
|
||||
|
||||
|
||||
class DownloadBlueEmojiProgressBar(BaseDownloadProgressBar, # type: ignore
|
||||
BlueEmojiBar):
|
||||
pass
|
||||
|
||||
|
||||
class DownloadProgressSpinner(WindowsMixin, InterruptibleMixin,
|
||||
DownloadProgressMixin, WritelnMixin, Spinner):
|
||||
|
|
@ -266,22 +205,6 @@ class DownloadProgressSpinner(WindowsMixin, InterruptibleMixin,
|
|||
self.writeln(line)
|
||||
|
||||
|
||||
BAR_TYPES = {
|
||||
"off": (DownloadSilentBar, DownloadSilentBar),
|
||||
"on": (DefaultDownloadProgressBar, DownloadProgressSpinner),
|
||||
"ascii": (DownloadIncrementalBar, DownloadProgressSpinner),
|
||||
"pretty": (DownloadFillingCirclesBar, DownloadProgressSpinner),
|
||||
"emoji": (DownloadBlueEmojiProgressBar, DownloadProgressSpinner)
|
||||
}
|
||||
|
||||
|
||||
def DownloadProgressProvider(progress_bar, max=None):
|
||||
if max is None or max == 0:
|
||||
return BAR_TYPES[progress_bar][1]().iter
|
||||
else:
|
||||
return BAR_TYPES[progress_bar][0](max=max).iter
|
||||
|
||||
|
||||
################################################################
|
||||
# Generic "something is happening" spinners
|
||||
#
|
||||
366
lib/python3.4/site-packages/pip/vcs/__init__.py
Normal file
366
lib/python3.4/site-packages/pip/vcs/__init__.py
Normal file
|
|
@ -0,0 +1,366 @@
|
|||
"""Handles all VCS (version control) support"""
|
||||
from __future__ import absolute_import
|
||||
|
||||
import errno
|
||||
import logging
|
||||
import os
|
||||
import shutil
|
||||
import sys
|
||||
|
||||
from pip._vendor.six.moves.urllib import parse as urllib_parse
|
||||
|
||||
from pip.exceptions import BadCommand
|
||||
from pip.utils import (display_path, backup_dir, call_subprocess,
|
||||
rmtree, ask_path_exists)
|
||||
|
||||
|
||||
__all__ = ['vcs', 'get_src_requirement']
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class VcsSupport(object):
|
||||
_registry = {}
|
||||
schemes = ['ssh', 'git', 'hg', 'bzr', 'sftp', 'svn']
|
||||
|
||||
def __init__(self):
|
||||
# Register more schemes with urlparse for various version control
|
||||
# systems
|
||||
urllib_parse.uses_netloc.extend(self.schemes)
|
||||
# Python >= 2.7.4, 3.3 doesn't have uses_fragment
|
||||
if getattr(urllib_parse, 'uses_fragment', None):
|
||||
urllib_parse.uses_fragment.extend(self.schemes)
|
||||
super(VcsSupport, self).__init__()
|
||||
|
||||
def __iter__(self):
|
||||
return self._registry.__iter__()
|
||||
|
||||
@property
|
||||
def backends(self):
|
||||
return list(self._registry.values())
|
||||
|
||||
@property
|
||||
def dirnames(self):
|
||||
return [backend.dirname for backend in self.backends]
|
||||
|
||||
@property
|
||||
def all_schemes(self):
|
||||
schemes = []
|
||||
for backend in self.backends:
|
||||
schemes.extend(backend.schemes)
|
||||
return schemes
|
||||
|
||||
def register(self, cls):
|
||||
if not hasattr(cls, 'name'):
|
||||
logger.warning('Cannot register VCS %s', cls.__name__)
|
||||
return
|
||||
if cls.name not in self._registry:
|
||||
self._registry[cls.name] = cls
|
||||
logger.debug('Registered VCS backend: %s', cls.name)
|
||||
|
||||
def unregister(self, cls=None, name=None):
|
||||
if name in self._registry:
|
||||
del self._registry[name]
|
||||
elif cls in self._registry.values():
|
||||
del self._registry[cls.name]
|
||||
else:
|
||||
logger.warning('Cannot unregister because no class or name given')
|
||||
|
||||
def get_backend_name(self, location):
|
||||
"""
|
||||
Return the name of the version control backend if found at given
|
||||
location, e.g. vcs.get_backend_name('/path/to/vcs/checkout')
|
||||
"""
|
||||
for vc_type in self._registry.values():
|
||||
if vc_type.controls_location(location):
|
||||
logger.debug('Determine that %s uses VCS: %s',
|
||||
location, vc_type.name)
|
||||
return vc_type.name
|
||||
return None
|
||||
|
||||
def get_backend(self, name):
|
||||
name = name.lower()
|
||||
if name in self._registry:
|
||||
return self._registry[name]
|
||||
|
||||
def get_backend_from_location(self, location):
|
||||
vc_type = self.get_backend_name(location)
|
||||
if vc_type:
|
||||
return self.get_backend(vc_type)
|
||||
return None
|
||||
|
||||
|
||||
vcs = VcsSupport()
|
||||
|
||||
|
||||
class VersionControl(object):
|
||||
name = ''
|
||||
dirname = ''
|
||||
# List of supported schemes for this Version Control
|
||||
schemes = ()
|
||||
|
||||
def __init__(self, url=None, *args, **kwargs):
|
||||
self.url = url
|
||||
super(VersionControl, self).__init__(*args, **kwargs)
|
||||
|
||||
def _is_local_repository(self, repo):
|
||||
"""
|
||||
posix absolute paths start with os.path.sep,
|
||||
win32 ones start with drive (like c:\\folder)
|
||||
"""
|
||||
drive, tail = os.path.splitdrive(repo)
|
||||
return repo.startswith(os.path.sep) or drive
|
||||
|
||||
# See issue #1083 for why this method was introduced:
|
||||
# https://github.com/pypa/pip/issues/1083
|
||||
def translate_egg_surname(self, surname):
|
||||
# For example, Django has branches of the form "stable/1.7.x".
|
||||
return surname.replace('/', '_')
|
||||
|
||||
def export(self, location):
|
||||
"""
|
||||
Export the repository at the url to the destination location
|
||||
i.e. only download the files, without vcs informations
|
||||
"""
|
||||
raise NotImplementedError
|
||||
|
||||
def get_url_rev(self):
|
||||
"""
|
||||
Returns the correct repository URL and revision by parsing the given
|
||||
repository URL
|
||||
"""
|
||||
error_message = (
|
||||
"Sorry, '%s' is a malformed VCS url. "
|
||||
"The format is <vcs>+<protocol>://<url>, "
|
||||
"e.g. svn+http://myrepo/svn/MyApp#egg=MyApp"
|
||||
)
|
||||
assert '+' in self.url, error_message % self.url
|
||||
url = self.url.split('+', 1)[1]
|
||||
scheme, netloc, path, query, frag = urllib_parse.urlsplit(url)
|
||||
rev = None
|
||||
if '@' in path:
|
||||
path, rev = path.rsplit('@', 1)
|
||||
url = urllib_parse.urlunsplit((scheme, netloc, path, query, ''))
|
||||
return url, rev
|
||||
|
||||
def get_info(self, location):
|
||||
"""
|
||||
Returns (url, revision), where both are strings
|
||||
"""
|
||||
assert not location.rstrip('/').endswith(self.dirname), \
|
||||
'Bad directory: %s' % location
|
||||
return self.get_url(location), self.get_revision(location)
|
||||
|
||||
def normalize_url(self, url):
|
||||
"""
|
||||
Normalize a URL for comparison by unquoting it and removing any
|
||||
trailing slash.
|
||||
"""
|
||||
return urllib_parse.unquote(url).rstrip('/')
|
||||
|
||||
def compare_urls(self, url1, url2):
|
||||
"""
|
||||
Compare two repo URLs for identity, ignoring incidental differences.
|
||||
"""
|
||||
return (self.normalize_url(url1) == self.normalize_url(url2))
|
||||
|
||||
def obtain(self, dest):
|
||||
"""
|
||||
Called when installing or updating an editable package, takes the
|
||||
source path of the checkout.
|
||||
"""
|
||||
raise NotImplementedError
|
||||
|
||||
def switch(self, dest, url, rev_options):
|
||||
"""
|
||||
Switch the repo at ``dest`` to point to ``URL``.
|
||||
"""
|
||||
raise NotImplementedError
|
||||
|
||||
def update(self, dest, rev_options):
|
||||
"""
|
||||
Update an already-existing repo to the given ``rev_options``.
|
||||
"""
|
||||
raise NotImplementedError
|
||||
|
||||
def check_version(self, dest, rev_options):
|
||||
"""
|
||||
Return True if the version is identical to what exists and
|
||||
doesn't need to be updated.
|
||||
"""
|
||||
raise NotImplementedError
|
||||
|
||||
def check_destination(self, dest, url, rev_options, rev_display):
|
||||
"""
|
||||
Prepare a location to receive a checkout/clone.
|
||||
|
||||
Return True if the location is ready for (and requires) a
|
||||
checkout/clone, False otherwise.
|
||||
"""
|
||||
checkout = True
|
||||
prompt = False
|
||||
if os.path.exists(dest):
|
||||
checkout = False
|
||||
if os.path.exists(os.path.join(dest, self.dirname)):
|
||||
existing_url = self.get_url(dest)
|
||||
if self.compare_urls(existing_url, url):
|
||||
logger.debug(
|
||||
'%s in %s exists, and has correct URL (%s)',
|
||||
self.repo_name.title(),
|
||||
display_path(dest),
|
||||
url,
|
||||
)
|
||||
if not self.check_version(dest, rev_options):
|
||||
logger.info(
|
||||
'Updating %s %s%s',
|
||||
display_path(dest),
|
||||
self.repo_name,
|
||||
rev_display,
|
||||
)
|
||||
self.update(dest, rev_options)
|
||||
else:
|
||||
logger.info(
|
||||
'Skipping because already up-to-date.')
|
||||
else:
|
||||
logger.warning(
|
||||
'%s %s in %s exists with URL %s',
|
||||
self.name,
|
||||
self.repo_name,
|
||||
display_path(dest),
|
||||
existing_url,
|
||||
)
|
||||
prompt = ('(s)witch, (i)gnore, (w)ipe, (b)ackup ',
|
||||
('s', 'i', 'w', 'b'))
|
||||
else:
|
||||
logger.warning(
|
||||
'Directory %s already exists, and is not a %s %s.',
|
||||
dest,
|
||||
self.name,
|
||||
self.repo_name,
|
||||
)
|
||||
prompt = ('(i)gnore, (w)ipe, (b)ackup ', ('i', 'w', 'b'))
|
||||
if prompt:
|
||||
logger.warning(
|
||||
'The plan is to install the %s repository %s',
|
||||
self.name,
|
||||
url,
|
||||
)
|
||||
response = ask_path_exists('What to do? %s' % prompt[0],
|
||||
prompt[1])
|
||||
|
||||
if response == 's':
|
||||
logger.info(
|
||||
'Switching %s %s to %s%s',
|
||||
self.repo_name,
|
||||
display_path(dest),
|
||||
url,
|
||||
rev_display,
|
||||
)
|
||||
self.switch(dest, url, rev_options)
|
||||
elif response == 'i':
|
||||
# do nothing
|
||||
pass
|
||||
elif response == 'w':
|
||||
logger.warning('Deleting %s', display_path(dest))
|
||||
rmtree(dest)
|
||||
checkout = True
|
||||
elif response == 'b':
|
||||
dest_dir = backup_dir(dest)
|
||||
logger.warning(
|
||||
'Backing up %s to %s', display_path(dest), dest_dir,
|
||||
)
|
||||
shutil.move(dest, dest_dir)
|
||||
checkout = True
|
||||
elif response == 'a':
|
||||
sys.exit(-1)
|
||||
return checkout
|
||||
|
||||
def unpack(self, location):
|
||||
"""
|
||||
Clean up current location and download the url repository
|
||||
(and vcs infos) into location
|
||||
"""
|
||||
if os.path.exists(location):
|
||||
rmtree(location)
|
||||
self.obtain(location)
|
||||
|
||||
def get_src_requirement(self, dist, location):
|
||||
"""
|
||||
Return a string representing the requirement needed to
|
||||
redownload the files currently present in location, something
|
||||
like:
|
||||
{repository_url}@{revision}#egg={project_name}-{version_identifier}
|
||||
"""
|
||||
raise NotImplementedError
|
||||
|
||||
def get_url(self, location):
|
||||
"""
|
||||
Return the url used at location
|
||||
Used in get_info or check_destination
|
||||
"""
|
||||
raise NotImplementedError
|
||||
|
||||
def get_revision(self, location):
|
||||
"""
|
||||
Return the current revision of the files at location
|
||||
Used in get_info
|
||||
"""
|
||||
raise NotImplementedError
|
||||
|
||||
def run_command(self, cmd, show_stdout=True, cwd=None,
|
||||
on_returncode='raise',
|
||||
command_desc=None,
|
||||
extra_environ=None, spinner=None):
|
||||
"""
|
||||
Run a VCS subcommand
|
||||
This is simply a wrapper around call_subprocess that adds the VCS
|
||||
command name, and checks that the VCS is available
|
||||
"""
|
||||
cmd = [self.name] + cmd
|
||||
try:
|
||||
return call_subprocess(cmd, show_stdout, cwd,
|
||||
on_returncode,
|
||||
command_desc, extra_environ,
|
||||
spinner)
|
||||
except OSError as e:
|
||||
# errno.ENOENT = no such file or directory
|
||||
# In other words, the VCS executable isn't available
|
||||
if e.errno == errno.ENOENT:
|
||||
raise BadCommand('Cannot find command %r' % self.name)
|
||||
else:
|
||||
raise # re-raise exception if a different error occurred
|
||||
|
||||
@classmethod
|
||||
def controls_location(cls, location):
|
||||
"""
|
||||
Check if a location is controlled by the vcs.
|
||||
It is meant to be overridden to implement smarter detection
|
||||
mechanisms for specific vcs.
|
||||
"""
|
||||
logger.debug('Checking in %s for %s (%s)...',
|
||||
location, cls.dirname, cls.name)
|
||||
path = os.path.join(location, cls.dirname)
|
||||
return os.path.exists(path)
|
||||
|
||||
|
||||
def get_src_requirement(dist, location):
|
||||
version_control = vcs.get_backend_from_location(location)
|
||||
if version_control:
|
||||
try:
|
||||
return version_control().get_src_requirement(dist,
|
||||
location)
|
||||
except BadCommand:
|
||||
logger.warning(
|
||||
'cannot determine version of editable source in %s '
|
||||
'(%s command not found in path)',
|
||||
location,
|
||||
version_control.name,
|
||||
)
|
||||
return dist.as_requirement()
|
||||
logger.warning(
|
||||
'cannot determine version of editable source in %s (is not SVN '
|
||||
'checkout, Git clone, Mercurial clone or Bazaar branch)',
|
||||
location,
|
||||
)
|
||||
return dist.as_requirement()
|
||||
|
|
@ -2,15 +2,18 @@ from __future__ import absolute_import
|
|||
|
||||
import logging
|
||||
import os
|
||||
import tempfile
|
||||
|
||||
from pip._vendor.six.moves.urllib import parse as urllib_parse
|
||||
# TODO: Get this into six.moves.urllib.parse
|
||||
try:
|
||||
from urllib import parse as urllib_parse
|
||||
except ImportError:
|
||||
import urlparse as urllib_parse
|
||||
|
||||
from pip.utils import rmtree, display_path
|
||||
from pip.vcs import vcs, VersionControl
|
||||
from pip.download import path_to_url
|
||||
|
||||
from pip._internal.download import path_to_url
|
||||
from pip._internal.utils.misc import (
|
||||
display_path, make_vcs_requirement_url, rmtree,
|
||||
)
|
||||
from pip._internal.utils.temp_dir import TempDirectory
|
||||
from pip._internal.vcs import VersionControl, vcs
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
|
@ -26,54 +29,56 @@ class Bazaar(VersionControl):
|
|||
|
||||
def __init__(self, url=None, *args, **kwargs):
|
||||
super(Bazaar, self).__init__(url, *args, **kwargs)
|
||||
# This is only needed for python <2.7.5
|
||||
# Python >= 2.7.4, 3.3 doesn't have uses_fragment or non_hierarchical
|
||||
# Register lp but do not expose as a scheme to support bzr+lp.
|
||||
if getattr(urllib_parse, 'uses_fragment', None):
|
||||
urllib_parse.uses_fragment.extend(['lp'])
|
||||
|
||||
def get_base_rev_args(self, rev):
|
||||
return ['-r', rev]
|
||||
urllib_parse.non_hierarchical.extend(['lp'])
|
||||
|
||||
def export(self, location):
|
||||
"""
|
||||
Export the Bazaar repository at the url to the destination location
|
||||
"""
|
||||
# Remove the location to make sure Bazaar can export it correctly
|
||||
temp_dir = tempfile.mkdtemp('-export', 'pip-')
|
||||
self.unpack(temp_dir)
|
||||
if os.path.exists(location):
|
||||
# Remove the location to make sure Bazaar can export it correctly
|
||||
rmtree(location)
|
||||
try:
|
||||
self.run_command(['export', location], cwd=temp_dir,
|
||||
show_stdout=False)
|
||||
finally:
|
||||
rmtree(temp_dir)
|
||||
|
||||
with TempDirectory(kind="export") as temp_dir:
|
||||
self.unpack(temp_dir.path)
|
||||
def switch(self, dest, url, rev_options):
|
||||
self.run_command(['switch', url], cwd=dest)
|
||||
|
||||
self.run_command(
|
||||
['export', location],
|
||||
cwd=temp_dir.path, show_stdout=False,
|
||||
)
|
||||
def update(self, dest, rev_options):
|
||||
self.run_command(['pull', '-q'] + rev_options, cwd=dest)
|
||||
|
||||
def fetch_new(self, dest, url, rev_options):
|
||||
rev_display = rev_options.to_display()
|
||||
def obtain(self, dest):
|
||||
url, rev = self.get_url_rev()
|
||||
if rev:
|
||||
rev_options = ['-r', rev]
|
||||
rev_display = ' (to revision %s)' % rev
|
||||
else:
|
||||
rev_options = []
|
||||
rev_display = ''
|
||||
if self.check_destination(dest, url, rev_options, rev_display):
|
||||
logger.info(
|
||||
'Checking out %s%s to %s',
|
||||
url,
|
||||
rev_display,
|
||||
display_path(dest),
|
||||
)
|
||||
cmd_args = ['branch', '-q'] + rev_options.to_args() + [url, dest]
|
||||
self.run_command(cmd_args)
|
||||
self.run_command(['branch', '-q'] + rev_options + [url, dest])
|
||||
|
||||
def switch(self, dest, url, rev_options):
|
||||
self.run_command(['switch', url], cwd=dest)
|
||||
|
||||
def update(self, dest, url, rev_options):
|
||||
cmd_args = ['pull', '-q'] + rev_options.to_args()
|
||||
self.run_command(cmd_args, cwd=dest)
|
||||
|
||||
def get_url_rev_and_auth(self, url):
|
||||
def get_url_rev(self):
|
||||
# hotfix the URL scheme after removing bzr+ from bzr+ssh:// readd it
|
||||
url, rev, user_pass = super(Bazaar, self).get_url_rev_and_auth(url)
|
||||
url, rev = super(Bazaar, self).get_url_rev()
|
||||
if url.startswith('ssh://'):
|
||||
url = 'bzr+' + url
|
||||
return url, rev, user_pass
|
||||
return url, rev
|
||||
|
||||
def get_url(self, location):
|
||||
urls = self.run_command(['info'], show_stdout=False, cwd=location)
|
||||
|
|
@ -90,8 +95,7 @@ class Bazaar(VersionControl):
|
|||
|
||||
def get_revision(self, location):
|
||||
revision = self.run_command(
|
||||
['revno'], show_stdout=False, cwd=location,
|
||||
)
|
||||
['revno'], show_stdout=False, cwd=location)
|
||||
return revision.splitlines()[-1]
|
||||
|
||||
def get_src_requirement(self, dist, location):
|
||||
|
|
@ -100,11 +104,11 @@ class Bazaar(VersionControl):
|
|||
return None
|
||||
if not repo.lower().startswith('bzr:'):
|
||||
repo = 'bzr+' + repo
|
||||
current_rev = self.get_revision(location)
|
||||
egg_project_name = dist.egg_name().split('-', 1)[0]
|
||||
return make_vcs_requirement_url(repo, current_rev, egg_project_name)
|
||||
current_rev = self.get_revision(location)
|
||||
return '%s@%s#egg=%s' % (repo, current_rev, egg_project_name)
|
||||
|
||||
def is_commit_id_equal(self, dest, name):
|
||||
def check_version(self, dest, rev_options):
|
||||
"""Always assume the versions don't match"""
|
||||
return False
|
||||
|
||||
300
lib/python3.4/site-packages/pip/vcs/git.py
Normal file
300
lib/python3.4/site-packages/pip/vcs/git.py
Normal file
|
|
@ -0,0 +1,300 @@
|
|||
from __future__ import absolute_import
|
||||
|
||||
import logging
|
||||
import tempfile
|
||||
import os.path
|
||||
|
||||
from pip.compat import samefile
|
||||
from pip.exceptions import BadCommand
|
||||
from pip._vendor.six.moves.urllib import parse as urllib_parse
|
||||
from pip._vendor.six.moves.urllib import request as urllib_request
|
||||
from pip._vendor.packaging.version import parse as parse_version
|
||||
|
||||
from pip.utils import display_path, rmtree
|
||||
from pip.vcs import vcs, VersionControl
|
||||
|
||||
|
||||
urlsplit = urllib_parse.urlsplit
|
||||
urlunsplit = urllib_parse.urlunsplit
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class Git(VersionControl):
|
||||
name = 'git'
|
||||
dirname = '.git'
|
||||
repo_name = 'clone'
|
||||
schemes = (
|
||||
'git', 'git+http', 'git+https', 'git+ssh', 'git+git', 'git+file',
|
||||
)
|
||||
|
||||
def __init__(self, url=None, *args, **kwargs):
|
||||
|
||||
# Works around an apparent Git bug
|
||||
# (see http://article.gmane.org/gmane.comp.version-control.git/146500)
|
||||
if url:
|
||||
scheme, netloc, path, query, fragment = urlsplit(url)
|
||||
if scheme.endswith('file'):
|
||||
initial_slashes = path[:-len(path.lstrip('/'))]
|
||||
newpath = (
|
||||
initial_slashes +
|
||||
urllib_request.url2pathname(path)
|
||||
.replace('\\', '/').lstrip('/')
|
||||
)
|
||||
url = urlunsplit((scheme, netloc, newpath, query, fragment))
|
||||
after_plus = scheme.find('+') + 1
|
||||
url = scheme[:after_plus] + urlunsplit(
|
||||
(scheme[after_plus:], netloc, newpath, query, fragment),
|
||||
)
|
||||
|
||||
super(Git, self).__init__(url, *args, **kwargs)
|
||||
|
||||
def get_git_version(self):
|
||||
VERSION_PFX = 'git version '
|
||||
version = self.run_command(['version'], show_stdout=False)
|
||||
if version.startswith(VERSION_PFX):
|
||||
version = version[len(VERSION_PFX):]
|
||||
else:
|
||||
version = ''
|
||||
# get first 3 positions of the git version becasue
|
||||
# on windows it is x.y.z.windows.t, and this parses as
|
||||
# LegacyVersion which always smaller than a Version.
|
||||
version = '.'.join(version.split('.')[:3])
|
||||
return parse_version(version)
|
||||
|
||||
def export(self, location):
|
||||
"""Export the Git repository at the url to the destination location"""
|
||||
temp_dir = tempfile.mkdtemp('-export', 'pip-')
|
||||
self.unpack(temp_dir)
|
||||
try:
|
||||
if not location.endswith('/'):
|
||||
location = location + '/'
|
||||
self.run_command(
|
||||
['checkout-index', '-a', '-f', '--prefix', location],
|
||||
show_stdout=False, cwd=temp_dir)
|
||||
finally:
|
||||
rmtree(temp_dir)
|
||||
|
||||
def check_rev_options(self, rev, dest, rev_options):
|
||||
"""Check the revision options before checkout to compensate that tags
|
||||
and branches may need origin/ as a prefix.
|
||||
Returns the SHA1 of the branch or tag if found.
|
||||
"""
|
||||
revisions = self.get_short_refs(dest)
|
||||
|
||||
origin_rev = 'origin/%s' % rev
|
||||
if origin_rev in revisions:
|
||||
# remote branch
|
||||
return [revisions[origin_rev]]
|
||||
elif rev in revisions:
|
||||
# a local tag or branch name
|
||||
return [revisions[rev]]
|
||||
else:
|
||||
logger.warning(
|
||||
"Could not find a tag or branch '%s', assuming commit.", rev,
|
||||
)
|
||||
return rev_options
|
||||
|
||||
def check_version(self, dest, rev_options):
|
||||
"""
|
||||
Compare the current sha to the ref. ref may be a branch or tag name,
|
||||
but current rev will always point to a sha. This means that a branch
|
||||
or tag will never compare as True. So this ultimately only matches
|
||||
against exact shas.
|
||||
"""
|
||||
return self.get_revision(dest).startswith(rev_options[0])
|
||||
|
||||
def switch(self, dest, url, rev_options):
|
||||
self.run_command(['config', 'remote.origin.url', url], cwd=dest)
|
||||
self.run_command(['checkout', '-q'] + rev_options, cwd=dest)
|
||||
|
||||
self.update_submodules(dest)
|
||||
|
||||
def update(self, dest, rev_options):
|
||||
# First fetch changes from the default remote
|
||||
if self.get_git_version() >= parse_version('1.9.0'):
|
||||
# fetch tags in addition to everything else
|
||||
self.run_command(['fetch', '-q', '--tags'], cwd=dest)
|
||||
else:
|
||||
self.run_command(['fetch', '-q'], cwd=dest)
|
||||
# Then reset to wanted revision (maybe even origin/master)
|
||||
if rev_options:
|
||||
rev_options = self.check_rev_options(
|
||||
rev_options[0], dest, rev_options,
|
||||
)
|
||||
self.run_command(['reset', '--hard', '-q'] + rev_options, cwd=dest)
|
||||
#: update submodules
|
||||
self.update_submodules(dest)
|
||||
|
||||
def obtain(self, dest):
|
||||
url, rev = self.get_url_rev()
|
||||
if rev:
|
||||
rev_options = [rev]
|
||||
rev_display = ' (to %s)' % rev
|
||||
else:
|
||||
rev_options = ['origin/master']
|
||||
rev_display = ''
|
||||
if self.check_destination(dest, url, rev_options, rev_display):
|
||||
logger.info(
|
||||
'Cloning %s%s to %s', url, rev_display, display_path(dest),
|
||||
)
|
||||
self.run_command(['clone', '-q', url, dest])
|
||||
|
||||
if rev:
|
||||
rev_options = self.check_rev_options(rev, dest, rev_options)
|
||||
# Only do a checkout if rev_options differs from HEAD
|
||||
if not self.check_version(dest, rev_options):
|
||||
self.run_command(
|
||||
['checkout', '-q'] + rev_options,
|
||||
cwd=dest,
|
||||
)
|
||||
#: repo may contain submodules
|
||||
self.update_submodules(dest)
|
||||
|
||||
def get_url(self, location):
|
||||
"""Return URL of the first remote encountered."""
|
||||
remotes = self.run_command(
|
||||
['config', '--get-regexp', 'remote\..*\.url'],
|
||||
show_stdout=False, cwd=location)
|
||||
remotes = remotes.splitlines()
|
||||
found_remote = remotes[0]
|
||||
for remote in remotes:
|
||||
if remote.startswith('remote.origin.url '):
|
||||
found_remote = remote
|
||||
break
|
||||
url = found_remote.split(' ')[1]
|
||||
return url.strip()
|
||||
|
||||
def get_revision(self, location):
|
||||
current_rev = self.run_command(
|
||||
['rev-parse', 'HEAD'], show_stdout=False, cwd=location)
|
||||
return current_rev.strip()
|
||||
|
||||
def get_full_refs(self, location):
|
||||
"""Yields tuples of (commit, ref) for branches and tags"""
|
||||
output = self.run_command(['show-ref'],
|
||||
show_stdout=False, cwd=location)
|
||||
for line in output.strip().splitlines():
|
||||
commit, ref = line.split(' ', 1)
|
||||
yield commit.strip(), ref.strip()
|
||||
|
||||
def is_ref_remote(self, ref):
|
||||
return ref.startswith('refs/remotes/')
|
||||
|
||||
def is_ref_branch(self, ref):
|
||||
return ref.startswith('refs/heads/')
|
||||
|
||||
def is_ref_tag(self, ref):
|
||||
return ref.startswith('refs/tags/')
|
||||
|
||||
def is_ref_commit(self, ref):
|
||||
"""A ref is a commit sha if it is not anything else"""
|
||||
return not any((
|
||||
self.is_ref_remote(ref),
|
||||
self.is_ref_branch(ref),
|
||||
self.is_ref_tag(ref),
|
||||
))
|
||||
|
||||
# Should deprecate `get_refs` since it's ambiguous
|
||||
def get_refs(self, location):
|
||||
return self.get_short_refs(location)
|
||||
|
||||
def get_short_refs(self, location):
|
||||
"""Return map of named refs (branches or tags) to commit hashes."""
|
||||
rv = {}
|
||||
for commit, ref in self.get_full_refs(location):
|
||||
ref_name = None
|
||||
if self.is_ref_remote(ref):
|
||||
ref_name = ref[len('refs/remotes/'):]
|
||||
elif self.is_ref_branch(ref):
|
||||
ref_name = ref[len('refs/heads/'):]
|
||||
elif self.is_ref_tag(ref):
|
||||
ref_name = ref[len('refs/tags/'):]
|
||||
if ref_name is not None:
|
||||
rv[ref_name] = commit
|
||||
return rv
|
||||
|
||||
def _get_subdirectory(self, location):
|
||||
"""Return the relative path of setup.py to the git repo root."""
|
||||
# find the repo root
|
||||
git_dir = self.run_command(['rev-parse', '--git-dir'],
|
||||
show_stdout=False, cwd=location).strip()
|
||||
if not os.path.isabs(git_dir):
|
||||
git_dir = os.path.join(location, git_dir)
|
||||
root_dir = os.path.join(git_dir, '..')
|
||||
# find setup.py
|
||||
orig_location = location
|
||||
while not os.path.exists(os.path.join(location, 'setup.py')):
|
||||
last_location = location
|
||||
location = os.path.dirname(location)
|
||||
if location == last_location:
|
||||
# We've traversed up to the root of the filesystem without
|
||||
# finding setup.py
|
||||
logger.warning(
|
||||
"Could not find setup.py for directory %s (tried all "
|
||||
"parent directories)",
|
||||
orig_location,
|
||||
)
|
||||
return None
|
||||
# relative path of setup.py to repo root
|
||||
if samefile(root_dir, location):
|
||||
return None
|
||||
return os.path.relpath(location, root_dir)
|
||||
|
||||
def get_src_requirement(self, dist, location):
|
||||
repo = self.get_url(location)
|
||||
if not repo.lower().startswith('git:'):
|
||||
repo = 'git+' + repo
|
||||
egg_project_name = dist.egg_name().split('-', 1)[0]
|
||||
if not repo:
|
||||
return None
|
||||
current_rev = self.get_revision(location)
|
||||
req = '%s@%s#egg=%s' % (repo, current_rev, egg_project_name)
|
||||
subdirectory = self._get_subdirectory(location)
|
||||
if subdirectory:
|
||||
req += '&subdirectory=' + subdirectory
|
||||
return req
|
||||
|
||||
def get_url_rev(self):
|
||||
"""
|
||||
Prefixes stub URLs like 'user@hostname:user/repo.git' with 'ssh://'.
|
||||
That's required because although they use SSH they sometimes doesn't
|
||||
work with a ssh:// scheme (e.g. Github). But we need a scheme for
|
||||
parsing. Hence we remove it again afterwards and return it as a stub.
|
||||
"""
|
||||
if '://' not in self.url:
|
||||
assert 'file:' not in self.url
|
||||
self.url = self.url.replace('git+', 'git+ssh://')
|
||||
url, rev = super(Git, self).get_url_rev()
|
||||
url = url.replace('ssh://', '')
|
||||
else:
|
||||
url, rev = super(Git, self).get_url_rev()
|
||||
|
||||
return url, rev
|
||||
|
||||
def update_submodules(self, location):
|
||||
if not os.path.exists(os.path.join(location, '.gitmodules')):
|
||||
return
|
||||
self.run_command(
|
||||
['submodule', 'update', '--init', '--recursive', '-q'],
|
||||
cwd=location,
|
||||
)
|
||||
|
||||
@classmethod
|
||||
def controls_location(cls, location):
|
||||
if super(Git, cls).controls_location(location):
|
||||
return True
|
||||
try:
|
||||
r = cls().run_command(['rev-parse'],
|
||||
cwd=location,
|
||||
show_stdout=False,
|
||||
on_returncode='ignore')
|
||||
return not r
|
||||
except BadCommand:
|
||||
logger.debug("could not determine if %s is under git control "
|
||||
"because git is not available", location)
|
||||
return False
|
||||
|
||||
|
||||
vcs.register(Git)
|
||||
|
|
@ -2,13 +2,13 @@ from __future__ import absolute_import
|
|||
|
||||
import logging
|
||||
import os
|
||||
import tempfile
|
||||
|
||||
from pip.utils import display_path, rmtree
|
||||
from pip.vcs import vcs, VersionControl
|
||||
from pip.download import path_to_url
|
||||
from pip._vendor.six.moves import configparser
|
||||
|
||||
from pip._internal.download import path_to_url
|
||||
from pip._internal.utils.misc import display_path, make_vcs_requirement_url
|
||||
from pip._internal.utils.temp_dir import TempDirectory
|
||||
from pip._internal.vcs import VersionControl, vcs
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
|
@ -19,29 +19,15 @@ class Mercurial(VersionControl):
|
|||
repo_name = 'clone'
|
||||
schemes = ('hg', 'hg+http', 'hg+https', 'hg+ssh', 'hg+static-http')
|
||||
|
||||
def get_base_rev_args(self, rev):
|
||||
return [rev]
|
||||
|
||||
def export(self, location):
|
||||
"""Export the Hg repository at the url to the destination location"""
|
||||
with TempDirectory(kind="export") as temp_dir:
|
||||
self.unpack(temp_dir.path)
|
||||
|
||||
temp_dir = tempfile.mkdtemp('-export', 'pip-')
|
||||
self.unpack(temp_dir)
|
||||
try:
|
||||
self.run_command(
|
||||
['archive', location], show_stdout=False, cwd=temp_dir.path
|
||||
)
|
||||
|
||||
def fetch_new(self, dest, url, rev_options):
|
||||
rev_display = rev_options.to_display()
|
||||
logger.info(
|
||||
'Cloning hg %s%s to %s',
|
||||
url,
|
||||
rev_display,
|
||||
display_path(dest),
|
||||
)
|
||||
self.run_command(['clone', '--noupdate', '-q', url, dest])
|
||||
cmd_args = ['update', '-q'] + rev_options.to_args()
|
||||
self.run_command(cmd_args, cwd=dest)
|
||||
['archive', location], show_stdout=False, cwd=temp_dir)
|
||||
finally:
|
||||
rmtree(temp_dir)
|
||||
|
||||
def switch(self, dest, url, rev_options):
|
||||
repo_config = os.path.join(dest, self.dirname, 'hgrc')
|
||||
|
|
@ -56,13 +42,29 @@ class Mercurial(VersionControl):
|
|||
'Could not switch Mercurial repository to %s: %s', url, exc,
|
||||
)
|
||||
else:
|
||||
cmd_args = ['update', '-q'] + rev_options.to_args()
|
||||
self.run_command(cmd_args, cwd=dest)
|
||||
self.run_command(['update', '-q'] + rev_options, cwd=dest)
|
||||
|
||||
def update(self, dest, url, rev_options):
|
||||
def update(self, dest, rev_options):
|
||||
self.run_command(['pull', '-q'], cwd=dest)
|
||||
cmd_args = ['update', '-q'] + rev_options.to_args()
|
||||
self.run_command(cmd_args, cwd=dest)
|
||||
self.run_command(['update', '-q'] + rev_options, cwd=dest)
|
||||
|
||||
def obtain(self, dest):
|
||||
url, rev = self.get_url_rev()
|
||||
if rev:
|
||||
rev_options = [rev]
|
||||
rev_display = ' (to revision %s)' % rev
|
||||
else:
|
||||
rev_options = []
|
||||
rev_display = ''
|
||||
if self.check_destination(dest, url, rev_options, rev_display):
|
||||
logger.info(
|
||||
'Cloning hg %s%s to %s',
|
||||
url,
|
||||
rev_display,
|
||||
display_path(dest),
|
||||
)
|
||||
self.run_command(['clone', '--noupdate', '-q', url, dest])
|
||||
self.run_command(['update', '-q'] + rev_options, cwd=dest)
|
||||
|
||||
def get_url(self, location):
|
||||
url = self.run_command(
|
||||
|
|
@ -88,14 +90,14 @@ class Mercurial(VersionControl):
|
|||
repo = self.get_url(location)
|
||||
if not repo.lower().startswith('hg:'):
|
||||
repo = 'hg+' + repo
|
||||
current_rev_hash = self.get_revision_hash(location)
|
||||
egg_project_name = dist.egg_name().split('-', 1)[0]
|
||||
return make_vcs_requirement_url(repo, current_rev_hash,
|
||||
egg_project_name)
|
||||
if not repo:
|
||||
return None
|
||||
current_rev_hash = self.get_revision_hash(location)
|
||||
return '%s@%s#egg=%s' % (repo, current_rev_hash, egg_project_name)
|
||||
|
||||
def is_commit_id_equal(self, dest, name):
|
||||
def check_version(self, dest, rev_options):
|
||||
"""Always assume the versions don't match"""
|
||||
return False
|
||||
|
||||
|
||||
vcs.register(Mercurial)
|
||||
|
|
@ -4,15 +4,17 @@ import logging
|
|||
import os
|
||||
import re
|
||||
|
||||
from pip._internal.models.link import Link
|
||||
from pip._internal.utils.logging import indent_log
|
||||
from pip._internal.utils.misc import (
|
||||
display_path, make_vcs_requirement_url, rmtree, split_auth_from_netloc,
|
||||
)
|
||||
from pip._internal.vcs import VersionControl, vcs
|
||||
from pip._vendor.six.moves.urllib import parse as urllib_parse
|
||||
|
||||
from pip.index import Link
|
||||
from pip.utils import rmtree, display_path
|
||||
from pip.utils.logging import indent_log
|
||||
from pip.vcs import vcs, VersionControl
|
||||
|
||||
_svn_xml_url_re = re.compile('url="([^"]+)"')
|
||||
_svn_rev_re = re.compile(r'committed-rev="(\d+)"')
|
||||
_svn_rev_re = re.compile('committed-rev="(\d+)"')
|
||||
_svn_url_re = re.compile(r'URL: (.+)')
|
||||
_svn_revision_re = re.compile(r'Revision: (.+)')
|
||||
_svn_info_xml_rev_re = re.compile(r'\s*revision="(\d+)"')
|
||||
_svn_info_xml_url_re = re.compile(r'<url>(.*)</url>')
|
||||
|
||||
|
|
@ -26,40 +28,71 @@ class Subversion(VersionControl):
|
|||
repo_name = 'checkout'
|
||||
schemes = ('svn', 'svn+ssh', 'svn+http', 'svn+https', 'svn+svn')
|
||||
|
||||
def get_base_rev_args(self, rev):
|
||||
return ['-r', rev]
|
||||
def get_info(self, location):
|
||||
"""Returns (url, revision), where both are strings"""
|
||||
assert not location.rstrip('/').endswith(self.dirname), \
|
||||
'Bad directory: %s' % location
|
||||
output = self.run_command(
|
||||
['info', location],
|
||||
show_stdout=False,
|
||||
extra_environ={'LANG': 'C'},
|
||||
)
|
||||
match = _svn_url_re.search(output)
|
||||
if not match:
|
||||
logger.warning(
|
||||
'Cannot determine URL of svn checkout %s',
|
||||
display_path(location),
|
||||
)
|
||||
logger.debug('Output that cannot be parsed: \n%s', output)
|
||||
return None, None
|
||||
url = match.group(1).strip()
|
||||
match = _svn_revision_re.search(output)
|
||||
if not match:
|
||||
logger.warning(
|
||||
'Cannot determine revision of svn checkout %s',
|
||||
display_path(location),
|
||||
)
|
||||
logger.debug('Output that cannot be parsed: \n%s', output)
|
||||
return url, None
|
||||
return url, match.group(1)
|
||||
|
||||
def export(self, location):
|
||||
"""Export the svn repository at the url to the destination location"""
|
||||
url, rev_options = self.get_url_rev_options(self.url)
|
||||
|
||||
url, rev = self.get_url_rev()
|
||||
rev_options = get_rev_options(url, rev)
|
||||
url = self.remove_auth_from_url(url)
|
||||
logger.info('Exporting svn repository %s to %s', url, location)
|
||||
with indent_log():
|
||||
if os.path.exists(location):
|
||||
# Subversion doesn't like to check out over an existing
|
||||
# directory --force fixes this, but was only added in svn 1.5
|
||||
rmtree(location)
|
||||
cmd_args = ['export'] + rev_options.to_args() + [url, location]
|
||||
self.run_command(cmd_args, show_stdout=False)
|
||||
self.run_command(
|
||||
['export'] + rev_options + [url, location],
|
||||
show_stdout=False)
|
||||
|
||||
def fetch_new(self, dest, url, rev_options):
|
||||
rev_display = rev_options.to_display()
|
||||
def switch(self, dest, url, rev_options):
|
||||
self.run_command(['switch'] + rev_options + [url, dest])
|
||||
|
||||
def update(self, dest, rev_options):
|
||||
self.run_command(['update'] + rev_options + [dest])
|
||||
|
||||
def obtain(self, dest):
|
||||
url, rev = self.get_url_rev()
|
||||
rev_options = get_rev_options(url, rev)
|
||||
url = self.remove_auth_from_url(url)
|
||||
if rev:
|
||||
rev_display = ' (to revision %s)' % rev
|
||||
else:
|
||||
rev_display = ''
|
||||
if self.check_destination(dest, url, rev_options, rev_display):
|
||||
logger.info(
|
||||
'Checking out %s%s to %s',
|
||||
url,
|
||||
rev_display,
|
||||
display_path(dest),
|
||||
)
|
||||
cmd_args = ['checkout', '-q'] + rev_options.to_args() + [url, dest]
|
||||
self.run_command(cmd_args)
|
||||
|
||||
def switch(self, dest, url, rev_options):
|
||||
cmd_args = ['switch'] + rev_options.to_args() + [url, dest]
|
||||
self.run_command(cmd_args)
|
||||
|
||||
def update(self, dest, url, rev_options):
|
||||
cmd_args = ['update'] + rev_options.to_args() + [dest]
|
||||
self.run_command(cmd_args)
|
||||
self.run_command(['checkout', '-q'] + rev_options + [url, dest])
|
||||
|
||||
def get_location(self, dist, dependency_links):
|
||||
for url in dependency_links:
|
||||
|
|
@ -95,41 +128,19 @@ class Subversion(VersionControl):
|
|||
dirurl, localrev = self._get_svn_url_rev(base)
|
||||
|
||||
if base == location:
|
||||
base = dirurl + '/' # save the root url
|
||||
elif not dirurl or not dirurl.startswith(base):
|
||||
base_url = dirurl + '/' # save the root url
|
||||
elif not dirurl or not dirurl.startswith(base_url):
|
||||
dirs[:] = []
|
||||
continue # not part of the same svn tree, skip it
|
||||
revision = max(revision, localrev)
|
||||
return revision
|
||||
|
||||
def get_netloc_and_auth(self, netloc, scheme):
|
||||
"""
|
||||
This override allows the auth information to be passed to svn via the
|
||||
--username and --password options instead of via the URL.
|
||||
"""
|
||||
if scheme == 'ssh':
|
||||
# The --username and --password options can't be used for
|
||||
# svn+ssh URLs, so keep the auth information in the URL.
|
||||
return super(Subversion, self).get_netloc_and_auth(
|
||||
netloc, scheme)
|
||||
|
||||
return split_auth_from_netloc(netloc)
|
||||
|
||||
def get_url_rev_and_auth(self, url):
|
||||
def get_url_rev(self):
|
||||
# hotfix the URL scheme after removing svn+ from svn+ssh:// readd it
|
||||
url, rev, user_pass = super(Subversion, self).get_url_rev_and_auth(url)
|
||||
url, rev = super(Subversion, self).get_url_rev()
|
||||
if url.startswith('ssh://'):
|
||||
url = 'svn+' + url
|
||||
return url, rev, user_pass
|
||||
|
||||
def make_rev_args(self, username, password):
|
||||
extra_args = []
|
||||
if username:
|
||||
extra_args += ['--username', username]
|
||||
if password:
|
||||
extra_args += ['--password', password]
|
||||
|
||||
return extra_args
|
||||
return url, rev
|
||||
|
||||
def get_url(self, location):
|
||||
# In cases where the source is in a subdirectory, not alongside
|
||||
|
|
@ -152,7 +163,7 @@ class Subversion(VersionControl):
|
|||
return self._get_svn_url_rev(location)[0]
|
||||
|
||||
def _get_svn_url_rev(self, location):
|
||||
from pip._internal.exceptions import InstallationError
|
||||
from pip.exceptions import InstallationError
|
||||
|
||||
entries_path = os.path.join(location, self.dirname, 'entries')
|
||||
if os.path.exists(entries_path):
|
||||
|
|
@ -199,15 +210,60 @@ class Subversion(VersionControl):
|
|||
repo = self.get_url(location)
|
||||
if repo is None:
|
||||
return None
|
||||
repo = 'svn+' + repo
|
||||
rev = self.get_revision(location)
|
||||
# FIXME: why not project name?
|
||||
egg_project_name = dist.egg_name().split('-', 1)[0]
|
||||
return make_vcs_requirement_url(repo, rev, egg_project_name)
|
||||
rev = self.get_revision(location)
|
||||
return 'svn+%s@%s#egg=%s' % (repo, rev, egg_project_name)
|
||||
|
||||
def is_commit_id_equal(self, dest, name):
|
||||
def check_version(self, dest, rev_options):
|
||||
"""Always assume the versions don't match"""
|
||||
return False
|
||||
|
||||
@staticmethod
|
||||
def remove_auth_from_url(url):
|
||||
# Return a copy of url with 'username:password@' removed.
|
||||
# username/pass params are passed to subversion through flags
|
||||
# and are not recognized in the url.
|
||||
|
||||
# parsed url
|
||||
purl = urllib_parse.urlsplit(url)
|
||||
stripped_netloc = \
|
||||
purl.netloc.split('@')[-1]
|
||||
|
||||
# stripped url
|
||||
url_pieces = (
|
||||
purl.scheme, stripped_netloc, purl.path, purl.query, purl.fragment
|
||||
)
|
||||
surl = urllib_parse.urlunsplit(url_pieces)
|
||||
return surl
|
||||
|
||||
|
||||
def get_rev_options(url, rev):
|
||||
if rev:
|
||||
rev_options = ['-r', rev]
|
||||
else:
|
||||
rev_options = []
|
||||
|
||||
r = urllib_parse.urlsplit(url)
|
||||
if hasattr(r, 'username'):
|
||||
# >= Python-2.5
|
||||
username, password = r.username, r.password
|
||||
else:
|
||||
netloc = r[1]
|
||||
if '@' in netloc:
|
||||
auth = netloc.split('@')[0]
|
||||
if ':' in auth:
|
||||
username, password = auth.split(':', 1)
|
||||
else:
|
||||
username, password = auth, None
|
||||
else:
|
||||
username, password = None, None
|
||||
|
||||
if username:
|
||||
rev_options += ['--username', username]
|
||||
if password:
|
||||
rev_options += ['--password', password]
|
||||
return rev_options
|
||||
|
||||
|
||||
vcs.register(Subversion)
|
||||
|
|
@ -3,44 +3,44 @@ Support for installing and building the "wheel" binary package format.
|
|||
"""
|
||||
from __future__ import absolute_import
|
||||
|
||||
import collections
|
||||
import compileall
|
||||
import csv
|
||||
import errno
|
||||
import functools
|
||||
import hashlib
|
||||
import logging
|
||||
import os
|
||||
import os.path
|
||||
import re
|
||||
import shutil
|
||||
import stat
|
||||
import sys
|
||||
import tempfile
|
||||
import warnings
|
||||
|
||||
from base64 import urlsafe_b64encode
|
||||
from email.parser import Parser
|
||||
|
||||
from pip._vendor import pkg_resources
|
||||
from pip._vendor.distlib.scripts import ScriptMaker
|
||||
from pip._vendor.packaging.utils import canonicalize_name
|
||||
from pip._vendor.six import StringIO
|
||||
|
||||
from pip._internal import pep425tags
|
||||
from pip._internal.download import path_to_url, unpack_url
|
||||
from pip._internal.exceptions import (
|
||||
InstallationError, InvalidWheelFilename, UnsupportedWheel,
|
||||
import pip
|
||||
from pip.compat import expanduser
|
||||
from pip.download import path_to_url, unpack_url
|
||||
from pip.exceptions import (
|
||||
InstallationError, InvalidWheelFilename, UnsupportedWheel)
|
||||
from pip.locations import distutils_scheme, PIP_DELETE_MARKER_FILENAME
|
||||
from pip import pep425tags
|
||||
from pip.utils import (
|
||||
call_subprocess, ensure_dir, captured_stdout, rmtree, read_chunks,
|
||||
)
|
||||
from pip._internal.locations import (
|
||||
PIP_DELETE_MARKER_FILENAME, distutils_scheme,
|
||||
)
|
||||
from pip._internal.utils.logging import indent_log
|
||||
from pip._internal.utils.misc import (
|
||||
call_subprocess, captured_stdout, ensure_dir, read_chunks,
|
||||
)
|
||||
from pip._internal.utils.setuptools_build import SETUPTOOLS_SHIM
|
||||
from pip._internal.utils.temp_dir import TempDirectory
|
||||
from pip._internal.utils.typing import MYPY_CHECK_RUNNING
|
||||
from pip._internal.utils.ui import open_spinner
|
||||
from pip.utils.ui import open_spinner
|
||||
from pip.utils.logging import indent_log
|
||||
from pip.utils.setuptools_build import SETUPTOOLS_SHIM
|
||||
from pip._vendor.distlib.scripts import ScriptMaker
|
||||
from pip._vendor import pkg_resources
|
||||
from pip._vendor.packaging.utils import canonicalize_name
|
||||
from pip._vendor.six.moves import configparser
|
||||
|
||||
if MYPY_CHECK_RUNNING:
|
||||
from typing import Dict, List, Optional # noqa: F401
|
||||
|
||||
wheel_ext = '.whl'
|
||||
|
||||
|
|
@ -50,9 +50,107 @@ VERSION_COMPATIBLE = (1, 0)
|
|||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def rehash(path, blocksize=1 << 20):
|
||||
"""Return (hash, length) for path using hashlib.sha256()"""
|
||||
h = hashlib.sha256()
|
||||
class WheelCache(object):
|
||||
"""A cache of wheels for future installs."""
|
||||
|
||||
def __init__(self, cache_dir, format_control):
|
||||
"""Create a wheel cache.
|
||||
|
||||
:param cache_dir: The root of the cache.
|
||||
:param format_control: A pip.index.FormatControl object to limit
|
||||
binaries being read from the cache.
|
||||
"""
|
||||
self._cache_dir = expanduser(cache_dir) if cache_dir else None
|
||||
self._format_control = format_control
|
||||
|
||||
def cached_wheel(self, link, package_name):
|
||||
return cached_wheel(
|
||||
self._cache_dir, link, self._format_control, package_name)
|
||||
|
||||
|
||||
def _cache_for_link(cache_dir, link):
|
||||
"""
|
||||
Return a directory to store cached wheels in for link.
|
||||
|
||||
Because there are M wheels for any one sdist, we provide a directory
|
||||
to cache them in, and then consult that directory when looking up
|
||||
cache hits.
|
||||
|
||||
We only insert things into the cache if they have plausible version
|
||||
numbers, so that we don't contaminate the cache with things that were not
|
||||
unique. E.g. ./package might have dozens of installs done for it and build
|
||||
a version of 0.0...and if we built and cached a wheel, we'd end up using
|
||||
the same wheel even if the source has been edited.
|
||||
|
||||
:param cache_dir: The cache_dir being used by pip.
|
||||
:param link: The link of the sdist for which this will cache wheels.
|
||||
"""
|
||||
|
||||
# We want to generate an url to use as our cache key, we don't want to just
|
||||
# re-use the URL because it might have other items in the fragment and we
|
||||
# don't care about those.
|
||||
key_parts = [link.url_without_fragment]
|
||||
if link.hash_name is not None and link.hash is not None:
|
||||
key_parts.append("=".join([link.hash_name, link.hash]))
|
||||
key_url = "#".join(key_parts)
|
||||
|
||||
# Encode our key url with sha224, we'll use this because it has similar
|
||||
# security properties to sha256, but with a shorter total output (and thus
|
||||
# less secure). However the differences don't make a lot of difference for
|
||||
# our use case here.
|
||||
hashed = hashlib.sha224(key_url.encode()).hexdigest()
|
||||
|
||||
# We want to nest the directories some to prevent having a ton of top level
|
||||
# directories where we might run out of sub directories on some FS.
|
||||
parts = [hashed[:2], hashed[2:4], hashed[4:6], hashed[6:]]
|
||||
|
||||
# Inside of the base location for cached wheels, expand our parts and join
|
||||
# them all together.
|
||||
return os.path.join(cache_dir, "wheels", *parts)
|
||||
|
||||
|
||||
def cached_wheel(cache_dir, link, format_control, package_name):
|
||||
if not cache_dir:
|
||||
return link
|
||||
if not link:
|
||||
return link
|
||||
if link.is_wheel:
|
||||
return link
|
||||
if not link.is_artifact:
|
||||
return link
|
||||
if not package_name:
|
||||
return link
|
||||
canonical_name = canonicalize_name(package_name)
|
||||
formats = pip.index.fmt_ctl_formats(format_control, canonical_name)
|
||||
if "binary" not in formats:
|
||||
return link
|
||||
root = _cache_for_link(cache_dir, link)
|
||||
try:
|
||||
wheel_names = os.listdir(root)
|
||||
except OSError as e:
|
||||
if e.errno in (errno.ENOENT, errno.ENOTDIR):
|
||||
return link
|
||||
raise
|
||||
candidates = []
|
||||
for wheel_name in wheel_names:
|
||||
try:
|
||||
wheel = Wheel(wheel_name)
|
||||
except InvalidWheelFilename:
|
||||
continue
|
||||
if not wheel.supported():
|
||||
# Built for a different python/arch/etc
|
||||
continue
|
||||
candidates.append((wheel.support_index_min(), wheel_name))
|
||||
if not candidates:
|
||||
return link
|
||||
candidates.sort()
|
||||
path = os.path.join(root, candidates[0][1])
|
||||
return pip.index.Link(path_to_url(path))
|
||||
|
||||
|
||||
def rehash(path, algo='sha256', blocksize=1 << 20):
|
||||
"""Return (hash, length) for path using hashlib.new(algo)"""
|
||||
h = hashlib.new(algo)
|
||||
length = 0
|
||||
with open(path, 'rb') as f:
|
||||
for block in read_chunks(f, size=blocksize):
|
||||
|
|
@ -91,8 +189,7 @@ def fix_script(path):
|
|||
script.write(rest)
|
||||
return True
|
||||
|
||||
|
||||
dist_info_re = re.compile(r"""^(?P<namever>(?P<name>.+?)(-(?P<ver>.+?))?)
|
||||
dist_info_re = re.compile(r"""^(?P<namever>(?P<name>.+?)(-(?P<ver>\d.+?))?)
|
||||
\.dist-info$""", re.VERBOSE)
|
||||
|
||||
|
||||
|
|
@ -127,86 +224,21 @@ def get_entrypoints(filename):
|
|||
data.write("\n")
|
||||
data.seek(0)
|
||||
|
||||
# get the entry points and then the script names
|
||||
entry_points = pkg_resources.EntryPoint.parse_map(data)
|
||||
console = entry_points.get('console_scripts', {})
|
||||
gui = entry_points.get('gui_scripts', {})
|
||||
cp = configparser.RawConfigParser()
|
||||
cp.optionxform = lambda option: option
|
||||
cp.readfp(data)
|
||||
|
||||
def _split_ep(s):
|
||||
"""get the string representation of EntryPoint, remove space and split
|
||||
on '='"""
|
||||
return str(s).replace(" ", "").split("=")
|
||||
|
||||
# convert the EntryPoint objects into strings with module:function
|
||||
console = dict(_split_ep(v) for v in console.values())
|
||||
gui = dict(_split_ep(v) for v in gui.values())
|
||||
console = {}
|
||||
gui = {}
|
||||
if cp.has_section('console_scripts'):
|
||||
console = dict(cp.items('console_scripts'))
|
||||
if cp.has_section('gui_scripts'):
|
||||
gui = dict(cp.items('gui_scripts'))
|
||||
return console, gui
|
||||
|
||||
|
||||
def message_about_scripts_not_on_PATH(scripts):
|
||||
# type: (List[str]) -> Optional[str]
|
||||
"""Determine if any scripts are not on PATH and format a warning.
|
||||
|
||||
Returns a warning message if one or more scripts are not on PATH,
|
||||
otherwise None.
|
||||
"""
|
||||
if not scripts:
|
||||
return None
|
||||
|
||||
# Group scripts by the path they were installed in
|
||||
grouped_by_dir = collections.defaultdict(set) # type: Dict[str, set]
|
||||
for destfile in scripts:
|
||||
parent_dir = os.path.dirname(destfile)
|
||||
script_name = os.path.basename(destfile)
|
||||
grouped_by_dir[parent_dir].add(script_name)
|
||||
|
||||
# We don't want to warn for directories that are on PATH.
|
||||
not_warn_dirs = [
|
||||
os.path.normcase(i).rstrip(os.sep) for i in
|
||||
os.environ.get("PATH", "").split(os.pathsep)
|
||||
]
|
||||
# If an executable sits with sys.executable, we don't warn for it.
|
||||
# This covers the case of venv invocations without activating the venv.
|
||||
not_warn_dirs.append(os.path.normcase(os.path.dirname(sys.executable)))
|
||||
warn_for = {
|
||||
parent_dir: scripts for parent_dir, scripts in grouped_by_dir.items()
|
||||
if os.path.normcase(parent_dir) not in not_warn_dirs
|
||||
}
|
||||
if not warn_for:
|
||||
return None
|
||||
|
||||
# Format a message
|
||||
msg_lines = []
|
||||
for parent_dir, scripts in warn_for.items():
|
||||
scripts = sorted(scripts)
|
||||
if len(scripts) == 1:
|
||||
start_text = "script {} is".format(scripts[0])
|
||||
else:
|
||||
start_text = "scripts {} are".format(
|
||||
", ".join(scripts[:-1]) + " and " + scripts[-1]
|
||||
)
|
||||
|
||||
msg_lines.append(
|
||||
"The {} installed in '{}' which is not on PATH."
|
||||
.format(start_text, parent_dir)
|
||||
)
|
||||
|
||||
last_line_fmt = (
|
||||
"Consider adding {} to PATH or, if you prefer "
|
||||
"to suppress this warning, use --no-warn-script-location."
|
||||
)
|
||||
if len(msg_lines) == 1:
|
||||
msg_lines.append(last_line_fmt.format("this directory"))
|
||||
else:
|
||||
msg_lines.append(last_line_fmt.format("these directories"))
|
||||
|
||||
# Returns the formatted multiline message
|
||||
return "\n".join(msg_lines)
|
||||
|
||||
|
||||
def move_wheel_files(name, req, wheeldir, user=False, home=None, root=None,
|
||||
pycompile=True, scheme=None, isolated=False, prefix=None,
|
||||
warn_script_location=True):
|
||||
pycompile=True, scheme=None, isolated=False, prefix=None):
|
||||
"""Install a wheel"""
|
||||
|
||||
if not scheme:
|
||||
|
|
@ -283,17 +315,6 @@ def move_wheel_files(name, req, wheeldir, user=False, home=None, root=None,
|
|||
# uninstalled.
|
||||
ensure_dir(destdir)
|
||||
|
||||
# copyfile (called below) truncates the destination if it
|
||||
# exists and then writes the new contents. This is fine in most
|
||||
# cases, but can cause a segfault if pip has loaded a shared
|
||||
# object (e.g. from pyopenssl through its vendored urllib3)
|
||||
# Since the shared object is mmap'd an attempt to call a
|
||||
# symbol in it will then cause a segfault. Unlinking the file
|
||||
# allows writing of new contents while allowing the process to
|
||||
# continue to use the old copy.
|
||||
if os.path.exists(destfile):
|
||||
os.unlink(destfile)
|
||||
|
||||
# We use copyfile (not move, copy, or copy2) to be extra sure
|
||||
# that we are not moving directories over (copyfile fails for
|
||||
# directories) as well as to ensure that we are not copying
|
||||
|
|
@ -364,7 +385,7 @@ def move_wheel_files(name, req, wheeldir, user=False, home=None, root=None,
|
|||
# Ensure we don't generate any variants for scripts because this is almost
|
||||
# never what somebody wants.
|
||||
# See https://bitbucket.org/pypa/distlib/issue/35/
|
||||
maker.variants = {''}
|
||||
maker.variants = set(('', ))
|
||||
|
||||
# This is required because otherwise distlib creates scripts that are not
|
||||
# executable.
|
||||
|
|
@ -390,7 +411,7 @@ def move_wheel_files(name, req, wheeldir, user=False, home=None, root=None,
|
|||
}
|
||||
|
||||
maker._get_script_text = _get_script_text
|
||||
maker.script_template = r"""# -*- coding: utf-8 -*-
|
||||
maker.script_template = """# -*- coding: utf-8 -*-
|
||||
import re
|
||||
import sys
|
||||
|
||||
|
|
@ -467,16 +488,9 @@ if __name__ == '__main__':
|
|||
|
||||
# Generate the console and GUI entry points specified in the wheel
|
||||
if len(console) > 0:
|
||||
generated_console_scripts = maker.make_multiple(
|
||||
['%s = %s' % kv for kv in console.items()]
|
||||
generated.extend(
|
||||
maker.make_multiple(['%s = %s' % kv for kv in console.items()])
|
||||
)
|
||||
generated.extend(generated_console_scripts)
|
||||
|
||||
if warn_script_location:
|
||||
msg = message_about_scripts_not_on_PATH(generated_console_scripts)
|
||||
if msg is not None:
|
||||
logger.warning(msg)
|
||||
|
||||
if len(gui) > 0:
|
||||
generated.extend(
|
||||
maker.make_multiple(
|
||||
|
|
@ -500,22 +514,53 @@ if __name__ == '__main__':
|
|||
with open_for_csv(temp_record, 'w+') as record_out:
|
||||
reader = csv.reader(record_in)
|
||||
writer = csv.writer(record_out)
|
||||
outrows = []
|
||||
for row in reader:
|
||||
row[0] = installed.pop(row[0], row[0])
|
||||
if row[0] in changed:
|
||||
row[1], row[2] = rehash(row[0])
|
||||
outrows.append(tuple(row))
|
||||
for f in generated:
|
||||
digest, length = rehash(f)
|
||||
outrows.append((normpath(f, lib_dir), digest, length))
|
||||
for f in installed:
|
||||
outrows.append((installed[f], '', ''))
|
||||
for row in sorted(outrows):
|
||||
writer.writerow(row)
|
||||
for f in generated:
|
||||
h, l = rehash(f)
|
||||
writer.writerow((normpath(f, lib_dir), h, l))
|
||||
for f in installed:
|
||||
writer.writerow((installed[f], '', ''))
|
||||
shutil.move(temp_record, record)
|
||||
|
||||
|
||||
def _unique(fn):
|
||||
@functools.wraps(fn)
|
||||
def unique(*args, **kw):
|
||||
seen = set()
|
||||
for item in fn(*args, **kw):
|
||||
if item not in seen:
|
||||
seen.add(item)
|
||||
yield item
|
||||
return unique
|
||||
|
||||
|
||||
# TODO: this goes somewhere besides the wheel module
|
||||
@_unique
|
||||
def uninstallation_paths(dist):
|
||||
"""
|
||||
Yield all the uninstallation paths for dist based on RECORD-without-.pyc
|
||||
|
||||
Yield paths to all the files in RECORD. For each .py file in RECORD, add
|
||||
the .pyc in the same directory.
|
||||
|
||||
UninstallPathSet.add() takes care of the __pycache__ .pyc.
|
||||
"""
|
||||
from pip.utils import FakeFile # circular import
|
||||
r = csv.reader(FakeFile(dist.get_metadata_lines('RECORD')))
|
||||
for row in r:
|
||||
path = os.path.join(dist.location, row[0])
|
||||
yield path
|
||||
if path.endswith('.py'):
|
||||
dn, fn = os.path.split(path)
|
||||
base = fn[:-3]
|
||||
path = os.path.join(dn, base + '.pyc')
|
||||
yield path
|
||||
|
||||
|
||||
def wheel_version(source_dir):
|
||||
"""
|
||||
Return the Wheel-Version of an extracted wheel, if possible.
|
||||
|
|
@ -531,7 +576,7 @@ def wheel_version(source_dir):
|
|||
version = wheel_data['Wheel-Version'].strip()
|
||||
version = tuple(map(int, version.split('.')))
|
||||
return version
|
||||
except Exception:
|
||||
except:
|
||||
return False
|
||||
|
||||
|
||||
|
|
@ -570,8 +615,8 @@ class Wheel(object):
|
|||
# TODO: maybe move the install code into this class
|
||||
|
||||
wheel_file_re = re.compile(
|
||||
r"""^(?P<namever>(?P<name>.+?)-(?P<ver>.*?))
|
||||
((-(?P<build>\d[^-]*?))?-(?P<pyver>.+?)-(?P<abi>.+?)-(?P<plat>.+?)
|
||||
r"""^(?P<namever>(?P<name>.+?)-(?P<ver>\d.*?))
|
||||
((-(?P<build>\d.*?))?-(?P<pyver>.+?)-(?P<abi>.+?)-(?P<plat>.+?)
|
||||
\.whl|\.dist-info)$""",
|
||||
re.VERBOSE
|
||||
)
|
||||
|
|
@ -590,16 +635,15 @@ class Wheel(object):
|
|||
# we'll assume "_" means "-" due to wheel naming scheme
|
||||
# (https://github.com/pypa/pip/issues/1150)
|
||||
self.version = wheel_info.group('ver').replace('_', '-')
|
||||
self.build_tag = wheel_info.group('build')
|
||||
self.pyversions = wheel_info.group('pyver').split('.')
|
||||
self.abis = wheel_info.group('abi').split('.')
|
||||
self.plats = wheel_info.group('plat').split('.')
|
||||
|
||||
# All the tag combinations from this file
|
||||
self.file_tags = {
|
||||
self.file_tags = set(
|
||||
(x, y, z) for x in self.pyversions
|
||||
for y in self.abis for z in self.plats
|
||||
}
|
||||
)
|
||||
|
||||
def support_index_min(self, tags=None):
|
||||
"""
|
||||
|
|
@ -609,66 +653,54 @@ class Wheel(object):
|
|||
None is the wheel is not supported.
|
||||
"""
|
||||
if tags is None: # for mock
|
||||
tags = pep425tags.get_supported()
|
||||
tags = pep425tags.supported_tags
|
||||
indexes = [tags.index(c) for c in self.file_tags if c in tags]
|
||||
return min(indexes) if indexes else None
|
||||
|
||||
def supported(self, tags=None):
|
||||
"""Is this wheel supported on this system?"""
|
||||
if tags is None: # for mock
|
||||
tags = pep425tags.get_supported()
|
||||
tags = pep425tags.supported_tags
|
||||
return bool(set(tags).intersection(self.file_tags))
|
||||
|
||||
|
||||
class WheelBuilder(object):
|
||||
"""Build wheels from a RequirementSet."""
|
||||
|
||||
def __init__(self, finder, preparer, wheel_cache,
|
||||
build_options=None, global_options=None, no_clean=False):
|
||||
def __init__(self, requirement_set, finder, build_options=None,
|
||||
global_options=None):
|
||||
self.requirement_set = requirement_set
|
||||
self.finder = finder
|
||||
self.preparer = preparer
|
||||
self.wheel_cache = wheel_cache
|
||||
|
||||
self._wheel_dir = preparer.wheel_download_dir
|
||||
|
||||
self._cache_root = requirement_set._wheel_cache._cache_dir
|
||||
self._wheel_dir = requirement_set.wheel_download_dir
|
||||
self.build_options = build_options or []
|
||||
self.global_options = global_options or []
|
||||
self.no_clean = no_clean
|
||||
|
||||
def _build_one(self, req, output_dir, python_tag=None):
|
||||
"""Build one wheel.
|
||||
|
||||
:return: The filename of the built wheel, or None if the build failed.
|
||||
"""
|
||||
# Install build deps into temporary directory (PEP 518)
|
||||
with req.build_env:
|
||||
return self._build_one_inside_env(req, output_dir,
|
||||
python_tag=python_tag)
|
||||
|
||||
def _build_one_inside_env(self, req, output_dir, python_tag=None):
|
||||
with TempDirectory(kind="wheel") as temp_dir:
|
||||
if self.__build_one(req, temp_dir.path, python_tag=python_tag):
|
||||
tempd = tempfile.mkdtemp('pip-wheel-')
|
||||
try:
|
||||
wheel_name = os.listdir(temp_dir.path)[0]
|
||||
if self.__build_one(req, tempd, python_tag=python_tag):
|
||||
try:
|
||||
wheel_name = os.listdir(tempd)[0]
|
||||
wheel_path = os.path.join(output_dir, wheel_name)
|
||||
shutil.move(
|
||||
os.path.join(temp_dir.path, wheel_name), wheel_path
|
||||
)
|
||||
shutil.move(os.path.join(tempd, wheel_name), wheel_path)
|
||||
logger.info('Stored in directory: %s', output_dir)
|
||||
return wheel_path
|
||||
except Exception:
|
||||
except:
|
||||
pass
|
||||
# Ignore return, we can't do anything else useful.
|
||||
self._clean_one(req)
|
||||
return None
|
||||
finally:
|
||||
rmtree(tempd)
|
||||
|
||||
def _base_setup_args(self, req):
|
||||
# NOTE: Eventually, we'd want to also -S to the flags here, when we're
|
||||
# isolating. Currently, it breaks Python in virtualenvs, because it
|
||||
# relies on site.py to find parts of the standard library outside the
|
||||
# virtualenv.
|
||||
return [
|
||||
sys.executable, '-u', '-c',
|
||||
sys.executable, "-u", '-c',
|
||||
SETUPTOOLS_SHIM % req.setup_py
|
||||
] + list(self.global_options)
|
||||
|
||||
|
|
@ -688,7 +720,7 @@ class WheelBuilder(object):
|
|||
call_subprocess(wheel_args, cwd=req.setup_py_dir,
|
||||
show_stdout=False, spinner=spinner)
|
||||
return True
|
||||
except Exception:
|
||||
except:
|
||||
spinner.finish("error")
|
||||
logger.error('Failed building wheel for %s', req.name)
|
||||
return False
|
||||
|
|
@ -701,58 +733,53 @@ class WheelBuilder(object):
|
|||
try:
|
||||
call_subprocess(clean_args, cwd=req.source_dir, show_stdout=False)
|
||||
return True
|
||||
except Exception:
|
||||
except:
|
||||
logger.error('Failed cleaning build dir for %s', req.name)
|
||||
return False
|
||||
|
||||
def build(self, requirements, session, autobuilding=False):
|
||||
def build(self, autobuilding=False):
|
||||
"""Build wheels.
|
||||
|
||||
:param unpack: If True, replace the sdist we built from with the
|
||||
newly built wheel, in preparation for installation.
|
||||
:return: True if all the wheels built correctly.
|
||||
"""
|
||||
from pip._internal import index
|
||||
from pip._internal.models.link import Link
|
||||
assert self._wheel_dir or (autobuilding and self._cache_root)
|
||||
# unpack sdists and constructs req set
|
||||
self.requirement_set.prepare_files(self.finder)
|
||||
|
||||
building_is_possible = self._wheel_dir or (
|
||||
autobuilding and self.wheel_cache.cache_dir
|
||||
)
|
||||
assert building_is_possible
|
||||
reqset = self.requirement_set.requirements.values()
|
||||
|
||||
buildset = []
|
||||
format_control = self.finder.format_control
|
||||
for req in requirements:
|
||||
for req in reqset:
|
||||
if req.constraint:
|
||||
continue
|
||||
if req.is_wheel:
|
||||
if not autobuilding:
|
||||
logger.info(
|
||||
'Skipping %s, due to already being wheel.', req.name,
|
||||
)
|
||||
'Skipping %s, due to already being wheel.', req.name)
|
||||
elif autobuilding and req.editable:
|
||||
pass
|
||||
elif autobuilding and req.link and not req.link.is_artifact:
|
||||
pass
|
||||
elif autobuilding and not req.source_dir:
|
||||
pass
|
||||
elif autobuilding and req.link and not req.link.is_artifact:
|
||||
# VCS checkout. Build wheel just for this run.
|
||||
buildset.append((req, True))
|
||||
else:
|
||||
ephem_cache = False
|
||||
if autobuilding:
|
||||
link = req.link
|
||||
base, ext = link.splitext()
|
||||
if index.egg_info_matches(base, None, link) is None:
|
||||
# E.g. local directory. Build wheel just for this run.
|
||||
ephem_cache = True
|
||||
if "binary" not in format_control.get_allowed_formats(
|
||||
if pip.index.egg_info_matches(base, None, link) is None:
|
||||
# Doesn't look like a package - don't autobuild a wheel
|
||||
# because we'll have no way to lookup the result sanely
|
||||
continue
|
||||
if "binary" not in pip.index.fmt_ctl_formats(
|
||||
self.finder.format_control,
|
||||
canonicalize_name(req.name)):
|
||||
logger.info(
|
||||
"Skipping bdist_wheel for %s, due to binaries "
|
||||
"being disabled for it.", req.name,
|
||||
)
|
||||
"being disabled for it.", req.name)
|
||||
continue
|
||||
buildset.append((req, ephem_cache))
|
||||
buildset.append(req)
|
||||
|
||||
if not buildset:
|
||||
return True
|
||||
|
|
@ -760,19 +787,15 @@ class WheelBuilder(object):
|
|||
# Build the wheels.
|
||||
logger.info(
|
||||
'Building wheels for collected packages: %s',
|
||||
', '.join([req.name for (req, _) in buildset]),
|
||||
', '.join([req.name for req in buildset]),
|
||||
)
|
||||
_cache = self.wheel_cache # shorter name
|
||||
with indent_log():
|
||||
build_success, build_failure = [], []
|
||||
for req, ephem in buildset:
|
||||
for req in buildset:
|
||||
python_tag = None
|
||||
if autobuilding:
|
||||
python_tag = pep425tags.implementation_tag
|
||||
if ephem:
|
||||
output_dir = _cache.get_ephem_path_for_link(req.link)
|
||||
else:
|
||||
output_dir = _cache.get_path_for_link(req.link)
|
||||
output_dir = _cache_for_link(self._cache_root, req.link)
|
||||
try:
|
||||
ensure_dir(output_dir)
|
||||
except OSError as e:
|
||||
|
|
@ -803,16 +826,15 @@ class WheelBuilder(object):
|
|||
# set the build directory again - name is known from
|
||||
# the work prepare_files did.
|
||||
req.source_dir = req.build_location(
|
||||
self.preparer.build_dir
|
||||
)
|
||||
self.requirement_set.build_dir)
|
||||
# Update the link for this.
|
||||
req.link = Link(path_to_url(wheel_file))
|
||||
req.link = pip.index.Link(
|
||||
path_to_url(wheel_file))
|
||||
assert req.link.is_wheel
|
||||
# extract the wheel into the dir
|
||||
unpack_url(
|
||||
req.link, req.source_dir, None, False,
|
||||
session=session,
|
||||
)
|
||||
session=self.requirement_set.session)
|
||||
else:
|
||||
build_failure.append(req)
|
||||
|
||||
|
|
@ -0,0 +1,3 @@
|
|||
UNKNOWN
|
||||
|
||||
|
||||
|
|
@ -0,0 +1 @@
|
|||
pip
|
||||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Add a link
Reference in a new issue