diff --git a/lib/python3.4/site-packages/SQLAlchemy-1.0.12.dist-info/DESCRIPTION.rst b/lib/python3.4/site-packages/SQLAlchemy-1.0.12.dist-info/DESCRIPTION.rst deleted file mode 100644 index 479eaf5..0000000 --- a/lib/python3.4/site-packages/SQLAlchemy-1.0.12.dist-info/DESCRIPTION.rst +++ /dev/null @@ -1,137 +0,0 @@ -SQLAlchemy -========== - -The Python SQL Toolkit and Object Relational Mapper - -Introduction -------------- - -SQLAlchemy is the Python SQL toolkit and Object Relational Mapper -that gives application developers the full power and -flexibility of SQL. SQLAlchemy provides a full suite -of well known enterprise-level persistence patterns, -designed for efficient and high-performing database -access, adapted into a simple and Pythonic domain -language. - -Major SQLAlchemy features include: - -* An industrial strength ORM, built - from the core on the identity map, unit of work, - and data mapper patterns. These patterns - allow transparent persistence of objects - using a declarative configuration system. - Domain models - can be constructed and manipulated naturally, - and changes are synchronized with the - current transaction automatically. -* A relationally-oriented query system, exposing - the full range of SQL's capabilities - explicitly, including joins, subqueries, - correlation, and most everything else, - in terms of the object model. - Writing queries with the ORM uses the same - techniques of relational composition you use - when writing SQL. While you can drop into - literal SQL at any time, it's virtually never - needed. -* A comprehensive and flexible system - of eager loading for related collections and objects. - Collections are cached within a session, - and can be loaded on individual access, all - at once using joins, or by query per collection - across the full result set. -* A Core SQL construction system and DBAPI - interaction layer. The SQLAlchemy Core is - separate from the ORM and is a full database - abstraction layer in its own right, and includes - an extensible Python-based SQL expression - language, schema metadata, connection pooling, - type coercion, and custom types. -* All primary and foreign key constraints are - assumed to be composite and natural. Surrogate - integer primary keys are of course still the - norm, but SQLAlchemy never assumes or hardcodes - to this model. -* Database introspection and generation. Database - schemas can be "reflected" in one step into - Python structures representing database metadata; - those same structures can then generate - CREATE statements right back out - all within - the Core, independent of the ORM. - -SQLAlchemy's philosophy: - -* SQL databases behave less and less like object - collections the more size and performance start to - matter; object collections behave less and less like - tables and rows the more abstraction starts to matter. - SQLAlchemy aims to accommodate both of these - principles. -* An ORM doesn't need to hide the "R". A relational - database provides rich, set-based functionality - that should be fully exposed. SQLAlchemy's - ORM provides an open-ended set of patterns - that allow a developer to construct a custom - mediation layer between a domain model and - a relational schema, turning the so-called - "object relational impedance" issue into - a distant memory. -* The developer, in all cases, makes all decisions - regarding the design, structure, and naming conventions - of both the object model as well as the relational - schema. SQLAlchemy only provides the means - to automate the execution of these decisions. -* With SQLAlchemy, there's no such thing as - "the ORM generated a bad query" - you - retain full control over the structure of - queries, including how joins are organized, - how subqueries and correlation is used, what - columns are requested. Everything SQLAlchemy - does is ultimately the result of a developer- - initiated decision. -* Don't use an ORM if the problem doesn't need one. - SQLAlchemy consists of a Core and separate ORM - component. The Core offers a full SQL expression - language that allows Pythonic construction - of SQL constructs that render directly to SQL - strings for a target database, returning - result sets that are essentially enhanced DBAPI - cursors. -* Transactions should be the norm. With SQLAlchemy's - ORM, nothing goes to permanent storage until - commit() is called. SQLAlchemy encourages applications - to create a consistent means of delineating - the start and end of a series of operations. -* Never render a literal value in a SQL statement. - Bound parameters are used to the greatest degree - possible, allowing query optimizers to cache - query plans effectively and making SQL injection - attacks a non-issue. - -Documentation -------------- - -Latest documentation is at: - -http://www.sqlalchemy.org/docs/ - -Installation / Requirements ---------------------------- - -Full documentation for installation is at -`Installation `_. - -Getting Help / Development / Bug reporting ------------------------------------------- - -Please refer to the `SQLAlchemy Community Guide `_. - -License -------- - -SQLAlchemy is distributed under the `MIT license -`_. - - - diff --git a/lib/python3.4/site-packages/SQLAlchemy-1.0.12.dist-info/METADATA b/lib/python3.4/site-packages/SQLAlchemy-1.0.12.dist-info/METADATA deleted file mode 100644 index e5ec428..0000000 --- a/lib/python3.4/site-packages/SQLAlchemy-1.0.12.dist-info/METADATA +++ /dev/null @@ -1,158 +0,0 @@ -Metadata-Version: 2.0 -Name: SQLAlchemy -Version: 1.0.12 -Summary: Database Abstraction Library -Home-page: http://www.sqlalchemy.org -Author: Mike Bayer -Author-email: mike_mp@zzzcomputing.com -License: MIT License -Description-Content-Type: UNKNOWN -Platform: UNKNOWN -Classifier: Development Status :: 5 - Production/Stable -Classifier: Intended Audience :: Developers -Classifier: License :: OSI Approved :: MIT License -Classifier: Programming Language :: Python -Classifier: Programming Language :: Python :: 3 -Classifier: Programming Language :: Python :: Implementation :: CPython -Classifier: Programming Language :: Python :: Implementation :: Jython -Classifier: Programming Language :: Python :: Implementation :: PyPy -Classifier: Topic :: Database :: Front-Ends -Classifier: Operating System :: OS Independent - -SQLAlchemy -========== - -The Python SQL Toolkit and Object Relational Mapper - -Introduction -------------- - -SQLAlchemy is the Python SQL toolkit and Object Relational Mapper -that gives application developers the full power and -flexibility of SQL. SQLAlchemy provides a full suite -of well known enterprise-level persistence patterns, -designed for efficient and high-performing database -access, adapted into a simple and Pythonic domain -language. - -Major SQLAlchemy features include: - -* An industrial strength ORM, built - from the core on the identity map, unit of work, - and data mapper patterns. These patterns - allow transparent persistence of objects - using a declarative configuration system. - Domain models - can be constructed and manipulated naturally, - and changes are synchronized with the - current transaction automatically. -* A relationally-oriented query system, exposing - the full range of SQL's capabilities - explicitly, including joins, subqueries, - correlation, and most everything else, - in terms of the object model. - Writing queries with the ORM uses the same - techniques of relational composition you use - when writing SQL. While you can drop into - literal SQL at any time, it's virtually never - needed. -* A comprehensive and flexible system - of eager loading for related collections and objects. - Collections are cached within a session, - and can be loaded on individual access, all - at once using joins, or by query per collection - across the full result set. -* A Core SQL construction system and DBAPI - interaction layer. The SQLAlchemy Core is - separate from the ORM and is a full database - abstraction layer in its own right, and includes - an extensible Python-based SQL expression - language, schema metadata, connection pooling, - type coercion, and custom types. -* All primary and foreign key constraints are - assumed to be composite and natural. Surrogate - integer primary keys are of course still the - norm, but SQLAlchemy never assumes or hardcodes - to this model. -* Database introspection and generation. Database - schemas can be "reflected" in one step into - Python structures representing database metadata; - those same structures can then generate - CREATE statements right back out - all within - the Core, independent of the ORM. - -SQLAlchemy's philosophy: - -* SQL databases behave less and less like object - collections the more size and performance start to - matter; object collections behave less and less like - tables and rows the more abstraction starts to matter. - SQLAlchemy aims to accommodate both of these - principles. -* An ORM doesn't need to hide the "R". A relational - database provides rich, set-based functionality - that should be fully exposed. SQLAlchemy's - ORM provides an open-ended set of patterns - that allow a developer to construct a custom - mediation layer between a domain model and - a relational schema, turning the so-called - "object relational impedance" issue into - a distant memory. -* The developer, in all cases, makes all decisions - regarding the design, structure, and naming conventions - of both the object model as well as the relational - schema. SQLAlchemy only provides the means - to automate the execution of these decisions. -* With SQLAlchemy, there's no such thing as - "the ORM generated a bad query" - you - retain full control over the structure of - queries, including how joins are organized, - how subqueries and correlation is used, what - columns are requested. Everything SQLAlchemy - does is ultimately the result of a developer- - initiated decision. -* Don't use an ORM if the problem doesn't need one. - SQLAlchemy consists of a Core and separate ORM - component. The Core offers a full SQL expression - language that allows Pythonic construction - of SQL constructs that render directly to SQL - strings for a target database, returning - result sets that are essentially enhanced DBAPI - cursors. -* Transactions should be the norm. With SQLAlchemy's - ORM, nothing goes to permanent storage until - commit() is called. SQLAlchemy encourages applications - to create a consistent means of delineating - the start and end of a series of operations. -* Never render a literal value in a SQL statement. - Bound parameters are used to the greatest degree - possible, allowing query optimizers to cache - query plans effectively and making SQL injection - attacks a non-issue. - -Documentation -------------- - -Latest documentation is at: - -http://www.sqlalchemy.org/docs/ - -Installation / Requirements ---------------------------- - -Full documentation for installation is at -`Installation `_. - -Getting Help / Development / Bug reporting ------------------------------------------- - -Please refer to the `SQLAlchemy Community Guide `_. - -License -------- - -SQLAlchemy is distributed under the `MIT license -`_. - - - diff --git a/lib/python3.4/site-packages/SQLAlchemy-1.0.12.dist-info/RECORD b/lib/python3.4/site-packages/SQLAlchemy-1.0.12.dist-info/RECORD deleted file mode 100644 index 057fad3..0000000 --- a/lib/python3.4/site-packages/SQLAlchemy-1.0.12.dist-info/RECORD +++ /dev/null @@ -1,376 +0,0 @@ -SQLAlchemy-1.0.12.dist-info/DESCRIPTION.rst,sha256=ZN8fj2owI_rw0Emr3_RXqoNfTFkThjiZy7xcCzg1W_g,5013 -SQLAlchemy-1.0.12.dist-info/METADATA,sha256=xCBLJSNub29eg_Bm-fHTUT_al-Sr8jh38ztUF4_s1so,5820 -SQLAlchemy-1.0.12.dist-info/RECORD,, -SQLAlchemy-1.0.12.dist-info/WHEEL,sha256=AEztX7vHDtcgysb-4-5-DyIKMLIPg6NMxY9dXTRdoXQ,104 -SQLAlchemy-1.0.12.dist-info/metadata.json,sha256=QT7EcApgL9QrRqR1YIngngveBNd13H8h-oNK9fsxj0U,1004 -SQLAlchemy-1.0.12.dist-info/top_level.txt,sha256=rp-ZgB7D8G11ivXON5VGPjupT1voYmWqkciDt5Uaw_Q,11 -sqlalchemy/__init__.py,sha256=fTurvwmGkoRt_zdwxoZNWTHg6VdzvBpeHyPmUnexOK4,2112 -sqlalchemy/cprocessors.cpython-34m.so,sha256=hvG3A0r4VO9gevdsLGZYRdqNfG2rahDIFUqJ-fUxAB4,52136 -sqlalchemy/cresultproxy.cpython-34m.so,sha256=piAFu3JE3mOaKpNSg6vcu8jGTl_-X6elUDWS2h_YOfQ,61504 -sqlalchemy/cutils.cpython-34m.so,sha256=-ARQsTXx0XDzghnRNCwdaxm2eeIn2TuEqoU_Wb18h6E,34312 -sqlalchemy/events.py,sha256=j8yref-XfuJxkPKbvnZmB4jeUAIujPcbLAzD2cKV4f4,43944 -sqlalchemy/exc.py,sha256=NhA5R5nDdducWkp0MXtlQ0-Q6iF_rhqkHWblIfuSYGk,11706 -sqlalchemy/inspection.py,sha256=zMa-2nt-OQ0Op1dqq0Z2XCnpdAFSTkqif5Kdi8Wz8AU,3093 -sqlalchemy/interfaces.py,sha256=XSx5y-HittAzc79lU4C7rPbTtSW_Hc2c89NqCy50tsQ,10967 -sqlalchemy/log.py,sha256=opX7UORq5N6_jWxN9aHX9OpiirwAcRA0qq-u5m4SMkQ,6712 -sqlalchemy/pool.py,sha256=-F51TIJYl0XGTV2_sdpV8C1m0jTTQaq0nAezdmSgr84,47220 -sqlalchemy/processors.py,sha256=Li1kdC-I0v03JxeOz4V7u4HAevK6LledyCPvaL06mYc,5220 -sqlalchemy/schema.py,sha256=rZzZJJ8dT9trLSYknFpHm0N1kRERYwhqHH3QD31SJjc,1182 -sqlalchemy/types.py,sha256=qcoy5xKaurDV4kaXr489GL2sz8FKkWX21Us3ZCqeasg,1650 -sqlalchemy/connectors/__init__.py,sha256=97YbriYu5mcljh7opc1JOScRlf3Tk8ldbn5urBVm4WY,278 -sqlalchemy/connectors/mxodbc.py,sha256=-0iqw2k8e-o3OkAKzoCWuAaEPxlEjslvfRM9hnVXENM,5348 -sqlalchemy/connectors/pyodbc.py,sha256=pG2yf3cEDtTr-w_m4to6jF5l8hZk6MJv69K3cg84NfY,6264 -sqlalchemy/connectors/zxJDBC.py,sha256=2KK_sVSgMsdW0ufZqAwgXjd1FsMb4hqbiUQRAkM0RYg,1868 -sqlalchemy/databases/__init__.py,sha256=BaQyAuMjXNpZYV47hCseHrDtPzTfSw-iqUQYxMWJddw,817 -sqlalchemy/dialects/__init__.py,sha256=7SMul8PL3gkbJRUwAwovHLae5qBBApRF-VcRwU-VtdU,1012 -sqlalchemy/dialects/postgres.py,sha256=heNVHys6E91DIBepXT3ls_4_6N8HTTahrZ49W5IR3M0,614 -sqlalchemy/dialects/firebird/__init__.py,sha256=QYmQ0SaGfq3YjDraCV9ALwqVW5A3KDUF0F6air_qp3Q,664 -sqlalchemy/dialects/firebird/base.py,sha256=IT0prWkh1TFSTke-BqGdVMGdof53zmWWk6zbJZ_TuuI,28170 -sqlalchemy/dialects/firebird/fdb.py,sha256=l4s6_8Z0HvqxgqGz0LNcKWP1qUmEc3M2XM718_drN34,4325 -sqlalchemy/dialects/firebird/kinterbasdb.py,sha256=kCsn2ed4u9fyjcyfEI3rXQdKvL05z9wtf5YjW9-NrvI,6299 -sqlalchemy/dialects/mssql/__init__.py,sha256=G12xmirGZgMzfUKZCA8BFfaCmqUDuYca9Fu2VP_eaks,1081 -sqlalchemy/dialects/mssql/adodbapi.py,sha256=dHZgS3pEDX39ixhlDfTtDcjCq6rdjF85VS7rIZ1TfYo,2493 -sqlalchemy/dialects/mssql/base.py,sha256=xqRmK_npoyH5gl626EjazVnu9TEArmrBIFme_avYFUg,66855 -sqlalchemy/dialects/mssql/information_schema.py,sha256=pwuTsgOCY5eSBW9w-g-pyJDRfyuZ_rOEXXNYRuAroCE,6418 -sqlalchemy/dialects/mssql/mxodbc.py,sha256=G9LypIeEizgxeShtDu2M7Vwm8NopnzaTmnZMD49mYeg,3856 -sqlalchemy/dialects/mssql/pymssql.py,sha256=w92w4YQzXdHb53AjCrBcIRHsf6jmie1iN9H7gJNGX4k,3079 -sqlalchemy/dialects/mssql/pyodbc.py,sha256=KRke1Hizrg3r5iYqxdBI0axXVQ_pZR_UPxLaAdF0mKk,9473 -sqlalchemy/dialects/mssql/zxjdbc.py,sha256=u4uBgwk0LbI7_I5CIvM3C4bBb0pmrw2_DqRh_ehJTkI,2282 -sqlalchemy/dialects/mysql/__init__.py,sha256=3cQ2juPT8LsZTicPa2J-0rCQjQIQaPgyBzxjV3O_7xs,1171 -sqlalchemy/dialects/mysql/base.py,sha256=rwC8fnhGZaAnsPB1Jhg4sTcrWE2hjxrZJ5deCS0rAOc,122869 -sqlalchemy/dialects/mysql/cymysql.py,sha256=nqsdQA8LBLIc6eilgX6qwkjm7szsUoqMTVYwK9kkfsE,2349 -sqlalchemy/dialects/mysql/gaerdbms.py,sha256=2MxtTsIqlpq_J32HHqDzz-5vu-mC51Lb7PvyGkJa73M,3387 -sqlalchemy/dialects/mysql/mysqlconnector.py,sha256=DMDm684Shk-ijVo7w-yidopYw7EC6EiOmJY56EPawok,5323 -sqlalchemy/dialects/mysql/mysqldb.py,sha256=McqROngxAknbLOXoUAG9o9mP9FQBLs-ouD-JqqI2Ses,6564 -sqlalchemy/dialects/mysql/oursql.py,sha256=rmdr-r66iJ2amqFeGvCohvE8WCl_i6R9KcgVG0uXOQs,8124 -sqlalchemy/dialects/mysql/pymysql.py,sha256=e-qehI-sASmAjEa0ajHqjZjlyJYWsb3RPQY4iBR5pz0,1504 -sqlalchemy/dialects/mysql/pyodbc.py,sha256=Ze9IOKw6ANVQj25IlmSGR8aaJhM0pMuRtbzKF7UsZCY,2665 -sqlalchemy/dialects/mysql/zxjdbc.py,sha256=LIhe2mHSRVgi8I7qmiTMVBRSpuWJVnuDtpHTUivIx0M,3942 -sqlalchemy/dialects/oracle/__init__.py,sha256=UhF2ZyPfT3EFAnP8ZjGng6GnWSzmAkjMax0Lucpn0Bg,797 -sqlalchemy/dialects/oracle/base.py,sha256=2KJO-sU2CVKK1rij6bAQ5ZFJv203_NmzT8dE5qor9wc,55961 -sqlalchemy/dialects/oracle/cx_oracle.py,sha256=-d5tHbNcCyjbgVtAvWfHgSY2yA8C9bvCzxhwkdWFNe0,38635 -sqlalchemy/dialects/oracle/zxjdbc.py,sha256=nC7XOCY3NdTLrEyIacNTnLDCaeVjWn59q8UYssJL8Wo,8112 -sqlalchemy/dialects/postgresql/__init__.py,sha256=SjCtM5b3EaGyRaTyg_i82sh_qjkLEIVUXW91XDihiCM,1299 -sqlalchemy/dialects/postgresql/base.py,sha256=xhdLeHuWioTv9LYW41pcIPsEjD2fyeh7JflkLKmZMB8,104230 -sqlalchemy/dialects/postgresql/constraints.py,sha256=8UDx_2TNQgqIUSRETZPhgninJigQ6rMfdRNI6vIt3Is,3119 -sqlalchemy/dialects/postgresql/hstore.py,sha256=n8Wsd7Uldk3bbg66tTa0NKjVqjhJUbF1mVeUsM7keXA,11402 -sqlalchemy/dialects/postgresql/json.py,sha256=MTlIGinMDa8iaVbZMOzYnremo0xL4tn2wyGTPwnvX6U,12215 -sqlalchemy/dialects/postgresql/pg8000.py,sha256=x6o3P8Ad0wKsuF9qeyip39BKc5ORJZ4nWxv-8qOdj0E,8375 -sqlalchemy/dialects/postgresql/psycopg2.py,sha256=4ac0upErNRJz6YWJYNbATCU3ncWFvat5kal_Cuq-Jhw,26953 -sqlalchemy/dialects/postgresql/psycopg2cffi.py,sha256=8R3POkJH8z8a2DxwKNmfmQOsxFqsg4tU_OnjGj3OfDA,1651 -sqlalchemy/dialects/postgresql/pypostgresql.py,sha256=raQRfZb8T9-c-jmq1w86Wci5QyiXgf_9_71OInT_sAw,2655 -sqlalchemy/dialects/postgresql/ranges.py,sha256=MihdGXMdmCM6ToIlrj7OJx9Qh_8BX8bv5PSaAepHmII,4814 -sqlalchemy/dialects/postgresql/zxjdbc.py,sha256=AhEGRiAy8q-GM0BStFcsLBgSwjxHkkwy2-BSroIoADo,1397 -sqlalchemy/dialects/sqlite/__init__.py,sha256=0wW0VOhE_RtFDpRcbwvvo3XtD6Y2-SDgG4K7468eh_w,736 -sqlalchemy/dialects/sqlite/base.py,sha256=_L9-854ITf8Fl2BgUymF9fKjDFvXSo7Pb2yuz1CMkDo,55007 -sqlalchemy/dialects/sqlite/pysqlcipher.py,sha256=sgXCqn8ZtNIeTDwyo253Kj5mn4TPlIW3AZCNNmURi2A,4129 -sqlalchemy/dialects/sqlite/pysqlite.py,sha256=G-Cg-iI-ErYsVjOH4UlQTEY9pLnLOLV89ik8q0-reuY,14980 -sqlalchemy/dialects/sybase/__init__.py,sha256=gwCgFR_C_hoj0Re7PiaW3zmKSWaLpsd96UVXdM7EnTM,894 -sqlalchemy/dialects/sybase/base.py,sha256=Xpl3vEd5VDyvoIRMg0DZa48Or--yBSrhaZ2CbTSCt0w,28853 -sqlalchemy/dialects/sybase/mxodbc.py,sha256=E_ask6yFSjyhNPvv7gQsvA41WmyxbBvRGWjCyPVr9Gs,901 -sqlalchemy/dialects/sybase/pyodbc.py,sha256=0a_gKwrIweJGcz3ZRYuQZb5BIvwjGmFEYBo9wGk66kI,2102 -sqlalchemy/dialects/sybase/pysybase.py,sha256=tu2V_EbtgxWYOvt-ybo5_lLiBQzsIFaAtF8e7S1_-rk,3208 -sqlalchemy/engine/__init__.py,sha256=fyIFw2R5wfLQzSbfE9Jz-28ZDP5RyB-5elNH92uTZYM,18803 -sqlalchemy/engine/base.py,sha256=cRqbbG0QuUG-NGs3GOPVQsU0WLsw5bLT0Y07Yf8OOfU,79399 -sqlalchemy/engine/default.py,sha256=U_yaliCazUHp6cfk_NVzhB4F_zOJSyy959rHyk40J4M,36548 -sqlalchemy/engine/interfaces.py,sha256=CmPYM_oDp1zAPH13sKmufO4Tuha6KA-fXRQq-K_3YTE,35908 -sqlalchemy/engine/reflection.py,sha256=jly5YN-cyjoBDxHs9qO6Mlgm1OZSb2NBNFALwZMEGxE,28590 -sqlalchemy/engine/result.py,sha256=ot5RQxa6kjoScXRUR-DTl0iJJISBhmyNTj1JZkZiNsk,44027 -sqlalchemy/engine/strategies.py,sha256=mwy-CTrnXzyaIA1TRQBQ_Z2O8wN0lnTNZwDefEWCR9A,8929 -sqlalchemy/engine/threadlocal.py,sha256=y4wOLjtbeY-dvp2GcJDtos6F2jzfP11JVAaSFwZ0zRM,4191 -sqlalchemy/engine/url.py,sha256=ZhS_Iqiu6V1kfIM2pcv3ud9fOPXkFOHBv8wiLOqbJhc,8228 -sqlalchemy/engine/util.py,sha256=Tvb9sIkyd6qOwIA-RsBmo5j877UXa5x-jQmhqnhHWRA,2338 -sqlalchemy/event/__init__.py,sha256=KnUVp-NVX6k276ntGffxgkjVmIWR22FSlzrbAKqQ6S4,419 -sqlalchemy/event/api.py,sha256=O2udbj5D7HdXcvsGBQk6-dK9CAFfePTypWOrUdqmhYY,5990 -sqlalchemy/event/attr.py,sha256=VfRJJl4RD24mQaIoDwArWL2hsGOX6ISSU6vKusVMNO0,12053 -sqlalchemy/event/base.py,sha256=DWDKZV19fFsLavu2cXOxXV8NhO3XuCbKcKamBKyXuME,9540 -sqlalchemy/event/legacy.py,sha256=ACnVeBUt8uwVfh1GNRu22cWCADC3CWZdrsBKzAd6UQQ,5814 -sqlalchemy/event/registry.py,sha256=13wx1qdEmcQeCoAmgf_WQEMuR43h3v7iyd2Re54QdOE,7786 -sqlalchemy/ext/__init__.py,sha256=smCZIGgjJprT4ddhuYSLZ8PrTn4NdXPP3j03a038SdE,322 -sqlalchemy/ext/associationproxy.py,sha256=y61Y4UIZNBit5lqk2WzdHTCXIWRrBg3hHbRVsqXjnqE,33422 -sqlalchemy/ext/automap.py,sha256=Aet-3zk2vbsJVLqigwZJYau0hB1D6Y21K65QVWeB5pc,41567 -sqlalchemy/ext/baked.py,sha256=BnVaB4pkQxHk-Fyz4nUw225vCxO_zrDuVC6t5cSF9x8,16967 -sqlalchemy/ext/compiler.py,sha256=aSSlySoTsqN-JkACWFIhv3pq2CuZwxKm6pSDfQoc10Q,16257 -sqlalchemy/ext/horizontal_shard.py,sha256=XEBYIfs0YrTt_2vRuaBY6C33ZOZMUHQb2E4X2s3Szns,4814 -sqlalchemy/ext/hybrid.py,sha256=wNXvuYEEmKy-Nc6z7fu1c2gNWCMOiQA0N14Y3FCq5lo,27989 -sqlalchemy/ext/instrumentation.py,sha256=HRgNiuYJ90_uSKC1iDwsEl8_KXscMQkEb9KeElk-yLE,14856 -sqlalchemy/ext/mutable.py,sha256=lx7b_ewFVe7O6I4gTXdi9M6C6TqxWCFiViqCM2VwUac,25444 -sqlalchemy/ext/orderinglist.py,sha256=UCkuZxTWAQ0num-b5oNm8zNJAmVuIFcbFXt5e7JPx-U,13816 -sqlalchemy/ext/serializer.py,sha256=fK3N1miYF16PSIZDjLFS2zI7y-scZ9qtmopXIfzPqrA,5586 -sqlalchemy/ext/declarative/__init__.py,sha256=Jpwf2EukqwNe4RzDfCmX1p-hQ6pPhJEIL_xunaER3tw,756 -sqlalchemy/ext/declarative/api.py,sha256=PdoO_jh50TWaMvXqnjNh-vX42VqB75ZyliluilphvsU,23317 -sqlalchemy/ext/declarative/base.py,sha256=96SJBOfxpTMsU2jAHrvuXbsjUUJ7TvbLm11R8Hy2Irc,25231 -sqlalchemy/ext/declarative/clsregistry.py,sha256=jaLLSr-66XvLnA1Z9kxjKatH_XHxWchqEXMKwvjKAXk,10817 -sqlalchemy/orm/__init__.py,sha256=UzDockQEVMaWvr-FE4y1rptrMb5uX5k8v_UNQs82qFY,8033 -sqlalchemy/orm/attributes.py,sha256=OmXkppJEZxRGc0acZZZkSbUhdfDl8ry3Skmvzl3OtLQ,56510 -sqlalchemy/orm/base.py,sha256=F0aRZGK2_1F8phwBHnVYaChkAb-nnTRoFE1VKSvmAwA,14634 -sqlalchemy/orm/collections.py,sha256=TFutWIn_c07DI48FDOKMsFMnAoQB3BG2FnEMGzEF3iI,53549 -sqlalchemy/orm/dependency.py,sha256=phB8nS1788FSd4dWa2j9d4uj6QFlRL7nzcXvh3Bb7Zo,46192 -sqlalchemy/orm/deprecated_interfaces.py,sha256=A63t6ivbZB3Wq8vWgL8I05uTRR6whcWnIPkquuTIPXU,18254 -sqlalchemy/orm/descriptor_props.py,sha256=uk5r77w1VUWVgn0bkgOItkAlMh9FRgeT6OCgOHz3_bM,25141 -sqlalchemy/orm/dynamic.py,sha256=I_YP7X-H9HLjeFHmYgsOas6JPdqg0Aqe0kaltt4HVzA,13283 -sqlalchemy/orm/evaluator.py,sha256=Hozggsd_Fi0YyqHrr9-tldtOA9NLX0MVBF4e2vSM6GY,4731 -sqlalchemy/orm/events.py,sha256=yRaoXlBL78b3l11itTrAy42UhLu42-7cgXKCFUGNXSg,69410 -sqlalchemy/orm/exc.py,sha256=P5lxi5RMFokiHL136VBK0AP3UmAlJcSDHtzgo-M6Kgs,5439 -sqlalchemy/orm/identity.py,sha256=zsb8xOZaPYKvs4sGhyxW21mILQDrtdSuzD4sTyeKdJs,9021 -sqlalchemy/orm/instrumentation.py,sha256=xtq9soM3mpMws7xqNJIFYXqKw65p2nnxCTfmMpuvpeI,17510 -sqlalchemy/orm/interfaces.py,sha256=AqitvZ_BBkB6L503uhdH55nxHplleJ2kQMwM7xKq9Sc,21552 -sqlalchemy/orm/loading.py,sha256=cjC8DQ5g8_rMxroYrYHfW5s35Z5OFSNBUu0-LpxW7hI,22878 -sqlalchemy/orm/mapper.py,sha256=sfooeslzwWAKN7WNIQoZ2Y3u_mCyIxd0tebp4yEUu8k,115074 -sqlalchemy/orm/path_registry.py,sha256=8Pah0P8yPVUyRjoET7DvIMGtM5PC8HZJC4GtxAyqVAs,8370 -sqlalchemy/orm/persistence.py,sha256=WzUUNm1UGm5mGxbv94hLTQowEDNoXfU1VoyGnoKeN_g,51028 -sqlalchemy/orm/properties.py,sha256=HR3eoY3Ze3FUPPNCXM_FruWz4pEMWrGlqtCGiK2G1qE,10426 -sqlalchemy/orm/query.py,sha256=2q2XprzbZhIlAbs0vihIr9dgqfJtcbrjNewgE9q26gE,147616 -sqlalchemy/orm/relationships.py,sha256=79LRGGz8MxsKsAlv0vuZ6MYZXzDXXtfiOCZg-IQ9hiU,116992 -sqlalchemy/orm/scoping.py,sha256=Ao-K4iqg4pBp7Si5JOAlro5zUL_r500TC3lVLcFMLDs,6421 -sqlalchemy/orm/session.py,sha256=yctpvCsLUcFv9Sy8keT1SElZ2VH5DNScYtO7Z77ptYI,111314 -sqlalchemy/orm/state.py,sha256=4LwwftOtPQldH12SKZV2UFgzqPOCj40QfQ08knZs0_E,22984 -sqlalchemy/orm/strategies.py,sha256=rdLEs2pPrF8nqcQqezyG-fGdmE11r22fUva4ES3KGOE,58529 -sqlalchemy/orm/strategy_options.py,sha256=_z7ZblWCnXh8bZpGSOXDoUwtdUqnXdCaWfKXYDgCuH0,34973 -sqlalchemy/orm/sync.py,sha256=B-d-H1Gzw1TkflpvgJeQghwTzqObzhZCQdvEdSPyDeE,5451 -sqlalchemy/orm/unitofwork.py,sha256=EQvZ7RZ-u5wJT51BWTeMJJi-tt22YRnmqywGUCn0Qrc,23343 -sqlalchemy/orm/util.py,sha256=Mj3NXDd8Mwp4O5Vr5zvRGFUZRlB65WpExdDBFJp04wQ,38092 -sqlalchemy/sql/__init__.py,sha256=IFCJYIilmmAQRnSDhv9Y6LQUSpx6pUU5zp9VT7sOx0c,1737 -sqlalchemy/sql/annotation.py,sha256=8ncgAVUo5QCoinApKjREi8esWNMFklcBqie8Q42KsaQ,6136 -sqlalchemy/sql/base.py,sha256=TuXOp7z0Q30qKAjhgcsts6WGvRbvg6F7OBojMQAxjX0,20990 -sqlalchemy/sql/compiler.py,sha256=G0Ft_Dmq1AousO66eagPhI0g9Vkqui_c_LjqY0AbImU,100710 -sqlalchemy/sql/crud.py,sha256=X86dyvzEnbj0-oeJO5ufi6zXxbSKBtDeu5JHlNg-BJU,19837 -sqlalchemy/sql/ddl.py,sha256=nkjd_B4lKwC2GeyPjE0ZtRB9RKXccQL1g1XoZ4p69sM,37540 -sqlalchemy/sql/default_comparator.py,sha256=QaowWtW4apULq_aohDvmj97j0sDtHQQjMRdNxXm83vk,10447 -sqlalchemy/sql/dml.py,sha256=7846H52IMJfMYi5Jd-Cv6Hy9hZM4dkonXbjfBjl5ED4,33330 -sqlalchemy/sql/elements.py,sha256=MLeecC5dMqeekZmFbPn0J-ODKJj5DBDE5v6kuSkq66I,132898 -sqlalchemy/sql/expression.py,sha256=vFZ9MmBlC9Fg8IYzLMAwXgcsnXZhkZbUstY6dO8BFGY,5833 -sqlalchemy/sql/functions.py,sha256=ZYKyvPnVKZMtHyyjyNwK0M5UWPrZmFz3vtTqHN-8658,18533 -sqlalchemy/sql/naming.py,sha256=foE2lAzngLCFXCeHrpv0S4zT23GCnZLCiata2MPo0kE,4662 -sqlalchemy/sql/operators.py,sha256=UeZgb7eRhWd4H7OfJZkx0ZWOjvo5chIUXQsBAIeeTDY,23013 -sqlalchemy/sql/schema.py,sha256=awhLY5YjUBah8ZYxW9FBfe6lH0v4fW0UJLTNApnx7E0,145511 -sqlalchemy/sql/selectable.py,sha256=o1Hom00WGHjI21Mdb5fkX-f0k2nksQNb_txT0KWK1zQ,118995 -sqlalchemy/sql/sqltypes.py,sha256=JGxizqIjO1WFuZpppWj1Yi5cvCyBczb1JqUQeuhQn8s,54879 -sqlalchemy/sql/type_api.py,sha256=Xe6yH4slgdLA8HRjT19GBOou51SS9o4oUhyK0xfn04c,42846 -sqlalchemy/sql/util.py,sha256=7AsOsyhIq2eSLMWtwvqfTLc2MdCotGzEKQKFE3wk5sk,20382 -sqlalchemy/sql/visitors.py,sha256=4ipGvAkqFaSAWgyNuKjx5x_ms8GIy9aq-wC5pj4-Z3g,10271 -sqlalchemy/testing/__init__.py,sha256=MwKimX0atzs_SmG2j74GXLiyI8O56e3DLq96tcoL0TM,1095 -sqlalchemy/testing/assertions.py,sha256=r1I2nHC599VZcY-5g0JYRQl8bl9kjkf6WFOooOmJ2eE,16112 -sqlalchemy/testing/assertsql.py,sha256=-fP9Iuhdu52BJoT1lEj_KED8jy5ay_XiJu7i4Ry9eWA,12335 -sqlalchemy/testing/config.py,sha256=nqvVm55Vk0BVNjk1Wj3aYR65j_EEEepfB-W9QSFLU-k,2469 -sqlalchemy/testing/distutils_run.py,sha256=tkURrZRwgFiSwseKm1iJRkSjKf2Rtsb3pOXRWtACTHI,247 -sqlalchemy/testing/engines.py,sha256=u6GlDMXt0FKqVTQe_QJ5JXAnkA6W-xdw6Fe_5gMAQhg,9359 -sqlalchemy/testing/entities.py,sha256=IXqTgAihV-1TZyxL0MWdZzu4rFtxdbWKWFetIJWNGM4,2992 -sqlalchemy/testing/exclusions.py,sha256=WuH_tVK5fZJWe8Hu2LzNB4HNQMa_iAUaGC-_6mHUdIM,12570 -sqlalchemy/testing/fixtures.py,sha256=q4nK-81z2EWs17TjeJtPmnaJUCtDdoUiIU7jgLq3l_w,10721 -sqlalchemy/testing/mock.py,sha256=vj5q-GzJrLW6mMVDLqsppxBu_p7K49VvjfiVt5tn0o8,630 -sqlalchemy/testing/pickleable.py,sha256=8I8M4H1XN29pZPMxZdYkmpKWfwzPsUn6WK5FX4UP9L4,2641 -sqlalchemy/testing/profiling.py,sha256=Q_wOTS5JtcGBcs2eCYIvoRoDS_FW_HcfEW3hXWB87Zg,8392 -sqlalchemy/testing/provision.py,sha256=mU9g6JZEHIshqUkE6PWu-t61FVPs_cUJtEtVFRavj9g,9377 -sqlalchemy/testing/replay_fixture.py,sha256=iAxg7XsFkKSCcJnrNPQNJfjMxOgeBAa-ShOkywWPJ4w,5429 -sqlalchemy/testing/requirements.py,sha256=aIdvbfugMzrlVdldEbpcwretX-zjiukPhPUSZgulrzU,19949 -sqlalchemy/testing/runner.py,sha256=hpNH6MNTif4TnBRySxpm92KgFwDK0mOa8eF7wZXumTI,1607 -sqlalchemy/testing/schema.py,sha256=agOzrIMvmuUCeVZY5mYjJ1eJmOP69-wa0gZALtNtJBk,3446 -sqlalchemy/testing/util.py,sha256=IJ688AWzichtXVwWgYf_A4BUbcXPGsK6BQP5fvY3h-U,7544 -sqlalchemy/testing/warnings.py,sha256=-KskRAh1RkJ_69UIY_WR7i15u21U3gDLQ6nKlnJT7_w,987 -sqlalchemy/testing/plugin/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0 -sqlalchemy/testing/plugin/bootstrap.py,sha256=Iw8R-d1gqoz_NKFtPyGfdX56QPcQHny_9Lvwov65aVY,1634 -sqlalchemy/testing/plugin/noseplugin.py,sha256=In79x6zs9DOngfoYpaHojihWlSd4PeS7Nwzh3M_KNM4,2847 -sqlalchemy/testing/plugin/plugin_base.py,sha256=h4RI4nPNdNq9kYABp6IP89Eknm29q8usgO-nWb8Eobc,17120 -sqlalchemy/testing/plugin/pytestplugin.py,sha256=Pbc62y7Km0PHXd4M9dm5ThBwrlXkM4WtIX-W1pOaM84,5812 -sqlalchemy/testing/suite/__init__.py,sha256=wqCTrb28i5FwhQZOyXVlnz3mA94iQOUBio7lszkFq-g,471 -sqlalchemy/testing/suite/test_ddl.py,sha256=Baw0ou9nKdADmrRuXgWzF1FZx0rvkkw3JHc6yw5BN0M,1838 -sqlalchemy/testing/suite/test_dialect.py,sha256=ORQPXUt53XtO-5ENlWgs8BpsSdPBDjyMRl4W2UjXLI4,1165 -sqlalchemy/testing/suite/test_insert.py,sha256=nP0mgVpsVs72MHMADmihB1oXLbFBpsYsLGO3BlQ7RLU,8132 -sqlalchemy/testing/suite/test_reflection.py,sha256=HtJRsJ_vuNMrOhnPTvuIvRg66OakSaSpeCU36zhaSPg,24616 -sqlalchemy/testing/suite/test_results.py,sha256=oAcO1tD0I7c9ErMeSvSZBZfz1IBDMJHJTf64Y1pBodk,6685 -sqlalchemy/testing/suite/test_select.py,sha256=u0wAz1g-GrAFdZpG4zwSrVckVtjULvjlbd0Z1U1jHAA,5729 -sqlalchemy/testing/suite/test_sequence.py,sha256=fmBR4Pc5tOLSkXFxfcqwGx1z3xaxeJeUyqDnTakKTBU,3831 -sqlalchemy/testing/suite/test_types.py,sha256=UKa-ZPdpz16mVKvT-9ISRAfqdrqiKaE7IA-_phQQuxo,17088 -sqlalchemy/testing/suite/test_update_delete.py,sha256=r5p467r-EUsjEcWGfUE0VPIfN4LLXZpLRnnyBLyyjl4,1582 -sqlalchemy/util/__init__.py,sha256=G06a5vBxg27RtWzY6dPZHt1FO8qtOiy_2C9PHTTMblI,2520 -sqlalchemy/util/_collections.py,sha256=JZkeYK4GcIE1A5s6MAvHhmUp_X4wp6r7vMGT-iMftZ8,27842 -sqlalchemy/util/compat.py,sha256=80OXp3D-F_R-pLf7s-zITPlfCqG1s_5o6KTlY1g2p0Q,6821 -sqlalchemy/util/deprecations.py,sha256=D_LTsfb9jHokJtPEWNDRMJOc372xRGNjputAiTIysRU,4403 -sqlalchemy/util/langhelpers.py,sha256=Nhe3Y9ieK6JaFYejjYosVOjOSSIBT2V385Hu6HGcyZk,41607 -sqlalchemy/util/queue.py,sha256=rs3W0LDhKt7M_dlQEjYpI9KS-bzQmmwN38LE_-RRVvU,6548 -sqlalchemy/util/topological.py,sha256=xKsYjjAat4p8cdqRHKwibLzr6WONbPTC0X8Mqg7jYno,2794 -SQLAlchemy-1.0.12.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4 -sqlalchemy/orm/__pycache__/path_registry.cpython-34.pyc,, -sqlalchemy/ext/__pycache__/associationproxy.cpython-34.pyc,, -sqlalchemy/__pycache__/__init__.cpython-34.pyc,, -sqlalchemy/orm/__pycache__/deprecated_interfaces.cpython-34.pyc,, -sqlalchemy/orm/__pycache__/dynamic.cpython-34.pyc,, -sqlalchemy/event/__pycache__/legacy.cpython-34.pyc,, -sqlalchemy/event/__pycache__/api.cpython-34.pyc,, -sqlalchemy/dialects/__pycache__/postgres.cpython-34.pyc,, -sqlalchemy/testing/__pycache__/profiling.cpython-34.pyc,, -sqlalchemy/dialects/firebird/__pycache__/__init__.cpython-34.pyc,, -sqlalchemy/dialects/mysql/__pycache__/gaerdbms.cpython-34.pyc,, -sqlalchemy/dialects/mysql/__pycache__/oursql.cpython-34.pyc,, -sqlalchemy/testing/suite/__pycache__/test_types.cpython-34.pyc,, -sqlalchemy/event/__pycache__/registry.cpython-34.pyc,, -sqlalchemy/testing/__pycache__/distutils_run.cpython-34.pyc,, -sqlalchemy/dialects/mssql/__pycache__/mxodbc.cpython-34.pyc,, -sqlalchemy/engine/__pycache__/base.cpython-34.pyc,, -sqlalchemy/dialects/postgresql/__pycache__/pypostgresql.cpython-34.pyc,, -sqlalchemy/util/__pycache__/__init__.cpython-34.pyc,, -sqlalchemy/util/__pycache__/topological.cpython-34.pyc,, -sqlalchemy/dialects/firebird/__pycache__/kinterbasdb.cpython-34.pyc,, -sqlalchemy/orm/__pycache__/persistence.cpython-34.pyc,, -sqlalchemy/connectors/__pycache__/mxodbc.cpython-34.pyc,, -sqlalchemy/dialects/mssql/__pycache__/base.cpython-34.pyc,, -sqlalchemy/connectors/__pycache__/__init__.cpython-34.pyc,, -sqlalchemy/sql/__pycache__/elements.cpython-34.pyc,, -sqlalchemy/testing/suite/__pycache__/__init__.cpython-34.pyc,, -sqlalchemy/dialects/__pycache__/__init__.cpython-34.pyc,, -sqlalchemy/util/__pycache__/langhelpers.cpython-34.pyc,, -sqlalchemy/testing/__pycache__/entities.cpython-34.pyc,, -sqlalchemy/engine/__pycache__/interfaces.cpython-34.pyc,, -sqlalchemy/sql/__pycache__/schema.cpython-34.pyc,, -sqlalchemy/ext/__pycache__/baked.cpython-34.pyc,, -sqlalchemy/dialects/postgresql/__pycache__/zxjdbc.cpython-34.pyc,, -sqlalchemy/orm/__pycache__/base.cpython-34.pyc,, -sqlalchemy/connectors/__pycache__/pyodbc.cpython-34.pyc,, -sqlalchemy/sql/__pycache__/annotation.cpython-34.pyc,, -sqlalchemy/dialects/oracle/__pycache__/zxjdbc.cpython-34.pyc,, -sqlalchemy/testing/__pycache__/runner.cpython-34.pyc,, -sqlalchemy/testing/__pycache__/schema.cpython-34.pyc,, -sqlalchemy/orm/__pycache__/relationships.cpython-34.pyc,, -sqlalchemy/__pycache__/pool.cpython-34.pyc,, -sqlalchemy/testing/suite/__pycache__/test_sequence.cpython-34.pyc,, -sqlalchemy/dialects/sqlite/__pycache__/__init__.cpython-34.pyc,, -sqlalchemy/sql/__pycache__/ddl.cpython-34.pyc,, -sqlalchemy/dialects/sybase/__pycache__/pyodbc.cpython-34.pyc,, -sqlalchemy/orm/__pycache__/dependency.cpython-34.pyc,, -sqlalchemy/sql/__pycache__/visitors.cpython-34.pyc,, -sqlalchemy/testing/__pycache__/provision.cpython-34.pyc,, -sqlalchemy/dialects/postgresql/__pycache__/json.cpython-34.pyc,, -sqlalchemy/sql/__pycache__/selectable.cpython-34.pyc,, -sqlalchemy/orm/__pycache__/exc.cpython-34.pyc,, -sqlalchemy/ext/declarative/__pycache__/clsregistry.cpython-34.pyc,, -sqlalchemy/orm/__pycache__/interfaces.cpython-34.pyc,, -sqlalchemy/testing/__pycache__/assertions.cpython-34.pyc,, -sqlalchemy/ext/__pycache__/compiler.cpython-34.pyc,, -sqlalchemy/dialects/oracle/__pycache__/cx_oracle.cpython-34.pyc,, -sqlalchemy/testing/suite/__pycache__/test_select.cpython-34.pyc,, -sqlalchemy/dialects/firebird/__pycache__/fdb.cpython-34.pyc,, -sqlalchemy/orm/__pycache__/unitofwork.cpython-34.pyc,, -sqlalchemy/testing/__pycache__/util.cpython-34.pyc,, -sqlalchemy/dialects/postgresql/__pycache__/psycopg2cffi.cpython-34.pyc,, -sqlalchemy/__pycache__/interfaces.cpython-34.pyc,, -sqlalchemy/engine/__pycache__/util.cpython-34.pyc,, -sqlalchemy/dialects/postgresql/__pycache__/__init__.cpython-34.pyc,, -sqlalchemy/__pycache__/schema.cpython-34.pyc,, -sqlalchemy/orm/__pycache__/sync.cpython-34.pyc,, -sqlalchemy/__pycache__/processors.cpython-34.pyc,, -sqlalchemy/dialects/firebird/__pycache__/base.cpython-34.pyc,, -sqlalchemy/dialects/postgresql/__pycache__/psycopg2.cpython-34.pyc,, -sqlalchemy/databases/__pycache__/__init__.cpython-34.pyc,, -sqlalchemy/sql/__pycache__/sqltypes.cpython-34.pyc,, -sqlalchemy/dialects/oracle/__pycache__/base.cpython-34.pyc,, -sqlalchemy/sql/__pycache__/functions.cpython-34.pyc,, -sqlalchemy/dialects/sqlite/__pycache__/pysqlcipher.cpython-34.pyc,, -sqlalchemy/testing/suite/__pycache__/test_dialect.cpython-34.pyc,, -sqlalchemy/ext/__pycache__/automap.cpython-34.pyc,, -sqlalchemy/testing/__pycache__/mock.cpython-34.pyc,, -sqlalchemy/testing/__pycache__/requirements.cpython-34.pyc,, -sqlalchemy/testing/suite/__pycache__/test_results.cpython-34.pyc,, -sqlalchemy/orm/__pycache__/__init__.cpython-34.pyc,, -sqlalchemy/dialects/postgresql/__pycache__/base.cpython-34.pyc,, -sqlalchemy/util/__pycache__/deprecations.cpython-34.pyc,, -sqlalchemy/dialects/mssql/__pycache__/pyodbc.cpython-34.pyc,, -sqlalchemy/orm/__pycache__/state.cpython-34.pyc,, -sqlalchemy/event/__pycache__/base.cpython-34.pyc,, -sqlalchemy/__pycache__/log.cpython-34.pyc,, -sqlalchemy/connectors/__pycache__/zxJDBC.cpython-34.pyc,, -sqlalchemy/testing/plugin/__pycache__/plugin_base.cpython-34.pyc,, -sqlalchemy/orm/__pycache__/identity.cpython-34.pyc,, -sqlalchemy/dialects/mysql/__pycache__/mysqlconnector.cpython-34.pyc,, -sqlalchemy/orm/__pycache__/attributes.cpython-34.pyc,, -sqlalchemy/ext/declarative/__pycache__/__init__.cpython-34.pyc,, -sqlalchemy/dialects/sqlite/__pycache__/base.cpython-34.pyc,, -sqlalchemy/ext/__pycache__/serializer.cpython-34.pyc,, -sqlalchemy/testing/plugin/__pycache__/pytestplugin.cpython-34.pyc,, -sqlalchemy/orm/__pycache__/properties.cpython-34.pyc,, -sqlalchemy/orm/__pycache__/mapper.cpython-34.pyc,, -sqlalchemy/testing/__pycache__/fixtures.cpython-34.pyc,, -sqlalchemy/sql/__pycache__/base.cpython-34.pyc,, -sqlalchemy/orm/__pycache__/events.cpython-34.pyc,, -sqlalchemy/dialects/mysql/__pycache__/zxjdbc.cpython-34.pyc,, -sqlalchemy/dialects/mysql/__pycache__/__init__.cpython-34.pyc,, -sqlalchemy/orm/__pycache__/strategy_options.cpython-34.pyc,, -sqlalchemy/dialects/sybase/__pycache__/mxodbc.cpython-34.pyc,, -sqlalchemy/util/__pycache__/compat.cpython-34.pyc,, -sqlalchemy/testing/plugin/__pycache__/bootstrap.cpython-34.pyc,, -sqlalchemy/sql/__pycache__/compiler.cpython-34.pyc,, -sqlalchemy/dialects/mysql/__pycache__/mysqldb.cpython-34.pyc,, -sqlalchemy/__pycache__/inspection.cpython-34.pyc,, -sqlalchemy/testing/plugin/__pycache__/__init__.cpython-34.pyc,, -sqlalchemy/dialects/mssql/__pycache__/adodbapi.cpython-34.pyc,, -sqlalchemy/engine/__pycache__/url.cpython-34.pyc,, -sqlalchemy/dialects/oracle/__pycache__/__init__.cpython-34.pyc,, -sqlalchemy/engine/__pycache__/result.cpython-34.pyc,, -sqlalchemy/testing/suite/__pycache__/test_insert.cpython-34.pyc,, -sqlalchemy/event/__pycache__/__init__.cpython-34.pyc,, -sqlalchemy/orm/__pycache__/scoping.cpython-34.pyc,, -sqlalchemy/orm/__pycache__/instrumentation.cpython-34.pyc,, -sqlalchemy/dialects/sybase/__pycache__/base.cpython-34.pyc,, -sqlalchemy/dialects/mysql/__pycache__/pyodbc.cpython-34.pyc,, -sqlalchemy/testing/plugin/__pycache__/noseplugin.cpython-34.pyc,, -sqlalchemy/dialects/mysql/__pycache__/cymysql.cpython-34.pyc,, -sqlalchemy/testing/__pycache__/exclusions.cpython-34.pyc,, -sqlalchemy/ext/__pycache__/mutable.cpython-34.pyc,, -sqlalchemy/sql/__pycache__/default_comparator.cpython-34.pyc,, -sqlalchemy/engine/__pycache__/default.cpython-34.pyc,, -sqlalchemy/__pycache__/types.cpython-34.pyc,, -sqlalchemy/orm/__pycache__/session.cpython-34.pyc,, -sqlalchemy/util/__pycache__/_collections.cpython-34.pyc,, -sqlalchemy/engine/__pycache__/reflection.cpython-34.pyc,, -sqlalchemy/dialects/sybase/__pycache__/pysybase.cpython-34.pyc,, -sqlalchemy/testing/__pycache__/assertsql.cpython-34.pyc,, -sqlalchemy/testing/__pycache__/replay_fixture.cpython-34.pyc,, -sqlalchemy/dialects/mysql/__pycache__/pymysql.cpython-34.pyc,, -sqlalchemy/testing/__pycache__/config.cpython-34.pyc,, -sqlalchemy/orm/__pycache__/strategies.cpython-34.pyc,, -sqlalchemy/dialects/sqlite/__pycache__/pysqlite.cpython-34.pyc,, -sqlalchemy/orm/__pycache__/util.cpython-34.pyc,, -sqlalchemy/dialects/mysql/__pycache__/base.cpython-34.pyc,, -sqlalchemy/sql/__pycache__/crud.cpython-34.pyc,, -sqlalchemy/dialects/postgresql/__pycache__/pg8000.cpython-34.pyc,, -sqlalchemy/engine/__pycache__/__init__.cpython-34.pyc,, -sqlalchemy/orm/__pycache__/loading.cpython-34.pyc,, -sqlalchemy/testing/__pycache__/__init__.cpython-34.pyc,, -sqlalchemy/dialects/postgresql/__pycache__/ranges.cpython-34.pyc,, -sqlalchemy/sql/__pycache__/operators.cpython-34.pyc,, -sqlalchemy/dialects/mssql/__pycache__/__init__.cpython-34.pyc,, -sqlalchemy/testing/__pycache__/pickleable.cpython-34.pyc,, -sqlalchemy/sql/__pycache__/expression.cpython-34.pyc,, -sqlalchemy/dialects/mssql/__pycache__/pymssql.cpython-34.pyc,, -sqlalchemy/sql/__pycache__/naming.cpython-34.pyc,, -sqlalchemy/ext/__pycache__/horizontal_shard.cpython-34.pyc,, -sqlalchemy/dialects/sybase/__pycache__/__init__.cpython-34.pyc,, -sqlalchemy/engine/__pycache__/threadlocal.cpython-34.pyc,, -sqlalchemy/ext/declarative/__pycache__/api.cpython-34.pyc,, -sqlalchemy/testing/__pycache__/warnings.cpython-34.pyc,, -sqlalchemy/sql/__pycache__/util.cpython-34.pyc,, -sqlalchemy/sql/__pycache__/dml.cpython-34.pyc,, -sqlalchemy/ext/__pycache__/__init__.cpython-34.pyc,, -sqlalchemy/dialects/postgresql/__pycache__/hstore.cpython-34.pyc,, -sqlalchemy/orm/__pycache__/collections.cpython-34.pyc,, -sqlalchemy/sql/__pycache__/__init__.cpython-34.pyc,, -sqlalchemy/testing/suite/__pycache__/test_ddl.cpython-34.pyc,, -sqlalchemy/ext/__pycache__/orderinglist.cpython-34.pyc,, -sqlalchemy/dialects/postgresql/__pycache__/constraints.cpython-34.pyc,, -sqlalchemy/__pycache__/exc.cpython-34.pyc,, -sqlalchemy/testing/suite/__pycache__/test_update_delete.cpython-34.pyc,, -sqlalchemy/engine/__pycache__/strategies.cpython-34.pyc,, -sqlalchemy/ext/declarative/__pycache__/base.cpython-34.pyc,, -sqlalchemy/orm/__pycache__/evaluator.cpython-34.pyc,, -sqlalchemy/orm/__pycache__/query.cpython-34.pyc,, -sqlalchemy/dialects/mssql/__pycache__/zxjdbc.cpython-34.pyc,, -sqlalchemy/orm/__pycache__/descriptor_props.cpython-34.pyc,, -sqlalchemy/__pycache__/events.cpython-34.pyc,, -sqlalchemy/sql/__pycache__/type_api.cpython-34.pyc,, -sqlalchemy/util/__pycache__/queue.cpython-34.pyc,, -sqlalchemy/ext/__pycache__/hybrid.cpython-34.pyc,, -sqlalchemy/event/__pycache__/attr.cpython-34.pyc,, -sqlalchemy/testing/suite/__pycache__/test_reflection.cpython-34.pyc,, -sqlalchemy/dialects/mssql/__pycache__/information_schema.cpython-34.pyc,, -sqlalchemy/ext/__pycache__/instrumentation.cpython-34.pyc,, -sqlalchemy/testing/__pycache__/engines.cpython-34.pyc,, diff --git a/lib/python3.4/site-packages/SQLAlchemy-1.0.12.dist-info/WHEEL b/lib/python3.4/site-packages/SQLAlchemy-1.0.12.dist-info/WHEEL deleted file mode 100644 index 1fdf70f..0000000 --- a/lib/python3.4/site-packages/SQLAlchemy-1.0.12.dist-info/WHEEL +++ /dev/null @@ -1,5 +0,0 @@ -Wheel-Version: 1.0 -Generator: bdist_wheel (0.30.0) -Root-Is-Purelib: false -Tag: cp34-cp34m-linux_x86_64 - diff --git a/lib/python3.4/site-packages/SQLAlchemy-1.0.12.dist-info/metadata.json b/lib/python3.4/site-packages/SQLAlchemy-1.0.12.dist-info/metadata.json deleted file mode 100644 index a446465..0000000 --- a/lib/python3.4/site-packages/SQLAlchemy-1.0.12.dist-info/metadata.json +++ /dev/null @@ -1 +0,0 @@ -{"classifiers": ["Development Status :: 5 - Production/Stable", "Intended Audience :: Developers", "License :: OSI Approved :: MIT License", "Programming Language :: Python", "Programming Language :: Python :: 3", "Programming Language :: Python :: Implementation :: CPython", "Programming Language :: Python :: Implementation :: Jython", "Programming Language :: Python :: Implementation :: PyPy", "Topic :: Database :: Front-Ends", "Operating System :: OS Independent"], "description_content_type": "UNKNOWN", "extensions": {"python.details": {"contacts": [{"email": "mike_mp@zzzcomputing.com", "name": "Mike Bayer", "role": "author"}], "document_names": {"description": "DESCRIPTION.rst"}, "project_urls": {"Home": "http://www.sqlalchemy.org"}}}, "generator": "bdist_wheel (0.30.0)", "license": "MIT License", "metadata_version": "2.0", "name": "SQLAlchemy", "summary": "Database Abstraction Library", "test_requires": [{"requires": ["mock", "pytest (>=2.5.2)", "pytest-xdist"]}], "version": "1.0.12"} \ No newline at end of file diff --git a/lib/python3.4/site-packages/ed25519-1.4.dist-info/DESCRIPTION.rst b/lib/python3.4/site-packages/ed25519-1.4.dist-info/DESCRIPTION.rst deleted file mode 100644 index 2d8b087..0000000 --- a/lib/python3.4/site-packages/ed25519-1.4.dist-info/DESCRIPTION.rst +++ /dev/null @@ -1,11 +0,0 @@ -Python bindings to the Ed25519 public-key signature system. - -This offers a comfortable python interface to a C implementation of the -Ed25519 public-key signature system (http://ed25519.cr.yp.to/), using the -portable 'ref' code from the 'SUPERCOP' benchmarking suite. - -This system provides high (128-bit) security, short (32-byte) keys, short -(64-byte) signatures, and fast (2-6ms) operation. Please see the README for -more details. - - diff --git a/lib/python3.4/site-packages/ed25519-1.4.dist-info/RECORD b/lib/python3.4/site-packages/ed25519-1.4.dist-info/RECORD deleted file mode 100644 index 7b7cbd4..0000000 --- a/lib/python3.4/site-packages/ed25519-1.4.dist-info/RECORD +++ /dev/null @@ -1,17 +0,0 @@ -ed25519/__init__.py,sha256=0AicD1xQAforRdrUWwmmURJkZ3Gi1lqaifukwZNYJos,401 -ed25519/_ed25519.cpython-34m.so,sha256=-qvpNKMbtiJoFhWHlvH83lGmJEntE9ISrt8hYZE4zig,262968 -ed25519/_version.py,sha256=yb119RosJrH_RO02_o3o12GWQvkxx3xD4X7UrJW9vTY,469 -ed25519/keys.py,sha256=AbMFsbxn0qbwmQ6HntpNURsOGq_y4puwFxs6U7Of2eo,7123 -ed25519/test_ed25519.py,sha256=IG8ot-yARHi6PoyJY6ixS1l2L23hE1lCXbSH-XQPCCM,12389 -../../../bin/edsig,sha256=SA1mUUWCjAAaSEe6MKSpVWg-2qXwuiuK3PodCAUwCN0,2853 -ed25519-1.4.dist-info/DESCRIPTION.rst,sha256=8UWGEqjPrB7zPyxLA5Ep6JL58ANbe0Wybqth188exdc,434 -ed25519-1.4.dist-info/METADATA,sha256=8xAIfsJS4nw5H1ui1jHsVntmwcMjIzm4j_LHEaW3wNQ,1148 -ed25519-1.4.dist-info/RECORD,, -ed25519-1.4.dist-info/WHEEL,sha256=AEztX7vHDtcgysb-4-5-DyIKMLIPg6NMxY9dXTRdoXQ,104 -ed25519-1.4.dist-info/metadata.json,sha256=6X6ChTS1aIj99pNHtLNerEBCuO-F-P2Z1GgSMt2svQw,841 -ed25519-1.4.dist-info/top_level.txt,sha256=U3-N9ZJMBO9MUuZLwoiMbsWSkxsd0TfkNSuzO6O_gYY,8 -ed25519-1.4.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4 -ed25519/__pycache__/keys.cpython-34.pyc,, -ed25519/__pycache__/_version.cpython-34.pyc,, -ed25519/__pycache__/__init__.cpython-34.pyc,, -ed25519/__pycache__/test_ed25519.cpython-34.pyc,, diff --git a/lib/python3.4/site-packages/ed25519-1.4.dist-info/WHEEL b/lib/python3.4/site-packages/ed25519-1.4.dist-info/WHEEL deleted file mode 100644 index 1fdf70f..0000000 --- a/lib/python3.4/site-packages/ed25519-1.4.dist-info/WHEEL +++ /dev/null @@ -1,5 +0,0 @@ -Wheel-Version: 1.0 -Generator: bdist_wheel (0.30.0) -Root-Is-Purelib: false -Tag: cp34-cp34m-linux_x86_64 - diff --git a/lib/python3.4/site-packages/ed25519-1.4.dist-info/metadata.json b/lib/python3.4/site-packages/ed25519-1.4.dist-info/metadata.json deleted file mode 100644 index 12a665f..0000000 --- a/lib/python3.4/site-packages/ed25519-1.4.dist-info/metadata.json +++ /dev/null @@ -1 +0,0 @@ -{"classifiers": ["Development Status :: 5 - Production/Stable", "Intended Audience :: Developers", "License :: OSI Approved :: MIT License", "Programming Language :: Python", "Programming Language :: Python :: 2.6", "Programming Language :: Python :: 2.7", "Programming Language :: Python :: 3.3", "Programming Language :: Python :: 3.4", "Topic :: Security :: Cryptography"], "description_content_type": "UNKNOWN", "extensions": {"python.details": {"contacts": [{"email": "warner-python-ed25519@lothar.com", "name": "Brian Warner", "role": "author"}], "document_names": {"description": "DESCRIPTION.rst"}, "project_urls": {"Home": "https://github.com/warner/python-ed25519"}}}, "generator": "bdist_wheel (0.30.0)", "license": "MIT", "metadata_version": "2.0", "name": "ed25519", "summary": "Ed25519 public-key signatures", "version": "1.4"} \ No newline at end of file diff --git a/lib/python3.4/site-packages/ed25519/_ed25519.cpython-34m.so b/lib/python3.4/site-packages/ed25519/_ed25519.cpython-34m.so deleted file mode 100755 index a07ea54..0000000 Binary files a/lib/python3.4/site-packages/ed25519/_ed25519.cpython-34m.so and /dev/null differ diff --git a/lib/python3.4/site-packages/ed25519/_ed25519.cpython-35m-x86_64-linux-gnu.so b/lib/python3.4/site-packages/ed25519/_ed25519.cpython-35m-x86_64-linux-gnu.so deleted file mode 100755 index ea40ec0..0000000 Binary files a/lib/python3.4/site-packages/ed25519/_ed25519.cpython-35m-x86_64-linux-gnu.so and /dev/null differ diff --git a/lib/python3.4/site-packages/ed25519/_ed25519.cpython-36m-x86_64-linux-gnu.so b/lib/python3.4/site-packages/ed25519/_ed25519.cpython-36m-x86_64-linux-gnu.so deleted file mode 100755 index 15b7049..0000000 Binary files a/lib/python3.4/site-packages/ed25519/_ed25519.cpython-36m-x86_64-linux-gnu.so and /dev/null differ diff --git a/lib/python3.4/site-packages/netifaces-0.10.6.dist-info/DESCRIPTION.rst b/lib/python3.4/site-packages/netifaces-0.10.6.dist-info/DESCRIPTION.rst deleted file mode 100644 index 27fd20f..0000000 --- a/lib/python3.4/site-packages/netifaces-0.10.6.dist-info/DESCRIPTION.rst +++ /dev/null @@ -1,199 +0,0 @@ -netifaces 0.10.6 -================ - -.. image:: https://drone.io/bitbucket.org/al45tair/netifaces/status.png - :target: https://drone.io/bitbucket.org/al45tair/netifaces/latest - :alt: Build Status - -1. What is this? ----------------- - -It's been annoying me for some time that there's no easy way to get the -address(es) of the machine's network interfaces from Python. There is -a good reason for this difficulty, which is that it is virtually impossible -to do so in a portable manner. However, it seems to me that there should -be a package you can easy_install that will take care of working out the -details of doing so on the machine you're using, then you can get on with -writing Python code without concerning yourself with the nitty gritty of -system-dependent low-level networking APIs. - -This package attempts to solve that problem. - -2. How do I use it? -------------------- - -First you need to install it, which you can do by typing:: - - tar xvzf netifaces-0.10.6.tar.gz - cd netifaces-0.10.6 - python setup.py install - -**Note that you will need the relevant developer tools for your platform**, -as netifaces is written in C and installing this way will compile the extension. - -Once that's done, you'll need to start Python and do something like the -following:: - ->>> import netifaces - -Then if you enter - ->>> netifaces.interfaces() -['lo0', 'gif0', 'stf0', 'en0', 'en1', 'fw0'] - -you'll see the list of interface identifiers for your machine. - -You can ask for the addresses of a particular interface by doing - ->>> netifaces.ifaddresses('lo0') -{18: [{'addr': ''}], 2: [{'peer': '127.0.0.1', 'netmask': '255.0.0.0', 'addr': '127.0.0.1'}], 30: [{'peer': '::1', 'netmask': 'ffff:ffff:ffff:ffff:ffff:ffff:ffff:ffff', 'addr': '::1'}, {'peer': '', 'netmask': 'ffff:ffff:ffff:ffff::', 'addr': 'fe80::1%lo0'}]} - -Hmmmm. That result looks a bit cryptic; let's break it apart and explain -what each piece means. It returned a dictionary, so let's look there first:: - - { 18: [...], 2: [...], 30: [...] } - -Each of the numbers refers to a particular address family. In this case, we -have three address families listed; on my system, 18 is ``AF_LINK`` (which means -the link layer interface, e.g. Ethernet), 2 is ``AF_INET`` (normal Internet -addresses), and 30 is ``AF_INET6`` (IPv6). - -But wait! Don't use these numbers in your code. The numeric values here are -system dependent; fortunately, I thought of that when writing netifaces, so -the module declares a range of values that you might need. e.g. - ->>> netifaces.AF_LINK -18 - -Again, on your system, the number may be different. - -So, what we've established is that the dictionary that's returned has one -entry for each address family for which this interface has an address. Let's -take a look at the ``AF_INET`` addresses now: - ->>> addrs = netifaces.ifaddresses('lo0') ->>> addrs[netifaces.AF_INET] -[{'peer': '127.0.0.1', 'netmask': '255.0.0.0', 'addr': '127.0.0.1'}] - -You might be wondering why this value is a list. The reason is that it's -possible for an interface to have more than one address, even within the -same family. I'll say that again: *you can have more than one address of -the same type associated with each interface*. - -*Asking for "the" address of a particular interface doesn't make sense.* - -Right, so, we can see that this particular interface only has one address, -and, because it's a loopback interface, it's point-to-point and therefore -has a *peer* address rather than a broadcast address. - -Let's look at a more interesting interface. - ->>> addrs = netifaces.ifaddresses('en0') ->>> addrs[netifaces.AF_INET] -[{'broadcast': '10.15.255.255', 'netmask': '255.240.0.0', 'addr': '10.0.1.4'}, {'broadcast': '192.168.0.255', 'addr': '192.168.0.47'}] - -This interface has two addresses (see, I told you...) Both of them are -regular IPv4 addresses, although in one case the netmask has been changed -from its default. The netmask *may not* appear on your system if it's set -to the default for the address range. - -Because this interface isn't point-to-point, it also has broadcast addresses. - -Now, say we want, instead of the IP addresses, to get the MAC address; that -is, the hardware address of the Ethernet adapter running this interface. We -can do - ->>> addrs[netifaces.AF_LINK] -[{'addr': '00:12:34:56:78:9a'}] - -Note that this may not be available on platforms without getifaddrs(), unless -they happen to implement ``SIOCGIFHWADDR``. Note also that you just get the -address; it's unlikely that you'll see anything else with an ``AF_LINK`` address. -Oh, and don't assume that all ``AF_LINK`` addresses are Ethernet; you might, for -instance, be on a Mac, in which case: - ->>> addrs = netifaces.ifaddresses('fw0') ->>> addrs[netifaces.AF_LINK] -[{'addr': '00:12:34:56:78:9a:bc:de'}] - -No, that isn't an exceptionally long Ethernet MAC address---it's a FireWire -address. - -As of version 0.10.0, you can also obtain a list of gateways on your -machine: - ->>> netifaces.gateways() -{2: [('10.0.1.1', 'en0', True), ('10.2.1.1', 'en1', False)], 30: [('fe80::1', 'en0', True)], 'default': { 2: ('10.0.1.1', 'en0'), 30: ('fe80::1', 'en0') }} - -This dictionary is keyed on address family---in this case, ``AF_INET``---and -each entry is a list of gateways as ``(address, interface, is_default)`` tuples. -Notice that here we have two separate gateways for IPv4 (``AF_INET``); some -operating systems support configurations like this and can either route packets -based on their source, or based on administratively configured routing tables. - -For convenience, we also allow you to index the dictionary with the special -value ``'default'``, which returns a dictionary mapping address families to the -default gateway in each case. Thus you can get the default IPv4 gateway with - ->>> gws = netifaces.gateways() ->>> gws['default'][netifaces.AF_INET] -('10.0.1.1', 'en0') - -Do note that there may be no default gateway for any given address family; -this is currently very common for IPv6 and much less common for IPv4 but it -can happen even for ``AF_INET``. - -BTW, if you're trying to configure your machine to have multiple gateways for -the same address family, it's a very good idea to check the documentation for -your operating system *very* carefully, as some systems become extremely -confused or route packets in a non-obvious manner. - -I'm very interested in hearing from anyone (on any platform) for whom the -``gateways()`` method doesn't produce the expected results. It's quite -complicated extracting this information from the operating system (whichever -operating system we're talking about), and so I expect there's at least one -system out there where this just won't work. - -3. This is great! What platforms does it work on? --------------------------------------------------- - -It gets regular testing on OS X, Linux and Windows. It has also been used -successfully on Solaris, and it's expected to work properly on other UNIX-like -systems as well. If you are running something that is not supported, and -wish to contribute a patch, please use BitBucket to send a pull request. - -4. What license is this under? ------------------------------- - -It's an MIT-style license. Here goes: - -Copyright (c) 2007-2017 Alastair Houghton - -Permission is hereby granted, free of charge, to any person obtaining a copy -of this software and associated documentation files (the "Software"), to deal -in the Software without restriction, including without limitation the rights -to use, copy, modify, merge, publish, distribute, sublicense, and/or sell -copies of the Software, and to permit persons to whom the Software is -furnished to do so, subject to the following conditions: - -The above copyright notice and this permission notice shall be included in all -copies or substantial portions of the Software. - -THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR -IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, -FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE -AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER -LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, -OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE -SOFTWARE. - -5. Why the jump to 0.10.0? --------------------------- - -Because someone released a fork of netifaces with the version 0.9.0. -Hopefully skipping the version number should remove any confusion. In -addition starting with 0.10.0 Python 3 is now supported and other -features/bugfixes have been included as well. See the CHANGELOG for a -more complete list of changes. - - diff --git a/lib/python3.4/site-packages/netifaces-0.10.6.dist-info/METADATA b/lib/python3.4/site-packages/netifaces-0.10.6.dist-info/METADATA deleted file mode 100644 index 59b98bf..0000000 --- a/lib/python3.4/site-packages/netifaces-0.10.6.dist-info/METADATA +++ /dev/null @@ -1,220 +0,0 @@ -Metadata-Version: 2.0 -Name: netifaces -Version: 0.10.6 -Summary: Portable network interface information. -Home-page: https://bitbucket.org/al45tair/netifaces -Author: Alastair Houghton -Author-email: alastair@alastairs-place.net -License: MIT License -Description-Content-Type: UNKNOWN -Platform: UNKNOWN -Classifier: Development Status :: 4 - Beta -Classifier: Intended Audience :: Developers -Classifier: License :: OSI Approved :: MIT License -Classifier: Topic :: System :: Networking -Classifier: Programming Language :: Python -Classifier: Programming Language :: Python :: 2 -Classifier: Programming Language :: Python :: 2.5 -Classifier: Programming Language :: Python :: 2.6 -Classifier: Programming Language :: Python :: 2.7 -Classifier: Programming Language :: Python :: 3 - -netifaces 0.10.6 -================ - -.. image:: https://drone.io/bitbucket.org/al45tair/netifaces/status.png - :target: https://drone.io/bitbucket.org/al45tair/netifaces/latest - :alt: Build Status - -1. What is this? ----------------- - -It's been annoying me for some time that there's no easy way to get the -address(es) of the machine's network interfaces from Python. There is -a good reason for this difficulty, which is that it is virtually impossible -to do so in a portable manner. However, it seems to me that there should -be a package you can easy_install that will take care of working out the -details of doing so on the machine you're using, then you can get on with -writing Python code without concerning yourself with the nitty gritty of -system-dependent low-level networking APIs. - -This package attempts to solve that problem. - -2. How do I use it? -------------------- - -First you need to install it, which you can do by typing:: - - tar xvzf netifaces-0.10.6.tar.gz - cd netifaces-0.10.6 - python setup.py install - -**Note that you will need the relevant developer tools for your platform**, -as netifaces is written in C and installing this way will compile the extension. - -Once that's done, you'll need to start Python and do something like the -following:: - ->>> import netifaces - -Then if you enter - ->>> netifaces.interfaces() -['lo0', 'gif0', 'stf0', 'en0', 'en1', 'fw0'] - -you'll see the list of interface identifiers for your machine. - -You can ask for the addresses of a particular interface by doing - ->>> netifaces.ifaddresses('lo0') -{18: [{'addr': ''}], 2: [{'peer': '127.0.0.1', 'netmask': '255.0.0.0', 'addr': '127.0.0.1'}], 30: [{'peer': '::1', 'netmask': 'ffff:ffff:ffff:ffff:ffff:ffff:ffff:ffff', 'addr': '::1'}, {'peer': '', 'netmask': 'ffff:ffff:ffff:ffff::', 'addr': 'fe80::1%lo0'}]} - -Hmmmm. That result looks a bit cryptic; let's break it apart and explain -what each piece means. It returned a dictionary, so let's look there first:: - - { 18: [...], 2: [...], 30: [...] } - -Each of the numbers refers to a particular address family. In this case, we -have three address families listed; on my system, 18 is ``AF_LINK`` (which means -the link layer interface, e.g. Ethernet), 2 is ``AF_INET`` (normal Internet -addresses), and 30 is ``AF_INET6`` (IPv6). - -But wait! Don't use these numbers in your code. The numeric values here are -system dependent; fortunately, I thought of that when writing netifaces, so -the module declares a range of values that you might need. e.g. - ->>> netifaces.AF_LINK -18 - -Again, on your system, the number may be different. - -So, what we've established is that the dictionary that's returned has one -entry for each address family for which this interface has an address. Let's -take a look at the ``AF_INET`` addresses now: - ->>> addrs = netifaces.ifaddresses('lo0') ->>> addrs[netifaces.AF_INET] -[{'peer': '127.0.0.1', 'netmask': '255.0.0.0', 'addr': '127.0.0.1'}] - -You might be wondering why this value is a list. The reason is that it's -possible for an interface to have more than one address, even within the -same family. I'll say that again: *you can have more than one address of -the same type associated with each interface*. - -*Asking for "the" address of a particular interface doesn't make sense.* - -Right, so, we can see that this particular interface only has one address, -and, because it's a loopback interface, it's point-to-point and therefore -has a *peer* address rather than a broadcast address. - -Let's look at a more interesting interface. - ->>> addrs = netifaces.ifaddresses('en0') ->>> addrs[netifaces.AF_INET] -[{'broadcast': '10.15.255.255', 'netmask': '255.240.0.0', 'addr': '10.0.1.4'}, {'broadcast': '192.168.0.255', 'addr': '192.168.0.47'}] - -This interface has two addresses (see, I told you...) Both of them are -regular IPv4 addresses, although in one case the netmask has been changed -from its default. The netmask *may not* appear on your system if it's set -to the default for the address range. - -Because this interface isn't point-to-point, it also has broadcast addresses. - -Now, say we want, instead of the IP addresses, to get the MAC address; that -is, the hardware address of the Ethernet adapter running this interface. We -can do - ->>> addrs[netifaces.AF_LINK] -[{'addr': '00:12:34:56:78:9a'}] - -Note that this may not be available on platforms without getifaddrs(), unless -they happen to implement ``SIOCGIFHWADDR``. Note also that you just get the -address; it's unlikely that you'll see anything else with an ``AF_LINK`` address. -Oh, and don't assume that all ``AF_LINK`` addresses are Ethernet; you might, for -instance, be on a Mac, in which case: - ->>> addrs = netifaces.ifaddresses('fw0') ->>> addrs[netifaces.AF_LINK] -[{'addr': '00:12:34:56:78:9a:bc:de'}] - -No, that isn't an exceptionally long Ethernet MAC address---it's a FireWire -address. - -As of version 0.10.0, you can also obtain a list of gateways on your -machine: - ->>> netifaces.gateways() -{2: [('10.0.1.1', 'en0', True), ('10.2.1.1', 'en1', False)], 30: [('fe80::1', 'en0', True)], 'default': { 2: ('10.0.1.1', 'en0'), 30: ('fe80::1', 'en0') }} - -This dictionary is keyed on address family---in this case, ``AF_INET``---and -each entry is a list of gateways as ``(address, interface, is_default)`` tuples. -Notice that here we have two separate gateways for IPv4 (``AF_INET``); some -operating systems support configurations like this and can either route packets -based on their source, or based on administratively configured routing tables. - -For convenience, we also allow you to index the dictionary with the special -value ``'default'``, which returns a dictionary mapping address families to the -default gateway in each case. Thus you can get the default IPv4 gateway with - ->>> gws = netifaces.gateways() ->>> gws['default'][netifaces.AF_INET] -('10.0.1.1', 'en0') - -Do note that there may be no default gateway for any given address family; -this is currently very common for IPv6 and much less common for IPv4 but it -can happen even for ``AF_INET``. - -BTW, if you're trying to configure your machine to have multiple gateways for -the same address family, it's a very good idea to check the documentation for -your operating system *very* carefully, as some systems become extremely -confused or route packets in a non-obvious manner. - -I'm very interested in hearing from anyone (on any platform) for whom the -``gateways()`` method doesn't produce the expected results. It's quite -complicated extracting this information from the operating system (whichever -operating system we're talking about), and so I expect there's at least one -system out there where this just won't work. - -3. This is great! What platforms does it work on? --------------------------------------------------- - -It gets regular testing on OS X, Linux and Windows. It has also been used -successfully on Solaris, and it's expected to work properly on other UNIX-like -systems as well. If you are running something that is not supported, and -wish to contribute a patch, please use BitBucket to send a pull request. - -4. What license is this under? ------------------------------- - -It's an MIT-style license. Here goes: - -Copyright (c) 2007-2017 Alastair Houghton - -Permission is hereby granted, free of charge, to any person obtaining a copy -of this software and associated documentation files (the "Software"), to deal -in the Software without restriction, including without limitation the rights -to use, copy, modify, merge, publish, distribute, sublicense, and/or sell -copies of the Software, and to permit persons to whom the Software is -furnished to do so, subject to the following conditions: - -The above copyright notice and this permission notice shall be included in all -copies or substantial portions of the Software. - -THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR -IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, -FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE -AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER -LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, -OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE -SOFTWARE. - -5. Why the jump to 0.10.0? --------------------------- - -Because someone released a fork of netifaces with the version 0.9.0. -Hopefully skipping the version number should remove any confusion. In -addition starting with 0.10.0 Python 3 is now supported and other -features/bugfixes have been included as well. See the CHANGELOG for a -more complete list of changes. - - diff --git a/lib/python3.4/site-packages/netifaces-0.10.6.dist-info/RECORD b/lib/python3.4/site-packages/netifaces-0.10.6.dist-info/RECORD deleted file mode 100644 index c9438a2..0000000 --- a/lib/python3.4/site-packages/netifaces-0.10.6.dist-info/RECORD +++ /dev/null @@ -1,9 +0,0 @@ -netifaces.cpython-34m.so,sha256=KiLZHMhvo_x40-9D0bLqZoVzQsGbimZY_33SUPowm9E,72976 -netifaces-0.10.6.dist-info/DESCRIPTION.rst,sha256=WCNR0xdB7g_1r_U6WwIedMlurGlPeDjvJX-NBElPoII,8555 -netifaces-0.10.6.dist-info/METADATA,sha256=InwXovYI_sgETAChE4hBUFbkSwYlZ_gWeKcNvyX8KOA,9322 -netifaces-0.10.6.dist-info/RECORD,, -netifaces-0.10.6.dist-info/WHEEL,sha256=AEztX7vHDtcgysb-4-5-DyIKMLIPg6NMxY9dXTRdoXQ,104 -netifaces-0.10.6.dist-info/metadata.json,sha256=W-IHSrO0Ma846gdBr18QTsvc9GjGN0SgAnZha0vW9tU,885 -netifaces-0.10.6.dist-info/top_level.txt,sha256=PqMTaIuWtSjkdQHX6lH1Lmpv2aqBUYAGqATB8z3A6TQ,10 -netifaces-0.10.6.dist-info/zip-safe,sha256=AbpHGcgLb-kRsJGnwFEktk7uzpZOCcBY74-YBdrKVGs,1 -netifaces-0.10.6.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4 diff --git a/lib/python3.4/site-packages/netifaces-0.10.6.dist-info/WHEEL b/lib/python3.4/site-packages/netifaces-0.10.6.dist-info/WHEEL deleted file mode 100644 index 1fdf70f..0000000 --- a/lib/python3.4/site-packages/netifaces-0.10.6.dist-info/WHEEL +++ /dev/null @@ -1,5 +0,0 @@ -Wheel-Version: 1.0 -Generator: bdist_wheel (0.30.0) -Root-Is-Purelib: false -Tag: cp34-cp34m-linux_x86_64 - diff --git a/lib/python3.4/site-packages/netifaces-0.10.6.dist-info/metadata.json b/lib/python3.4/site-packages/netifaces-0.10.6.dist-info/metadata.json deleted file mode 100644 index 7de3738..0000000 --- a/lib/python3.4/site-packages/netifaces-0.10.6.dist-info/metadata.json +++ /dev/null @@ -1 +0,0 @@ -{"classifiers": ["Development Status :: 4 - Beta", "Intended Audience :: Developers", "License :: OSI Approved :: MIT License", "Topic :: System :: Networking", "Programming Language :: Python", "Programming Language :: Python :: 2", "Programming Language :: Python :: 2.5", "Programming Language :: Python :: 2.6", "Programming Language :: Python :: 2.7", "Programming Language :: Python :: 3"], "description_content_type": "UNKNOWN", "extensions": {"python.details": {"contacts": [{"email": "alastair@alastairs-place.net", "name": "Alastair Houghton", "role": "author"}], "document_names": {"description": "DESCRIPTION.rst"}, "project_urls": {"Home": "https://bitbucket.org/al45tair/netifaces"}}}, "generator": "bdist_wheel (0.30.0)", "license": "MIT License", "metadata_version": "2.0", "name": "netifaces", "summary": "Portable network interface information.", "version": "0.10.6"} \ No newline at end of file diff --git a/lib/python3.4/site-packages/netifaces.cpython-34m.so b/lib/python3.4/site-packages/netifaces.cpython-34m.so deleted file mode 100755 index 376c5cd..0000000 Binary files a/lib/python3.4/site-packages/netifaces.cpython-34m.so and /dev/null differ diff --git a/lib/python3.4/site-packages/netifaces.cpython-35m-x86_64-linux-gnu.so b/lib/python3.4/site-packages/netifaces.cpython-35m-x86_64-linux-gnu.so deleted file mode 100755 index 6393135..0000000 Binary files a/lib/python3.4/site-packages/netifaces.cpython-35m-x86_64-linux-gnu.so and /dev/null differ diff --git a/lib/python3.4/site-packages/netifaces.cpython-36m-x86_64-linux-gnu.so b/lib/python3.4/site-packages/netifaces.cpython-36m-x86_64-linux-gnu.so deleted file mode 100755 index f87df74..0000000 Binary files a/lib/python3.4/site-packages/netifaces.cpython-36m-x86_64-linux-gnu.so and /dev/null differ diff --git a/lib/python3.4/site-packages/pip-9.0.1.dist-info/DESCRIPTION.rst b/lib/python3.4/site-packages/pip-9.0.1.dist-info/DESCRIPTION.rst deleted file mode 100644 index 8ef94c4..0000000 --- a/lib/python3.4/site-packages/pip-9.0.1.dist-info/DESCRIPTION.rst +++ /dev/null @@ -1,39 +0,0 @@ -pip -=== - -The `PyPA recommended -`_ -tool for installing Python packages. - -* `Installation `_ -* `Documentation `_ -* `Changelog `_ -* `Github Page `_ -* `Issue Tracking `_ -* `User mailing list `_ -* `Dev mailing list `_ -* User IRC: #pypa on Freenode. -* Dev IRC: #pypa-dev on Freenode. - - -.. image:: https://img.shields.io/pypi/v/pip.svg - :target: https://pypi.python.org/pypi/pip - -.. image:: https://img.shields.io/travis/pypa/pip/master.svg - :target: http://travis-ci.org/pypa/pip - -.. image:: https://img.shields.io/appveyor/ci/pypa/pip.svg - :target: https://ci.appveyor.com/project/pypa/pip/history - -.. image:: https://readthedocs.org/projects/pip/badge/?version=stable - :target: https://pip.pypa.io/en/stable - -Code of Conduct ---------------- - -Everyone interacting in the pip project's codebases, issue trackers, chat -rooms, and mailing lists is expected to follow the `PyPA Code of Conduct`_. - -.. _PyPA Code of Conduct: https://www.pypa.io/en/latest/code-of-conduct/ - - diff --git a/lib/python3.4/site-packages/pip-9.0.1.dist-info/METADATA b/lib/python3.4/site-packages/pip-9.0.1.dist-info/METADATA deleted file mode 100644 index 600a905..0000000 --- a/lib/python3.4/site-packages/pip-9.0.1.dist-info/METADATA +++ /dev/null @@ -1,69 +0,0 @@ -Metadata-Version: 2.0 -Name: pip -Version: 9.0.1 -Summary: The PyPA recommended tool for installing Python packages. -Home-page: https://pip.pypa.io/ -Author: The pip developers -Author-email: python-virtualenv@groups.google.com -License: MIT -Keywords: easy_install distutils setuptools egg virtualenv -Platform: UNKNOWN -Classifier: Development Status :: 5 - Production/Stable -Classifier: Intended Audience :: Developers -Classifier: License :: OSI Approved :: MIT License -Classifier: Topic :: Software Development :: Build Tools -Classifier: Programming Language :: Python :: 2 -Classifier: Programming Language :: Python :: 2.6 -Classifier: Programming Language :: Python :: 2.7 -Classifier: Programming Language :: Python :: 3 -Classifier: Programming Language :: Python :: 3.3 -Classifier: Programming Language :: Python :: 3.4 -Classifier: Programming Language :: Python :: 3.5 -Classifier: Programming Language :: Python :: Implementation :: PyPy -Requires-Python: >=2.6,!=3.0.*,!=3.1.*,!=3.2.* -Provides-Extra: testing -Requires-Dist: mock; extra == 'testing' -Requires-Dist: pretend; extra == 'testing' -Requires-Dist: pytest; extra == 'testing' -Requires-Dist: scripttest (>=1.3); extra == 'testing' -Requires-Dist: virtualenv (>=1.10); extra == 'testing' - -pip -=== - -The `PyPA recommended -`_ -tool for installing Python packages. - -* `Installation `_ -* `Documentation `_ -* `Changelog `_ -* `Github Page `_ -* `Issue Tracking `_ -* `User mailing list `_ -* `Dev mailing list `_ -* User IRC: #pypa on Freenode. -* Dev IRC: #pypa-dev on Freenode. - - -.. image:: https://img.shields.io/pypi/v/pip.svg - :target: https://pypi.python.org/pypi/pip - -.. image:: https://img.shields.io/travis/pypa/pip/master.svg - :target: http://travis-ci.org/pypa/pip - -.. image:: https://img.shields.io/appveyor/ci/pypa/pip.svg - :target: https://ci.appveyor.com/project/pypa/pip/history - -.. image:: https://readthedocs.org/projects/pip/badge/?version=stable - :target: https://pip.pypa.io/en/stable - -Code of Conduct ---------------- - -Everyone interacting in the pip project's codebases, issue trackers, chat -rooms, and mailing lists is expected to follow the `PyPA Code of Conduct`_. - -.. _PyPA Code of Conduct: https://www.pypa.io/en/latest/code-of-conduct/ - - diff --git a/lib/python3.4/site-packages/pip-9.0.1.dist-info/RECORD b/lib/python3.4/site-packages/pip-9.0.1.dist-info/RECORD deleted file mode 100644 index 86fe98d..0000000 --- a/lib/python3.4/site-packages/pip-9.0.1.dist-info/RECORD +++ /dev/null @@ -1,123 +0,0 @@ -pip/__init__.py,sha256=00QWSreEBjb8Y8sPs8HeqgLXSB-3UrONJxo4J5APxEc,11348 -pip/__main__.py,sha256=V6Kh-IEDEFpt1cahRE6MajUF_14qJR_Qsvn4MjWZXzE,584 -pip/basecommand.py,sha256=TTlmZesQ4Vuxcto2KqwZGmgmN5ioHEl_DeFev9ie_SA,11910 -pip/baseparser.py,sha256=AKMOeF3fTrRroiv0DmTQbdiLW0DQux2KqGC_dJJB9d0,10465 -pip/cmdoptions.py,sha256=pRptFz05iFEfSW4Flg3x1_P92sYlFvq7elhnwujikNY,16473 -pip/download.py,sha256=rA0wbmqC2n9ejX481YJSidmKgQqQDjdaxkHkHlAN68k,32171 -pip/exceptions.py,sha256=BvqH-Jw3tP2b-2IJ2kjrQemOAPMqKrQMLRIZHZQpJXk,8121 -pip/index.py,sha256=L6UhtAEZc2qw7BqfQrkPQcw2gCgEw3GukLRSA95BNyI,39950 -pip/locations.py,sha256=9rJRlgonC6QC2zGDIn_7mXaoZ9_tF_IHM2BQhWVRgbo,5626 -pip/pep425tags.py,sha256=q3kec4f6NHszuGYIhGIbVvs896D06uJAnKFgJ_wce44,10980 -pip/status_codes.py,sha256=F6uDG6Gj7RNKQJUDnd87QKqI16Us-t-B0wPF_4QMpWc,156 -pip/wheel.py,sha256=QSWmGs2ui-n4UMWm0JUY6aMCcwNKungVzbWsxI9KlJQ,32010 -pip/_vendor/__init__.py,sha256=L-0x9jj0HSZen1Fm2U0GUbxfjfwQPIXc4XJ4IAxy8D8,4804 -pip/commands/__init__.py,sha256=2Uq3HCdjchJD9FL1LB7rd5v6UySVAVizX0W3EX3hIoE,2244 -pip/commands/check.py,sha256=-A7GI1-WZBh9a4P6UoH_aR-J7I8Lz8ly7m3wnCjmevs,1382 -pip/commands/completion.py,sha256=kkPgVX7SUcJ_8Juw5GkgWaxHN9_45wmAr9mGs1zXEEs,2453 -pip/commands/download.py,sha256=8RuuPmSYgAq3iEDTqZY_1PDXRqREdUULHNjWJeAv7Mo,7810 -pip/commands/freeze.py,sha256=h6-yFMpjCjbNj8-gOm5UuoF6cg14N5rPV4TCi3_CeuI,2835 -pip/commands/hash.py,sha256=MCt4jEFyfoce0lVeNEz1x49uaTY-VDkKiBvvxrVcHkw,1597 -pip/commands/help.py,sha256=84HWkEdnGP_AEBHnn8gJP2Te0XTXRKFoXqXopbOZTNo,982 -pip/commands/install.py,sha256=o-CR1TKf-b1qaFv47nNlawqsIfDjXyIzv_iJUw1Trag,18069 -pip/commands/list.py,sha256=93bCiFyt2Qut_YHkYHJMZHpXladmxsjS-yOtZeb3uqI,11369 -pip/commands/search.py,sha256=oTs9QNdefnrmCV_JeftG0PGiMuYVmiEDF1OUaYsmDao,4502 -pip/commands/show.py,sha256=ZYM57_7U8KP9MQIIyHKQdZxmiEZByy-DRzB697VFoTY,5891 -pip/commands/uninstall.py,sha256=tz8cXz4WdpUdnt3RvpdQwH6_SNMB50egBIZWa1dwfcc,2884 -pip/commands/wheel.py,sha256=z5SEhws2YRMb0Ml1IEkg6jFZMLRpLl86bHCrQbYt5zo,7729 -pip/compat/__init__.py,sha256=2Xs_IpsmdRgHbQgQO0c8_lPvHJnQXHyGWxPbLbYJL4c,4672 -pip/compat/dictconfig.py,sha256=dRrelPDWrceDSzFT51RTEVY2GuM7UDyc5Igh_tn4Fvk,23096 -pip/models/__init__.py,sha256=0Rs7_RA4DxeOkWT5Cq4CQzDrSEhvYcN3TH2cazr72PE,71 -pip/models/index.py,sha256=pUfbO__v3mD9j-2n_ClwPS8pVyx4l2wIwyvWt8GMCRA,487 -pip/operations/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0 -pip/operations/check.py,sha256=uwUN9cs1sPo7c0Sj6pRrSv7b22Pk29SXUImTelVchMQ,1590 -pip/operations/freeze.py,sha256=k-7w7LsM-RpPv7ERBzHiPpYkH-GuYfHLyR-Cp_1VPL0,5194 -pip/req/__init__.py,sha256=vFwZY8_Vc1WU1zFAespg1My_r_AT3n7cN0W9eX0EFqk,276 -pip/req/req_file.py,sha256=fG9MDsXUNPhmGwxUiwrIXEynyD8Q7s3L47-hLZPDXq0,11926 -pip/req/req_install.py,sha256=gYrH-lwQMmt55VVbav_EtRIPu94cQbHFHm_Kq6AeHbg,46487 -pip/req/req_set.py,sha256=jHspXqcA2FxcF05dgUIAZ5huYPv6bn0wRUX0Z7PKmaA,34462 -pip/req/req_uninstall.py,sha256=fdH2VgCjEC8NRYDS7fRu3ZJaBBUEy-N5muwxDX5MBNM,6897 -pip/utils/__init__.py,sha256=zk1vF2EzHZX1ZKPwgeC9I6yKvs8IJ6NZEfXgp2IP8hI,27912 -pip/utils/appdirs.py,sha256=kj2LK-I2fC5QnEh_A_v-ev_IQMcXaWWF5DE39sNvCLQ,8811 -pip/utils/build.py,sha256=4smLRrfSCmXmjEnVnMFh2tBEpNcSLRe6J0ejZJ-wWJE,1312 -pip/utils/deprecation.py,sha256=X_FMjtDbMJqfqEkdRrki-mYyIdPB6I6DHUTCA_ChY6M,2232 -pip/utils/encoding.py,sha256=NQxGiFS5GbeAveLZTnx92t5r0PYqvt0iRnP2u9SGG1w,971 -pip/utils/filesystem.py,sha256=ZEVBuYM3fqr2_lgOESh4Y7fPFszGD474zVm_M3Mb5Tk,899 -pip/utils/glibc.py,sha256=jcQYjt_oJLPKVZB28Kauy4Sw70zS-wawxoU1HHX36_0,2939 -pip/utils/hashes.py,sha256=oMk7cd3PbJgzpSQyXq1MytMud5f6H5Oa2YY5hYuCq6I,2866 -pip/utils/logging.py,sha256=7yWu4gZw-Qclj7X80QVdpGWkdTWGKT4LiUVKcE04pro,3327 -pip/utils/outdated.py,sha256=fNwOCL5r2EftPGhgCYGMKu032HC8cV-JAr9lp0HmToM,5455 -pip/utils/packaging.py,sha256=qhmli14odw6DIhWJgQYS2Q0RrSbr8nXNcG48f5yTRms,2080 -pip/utils/setuptools_build.py,sha256=0blfscmNJW_iZ5DcswJeDB_PbtTEjfK9RL1R1WEDW2E,278 -pip/utils/ui.py,sha256=pbDkSAeumZ6jdZcOJ2yAbx8iBgeP2zfpqNnLJK1gskQ,11597 -pip/vcs/__init__.py,sha256=WafFliUTHMmsSISV8PHp1M5EXDNSWyJr78zKaQmPLdY,12374 -pip/vcs/bazaar.py,sha256=tYTwc4b4off8mr0O2o8SiGejqBDJxcbDBMSMd9-ISYc,3803 -pip/vcs/git.py,sha256=5LfWryi78A-2ULjEZJvCTarJ_3l8venwXASlwm8hiug,11197 -pip/vcs/mercurial.py,sha256=xG6rDiwHCRytJEs23SIHBXl_SwQo2jkkdD_6rVVP5h4,3472 -pip/vcs/subversion.py,sha256=GAuX2Sk7IZvJyEzENKcVld_wGBrQ3fpXDlXjapZEYdI,9350 -pip-9.0.1.dist-info/DESCRIPTION.rst,sha256=Va8Wj1XBpTbVQ2Z41mZRJdALEeziiS_ZewWn1H2ecY4,1287 -pip-9.0.1.dist-info/METADATA,sha256=mvs_tLoKAbECXY_6QHiVWQsagSL-1UjolQTpScT8JSk,2529 -pip-9.0.1.dist-info/RECORD,, -pip-9.0.1.dist-info/WHEEL,sha256=o2k-Qa-RMNIJmUdIc7KU6VWR_ErNRbWNlxDIpl7lm34,110 -pip-9.0.1.dist-info/entry_points.txt,sha256=GWc-Wb9WUKZ1EuVWNz-G0l3BeIpbNJLx0OJbZ61AAV0,68 -pip-9.0.1.dist-info/metadata.json,sha256=aqvkETDy4mHUBob-2Fn5WWlXORi_M2OSfQ2HQCUU_Fk,1565 -pip-9.0.1.dist-info/top_level.txt,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4 -../../../bin/pip,sha256=kUtfTrIe4CRluRco6nKs-hUx0Eir2ABPF8Rr_1zK534,272 -../../../bin/pip3,sha256=kUtfTrIe4CRluRco6nKs-hUx0Eir2ABPF8Rr_1zK534,272 -../../../bin/pip3.4,sha256=kUtfTrIe4CRluRco6nKs-hUx0Eir2ABPF8Rr_1zK534,272 -pip-9.0.1.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4 -pip/__pycache__/exceptions.cpython-34.pyc,, -pip/utils/__pycache__/ui.cpython-34.pyc,, -pip/__pycache__/basecommand.cpython-34.pyc,, -pip/commands/__pycache__/check.cpython-34.pyc,, -pip/utils/__pycache__/packaging.cpython-34.pyc,, -pip/utils/__pycache__/build.cpython-34.pyc,, -pip/vcs/__pycache__/__init__.cpython-34.pyc,, -pip/__pycache__/download.cpython-34.pyc,, -pip/utils/__pycache__/setuptools_build.cpython-34.pyc,, -pip/req/__pycache__/req_uninstall.cpython-34.pyc,, -pip/utils/__pycache__/deprecation.cpython-34.pyc,, -pip/operations/__pycache__/check.cpython-34.pyc,, -pip/_vendor/__pycache__/__init__.cpython-34.pyc,, -pip/utils/__pycache__/outdated.cpython-34.pyc,, -pip/commands/__pycache__/install.cpython-34.pyc,, -pip/operations/__pycache__/__init__.cpython-34.pyc,, -pip/commands/__pycache__/freeze.cpython-34.pyc,, -pip/req/__pycache__/req_set.cpython-34.pyc,, -pip/operations/__pycache__/freeze.cpython-34.pyc,, -pip/__pycache__/baseparser.cpython-34.pyc,, -pip/commands/__pycache__/hash.cpython-34.pyc,, -pip/commands/__pycache__/download.cpython-34.pyc,, -pip/commands/__pycache__/wheel.cpython-34.pyc,, -pip/commands/__pycache__/help.cpython-34.pyc,, -pip/utils/__pycache__/glibc.cpython-34.pyc,, -pip/__pycache__/locations.cpython-34.pyc,, -pip/commands/__pycache__/list.cpython-34.pyc,, -pip/compat/__pycache__/dictconfig.cpython-34.pyc,, -pip/__pycache__/__init__.cpython-34.pyc,, -pip/utils/__pycache__/hashes.cpython-34.pyc,, -pip/compat/__pycache__/__init__.cpython-34.pyc,, -pip/vcs/__pycache__/git.cpython-34.pyc,, -pip/req/__pycache__/__init__.cpython-34.pyc,, -pip/__pycache__/__main__.cpython-34.pyc,, -pip/__pycache__/status_codes.cpython-34.pyc,, -pip/models/__pycache__/index.cpython-34.pyc,, -pip/__pycache__/pep425tags.cpython-34.pyc,, -pip/commands/__pycache__/uninstall.cpython-34.pyc,, -pip/vcs/__pycache__/bazaar.cpython-34.pyc,, -pip/req/__pycache__/req_install.cpython-34.pyc,, -pip/vcs/__pycache__/mercurial.cpython-34.pyc,, -pip/commands/__pycache__/__init__.cpython-34.pyc,, -pip/commands/__pycache__/show.cpython-34.pyc,, -pip/__pycache__/index.cpython-34.pyc,, -pip/commands/__pycache__/completion.cpython-34.pyc,, -pip/req/__pycache__/req_file.cpython-34.pyc,, -pip/__pycache__/cmdoptions.cpython-34.pyc,, -pip/utils/__pycache__/filesystem.cpython-34.pyc,, -pip/__pycache__/wheel.cpython-34.pyc,, -pip/utils/__pycache__/appdirs.cpython-34.pyc,, -pip/utils/__pycache__/__init__.cpython-34.pyc,, -pip/vcs/__pycache__/subversion.cpython-34.pyc,, -pip/utils/__pycache__/logging.cpython-34.pyc,, -pip/commands/__pycache__/search.cpython-34.pyc,, -pip/utils/__pycache__/encoding.cpython-34.pyc,, -pip/models/__pycache__/__init__.cpython-34.pyc,, diff --git a/lib/python3.4/site-packages/pip-9.0.1.dist-info/entry_points.txt b/lib/python3.4/site-packages/pip-9.0.1.dist-info/entry_points.txt deleted file mode 100644 index c02a8d5..0000000 --- a/lib/python3.4/site-packages/pip-9.0.1.dist-info/entry_points.txt +++ /dev/null @@ -1,5 +0,0 @@ -[console_scripts] -pip = pip:main -pip3 = pip:main -pip3.5 = pip:main - diff --git a/lib/python3.4/site-packages/pip-9.0.1.dist-info/metadata.json b/lib/python3.4/site-packages/pip-9.0.1.dist-info/metadata.json deleted file mode 100644 index 9eae02c..0000000 --- a/lib/python3.4/site-packages/pip-9.0.1.dist-info/metadata.json +++ /dev/null @@ -1 +0,0 @@ -{"classifiers": ["Development Status :: 5 - Production/Stable", "Intended Audience :: Developers", "License :: OSI Approved :: MIT License", "Topic :: Software Development :: Build Tools", "Programming Language :: Python :: 2", "Programming Language :: Python :: 2.6", "Programming Language :: Python :: 2.7", "Programming Language :: Python :: 3", "Programming Language :: Python :: 3.3", "Programming Language :: Python :: 3.4", "Programming Language :: Python :: 3.5", "Programming Language :: Python :: Implementation :: PyPy"], "extensions": {"python.commands": {"wrap_console": {"pip": "pip:main", "pip3": "pip:main", "pip3.5": "pip:main"}}, "python.details": {"contacts": [{"email": "python-virtualenv@groups.google.com", "name": "The pip developers", "role": "author"}], "document_names": {"description": "DESCRIPTION.rst"}, "project_urls": {"Home": "https://pip.pypa.io/"}}, "python.exports": {"console_scripts": {"pip": "pip:main", "pip3": "pip:main", "pip3.5": "pip:main"}}}, "extras": ["testing"], "generator": "bdist_wheel (0.29.0)", "keywords": ["easy_install", "distutils", "setuptools", "egg", "virtualenv"], "license": "MIT", "metadata_version": "2.0", "name": "pip", "requires_python": ">=2.6,!=3.0.*,!=3.1.*,!=3.2.*", "run_requires": [{"extra": "testing", "requires": ["mock", "pretend", "pytest", "scripttest (>=1.3)", "virtualenv (>=1.10)"]}], "summary": "The PyPA recommended tool for installing Python packages.", "test_requires": [{"requires": ["mock", "pretend", "pytest", "scripttest (>=1.3)", "virtualenv (>=1.10)"]}], "version": "9.0.1"} \ No newline at end of file diff --git a/lib/python3.4/site-packages/pip/__init__.py b/lib/python3.4/site-packages/pip/__init__.py deleted file mode 100644 index 9c1d8f9..0000000 --- a/lib/python3.4/site-packages/pip/__init__.py +++ /dev/null @@ -1,331 +0,0 @@ -#!/usr/bin/env python -from __future__ import absolute_import - -import locale -import logging -import os -import optparse -import warnings - -import sys -import re - -# 2016-06-17 barry@debian.org: urllib3 1.14 added optional support for socks, -# but if invoked (i.e. imported), it will issue a warning to stderr if socks -# isn't available. requests unconditionally imports urllib3's socks contrib -# module, triggering this warning. The warning breaks DEP-8 tests (because of -# the stderr output) and is just plain annoying in normal usage. I don't want -# to add socks as yet another dependency for pip, nor do I want to allow-stder -# in the DEP-8 tests, so just suppress the warning. pdb tells me this has to -# be done before the import of pip.vcs. -from pip._vendor.requests.packages.urllib3.exceptions import DependencyWarning -warnings.filterwarnings("ignore", category=DependencyWarning) # noqa - - -from pip.exceptions import InstallationError, CommandError, PipError -from pip.utils import get_installed_distributions, get_prog -from pip.utils import deprecation, dist_is_editable -from pip.vcs import git, mercurial, subversion, bazaar # noqa -from pip.baseparser import ConfigOptionParser, UpdatingDefaultsHelpFormatter -from pip.commands import get_summaries, get_similar_commands -from pip.commands import commands_dict -from pip._vendor.requests.packages.urllib3.exceptions import ( - InsecureRequestWarning, -) - - -# assignment for flake8 to be happy - -# This fixes a peculiarity when importing via __import__ - as we are -# initialising the pip module, "from pip import cmdoptions" is recursive -# and appears not to work properly in that situation. -import pip.cmdoptions -cmdoptions = pip.cmdoptions - -# The version as used in the setup.py and the docs conf.py -__version__ = "9.0.1" - - -logger = logging.getLogger(__name__) - -# Hide the InsecureRequestWarning from urllib3 -warnings.filterwarnings("ignore", category=InsecureRequestWarning) - - -def autocomplete(): - """Command and option completion for the main option parser (and options) - and its subcommands (and options). - - Enable by sourcing one of the completion shell scripts (bash, zsh or fish). - """ - # Don't complete if user hasn't sourced bash_completion file. - if 'PIP_AUTO_COMPLETE' not in os.environ: - return - cwords = os.environ['COMP_WORDS'].split()[1:] - cword = int(os.environ['COMP_CWORD']) - try: - current = cwords[cword - 1] - except IndexError: - current = '' - - subcommands = [cmd for cmd, summary in get_summaries()] - options = [] - # subcommand - try: - subcommand_name = [w for w in cwords if w in subcommands][0] - except IndexError: - subcommand_name = None - - parser = create_main_parser() - # subcommand options - if subcommand_name: - # special case: 'help' subcommand has no options - if subcommand_name == 'help': - sys.exit(1) - # special case: list locally installed dists for uninstall command - if subcommand_name == 'uninstall' and not current.startswith('-'): - installed = [] - lc = current.lower() - for dist in get_installed_distributions(local_only=True): - if dist.key.startswith(lc) and dist.key not in cwords[1:]: - installed.append(dist.key) - # if there are no dists installed, fall back to option completion - if installed: - for dist in installed: - print(dist) - sys.exit(1) - - subcommand = commands_dict[subcommand_name]() - options += [(opt.get_opt_string(), opt.nargs) - for opt in subcommand.parser.option_list_all - if opt.help != optparse.SUPPRESS_HELP] - - # filter out previously specified options from available options - prev_opts = [x.split('=')[0] for x in cwords[1:cword - 1]] - options = [(x, v) for (x, v) in options if x not in prev_opts] - # filter options by current input - options = [(k, v) for k, v in options if k.startswith(current)] - for option in options: - opt_label = option[0] - # append '=' to options which require args - if option[1]: - opt_label += '=' - print(opt_label) - else: - # show main parser options only when necessary - if current.startswith('-') or current.startswith('--'): - opts = [i.option_list for i in parser.option_groups] - opts.append(parser.option_list) - opts = (o for it in opts for o in it) - - subcommands += [i.get_opt_string() for i in opts - if i.help != optparse.SUPPRESS_HELP] - - print(' '.join([x for x in subcommands if x.startswith(current)])) - sys.exit(1) - - -def create_main_parser(): - parser_kw = { - 'usage': '\n%prog [options]', - 'add_help_option': False, - 'formatter': UpdatingDefaultsHelpFormatter(), - 'name': 'global', - 'prog': get_prog(), - } - - parser = ConfigOptionParser(**parser_kw) - parser.disable_interspersed_args() - - pip_pkg_dir = os.path.dirname(os.path.dirname(os.path.abspath(__file__))) - parser.version = 'pip %s from %s (python %s)' % ( - __version__, pip_pkg_dir, sys.version[:3]) - - # add the general options - gen_opts = cmdoptions.make_option_group(cmdoptions.general_group, parser) - parser.add_option_group(gen_opts) - - parser.main = True # so the help formatter knows - - # create command listing for description - command_summaries = get_summaries() - description = [''] + ['%-27s %s' % (i, j) for i, j in command_summaries] - parser.description = '\n'.join(description) - - return parser - - -def parseopts(args): - parser = create_main_parser() - - # Note: parser calls disable_interspersed_args(), so the result of this - # call is to split the initial args into the general options before the - # subcommand and everything else. - # For example: - # args: ['--timeout=5', 'install', '--user', 'INITools'] - # general_options: ['--timeout==5'] - # args_else: ['install', '--user', 'INITools'] - general_options, args_else = parser.parse_args(args) - - # --version - if general_options.version: - sys.stdout.write(parser.version) - sys.stdout.write(os.linesep) - sys.exit() - - # pip || pip help -> print_help() - if not args_else or (args_else[0] == 'help' and len(args_else) == 1): - parser.print_help() - sys.exit() - - # the subcommand name - cmd_name = args_else[0] - - if cmd_name not in commands_dict: - guess = get_similar_commands(cmd_name) - - msg = ['unknown command "%s"' % cmd_name] - if guess: - msg.append('maybe you meant "%s"' % guess) - - raise CommandError(' - '.join(msg)) - - # all the args without the subcommand - cmd_args = args[:] - cmd_args.remove(cmd_name) - - return cmd_name, cmd_args - - -def check_isolated(args): - isolated = False - - if "--isolated" in args: - isolated = True - - return isolated - - -def main(args=None): - if args is None: - args = sys.argv[1:] - - # Configure our deprecation warnings to be sent through loggers - deprecation.install_warning_logger() - - autocomplete() - - try: - cmd_name, cmd_args = parseopts(args) - except PipError as exc: - sys.stderr.write("ERROR: %s" % exc) - sys.stderr.write(os.linesep) - sys.exit(1) - - # Needed for locale.getpreferredencoding(False) to work - # in pip.utils.encoding.auto_decode - try: - locale.setlocale(locale.LC_ALL, '') - except locale.Error as e: - # setlocale can apparently crash if locale are uninitialized - logger.debug("Ignoring error %s when setting locale", e) - command = commands_dict[cmd_name](isolated=check_isolated(cmd_args)) - return command.main(cmd_args) - - -# ########################################################### -# # Writing freeze files - -class FrozenRequirement(object): - - def __init__(self, name, req, editable, comments=()): - self.name = name - self.req = req - self.editable = editable - self.comments = comments - - _rev_re = re.compile(r'-r(\d+)$') - _date_re = re.compile(r'-(20\d\d\d\d\d\d)$') - - @classmethod - def from_dist(cls, dist, dependency_links): - location = os.path.normcase(os.path.abspath(dist.location)) - comments = [] - from pip.vcs import vcs, get_src_requirement - if dist_is_editable(dist) and vcs.get_backend_name(location): - editable = True - try: - req = get_src_requirement(dist, location) - except InstallationError as exc: - logger.warning( - "Error when trying to get requirement for VCS system %s, " - "falling back to uneditable format", exc - ) - req = None - if req is None: - logger.warning( - 'Could not determine repository location of %s', location - ) - comments.append( - '## !! Could not determine repository location' - ) - req = dist.as_requirement() - editable = False - else: - editable = False - req = dist.as_requirement() - specs = req.specs - assert len(specs) == 1 and specs[0][0] in ["==", "==="], \ - 'Expected 1 spec with == or ===; specs = %r; dist = %r' % \ - (specs, dist) - version = specs[0][1] - ver_match = cls._rev_re.search(version) - date_match = cls._date_re.search(version) - if ver_match or date_match: - svn_backend = vcs.get_backend('svn') - if svn_backend: - svn_location = svn_backend().get_location( - dist, - dependency_links, - ) - if not svn_location: - logger.warning( - 'Warning: cannot find svn location for %s', req) - comments.append( - '## FIXME: could not find svn URL in dependency_links ' - 'for this package:' - ) - else: - comments.append( - '# Installing as editable to satisfy requirement %s:' % - req - ) - if ver_match: - rev = ver_match.group(1) - else: - rev = '{%s}' % date_match.group(1) - editable = True - req = '%s@%s#egg=%s' % ( - svn_location, - rev, - cls.egg_name(dist) - ) - return cls(dist.project_name, req, editable, comments) - - @staticmethod - def egg_name(dist): - name = dist.egg_name() - match = re.search(r'-py\d\.\d$', name) - if match: - name = name[:match.start()] - return name - - def __str__(self): - req = self.req - if self.editable: - req = '-e %s' % req - return '\n'.join(list(self.comments) + [str(req)]) + '\n' - - -if __name__ == '__main__': - sys.exit(main()) diff --git a/lib/python3.4/site-packages/pip/commands/check.py b/lib/python3.4/site-packages/pip/commands/check.py deleted file mode 100644 index 70458ad..0000000 --- a/lib/python3.4/site-packages/pip/commands/check.py +++ /dev/null @@ -1,39 +0,0 @@ -import logging - -from pip.basecommand import Command -from pip.operations.check import check_requirements -from pip.utils import get_installed_distributions - - -logger = logging.getLogger(__name__) - - -class CheckCommand(Command): - """Verify installed packages have compatible dependencies.""" - name = 'check' - usage = """ - %prog [options]""" - summary = 'Verify installed packages have compatible dependencies.' - - def run(self, options, args): - dists = get_installed_distributions(local_only=False, skip=()) - missing_reqs_dict, incompatible_reqs_dict = check_requirements(dists) - - for dist in dists: - key = '%s==%s' % (dist.project_name, dist.version) - - for requirement in missing_reqs_dict.get(key, []): - logger.info( - "%s %s requires %s, which is not installed.", - dist.project_name, dist.version, requirement.project_name) - - for requirement, actual in incompatible_reqs_dict.get(key, []): - logger.info( - "%s %s has requirement %s, but you have %s %s.", - dist.project_name, dist.version, requirement, - actual.project_name, actual.version) - - if missing_reqs_dict or incompatible_reqs_dict: - return 1 - else: - logger.info("No broken requirements found.") diff --git a/lib/python3.4/site-packages/pip/commands/install.py b/lib/python3.4/site-packages/pip/commands/install.py deleted file mode 100644 index 39292b1..0000000 --- a/lib/python3.4/site-packages/pip/commands/install.py +++ /dev/null @@ -1,455 +0,0 @@ -from __future__ import absolute_import - -import logging -import operator -import os -import tempfile -import shutil -import warnings -try: - import wheel -except ImportError: - wheel = None - -from pip.req import RequirementSet -from pip.basecommand import RequirementCommand -from pip.locations import virtualenv_no_global, distutils_scheme -from pip.exceptions import ( - InstallationError, CommandError, PreviousBuildDirError, -) -from pip import cmdoptions -from pip.utils import ensure_dir, get_installed_version -from pip.utils.build import BuildDirectory -from pip.utils.deprecation import RemovedInPip10Warning -from pip.utils.filesystem import check_path_owner -from pip.wheel import WheelCache, WheelBuilder - -from pip.locations import running_under_virtualenv - -logger = logging.getLogger(__name__) - - -class InstallCommand(RequirementCommand): - """ - Install packages from: - - - PyPI (and other indexes) using requirement specifiers. - - VCS project urls. - - Local project directories. - - Local or remote source archives. - - pip also supports installing from "requirements files", which provide - an easy way to specify a whole environment to be installed. - """ - name = 'install' - - usage = """ - %prog [options] [package-index-options] ... - %prog [options] -r [package-index-options] ... - %prog [options] [-e] ... - %prog [options] [-e] ... - %prog [options] ...""" - - summary = 'Install packages.' - - def __init__(self, *args, **kw): - super(InstallCommand, self).__init__(*args, **kw) - - default_user = True - if running_under_virtualenv(): - default_user = False - if os.geteuid() == 0: - default_user = False - - cmd_opts = self.cmd_opts - - cmd_opts.add_option(cmdoptions.constraints()) - cmd_opts.add_option(cmdoptions.editable()) - cmd_opts.add_option(cmdoptions.requirements()) - cmd_opts.add_option(cmdoptions.build_dir()) - - cmd_opts.add_option( - '-t', '--target', - dest='target_dir', - metavar='dir', - default=None, - help='Install packages into . ' - 'By default this will not replace existing files/folders in ' - '. Use --upgrade to replace existing packages in ' - 'with new versions.' - ) - - cmd_opts.add_option( - '-d', '--download', '--download-dir', '--download-directory', - dest='download_dir', - metavar='dir', - default=None, - help=("Download packages into instead of installing them, " - "regardless of what's already installed."), - ) - - cmd_opts.add_option(cmdoptions.src()) - - cmd_opts.add_option( - '-U', '--upgrade', - dest='upgrade', - action='store_true', - help='Upgrade all specified packages to the newest available ' - 'version. The handling of dependencies depends on the ' - 'upgrade-strategy used.' - ) - - cmd_opts.add_option( - '--upgrade-strategy', - dest='upgrade_strategy', - default='eager', - choices=['only-if-needed', 'eager'], - help='Determines how dependency upgrading should be handled. ' - '"eager" - dependencies are upgraded regardless of ' - 'whether the currently installed version satisfies the ' - 'requirements of the upgraded package(s). ' - '"only-if-needed" - are upgraded only when they do not ' - 'satisfy the requirements of the upgraded package(s).' - ) - - cmd_opts.add_option( - '--force-reinstall', - dest='force_reinstall', - action='store_true', - help='When upgrading, reinstall all packages even if they are ' - 'already up-to-date.') - - cmd_opts.add_option( - '-I', '--ignore-installed', - dest='ignore_installed', - action='store_true', - default=default_user, - help='Ignore the installed packages (reinstalling instead).') - - cmd_opts.add_option(cmdoptions.ignore_requires_python()) - cmd_opts.add_option(cmdoptions.no_deps()) - - cmd_opts.add_option(cmdoptions.install_options()) - cmd_opts.add_option(cmdoptions.global_options()) - - cmd_opts.add_option( - '--user', - dest='use_user_site', - action='store_true', - default=default_user, - help="Install to the Python user install directory for your " - "platform. Typically ~/.local/, or %APPDATA%\Python on " - "Windows. (See the Python documentation for site.USER_BASE " - "for full details.) On Debian systems, this is the " - "default when running outside of a virtual environment " - "and not as root.") - - cmd_opts.add_option( - '--system', - dest='use_user_site', - action='store_false', - help="Install using the system scheme (overrides --user on " - "Debian systems)") - - cmd_opts.add_option( - '--egg', - dest='as_egg', - action='store_true', - help="Install packages as eggs, not 'flat', like pip normally " - "does. This option is not about installing *from* eggs. " - "(WARNING: Because this option overrides pip's normal install" - " logic, requirements files may not behave as expected.)") - - cmd_opts.add_option( - '--root', - dest='root_path', - metavar='dir', - default=None, - help="Install everything relative to this alternate root " - "directory.") - - cmd_opts.add_option( - '--prefix', - dest='prefix_path', - metavar='dir', - default=None, - help="Installation prefix where lib, bin and other top-level " - "folders are placed") - - cmd_opts.add_option( - "--compile", - action="store_true", - dest="compile", - default=True, - help="Compile py files to pyc", - ) - - cmd_opts.add_option( - "--no-compile", - action="store_false", - dest="compile", - help="Do not compile py files to pyc", - ) - - cmd_opts.add_option(cmdoptions.use_wheel()) - cmd_opts.add_option(cmdoptions.no_use_wheel()) - cmd_opts.add_option(cmdoptions.no_binary()) - cmd_opts.add_option(cmdoptions.only_binary()) - cmd_opts.add_option(cmdoptions.pre()) - cmd_opts.add_option(cmdoptions.no_clean()) - cmd_opts.add_option(cmdoptions.require_hashes()) - - index_opts = cmdoptions.make_option_group( - cmdoptions.index_group, - self.parser, - ) - - self.parser.insert_option_group(0, index_opts) - self.parser.insert_option_group(0, cmd_opts) - - def run(self, options, args): - cmdoptions.resolve_wheel_no_use_binary(options) - cmdoptions.check_install_build_global(options) - - if options.as_egg: - warnings.warn( - "--egg has been deprecated and will be removed in the future. " - "This flag is mutually exclusive with large parts of pip, and " - "actually using it invalidates pip's ability to manage the " - "installation process.", - RemovedInPip10Warning, - ) - - if options.allow_external: - warnings.warn( - "--allow-external has been deprecated and will be removed in " - "the future. Due to changes in the repository protocol, it no " - "longer has any effect.", - RemovedInPip10Warning, - ) - - if options.allow_all_external: - warnings.warn( - "--allow-all-external has been deprecated and will be removed " - "in the future. Due to changes in the repository protocol, it " - "no longer has any effect.", - RemovedInPip10Warning, - ) - - if options.allow_unverified: - warnings.warn( - "--allow-unverified has been deprecated and will be removed " - "in the future. Due to changes in the repository protocol, it " - "no longer has any effect.", - RemovedInPip10Warning, - ) - - if options.download_dir: - warnings.warn( - "pip install --download has been deprecated and will be " - "removed in the future. Pip now has a download command that " - "should be used instead.", - RemovedInPip10Warning, - ) - options.ignore_installed = True - - if options.build_dir: - options.build_dir = os.path.abspath(options.build_dir) - - options.src_dir = os.path.abspath(options.src_dir) - install_options = options.install_options or [] - if options.use_user_site: - if options.prefix_path: - raise CommandError( - "Can not combine '--user' and '--prefix' as they imply " - "different installation locations" - ) - if virtualenv_no_global(): - raise InstallationError( - "Can not perform a '--user' install. User site-packages " - "are not visible in this virtualenv." - ) - install_options.append('--user') - install_options.append('--prefix=') - - temp_target_dir = None - if options.target_dir: - options.ignore_installed = True - temp_target_dir = tempfile.mkdtemp() - options.target_dir = os.path.abspath(options.target_dir) - if (os.path.exists(options.target_dir) and not - os.path.isdir(options.target_dir)): - raise CommandError( - "Target path exists but is not a directory, will not " - "continue." - ) - install_options.append('--home=' + temp_target_dir) - - global_options = options.global_options or [] - - with self._build_session(options) as session: - - finder = self._build_package_finder(options, session) - build_delete = (not (options.no_clean or options.build_dir)) - wheel_cache = WheelCache(options.cache_dir, options.format_control) - if options.cache_dir and not check_path_owner(options.cache_dir): - logger.warning( - "The directory '%s' or its parent directory is not owned " - "by the current user and caching wheels has been " - "disabled. check the permissions and owner of that " - "directory. If executing pip with sudo, you may want " - "sudo's -H flag.", - options.cache_dir, - ) - options.cache_dir = None - - with BuildDirectory(options.build_dir, - delete=build_delete) as build_dir: - requirement_set = RequirementSet( - build_dir=build_dir, - src_dir=options.src_dir, - download_dir=options.download_dir, - upgrade=options.upgrade, - upgrade_strategy=options.upgrade_strategy, - as_egg=options.as_egg, - ignore_installed=options.ignore_installed, - ignore_dependencies=options.ignore_dependencies, - ignore_requires_python=options.ignore_requires_python, - force_reinstall=options.force_reinstall, - use_user_site=options.use_user_site, - target_dir=temp_target_dir, - session=session, - pycompile=options.compile, - isolated=options.isolated_mode, - wheel_cache=wheel_cache, - require_hashes=options.require_hashes, - ) - - self.populate_requirement_set( - requirement_set, args, options, finder, session, self.name, - wheel_cache - ) - - if not requirement_set.has_requirements: - return - - try: - if (options.download_dir or not wheel or not - options.cache_dir): - # on -d don't do complex things like building - # wheels, and don't try to build wheels when wheel is - # not installed. - requirement_set.prepare_files(finder) - else: - # build wheels before install. - wb = WheelBuilder( - requirement_set, - finder, - build_options=[], - global_options=[], - ) - # Ignore the result: a failed wheel will be - # installed from the sdist/vcs whatever. - wb.build(autobuilding=True) - - if not options.download_dir: - requirement_set.install( - install_options, - global_options, - root=options.root_path, - prefix=options.prefix_path, - ) - - possible_lib_locations = get_lib_location_guesses( - user=options.use_user_site, - home=temp_target_dir, - root=options.root_path, - prefix=options.prefix_path, - isolated=options.isolated_mode, - ) - reqs = sorted( - requirement_set.successfully_installed, - key=operator.attrgetter('name')) - items = [] - for req in reqs: - item = req.name - try: - installed_version = get_installed_version( - req.name, possible_lib_locations - ) - if installed_version: - item += '-' + installed_version - except Exception: - pass - items.append(item) - installed = ' '.join(items) - if installed: - logger.info('Successfully installed %s', installed) - else: - downloaded = ' '.join([ - req.name - for req in requirement_set.successfully_downloaded - ]) - if downloaded: - logger.info( - 'Successfully downloaded %s', downloaded - ) - except PreviousBuildDirError: - options.no_clean = True - raise - finally: - # Clean up - if not options.no_clean: - requirement_set.cleanup_files() - - if options.target_dir: - ensure_dir(options.target_dir) - - # Checking both purelib and platlib directories for installed - # packages to be moved to target directory - lib_dir_list = [] - - purelib_dir = distutils_scheme('', home=temp_target_dir)['purelib'] - platlib_dir = distutils_scheme('', home=temp_target_dir)['platlib'] - - if os.path.exists(purelib_dir): - lib_dir_list.append(purelib_dir) - if os.path.exists(platlib_dir) and platlib_dir != purelib_dir: - lib_dir_list.append(platlib_dir) - - for lib_dir in lib_dir_list: - for item in os.listdir(lib_dir): - target_item_dir = os.path.join(options.target_dir, item) - if os.path.exists(target_item_dir): - if not options.upgrade: - logger.warning( - 'Target directory %s already exists. Specify ' - '--upgrade to force replacement.', - target_item_dir - ) - continue - if os.path.islink(target_item_dir): - logger.warning( - 'Target directory %s already exists and is ' - 'a link. Pip will not automatically replace ' - 'links, please remove if replacement is ' - 'desired.', - target_item_dir - ) - continue - if os.path.isdir(target_item_dir): - shutil.rmtree(target_item_dir) - else: - os.remove(target_item_dir) - - shutil.move( - os.path.join(lib_dir, item), - target_item_dir - ) - shutil.rmtree(temp_target_dir) - return requirement_set - - -def get_lib_location_guesses(*args, **kwargs): - scheme = distutils_scheme('', *args, **kwargs) - return [scheme['purelib'], scheme['platlib']] diff --git a/lib/python3.4/site-packages/pip/compat/__init__.py b/lib/python3.4/site-packages/pip/compat/__init__.py deleted file mode 100644 index 099672c..0000000 --- a/lib/python3.4/site-packages/pip/compat/__init__.py +++ /dev/null @@ -1,164 +0,0 @@ -"""Stuff that differs in different Python versions and platform -distributions.""" -from __future__ import absolute_import, division - -import os -import sys - -from pip._vendor.six import text_type - -try: - from logging.config import dictConfig as logging_dictConfig -except ImportError: - from pip.compat.dictconfig import dictConfig as logging_dictConfig - -try: - from collections import OrderedDict -except ImportError: - from pip._vendor.ordereddict import OrderedDict - -try: - import ipaddress -except ImportError: - try: - from pip._vendor import ipaddress - except ImportError: - import ipaddr as ipaddress - ipaddress.ip_address = ipaddress.IPAddress - ipaddress.ip_network = ipaddress.IPNetwork - - -try: - import sysconfig - - def get_stdlib(): - paths = [ - sysconfig.get_path("stdlib"), - sysconfig.get_path("platstdlib"), - ] - return set(filter(bool, paths)) -except ImportError: - from distutils import sysconfig - - def get_stdlib(): - paths = [ - sysconfig.get_python_lib(standard_lib=True), - sysconfig.get_python_lib(standard_lib=True, plat_specific=True), - ] - return set(filter(bool, paths)) - - -__all__ = [ - "logging_dictConfig", "ipaddress", "uses_pycache", "console_to_str", - "native_str", "get_path_uid", "stdlib_pkgs", "WINDOWS", "samefile", - "OrderedDict", -] - - -if sys.version_info >= (3, 4): - uses_pycache = True - from importlib.util import cache_from_source -else: - import imp - uses_pycache = hasattr(imp, 'cache_from_source') - if uses_pycache: - cache_from_source = imp.cache_from_source - else: - cache_from_source = None - - -if sys.version_info >= (3,): - def console_to_str(s): - try: - return s.decode(sys.__stdout__.encoding) - except UnicodeDecodeError: - return s.decode('utf_8') - - def native_str(s, replace=False): - if isinstance(s, bytes): - return s.decode('utf-8', 'replace' if replace else 'strict') - return s - -else: - def console_to_str(s): - return s - - def native_str(s, replace=False): - # Replace is ignored -- unicode to UTF-8 can't fail - if isinstance(s, text_type): - return s.encode('utf-8') - return s - - -def total_seconds(td): - if hasattr(td, "total_seconds"): - return td.total_seconds() - else: - val = td.microseconds + (td.seconds + td.days * 24 * 3600) * 10 ** 6 - return val / 10 ** 6 - - -def get_path_uid(path): - """ - Return path's uid. - - Does not follow symlinks: - https://github.com/pypa/pip/pull/935#discussion_r5307003 - - Placed this function in compat due to differences on AIX and - Jython, that should eventually go away. - - :raises OSError: When path is a symlink or can't be read. - """ - if hasattr(os, 'O_NOFOLLOW'): - fd = os.open(path, os.O_RDONLY | os.O_NOFOLLOW) - file_uid = os.fstat(fd).st_uid - os.close(fd) - else: # AIX and Jython - # WARNING: time of check vulnerability, but best we can do w/o NOFOLLOW - if not os.path.islink(path): - # older versions of Jython don't have `os.fstat` - file_uid = os.stat(path).st_uid - else: - # raise OSError for parity with os.O_NOFOLLOW above - raise OSError( - "%s is a symlink; Will not return uid for symlinks" % path - ) - return file_uid - - -def expanduser(path): - """ - Expand ~ and ~user constructions. - - Includes a workaround for http://bugs.python.org/issue14768 - """ - expanded = os.path.expanduser(path) - if path.startswith('~/') and expanded.startswith('//'): - expanded = expanded[1:] - return expanded - - -# packages in the stdlib that may have installation metadata, but should not be -# considered 'installed'. this theoretically could be determined based on -# dist.location (py27:`sysconfig.get_paths()['stdlib']`, -# py26:sysconfig.get_config_vars('LIBDEST')), but fear platform variation may -# make this ineffective, so hard-coding -stdlib_pkgs = ('python', 'wsgiref') -if sys.version_info >= (2, 7): - stdlib_pkgs += ('argparse',) - - -# windows detection, covers cpython and ironpython -WINDOWS = (sys.platform.startswith("win") or - (sys.platform == 'cli' and os.name == 'nt')) - - -def samefile(file1, file2): - """Provide an alternative for os.path.samefile on Windows/Python2""" - if hasattr(os.path, 'samefile'): - return os.path.samefile(file1, file2) - else: - path1 = os.path.normcase(os.path.abspath(file1)) - path2 = os.path.normcase(os.path.abspath(file2)) - return path1 == path2 diff --git a/lib/python3.4/site-packages/pip/compat/dictconfig.py b/lib/python3.4/site-packages/pip/compat/dictconfig.py deleted file mode 100644 index ec684aa..0000000 --- a/lib/python3.4/site-packages/pip/compat/dictconfig.py +++ /dev/null @@ -1,565 +0,0 @@ -# This is a copy of the Python logging.config.dictconfig module, -# reproduced with permission. It is provided here for backwards -# compatibility for Python versions prior to 2.7. -# -# Copyright 2009-2010 by Vinay Sajip. All Rights Reserved. -# -# Permission to use, copy, modify, and distribute this software and its -# documentation for any purpose and without fee is hereby granted, -# provided that the above copyright notice appear in all copies and that -# both that copyright notice and this permission notice appear in -# supporting documentation, and that the name of Vinay Sajip -# not be used in advertising or publicity pertaining to distribution -# of the software without specific, written prior permission. -# VINAY SAJIP DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE, INCLUDING -# ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL -# VINAY SAJIP BE LIABLE FOR ANY SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES OR -# ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER -# IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT -# OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. -from __future__ import absolute_import - -import logging.handlers -import re -import sys -import types - -from pip._vendor import six - -# flake8: noqa - -IDENTIFIER = re.compile('^[a-z_][a-z0-9_]*$', re.I) - - -def valid_ident(s): - m = IDENTIFIER.match(s) - if not m: - raise ValueError('Not a valid Python identifier: %r' % s) - return True - -# -# This function is defined in logging only in recent versions of Python -# -try: - from logging import _checkLevel -except ImportError: - def _checkLevel(level): - if isinstance(level, int): - rv = level - elif str(level) == level: - if level not in logging._levelNames: - raise ValueError('Unknown level: %r' % level) - rv = logging._levelNames[level] - else: - raise TypeError('Level not an integer or a ' - 'valid string: %r' % level) - return rv - -# The ConvertingXXX classes are wrappers around standard Python containers, -# and they serve to convert any suitable values in the container. The -# conversion converts base dicts, lists and tuples to their wrapped -# equivalents, whereas strings which match a conversion format are converted -# appropriately. -# -# Each wrapper should have a configurator attribute holding the actual -# configurator to use for conversion. - - -class ConvertingDict(dict): - """A converting dictionary wrapper.""" - - def __getitem__(self, key): - value = dict.__getitem__(self, key) - result = self.configurator.convert(value) - # If the converted value is different, save for next time - if value is not result: - self[key] = result - if type(result) in (ConvertingDict, ConvertingList, - ConvertingTuple): - result.parent = self - result.key = key - return result - - def get(self, key, default=None): - value = dict.get(self, key, default) - result = self.configurator.convert(value) - # If the converted value is different, save for next time - if value is not result: - self[key] = result - if type(result) in (ConvertingDict, ConvertingList, - ConvertingTuple): - result.parent = self - result.key = key - return result - - def pop(self, key, default=None): - value = dict.pop(self, key, default) - result = self.configurator.convert(value) - if value is not result: - if type(result) in (ConvertingDict, ConvertingList, - ConvertingTuple): - result.parent = self - result.key = key - return result - - -class ConvertingList(list): - """A converting list wrapper.""" - def __getitem__(self, key): - value = list.__getitem__(self, key) - result = self.configurator.convert(value) - # If the converted value is different, save for next time - if value is not result: - self[key] = result - if type(result) in (ConvertingDict, ConvertingList, - ConvertingTuple): - result.parent = self - result.key = key - return result - - def pop(self, idx=-1): - value = list.pop(self, idx) - result = self.configurator.convert(value) - if value is not result: - if type(result) in (ConvertingDict, ConvertingList, - ConvertingTuple): - result.parent = self - return result - - -class ConvertingTuple(tuple): - """A converting tuple wrapper.""" - def __getitem__(self, key): - value = tuple.__getitem__(self, key) - result = self.configurator.convert(value) - if value is not result: - if type(result) in (ConvertingDict, ConvertingList, - ConvertingTuple): - result.parent = self - result.key = key - return result - - -class BaseConfigurator(object): - """ - The configurator base class which defines some useful defaults. - """ - - CONVERT_PATTERN = re.compile(r'^(?P[a-z]+)://(?P.*)$') - - WORD_PATTERN = re.compile(r'^\s*(\w+)\s*') - DOT_PATTERN = re.compile(r'^\.\s*(\w+)\s*') - INDEX_PATTERN = re.compile(r'^\[\s*(\w+)\s*\]\s*') - DIGIT_PATTERN = re.compile(r'^\d+$') - - value_converters = { - 'ext' : 'ext_convert', - 'cfg' : 'cfg_convert', - } - - # We might want to use a different one, e.g. importlib - importer = __import__ - - def __init__(self, config): - self.config = ConvertingDict(config) - self.config.configurator = self - - def resolve(self, s): - """ - Resolve strings to objects using standard import and attribute - syntax. - """ - name = s.split('.') - used = name.pop(0) - try: - found = self.importer(used) - for frag in name: - used += '.' + frag - try: - found = getattr(found, frag) - except AttributeError: - self.importer(used) - found = getattr(found, frag) - return found - except ImportError: - e, tb = sys.exc_info()[1:] - v = ValueError('Cannot resolve %r: %s' % (s, e)) - v.__cause__, v.__traceback__ = e, tb - raise v - - def ext_convert(self, value): - """Default converter for the ext:// protocol.""" - return self.resolve(value) - - def cfg_convert(self, value): - """Default converter for the cfg:// protocol.""" - rest = value - m = self.WORD_PATTERN.match(rest) - if m is None: - raise ValueError("Unable to convert %r" % value) - else: - rest = rest[m.end():] - d = self.config[m.groups()[0]] - # print d, rest - while rest: - m = self.DOT_PATTERN.match(rest) - if m: - d = d[m.groups()[0]] - else: - m = self.INDEX_PATTERN.match(rest) - if m: - idx = m.groups()[0] - if not self.DIGIT_PATTERN.match(idx): - d = d[idx] - else: - try: - n = int(idx) # try as number first (most likely) - d = d[n] - except TypeError: - d = d[idx] - if m: - rest = rest[m.end():] - else: - raise ValueError('Unable to convert ' - '%r at %r' % (value, rest)) - # rest should be empty - return d - - def convert(self, value): - """ - Convert values to an appropriate type. dicts, lists and tuples are - replaced by their converting alternatives. Strings are checked to - see if they have a conversion format and are converted if they do. - """ - if not isinstance(value, ConvertingDict) and isinstance(value, dict): - value = ConvertingDict(value) - value.configurator = self - elif not isinstance(value, ConvertingList) and isinstance(value, list): - value = ConvertingList(value) - value.configurator = self - elif not isinstance(value, ConvertingTuple) and\ - isinstance(value, tuple): - value = ConvertingTuple(value) - value.configurator = self - elif isinstance(value, six.string_types): # str for py3k - m = self.CONVERT_PATTERN.match(value) - if m: - d = m.groupdict() - prefix = d['prefix'] - converter = self.value_converters.get(prefix, None) - if converter: - suffix = d['suffix'] - converter = getattr(self, converter) - value = converter(suffix) - return value - - def configure_custom(self, config): - """Configure an object with a user-supplied factory.""" - c = config.pop('()') - if not hasattr(c, '__call__') and hasattr(types, 'ClassType') and type(c) != types.ClassType: - c = self.resolve(c) - props = config.pop('.', None) - # Check for valid identifiers - kwargs = dict((k, config[k]) for k in config if valid_ident(k)) - result = c(**kwargs) - if props: - for name, value in props.items(): - setattr(result, name, value) - return result - - def as_tuple(self, value): - """Utility function which converts lists to tuples.""" - if isinstance(value, list): - value = tuple(value) - return value - - -class DictConfigurator(BaseConfigurator): - """ - Configure logging using a dictionary-like object to describe the - configuration. - """ - - def configure(self): - """Do the configuration.""" - - config = self.config - if 'version' not in config: - raise ValueError("dictionary doesn't specify a version") - if config['version'] != 1: - raise ValueError("Unsupported version: %s" % config['version']) - incremental = config.pop('incremental', False) - EMPTY_DICT = {} - logging._acquireLock() - try: - if incremental: - handlers = config.get('handlers', EMPTY_DICT) - # incremental handler config only if handler name - # ties in to logging._handlers (Python 2.7) - if sys.version_info[:2] == (2, 7): - for name in handlers: - if name not in logging._handlers: - raise ValueError('No handler found with ' - 'name %r' % name) - else: - try: - handler = logging._handlers[name] - handler_config = handlers[name] - level = handler_config.get('level', None) - if level: - handler.setLevel(_checkLevel(level)) - except StandardError as e: - raise ValueError('Unable to configure handler ' - '%r: %s' % (name, e)) - loggers = config.get('loggers', EMPTY_DICT) - for name in loggers: - try: - self.configure_logger(name, loggers[name], True) - except StandardError as e: - raise ValueError('Unable to configure logger ' - '%r: %s' % (name, e)) - root = config.get('root', None) - if root: - try: - self.configure_root(root, True) - except StandardError as e: - raise ValueError('Unable to configure root ' - 'logger: %s' % e) - else: - disable_existing = config.pop('disable_existing_loggers', True) - - logging._handlers.clear() - del logging._handlerList[:] - - # Do formatters first - they don't refer to anything else - formatters = config.get('formatters', EMPTY_DICT) - for name in formatters: - try: - formatters[name] = self.configure_formatter( - formatters[name]) - except StandardError as e: - raise ValueError('Unable to configure ' - 'formatter %r: %s' % (name, e)) - # Next, do filters - they don't refer to anything else, either - filters = config.get('filters', EMPTY_DICT) - for name in filters: - try: - filters[name] = self.configure_filter(filters[name]) - except StandardError as e: - raise ValueError('Unable to configure ' - 'filter %r: %s' % (name, e)) - - # Next, do handlers - they refer to formatters and filters - # As handlers can refer to other handlers, sort the keys - # to allow a deterministic order of configuration - handlers = config.get('handlers', EMPTY_DICT) - for name in sorted(handlers): - try: - handler = self.configure_handler(handlers[name]) - handler.name = name - handlers[name] = handler - except StandardError as e: - raise ValueError('Unable to configure handler ' - '%r: %s' % (name, e)) - # Next, do loggers - they refer to handlers and filters - - # we don't want to lose the existing loggers, - # since other threads may have pointers to them. - # existing is set to contain all existing loggers, - # and as we go through the new configuration we - # remove any which are configured. At the end, - # what's left in existing is the set of loggers - # which were in the previous configuration but - # which are not in the new configuration. - root = logging.root - existing = list(root.manager.loggerDict) - # The list needs to be sorted so that we can - # avoid disabling child loggers of explicitly - # named loggers. With a sorted list it is easier - # to find the child loggers. - existing.sort() - # We'll keep the list of existing loggers - # which are children of named loggers here... - child_loggers = [] - # now set up the new ones... - loggers = config.get('loggers', EMPTY_DICT) - for name in loggers: - if name in existing: - i = existing.index(name) - prefixed = name + "." - pflen = len(prefixed) - num_existing = len(existing) - i = i + 1 # look at the entry after name - while (i < num_existing) and\ - (existing[i][:pflen] == prefixed): - child_loggers.append(existing[i]) - i = i + 1 - existing.remove(name) - try: - self.configure_logger(name, loggers[name]) - except StandardError as e: - raise ValueError('Unable to configure logger ' - '%r: %s' % (name, e)) - - # Disable any old loggers. There's no point deleting - # them as other threads may continue to hold references - # and by disabling them, you stop them doing any logging. - # However, don't disable children of named loggers, as that's - # probably not what was intended by the user. - for log in existing: - logger = root.manager.loggerDict[log] - if log in child_loggers: - logger.level = logging.NOTSET - logger.handlers = [] - logger.propagate = True - elif disable_existing: - logger.disabled = True - - # And finally, do the root logger - root = config.get('root', None) - if root: - try: - self.configure_root(root) - except StandardError as e: - raise ValueError('Unable to configure root ' - 'logger: %s' % e) - finally: - logging._releaseLock() - - def configure_formatter(self, config): - """Configure a formatter from a dictionary.""" - if '()' in config: - factory = config['()'] # for use in exception handler - try: - result = self.configure_custom(config) - except TypeError as te: - if "'format'" not in str(te): - raise - # Name of parameter changed from fmt to format. - # Retry with old name. - # This is so that code can be used with older Python versions - #(e.g. by Django) - config['fmt'] = config.pop('format') - config['()'] = factory - result = self.configure_custom(config) - else: - fmt = config.get('format', None) - dfmt = config.get('datefmt', None) - result = logging.Formatter(fmt, dfmt) - return result - - def configure_filter(self, config): - """Configure a filter from a dictionary.""" - if '()' in config: - result = self.configure_custom(config) - else: - name = config.get('name', '') - result = logging.Filter(name) - return result - - def add_filters(self, filterer, filters): - """Add filters to a filterer from a list of names.""" - for f in filters: - try: - filterer.addFilter(self.config['filters'][f]) - except StandardError as e: - raise ValueError('Unable to add filter %r: %s' % (f, e)) - - def configure_handler(self, config): - """Configure a handler from a dictionary.""" - formatter = config.pop('formatter', None) - if formatter: - try: - formatter = self.config['formatters'][formatter] - except StandardError as e: - raise ValueError('Unable to set formatter ' - '%r: %s' % (formatter, e)) - level = config.pop('level', None) - filters = config.pop('filters', None) - if '()' in config: - c = config.pop('()') - if not hasattr(c, '__call__') and hasattr(types, 'ClassType') and type(c) != types.ClassType: - c = self.resolve(c) - factory = c - else: - klass = self.resolve(config.pop('class')) - # Special case for handler which refers to another handler - if issubclass(klass, logging.handlers.MemoryHandler) and\ - 'target' in config: - try: - config['target'] = self.config['handlers'][config['target']] - except StandardError as e: - raise ValueError('Unable to set target handler ' - '%r: %s' % (config['target'], e)) - elif issubclass(klass, logging.handlers.SMTPHandler) and\ - 'mailhost' in config: - config['mailhost'] = self.as_tuple(config['mailhost']) - elif issubclass(klass, logging.handlers.SysLogHandler) and\ - 'address' in config: - config['address'] = self.as_tuple(config['address']) - factory = klass - kwargs = dict((k, config[k]) for k in config if valid_ident(k)) - try: - result = factory(**kwargs) - except TypeError as te: - if "'stream'" not in str(te): - raise - # The argument name changed from strm to stream - # Retry with old name. - # This is so that code can be used with older Python versions - #(e.g. by Django) - kwargs['strm'] = kwargs.pop('stream') - result = factory(**kwargs) - if formatter: - result.setFormatter(formatter) - if level is not None: - result.setLevel(_checkLevel(level)) - if filters: - self.add_filters(result, filters) - return result - - def add_handlers(self, logger, handlers): - """Add handlers to a logger from a list of names.""" - for h in handlers: - try: - logger.addHandler(self.config['handlers'][h]) - except StandardError as e: - raise ValueError('Unable to add handler %r: %s' % (h, e)) - - def common_logger_config(self, logger, config, incremental=False): - """ - Perform configuration which is common to root and non-root loggers. - """ - level = config.get('level', None) - if level is not None: - logger.setLevel(_checkLevel(level)) - if not incremental: - # Remove any existing handlers - for h in logger.handlers[:]: - logger.removeHandler(h) - handlers = config.get('handlers', None) - if handlers: - self.add_handlers(logger, handlers) - filters = config.get('filters', None) - if filters: - self.add_filters(logger, filters) - - def configure_logger(self, name, config, incremental=False): - """Configure a non-root logger from a dictionary.""" - logger = logging.getLogger(name) - self.common_logger_config(logger, config, incremental) - propagate = config.get('propagate', None) - if propagate is not None: - logger.propagate = propagate - - def configure_root(self, config, incremental=False): - """Configure a root logger from a dictionary.""" - root = logging.getLogger() - self.common_logger_config(root, config, incremental) - -dictConfigClass = DictConfigurator - - -def dictConfig(config): - """Configure logging using a dictionary.""" - dictConfigClass(config).configure() diff --git a/lib/python3.4/site-packages/pip/models/__init__.py b/lib/python3.4/site-packages/pip/models/__init__.py deleted file mode 100644 index 1d727d7..0000000 --- a/lib/python3.4/site-packages/pip/models/__init__.py +++ /dev/null @@ -1,4 +0,0 @@ -from pip.models.index import Index, PyPI - - -__all__ = ["Index", "PyPI"] diff --git a/lib/python3.4/site-packages/pip/models/index.py b/lib/python3.4/site-packages/pip/models/index.py deleted file mode 100644 index be99119..0000000 --- a/lib/python3.4/site-packages/pip/models/index.py +++ /dev/null @@ -1,16 +0,0 @@ -from pip._vendor.six.moves.urllib import parse as urllib_parse - - -class Index(object): - def __init__(self, url): - self.url = url - self.netloc = urllib_parse.urlsplit(url).netloc - self.simple_url = self.url_to_path('simple') - self.pypi_url = self.url_to_path('pypi') - self.pip_json_url = self.url_to_path('pypi/pip/json') - - def url_to_path(self, path): - return urllib_parse.urljoin(self.url, path) - - -PyPI = Index('https://pypi.python.org/') diff --git a/lib/python3.4/site-packages/pip/operations/check.py b/lib/python3.4/site-packages/pip/operations/check.py deleted file mode 100644 index 2cf67aa..0000000 --- a/lib/python3.4/site-packages/pip/operations/check.py +++ /dev/null @@ -1,49 +0,0 @@ - - -def check_requirements(installed_dists): - missing_reqs_dict = {} - incompatible_reqs_dict = {} - - for dist in installed_dists: - key = '%s==%s' % (dist.project_name, dist.version) - - missing_reqs = list(get_missing_reqs(dist, installed_dists)) - if missing_reqs: - missing_reqs_dict[key] = missing_reqs - - incompatible_reqs = list(get_incompatible_reqs( - dist, installed_dists)) - if incompatible_reqs: - incompatible_reqs_dict[key] = incompatible_reqs - - return (missing_reqs_dict, incompatible_reqs_dict) - - -def get_missing_reqs(dist, installed_dists): - """Return all of the requirements of `dist` that aren't present in - `installed_dists`. - - """ - installed_names = set(d.project_name.lower() for d in installed_dists) - missing_requirements = set() - - for requirement in dist.requires(): - if requirement.project_name.lower() not in installed_names: - missing_requirements.add(requirement) - yield requirement - - -def get_incompatible_reqs(dist, installed_dists): - """Return all of the requirements of `dist` that are present in - `installed_dists`, but have incompatible versions. - - """ - installed_dists_by_name = {} - for installed_dist in installed_dists: - installed_dists_by_name[installed_dist.project_name] = installed_dist - - for requirement in dist.requires(): - present_dist = installed_dists_by_name.get(requirement.project_name) - - if present_dist and present_dist not in requirement: - yield (requirement, present_dist) diff --git a/lib/python3.4/site-packages/pip/operations/freeze.py b/lib/python3.4/site-packages/pip/operations/freeze.py deleted file mode 100644 index 920c2c1..0000000 --- a/lib/python3.4/site-packages/pip/operations/freeze.py +++ /dev/null @@ -1,132 +0,0 @@ -from __future__ import absolute_import - -import logging -import re - -import pip -from pip.req import InstallRequirement -from pip.req.req_file import COMMENT_RE -from pip.utils import get_installed_distributions -from pip._vendor import pkg_resources -from pip._vendor.packaging.utils import canonicalize_name -from pip._vendor.pkg_resources import RequirementParseError - - -logger = logging.getLogger(__name__) - - -def freeze( - requirement=None, - find_links=None, local_only=None, user_only=None, skip_regex=None, - default_vcs=None, - isolated=False, - wheel_cache=None, - skip=()): - find_links = find_links or [] - skip_match = None - - if skip_regex: - skip_match = re.compile(skip_regex).search - - dependency_links = [] - - for dist in pkg_resources.working_set: - if dist.has_metadata('dependency_links.txt'): - dependency_links.extend( - dist.get_metadata_lines('dependency_links.txt') - ) - for link in find_links: - if '#egg=' in link: - dependency_links.append(link) - for link in find_links: - yield '-f %s' % link - installations = {} - for dist in get_installed_distributions(local_only=local_only, - skip=(), - user_only=user_only): - try: - req = pip.FrozenRequirement.from_dist( - dist, - dependency_links - ) - except RequirementParseError: - logger.warning( - "Could not parse requirement: %s", - dist.project_name - ) - continue - installations[req.name] = req - - if requirement: - # the options that don't get turned into an InstallRequirement - # should only be emitted once, even if the same option is in multiple - # requirements files, so we need to keep track of what has been emitted - # so that we don't emit it again if it's seen again - emitted_options = set() - for req_file_path in requirement: - with open(req_file_path) as req_file: - for line in req_file: - if (not line.strip() or - line.strip().startswith('#') or - (skip_match and skip_match(line)) or - line.startswith(( - '-r', '--requirement', - '-Z', '--always-unzip', - '-f', '--find-links', - '-i', '--index-url', - '--pre', - '--trusted-host', - '--process-dependency-links', - '--extra-index-url'))): - line = line.rstrip() - if line not in emitted_options: - emitted_options.add(line) - yield line - continue - - if line.startswith('-e') or line.startswith('--editable'): - if line.startswith('-e'): - line = line[2:].strip() - else: - line = line[len('--editable'):].strip().lstrip('=') - line_req = InstallRequirement.from_editable( - line, - default_vcs=default_vcs, - isolated=isolated, - wheel_cache=wheel_cache, - ) - else: - line_req = InstallRequirement.from_line( - COMMENT_RE.sub('', line).strip(), - isolated=isolated, - wheel_cache=wheel_cache, - ) - - if not line_req.name: - logger.info( - "Skipping line in requirement file [%s] because " - "it's not clear what it would install: %s", - req_file_path, line.strip(), - ) - logger.info( - " (add #egg=PackageName to the URL to avoid" - " this warning)" - ) - elif line_req.name not in installations: - logger.warning( - "Requirement file [%s] contains %s, but that " - "package is not installed", - req_file_path, COMMENT_RE.sub('', line).strip(), - ) - else: - yield str(installations[line_req.name]).rstrip() - del installations[line_req.name] - - yield( - '## The following requirements were added by ' - 'pip freeze:' - ) - for installation in sorted( - installations.values(), key=lambda x: x.name.lower()): - if canonicalize_name(installation.name) not in skip: - yield str(installation).rstrip() diff --git a/lib/python3.4/site-packages/pip/req/__init__.py b/lib/python3.4/site-packages/pip/req/__init__.py deleted file mode 100644 index 00185a4..0000000 --- a/lib/python3.4/site-packages/pip/req/__init__.py +++ /dev/null @@ -1,10 +0,0 @@ -from __future__ import absolute_import - -from .req_install import InstallRequirement -from .req_set import RequirementSet, Requirements -from .req_file import parse_requirements - -__all__ = [ - "RequirementSet", "Requirements", "InstallRequirement", - "parse_requirements", -] diff --git a/lib/python3.4/site-packages/pip/req/req_set.py b/lib/python3.4/site-packages/pip/req/req_set.py deleted file mode 100644 index 76aec06..0000000 --- a/lib/python3.4/site-packages/pip/req/req_set.py +++ /dev/null @@ -1,798 +0,0 @@ -from __future__ import absolute_import - -from collections import defaultdict -from itertools import chain -import logging -import os - -from pip._vendor import pkg_resources -from pip._vendor import requests - -from pip.compat import expanduser -from pip.download import (is_file_url, is_dir_url, is_vcs_url, url_to_path, - unpack_url) -from pip.exceptions import (InstallationError, BestVersionAlreadyInstalled, - DistributionNotFound, PreviousBuildDirError, - HashError, HashErrors, HashUnpinned, - DirectoryUrlHashUnsupported, VcsHashUnsupported, - UnsupportedPythonVersion) -from pip.req.req_install import InstallRequirement -from pip.utils import ( - display_path, dist_in_usersite, ensure_dir, normalize_path) -from pip.utils.hashes import MissingHashes -from pip.utils.logging import indent_log -from pip.utils.packaging import check_dist_requires_python -from pip.vcs import vcs -from pip.wheel import Wheel - -logger = logging.getLogger(__name__) - - -class Requirements(object): - - def __init__(self): - self._keys = [] - self._dict = {} - - def keys(self): - return self._keys - - def values(self): - return [self._dict[key] for key in self._keys] - - def __contains__(self, item): - return item in self._keys - - def __setitem__(self, key, value): - if key not in self._keys: - self._keys.append(key) - self._dict[key] = value - - def __getitem__(self, key): - return self._dict[key] - - def __repr__(self): - values = ['%s: %s' % (repr(k), repr(self[k])) for k in self.keys()] - return 'Requirements({%s})' % ', '.join(values) - - -class DistAbstraction(object): - """Abstracts out the wheel vs non-wheel prepare_files logic. - - The requirements for anything installable are as follows: - - we must be able to determine the requirement name - (or we can't correctly handle the non-upgrade case). - - we must be able to generate a list of run-time dependencies - without installing any additional packages (or we would - have to either burn time by doing temporary isolated installs - or alternatively violate pips 'don't start installing unless - all requirements are available' rule - neither of which are - desirable). - - for packages with setup requirements, we must also be able - to determine their requirements without installing additional - packages (for the same reason as run-time dependencies) - - we must be able to create a Distribution object exposing the - above metadata. - """ - - def __init__(self, req_to_install): - self.req_to_install = req_to_install - - def dist(self, finder): - """Return a setuptools Dist object.""" - raise NotImplementedError(self.dist) - - def prep_for_dist(self): - """Ensure that we can get a Dist for this requirement.""" - raise NotImplementedError(self.dist) - - -def make_abstract_dist(req_to_install): - """Factory to make an abstract dist object. - - Preconditions: Either an editable req with a source_dir, or satisfied_by or - a wheel link, or a non-editable req with a source_dir. - - :return: A concrete DistAbstraction. - """ - if req_to_install.editable: - return IsSDist(req_to_install) - elif req_to_install.link and req_to_install.link.is_wheel: - return IsWheel(req_to_install) - else: - return IsSDist(req_to_install) - - -class IsWheel(DistAbstraction): - - def dist(self, finder): - return list(pkg_resources.find_distributions( - self.req_to_install.source_dir))[0] - - def prep_for_dist(self): - # FIXME:https://github.com/pypa/pip/issues/1112 - pass - - -class IsSDist(DistAbstraction): - - def dist(self, finder): - dist = self.req_to_install.get_dist() - # FIXME: shouldn't be globally added: - if dist.has_metadata('dependency_links.txt'): - finder.add_dependency_links( - dist.get_metadata_lines('dependency_links.txt') - ) - return dist - - def prep_for_dist(self): - self.req_to_install.run_egg_info() - self.req_to_install.assert_source_matches_version() - - -class Installed(DistAbstraction): - - def dist(self, finder): - return self.req_to_install.satisfied_by - - def prep_for_dist(self): - pass - - -class RequirementSet(object): - - def __init__(self, build_dir, src_dir, download_dir, upgrade=False, - upgrade_strategy=None, ignore_installed=False, as_egg=False, - target_dir=None, ignore_dependencies=False, - force_reinstall=False, use_user_site=False, session=None, - pycompile=True, isolated=False, wheel_download_dir=None, - wheel_cache=None, require_hashes=False, - ignore_requires_python=False): - """Create a RequirementSet. - - :param wheel_download_dir: Where still-packed .whl files should be - written to. If None they are written to the download_dir parameter. - Separate to download_dir to permit only keeping wheel archives for - pip wheel. - :param download_dir: Where still packed archives should be written to. - If None they are not saved, and are deleted immediately after - unpacking. - :param wheel_cache: The pip wheel cache, for passing to - InstallRequirement. - """ - if session is None: - raise TypeError( - "RequirementSet() missing 1 required keyword argument: " - "'session'" - ) - - self.build_dir = build_dir - self.src_dir = src_dir - # XXX: download_dir and wheel_download_dir overlap semantically and may - # be combined if we're willing to have non-wheel archives present in - # the wheelhouse output by 'pip wheel'. - self.download_dir = download_dir - self.upgrade = upgrade - self.upgrade_strategy = upgrade_strategy - self.ignore_installed = ignore_installed - self.force_reinstall = force_reinstall - self.requirements = Requirements() - # Mapping of alias: real_name - self.requirement_aliases = {} - self.unnamed_requirements = [] - self.ignore_dependencies = ignore_dependencies - self.ignore_requires_python = ignore_requires_python - self.successfully_downloaded = [] - self.successfully_installed = [] - self.reqs_to_cleanup = [] - self.as_egg = as_egg - self.use_user_site = use_user_site - self.target_dir = target_dir # set from --target option - self.session = session - self.pycompile = pycompile - self.isolated = isolated - if wheel_download_dir: - wheel_download_dir = normalize_path(wheel_download_dir) - self.wheel_download_dir = wheel_download_dir - self._wheel_cache = wheel_cache - self.require_hashes = require_hashes - # Maps from install_req -> dependencies_of_install_req - self._dependencies = defaultdict(list) - - def __str__(self): - reqs = [req for req in self.requirements.values() - if not req.comes_from] - reqs.sort(key=lambda req: req.name.lower()) - return ' '.join([str(req.req) for req in reqs]) - - def __repr__(self): - reqs = [req for req in self.requirements.values()] - reqs.sort(key=lambda req: req.name.lower()) - reqs_str = ', '.join([str(req.req) for req in reqs]) - return ('<%s object; %d requirement(s): %s>' - % (self.__class__.__name__, len(reqs), reqs_str)) - - def add_requirement(self, install_req, parent_req_name=None, - extras_requested=None): - """Add install_req as a requirement to install. - - :param parent_req_name: The name of the requirement that needed this - added. The name is used because when multiple unnamed requirements - resolve to the same name, we could otherwise end up with dependency - links that point outside the Requirements set. parent_req must - already be added. Note that None implies that this is a user - supplied requirement, vs an inferred one. - :param extras_requested: an iterable of extras used to evaluate the - environement markers. - :return: Additional requirements to scan. That is either [] if - the requirement is not applicable, or [install_req] if the - requirement is applicable and has just been added. - """ - name = install_req.name - if not install_req.match_markers(extras_requested): - logger.warning("Ignoring %s: markers '%s' don't match your " - "environment", install_req.name, - install_req.markers) - return [] - - # This check has to come after we filter requirements with the - # environment markers. - if install_req.link and install_req.link.is_wheel: - wheel = Wheel(install_req.link.filename) - if not wheel.supported(): - raise InstallationError( - "%s is not a supported wheel on this platform." % - wheel.filename - ) - - install_req.as_egg = self.as_egg - install_req.use_user_site = self.use_user_site - install_req.target_dir = self.target_dir - install_req.pycompile = self.pycompile - install_req.is_direct = (parent_req_name is None) - - if not name: - # url or path requirement w/o an egg fragment - self.unnamed_requirements.append(install_req) - return [install_req] - else: - try: - existing_req = self.get_requirement(name) - except KeyError: - existing_req = None - if (parent_req_name is None and existing_req and not - existing_req.constraint and - existing_req.extras == install_req.extras and not - existing_req.req.specifier == install_req.req.specifier): - raise InstallationError( - 'Double requirement given: %s (already in %s, name=%r)' - % (install_req, existing_req, name)) - if not existing_req: - # Add requirement - self.requirements[name] = install_req - # FIXME: what about other normalizations? E.g., _ vs. -? - if name.lower() != name: - self.requirement_aliases[name.lower()] = name - result = [install_req] - else: - # Assume there's no need to scan, and that we've already - # encountered this for scanning. - result = [] - if not install_req.constraint and existing_req.constraint: - if (install_req.link and not (existing_req.link and - install_req.link.path == existing_req.link.path)): - self.reqs_to_cleanup.append(install_req) - raise InstallationError( - "Could not satisfy constraints for '%s': " - "installation from path or url cannot be " - "constrained to a version" % name) - # If we're now installing a constraint, mark the existing - # object for real installation. - existing_req.constraint = False - existing_req.extras = tuple( - sorted(set(existing_req.extras).union( - set(install_req.extras)))) - logger.debug("Setting %s extras to: %s", - existing_req, existing_req.extras) - # And now we need to scan this. - result = [existing_req] - # Canonicalise to the already-added object for the backref - # check below. - install_req = existing_req - if parent_req_name: - parent_req = self.get_requirement(parent_req_name) - self._dependencies[parent_req].append(install_req) - return result - - def has_requirement(self, project_name): - name = project_name.lower() - if (name in self.requirements and - not self.requirements[name].constraint or - name in self.requirement_aliases and - not self.requirements[self.requirement_aliases[name]].constraint): - return True - return False - - @property - def has_requirements(self): - return list(req for req in self.requirements.values() if not - req.constraint) or self.unnamed_requirements - - @property - def is_download(self): - if self.download_dir: - self.download_dir = expanduser(self.download_dir) - if os.path.exists(self.download_dir): - return True - else: - logger.critical('Could not find download directory') - raise InstallationError( - "Could not find or access download directory '%s'" - % display_path(self.download_dir)) - return False - - def get_requirement(self, project_name): - for name in project_name, project_name.lower(): - if name in self.requirements: - return self.requirements[name] - if name in self.requirement_aliases: - return self.requirements[self.requirement_aliases[name]] - raise KeyError("No project with the name %r" % project_name) - - def uninstall(self, auto_confirm=False): - for req in self.requirements.values(): - if req.constraint: - continue - req.uninstall(auto_confirm=auto_confirm) - req.commit_uninstall() - - def prepare_files(self, finder): - """ - Prepare process. Create temp directories, download and/or unpack files. - """ - # make the wheelhouse - if self.wheel_download_dir: - ensure_dir(self.wheel_download_dir) - - # If any top-level requirement has a hash specified, enter - # hash-checking mode, which requires hashes from all. - root_reqs = self.unnamed_requirements + self.requirements.values() - require_hashes = (self.require_hashes or - any(req.has_hash_options for req in root_reqs)) - if require_hashes and self.as_egg: - raise InstallationError( - '--egg is not allowed with --require-hashes mode, since it ' - 'delegates dependency resolution to setuptools and could thus ' - 'result in installation of unhashed packages.') - - # Actually prepare the files, and collect any exceptions. Most hash - # exceptions cannot be checked ahead of time, because - # req.populate_link() needs to be called before we can make decisions - # based on link type. - discovered_reqs = [] - hash_errors = HashErrors() - for req in chain(root_reqs, discovered_reqs): - try: - discovered_reqs.extend(self._prepare_file( - finder, - req, - require_hashes=require_hashes, - ignore_dependencies=self.ignore_dependencies)) - except HashError as exc: - exc.req = req - hash_errors.append(exc) - - if hash_errors: - raise hash_errors - - def _is_upgrade_allowed(self, req): - return self.upgrade and ( - self.upgrade_strategy == "eager" or ( - self.upgrade_strategy == "only-if-needed" and req.is_direct - ) - ) - - def _check_skip_installed(self, req_to_install, finder): - """Check if req_to_install should be skipped. - - This will check if the req is installed, and whether we should upgrade - or reinstall it, taking into account all the relevant user options. - - After calling this req_to_install will only have satisfied_by set to - None if the req_to_install is to be upgraded/reinstalled etc. Any - other value will be a dist recording the current thing installed that - satisfies the requirement. - - Note that for vcs urls and the like we can't assess skipping in this - routine - we simply identify that we need to pull the thing down, - then later on it is pulled down and introspected to assess upgrade/ - reinstalls etc. - - :return: A text reason for why it was skipped, or None. - """ - # Check whether to upgrade/reinstall this req or not. - req_to_install.check_if_exists() - if req_to_install.satisfied_by: - upgrade_allowed = self._is_upgrade_allowed(req_to_install) - - # Is the best version is installed. - best_installed = False - - if upgrade_allowed: - # For link based requirements we have to pull the - # tree down and inspect to assess the version #, so - # its handled way down. - if not (self.force_reinstall or req_to_install.link): - try: - finder.find_requirement( - req_to_install, upgrade_allowed) - except BestVersionAlreadyInstalled: - best_installed = True - except DistributionNotFound: - # No distribution found, so we squash the - # error - it will be raised later when we - # re-try later to do the install. - # Why don't we just raise here? - pass - - if not best_installed: - # don't uninstall conflict if user install and - # conflict is not user install - if not (self.use_user_site and not - dist_in_usersite(req_to_install.satisfied_by)): - req_to_install.conflicts_with = \ - req_to_install.satisfied_by - req_to_install.satisfied_by = None - - # Figure out a nice message to say why we're skipping this. - if best_installed: - skip_reason = 'already up-to-date' - elif self.upgrade_strategy == "only-if-needed": - skip_reason = 'not upgraded as not directly required' - else: - skip_reason = 'already satisfied' - - return skip_reason - else: - return None - - def _prepare_file(self, - finder, - req_to_install, - require_hashes=False, - ignore_dependencies=False): - """Prepare a single requirements file. - - :return: A list of additional InstallRequirements to also install. - """ - # Tell user what we are doing for this requirement: - # obtain (editable), skipping, processing (local url), collecting - # (remote url or package name) - if req_to_install.constraint or req_to_install.prepared: - return [] - - req_to_install.prepared = True - - # ###################### # - # # print log messages # # - # ###################### # - if req_to_install.editable: - logger.info('Obtaining %s', req_to_install) - else: - # satisfied_by is only evaluated by calling _check_skip_installed, - # so it must be None here. - assert req_to_install.satisfied_by is None - if not self.ignore_installed: - skip_reason = self._check_skip_installed( - req_to_install, finder) - - if req_to_install.satisfied_by: - assert skip_reason is not None, ( - '_check_skip_installed returned None but ' - 'req_to_install.satisfied_by is set to %r' - % (req_to_install.satisfied_by,)) - logger.info( - 'Requirement %s: %s', skip_reason, - req_to_install) - else: - if (req_to_install.link and - req_to_install.link.scheme == 'file'): - path = url_to_path(req_to_install.link.url) - logger.info('Processing %s', display_path(path)) - else: - logger.info('Collecting %s', req_to_install) - - with indent_log(): - # ################################ # - # # vcs update or unpack archive # # - # ################################ # - if req_to_install.editable: - if require_hashes: - raise InstallationError( - 'The editable requirement %s cannot be installed when ' - 'requiring hashes, because there is no single file to ' - 'hash.' % req_to_install) - req_to_install.ensure_has_source_dir(self.src_dir) - req_to_install.update_editable(not self.is_download) - abstract_dist = make_abstract_dist(req_to_install) - abstract_dist.prep_for_dist() - if self.is_download: - req_to_install.archive(self.download_dir) - req_to_install.check_if_exists() - elif req_to_install.satisfied_by: - if require_hashes: - logger.debug( - 'Since it is already installed, we are trusting this ' - 'package without checking its hash. To ensure a ' - 'completely repeatable environment, install into an ' - 'empty virtualenv.') - abstract_dist = Installed(req_to_install) - else: - # @@ if filesystem packages are not marked - # editable in a req, a non deterministic error - # occurs when the script attempts to unpack the - # build directory - req_to_install.ensure_has_source_dir(self.build_dir) - # If a checkout exists, it's unwise to keep going. version - # inconsistencies are logged later, but do not fail the - # installation. - # FIXME: this won't upgrade when there's an existing - # package unpacked in `req_to_install.source_dir` - if os.path.exists( - os.path.join(req_to_install.source_dir, 'setup.py')): - raise PreviousBuildDirError( - "pip can't proceed with requirements '%s' due to a" - " pre-existing build directory (%s). This is " - "likely due to a previous installation that failed" - ". pip is being responsible and not assuming it " - "can delete this. Please delete it and try again." - % (req_to_install, req_to_install.source_dir) - ) - req_to_install.populate_link( - finder, - self._is_upgrade_allowed(req_to_install), - require_hashes - ) - # We can't hit this spot and have populate_link return None. - # req_to_install.satisfied_by is None here (because we're - # guarded) and upgrade has no impact except when satisfied_by - # is not None. - # Then inside find_requirement existing_applicable -> False - # If no new versions are found, DistributionNotFound is raised, - # otherwise a result is guaranteed. - assert req_to_install.link - link = req_to_install.link - - # Now that we have the real link, we can tell what kind of - # requirements we have and raise some more informative errors - # than otherwise. (For example, we can raise VcsHashUnsupported - # for a VCS URL rather than HashMissing.) - if require_hashes: - # We could check these first 2 conditions inside - # unpack_url and save repetition of conditions, but then - # we would report less-useful error messages for - # unhashable requirements, complaining that there's no - # hash provided. - if is_vcs_url(link): - raise VcsHashUnsupported() - elif is_file_url(link) and is_dir_url(link): - raise DirectoryUrlHashUnsupported() - if (not req_to_install.original_link and - not req_to_install.is_pinned): - # Unpinned packages are asking for trouble when a new - # version is uploaded. This isn't a security check, but - # it saves users a surprising hash mismatch in the - # future. - # - # file:/// URLs aren't pinnable, so don't complain - # about them not being pinned. - raise HashUnpinned() - hashes = req_to_install.hashes( - trust_internet=not require_hashes) - if require_hashes and not hashes: - # Known-good hashes are missing for this requirement, so - # shim it with a facade object that will provoke hash - # computation and then raise a HashMissing exception - # showing the user what the hash should be. - hashes = MissingHashes() - - try: - download_dir = self.download_dir - # We always delete unpacked sdists after pip ran. - autodelete_unpacked = True - if req_to_install.link.is_wheel \ - and self.wheel_download_dir: - # when doing 'pip wheel` we download wheels to a - # dedicated dir. - download_dir = self.wheel_download_dir - if req_to_install.link.is_wheel: - if download_dir: - # When downloading, we only unpack wheels to get - # metadata. - autodelete_unpacked = True - else: - # When installing a wheel, we use the unpacked - # wheel. - autodelete_unpacked = False - unpack_url( - req_to_install.link, req_to_install.source_dir, - download_dir, autodelete_unpacked, - session=self.session, hashes=hashes) - except requests.HTTPError as exc: - logger.critical( - 'Could not install requirement %s because ' - 'of error %s', - req_to_install, - exc, - ) - raise InstallationError( - 'Could not install requirement %s because ' - 'of HTTP error %s for URL %s' % - (req_to_install, exc, req_to_install.link) - ) - abstract_dist = make_abstract_dist(req_to_install) - abstract_dist.prep_for_dist() - if self.is_download: - # Make a .zip of the source_dir we already created. - if req_to_install.link.scheme in vcs.all_schemes: - req_to_install.archive(self.download_dir) - # req_to_install.req is only avail after unpack for URL - # pkgs repeat check_if_exists to uninstall-on-upgrade - # (#14) - if not self.ignore_installed: - req_to_install.check_if_exists() - if req_to_install.satisfied_by: - if self.upgrade or self.ignore_installed: - # don't uninstall conflict if user install and - # conflict is not user install - if not (self.use_user_site and not - dist_in_usersite( - req_to_install.satisfied_by)): - req_to_install.conflicts_with = \ - req_to_install.satisfied_by - req_to_install.satisfied_by = None - else: - logger.info( - 'Requirement already satisfied (use ' - '--upgrade to upgrade): %s', - req_to_install, - ) - - # ###################### # - # # parse dependencies # # - # ###################### # - dist = abstract_dist.dist(finder) - try: - check_dist_requires_python(dist) - except UnsupportedPythonVersion as e: - if self.ignore_requires_python: - logger.warning(e.args[0]) - else: - req_to_install.remove_temporary_source() - raise - more_reqs = [] - - def add_req(subreq, extras_requested): - sub_install_req = InstallRequirement( - str(subreq), - req_to_install, - isolated=self.isolated, - wheel_cache=self._wheel_cache, - ) - more_reqs.extend(self.add_requirement( - sub_install_req, req_to_install.name, - extras_requested=extras_requested)) - - # We add req_to_install before its dependencies, so that we - # can refer to it when adding dependencies. - if not self.has_requirement(req_to_install.name): - # 'unnamed' requirements will get added here - self.add_requirement(req_to_install, None) - - if not ignore_dependencies: - if (req_to_install.extras): - logger.debug( - "Installing extra requirements: %r", - ','.join(req_to_install.extras), - ) - missing_requested = sorted( - set(req_to_install.extras) - set(dist.extras) - ) - for missing in missing_requested: - logger.warning( - '%s does not provide the extra \'%s\'', - dist, missing - ) - - available_requested = sorted( - set(dist.extras) & set(req_to_install.extras) - ) - for subreq in dist.requires(available_requested): - add_req(subreq, extras_requested=available_requested) - - # cleanup tmp src - self.reqs_to_cleanup.append(req_to_install) - - if not req_to_install.editable and not req_to_install.satisfied_by: - # XXX: --no-install leads this to report 'Successfully - # downloaded' for only non-editable reqs, even though we took - # action on them. - self.successfully_downloaded.append(req_to_install) - - return more_reqs - - def cleanup_files(self): - """Clean up files, remove builds.""" - logger.debug('Cleaning up...') - with indent_log(): - for req in self.reqs_to_cleanup: - req.remove_temporary_source() - - def _to_install(self): - """Create the installation order. - - The installation order is topological - requirements are installed - before the requiring thing. We break cycles at an arbitrary point, - and make no other guarantees. - """ - # The current implementation, which we may change at any point - # installs the user specified things in the order given, except when - # dependencies must come earlier to achieve topological order. - order = [] - ordered_reqs = set() - - def schedule(req): - if req.satisfied_by or req in ordered_reqs: - return - if req.constraint: - return - ordered_reqs.add(req) - for dep in self._dependencies[req]: - schedule(dep) - order.append(req) - for install_req in self.requirements.values(): - schedule(install_req) - return order - - def install(self, install_options, global_options=(), *args, **kwargs): - """ - Install everything in this set (after having downloaded and unpacked - the packages) - """ - to_install = self._to_install() - - if to_install: - logger.info( - 'Installing collected packages: %s', - ', '.join([req.name for req in to_install]), - ) - - with indent_log(): - for requirement in to_install: - if requirement.conflicts_with: - logger.info( - 'Found existing installation: %s', - requirement.conflicts_with, - ) - with indent_log(): - requirement.uninstall(auto_confirm=True) - try: - requirement.install( - install_options, - global_options, - *args, - **kwargs - ) - except: - # if install did not succeed, rollback previous uninstall - if (requirement.conflicts_with and not - requirement.install_succeeded): - requirement.rollback_uninstall() - raise - else: - if (requirement.conflicts_with and - requirement.install_succeeded): - requirement.commit_uninstall() - requirement.remove_temporary_source() - - self.successfully_installed = to_install diff --git a/lib/python3.4/site-packages/pip/req/req_uninstall.py b/lib/python3.4/site-packages/pip/req/req_uninstall.py deleted file mode 100644 index 5248430..0000000 --- a/lib/python3.4/site-packages/pip/req/req_uninstall.py +++ /dev/null @@ -1,195 +0,0 @@ -from __future__ import absolute_import - -import logging -import os -import tempfile - -from pip.compat import uses_pycache, WINDOWS, cache_from_source -from pip.exceptions import UninstallationError -from pip.utils import rmtree, ask, is_local, renames, normalize_path -from pip.utils.logging import indent_log - - -logger = logging.getLogger(__name__) - - -class UninstallPathSet(object): - """A set of file paths to be removed in the uninstallation of a - requirement.""" - def __init__(self, dist): - self.paths = set() - self._refuse = set() - self.pth = {} - self.dist = dist - self.save_dir = None - self._moved_paths = [] - - def _permitted(self, path): - """ - Return True if the given path is one we are permitted to - remove/modify, False otherwise. - - """ - return is_local(path) - - def add(self, path): - head, tail = os.path.split(path) - - # we normalize the head to resolve parent directory symlinks, but not - # the tail, since we only want to uninstall symlinks, not their targets - path = os.path.join(normalize_path(head), os.path.normcase(tail)) - - if not os.path.exists(path): - return - if self._permitted(path): - self.paths.add(path) - else: - self._refuse.add(path) - - # __pycache__ files can show up after 'installed-files.txt' is created, - # due to imports - if os.path.splitext(path)[1] == '.py' and uses_pycache: - self.add(cache_from_source(path)) - - def add_pth(self, pth_file, entry): - pth_file = normalize_path(pth_file) - if self._permitted(pth_file): - if pth_file not in self.pth: - self.pth[pth_file] = UninstallPthEntries(pth_file) - self.pth[pth_file].add(entry) - else: - self._refuse.add(pth_file) - - def compact(self, paths): - """Compact a path set to contain the minimal number of paths - necessary to contain all paths in the set. If /a/path/ and - /a/path/to/a/file.txt are both in the set, leave only the - shorter path.""" - short_paths = set() - for path in sorted(paths, key=len): - if not any([ - (path.startswith(shortpath) and - path[len(shortpath.rstrip(os.path.sep))] == os.path.sep) - for shortpath in short_paths]): - short_paths.add(path) - return short_paths - - def _stash(self, path): - return os.path.join( - self.save_dir, os.path.splitdrive(path)[1].lstrip(os.path.sep)) - - def remove(self, auto_confirm=False): - """Remove paths in ``self.paths`` with confirmation (unless - ``auto_confirm`` is True).""" - if not self.paths: - logger.info( - "Can't uninstall '%s'. No files were found to uninstall.", - self.dist.project_name, - ) - return - logger.info( - 'Uninstalling %s-%s:', - self.dist.project_name, self.dist.version - ) - - with indent_log(): - paths = sorted(self.compact(self.paths)) - - if auto_confirm: - response = 'y' - else: - for path in paths: - logger.info(path) - response = ask('Proceed (y/n)? ', ('y', 'n')) - if self._refuse: - logger.info('Not removing or modifying (outside of prefix):') - for path in self.compact(self._refuse): - logger.info(path) - if response == 'y': - self.save_dir = tempfile.mkdtemp(suffix='-uninstall', - prefix='pip-') - for path in paths: - new_path = self._stash(path) - logger.debug('Removing file or directory %s', path) - self._moved_paths.append(path) - renames(path, new_path) - for pth in self.pth.values(): - pth.remove() - logger.info( - 'Successfully uninstalled %s-%s', - self.dist.project_name, self.dist.version - ) - - def rollback(self): - """Rollback the changes previously made by remove().""" - if self.save_dir is None: - logger.error( - "Can't roll back %s; was not uninstalled", - self.dist.project_name, - ) - return False - logger.info('Rolling back uninstall of %s', self.dist.project_name) - for path in self._moved_paths: - tmp_path = self._stash(path) - logger.debug('Replacing %s', path) - renames(tmp_path, path) - for pth in self.pth.values(): - pth.rollback() - - def commit(self): - """Remove temporary save dir: rollback will no longer be possible.""" - if self.save_dir is not None: - rmtree(self.save_dir) - self.save_dir = None - self._moved_paths = [] - - -class UninstallPthEntries(object): - def __init__(self, pth_file): - if not os.path.isfile(pth_file): - raise UninstallationError( - "Cannot remove entries from nonexistent file %s" % pth_file - ) - self.file = pth_file - self.entries = set() - self._saved_lines = None - - def add(self, entry): - entry = os.path.normcase(entry) - # On Windows, os.path.normcase converts the entry to use - # backslashes. This is correct for entries that describe absolute - # paths outside of site-packages, but all the others use forward - # slashes. - if WINDOWS and not os.path.splitdrive(entry)[0]: - entry = entry.replace('\\', '/') - self.entries.add(entry) - - def remove(self): - logger.debug('Removing pth entries from %s:', self.file) - with open(self.file, 'rb') as fh: - # windows uses '\r\n' with py3k, but uses '\n' with py2.x - lines = fh.readlines() - self._saved_lines = lines - if any(b'\r\n' in line for line in lines): - endline = '\r\n' - else: - endline = '\n' - for entry in self.entries: - try: - logger.debug('Removing entry: %s', entry) - lines.remove((entry + endline).encode("utf-8")) - except ValueError: - pass - with open(self.file, 'wb') as fh: - fh.writelines(lines) - - def rollback(self): - if self._saved_lines is None: - logger.error( - 'Cannot roll back changes to %s, none were made', self.file - ) - return False - logger.debug('Rolling %s back to previous state', self.file) - with open(self.file, 'wb') as fh: - fh.writelines(self._saved_lines) - return True diff --git a/lib/python3.4/site-packages/pip/utils/build.py b/lib/python3.4/site-packages/pip/utils/build.py deleted file mode 100644 index fc65cfa..0000000 --- a/lib/python3.4/site-packages/pip/utils/build.py +++ /dev/null @@ -1,42 +0,0 @@ -from __future__ import absolute_import - -import os.path -import tempfile - -from pip.utils import rmtree - - -class BuildDirectory(object): - - def __init__(self, name=None, delete=None): - # If we were not given an explicit directory, and we were not given an - # explicit delete option, then we'll default to deleting. - if name is None and delete is None: - delete = True - - if name is None: - # We realpath here because some systems have their default tmpdir - # symlinked to another directory. This tends to confuse build - # scripts, so we canonicalize the path by traversing potential - # symlinks here. - name = os.path.realpath(tempfile.mkdtemp(prefix="pip-build-")) - # If we were not given an explicit directory, and we were not given - # an explicit delete option, then we'll default to deleting. - if delete is None: - delete = True - - self.name = name - self.delete = delete - - def __repr__(self): - return "<{} {!r}>".format(self.__class__.__name__, self.name) - - def __enter__(self): - return self.name - - def __exit__(self, exc, value, tb): - self.cleanup() - - def cleanup(self): - if self.delete: - rmtree(self.name) diff --git a/lib/python3.4/site-packages/pip/utils/deprecation.py b/lib/python3.4/site-packages/pip/utils/deprecation.py deleted file mode 100644 index c3f799e..0000000 --- a/lib/python3.4/site-packages/pip/utils/deprecation.py +++ /dev/null @@ -1,76 +0,0 @@ -""" -A module that implements tooling to enable easy warnings about deprecations. -""" -from __future__ import absolute_import - -import logging -import warnings - - -class PipDeprecationWarning(Warning): - pass - - -class Pending(object): - pass - - -class RemovedInPip10Warning(PipDeprecationWarning): - pass - - -class RemovedInPip11Warning(PipDeprecationWarning, Pending): - pass - - -class Python26DeprecationWarning(PipDeprecationWarning): - pass - - -# Warnings <-> Logging Integration - - -_warnings_showwarning = None - - -def _showwarning(message, category, filename, lineno, file=None, line=None): - if file is not None: - if _warnings_showwarning is not None: - _warnings_showwarning( - message, category, filename, lineno, file, line, - ) - else: - if issubclass(category, PipDeprecationWarning): - # We use a specially named logger which will handle all of the - # deprecation messages for pip. - logger = logging.getLogger("pip.deprecations") - - # This is purposely using the % formatter here instead of letting - # the logging module handle the interpolation. This is because we - # want it to appear as if someone typed this entire message out. - log_message = "DEPRECATION: %s" % message - - # PipDeprecationWarnings that are Pending still have at least 2 - # versions to go until they are removed so they can just be - # warnings. Otherwise, they will be removed in the very next - # version of pip. We want these to be more obvious so we use the - # ERROR logging level. - if issubclass(category, Pending): - logger.warning(log_message) - else: - logger.error(log_message) - else: - _warnings_showwarning( - message, category, filename, lineno, file, line, - ) - - -def install_warning_logger(): - # Enable our Deprecation Warnings - warnings.simplefilter("default", PipDeprecationWarning, append=True) - - global _warnings_showwarning - - if _warnings_showwarning is None: - _warnings_showwarning = warnings.showwarning - warnings.showwarning = _showwarning diff --git a/lib/python3.4/site-packages/pip/utils/logging.py b/lib/python3.4/site-packages/pip/utils/logging.py deleted file mode 100644 index 1c1053a..0000000 --- a/lib/python3.4/site-packages/pip/utils/logging.py +++ /dev/null @@ -1,130 +0,0 @@ -from __future__ import absolute_import - -import contextlib -import logging -import logging.handlers -import os - -try: - import threading -except ImportError: - import dummy_threading as threading - -from pip.compat import WINDOWS -from pip.utils import ensure_dir - -try: - from pip._vendor import colorama -# Lots of different errors can come from this, including SystemError and -# ImportError. -except Exception: - colorama = None - - -_log_state = threading.local() -_log_state.indentation = 0 - - -@contextlib.contextmanager -def indent_log(num=2): - """ - A context manager which will cause the log output to be indented for any - log messages emitted inside it. - """ - _log_state.indentation += num - try: - yield - finally: - _log_state.indentation -= num - - -def get_indentation(): - return getattr(_log_state, 'indentation', 0) - - -class IndentingFormatter(logging.Formatter): - - def format(self, record): - """ - Calls the standard formatter, but will indent all of the log messages - by our current indentation level. - """ - formatted = logging.Formatter.format(self, record) - formatted = "".join([ - (" " * get_indentation()) + line - for line in formatted.splitlines(True) - ]) - return formatted - - -def _color_wrap(*colors): - def wrapped(inp): - return "".join(list(colors) + [inp, colorama.Style.RESET_ALL]) - return wrapped - - -class ColorizedStreamHandler(logging.StreamHandler): - - # Don't build up a list of colors if we don't have colorama - if colorama: - COLORS = [ - # This needs to be in order from highest logging level to lowest. - (logging.ERROR, _color_wrap(colorama.Fore.RED)), - (logging.WARNING, _color_wrap(colorama.Fore.YELLOW)), - ] - else: - COLORS = [] - - def __init__(self, stream=None): - logging.StreamHandler.__init__(self, stream) - - if WINDOWS and colorama: - self.stream = colorama.AnsiToWin32(self.stream) - - def should_color(self): - # Don't colorize things if we do not have colorama - if not colorama: - return False - - real_stream = ( - self.stream if not isinstance(self.stream, colorama.AnsiToWin32) - else self.stream.wrapped - ) - - # If the stream is a tty we should color it - if hasattr(real_stream, "isatty") and real_stream.isatty(): - return True - - # If we have an ASNI term we should color it - if os.environ.get("TERM") == "ANSI": - return True - - # If anything else we should not color it - return False - - def format(self, record): - msg = logging.StreamHandler.format(self, record) - - if self.should_color(): - for level, color in self.COLORS: - if record.levelno >= level: - msg = color(msg) - break - - return msg - - -class BetterRotatingFileHandler(logging.handlers.RotatingFileHandler): - - def _open(self): - ensure_dir(os.path.dirname(self.baseFilename)) - return logging.handlers.RotatingFileHandler._open(self) - - -class MaxLevelFilter(logging.Filter): - - def __init__(self, level): - self.level = level - - def filter(self, record): - return record.levelno < self.level diff --git a/lib/python3.4/site-packages/pip/vcs/__init__.py b/lib/python3.4/site-packages/pip/vcs/__init__.py deleted file mode 100644 index 8d3dbb2..0000000 --- a/lib/python3.4/site-packages/pip/vcs/__init__.py +++ /dev/null @@ -1,366 +0,0 @@ -"""Handles all VCS (version control) support""" -from __future__ import absolute_import - -import errno -import logging -import os -import shutil -import sys - -from pip._vendor.six.moves.urllib import parse as urllib_parse - -from pip.exceptions import BadCommand -from pip.utils import (display_path, backup_dir, call_subprocess, - rmtree, ask_path_exists) - - -__all__ = ['vcs', 'get_src_requirement'] - - -logger = logging.getLogger(__name__) - - -class VcsSupport(object): - _registry = {} - schemes = ['ssh', 'git', 'hg', 'bzr', 'sftp', 'svn'] - - def __init__(self): - # Register more schemes with urlparse for various version control - # systems - urllib_parse.uses_netloc.extend(self.schemes) - # Python >= 2.7.4, 3.3 doesn't have uses_fragment - if getattr(urllib_parse, 'uses_fragment', None): - urllib_parse.uses_fragment.extend(self.schemes) - super(VcsSupport, self).__init__() - - def __iter__(self): - return self._registry.__iter__() - - @property - def backends(self): - return list(self._registry.values()) - - @property - def dirnames(self): - return [backend.dirname for backend in self.backends] - - @property - def all_schemes(self): - schemes = [] - for backend in self.backends: - schemes.extend(backend.schemes) - return schemes - - def register(self, cls): - if not hasattr(cls, 'name'): - logger.warning('Cannot register VCS %s', cls.__name__) - return - if cls.name not in self._registry: - self._registry[cls.name] = cls - logger.debug('Registered VCS backend: %s', cls.name) - - def unregister(self, cls=None, name=None): - if name in self._registry: - del self._registry[name] - elif cls in self._registry.values(): - del self._registry[cls.name] - else: - logger.warning('Cannot unregister because no class or name given') - - def get_backend_name(self, location): - """ - Return the name of the version control backend if found at given - location, e.g. vcs.get_backend_name('/path/to/vcs/checkout') - """ - for vc_type in self._registry.values(): - if vc_type.controls_location(location): - logger.debug('Determine that %s uses VCS: %s', - location, vc_type.name) - return vc_type.name - return None - - def get_backend(self, name): - name = name.lower() - if name in self._registry: - return self._registry[name] - - def get_backend_from_location(self, location): - vc_type = self.get_backend_name(location) - if vc_type: - return self.get_backend(vc_type) - return None - - -vcs = VcsSupport() - - -class VersionControl(object): - name = '' - dirname = '' - # List of supported schemes for this Version Control - schemes = () - - def __init__(self, url=None, *args, **kwargs): - self.url = url - super(VersionControl, self).__init__(*args, **kwargs) - - def _is_local_repository(self, repo): - """ - posix absolute paths start with os.path.sep, - win32 ones start with drive (like c:\\folder) - """ - drive, tail = os.path.splitdrive(repo) - return repo.startswith(os.path.sep) or drive - - # See issue #1083 for why this method was introduced: - # https://github.com/pypa/pip/issues/1083 - def translate_egg_surname(self, surname): - # For example, Django has branches of the form "stable/1.7.x". - return surname.replace('/', '_') - - def export(self, location): - """ - Export the repository at the url to the destination location - i.e. only download the files, without vcs informations - """ - raise NotImplementedError - - def get_url_rev(self): - """ - Returns the correct repository URL and revision by parsing the given - repository URL - """ - error_message = ( - "Sorry, '%s' is a malformed VCS url. " - "The format is +://, " - "e.g. svn+http://myrepo/svn/MyApp#egg=MyApp" - ) - assert '+' in self.url, error_message % self.url - url = self.url.split('+', 1)[1] - scheme, netloc, path, query, frag = urllib_parse.urlsplit(url) - rev = None - if '@' in path: - path, rev = path.rsplit('@', 1) - url = urllib_parse.urlunsplit((scheme, netloc, path, query, '')) - return url, rev - - def get_info(self, location): - """ - Returns (url, revision), where both are strings - """ - assert not location.rstrip('/').endswith(self.dirname), \ - 'Bad directory: %s' % location - return self.get_url(location), self.get_revision(location) - - def normalize_url(self, url): - """ - Normalize a URL for comparison by unquoting it and removing any - trailing slash. - """ - return urllib_parse.unquote(url).rstrip('/') - - def compare_urls(self, url1, url2): - """ - Compare two repo URLs for identity, ignoring incidental differences. - """ - return (self.normalize_url(url1) == self.normalize_url(url2)) - - def obtain(self, dest): - """ - Called when installing or updating an editable package, takes the - source path of the checkout. - """ - raise NotImplementedError - - def switch(self, dest, url, rev_options): - """ - Switch the repo at ``dest`` to point to ``URL``. - """ - raise NotImplementedError - - def update(self, dest, rev_options): - """ - Update an already-existing repo to the given ``rev_options``. - """ - raise NotImplementedError - - def check_version(self, dest, rev_options): - """ - Return True if the version is identical to what exists and - doesn't need to be updated. - """ - raise NotImplementedError - - def check_destination(self, dest, url, rev_options, rev_display): - """ - Prepare a location to receive a checkout/clone. - - Return True if the location is ready for (and requires) a - checkout/clone, False otherwise. - """ - checkout = True - prompt = False - if os.path.exists(dest): - checkout = False - if os.path.exists(os.path.join(dest, self.dirname)): - existing_url = self.get_url(dest) - if self.compare_urls(existing_url, url): - logger.debug( - '%s in %s exists, and has correct URL (%s)', - self.repo_name.title(), - display_path(dest), - url, - ) - if not self.check_version(dest, rev_options): - logger.info( - 'Updating %s %s%s', - display_path(dest), - self.repo_name, - rev_display, - ) - self.update(dest, rev_options) - else: - logger.info( - 'Skipping because already up-to-date.') - else: - logger.warning( - '%s %s in %s exists with URL %s', - self.name, - self.repo_name, - display_path(dest), - existing_url, - ) - prompt = ('(s)witch, (i)gnore, (w)ipe, (b)ackup ', - ('s', 'i', 'w', 'b')) - else: - logger.warning( - 'Directory %s already exists, and is not a %s %s.', - dest, - self.name, - self.repo_name, - ) - prompt = ('(i)gnore, (w)ipe, (b)ackup ', ('i', 'w', 'b')) - if prompt: - logger.warning( - 'The plan is to install the %s repository %s', - self.name, - url, - ) - response = ask_path_exists('What to do? %s' % prompt[0], - prompt[1]) - - if response == 's': - logger.info( - 'Switching %s %s to %s%s', - self.repo_name, - display_path(dest), - url, - rev_display, - ) - self.switch(dest, url, rev_options) - elif response == 'i': - # do nothing - pass - elif response == 'w': - logger.warning('Deleting %s', display_path(dest)) - rmtree(dest) - checkout = True - elif response == 'b': - dest_dir = backup_dir(dest) - logger.warning( - 'Backing up %s to %s', display_path(dest), dest_dir, - ) - shutil.move(dest, dest_dir) - checkout = True - elif response == 'a': - sys.exit(-1) - return checkout - - def unpack(self, location): - """ - Clean up current location and download the url repository - (and vcs infos) into location - """ - if os.path.exists(location): - rmtree(location) - self.obtain(location) - - def get_src_requirement(self, dist, location): - """ - Return a string representing the requirement needed to - redownload the files currently present in location, something - like: - {repository_url}@{revision}#egg={project_name}-{version_identifier} - """ - raise NotImplementedError - - def get_url(self, location): - """ - Return the url used at location - Used in get_info or check_destination - """ - raise NotImplementedError - - def get_revision(self, location): - """ - Return the current revision of the files at location - Used in get_info - """ - raise NotImplementedError - - def run_command(self, cmd, show_stdout=True, cwd=None, - on_returncode='raise', - command_desc=None, - extra_environ=None, spinner=None): - """ - Run a VCS subcommand - This is simply a wrapper around call_subprocess that adds the VCS - command name, and checks that the VCS is available - """ - cmd = [self.name] + cmd - try: - return call_subprocess(cmd, show_stdout, cwd, - on_returncode, - command_desc, extra_environ, - spinner) - except OSError as e: - # errno.ENOENT = no such file or directory - # In other words, the VCS executable isn't available - if e.errno == errno.ENOENT: - raise BadCommand('Cannot find command %r' % self.name) - else: - raise # re-raise exception if a different error occurred - - @classmethod - def controls_location(cls, location): - """ - Check if a location is controlled by the vcs. - It is meant to be overridden to implement smarter detection - mechanisms for specific vcs. - """ - logger.debug('Checking in %s for %s (%s)...', - location, cls.dirname, cls.name) - path = os.path.join(location, cls.dirname) - return os.path.exists(path) - - -def get_src_requirement(dist, location): - version_control = vcs.get_backend_from_location(location) - if version_control: - try: - return version_control().get_src_requirement(dist, - location) - except BadCommand: - logger.warning( - 'cannot determine version of editable source in %s ' - '(%s command not found in path)', - location, - version_control.name, - ) - return dist.as_requirement() - logger.warning( - 'cannot determine version of editable source in %s (is not SVN ' - 'checkout, Git clone, Mercurial clone or Bazaar branch)', - location, - ) - return dist.as_requirement() diff --git a/lib/python3.4/site-packages/pip/vcs/git.py b/lib/python3.4/site-packages/pip/vcs/git.py deleted file mode 100644 index 2187dd8..0000000 --- a/lib/python3.4/site-packages/pip/vcs/git.py +++ /dev/null @@ -1,300 +0,0 @@ -from __future__ import absolute_import - -import logging -import tempfile -import os.path - -from pip.compat import samefile -from pip.exceptions import BadCommand -from pip._vendor.six.moves.urllib import parse as urllib_parse -from pip._vendor.six.moves.urllib import request as urllib_request -from pip._vendor.packaging.version import parse as parse_version - -from pip.utils import display_path, rmtree -from pip.vcs import vcs, VersionControl - - -urlsplit = urllib_parse.urlsplit -urlunsplit = urllib_parse.urlunsplit - - -logger = logging.getLogger(__name__) - - -class Git(VersionControl): - name = 'git' - dirname = '.git' - repo_name = 'clone' - schemes = ( - 'git', 'git+http', 'git+https', 'git+ssh', 'git+git', 'git+file', - ) - - def __init__(self, url=None, *args, **kwargs): - - # Works around an apparent Git bug - # (see http://article.gmane.org/gmane.comp.version-control.git/146500) - if url: - scheme, netloc, path, query, fragment = urlsplit(url) - if scheme.endswith('file'): - initial_slashes = path[:-len(path.lstrip('/'))] - newpath = ( - initial_slashes + - urllib_request.url2pathname(path) - .replace('\\', '/').lstrip('/') - ) - url = urlunsplit((scheme, netloc, newpath, query, fragment)) - after_plus = scheme.find('+') + 1 - url = scheme[:after_plus] + urlunsplit( - (scheme[after_plus:], netloc, newpath, query, fragment), - ) - - super(Git, self).__init__(url, *args, **kwargs) - - def get_git_version(self): - VERSION_PFX = 'git version ' - version = self.run_command(['version'], show_stdout=False) - if version.startswith(VERSION_PFX): - version = version[len(VERSION_PFX):] - else: - version = '' - # get first 3 positions of the git version becasue - # on windows it is x.y.z.windows.t, and this parses as - # LegacyVersion which always smaller than a Version. - version = '.'.join(version.split('.')[:3]) - return parse_version(version) - - def export(self, location): - """Export the Git repository at the url to the destination location""" - temp_dir = tempfile.mkdtemp('-export', 'pip-') - self.unpack(temp_dir) - try: - if not location.endswith('/'): - location = location + '/' - self.run_command( - ['checkout-index', '-a', '-f', '--prefix', location], - show_stdout=False, cwd=temp_dir) - finally: - rmtree(temp_dir) - - def check_rev_options(self, rev, dest, rev_options): - """Check the revision options before checkout to compensate that tags - and branches may need origin/ as a prefix. - Returns the SHA1 of the branch or tag if found. - """ - revisions = self.get_short_refs(dest) - - origin_rev = 'origin/%s' % rev - if origin_rev in revisions: - # remote branch - return [revisions[origin_rev]] - elif rev in revisions: - # a local tag or branch name - return [revisions[rev]] - else: - logger.warning( - "Could not find a tag or branch '%s', assuming commit.", rev, - ) - return rev_options - - def check_version(self, dest, rev_options): - """ - Compare the current sha to the ref. ref may be a branch or tag name, - but current rev will always point to a sha. This means that a branch - or tag will never compare as True. So this ultimately only matches - against exact shas. - """ - return self.get_revision(dest).startswith(rev_options[0]) - - def switch(self, dest, url, rev_options): - self.run_command(['config', 'remote.origin.url', url], cwd=dest) - self.run_command(['checkout', '-q'] + rev_options, cwd=dest) - - self.update_submodules(dest) - - def update(self, dest, rev_options): - # First fetch changes from the default remote - if self.get_git_version() >= parse_version('1.9.0'): - # fetch tags in addition to everything else - self.run_command(['fetch', '-q', '--tags'], cwd=dest) - else: - self.run_command(['fetch', '-q'], cwd=dest) - # Then reset to wanted revision (maybe even origin/master) - if rev_options: - rev_options = self.check_rev_options( - rev_options[0], dest, rev_options, - ) - self.run_command(['reset', '--hard', '-q'] + rev_options, cwd=dest) - #: update submodules - self.update_submodules(dest) - - def obtain(self, dest): - url, rev = self.get_url_rev() - if rev: - rev_options = [rev] - rev_display = ' (to %s)' % rev - else: - rev_options = ['origin/master'] - rev_display = '' - if self.check_destination(dest, url, rev_options, rev_display): - logger.info( - 'Cloning %s%s to %s', url, rev_display, display_path(dest), - ) - self.run_command(['clone', '-q', url, dest]) - - if rev: - rev_options = self.check_rev_options(rev, dest, rev_options) - # Only do a checkout if rev_options differs from HEAD - if not self.check_version(dest, rev_options): - self.run_command( - ['checkout', '-q'] + rev_options, - cwd=dest, - ) - #: repo may contain submodules - self.update_submodules(dest) - - def get_url(self, location): - """Return URL of the first remote encountered.""" - remotes = self.run_command( - ['config', '--get-regexp', 'remote\..*\.url'], - show_stdout=False, cwd=location) - remotes = remotes.splitlines() - found_remote = remotes[0] - for remote in remotes: - if remote.startswith('remote.origin.url '): - found_remote = remote - break - url = found_remote.split(' ')[1] - return url.strip() - - def get_revision(self, location): - current_rev = self.run_command( - ['rev-parse', 'HEAD'], show_stdout=False, cwd=location) - return current_rev.strip() - - def get_full_refs(self, location): - """Yields tuples of (commit, ref) for branches and tags""" - output = self.run_command(['show-ref'], - show_stdout=False, cwd=location) - for line in output.strip().splitlines(): - commit, ref = line.split(' ', 1) - yield commit.strip(), ref.strip() - - def is_ref_remote(self, ref): - return ref.startswith('refs/remotes/') - - def is_ref_branch(self, ref): - return ref.startswith('refs/heads/') - - def is_ref_tag(self, ref): - return ref.startswith('refs/tags/') - - def is_ref_commit(self, ref): - """A ref is a commit sha if it is not anything else""" - return not any(( - self.is_ref_remote(ref), - self.is_ref_branch(ref), - self.is_ref_tag(ref), - )) - - # Should deprecate `get_refs` since it's ambiguous - def get_refs(self, location): - return self.get_short_refs(location) - - def get_short_refs(self, location): - """Return map of named refs (branches or tags) to commit hashes.""" - rv = {} - for commit, ref in self.get_full_refs(location): - ref_name = None - if self.is_ref_remote(ref): - ref_name = ref[len('refs/remotes/'):] - elif self.is_ref_branch(ref): - ref_name = ref[len('refs/heads/'):] - elif self.is_ref_tag(ref): - ref_name = ref[len('refs/tags/'):] - if ref_name is not None: - rv[ref_name] = commit - return rv - - def _get_subdirectory(self, location): - """Return the relative path of setup.py to the git repo root.""" - # find the repo root - git_dir = self.run_command(['rev-parse', '--git-dir'], - show_stdout=False, cwd=location).strip() - if not os.path.isabs(git_dir): - git_dir = os.path.join(location, git_dir) - root_dir = os.path.join(git_dir, '..') - # find setup.py - orig_location = location - while not os.path.exists(os.path.join(location, 'setup.py')): - last_location = location - location = os.path.dirname(location) - if location == last_location: - # We've traversed up to the root of the filesystem without - # finding setup.py - logger.warning( - "Could not find setup.py for directory %s (tried all " - "parent directories)", - orig_location, - ) - return None - # relative path of setup.py to repo root - if samefile(root_dir, location): - return None - return os.path.relpath(location, root_dir) - - def get_src_requirement(self, dist, location): - repo = self.get_url(location) - if not repo.lower().startswith('git:'): - repo = 'git+' + repo - egg_project_name = dist.egg_name().split('-', 1)[0] - if not repo: - return None - current_rev = self.get_revision(location) - req = '%s@%s#egg=%s' % (repo, current_rev, egg_project_name) - subdirectory = self._get_subdirectory(location) - if subdirectory: - req += '&subdirectory=' + subdirectory - return req - - def get_url_rev(self): - """ - Prefixes stub URLs like 'user@hostname:user/repo.git' with 'ssh://'. - That's required because although they use SSH they sometimes doesn't - work with a ssh:// scheme (e.g. Github). But we need a scheme for - parsing. Hence we remove it again afterwards and return it as a stub. - """ - if '://' not in self.url: - assert 'file:' not in self.url - self.url = self.url.replace('git+', 'git+ssh://') - url, rev = super(Git, self).get_url_rev() - url = url.replace('ssh://', '') - else: - url, rev = super(Git, self).get_url_rev() - - return url, rev - - def update_submodules(self, location): - if not os.path.exists(os.path.join(location, '.gitmodules')): - return - self.run_command( - ['submodule', 'update', '--init', '--recursive', '-q'], - cwd=location, - ) - - @classmethod - def controls_location(cls, location): - if super(Git, cls).controls_location(location): - return True - try: - r = cls().run_command(['rev-parse'], - cwd=location, - show_stdout=False, - on_returncode='ignore') - return not r - except BadCommand: - logger.debug("could not determine if %s is under git control " - "because git is not available", location) - return False - - -vcs.register(Git) diff --git a/lib/python3.4/site-packages/pkg_resources-0.0.0.dist-info/DESCRIPTION.rst b/lib/python3.4/site-packages/pkg_resources-0.0.0.dist-info/DESCRIPTION.rst deleted file mode 100644 index e118723..0000000 --- a/lib/python3.4/site-packages/pkg_resources-0.0.0.dist-info/DESCRIPTION.rst +++ /dev/null @@ -1,3 +0,0 @@ -UNKNOWN - - diff --git a/lib/python3.4/site-packages/pkg_resources-0.0.0.dist-info/INSTALLER b/lib/python3.4/site-packages/pkg_resources-0.0.0.dist-info/INSTALLER deleted file mode 100644 index a1b589e..0000000 --- a/lib/python3.4/site-packages/pkg_resources-0.0.0.dist-info/INSTALLER +++ /dev/null @@ -1 +0,0 @@ -pip diff --git a/lib/python3.4/site-packages/pkg_resources-0.0.0.dist-info/RECORD b/lib/python3.4/site-packages/pkg_resources-0.0.0.dist-info/RECORD deleted file mode 100644 index 6e71c0c..0000000 --- a/lib/python3.4/site-packages/pkg_resources-0.0.0.dist-info/RECORD +++ /dev/null @@ -1,36 +0,0 @@ -pkg_resources/__init__.py,sha256=qasrGUKwGQ8dGJP5SOEhLJoWRizj5HinbD2bXfrOH28,103308 -pkg_resources/_vendor/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0 -pkg_resources/_vendor/appdirs.py,sha256=tgGaL0m4Jo2VeuGfoOOifLv7a7oUEJu2n1vRkqoPw-0,22374 -pkg_resources/_vendor/pyparsing.py,sha256=PifeLY3-WhIcBVzLtv0U4T_pwDtPruBhBCkg5vLqa28,229867 -pkg_resources/_vendor/six.py,sha256=A6hdJZVjI3t_geebZ9BzUvwRrIXo0lfwzQlM2LcKyas,30098 -pkg_resources/_vendor/packaging/__about__.py,sha256=zkcCPTN_6TcLW0Nrlg0176-R1QQ_WVPTm8sz1R4-HjM,720 -pkg_resources/_vendor/packaging/__init__.py,sha256=_vNac5TrzwsrzbOFIbF-5cHqc_Y2aPT2D7zrIR06BOo,513 -pkg_resources/_vendor/packaging/_compat.py,sha256=Vi_A0rAQeHbU-a9X0tt1yQm9RqkgQbDSxzRw8WlU9kA,860 -pkg_resources/_vendor/packaging/_structures.py,sha256=RImECJ4c_wTlaTYYwZYLHEiebDMaAJmK1oPARhw1T5o,1416 -pkg_resources/_vendor/packaging/markers.py,sha256=uEcBBtGvzqltgnArqb9c4RrcInXezDLos14zbBHhWJo,8248 -pkg_resources/_vendor/packaging/requirements.py,sha256=SikL2UynbsT0qtY9ltqngndha_sfo0w6XGFhAhoSoaQ,4355 -pkg_resources/_vendor/packaging/specifiers.py,sha256=SAMRerzO3fK2IkFZCaZkuwZaL_EGqHNOz4pni4vhnN0,28025 -pkg_resources/_vendor/packaging/utils.py,sha256=3m6WvPm6NNxE8rkTGmn0r75B_GZSGg7ikafxHsBN1WA,421 -pkg_resources/_vendor/packaging/version.py,sha256=OwGnxYfr2ghNzYx59qWIBkrK3SnB6n-Zfd1XaLpnnM0,11556 -pkg_resources/extern/__init__.py,sha256=JUtlHHvlxHSNuB4pWqNjcx7n6kG-fwXg7qmJ2zNJlIY,2487 -pkg_resources-0.0.0.dist-info/DESCRIPTION.rst,sha256=OCTuuN6LcWulhHS3d5rfjdsQtW22n7HENFRh6jC6ego,10 -pkg_resources-0.0.0.dist-info/METADATA,sha256=FOYDX6cmnDUkWo-yhqWQYtjKIMZR2IW2G1GFZhA6gUQ,177 -pkg_resources-0.0.0.dist-info/RECORD,, -pkg_resources-0.0.0.dist-info/WHEEL,sha256=o2k-Qa-RMNIJmUdIc7KU6VWR_ErNRbWNlxDIpl7lm34,110 -pkg_resources-0.0.0.dist-info/metadata.json,sha256=8ZVRFU96pY_wnWouockCkvXw981Y0iDB5nQFFGq8ZiY,221 -pkg_resources-0.0.0.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4 -pkg_resources/_vendor/packaging/__pycache__/version.cpython-34.pyc,, -pkg_resources/__pycache__/__init__.cpython-34.pyc,, -pkg_resources/_vendor/__pycache__/appdirs.cpython-34.pyc,, -pkg_resources/_vendor/packaging/__pycache__/_structures.cpython-34.pyc,, -pkg_resources/_vendor/packaging/__pycache__/__init__.cpython-34.pyc,, -pkg_resources/_vendor/packaging/__pycache__/specifiers.cpython-34.pyc,, -pkg_resources/_vendor/__pycache__/__init__.cpython-34.pyc,, -pkg_resources/_vendor/packaging/__pycache__/utils.cpython-34.pyc,, -pkg_resources/_vendor/__pycache__/six.cpython-34.pyc,, -pkg_resources/_vendor/__pycache__/pyparsing.cpython-34.pyc,, -pkg_resources/_vendor/packaging/__pycache__/markers.cpython-34.pyc,, -pkg_resources/_vendor/packaging/__pycache__/__about__.cpython-34.pyc,, -pkg_resources/_vendor/packaging/__pycache__/_compat.cpython-34.pyc,, -pkg_resources/_vendor/packaging/__pycache__/requirements.cpython-34.pyc,, -pkg_resources/extern/__pycache__/__init__.cpython-34.pyc,, diff --git a/lib/python3.4/site-packages/pkg_resources-0.0.0.dist-info/metadata.json b/lib/python3.4/site-packages/pkg_resources-0.0.0.dist-info/metadata.json deleted file mode 100644 index f7d360a..0000000 --- a/lib/python3.4/site-packages/pkg_resources-0.0.0.dist-info/metadata.json +++ /dev/null @@ -1 +0,0 @@ -{"extensions": {"python.details": {"document_names": {"description": "DESCRIPTION.rst"}}}, "generator": "bdist_wheel (0.29.0)", "metadata_version": "2.0", "name": "pkg_resources", "summary": "UNKNOWN", "version": "0.0.0"} \ No newline at end of file diff --git a/lib/python3.4/site-packages/setuptools-36.6.0.dist-info/DESCRIPTION.rst b/lib/python3.4/site-packages/setuptools-36.6.0.dist-info/DESCRIPTION.rst deleted file mode 100644 index 027d10d..0000000 --- a/lib/python3.4/site-packages/setuptools-36.6.0.dist-info/DESCRIPTION.rst +++ /dev/null @@ -1,36 +0,0 @@ -.. image:: https://img.shields.io/pypi/v/setuptools.svg - :target: https://pypi.org/project/setuptools - -.. image:: https://readthedocs.org/projects/setuptools/badge/?version=latest - :target: https://setuptools.readthedocs.io - -.. image:: https://img.shields.io/travis/pypa/setuptools/master.svg?label=Linux%20build%20%40%20Travis%20CI - :target: http://travis-ci.org/pypa/setuptools - -.. image:: https://img.shields.io/appveyor/ci/jaraco/setuptools/master.svg?label=Windows%20build%20%40%20Appveyor - :target: https://ci.appveyor.com/project/jaraco/setuptools/branch/master - -.. image:: https://img.shields.io/pypi/pyversions/setuptools.svg - -See the `Installation Instructions -`_ in the Python Packaging -User's Guide for instructions on installing, upgrading, and uninstalling -Setuptools. - -The project is `maintained at GitHub `_. - -Questions and comments should be directed to the `distutils-sig -mailing list `_. -Bug reports and especially tested patches may be -submitted directly to the `bug tracker -`_. - - -Code of Conduct ---------------- - -Everyone interacting in the setuptools project's codebases, issue trackers, -chat rooms, and mailing lists is expected to follow the -`PyPA Code of Conduct `_. - - diff --git a/lib/python3.4/site-packages/setuptools-36.6.0.dist-info/INSTALLER b/lib/python3.4/site-packages/setuptools-36.6.0.dist-info/INSTALLER deleted file mode 100644 index a1b589e..0000000 --- a/lib/python3.4/site-packages/setuptools-36.6.0.dist-info/INSTALLER +++ /dev/null @@ -1 +0,0 @@ -pip diff --git a/lib/python3.4/site-packages/setuptools-36.6.0.dist-info/RECORD b/lib/python3.4/site-packages/setuptools-36.6.0.dist-info/RECORD deleted file mode 100644 index 2b060aa..0000000 --- a/lib/python3.4/site-packages/setuptools-36.6.0.dist-info/RECORD +++ /dev/null @@ -1,159 +0,0 @@ -easy_install.py,sha256=MDC9vt5AxDsXX5qcKlBz2TnW6Tpuv_AobnfhCJ9X3PM,126 -pkg_resources/__init__.py,sha256=0q4Rx1CSzw9caT4ewfrQmAAC60NZCjSQU-9vQjP34yo,106202 -pkg_resources/py31compat.py,sha256=-ysVqoxLetAnL94uM0kHkomKQTC1JZLN2ZUjqUhMeKE,600 -pkg_resources/_vendor/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0 -pkg_resources/_vendor/appdirs.py,sha256=tgGaL0m4Jo2VeuGfoOOifLv7a7oUEJu2n1vRkqoPw-0,22374 -pkg_resources/_vendor/pyparsing.py,sha256=PifeLY3-WhIcBVzLtv0U4T_pwDtPruBhBCkg5vLqa28,229867 -pkg_resources/_vendor/six.py,sha256=A6hdJZVjI3t_geebZ9BzUvwRrIXo0lfwzQlM2LcKyas,30098 -pkg_resources/_vendor/packaging/__about__.py,sha256=zkcCPTN_6TcLW0Nrlg0176-R1QQ_WVPTm8sz1R4-HjM,720 -pkg_resources/_vendor/packaging/__init__.py,sha256=_vNac5TrzwsrzbOFIbF-5cHqc_Y2aPT2D7zrIR06BOo,513 -pkg_resources/_vendor/packaging/_compat.py,sha256=Vi_A0rAQeHbU-a9X0tt1yQm9RqkgQbDSxzRw8WlU9kA,860 -pkg_resources/_vendor/packaging/_structures.py,sha256=RImECJ4c_wTlaTYYwZYLHEiebDMaAJmK1oPARhw1T5o,1416 -pkg_resources/_vendor/packaging/markers.py,sha256=uEcBBtGvzqltgnArqb9c4RrcInXezDLos14zbBHhWJo,8248 -pkg_resources/_vendor/packaging/requirements.py,sha256=SikL2UynbsT0qtY9ltqngndha_sfo0w6XGFhAhoSoaQ,4355 -pkg_resources/_vendor/packaging/specifiers.py,sha256=SAMRerzO3fK2IkFZCaZkuwZaL_EGqHNOz4pni4vhnN0,28025 -pkg_resources/_vendor/packaging/utils.py,sha256=3m6WvPm6NNxE8rkTGmn0r75B_GZSGg7ikafxHsBN1WA,421 -pkg_resources/_vendor/packaging/version.py,sha256=OwGnxYfr2ghNzYx59qWIBkrK3SnB6n-Zfd1XaLpnnM0,11556 -pkg_resources/extern/__init__.py,sha256=JUtlHHvlxHSNuB4pWqNjcx7n6kG-fwXg7qmJ2zNJlIY,2487 -setuptools/__init__.py,sha256=MsRcLyrl8E49pBeFZ-PSwST-I2adqjvkfCC1h9gl0TQ,5037 -setuptools/archive_util.py,sha256=Z58-gbZQ0j92UJy7X7uZevwI28JTVEXd__AjKy4aw78,6613 -setuptools/build_meta.py,sha256=Z8fCFFJooVDcBuSUlVBWgwV41B9raH1sINpOP5-4o2Y,4756 -setuptools/cli-32.exe,sha256=dfEuovMNnA2HLa3jRfMPVi5tk4R7alCbpTvuxtCyw0Y,65536 -setuptools/cli-64.exe,sha256=KLABu5pyrnokJCv6skjXZ6GsXeyYHGcqOUT3oHI3Xpo,74752 -setuptools/cli.exe,sha256=dfEuovMNnA2HLa3jRfMPVi5tk4R7alCbpTvuxtCyw0Y,65536 -setuptools/config.py,sha256=ms8JAS3aHsOun-OO-jyvrQq3txyRE2AwKOiZP1aTan8,16317 -setuptools/dep_util.py,sha256=fgixvC1R7sH3r13ktyf7N0FALoqEXL1cBarmNpSEoWg,935 -setuptools/depends.py,sha256=hC8QIDcM3VDpRXvRVA6OfL9AaQfxvhxHcN_w6sAyNq8,5837 -setuptools/dist.py,sha256=PZjofGBK1ZzA-VpbwuTlxf9XMkvwmGYPSIqUl8FpE2k,40364 -setuptools/extension.py,sha256=uc6nHI-MxwmNCNPbUiBnybSyqhpJqjbhvOQ-emdvt_E,1729 -setuptools/glob.py,sha256=Y-fpv8wdHZzv9DPCaGACpMSBWJ6amq_1e0R_i8_el4w,5207 -setuptools/gui-32.exe,sha256=XBr0bHMA6Hpz2s9s9Bzjl-PwXfa9nH4ie0rFn4V2kWA,65536 -setuptools/gui-64.exe,sha256=aYKMhX1IJLn4ULHgWX0sE0yREUt6B3TEHf_jOw6yNyE,75264 -setuptools/gui.exe,sha256=XBr0bHMA6Hpz2s9s9Bzjl-PwXfa9nH4ie0rFn4V2kWA,65536 -setuptools/launch.py,sha256=sd7ejwhBocCDx_wG9rIs0OaZ8HtmmFU8ZC6IR_S0Lvg,787 -setuptools/lib2to3_ex.py,sha256=t5e12hbR2pi9V4ezWDTB4JM-AISUnGOkmcnYHek3xjg,2013 -setuptools/monkey.py,sha256=s-yH6vfMFxXMrfVInT9_3gnEyAn-TYMHtXVNUOVI4T8,5791 -setuptools/msvc.py,sha256=AEbWNLJ0pTuHJSkQuBZET6wr_d2-yGGPkdHCMdIKWB4,40884 -setuptools/namespaces.py,sha256=F0Nrbv8KCT2OrO7rwa03om4N4GZKAlnce-rr-cgDQa8,3199 -setuptools/package_index.py,sha256=ELInXIlJZqNbeAKAHYZVDLbwOkYZt-o-vyaFK_eS_N0,39970 -setuptools/py26compat.py,sha256=VRGHC7z2gliR4_uICJsQNodUcNUzybpus3BrJkWbnK4,679 -setuptools/py27compat.py,sha256=3mwxRMDk5Q5O1rSXOERbQDXhFqwDJhhUitfMW_qpUCo,536 -setuptools/py31compat.py,sha256=qGRk3tefux8HbhNzhM0laR3mD8vhAZtffZgzLkBMXJs,1645 -setuptools/py33compat.py,sha256=W8_JFZr8WQbJT_7-JFWjc_6lHGtoMK-4pCrHIwk5JN0,998 -setuptools/py36compat.py,sha256=VUDWxmu5rt4QHlGTRtAFu6W5jvfL6WBjeDAzeoBy0OM,2891 -setuptools/sandbox.py,sha256=hkGRod5_yt3EBHkGnRBf7uK1YceoqFpTT4b__9ZZ5UU,14549 -setuptools/script (dev).tmpl,sha256=f7MR17dTkzaqkCMSVseyOCMVrPVSMdmTQsaB8cZzfuI,201 -setuptools/script.tmpl,sha256=WGTt5piezO27c-Dbx6l5Q4T3Ff20A5z7872hv3aAhYY,138 -setuptools/site-patch.py,sha256=BVt6yIrDMXJoflA5J6DJIcsJUfW_XEeVhOzelTTFDP4,2307 -setuptools/ssl_support.py,sha256=Axo1QtiAtsvuENZq_BvhW5PeWw2nrX39-4qoSiVpB6w,8220 -setuptools/unicode_utils.py,sha256=NOiZ_5hD72A6w-4wVj8awHFM3n51Kmw1Ic_vx15XFqw,996 -setuptools/version.py,sha256=og_cuZQb0QI6ukKZFfZWPlr1HgJBPPn2vO2m_bI9ZTE,144 -setuptools/windows_support.py,sha256=5GrfqSP2-dLGJoZTq2g6dCKkyQxxa2n5IQiXlJCoYEE,714 -setuptools/command/__init__.py,sha256=-X7tSQahlz8sbGu_Xq9bqumFE117jU56E96tDDufNqw,590 -setuptools/command/alias.py,sha256=KjpE0sz_SDIHv3fpZcIQK-sCkJz-SrC6Gmug6b9Nkc8,2426 -setuptools/command/bdist_egg.py,sha256=TGN1XVQb9V8Rf-msDKaIZWmeGQf81HT83oqXJ_3M0gg,17441 -setuptools/command/bdist_rpm.py,sha256=B7l0TnzCGb-0nLlm6rS00jWLkojASwVmdhW2w5Qz_Ak,1508 -setuptools/command/bdist_wininst.py,sha256=_6dz3lpB1tY200LxKPLM7qgwTCceOMgaWFF-jW2-pm0,637 -setuptools/command/build_clib.py,sha256=bQ9aBr-5ZSO-9fGsGsDLz0mnnFteHUZnftVLkhvHDq0,4484 -setuptools/command/build_ext.py,sha256=dO89j-IC0dAjSty1sSZxvi0LSdkPGR_ZPXFuAAFDZj4,13049 -setuptools/command/build_py.py,sha256=yWyYaaS9F3o9JbIczn064A5g1C5_UiKRDxGaTqYbtLE,9596 -setuptools/command/develop.py,sha256=PuVOjmGWGfvHZmOBMj_bdeU087kl0jhnMHqKcDODBDE,8024 -setuptools/command/dist_info.py,sha256=7Ewmog46orGjzME5UA_GQvqewRd1s25aCLxsfHCKqq8,924 -setuptools/command/easy_install.py,sha256=eruE4R4JfOTx0_0hDYMMElpup33Qkn0P44lclgP8dA0,85973 -setuptools/command/egg_info.py,sha256=HNUt2tQAAp8dULFS_6Qk9vflESI7jdqlCqq-VVQi7AA,25016 -setuptools/command/install.py,sha256=a0EZpL_A866KEdhicTGbuyD_TYl1sykfzdrri-zazT4,4683 -setuptools/command/install_egg_info.py,sha256=bMgeIeRiXzQ4DAGPV1328kcjwQjHjOWU4FngAWLV78Q,2203 -setuptools/command/install_lib.py,sha256=11mxf0Ch12NsuYwS8PHwXBRvyh671QAM4cTRh7epzG0,3840 -setuptools/command/install_scripts.py,sha256=UD0rEZ6861mTYhIdzcsqKnUl8PozocXWl9VBQ1VTWnc,2439 -setuptools/command/launcher manifest.xml,sha256=xlLbjWrB01tKC0-hlVkOKkiSPbzMml2eOPtJ_ucCnbE,628 -setuptools/command/py36compat.py,sha256=SzjZcOxF7zdFUT47Zv2n7AM3H8koDys_0OpS-n9gIfc,4986 -setuptools/command/register.py,sha256=bHlMm1qmBbSdahTOT8w6UhA-EgeQIz7p6cD-qOauaiI,270 -setuptools/command/rotate.py,sha256=co5C1EkI7P0GGT6Tqz-T2SIj2LBJTZXYELpmao6d4KQ,2164 -setuptools/command/saveopts.py,sha256=za7QCBcQimKKriWcoCcbhxPjUz30gSB74zuTL47xpP4,658 -setuptools/command/sdist.py,sha256=VldpcHRSlDrvvK2uV9O6HjQA2OtHCUa4QaMkYCYwTrA,6919 -setuptools/command/setopt.py,sha256=NTWDyx-gjDF-txf4dO577s7LOzHVoKR0Mq33rFxaRr8,5085 -setuptools/command/test.py,sha256=koi5lqjhXHlt0B3egYb98qRVETzKXKhWDD5OQY-AKuA,9044 -setuptools/command/upload.py,sha256=i1gfItZ3nQOn5FKXb8tLC2Kd7eKC8lWO4bdE6NqGpE4,1172 -setuptools/command/upload_docs.py,sha256=oXiGplM_cUKLwE4CWWw98RzCufAu8tBhMC97GegFcms,7311 -setuptools/extern/__init__.py,sha256=ZtCLYQ8JTtOtm7SYoxekZw-UzY3TR50SRIUaeqr2ROk,131 -setuptools-36.6.0.dist-info/DESCRIPTION.rst,sha256=1sSNG6a5L3fSMo1x9uE3jvumlEODgeqBUtSaYp_VVLw,1421 -setuptools-36.6.0.dist-info/METADATA,sha256=GLuJ3zbtJdt_nwgq9UIpUoXOis1Ub4tWeOTKQIZHT1s,2847 -setuptools-36.6.0.dist-info/RECORD,, -setuptools-36.6.0.dist-info/WHEEL,sha256=kdsN-5OJAZIiHN-iO4Rhl82KyS0bDWf4uBwMbkNafr8,110 -setuptools-36.6.0.dist-info/dependency_links.txt,sha256=HlkCFkoK5TbZ5EMLbLKYhLcY_E31kBWD8TqW2EgmatQ,239 -setuptools-36.6.0.dist-info/entry_points.txt,sha256=jBqCYDlVjl__sjYFGXo1JQGIMAYFJE-prYWUtnMZEew,2990 -setuptools-36.6.0.dist-info/metadata.json,sha256=4yqt7_oaFRn8AA20H0H5W2AByP8z-0HuDpwGyiQH6UU,4916 -setuptools-36.6.0.dist-info/top_level.txt,sha256=2HUXVVwA4Pff1xgTFr3GsTXXKaPaO6vlG6oNJ_4u4Tg,38 -setuptools-36.6.0.dist-info/zip-safe,sha256=AbpHGcgLb-kRsJGnwFEktk7uzpZOCcBY74-YBdrKVGs,1 -../../../bin/easy_install,sha256=tsci1id0sS7h2uWc2NQJYflZoKSI8AR-W02mXYmf7Es,300 -../../../bin/easy_install-3.4,sha256=tsci1id0sS7h2uWc2NQJYflZoKSI8AR-W02mXYmf7Es,300 -setuptools-36.6.0.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4 -setuptools/__pycache__/py31compat.cpython-34.pyc,, -setuptools/command/__pycache__/bdist_egg.cpython-34.pyc,, -setuptools/__pycache__/sandbox.cpython-34.pyc,, -setuptools/__pycache__/__init__.cpython-34.pyc,, -setuptools/extern/__pycache__/__init__.cpython-34.pyc,, -setuptools/__pycache__/site-patch.cpython-34.pyc,, -setuptools/__pycache__/config.cpython-34.pyc,, -setuptools/__pycache__/py26compat.cpython-34.pyc,, -setuptools/command/__pycache__/bdist_rpm.cpython-34.pyc,, -setuptools/command/__pycache__/test.cpython-34.pyc,, -pkg_resources/extern/__pycache__/__init__.cpython-34.pyc,, -setuptools/command/__pycache__/upload_docs.cpython-34.pyc,, -setuptools/__pycache__/ssl_support.cpython-34.pyc,, -setuptools/command/__pycache__/alias.cpython-34.pyc,, -pkg_resources/_vendor/__pycache__/pyparsing.cpython-34.pyc,, -pkg_resources/_vendor/packaging/__pycache__/__about__.cpython-34.pyc,, -setuptools/__pycache__/py33compat.cpython-34.pyc,, -setuptools/command/__pycache__/build_py.cpython-34.pyc,, -pkg_resources/_vendor/packaging/__pycache__/_compat.cpython-34.pyc,, -setuptools/command/__pycache__/install_lib.cpython-34.pyc,, -setuptools/command/__pycache__/dist_info.cpython-34.pyc,, -setuptools/__pycache__/build_meta.cpython-34.pyc,, -setuptools/command/__pycache__/bdist_wininst.cpython-34.pyc,, -setuptools/__pycache__/extension.cpython-34.pyc,, -pkg_resources/_vendor/packaging/__pycache__/_structures.cpython-34.pyc,, -pkg_resources/_vendor/packaging/__pycache__/__init__.cpython-34.pyc,, -setuptools/command/__pycache__/install_scripts.cpython-34.pyc,, -setuptools/command/__pycache__/install.cpython-34.pyc,, -setuptools/__pycache__/py27compat.cpython-34.pyc,, -setuptools/__pycache__/py36compat.cpython-34.pyc,, -setuptools/command/__pycache__/sdist.cpython-34.pyc,, -setuptools/__pycache__/package_index.cpython-34.pyc,, -setuptools/__pycache__/msvc.cpython-34.pyc,, -setuptools/__pycache__/archive_util.cpython-34.pyc,, -setuptools/command/__pycache__/egg_info.cpython-34.pyc,, -pkg_resources/_vendor/packaging/__pycache__/requirements.cpython-34.pyc,, -setuptools/__pycache__/lib2to3_ex.cpython-34.pyc,, -setuptools/command/__pycache__/install_egg_info.cpython-34.pyc,, -setuptools/command/__pycache__/upload.cpython-34.pyc,, -setuptools/command/__pycache__/build_ext.cpython-34.pyc,, -pkg_resources/__pycache__/__init__.cpython-34.pyc,, -pkg_resources/_vendor/__pycache__/appdirs.cpython-34.pyc,, -setuptools/__pycache__/namespaces.cpython-34.pyc,, -setuptools/__pycache__/monkey.cpython-34.pyc,, -setuptools/command/__pycache__/build_clib.cpython-34.pyc,, -pkg_resources/_vendor/__pycache__/six.cpython-34.pyc,, -pkg_resources/__pycache__/py31compat.cpython-34.pyc,, -pkg_resources/_vendor/packaging/__pycache__/specifiers.cpython-34.pyc,, -setuptools/__pycache__/dist.cpython-34.pyc,, -setuptools/__pycache__/depends.cpython-34.pyc,, -__pycache__/easy_install.cpython-34.pyc,, -setuptools/__pycache__/dep_util.cpython-34.pyc,, -setuptools/command/__pycache__/setopt.cpython-34.pyc,, -setuptools/__pycache__/version.cpython-34.pyc,, -setuptools/__pycache__/windows_support.cpython-34.pyc,, -pkg_resources/_vendor/packaging/__pycache__/utils.cpython-34.pyc,, -setuptools/__pycache__/glob.cpython-34.pyc,, -setuptools/command/__pycache__/develop.cpython-34.pyc,, -pkg_resources/_vendor/packaging/__pycache__/markers.cpython-34.pyc,, -setuptools/__pycache__/launch.cpython-34.pyc,, -setuptools/command/__pycache__/rotate.cpython-34.pyc,, -pkg_resources/_vendor/packaging/__pycache__/version.cpython-34.pyc,, -setuptools/command/__pycache__/py36compat.cpython-34.pyc,, -setuptools/command/__pycache__/__init__.cpython-34.pyc,, -setuptools/command/__pycache__/register.cpython-34.pyc,, -setuptools/__pycache__/unicode_utils.cpython-34.pyc,, -pkg_resources/_vendor/__pycache__/__init__.cpython-34.pyc,, -setuptools/command/__pycache__/easy_install.cpython-34.pyc,, -setuptools/command/__pycache__/saveopts.cpython-34.pyc,, diff --git a/lib/python3.4/site-packages/setuptools-36.6.0.dist-info/metadata.json b/lib/python3.4/site-packages/setuptools-36.6.0.dist-info/metadata.json deleted file mode 100644 index cb17d38..0000000 --- a/lib/python3.4/site-packages/setuptools-36.6.0.dist-info/metadata.json +++ /dev/null @@ -1 +0,0 @@ -{"classifiers": ["Development Status :: 5 - Production/Stable", "Intended Audience :: Developers", "License :: OSI Approved :: MIT License", "Operating System :: OS Independent", "Programming Language :: Python :: 2", "Programming Language :: Python :: 2.6", "Programming Language :: Python :: 2.7", "Programming Language :: Python :: 3", "Programming Language :: Python :: 3.3", "Programming Language :: Python :: 3.4", "Programming Language :: Python :: 3.5", "Programming Language :: Python :: 3.6", "Topic :: Software Development :: Libraries :: Python Modules", "Topic :: System :: Archiving :: Packaging", "Topic :: System :: Systems Administration", "Topic :: Utilities"], "description_content_type": "text/x-rst; charset=UTF-8", "extensions": {"python.commands": {"wrap_console": {"easy_install": "setuptools.command.easy_install:main", "easy_install-3.6": "setuptools.command.easy_install:main"}}, "python.details": {"contacts": [{"email": "distutils-sig@python.org", "name": "Python Packaging Authority", "role": "author"}], "document_names": {"description": "DESCRIPTION.rst"}, "project_urls": {"Home": "https://github.com/pypa/setuptools"}}, "python.exports": {"console_scripts": {"easy_install": "setuptools.command.easy_install:main", "easy_install-3.6": "setuptools.command.easy_install:main"}, "distutils.commands": {"alias": "setuptools.command.alias:alias", "bdist_egg": "setuptools.command.bdist_egg:bdist_egg", "bdist_rpm": "setuptools.command.bdist_rpm:bdist_rpm", "bdist_wininst": "setuptools.command.bdist_wininst:bdist_wininst", "build_clib": "setuptools.command.build_clib:build_clib", "build_ext": "setuptools.command.build_ext:build_ext", "build_py": "setuptools.command.build_py:build_py", "develop": "setuptools.command.develop:develop", "dist_info": "setuptools.command.dist_info:dist_info", "easy_install": "setuptools.command.easy_install:easy_install", "egg_info": "setuptools.command.egg_info:egg_info", "install": "setuptools.command.install:install", "install_egg_info": "setuptools.command.install_egg_info:install_egg_info", "install_lib": "setuptools.command.install_lib:install_lib", "install_scripts": "setuptools.command.install_scripts:install_scripts", "register": "setuptools.command.register:register", "rotate": "setuptools.command.rotate:rotate", "saveopts": "setuptools.command.saveopts:saveopts", "sdist": "setuptools.command.sdist:sdist", "setopt": "setuptools.command.setopt:setopt", "test": "setuptools.command.test:test", "upload": "setuptools.command.upload:upload", "upload_docs": "setuptools.command.upload_docs:upload_docs"}, "distutils.setup_keywords": {"convert_2to3_doctests": "setuptools.dist:assert_string_list", "dependency_links": "setuptools.dist:assert_string_list", "eager_resources": "setuptools.dist:assert_string_list", "entry_points": "setuptools.dist:check_entry_points", "exclude_package_data": "setuptools.dist:check_package_data", "extras_require": "setuptools.dist:check_extras", "include_package_data": "setuptools.dist:assert_bool", "install_requires": "setuptools.dist:check_requirements", "namespace_packages": "setuptools.dist:check_nsp", "package_data": "setuptools.dist:check_package_data", "packages": "setuptools.dist:check_packages", "python_requires": "setuptools.dist:check_specifier", "setup_requires": "setuptools.dist:check_requirements", "test_loader": "setuptools.dist:check_importable", "test_runner": "setuptools.dist:check_importable", "test_suite": "setuptools.dist:check_test_suite", "tests_require": "setuptools.dist:check_requirements", "use_2to3": "setuptools.dist:assert_bool", "use_2to3_exclude_fixers": "setuptools.dist:assert_string_list", "use_2to3_fixers": "setuptools.dist:assert_string_list", "zip_safe": "setuptools.dist:assert_bool"}, "egg_info.writers": {"PKG-INFO": "setuptools.command.egg_info:write_pkg_info", "dependency_links.txt": "setuptools.command.egg_info:overwrite_arg", "depends.txt": "setuptools.command.egg_info:warn_depends_obsolete", "eager_resources.txt": "setuptools.command.egg_info:overwrite_arg", "entry_points.txt": "setuptools.command.egg_info:write_entries", "namespace_packages.txt": "setuptools.command.egg_info:overwrite_arg", "requires.txt": "setuptools.command.egg_info:write_requirements", "top_level.txt": "setuptools.command.egg_info:write_toplevel_names"}, "setuptools.installation": {"eggsecutable": "setuptools.command.easy_install:bootstrap"}}}, "extras": ["certs", "ssl"], "generator": "bdist_wheel (0.30.0)", "keywords": ["CPAN", "PyPI", "distutils", "eggs", "package", "management"], "metadata_version": "2.0", "name": "setuptools", "requires_python": ">=2.6,!=3.0.*,!=3.1.*,!=3.2.*", "run_requires": [{"extra": "certs", "requires": ["certifi (==2016.9.26)"]}, {"environment": "sys_platform=='win32'", "extra": "ssl", "requires": ["wincertstore (==0.2)"]}], "summary": "Easily download, build, install, upgrade, and uninstall Python packages", "version": "36.6.0"} \ No newline at end of file diff --git a/lib/python3.4/site-packages/setuptools/command/register.py b/lib/python3.4/site-packages/setuptools/command/register.py deleted file mode 100644 index 8d6336a..0000000 --- a/lib/python3.4/site-packages/setuptools/command/register.py +++ /dev/null @@ -1,10 +0,0 @@ -import distutils.command.register as orig - - -class register(orig.register): - __doc__ = orig.register.__doc__ - - def run(self): - # Make sure that we are using valid current name/version info - self.run_command('egg_info') - orig.register.run(self) diff --git a/lib/python3.4/site-packages/setuptools/extern/__init__.py b/lib/python3.4/site-packages/setuptools/extern/__init__.py deleted file mode 100644 index 2cd08b7..0000000 --- a/lib/python3.4/site-packages/setuptools/extern/__init__.py +++ /dev/null @@ -1,4 +0,0 @@ -from pkg_resources.extern import VendorImporter - -names = 'six', -VendorImporter(__name__, names, 'pkg_resources._vendor').install() diff --git a/lib/python3.4/site-packages/setuptools/py26compat.py b/lib/python3.4/site-packages/setuptools/py26compat.py deleted file mode 100644 index 4d3add8..0000000 --- a/lib/python3.4/site-packages/setuptools/py26compat.py +++ /dev/null @@ -1,31 +0,0 @@ -""" -Compatibility Support for Python 2.6 and earlier -""" - -import sys - -try: - from urllib.parse import splittag -except ImportError: - from urllib import splittag - - -def strip_fragment(url): - """ - In `Python 8280 `_, Python 2.7 and - later was patched to disregard the fragment when making URL requests. - Do the same for Python 2.6 and earlier. - """ - url, fragment = splittag(url) - return url - - -if sys.version_info >= (2, 7): - strip_fragment = lambda x: x - -try: - from importlib import import_module -except ImportError: - - def import_module(module_name): - return __import__(module_name, fromlist=['__name__']) diff --git a/lib/python3.4/site-packages/setuptools/py31compat.py b/lib/python3.4/site-packages/setuptools/py31compat.py deleted file mode 100644 index 44b025d..0000000 --- a/lib/python3.4/site-packages/setuptools/py31compat.py +++ /dev/null @@ -1,56 +0,0 @@ -import sys -import unittest - -__all__ = ['get_config_vars', 'get_path'] - -try: - # Python 2.7 or >=3.2 - from sysconfig import get_config_vars, get_path -except ImportError: - from distutils.sysconfig import get_config_vars, get_python_lib - - def get_path(name): - if name not in ('platlib', 'purelib'): - raise ValueError("Name must be purelib or platlib") - return get_python_lib(name == 'platlib') - - -try: - # Python >=3.2 - from tempfile import TemporaryDirectory -except ImportError: - import shutil - import tempfile - - class TemporaryDirectory(object): - """ - Very simple temporary directory context manager. - Will try to delete afterward, but will also ignore OS and similar - errors on deletion. - """ - - def __init__(self): - self.name = None # Handle mkdtemp raising an exception - self.name = tempfile.mkdtemp() - - def __enter__(self): - return self.name - - def __exit__(self, exctype, excvalue, exctrace): - try: - shutil.rmtree(self.name, True) - except OSError: # removal errors are not the only possible - pass - self.name = None - - -unittest_main = unittest.main - -_PY31 = (3, 1) <= sys.version_info[:2] < (3, 2) -if _PY31: - # on Python 3.1, translate testRunner==None to TextTestRunner - # for compatibility with Python 2.6, 2.7, and 3.2+ - def unittest_main(*args, **kwargs): - if 'testRunner' in kwargs and kwargs['testRunner'] is None: - kwargs['testRunner'] = unittest.TextTestRunner - return unittest.main(*args, **kwargs) diff --git a/lib/python3.4/site-packages/six-1.11.0.dist-info/DESCRIPTION.rst b/lib/python3.4/site-packages/six-1.11.0.dist-info/DESCRIPTION.rst deleted file mode 100644 index 09c2c99..0000000 --- a/lib/python3.4/site-packages/six-1.11.0.dist-info/DESCRIPTION.rst +++ /dev/null @@ -1,27 +0,0 @@ -.. image:: http://img.shields.io/pypi/v/six.svg - :target: https://pypi.python.org/pypi/six - -.. image:: https://travis-ci.org/benjaminp/six.svg?branch=master - :target: https://travis-ci.org/benjaminp/six - -.. image:: http://img.shields.io/badge/license-MIT-green.svg - :target: https://github.com/benjaminp/six/blob/master/LICENSE - -Six is a Python 2 and 3 compatibility library. It provides utility functions -for smoothing over the differences between the Python versions with the goal of -writing Python code that is compatible on both Python versions. See the -documentation for more information on what is provided. - -Six supports every Python version since 2.6. It is contained in only one Python -file, so it can be easily copied into your project. (The copyright and license -notice must be retained.) - -Online documentation is at http://six.rtfd.org. - -Bugs can be reported to https://github.com/benjaminp/six. The code can also -be found there. - -For questions about six or porting in general, email the python-porting mailing -list: https://mail.python.org/mailman/listinfo/python-porting - - diff --git a/lib/python3.4/site-packages/six-1.11.0.dist-info/INSTALLER b/lib/python3.4/site-packages/six-1.11.0.dist-info/INSTALLER deleted file mode 100644 index a1b589e..0000000 --- a/lib/python3.4/site-packages/six-1.11.0.dist-info/INSTALLER +++ /dev/null @@ -1 +0,0 @@ -pip diff --git a/lib/python3.4/site-packages/six-1.11.0.dist-info/RECORD b/lib/python3.4/site-packages/six-1.11.0.dist-info/RECORD deleted file mode 100644 index 99350b6..0000000 --- a/lib/python3.4/site-packages/six-1.11.0.dist-info/RECORD +++ /dev/null @@ -1,9 +0,0 @@ -six.py,sha256=A08MPb-Gi9FfInI3IW7HimXFmEH2T2IPzHgDvdhZPRA,30888 -six-1.11.0.dist-info/DESCRIPTION.rst,sha256=gPBoq1Ruc1QDWyLeXPlieL3F-XZz1_WXB-5gctCfg-A,1098 -six-1.11.0.dist-info/METADATA,sha256=06nZXaDYN3vnC-pmUjhkECYFH_a--ywvcPIpUdNeH1o,1607 -six-1.11.0.dist-info/RECORD,, -six-1.11.0.dist-info/WHEEL,sha256=o2k-Qa-RMNIJmUdIc7KU6VWR_ErNRbWNlxDIpl7lm34,110 -six-1.11.0.dist-info/metadata.json,sha256=ac3f4f7MpSHSnZ1SqhHCwsL7FGWMG0gBEb0hhS2eSSM,703 -six-1.11.0.dist-info/top_level.txt,sha256=_iVH_iYEtEXnD8nYGQYpYFUvkUW9sEO1GYbkeKSAais,4 -six-1.11.0.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4 -__pycache__/six.cpython-34.pyc,, diff --git a/lib/python3.4/site-packages/six-1.11.0.dist-info/metadata.json b/lib/python3.4/site-packages/six-1.11.0.dist-info/metadata.json deleted file mode 100644 index 2c7fcea..0000000 --- a/lib/python3.4/site-packages/six-1.11.0.dist-info/metadata.json +++ /dev/null @@ -1 +0,0 @@ -{"classifiers": ["Programming Language :: Python :: 2", "Programming Language :: Python :: 3", "Intended Audience :: Developers", "License :: OSI Approved :: MIT License", "Topic :: Software Development :: Libraries", "Topic :: Utilities"], "extensions": {"python.details": {"contacts": [{"email": "benjamin@python.org", "name": "Benjamin Peterson", "role": "author"}], "document_names": {"description": "DESCRIPTION.rst"}, "project_urls": {"Home": "http://pypi.python.org/pypi/six/"}}}, "generator": "bdist_wheel (0.29.0)", "license": "MIT", "metadata_version": "2.0", "name": "six", "summary": "Python 2 and 3 compatibility utilities", "test_requires": [{"requires": ["pytest"]}], "version": "1.11.0"} \ No newline at end of file diff --git a/lib/python3.4/site-packages/sqlalchemy/cprocessors.cpython-34m.so b/lib/python3.4/site-packages/sqlalchemy/cprocessors.cpython-34m.so deleted file mode 100755 index 7b1c78e..0000000 Binary files a/lib/python3.4/site-packages/sqlalchemy/cprocessors.cpython-34m.so and /dev/null differ diff --git a/lib/python3.4/site-packages/sqlalchemy/cprocessors.cpython-35m-x86_64-linux-gnu.so b/lib/python3.4/site-packages/sqlalchemy/cprocessors.cpython-35m-x86_64-linux-gnu.so deleted file mode 100755 index 8d5baf4..0000000 Binary files a/lib/python3.4/site-packages/sqlalchemy/cprocessors.cpython-35m-x86_64-linux-gnu.so and /dev/null differ diff --git a/lib/python3.4/site-packages/sqlalchemy/cprocessors.cpython-36m-x86_64-linux-gnu.so b/lib/python3.4/site-packages/sqlalchemy/cprocessors.cpython-36m-x86_64-linux-gnu.so deleted file mode 100755 index 19f3794..0000000 Binary files a/lib/python3.4/site-packages/sqlalchemy/cprocessors.cpython-36m-x86_64-linux-gnu.so and /dev/null differ diff --git a/lib/python3.4/site-packages/sqlalchemy/cresultproxy.cpython-34m.so b/lib/python3.4/site-packages/sqlalchemy/cresultproxy.cpython-34m.so deleted file mode 100755 index 27c0c4f..0000000 Binary files a/lib/python3.4/site-packages/sqlalchemy/cresultproxy.cpython-34m.so and /dev/null differ diff --git a/lib/python3.4/site-packages/sqlalchemy/cresultproxy.cpython-35m-x86_64-linux-gnu.so b/lib/python3.4/site-packages/sqlalchemy/cresultproxy.cpython-35m-x86_64-linux-gnu.so deleted file mode 100755 index 526596f..0000000 Binary files a/lib/python3.4/site-packages/sqlalchemy/cresultproxy.cpython-35m-x86_64-linux-gnu.so and /dev/null differ diff --git a/lib/python3.4/site-packages/sqlalchemy/cresultproxy.cpython-36m-x86_64-linux-gnu.so b/lib/python3.4/site-packages/sqlalchemy/cresultproxy.cpython-36m-x86_64-linux-gnu.so deleted file mode 100755 index 20aacaa..0000000 Binary files a/lib/python3.4/site-packages/sqlalchemy/cresultproxy.cpython-36m-x86_64-linux-gnu.so and /dev/null differ diff --git a/lib/python3.4/site-packages/sqlalchemy/cutils.cpython-34m.so b/lib/python3.4/site-packages/sqlalchemy/cutils.cpython-34m.so deleted file mode 100755 index b5b447d..0000000 Binary files a/lib/python3.4/site-packages/sqlalchemy/cutils.cpython-34m.so and /dev/null differ diff --git a/lib/python3.4/site-packages/sqlalchemy/cutils.cpython-35m-x86_64-linux-gnu.so b/lib/python3.4/site-packages/sqlalchemy/cutils.cpython-35m-x86_64-linux-gnu.so deleted file mode 100755 index 15aaff6..0000000 Binary files a/lib/python3.4/site-packages/sqlalchemy/cutils.cpython-35m-x86_64-linux-gnu.so and /dev/null differ diff --git a/lib/python3.4/site-packages/sqlalchemy/cutils.cpython-36m-x86_64-linux-gnu.so b/lib/python3.4/site-packages/sqlalchemy/cutils.cpython-36m-x86_64-linux-gnu.so deleted file mode 100755 index 8cfcf0d..0000000 Binary files a/lib/python3.4/site-packages/sqlalchemy/cutils.cpython-36m-x86_64-linux-gnu.so and /dev/null differ diff --git a/lib/python3.4/site-packages/wheel-0.30.0.dist-info/DESCRIPTION.rst b/lib/python3.4/site-packages/wheel-0.30.0.dist-info/DESCRIPTION.rst deleted file mode 100644 index 9f37cad..0000000 --- a/lib/python3.4/site-packages/wheel-0.30.0.dist-info/DESCRIPTION.rst +++ /dev/null @@ -1,340 +0,0 @@ -Wheel -===== - -A built-package format for Python. - -A wheel is a ZIP-format archive with a specially formatted filename -and the .whl extension. It is designed to contain all the files for a -PEP 376 compatible install in a way that is very close to the on-disk -format. Many packages will be properly installed with only the "Unpack" -step (simply extracting the file onto sys.path), and the unpacked archive -preserves enough information to "Spread" (copy data and scripts to their -final locations) at any later time. - -The wheel project provides a `bdist_wheel` command for setuptools -(requires setuptools >= 0.8.0). Wheel files can be installed with a -newer `pip` from https://github.com/pypa/pip or with wheel's own command -line utility. - -The wheel documentation is at http://wheel.rtfd.org/. The file format -is documented in PEP 427 (http://www.python.org/dev/peps/pep-0427/). - -The reference implementation is at https://github.com/pypa/wheel - -Why not egg? ------------- - -Python's egg format predates the packaging related standards we have -today, the most important being PEP 376 "Database of Installed Python -Distributions" which specifies the .dist-info directory (instead of -.egg-info) and PEP 426 "Metadata for Python Software Packages 2.0" -which specifies how to express dependencies (instead of requires.txt -in .egg-info). - -Wheel implements these things. It also provides a richer file naming -convention that communicates the Python implementation and ABI as well -as simply the language version used in a particular package. - -Unlike .egg, wheel will be a fully-documented standard at the binary -level that is truly easy to install even if you do not want to use the -reference implementation. - - -Code of Conduct ---------------- - -Everyone interacting in the wheel project's codebases, issue trackers, chat -rooms, and mailing lists is expected to follow the `PyPA Code of Conduct`_. - -.. _PyPA Code of Conduct: https://www.pypa.io/en/latest/code-of-conduct/ - - -0.30.0 -====== -- Added py-limited-api {cp32|cp33|cp34|...} flag to produce cpNN.abi3.{arch} - tags on CPython 3. -- Documented the ``license_file`` metadata key -- Improved Python, abi tagging for `wheel convert`. Thanks Ales Erjavec. -- Fixed `>` being prepended to lines starting with "From" in the long description -- Added support for specifying a build number (as per PEP 427). - Thanks Ian Cordasco. -- Made the order of files in generated ZIP files deterministic. - Thanks Matthias Bach. -- Made the order of requirements in metadata deterministic. Thanks Chris Lamb. -- Fixed `wheel install` clobbering existing files -- Improved the error message when trying to verify an unsigned wheel file -- Removed support for Python 2.6, 3.2 and 3.3. - -0.29.0 -====== -- Fix compression type of files in archive (Issue #155, Pull Request #62, - thanks Xavier Fernandez) - -0.28.0 -====== -- Fix file modes in archive (Issue #154) - -0.27.0 -====== -- Support forcing a platform tag using `--plat-name` on pure-Python wheels, as - well as nonstandard platform tags on non-pure wheels (Pull Request #60, Issue - #144, thanks Andrés Díaz) -- Add SOABI tags to platform-specific wheels built for Python 2.X (Pull Request - #55, Issue #63, Issue #101) -- Support reproducible wheel files, wheels that can be rebuilt and will hash to - the same values as previous builds (Pull Request #52, Issue #143, thanks - Barry Warsaw) -- Support for changes in keyring >= 8.0 (Pull Request #61, thanks Jason R. - Coombs) -- Use the file context manager when checking if dependency_links.txt is empty, - fixes problems building wheels under PyPy on Windows (Issue #150, thanks - Cosimo Lupo) -- Don't attempt to (recursively) create a build directory ending with `..` - (invalid on all platforms, but code was only executed on Windows) (Issue #91) -- Added the PyPA Code of Conduct (Pull Request #56) - -0.26.0 -====== -- Fix multiple entrypoint comparison failure on Python 3 (Issue #148) - -0.25.0 -====== -- Add Python 3.5 to tox configuration -- Deterministic (sorted) metadata -- Fix tagging for Python 3.5 compatibility -- Support py2-none-'arch' and py3-none-'arch' tags -- Treat data-only wheels as pure -- Write to temporary file and rename when using wheel install --force - -0.24.0 -====== -- The python tag used for pure-python packages is now .pyN (major version - only). This change actually occurred in 0.23.0 when the --python-tag - option was added, but was not explicitly mentioned in the changelog then. -- wininst2wheel and egg2wheel removed. Use "wheel convert [archive]" - instead. -- Wheel now supports setuptools style conditional requirements via the - extras_require={} syntax. Separate 'extra' names from conditions using - the : character. Wheel's own setup.py does this. (The empty-string - extra is the same as install_requires.) These conditional requirements - should work the same whether the package is installed by wheel or - by setup.py. - -0.23.0 -====== -- Compatibility tag flags added to the bdist_wheel command -- sdist should include files necessary for tests -- 'wheel convert' can now also convert unpacked eggs to wheel -- Rename pydist.json to metadata.json to avoid stepping on the PEP -- The --skip-scripts option has been removed, and not generating scripts is now - the default. The option was a temporary approach until installers could - generate scripts themselves. That is now the case with pip 1.5 and later. - Note that using pip 1.4 to install a wheel without scripts will leave the - installation without entry-point wrappers. The "wheel install-scripts" - command can be used to generate the scripts in such cases. -- Thank you contributors - -0.22.0 -====== -- Include entry_points.txt, scripts a.k.a. commands, in experimental - pydist.json -- Improved test_requires parsing -- Python 2.6 fixes, "wheel version" command courtesy pombredanne - -0.21.0 -====== -- Pregenerated scripts are the default again. -- "setup.py bdist_wheel --skip-scripts" turns them off. -- setuptools is no longer a listed requirement for the 'wheel' - package. It is of course still required in order for bdist_wheel - to work. -- "python -m wheel" avoids importing pkg_resources until it's necessary. - -0.20.0 -====== -- No longer include console_scripts in wheels. Ordinary scripts (shell files, - standalone Python files) are included as usual. -- Include new command "python -m wheel install-scripts [distribution - [distribution ...]]" to install the console_scripts (setuptools-style - scripts using pkg_resources) for a distribution. - -0.19.0 -====== -- pymeta.json becomes pydist.json - -0.18.0 -====== -- Python 3 Unicode improvements - -0.17.0 -====== -- Support latest PEP-426 "pymeta.json" (json-format metadata) - -0.16.0 -====== -- Python 2.6 compatibility bugfix (thanks John McFarlane) -- Non-prerelease version number - -1.0.0a2 -======= -- Bugfix for C-extension tags for CPython 3.3 (using SOABI) - -1.0.0a1 -======= -- Bugfix for bdist_wininst converter "wheel convert" -- Bugfix for dists where "is pure" is None instead of True or False - -1.0.0a0 -======= -- Update for version 1.0 of Wheel (PEP accepted). -- Python 3 fix for moving Unicode Description to metadata body -- Include rudimentary API documentation in Sphinx (thanks Kevin Horn) - -0.15.0 -====== -- Various improvements - -0.14.0 -====== -- Changed the signature format to better comply with the current JWS spec. - Breaks all existing signatures. -- Include ``wheel unsign`` command to remove RECORD.jws from an archive. -- Put the description in the newly allowed payload section of PKG-INFO - (METADATA) files. - -0.13.0 -====== -- Use distutils instead of sysconfig to get installation paths; can install - headers. -- Improve WheelFile() sort. -- Allow bootstrap installs without any pkg_resources. - -0.12.0 -====== -- Unit test for wheel.tool.install - -0.11.0 -====== -- API cleanup - -0.10.3 -====== -- Scripts fixer fix - -0.10.2 -====== -- Fix keygen - -0.10.1 -====== -- Preserve attributes on install. - -0.10.0 -====== -- Include a copy of pkg_resources. Wheel can now install into a virtualenv - that does not have distribute (though most packages still require - pkg_resources to actually work; wheel install distribute) -- Define a new setup.cfg section [wheel]. universal=1 will - apply the py2.py3-none-any tag for pure python wheels. - -0.9.7 -===== -- Only import dirspec when needed. dirspec is only needed to find the - configuration for keygen/signing operations. - -0.9.6 -===== -- requires-dist from setup.cfg overwrites any requirements from setup.py - Care must be taken that the requirements are the same in both cases, - or just always install from wheel. -- drop dirspec requirement on win32 -- improved command line utility, adds 'wheel convert [egg or wininst]' to - convert legacy binary formats to wheel - -0.9.5 -===== -- Wheel's own wheel file can be executed by Python, and can install itself: - ``python wheel-0.9.5-py27-none-any/wheel install ...`` -- Use argparse; basic ``wheel install`` command should run with only stdlib - dependencies. -- Allow requires_dist in setup.cfg's [metadata] section. In addition to - dependencies in setup.py, but will only be interpreted when installing - from wheel, not from sdist. Can be qualified with environment markers. - -0.9.4 -===== -- Fix wheel.signatures in sdist - -0.9.3 -===== -- Integrated digital signatures support without C extensions. -- Integrated "wheel install" command (single package, no dependency - resolution) including compatibility check. -- Support Python 3.3 -- Use Metadata 1.3 (PEP 426) - -0.9.2 -===== -- Automatic signing if WHEEL_TOOL points to the wheel binary -- Even more Python 3 fixes - -0.9.1 -===== -- 'wheel sign' uses the keys generated by 'wheel keygen' (instead of generating - a new key at random each time) -- Python 2/3 encoding/decoding fixes -- Run tests on Python 2.6 (without signature verification) - -0.9 -=== -- Updated digital signatures scheme -- Python 3 support for digital signatures -- Always verify RECORD hashes on extract -- "wheel" command line tool to sign, verify, unpack wheel files - -0.8 -=== -- none/any draft pep tags update -- improved wininst2wheel script -- doc changes and other improvements - -0.7 -=== -- sort .dist-info at end of wheel archive -- Windows & Python 3 fixes from Paul Moore -- pep8 -- scripts to convert wininst & egg to wheel - -0.6 -=== -- require distribute >= 0.6.28 -- stop using verlib - -0.5 -=== -- working pretty well - -0.4.2 -===== -- hyphenated name fix - -0.4 -=== -- improve test coverage -- improve Windows compatibility -- include tox.ini courtesy of Marc Abramowitz -- draft hmac sha-256 signing function - -0.3 -=== -- prototype egg2wheel conversion script - -0.2 -=== -- Python 3 compatibility - -0.1 -=== -- Initial version - - diff --git a/lib/python3.4/site-packages/wheel-0.30.0.dist-info/INSTALLER b/lib/python3.4/site-packages/wheel-0.30.0.dist-info/INSTALLER deleted file mode 100644 index a1b589e..0000000 --- a/lib/python3.4/site-packages/wheel-0.30.0.dist-info/INSTALLER +++ /dev/null @@ -1 +0,0 @@ -pip diff --git a/lib/python3.4/site-packages/wheel-0.30.0.dist-info/LICENSE.txt b/lib/python3.4/site-packages/wheel-0.30.0.dist-info/LICENSE.txt deleted file mode 100644 index c3441e6..0000000 --- a/lib/python3.4/site-packages/wheel-0.30.0.dist-info/LICENSE.txt +++ /dev/null @@ -1,22 +0,0 @@ -"wheel" copyright (c) 2012-2014 Daniel Holth and -contributors. - -The MIT License - -Permission is hereby granted, free of charge, to any person obtaining a -copy of this software and associated documentation files (the "Software"), -to deal in the Software without restriction, including without limitation -the rights to use, copy, modify, merge, publish, distribute, sublicense, -and/or sell copies of the Software, and to permit persons to whom the -Software is furnished to do so, subject to the following conditions: - -The above copyright notice and this permission notice shall be included -in all copies or substantial portions of the Software. - -THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR -IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, -FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL -THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR -OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, -ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR -OTHER DEALINGS IN THE SOFTWARE. diff --git a/lib/python3.4/site-packages/wheel-0.30.0.dist-info/METADATA b/lib/python3.4/site-packages/wheel-0.30.0.dist-info/METADATA deleted file mode 100644 index 52b6411..0000000 --- a/lib/python3.4/site-packages/wheel-0.30.0.dist-info/METADATA +++ /dev/null @@ -1,374 +0,0 @@ -Metadata-Version: 2.0 -Name: wheel -Version: 0.30.0 -Summary: A built-package format for Python. -Home-page: https://github.com/pypa/wheel -Author: Alex Grönholm -Author-email: alex.gronholm@nextday.fi -License: MIT -Description-Content-Type: UNKNOWN -Keywords: wheel,packaging -Platform: UNKNOWN -Classifier: Development Status :: 5 - Production/Stable -Classifier: Intended Audience :: Developers -Classifier: License :: OSI Approved :: MIT License -Classifier: Programming Language :: Python -Classifier: Programming Language :: Python :: 2 -Classifier: Programming Language :: Python :: 2.7 -Classifier: Programming Language :: Python :: 3 -Classifier: Programming Language :: Python :: 3.4 -Classifier: Programming Language :: Python :: 3.5 -Classifier: Programming Language :: Python :: 3.6 -Provides-Extra: faster-signatures -Requires-Dist: ed25519ll; extra == 'faster-signatures' -Provides-Extra: signatures -Requires-Dist: keyring; extra == 'signatures' -Requires-Dist: keyrings.alt; extra == 'signatures' -Provides-Extra: signatures -Requires-Dist: pyxdg; sys_platform!="win32" and extra == 'signatures' -Provides-Extra: test -Requires-Dist: jsonschema; extra == 'test' -Requires-Dist: pytest (>=3.0.0); extra == 'test' -Requires-Dist: pytest-cov; extra == 'test' -Provides-Extra: tool - -Wheel -===== - -A built-package format for Python. - -A wheel is a ZIP-format archive with a specially formatted filename -and the .whl extension. It is designed to contain all the files for a -PEP 376 compatible install in a way that is very close to the on-disk -format. Many packages will be properly installed with only the "Unpack" -step (simply extracting the file onto sys.path), and the unpacked archive -preserves enough information to "Spread" (copy data and scripts to their -final locations) at any later time. - -The wheel project provides a `bdist_wheel` command for setuptools -(requires setuptools >= 0.8.0). Wheel files can be installed with a -newer `pip` from https://github.com/pypa/pip or with wheel's own command -line utility. - -The wheel documentation is at http://wheel.rtfd.org/. The file format -is documented in PEP 427 (http://www.python.org/dev/peps/pep-0427/). - -The reference implementation is at https://github.com/pypa/wheel - -Why not egg? ------------- - -Python's egg format predates the packaging related standards we have -today, the most important being PEP 376 "Database of Installed Python -Distributions" which specifies the .dist-info directory (instead of -.egg-info) and PEP 426 "Metadata for Python Software Packages 2.0" -which specifies how to express dependencies (instead of requires.txt -in .egg-info). - -Wheel implements these things. It also provides a richer file naming -convention that communicates the Python implementation and ABI as well -as simply the language version used in a particular package. - -Unlike .egg, wheel will be a fully-documented standard at the binary -level that is truly easy to install even if you do not want to use the -reference implementation. - - -Code of Conduct ---------------- - -Everyone interacting in the wheel project's codebases, issue trackers, chat -rooms, and mailing lists is expected to follow the `PyPA Code of Conduct`_. - -.. _PyPA Code of Conduct: https://www.pypa.io/en/latest/code-of-conduct/ - - -0.30.0 -====== -- Added py-limited-api {cp32|cp33|cp34|...} flag to produce cpNN.abi3.{arch} - tags on CPython 3. -- Documented the ``license_file`` metadata key -- Improved Python, abi tagging for `wheel convert`. Thanks Ales Erjavec. -- Fixed `>` being prepended to lines starting with "From" in the long description -- Added support for specifying a build number (as per PEP 427). - Thanks Ian Cordasco. -- Made the order of files in generated ZIP files deterministic. - Thanks Matthias Bach. -- Made the order of requirements in metadata deterministic. Thanks Chris Lamb. -- Fixed `wheel install` clobbering existing files -- Improved the error message when trying to verify an unsigned wheel file -- Removed support for Python 2.6, 3.2 and 3.3. - -0.29.0 -====== -- Fix compression type of files in archive (Issue #155, Pull Request #62, - thanks Xavier Fernandez) - -0.28.0 -====== -- Fix file modes in archive (Issue #154) - -0.27.0 -====== -- Support forcing a platform tag using `--plat-name` on pure-Python wheels, as - well as nonstandard platform tags on non-pure wheels (Pull Request #60, Issue - #144, thanks Andrés Díaz) -- Add SOABI tags to platform-specific wheels built for Python 2.X (Pull Request - #55, Issue #63, Issue #101) -- Support reproducible wheel files, wheels that can be rebuilt and will hash to - the same values as previous builds (Pull Request #52, Issue #143, thanks - Barry Warsaw) -- Support for changes in keyring >= 8.0 (Pull Request #61, thanks Jason R. - Coombs) -- Use the file context manager when checking if dependency_links.txt is empty, - fixes problems building wheels under PyPy on Windows (Issue #150, thanks - Cosimo Lupo) -- Don't attempt to (recursively) create a build directory ending with `..` - (invalid on all platforms, but code was only executed on Windows) (Issue #91) -- Added the PyPA Code of Conduct (Pull Request #56) - -0.26.0 -====== -- Fix multiple entrypoint comparison failure on Python 3 (Issue #148) - -0.25.0 -====== -- Add Python 3.5 to tox configuration -- Deterministic (sorted) metadata -- Fix tagging for Python 3.5 compatibility -- Support py2-none-'arch' and py3-none-'arch' tags -- Treat data-only wheels as pure -- Write to temporary file and rename when using wheel install --force - -0.24.0 -====== -- The python tag used for pure-python packages is now .pyN (major version - only). This change actually occurred in 0.23.0 when the --python-tag - option was added, but was not explicitly mentioned in the changelog then. -- wininst2wheel and egg2wheel removed. Use "wheel convert [archive]" - instead. -- Wheel now supports setuptools style conditional requirements via the - extras_require={} syntax. Separate 'extra' names from conditions using - the : character. Wheel's own setup.py does this. (The empty-string - extra is the same as install_requires.) These conditional requirements - should work the same whether the package is installed by wheel or - by setup.py. - -0.23.0 -====== -- Compatibility tag flags added to the bdist_wheel command -- sdist should include files necessary for tests -- 'wheel convert' can now also convert unpacked eggs to wheel -- Rename pydist.json to metadata.json to avoid stepping on the PEP -- The --skip-scripts option has been removed, and not generating scripts is now - the default. The option was a temporary approach until installers could - generate scripts themselves. That is now the case with pip 1.5 and later. - Note that using pip 1.4 to install a wheel without scripts will leave the - installation without entry-point wrappers. The "wheel install-scripts" - command can be used to generate the scripts in such cases. -- Thank you contributors - -0.22.0 -====== -- Include entry_points.txt, scripts a.k.a. commands, in experimental - pydist.json -- Improved test_requires parsing -- Python 2.6 fixes, "wheel version" command courtesy pombredanne - -0.21.0 -====== -- Pregenerated scripts are the default again. -- "setup.py bdist_wheel --skip-scripts" turns them off. -- setuptools is no longer a listed requirement for the 'wheel' - package. It is of course still required in order for bdist_wheel - to work. -- "python -m wheel" avoids importing pkg_resources until it's necessary. - -0.20.0 -====== -- No longer include console_scripts in wheels. Ordinary scripts (shell files, - standalone Python files) are included as usual. -- Include new command "python -m wheel install-scripts [distribution - [distribution ...]]" to install the console_scripts (setuptools-style - scripts using pkg_resources) for a distribution. - -0.19.0 -====== -- pymeta.json becomes pydist.json - -0.18.0 -====== -- Python 3 Unicode improvements - -0.17.0 -====== -- Support latest PEP-426 "pymeta.json" (json-format metadata) - -0.16.0 -====== -- Python 2.6 compatibility bugfix (thanks John McFarlane) -- Non-prerelease version number - -1.0.0a2 -======= -- Bugfix for C-extension tags for CPython 3.3 (using SOABI) - -1.0.0a1 -======= -- Bugfix for bdist_wininst converter "wheel convert" -- Bugfix for dists where "is pure" is None instead of True or False - -1.0.0a0 -======= -- Update for version 1.0 of Wheel (PEP accepted). -- Python 3 fix for moving Unicode Description to metadata body -- Include rudimentary API documentation in Sphinx (thanks Kevin Horn) - -0.15.0 -====== -- Various improvements - -0.14.0 -====== -- Changed the signature format to better comply with the current JWS spec. - Breaks all existing signatures. -- Include ``wheel unsign`` command to remove RECORD.jws from an archive. -- Put the description in the newly allowed payload section of PKG-INFO - (METADATA) files. - -0.13.0 -====== -- Use distutils instead of sysconfig to get installation paths; can install - headers. -- Improve WheelFile() sort. -- Allow bootstrap installs without any pkg_resources. - -0.12.0 -====== -- Unit test for wheel.tool.install - -0.11.0 -====== -- API cleanup - -0.10.3 -====== -- Scripts fixer fix - -0.10.2 -====== -- Fix keygen - -0.10.1 -====== -- Preserve attributes on install. - -0.10.0 -====== -- Include a copy of pkg_resources. Wheel can now install into a virtualenv - that does not have distribute (though most packages still require - pkg_resources to actually work; wheel install distribute) -- Define a new setup.cfg section [wheel]. universal=1 will - apply the py2.py3-none-any tag for pure python wheels. - -0.9.7 -===== -- Only import dirspec when needed. dirspec is only needed to find the - configuration for keygen/signing operations. - -0.9.6 -===== -- requires-dist from setup.cfg overwrites any requirements from setup.py - Care must be taken that the requirements are the same in both cases, - or just always install from wheel. -- drop dirspec requirement on win32 -- improved command line utility, adds 'wheel convert [egg or wininst]' to - convert legacy binary formats to wheel - -0.9.5 -===== -- Wheel's own wheel file can be executed by Python, and can install itself: - ``python wheel-0.9.5-py27-none-any/wheel install ...`` -- Use argparse; basic ``wheel install`` command should run with only stdlib - dependencies. -- Allow requires_dist in setup.cfg's [metadata] section. In addition to - dependencies in setup.py, but will only be interpreted when installing - from wheel, not from sdist. Can be qualified with environment markers. - -0.9.4 -===== -- Fix wheel.signatures in sdist - -0.9.3 -===== -- Integrated digital signatures support without C extensions. -- Integrated "wheel install" command (single package, no dependency - resolution) including compatibility check. -- Support Python 3.3 -- Use Metadata 1.3 (PEP 426) - -0.9.2 -===== -- Automatic signing if WHEEL_TOOL points to the wheel binary -- Even more Python 3 fixes - -0.9.1 -===== -- 'wheel sign' uses the keys generated by 'wheel keygen' (instead of generating - a new key at random each time) -- Python 2/3 encoding/decoding fixes -- Run tests on Python 2.6 (without signature verification) - -0.9 -=== -- Updated digital signatures scheme -- Python 3 support for digital signatures -- Always verify RECORD hashes on extract -- "wheel" command line tool to sign, verify, unpack wheel files - -0.8 -=== -- none/any draft pep tags update -- improved wininst2wheel script -- doc changes and other improvements - -0.7 -=== -- sort .dist-info at end of wheel archive -- Windows & Python 3 fixes from Paul Moore -- pep8 -- scripts to convert wininst & egg to wheel - -0.6 -=== -- require distribute >= 0.6.28 -- stop using verlib - -0.5 -=== -- working pretty well - -0.4.2 -===== -- hyphenated name fix - -0.4 -=== -- improve test coverage -- improve Windows compatibility -- include tox.ini courtesy of Marc Abramowitz -- draft hmac sha-256 signing function - -0.3 -=== -- prototype egg2wheel conversion script - -0.2 -=== -- Python 3 compatibility - -0.1 -=== -- Initial version - - diff --git a/lib/python3.4/site-packages/wheel-0.30.0.dist-info/RECORD b/lib/python3.4/site-packages/wheel-0.30.0.dist-info/RECORD deleted file mode 100644 index 291a54b..0000000 --- a/lib/python3.4/site-packages/wheel-0.30.0.dist-info/RECORD +++ /dev/null @@ -1,46 +0,0 @@ -wheel/__init__.py,sha256=ja92NKda3sstt4uKroYgFATu736whcI33p3GJNdslLQ,96 -wheel/__main__.py,sha256=K--m7mq-27NO0fm-a8KlthkucCe0w_-0hVxL3uDujkU,419 -wheel/archive.py,sha256=oEv42UnpxkoFMKcLXQ9RD8a8oic4X3oe2_H5FAgJ7_M,2376 -wheel/bdist_wheel.py,sha256=qKWdyvpkdmuLB4_GGIZsjmlcMLZuZDd8tRvaQI0w_eo,18852 -wheel/decorator.py,sha256=U2K77ZZ8x3x5vSIGCcEeh8GAxB6rABB7AlDwRukaoCk,541 -wheel/egg2wheel.py,sha256=me4Iaz4idCvS-xjfAzfb2dXXlXx_w6AgLjH6hi1Bt1A,3043 -wheel/install.py,sha256=zYQ-A8uQi-R2PwMvOh64YMlQDplqYpcBVM0EmbxZu8Y,18417 -wheel/metadata.py,sha256=SzI1MtzITZJuAJuvUVzEWi60VhgDbXSV_hapyiX0rlw,11561 -wheel/paths.py,sha256=OAtaJgCivlKvJKw1qC3YbJypvp2d38Eka8GQWdBWNZw,1129 -wheel/pep425tags.py,sha256=Lk9zYm1rrHG1X3RKlf9plcwpsoSZT8UR7fG3jhaoZrQ,5760 -wheel/pkginfo.py,sha256=GR76kupQzn1x9sKDaXuE6B6FsZ4OkfRtG7pndlXPvQ4,1257 -wheel/util.py,sha256=eJB-mrhMAaCGcoKhTLDYdpCf5N8BMLtX4usW_7qeZBg,4732 -wheel/wininst2wheel.py,sha256=afPAHWwa7FY0IkpG-BuuuY-dlB93VmFPrXff511NkBk,7772 -wheel/signatures/__init__.py,sha256=O7kZICZvXxN5YRkCYrPmAEr1LpGaZKJh5sLPWIRIoYE,3766 -wheel/signatures/djbec.py,sha256=jnfWxdS7dwLjiO6n0hy-4jLa_71SPrKWL0-7ocDrSHc,7035 -wheel/signatures/ed25519py.py,sha256=nFKDMq4LW2iJKk4IZKMxY46GyZNYPKxuWha9xYHk9lE,1669 -wheel/signatures/keys.py,sha256=k4j4yGZL31Dt2pa5TneIEeq6qkVIXEPExmFxiZxpE1Y,3299 -wheel/tool/__init__.py,sha256=rOy5VFvj-gTKgMwi_u2_iNu_Pq6aqw4rEfaciDTbmwg,13421 -wheel-0.30.0.dist-info/DESCRIPTION.rst,sha256=Alb3Ol--LhPgmWuBBPfzu54xzQ8J2skWNV34XCjhe0k,10549 -wheel-0.30.0.dist-info/LICENSE.txt,sha256=zKniDGrx_Pv2lAjzd3aShsvuvN7TNhAMm0o_NfvmNeQ,1125 -wheel-0.30.0.dist-info/METADATA,sha256=fYLxr6baQD-wDn4Yu8t-8fF7PJuiBTcThsl2UKBE7kg,11815 -wheel-0.30.0.dist-info/RECORD,, -wheel-0.30.0.dist-info/WHEEL,sha256=kdsN-5OJAZIiHN-iO4Rhl82KyS0bDWf4uBwMbkNafr8,110 -wheel-0.30.0.dist-info/entry_points.txt,sha256=pTyeGVsucyfr_BXe5OQKuA1Bp5YKaIAWy5pejkq4Qx0,109 -wheel-0.30.0.dist-info/metadata.json,sha256=neXQocJnVqPTjr4zpuOVdxBGCmjrTsOs76AvP8ngyJY,1522 -wheel-0.30.0.dist-info/top_level.txt,sha256=HxSBIbgEstMPe4eFawhA66Mq-QYHMopXVoAncfjb_1c,6 -../../../bin/wheel,sha256=sjtPVJ0ZS5WdGK7UXcnQrN6MG_czYyrsndkMrC0qluw,279 -wheel-0.30.0.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4 -wheel/signatures/__pycache__/__init__.cpython-34.pyc,, -wheel/__pycache__/decorator.cpython-34.pyc,, -wheel/__pycache__/__main__.cpython-34.pyc,, -wheel/signatures/__pycache__/ed25519py.cpython-34.pyc,, -wheel/__pycache__/util.cpython-34.pyc,, -wheel/__pycache__/wininst2wheel.cpython-34.pyc,, -wheel/__pycache__/pkginfo.cpython-34.pyc,, -wheel/__pycache__/__init__.cpython-34.pyc,, -wheel/signatures/__pycache__/djbec.cpython-34.pyc,, -wheel/__pycache__/metadata.cpython-34.pyc,, -wheel/__pycache__/egg2wheel.cpython-34.pyc,, -wheel/signatures/__pycache__/keys.cpython-34.pyc,, -wheel/__pycache__/archive.cpython-34.pyc,, -wheel/__pycache__/bdist_wheel.cpython-34.pyc,, -wheel/tool/__pycache__/__init__.cpython-34.pyc,, -wheel/__pycache__/install.cpython-34.pyc,, -wheel/__pycache__/pep425tags.cpython-34.pyc,, -wheel/__pycache__/paths.cpython-34.pyc,, diff --git a/lib/python3.4/site-packages/wheel-0.30.0.dist-info/WHEEL b/lib/python3.4/site-packages/wheel-0.30.0.dist-info/WHEEL deleted file mode 100644 index 7332a41..0000000 --- a/lib/python3.4/site-packages/wheel-0.30.0.dist-info/WHEEL +++ /dev/null @@ -1,6 +0,0 @@ -Wheel-Version: 1.0 -Generator: bdist_wheel (0.30.0) -Root-Is-Purelib: true -Tag: py2-none-any -Tag: py3-none-any - diff --git a/lib/python3.4/site-packages/wheel-0.30.0.dist-info/entry_points.txt b/lib/python3.4/site-packages/wheel-0.30.0.dist-info/entry_points.txt deleted file mode 100644 index 4ad253e..0000000 --- a/lib/python3.4/site-packages/wheel-0.30.0.dist-info/entry_points.txt +++ /dev/null @@ -1,6 +0,0 @@ -[console_scripts] -wheel = wheel.tool:main - -[distutils.commands] -bdist_wheel = wheel.bdist_wheel:bdist_wheel - diff --git a/lib/python3.4/site-packages/wheel-0.30.0.dist-info/metadata.json b/lib/python3.4/site-packages/wheel-0.30.0.dist-info/metadata.json deleted file mode 100644 index 709ccac..0000000 --- a/lib/python3.4/site-packages/wheel-0.30.0.dist-info/metadata.json +++ /dev/null @@ -1 +0,0 @@ -{"classifiers": ["Development Status :: 5 - Production/Stable", "Intended Audience :: Developers", "License :: OSI Approved :: MIT License", "Programming Language :: Python", "Programming Language :: Python :: 2", "Programming Language :: Python :: 2.7", "Programming Language :: Python :: 3", "Programming Language :: Python :: 3.4", "Programming Language :: Python :: 3.5", "Programming Language :: Python :: 3.6"], "description_content_type": "UNKNOWN", "extensions": {"python.commands": {"wrap_console": {"wheel": "wheel.tool:main"}}, "python.details": {"contacts": [{"email": "alex.gronholm@nextday.fi", "name": "Alex Gr\u00f6nholm", "role": "author"}], "document_names": {"description": "DESCRIPTION.rst", "license": "LICENSE.txt"}, "project_urls": {"Home": "https://github.com/pypa/wheel"}}, "python.exports": {"console_scripts": {"wheel": "wheel.tool:main"}, "distutils.commands": {"bdist_wheel": "wheel.bdist_wheel:bdist_wheel"}}}, "extras": ["faster-signatures", "signatures", "test", "tool"], "generator": "bdist_wheel (0.30.0)", "keywords": ["wheel", "packaging"], "license": "MIT", "metadata_version": "2.0", "name": "wheel", "run_requires": [{"extra": "faster-signatures", "requires": ["ed25519ll"]}, {"extra": "test", "requires": ["jsonschema", "pytest (>=3.0.0)", "pytest-cov"]}, {"extra": "signatures", "requires": ["keyring", "keyrings.alt"]}, {"environment": "sys_platform!=\"win32\"", "extra": "signatures", "requires": ["pyxdg"]}], "summary": "A built-package format for Python.", "version": "0.30.0"} \ No newline at end of file diff --git a/lib/python3.4/site-packages/wheel-0.30.0.dist-info/top_level.txt b/lib/python3.4/site-packages/wheel-0.30.0.dist-info/top_level.txt deleted file mode 100644 index 2309722..0000000 --- a/lib/python3.4/site-packages/wheel-0.30.0.dist-info/top_level.txt +++ /dev/null @@ -1 +0,0 @@ -wheel diff --git a/lib/python3.4/site-packages/wheel/__init__.py b/lib/python3.4/site-packages/wheel/__init__.py deleted file mode 100644 index 64cd668..0000000 --- a/lib/python3.4/site-packages/wheel/__init__.py +++ /dev/null @@ -1,2 +0,0 @@ -# __variables__ with double-quoted values will be available in setup.py: -__version__ = "0.30.0" diff --git a/lib/python3.4/site-packages/wheel/__main__.py b/lib/python3.4/site-packages/wheel/__main__.py deleted file mode 100644 index 8f0c4fe..0000000 --- a/lib/python3.4/site-packages/wheel/__main__.py +++ /dev/null @@ -1,19 +0,0 @@ -""" -Wheel command line tool (enable python -m wheel syntax) -""" - -import sys - - -def main(): # needed for console script - if __package__ == '': - # To be able to run 'python wheel-0.9.whl/wheel': - import os.path - path = os.path.dirname(os.path.dirname(__file__)) - sys.path[0:0] = [path] - import wheel.tool - sys.exit(wheel.tool.main()) - - -if __name__ == "__main__": - sys.exit(main()) diff --git a/lib/python3.4/site-packages/wheel/archive.py b/lib/python3.4/site-packages/wheel/archive.py deleted file mode 100644 index 5b1647a..0000000 --- a/lib/python3.4/site-packages/wheel/archive.py +++ /dev/null @@ -1,80 +0,0 @@ -""" -Archive tools for wheel. -""" - -import os -import os.path -import time -import zipfile -from distutils import log - - -def archive_wheelfile(base_name, base_dir): - """Archive all files under `base_dir` in a whl file and name it like - `base_name`. - """ - olddir = os.path.abspath(os.curdir) - base_name = os.path.abspath(base_name) - try: - os.chdir(base_dir) - return make_wheelfile_inner(base_name) - finally: - os.chdir(olddir) - - -def make_wheelfile_inner(base_name, base_dir='.'): - """Create a whl file from all the files under 'base_dir'. - - Places .dist-info at the end of the archive.""" - - zip_filename = base_name + ".whl" - - log.info("creating '%s' and adding '%s' to it", zip_filename, base_dir) - - # Some applications need reproducible .whl files, but they can't do this - # without forcing the timestamp of the individual ZipInfo objects. See - # issue #143. - timestamp = os.environ.get('SOURCE_DATE_EPOCH') - if timestamp is None: - date_time = None - else: - date_time = time.gmtime(int(timestamp))[0:6] - - # XXX support bz2, xz when available - zip = zipfile.ZipFile(zip_filename, "w", compression=zipfile.ZIP_DEFLATED) - - score = {'WHEEL': 1, 'METADATA': 2, 'RECORD': 3} - deferred = [] - - def writefile(path, date_time): - st = os.stat(path) - if date_time is None: - mtime = time.gmtime(st.st_mtime) - date_time = mtime[0:6] - zinfo = zipfile.ZipInfo(path, date_time) - zinfo.external_attr = st.st_mode << 16 - zinfo.compress_type = zipfile.ZIP_DEFLATED - with open(path, 'rb') as fp: - zip.writestr(zinfo, fp.read()) - log.info("adding '%s'" % path) - - for dirpath, dirnames, filenames in os.walk(base_dir): - # Sort the directory names so that `os.walk` will walk them in a - # defined order on the next iteration. - dirnames.sort() - for name in sorted(filenames): - path = os.path.normpath(os.path.join(dirpath, name)) - - if os.path.isfile(path): - if dirpath.endswith('.dist-info'): - deferred.append((score.get(name, 0), path)) - else: - writefile(path, date_time) - - deferred.sort() - for score, path in deferred: - writefile(path, date_time) - - zip.close() - - return zip_filename diff --git a/lib/python3.4/site-packages/wheel/bdist_wheel.py b/lib/python3.4/site-packages/wheel/bdist_wheel.py deleted file mode 100644 index 7fbeb4b..0000000 --- a/lib/python3.4/site-packages/wheel/bdist_wheel.py +++ /dev/null @@ -1,482 +0,0 @@ -""" -Create a wheel (.whl) distribution. - -A wheel is a built archive format. -""" - -import csv -import hashlib -import os -import subprocess -import warnings -import shutil -import json -import sys -import re -from email.generator import Generator -from distutils.core import Command -from distutils.sysconfig import get_python_version -from distutils import log as logger -from shutil import rmtree - -import pkg_resources - -from .pep425tags import get_abbr_impl, get_impl_ver, get_abi_tag, get_platform -from .util import native, open_for_csv -from .archive import archive_wheelfile -from .pkginfo import read_pkg_info, write_pkg_info -from .metadata import pkginfo_to_dict -from . import pep425tags, metadata -from . import __version__ as wheel_version - - -safe_name = pkg_resources.safe_name -safe_version = pkg_resources.safe_version - -PY_LIMITED_API_PATTERN = r'cp3\d' - - -def safer_name(name): - return safe_name(name).replace('-', '_') - - -def safer_version(version): - return safe_version(version).replace('-', '_') - - -class bdist_wheel(Command): - - description = 'create a wheel distribution' - - user_options = [('bdist-dir=', 'b', - "temporary directory for creating the distribution"), - ('plat-name=', 'p', - "platform name to embed in generated filenames " - "(default: %s)" % get_platform()), - ('keep-temp', 'k', - "keep the pseudo-installation tree around after " + - "creating the distribution archive"), - ('dist-dir=', 'd', - "directory to put final built distributions in"), - ('skip-build', None, - "skip rebuilding everything (for testing/debugging)"), - ('relative', None, - "build the archive using relative paths" - "(default: false)"), - ('owner=', 'u', - "Owner name used when creating a tar file" - " [default: current user]"), - ('group=', 'g', - "Group name used when creating a tar file" - " [default: current group]"), - ('universal', None, - "make a universal wheel" - " (default: false)"), - ('python-tag=', None, - "Python implementation compatibility tag" - " (default: py%s)" % get_impl_ver()[0]), - ('build-number=', None, - "Build number for this particular version. " - "As specified in PEP-0427, this must start with a digit. " - "[default: None]"), - ('py-limited-api=', None, - "Python tag (cp32|cp33|cpNN) for abi3 wheel tag" - " (default: false)"), - ] - - boolean_options = ['keep-temp', 'skip-build', 'relative', 'universal'] - - def initialize_options(self): - self.bdist_dir = None - self.data_dir = None - self.plat_name = None - self.plat_tag = None - self.format = 'zip' - self.keep_temp = False - self.dist_dir = None - self.distinfo_dir = None - self.egginfo_dir = None - self.root_is_pure = None - self.skip_build = None - self.relative = False - self.owner = None - self.group = None - self.universal = False - self.python_tag = 'py' + get_impl_ver()[0] - self.build_number = None - self.py_limited_api = False - self.plat_name_supplied = False - - def finalize_options(self): - if self.bdist_dir is None: - bdist_base = self.get_finalized_command('bdist').bdist_base - self.bdist_dir = os.path.join(bdist_base, 'wheel') - - self.data_dir = self.wheel_dist_name + '.data' - self.plat_name_supplied = self.plat_name is not None - - need_options = ('dist_dir', 'plat_name', 'skip_build') - - self.set_undefined_options('bdist', - *zip(need_options, need_options)) - - self.root_is_pure = not (self.distribution.has_ext_modules() - or self.distribution.has_c_libraries()) - - if self.py_limited_api and not re.match(PY_LIMITED_API_PATTERN, self.py_limited_api): - raise ValueError("py-limited-api must match '%s'" % PY_LIMITED_API_PATTERN) - - # Support legacy [wheel] section for setting universal - wheel = self.distribution.get_option_dict('wheel') - if 'universal' in wheel: - # please don't define this in your global configs - val = wheel['universal'][1].strip() - if val.lower() in ('1', 'true', 'yes'): - self.universal = True - - if self.build_number is not None and not self.build_number[:1].isdigit(): - raise ValueError("Build tag (build-number) must start with a digit.") - - @property - def wheel_dist_name(self): - """Return distribution full name with - replaced with _""" - components = (safer_name(self.distribution.get_name()), - safer_version(self.distribution.get_version())) - if self.build_number: - components += (self.build_number,) - return '-'.join(components) - - def get_tag(self): - # bdist sets self.plat_name if unset, we should only use it for purepy - # wheels if the user supplied it. - if self.plat_name_supplied: - plat_name = self.plat_name - elif self.root_is_pure: - plat_name = 'any' - else: - plat_name = self.plat_name or get_platform() - if plat_name in ('linux-x86_64', 'linux_x86_64') and sys.maxsize == 2147483647: - plat_name = 'linux_i686' - plat_name = plat_name.replace('-', '_').replace('.', '_') - - if self.root_is_pure: - if self.universal: - impl = 'py2.py3' - else: - impl = self.python_tag - tag = (impl, 'none', plat_name) - else: - impl_name = get_abbr_impl() - impl_ver = get_impl_ver() - impl = impl_name + impl_ver - # We don't work on CPython 3.1, 3.0. - if self.py_limited_api and (impl_name + impl_ver).startswith('cp3'): - impl = self.py_limited_api - abi_tag = 'abi3' - else: - abi_tag = str(get_abi_tag()).lower() - tag = (impl, abi_tag, plat_name) - supported_tags = pep425tags.get_supported( - supplied_platform=plat_name if self.plat_name_supplied else None) - # XXX switch to this alternate implementation for non-pure: - if not self.py_limited_api: - assert tag == supported_tags[0], "%s != %s" % (tag, supported_tags[0]) - assert tag in supported_tags, "would build wheel with unsupported tag {}".format(tag) - return tag - - def get_archive_basename(self): - """Return archive name without extension""" - - impl_tag, abi_tag, plat_tag = self.get_tag() - - archive_basename = "%s-%s-%s-%s" % ( - self.wheel_dist_name, - impl_tag, - abi_tag, - plat_tag) - return archive_basename - - def run(self): - build_scripts = self.reinitialize_command('build_scripts') - build_scripts.executable = 'python' - - if not self.skip_build: - self.run_command('build') - - install = self.reinitialize_command('install', - reinit_subcommands=True) - install.root = self.bdist_dir - install.compile = False - install.skip_build = self.skip_build - install.warn_dir = False - - # A wheel without setuptools scripts is more cross-platform. - # Use the (undocumented) `no_ep` option to setuptools' - # install_scripts command to avoid creating entry point scripts. - install_scripts = self.reinitialize_command('install_scripts') - install_scripts.no_ep = True - - # Use a custom scheme for the archive, because we have to decide - # at installation time which scheme to use. - for key in ('headers', 'scripts', 'data', 'purelib', 'platlib'): - setattr(install, - 'install_' + key, - os.path.join(self.data_dir, key)) - - basedir_observed = '' - - if os.name == 'nt': - # win32 barfs if any of these are ''; could be '.'? - # (distutils.command.install:change_roots bug) - basedir_observed = os.path.normpath(os.path.join(self.data_dir, '..')) - self.install_libbase = self.install_lib = basedir_observed - - setattr(install, - 'install_purelib' if self.root_is_pure else 'install_platlib', - basedir_observed) - - logger.info("installing to %s", self.bdist_dir) - - self.run_command('install') - - archive_basename = self.get_archive_basename() - - pseudoinstall_root = os.path.join(self.dist_dir, archive_basename) - if not self.relative: - archive_root = self.bdist_dir - else: - archive_root = os.path.join( - self.bdist_dir, - self._ensure_relative(install.install_base)) - - self.set_undefined_options( - 'install_egg_info', ('target', 'egginfo_dir')) - self.distinfo_dir = os.path.join(self.bdist_dir, - '%s.dist-info' % self.wheel_dist_name) - self.egg2dist(self.egginfo_dir, - self.distinfo_dir) - - self.write_wheelfile(self.distinfo_dir) - - self.write_record(self.bdist_dir, self.distinfo_dir) - - # Make the archive - if not os.path.exists(self.dist_dir): - os.makedirs(self.dist_dir) - wheel_name = archive_wheelfile(pseudoinstall_root, archive_root) - - # Sign the archive - if 'WHEEL_TOOL' in os.environ: - subprocess.call([os.environ['WHEEL_TOOL'], 'sign', wheel_name]) - - # Add to 'Distribution.dist_files' so that the "upload" command works - getattr(self.distribution, 'dist_files', []).append( - ('bdist_wheel', get_python_version(), wheel_name)) - - if not self.keep_temp: - if self.dry_run: - logger.info('removing %s', self.bdist_dir) - else: - rmtree(self.bdist_dir) - - def write_wheelfile(self, wheelfile_base, generator='bdist_wheel (' + wheel_version + ')'): - from email.message import Message - msg = Message() - msg['Wheel-Version'] = '1.0' # of the spec - msg['Generator'] = generator - msg['Root-Is-Purelib'] = str(self.root_is_pure).lower() - if self.build_number is not None: - msg['Build'] = self.build_number - - # Doesn't work for bdist_wininst - impl_tag, abi_tag, plat_tag = self.get_tag() - for impl in impl_tag.split('.'): - for abi in abi_tag.split('.'): - for plat in plat_tag.split('.'): - msg['Tag'] = '-'.join((impl, abi, plat)) - - wheelfile_path = os.path.join(wheelfile_base, 'WHEEL') - logger.info('creating %s', wheelfile_path) - with open(wheelfile_path, 'w') as f: - Generator(f, maxheaderlen=0).flatten(msg) - - def _ensure_relative(self, path): - # copied from dir_util, deleted - drive, path = os.path.splitdrive(path) - if path[0:1] == os.sep: - path = drive + path[1:] - return path - - def _pkginfo_to_metadata(self, egg_info_path, pkginfo_path): - return metadata.pkginfo_to_metadata(egg_info_path, pkginfo_path) - - def license_file(self): - """Return license filename from a license-file key in setup.cfg, or None.""" - metadata = self.distribution.get_option_dict('metadata') - if 'license_file' not in metadata: - return None - return metadata['license_file'][1] - - def setupcfg_requirements(self): - """Generate requirements from setup.cfg as - ('Requires-Dist', 'requirement; qualifier') tuples. From a metadata - section in setup.cfg: - - [metadata] - provides-extra = extra1 - extra2 - requires-dist = requirement; qualifier - another; qualifier2 - unqualified - - Yields - - ('Provides-Extra', 'extra1'), - ('Provides-Extra', 'extra2'), - ('Requires-Dist', 'requirement; qualifier'), - ('Requires-Dist', 'another; qualifier2'), - ('Requires-Dist', 'unqualified') - """ - metadata = self.distribution.get_option_dict('metadata') - - # our .ini parser folds - to _ in key names: - for key, title in (('provides_extra', 'Provides-Extra'), - ('requires_dist', 'Requires-Dist')): - if key not in metadata: - continue - field = metadata[key] - for line in field[1].splitlines(): - line = line.strip() - if not line: - continue - yield (title, line) - - def add_requirements(self, metadata_path): - """Add additional requirements from setup.cfg to file metadata_path""" - additional = list(self.setupcfg_requirements()) - if not additional: - return - - pkg_info = read_pkg_info(metadata_path) - if 'Provides-Extra' in pkg_info or 'Requires-Dist' in pkg_info: - warnings.warn('setup.cfg requirements overwrite values from setup.py') - del pkg_info['Provides-Extra'] - del pkg_info['Requires-Dist'] - for k, v in additional: - pkg_info[k] = v - write_pkg_info(metadata_path, pkg_info) - - def egg2dist(self, egginfo_path, distinfo_path): - """Convert an .egg-info directory into a .dist-info directory""" - def adios(p): - """Appropriately delete directory, file or link.""" - if os.path.exists(p) and not os.path.islink(p) and os.path.isdir(p): - shutil.rmtree(p) - elif os.path.exists(p): - os.unlink(p) - - adios(distinfo_path) - - if not os.path.exists(egginfo_path): - # There is no egg-info. This is probably because the egg-info - # file/directory is not named matching the distribution name used - # to name the archive file. Check for this case and report - # accordingly. - import glob - pat = os.path.join(os.path.dirname(egginfo_path), '*.egg-info') - possible = glob.glob(pat) - err = "Egg metadata expected at %s but not found" % (egginfo_path,) - if possible: - alt = os.path.basename(possible[0]) - err += " (%s found - possible misnamed archive file?)" % (alt,) - - raise ValueError(err) - - if os.path.isfile(egginfo_path): - # .egg-info is a single file - pkginfo_path = egginfo_path - pkg_info = self._pkginfo_to_metadata(egginfo_path, egginfo_path) - os.mkdir(distinfo_path) - else: - # .egg-info is a directory - pkginfo_path = os.path.join(egginfo_path, 'PKG-INFO') - pkg_info = self._pkginfo_to_metadata(egginfo_path, pkginfo_path) - - # ignore common egg metadata that is useless to wheel - shutil.copytree(egginfo_path, distinfo_path, - ignore=lambda x, y: {'PKG-INFO', 'requires.txt', 'SOURCES.txt', - 'not-zip-safe'} - ) - - # delete dependency_links if it is only whitespace - dependency_links_path = os.path.join(distinfo_path, 'dependency_links.txt') - with open(dependency_links_path, 'r') as dependency_links_file: - dependency_links = dependency_links_file.read().strip() - if not dependency_links: - adios(dependency_links_path) - - write_pkg_info(os.path.join(distinfo_path, 'METADATA'), pkg_info) - - # XXX deprecated. Still useful for current distribute/setuptools. - metadata_path = os.path.join(distinfo_path, 'METADATA') - self.add_requirements(metadata_path) - - # XXX intentionally a different path than the PEP. - metadata_json_path = os.path.join(distinfo_path, 'metadata.json') - pymeta = pkginfo_to_dict(metadata_path, - distribution=self.distribution) - - if 'description' in pymeta: - description_filename = 'DESCRIPTION.rst' - description_text = pymeta.pop('description') - description_path = os.path.join(distinfo_path, - description_filename) - with open(description_path, "wb") as description_file: - description_file.write(description_text.encode('utf-8')) - pymeta['extensions']['python.details']['document_names']['description'] = \ - description_filename - - # XXX heuristically copy any LICENSE/LICENSE.txt? - license = self.license_file() - if license: - license_filename = 'LICENSE.txt' - shutil.copy(license, os.path.join(self.distinfo_dir, license_filename)) - pymeta['extensions']['python.details']['document_names']['license'] = license_filename - - with open(metadata_json_path, "w") as metadata_json: - json.dump(pymeta, metadata_json, sort_keys=True) - - adios(egginfo_path) - - def write_record(self, bdist_dir, distinfo_dir): - from .util import urlsafe_b64encode - - record_path = os.path.join(distinfo_dir, 'RECORD') - record_relpath = os.path.relpath(record_path, bdist_dir) - - def walk(): - for dir, dirs, files in os.walk(bdist_dir): - dirs.sort() - for f in sorted(files): - yield os.path.join(dir, f) - - def skip(path): - """Wheel hashes every possible file.""" - return (path == record_relpath) - - with open_for_csv(record_path, 'w+') as record_file: - writer = csv.writer(record_file) - for path in walk(): - relpath = os.path.relpath(path, bdist_dir) - if skip(relpath): - hash = '' - size = '' - else: - with open(path, 'rb') as f: - data = f.read() - digest = hashlib.sha256(data).digest() - hash = 'sha256=' + native(urlsafe_b64encode(digest)) - size = len(data) - record_path = os.path.relpath( - path, bdist_dir).replace(os.path.sep, '/') - writer.writerow((record_path, hash, size)) diff --git a/lib/python3.4/site-packages/wheel/decorator.py b/lib/python3.4/site-packages/wheel/decorator.py deleted file mode 100644 index e4b56d1..0000000 --- a/lib/python3.4/site-packages/wheel/decorator.py +++ /dev/null @@ -1,19 +0,0 @@ -# from Pyramid - - -class reify(object): - """Put the result of a method which uses this (non-data) - descriptor decorator in the instance dict after the first call, - effectively replacing the decorator with an instance variable. - """ - - def __init__(self, wrapped): - self.wrapped = wrapped - self.__doc__ = wrapped.__doc__ - - def __get__(self, inst, objtype=None): - if inst is None: - return self - val = self.wrapped(inst) - setattr(inst, self.wrapped.__name__, val) - return val diff --git a/lib/python3.4/site-packages/wheel/egg2wheel.py b/lib/python3.4/site-packages/wheel/egg2wheel.py deleted file mode 100644 index 3799909..0000000 --- a/lib/python3.4/site-packages/wheel/egg2wheel.py +++ /dev/null @@ -1,90 +0,0 @@ -#!/usr/bin/env python -import distutils.dist -import os.path -import re -import shutil -import sys -import tempfile -import zipfile -from argparse import ArgumentParser -from distutils.archive_util import make_archive -from glob import iglob - -import wheel.bdist_wheel -from wheel.wininst2wheel import _bdist_wheel_tag - -egg_info_re = re.compile(r'''(?P.+?)-(?P.+?) - (-(?P.+?))?(-(?P.+?))?.egg''', re.VERBOSE) - - -def egg2wheel(egg_path, dest_dir): - egg_info = egg_info_re.match(os.path.basename(egg_path)).groupdict() - dir = tempfile.mkdtemp(suffix="_e2w") - if os.path.isfile(egg_path): - # assume we have a bdist_egg otherwise - egg = zipfile.ZipFile(egg_path) - egg.extractall(dir) - else: - # support buildout-style installed eggs directories - for pth in os.listdir(egg_path): - src = os.path.join(egg_path, pth) - if os.path.isfile(src): - shutil.copy2(src, dir) - else: - shutil.copytree(src, os.path.join(dir, pth)) - - dist_info = "%s-%s" % (egg_info['name'], egg_info['ver']) - abi = 'none' - pyver = egg_info['pyver'].replace('.', '') - arch = (egg_info['arch'] or 'any').replace('.', '_').replace('-', '_') - if arch != 'any': - # assume all binary eggs are for CPython - pyver = 'cp' + pyver[2:] - wheel_name = '-'.join(( - dist_info, - pyver, - abi, - arch - )) - root_is_purelib = egg_info['arch'] is None - if root_is_purelib: - bw = wheel.bdist_wheel.bdist_wheel(distutils.dist.Distribution()) - else: - bw = _bdist_wheel_tag(distutils.dist.Distribution()) - - bw.root_is_pure = root_is_purelib - bw.python_tag = pyver - bw.plat_name_supplied = True - bw.plat_name = egg_info['arch'] or 'any' - if not root_is_purelib: - bw.full_tag_supplied = True - bw.full_tag = (pyver, abi, arch) - - dist_info_dir = os.path.join(dir, '%s.dist-info' % dist_info) - bw.egg2dist(os.path.join(dir, 'EGG-INFO'), - dist_info_dir) - bw.write_wheelfile(dist_info_dir, generator='egg2wheel') - bw.write_record(dir, dist_info_dir) - filename = make_archive(os.path.join(dest_dir, wheel_name), 'zip', root_dir=dir) - os.rename(filename, filename[:-3] + 'whl') - shutil.rmtree(dir) - - -def main(): - parser = ArgumentParser() - parser.add_argument('eggs', nargs='*', help="Eggs to convert") - parser.add_argument('--dest-dir', '-d', default=os.path.curdir, - help="Directory to store wheels (default %(default)s)") - parser.add_argument('--verbose', '-v', action='store_true') - args = parser.parse_args() - for pat in args.eggs: - for egg in iglob(pat): - if args.verbose: - sys.stdout.write("{0}... ".format(egg)) - egg2wheel(egg, args.dest_dir) - if args.verbose: - sys.stdout.write("OK\n") - - -if __name__ == "__main__": - main() diff --git a/lib/python3.4/site-packages/wheel/install.py b/lib/python3.4/site-packages/wheel/install.py deleted file mode 100644 index 5a88a75..0000000 --- a/lib/python3.4/site-packages/wheel/install.py +++ /dev/null @@ -1,494 +0,0 @@ -""" -Operations on existing wheel files, including basic installation. -""" -# XXX see patched pip to install - -import csv -import hashlib -import os.path -import re -import shutil -import sys -import warnings -import zipfile - -from . import signatures -from .decorator import reify -from .paths import get_install_paths -from .pep425tags import get_supported -from .pkginfo import read_pkg_info_bytes -from .util import ( - urlsafe_b64encode, from_json, urlsafe_b64decode, native, binary, HashingFile, - open_for_csv) - -try: - _big_number = sys.maxsize -except NameError: - _big_number = sys.maxint - -# The next major version after this version of the 'wheel' tool: -VERSION_TOO_HIGH = (1, 0) - -# Non-greedy matching of an optional build number may be too clever (more -# invalid wheel filenames will match). Separate regex for .dist-info? -WHEEL_INFO_RE = re.compile( - r"""^(?P(?P.+?)(-(?P\d.+?))?) - ((-(?P\d.*?))?-(?P.+?)-(?P.+?)-(?P.+?) - \.whl|\.dist-info)$""", - re.VERBOSE).match - - -def parse_version(version): - """Use parse_version from pkg_resources or distutils as available.""" - global parse_version - try: - from pkg_resources import parse_version - except ImportError: - from distutils.version import LooseVersion as parse_version - return parse_version(version) - - -class BadWheelFile(ValueError): - pass - - -class WheelFile(object): - """Parse wheel-specific attributes from a wheel (.whl) file and offer - basic installation and verification support. - - WheelFile can be used to simply parse a wheel filename by avoiding the - methods that require the actual file contents.""" - - WHEEL_INFO = "WHEEL" - RECORD = "RECORD" - - def __init__(self, - filename, - fp=None, - append=False, - context=get_supported): - """ - :param fp: A seekable file-like object or None to open(filename). - :param append: Open archive in append mode. - :param context: Function returning list of supported tags. Wheels - must have the same context to be sortable. - """ - self.filename = filename - self.fp = fp - self.append = append - self.context = context - basename = os.path.basename(filename) - self.parsed_filename = WHEEL_INFO_RE(basename) - if not basename.endswith('.whl') or self.parsed_filename is None: - raise BadWheelFile("Bad filename '%s'" % filename) - - def __repr__(self): - return self.filename - - @property - def distinfo_name(self): - return "%s.dist-info" % self.parsed_filename.group('namever') - - @property - def datadir_name(self): - return "%s.data" % self.parsed_filename.group('namever') - - @property - def record_name(self): - return "%s/%s" % (self.distinfo_name, self.RECORD) - - @property - def wheelinfo_name(self): - return "%s/%s" % (self.distinfo_name, self.WHEEL_INFO) - - @property - def tags(self): - """A wheel file is compatible with the Cartesian product of the - period-delimited tags in its filename. - To choose a wheel file among several candidates having the same - distribution version 'ver', an installer ranks each triple of - (pyver, abi, plat) that its Python installation can run, sorting - the wheels by the best-ranked tag it supports and then by their - arity which is just len(list(compatibility_tags)). - """ - tags = self.parsed_filename.groupdict() - for pyver in tags['pyver'].split('.'): - for abi in tags['abi'].split('.'): - for plat in tags['plat'].split('.'): - yield (pyver, abi, plat) - - compatibility_tags = tags - - @property - def arity(self): - """The number of compatibility tags the wheel declares.""" - return len(list(self.compatibility_tags)) - - @property - def rank(self): - """ - Lowest index of any of this wheel's tags in self.context(), and the - arity e.g. (0, 1) - """ - return self.compatibility_rank(self.context()) - - @property - def compatible(self): - return self.rank[0] != _big_number # bad API! - - # deprecated: - def compatibility_rank(self, supported): - """Rank the wheel against the supported tags. Smaller ranks are more - compatible! - - :param supported: A list of compatibility tags that the current - Python implemenation can run. - """ - preferences = [] - for tag in self.compatibility_tags: - try: - preferences.append(supported.index(tag)) - # Tag not present - except ValueError: - pass - if len(preferences): - return (min(preferences), self.arity) - return (_big_number, 0) - - # deprecated - def supports_current_python(self, x): - assert self.context == x, 'context mismatch' - return self.compatible - - # Comparability. - # Wheels are equal if they refer to the same file. - # If two wheels are not equal, compare based on (in this order): - # 1. Name - # 2. Version - # 3. Compatibility rank - # 4. Filename (as a tiebreaker) - @property - def _sort_key(self): - return (self.parsed_filename.group('name'), - parse_version(self.parsed_filename.group('ver')), - tuple(-x for x in self.rank), - self.filename) - - def __eq__(self, other): - return self.filename == other.filename - - def __ne__(self, other): - return self.filename != other.filename - - def __lt__(self, other): - if self.context != other.context: - raise TypeError("{0}.context != {1}.context".format(self, other)) - - return self._sort_key < other._sort_key - - # XXX prune - - sn = self.parsed_filename.group('name') - on = other.parsed_filename.group('name') - if sn != on: - return sn < on - sv = parse_version(self.parsed_filename.group('ver')) - ov = parse_version(other.parsed_filename.group('ver')) - if sv != ov: - return sv < ov - # Compatibility - if self.context != other.context: - raise TypeError("{0}.context != {1}.context".format(self, other)) - sc = self.rank - oc = other.rank - if sc is not None and oc is not None and sc != oc: - # Smaller compatibility ranks are "better" than larger ones, - # so we have to reverse the sense of the comparison here! - return sc > oc - elif sc is None and oc is not None: - return False - return self.filename < other.filename - - def __gt__(self, other): - return other < self - - def __le__(self, other): - return self == other or self < other - - def __ge__(self, other): - return self == other or other < self - - # - # Methods using the file's contents: - # - - @reify - def zipfile(self): - mode = "r" - if self.append: - mode = "a" - vzf = VerifyingZipFile(self.fp if self.fp else self.filename, mode) - if not self.append: - self.verify(vzf) - return vzf - - @reify - def parsed_wheel_info(self): - """Parse wheel metadata (the .data/WHEEL file)""" - return read_pkg_info_bytes(self.zipfile.read(self.wheelinfo_name)) - - def check_version(self): - version = self.parsed_wheel_info['Wheel-Version'] - if tuple(map(int, version.split('.'))) >= VERSION_TOO_HIGH: - raise ValueError("Wheel version is too high") - - @reify - def install_paths(self): - """ - Consult distutils to get the install paths for our dist. A dict with - ('purelib', 'platlib', 'headers', 'scripts', 'data'). - - We use the name from our filename as the dist name, which means headers - could be installed in the wrong place if the filesystem-escaped name - is different than the Name. Who cares? - """ - name = self.parsed_filename.group('name') - return get_install_paths(name) - - def install(self, force=False, overrides={}): - """ - Install the wheel into site-packages. - """ - - # Utility to get the target directory for a particular key - def get_path(key): - return overrides.get(key) or self.install_paths[key] - - # The base target location is either purelib or platlib - if self.parsed_wheel_info['Root-Is-Purelib'] == 'true': - root = get_path('purelib') - else: - root = get_path('platlib') - - # Parse all the names in the archive - name_trans = {} - for info in self.zipfile.infolist(): - name = info.filename - # Zip files can contain entries representing directories. - # These end in a '/'. - # We ignore these, as we create directories on demand. - if name.endswith('/'): - continue - - # Pathnames in a zipfile namelist are always /-separated. - # In theory, paths could start with ./ or have other oddities - # but this won't happen in practical cases of well-formed wheels. - # We'll cover the simple case of an initial './' as it's both easy - # to do and more common than most other oddities. - if name.startswith('./'): - name = name[2:] - - # Split off the base directory to identify files that are to be - # installed in non-root locations - basedir, sep, filename = name.partition('/') - if sep and basedir == self.datadir_name: - # Data file. Target destination is elsewhere - key, sep, filename = filename.partition('/') - if not sep: - raise ValueError("Invalid filename in wheel: {0}".format(name)) - target = get_path(key) - else: - # Normal file. Target destination is root - key = '' - target = root - filename = name - - # Map the actual filename from the zipfile to its intended target - # directory and the pathname relative to that directory. - dest = os.path.normpath(os.path.join(target, filename)) - name_trans[info] = (key, target, filename, dest) - - # We're now ready to start processing the actual install. The process - # is as follows: - # 1. Prechecks - is the wheel valid, is its declared architecture - # OK, etc. [[Responsibility of the caller]] - # 2. Overwrite check - do any of the files to be installed already - # exist? - # 3. Actual install - put the files in their target locations. - # 4. Update RECORD - write a suitably modified RECORD file to - # reflect the actual installed paths. - - if not force: - for info, v in name_trans.items(): - k = info.filename - key, target, filename, dest = v - if os.path.exists(dest): - raise ValueError( - "Wheel file {0} would overwrite {1}. Use force if this is intended".format( - k, dest)) - - # Get the name of our executable, for use when replacing script - # wrapper hashbang lines. - # We encode it using getfilesystemencoding, as that is "the name of - # the encoding used to convert Unicode filenames into system file - # names". - exename = sys.executable.encode(sys.getfilesystemencoding()) - record_data = [] - record_name = self.distinfo_name + '/RECORD' - for info, (key, target, filename, dest) in name_trans.items(): - name = info.filename - source = self.zipfile.open(info) - # Skip the RECORD file - if name == record_name: - continue - ddir = os.path.dirname(dest) - if not os.path.isdir(ddir): - os.makedirs(ddir) - - temp_filename = dest + '.part' - try: - with HashingFile(temp_filename, 'wb') as destination: - if key == 'scripts': - hashbang = source.readline() - if hashbang.startswith(b'#!python'): - hashbang = b'#!' + exename + binary(os.linesep) - destination.write(hashbang) - - shutil.copyfileobj(source, destination) - except: - if os.path.exists(temp_filename): - os.unlink(temp_filename) - - raise - - os.rename(temp_filename, dest) - reldest = os.path.relpath(dest, root) - reldest.replace(os.sep, '/') - record_data.append((reldest, destination.digest(), destination.length)) - destination.close() - source.close() - # preserve attributes (especially +x bit for scripts) - attrs = info.external_attr >> 16 - if attrs: # tends to be 0 if Windows. - os.chmod(dest, info.external_attr >> 16) - - record_name = os.path.join(root, self.record_name) - with open_for_csv(record_name, 'w+') as record_file: - writer = csv.writer(record_file) - for reldest, digest, length in sorted(record_data): - writer.writerow((reldest, digest, length)) - writer.writerow((self.record_name, '', '')) - - def verify(self, zipfile=None): - """Configure the VerifyingZipFile `zipfile` by verifying its signature - and setting expected hashes for every hash in RECORD. - Caller must complete the verification process by completely reading - every file in the archive (e.g. with extractall).""" - sig = None - if zipfile is None: - zipfile = self.zipfile - zipfile.strict = True - - record_name = '/'.join((self.distinfo_name, 'RECORD')) - sig_name = '/'.join((self.distinfo_name, 'RECORD.jws')) - # tolerate s/mime signatures: - smime_sig_name = '/'.join((self.distinfo_name, 'RECORD.p7s')) - zipfile.set_expected_hash(record_name, None) - zipfile.set_expected_hash(sig_name, None) - zipfile.set_expected_hash(smime_sig_name, None) - record = zipfile.read(record_name) - - record_digest = urlsafe_b64encode(hashlib.sha256(record).digest()) - try: - sig = from_json(native(zipfile.read(sig_name))) - except KeyError: # no signature - pass - if sig: - headers, payload = signatures.verify(sig) - if payload['hash'] != "sha256=" + native(record_digest): - msg = "RECORD.sig claimed RECORD hash {0} != computed hash {1}." - raise BadWheelFile(msg.format(payload['hash'], - native(record_digest))) - - reader = csv.reader((native(r) for r in record.splitlines())) - - for row in reader: - filename = row[0] - hash = row[1] - if not hash: - if filename not in (record_name, sig_name): - sys.stderr.write("%s has no hash!\n" % filename) - continue - algo, data = row[1].split('=', 1) - assert algo == "sha256", "Unsupported hash algorithm" - zipfile.set_expected_hash(filename, urlsafe_b64decode(binary(data))) - - -class VerifyingZipFile(zipfile.ZipFile): - """ZipFile that can assert that each of its extracted contents matches - an expected sha256 hash. Note that each file must be completly read in - order for its hash to be checked.""" - - def __init__(self, file, mode="r", - compression=zipfile.ZIP_STORED, - allowZip64=False): - zipfile.ZipFile.__init__(self, file, mode, compression, allowZip64) - - self.strict = False - self._expected_hashes = {} - self._hash_algorithm = hashlib.sha256 - - def set_expected_hash(self, name, hash): - """ - :param name: name of zip entry - :param hash: bytes of hash (or None for "don't care") - """ - self._expected_hashes[name] = hash - - def open(self, name_or_info, mode="r", pwd=None): - """Return file-like object for 'name'.""" - # A non-monkey-patched version would contain most of zipfile.py - ef = zipfile.ZipFile.open(self, name_or_info, mode, pwd) - if isinstance(name_or_info, zipfile.ZipInfo): - name = name_or_info.filename - else: - name = name_or_info - - if name in self._expected_hashes and self._expected_hashes[name] is not None: - expected_hash = self._expected_hashes[name] - try: - _update_crc_orig = ef._update_crc - except AttributeError: - warnings.warn('Need ZipExtFile._update_crc to implement ' - 'file hash verification (in Python >= 2.7)') - return ef - running_hash = self._hash_algorithm() - if hasattr(ef, '_eof'): # py33 - def _update_crc(data): - _update_crc_orig(data) - running_hash.update(data) - if ef._eof and running_hash.digest() != expected_hash: - raise BadWheelFile("Bad hash for file %r" % ef.name) - else: - def _update_crc(data, eof=None): - _update_crc_orig(data, eof=eof) - running_hash.update(data) - if eof and running_hash.digest() != expected_hash: - raise BadWheelFile("Bad hash for file %r" % ef.name) - ef._update_crc = _update_crc - elif self.strict and name not in self._expected_hashes: - raise BadWheelFile("No expected hash for file %r" % ef.name) - return ef - - def pop(self): - """Truncate the last file off this zipfile. - Assumes infolist() is in the same order as the files (true for - ordinary zip files created by Python)""" - if not self.fp: - raise RuntimeError( - "Attempt to pop from ZIP archive that was already closed") - last = self.infolist().pop() - del self.NameToInfo[last.filename] - self.fp.seek(last.header_offset, os.SEEK_SET) - self.fp.truncate() - self._didModify = True diff --git a/lib/python3.4/site-packages/wheel/metadata.py b/lib/python3.4/site-packages/wheel/metadata.py deleted file mode 100644 index 29638e7..0000000 --- a/lib/python3.4/site-packages/wheel/metadata.py +++ /dev/null @@ -1,338 +0,0 @@ -""" -Tools for converting old- to new-style metadata. -""" - -import email.parser -import os.path -import re -import textwrap -from collections import namedtuple, OrderedDict - -import pkg_resources - -from . import __version__ as wheel_version -from .pkginfo import read_pkg_info -from .util import OrderedDefaultDict - -METADATA_VERSION = "2.0" - -PLURAL_FIELDS = {"classifier": "classifiers", - "provides_dist": "provides", - "provides_extra": "extras"} - -SKIP_FIELDS = set() - -CONTACT_FIELDS = (({"email": "author_email", "name": "author"}, - "author"), - ({"email": "maintainer_email", "name": "maintainer"}, - "maintainer")) - -# commonly filled out as "UNKNOWN" by distutils: -UNKNOWN_FIELDS = {"author", "author_email", "platform", "home_page", "license"} - -# Wheel itself is probably the only program that uses non-extras markers -# in METADATA/PKG-INFO. Support its syntax with the extra at the end only. -EXTRA_RE = re.compile("""^(?P.*?)(;\s*(?P.*?)(extra == '(?P.*?)')?)$""") -KEYWORDS_RE = re.compile("[\0-,]+") - -MayRequiresKey = namedtuple('MayRequiresKey', ('condition', 'extra')) - - -def unique(iterable): - """ - Yield unique values in iterable, preserving order. - """ - seen = set() - for value in iterable: - if value not in seen: - seen.add(value) - yield value - - -def handle_requires(metadata, pkg_info, key): - """ - Place the runtime requirements from pkg_info into metadata. - """ - may_requires = OrderedDefaultDict(list) - for value in sorted(pkg_info.get_all(key)): - extra_match = EXTRA_RE.search(value) - if extra_match: - groupdict = extra_match.groupdict() - condition = groupdict['condition'] - extra = groupdict['extra'] - package = groupdict['package'] - if condition.endswith(' and '): - condition = condition[:-5] - else: - condition, extra = None, None - package = value - key = MayRequiresKey(condition, extra) - may_requires[key].append(package) - - if may_requires: - metadata['run_requires'] = [] - - def sort_key(item): - # Both condition and extra could be None, which can't be compared - # against strings in Python 3. - key, value = item - if key.condition is None: - return '' - return key.condition - - for key, value in sorted(may_requires.items(), key=sort_key): - may_requirement = OrderedDict((('requires', value),)) - if key.extra: - may_requirement['extra'] = key.extra - if key.condition: - may_requirement['environment'] = key.condition - metadata['run_requires'].append(may_requirement) - - if 'extras' not in metadata: - metadata['extras'] = [] - metadata['extras'].extend([key.extra for key in may_requires.keys() if key.extra]) - - -def pkginfo_to_dict(path, distribution=None): - """ - Convert PKG-INFO to a prototype Metadata 2.0 (PEP 426) dict. - - The description is included under the key ['description'] rather than - being written to a separate file. - - path: path to PKG-INFO file - distribution: optional distutils Distribution() - """ - - metadata = OrderedDefaultDict( - lambda: OrderedDefaultDict(lambda: OrderedDefaultDict(OrderedDict))) - metadata["generator"] = "bdist_wheel (" + wheel_version + ")" - try: - unicode - pkg_info = read_pkg_info(path) - except NameError: - with open(path, 'rb') as pkg_info_file: - pkg_info = email.parser.Parser().parsestr(pkg_info_file.read().decode('utf-8')) - description = None - - if pkg_info['Summary']: - metadata['summary'] = pkginfo_unicode(pkg_info, 'Summary') - del pkg_info['Summary'] - - if pkg_info['Description']: - description = dedent_description(pkg_info) - del pkg_info['Description'] - else: - payload = pkg_info.get_payload() - if isinstance(payload, bytes): - # Avoid a Python 2 Unicode error. - # We still suffer ? glyphs on Python 3. - payload = payload.decode('utf-8') - if payload: - description = payload - - if description: - pkg_info['description'] = description - - for key in sorted(unique(k.lower() for k in pkg_info.keys())): - low_key = key.replace('-', '_') - - if low_key in SKIP_FIELDS: - continue - - if low_key in UNKNOWN_FIELDS and pkg_info.get(key) == 'UNKNOWN': - continue - - if low_key in sorted(PLURAL_FIELDS): - metadata[PLURAL_FIELDS[low_key]] = pkg_info.get_all(key) - - elif low_key == "requires_dist": - handle_requires(metadata, pkg_info, key) - - elif low_key == 'provides_extra': - if 'extras' not in metadata: - metadata['extras'] = [] - metadata['extras'].extend(pkg_info.get_all(key)) - - elif low_key == 'home_page': - metadata['extensions']['python.details']['project_urls'] = {'Home': pkg_info[key]} - - elif low_key == 'keywords': - metadata['keywords'] = KEYWORDS_RE.split(pkg_info[key]) - - else: - metadata[low_key] = pkg_info[key] - - metadata['metadata_version'] = METADATA_VERSION - - if 'extras' in metadata: - metadata['extras'] = sorted(set(metadata['extras'])) - - # include more information if distribution is available - if distribution: - for requires, attr in (('test_requires', 'tests_require'),): - try: - requirements = getattr(distribution, attr) - if isinstance(requirements, list): - new_requirements = sorted(convert_requirements(requirements)) - metadata[requires] = [{'requires': new_requirements}] - except AttributeError: - pass - - # handle contacts - contacts = [] - for contact_type, role in CONTACT_FIELDS: - contact = OrderedDict() - for key in sorted(contact_type): - if contact_type[key] in metadata: - contact[key] = metadata.pop(contact_type[key]) - if contact: - contact['role'] = role - contacts.append(contact) - if contacts: - metadata['extensions']['python.details']['contacts'] = contacts - - # convert entry points to exports - try: - with open(os.path.join(os.path.dirname(path), "entry_points.txt"), "r") as ep_file: - ep_map = pkg_resources.EntryPoint.parse_map(ep_file.read()) - exports = OrderedDict() - for group, items in sorted(ep_map.items()): - exports[group] = OrderedDict() - for item in sorted(map(str, items.values())): - name, export = item.split(' = ', 1) - exports[group][name] = export - if exports: - metadata['extensions']['python.exports'] = exports - except IOError: - pass - - # copy console_scripts entry points to commands - if 'python.exports' in metadata['extensions']: - for (ep_script, wrap_script) in (('console_scripts', 'wrap_console'), - ('gui_scripts', 'wrap_gui')): - if ep_script in metadata['extensions']['python.exports']: - metadata['extensions']['python.commands'][wrap_script] = \ - metadata['extensions']['python.exports'][ep_script] - - return metadata - - -def requires_to_requires_dist(requirement): - """Compose the version predicates for requirement in PEP 345 fashion.""" - requires_dist = [] - for op, ver in requirement.specs: - requires_dist.append(op + ver) - if not requires_dist: - return '' - return " (%s)" % ','.join(sorted(requires_dist)) - - -def convert_requirements(requirements): - """Yield Requires-Dist: strings for parsed requirements strings.""" - for req in requirements: - parsed_requirement = pkg_resources.Requirement.parse(req) - spec = requires_to_requires_dist(parsed_requirement) - extras = ",".join(parsed_requirement.extras) - if extras: - extras = "[%s]" % extras - yield (parsed_requirement.project_name + extras + spec) - - -def generate_requirements(extras_require): - """ - Convert requirements from a setup()-style dictionary to ('Requires-Dist', 'requirement') - and ('Provides-Extra', 'extra') tuples. - - extras_require is a dictionary of {extra: [requirements]} as passed to setup(), - using the empty extra {'': [requirements]} to hold install_requires. - """ - for extra, depends in extras_require.items(): - condition = '' - if extra and ':' in extra: # setuptools extra:condition syntax - extra, condition = extra.split(':', 1) - extra = pkg_resources.safe_extra(extra) - if extra: - yield ('Provides-Extra', extra) - if condition: - condition += " and " - condition += "extra == '%s'" % extra - if condition: - condition = '; ' + condition - for new_req in convert_requirements(depends): - yield ('Requires-Dist', new_req + condition) - - -def pkginfo_to_metadata(egg_info_path, pkginfo_path): - """ - Convert .egg-info directory with PKG-INFO to the Metadata 1.3 aka - old-draft Metadata 2.0 format. - """ - pkg_info = read_pkg_info(pkginfo_path) - pkg_info.replace_header('Metadata-Version', '2.0') - requires_path = os.path.join(egg_info_path, 'requires.txt') - if os.path.exists(requires_path): - with open(requires_path) as requires_file: - requires = requires_file.read() - for extra, reqs in sorted(pkg_resources.split_sections(requires), - key=lambda x: x[0] or ''): - for item in generate_requirements({extra: reqs}): - pkg_info[item[0]] = item[1] - - description = pkg_info['Description'] - if description: - pkg_info.set_payload(dedent_description(pkg_info)) - del pkg_info['Description'] - - return pkg_info - - -def pkginfo_unicode(pkg_info, field): - """Hack to coax Unicode out of an email Message() - Python 3.3+""" - text = pkg_info[field] - field = field.lower() - if not isinstance(text, str): - if not hasattr(pkg_info, 'raw_items'): # Python 3.2 - return str(text) - for item in pkg_info.raw_items(): - if item[0].lower() == field: - text = item[1].encode('ascii', 'surrogateescape') \ - .decode('utf-8') - break - - return text - - -def dedent_description(pkg_info): - """ - Dedent and convert pkg_info['Description'] to Unicode. - """ - description = pkg_info['Description'] - - # Python 3 Unicode handling, sorta. - surrogates = False - if not isinstance(description, str): - surrogates = True - description = pkginfo_unicode(pkg_info, 'Description') - - description_lines = description.splitlines() - description_dedent = '\n'.join( - # if the first line of long_description is blank, - # the first line here will be indented. - (description_lines[0].lstrip(), - textwrap.dedent('\n'.join(description_lines[1:])), - '\n')) - - if surrogates: - description_dedent = description_dedent \ - .encode("utf8") \ - .decode("ascii", "surrogateescape") - - return description_dedent - - -if __name__ == "__main__": - import sys - import pprint - - pprint.pprint(pkginfo_to_dict(sys.argv[1])) diff --git a/lib/python3.4/site-packages/wheel/paths.py b/lib/python3.4/site-packages/wheel/paths.py deleted file mode 100644 index afb3cae..0000000 --- a/lib/python3.4/site-packages/wheel/paths.py +++ /dev/null @@ -1,43 +0,0 @@ -""" -Installation paths. - -Map the .data/ subdirectory names to install paths. -""" - -import distutils.command.install as install -import distutils.dist as dist -import os.path -import sys - - -def get_install_command(name): - # late binding due to potential monkeypatching - d = dist.Distribution({'name': name}) - i = install.install(d) - i.finalize_options() - return i - - -def get_install_paths(name): - """ - Return the (distutils) install paths for the named dist. - - A dict with ('purelib', 'platlib', 'headers', 'scripts', 'data') keys. - """ - paths = {} - - i = get_install_command(name) - - for key in install.SCHEME_KEYS: - paths[key] = getattr(i, 'install_' + key) - - # pip uses a similar path as an alternative to the system's (read-only) - # include directory: - if hasattr(sys, 'real_prefix'): # virtualenv - paths['headers'] = os.path.join(sys.prefix, - 'include', - 'site', - 'python' + sys.version[:3], - name) - - return paths diff --git a/lib/python3.4/site-packages/wheel/pep425tags.py b/lib/python3.4/site-packages/wheel/pep425tags.py deleted file mode 100644 index 29afdc3..0000000 --- a/lib/python3.4/site-packages/wheel/pep425tags.py +++ /dev/null @@ -1,180 +0,0 @@ -"""Generate and work with PEP 425 Compatibility Tags.""" - -import distutils.util -import platform -import sys -import sysconfig -import warnings - - -def get_config_var(var): - try: - return sysconfig.get_config_var(var) - except IOError as e: # pip Issue #1074 - warnings.warn("{0}".format(e), RuntimeWarning) - return None - - -def get_abbr_impl(): - """Return abbreviated implementation name.""" - impl = platform.python_implementation() - if impl == 'PyPy': - return 'pp' - elif impl == 'Jython': - return 'jy' - elif impl == 'IronPython': - return 'ip' - elif impl == 'CPython': - return 'cp' - - raise LookupError('Unknown Python implementation: ' + impl) - - -def get_impl_ver(): - """Return implementation version.""" - impl_ver = get_config_var("py_version_nodot") - if not impl_ver or get_abbr_impl() == 'pp': - impl_ver = ''.join(map(str, get_impl_version_info())) - return impl_ver - - -def get_impl_version_info(): - """Return sys.version_info-like tuple for use in decrementing the minor - version.""" - if get_abbr_impl() == 'pp': - # as per https://github.com/pypa/pip/issues/2882 - return (sys.version_info[0], sys.pypy_version_info.major, - sys.pypy_version_info.minor) - else: - return sys.version_info[0], sys.version_info[1] - - -def get_flag(var, fallback, expected=True, warn=True): - """Use a fallback method for determining SOABI flags if the needed config - var is unset or unavailable.""" - val = get_config_var(var) - if val is None: - if warn: - warnings.warn("Config variable '{0}' is unset, Python ABI tag may " - "be incorrect".format(var), RuntimeWarning, 2) - return fallback() - return val == expected - - -def get_abi_tag(): - """Return the ABI tag based on SOABI (if available) or emulate SOABI - (CPython 2, PyPy).""" - soabi = get_config_var('SOABI') - impl = get_abbr_impl() - if not soabi and impl in ('cp', 'pp') and hasattr(sys, 'maxunicode'): - d = '' - m = '' - u = '' - if get_flag('Py_DEBUG', - lambda: hasattr(sys, 'gettotalrefcount'), - warn=(impl == 'cp')): - d = 'd' - if get_flag('WITH_PYMALLOC', - lambda: impl == 'cp', - warn=(impl == 'cp')): - m = 'm' - if get_flag('Py_UNICODE_SIZE', - lambda: sys.maxunicode == 0x10ffff, - expected=4, - warn=(impl == 'cp' and - sys.version_info < (3, 3))) \ - and sys.version_info < (3, 3): - u = 'u' - abi = '%s%s%s%s%s' % (impl, get_impl_ver(), d, m, u) - elif soabi and soabi.startswith('cpython-'): - abi = 'cp' + soabi.split('-')[1] - elif soabi: - abi = soabi.replace('.', '_').replace('-', '_') - else: - abi = None - return abi - - -def get_platform(): - """Return our platform name 'win32', 'linux_x86_64'""" - # XXX remove distutils dependency - result = distutils.util.get_platform().replace('.', '_').replace('-', '_') - if result == "linux_x86_64" and sys.maxsize == 2147483647: - # pip pull request #3497 - result = "linux_i686" - return result - - -def get_supported(versions=None, supplied_platform=None): - """Return a list of supported tags for each version specified in - `versions`. - - :param versions: a list of string versions, of the form ["33", "32"], - or None. The first version will be assumed to support our ABI. - """ - supported = [] - - # Versions must be given with respect to the preference - if versions is None: - versions = [] - version_info = get_impl_version_info() - major = version_info[:-1] - # Support all previous minor Python versions. - for minor in range(version_info[-1], -1, -1): - versions.append(''.join(map(str, major + (minor,)))) - - impl = get_abbr_impl() - - abis = [] - - abi = get_abi_tag() - if abi: - abis[0:0] = [abi] - - abi3s = set() - import imp - for suffix in imp.get_suffixes(): - if suffix[0].startswith('.abi'): - abi3s.add(suffix[0].split('.', 2)[1]) - - abis.extend(sorted(list(abi3s))) - - abis.append('none') - - platforms = [] - if supplied_platform: - platforms.append(supplied_platform) - platforms.append(get_platform()) - - # Current version, current API (built specifically for our Python): - for abi in abis: - for arch in platforms: - supported.append(('%s%s' % (impl, versions[0]), abi, arch)) - - # abi3 modules compatible with older version of Python - for version in versions[1:]: - # abi3 was introduced in Python 3.2 - if version in ('31', '30'): - break - for abi in abi3s: # empty set if not Python 3 - for arch in platforms: - supported.append(("%s%s" % (impl, version), abi, arch)) - - # No abi / arch, but requires our implementation: - for i, version in enumerate(versions): - supported.append(('%s%s' % (impl, version), 'none', 'any')) - if i == 0: - # Tagged specifically as being cross-version compatible - # (with just the major version specified) - supported.append(('%s%s' % (impl, versions[0][0]), 'none', 'any')) - - # Major Python version + platform; e.g. binaries not using the Python API - supported.append(('py%s' % (versions[0][0]), 'none', arch)) - - # No abi / arch, generic Python - for i, version in enumerate(versions): - supported.append(('py%s' % (version,), 'none', 'any')) - if i == 0: - supported.append(('py%s' % (version[0]), 'none', 'any')) - - return supported diff --git a/lib/python3.4/site-packages/wheel/pkginfo.py b/lib/python3.4/site-packages/wheel/pkginfo.py deleted file mode 100644 index 115be45..0000000 --- a/lib/python3.4/site-packages/wheel/pkginfo.py +++ /dev/null @@ -1,43 +0,0 @@ -"""Tools for reading and writing PKG-INFO / METADATA without caring -about the encoding.""" - -from email.parser import Parser - -try: - unicode - _PY3 = False -except NameError: - _PY3 = True - -if not _PY3: - from email.generator import Generator - - def read_pkg_info_bytes(bytestr): - return Parser().parsestr(bytestr) - - def read_pkg_info(path): - with open(path, "r") as headers: - message = Parser().parse(headers) - return message - - def write_pkg_info(path, message): - with open(path, 'w') as metadata: - Generator(metadata, mangle_from_=False, maxheaderlen=0).flatten(message) -else: - from email.generator import BytesGenerator - - def read_pkg_info_bytes(bytestr): - headers = bytestr.decode(encoding="ascii", errors="surrogateescape") - message = Parser().parsestr(headers) - return message - - def read_pkg_info(path): - with open(path, "r", - encoding="ascii", - errors="surrogateescape") as headers: - message = Parser().parse(headers) - return message - - def write_pkg_info(path, message): - with open(path, "wb") as out: - BytesGenerator(out, mangle_from_=False, maxheaderlen=0).flatten(message) diff --git a/lib/python3.4/site-packages/wheel/signatures/__init__.py b/lib/python3.4/site-packages/wheel/signatures/__init__.py deleted file mode 100644 index e7a5331..0000000 --- a/lib/python3.4/site-packages/wheel/signatures/__init__.py +++ /dev/null @@ -1,110 +0,0 @@ -""" -Create and verify jws-js format Ed25519 signatures. -""" - -import json -from ..util import urlsafe_b64decode, urlsafe_b64encode, native, binary - -__all__ = ['sign', 'verify'] - -ed25519ll = None - -ALG = "Ed25519" - - -def get_ed25519ll(): - """Lazy import-and-test of ed25519 module""" - global ed25519ll - - if not ed25519ll: - try: - import ed25519ll # fast (thousands / s) - except (ImportError, OSError): # pragma nocover - from . import ed25519py as ed25519ll # pure Python (hundreds / s) - test() - - return ed25519ll - - -def sign(payload, keypair): - """Return a JWS-JS format signature given a JSON-serializable payload and - an Ed25519 keypair.""" - get_ed25519ll() - # - header = { - "alg": ALG, - "jwk": { - "kty": ALG, # alg -> kty in jwk-08. - "vk": native(urlsafe_b64encode(keypair.vk)) - } - } - - encoded_header = urlsafe_b64encode(binary(json.dumps(header, sort_keys=True))) - encoded_payload = urlsafe_b64encode(binary(json.dumps(payload, sort_keys=True))) - secured_input = b".".join((encoded_header, encoded_payload)) - sig_msg = ed25519ll.crypto_sign(secured_input, keypair.sk) - signature = sig_msg[:ed25519ll.SIGNATUREBYTES] - encoded_signature = urlsafe_b64encode(signature) - - return {"recipients": - [{"header": native(encoded_header), - "signature": native(encoded_signature)}], - "payload": native(encoded_payload)} - - -def assertTrue(condition, message=""): - if not condition: - raise ValueError(message) - - -def verify(jwsjs): - """Return (decoded headers, payload) if all signatures in jwsjs are - consistent, else raise ValueError. - - Caller must decide whether the keys are actually trusted.""" - get_ed25519ll() - # XXX forbid duplicate keys in JSON input using object_pairs_hook (2.7+) - recipients = jwsjs["recipients"] - encoded_payload = binary(jwsjs["payload"]) - headers = [] - for recipient in recipients: - assertTrue(len(recipient) == 2, "Unknown recipient key {0}".format(recipient)) - h = binary(recipient["header"]) - s = binary(recipient["signature"]) - header = json.loads(native(urlsafe_b64decode(h))) - assertTrue(header["alg"] == ALG, - "Unexpected algorithm {0}".format(header["alg"])) - if "alg" in header["jwk"] and "kty" not in header["jwk"]: - header["jwk"]["kty"] = header["jwk"]["alg"] # b/w for JWK < -08 - assertTrue(header["jwk"]["kty"] == ALG, # true for Ed25519 - "Unexpected key type {0}".format(header["jwk"]["kty"])) - vk = urlsafe_b64decode(binary(header["jwk"]["vk"])) - secured_input = b".".join((h, encoded_payload)) - sig = urlsafe_b64decode(s) - sig_msg = sig+secured_input - verified_input = native(ed25519ll.crypto_sign_open(sig_msg, vk)) - verified_header, verified_payload = verified_input.split('.') - verified_header = binary(verified_header) - decoded_header = native(urlsafe_b64decode(verified_header)) - headers.append(json.loads(decoded_header)) - - verified_payload = binary(verified_payload) - - # only return header, payload that have passed through the crypto library. - payload = json.loads(native(urlsafe_b64decode(verified_payload))) - - return headers, payload - - -def test(): - kp = ed25519ll.crypto_sign_keypair() - payload = {'test': 'onstartup'} - jwsjs = json.loads(json.dumps(sign(payload, kp))) - verify(jwsjs) - jwsjs['payload'] += 'x' - try: - verify(jwsjs) - except ValueError: - pass - else: # pragma no cover - raise RuntimeError("No error from bad wheel.signatures payload.") diff --git a/lib/python3.4/site-packages/wheel/signatures/djbec.py b/lib/python3.4/site-packages/wheel/signatures/djbec.py deleted file mode 100644 index 87f72d4..0000000 --- a/lib/python3.4/site-packages/wheel/signatures/djbec.py +++ /dev/null @@ -1,323 +0,0 @@ -# Ed25519 digital signatures -# Based on http://ed25519.cr.yp.to/python/ed25519.py -# See also http://ed25519.cr.yp.to/software.html -# Adapted by Ron Garret -# Sped up considerably using coordinate transforms found on: -# http://www.hyperelliptic.org/EFD/g1p/auto-twisted-extended-1.html -# Specifically add-2008-hwcd-4 and dbl-2008-hwcd - -import hashlib -import random - -try: # pragma nocover - unicode - PY3 = False - - def asbytes(b): - """Convert array of integers to byte string""" - return ''.join(chr(x) for x in b) - - def joinbytes(b): - """Convert array of bytes to byte string""" - return ''.join(b) - - def bit(h, i): - """Return i'th bit of bytestring h""" - return (ord(h[i // 8]) >> (i % 8)) & 1 -except NameError: # pragma nocover - PY3 = True - asbytes = bytes - joinbytes = bytes - - def bit(h, i): - return (h[i // 8] >> (i % 8)) & 1 - -b = 256 -q = 2 ** 255 - 19 -l = 2 ** 252 + 27742317777372353535851937790883648493 - - -def H(m): - return hashlib.sha512(m).digest() - - -def expmod(b, e, m): - if e == 0: - return 1 - - t = expmod(b, e // 2, m) ** 2 % m - if e & 1: - t = (t * b) % m - - return t - - -# Can probably get some extra speedup here by replacing this with -# an extended-euclidean, but performance seems OK without that -def inv(x): - return expmod(x, q - 2, q) - - -d = -121665 * inv(121666) -I = expmod(2, (q - 1) // 4, q) - - -def xrecover(y): - xx = (y * y - 1) * inv(d * y * y + 1) - x = expmod(xx, (q + 3) // 8, q) - if (x * x - xx) % q != 0: - x = (x * I) % q - - if x % 2 != 0: - x = q - x - - return x - - -By = 4 * inv(5) -Bx = xrecover(By) -B = [Bx % q, By % q] - - -# def edwards(P,Q): -# x1 = P[0] -# y1 = P[1] -# x2 = Q[0] -# y2 = Q[1] -# x3 = (x1*y2+x2*y1) * inv(1+d*x1*x2*y1*y2) -# y3 = (y1*y2+x1*x2) * inv(1-d*x1*x2*y1*y2) -# return (x3 % q,y3 % q) - -# def scalarmult(P,e): -# if e == 0: return [0,1] -# Q = scalarmult(P,e/2) -# Q = edwards(Q,Q) -# if e & 1: Q = edwards(Q,P) -# return Q - -# Faster (!) version based on: -# http://www.hyperelliptic.org/EFD/g1p/auto-twisted-extended-1.html - -def xpt_add(pt1, pt2): - (X1, Y1, Z1, T1) = pt1 - (X2, Y2, Z2, T2) = pt2 - A = ((Y1 - X1) * (Y2 + X2)) % q - B = ((Y1 + X1) * (Y2 - X2)) % q - C = (Z1 * 2 * T2) % q - D = (T1 * 2 * Z2) % q - E = (D + C) % q - F = (B - A) % q - G = (B + A) % q - H = (D - C) % q - X3 = (E * F) % q - Y3 = (G * H) % q - Z3 = (F * G) % q - T3 = (E * H) % q - return (X3, Y3, Z3, T3) - - -def xpt_double(pt): - (X1, Y1, Z1, _) = pt - A = (X1 * X1) - B = (Y1 * Y1) - C = (2 * Z1 * Z1) - D = (-A) % q - J = (X1 + Y1) % q - E = (J * J - A - B) % q - G = (D + B) % q - F = (G - C) % q - H = (D - B) % q - X3 = (E * F) % q - Y3 = (G * H) % q - Z3 = (F * G) % q - T3 = (E * H) % q - return X3, Y3, Z3, T3 - - -def pt_xform(pt): - (x, y) = pt - return x, y, 1, (x * y) % q - - -def pt_unxform(pt): - (x, y, z, _) = pt - return (x * inv(z)) % q, (y * inv(z)) % q - - -def xpt_mult(pt, n): - if n == 0: - return pt_xform((0, 1)) - - _ = xpt_double(xpt_mult(pt, n >> 1)) - return xpt_add(_, pt) if n & 1 else _ - - -def scalarmult(pt, e): - return pt_unxform(xpt_mult(pt_xform(pt), e)) - - -def encodeint(y): - bits = [(y >> i) & 1 for i in range(b)] - e = [(sum([bits[i * 8 + j] << j for j in range(8)])) - for i in range(b // 8)] - return asbytes(e) - - -def encodepoint(P): - x = P[0] - y = P[1] - bits = [(y >> i) & 1 for i in range(b - 1)] + [x & 1] - e = [(sum([bits[i * 8 + j] << j for j in range(8)])) - for i in range(b // 8)] - return asbytes(e) - - -def publickey(sk): - h = H(sk) - a = 2 ** (b - 2) + sum(2 ** i * bit(h, i) for i in range(3, b - 2)) - A = scalarmult(B, a) - return encodepoint(A) - - -def Hint(m): - h = H(m) - return sum(2 ** i * bit(h, i) for i in range(2 * b)) - - -def signature(m, sk, pk): - h = H(sk) - a = 2 ** (b - 2) + sum(2 ** i * bit(h, i) for i in range(3, b - 2)) - inter = joinbytes([h[i] for i in range(b // 8, b // 4)]) - r = Hint(inter + m) - R = scalarmult(B, r) - S = (r + Hint(encodepoint(R) + pk + m) * a) % l - return encodepoint(R) + encodeint(S) - - -def isoncurve(P): - x = P[0] - y = P[1] - return (-x * x + y * y - 1 - d * x * x * y * y) % q == 0 - - -def decodeint(s): - return sum(2 ** i * bit(s, i) for i in range(0, b)) - - -def decodepoint(s): - y = sum(2 ** i * bit(s, i) for i in range(0, b - 1)) - x = xrecover(y) - if x & 1 != bit(s, b - 1): - x = q - x - - P = [x, y] - if not isoncurve(P): - raise Exception("decoding point that is not on curve") - - return P - - -def checkvalid(s, m, pk): - if len(s) != b // 4: - raise Exception("signature length is wrong") - if len(pk) != b // 8: - raise Exception("public-key length is wrong") - - R = decodepoint(s[0:b // 8]) - A = decodepoint(pk) - S = decodeint(s[b // 8:b // 4]) - h = Hint(encodepoint(R) + pk + m) - v1 = scalarmult(B, S) - # v2 = edwards(R,scalarmult(A,h)) - v2 = pt_unxform(xpt_add(pt_xform(R), pt_xform(scalarmult(A, h)))) - return v1 == v2 - - -########################################################## -# -# Curve25519 reference implementation by Matthew Dempsky, from: -# http://cr.yp.to/highspeed/naclcrypto-20090310.pdf - -# P = 2 ** 255 - 19 -P = q -A = 486662 - - -# def expmod(b, e, m): -# if e == 0: return 1 -# t = expmod(b, e / 2, m) ** 2 % m -# if e & 1: t = (t * b) % m -# return t - -# def inv(x): return expmod(x, P - 2, P) - - -def add(n, m, d): - (xn, zn) = n - (xm, zm) = m - (xd, zd) = d - x = 4 * (xm * xn - zm * zn) ** 2 * zd - z = 4 * (xm * zn - zm * xn) ** 2 * xd - return (x % P, z % P) - - -def double(n): - (xn, zn) = n - x = (xn ** 2 - zn ** 2) ** 2 - z = 4 * xn * zn * (xn ** 2 + A * xn * zn + zn ** 2) - return (x % P, z % P) - - -def curve25519(n, base=9): - one = (base, 1) - two = double(one) - - # f(m) evaluates to a tuple - # containing the mth multiple and the - # (m+1)th multiple of base. - def f(m): - if m == 1: - return (one, two) - - (pm, pm1) = f(m // 2) - if m & 1: - return (add(pm, pm1, one), double(pm1)) - - return (double(pm), add(pm, pm1, one)) - - ((x, z), _) = f(n) - return (x * inv(z)) % P - - -def genkey(n=0): - n = n or random.randint(0, P) - n &= ~7 - n &= ~(128 << 8 * 31) - n |= 64 << 8 * 31 - return n - - -# def str2int(s): -# return int(hexlify(s), 16) -# # return sum(ord(s[i]) << (8 * i) for i in range(32)) -# -# def int2str(n): -# return unhexlify("%x" % n) -# # return ''.join([chr((n >> (8 * i)) & 255) for i in range(32)]) - -################################################# - - -def dsa_test(): - import os - msg = str(random.randint(q, q + q)).encode('utf-8') - sk = os.urandom(32) - pk = publickey(sk) - sig = signature(msg, sk, pk) - return checkvalid(sig, msg, pk) - - -def dh_test(): - sk1 = genkey() - sk2 = genkey() - return curve25519(sk1, curve25519(sk2)) == curve25519(sk2, curve25519(sk1)) diff --git a/lib/python3.4/site-packages/wheel/signatures/ed25519py.py b/lib/python3.4/site-packages/wheel/signatures/ed25519py.py deleted file mode 100644 index 0c4ab8f..0000000 --- a/lib/python3.4/site-packages/wheel/signatures/ed25519py.py +++ /dev/null @@ -1,50 +0,0 @@ -import os -import warnings -from collections import namedtuple - -from . import djbec - -__all__ = ['crypto_sign', 'crypto_sign_open', 'crypto_sign_keypair', 'Keypair', - 'PUBLICKEYBYTES', 'SECRETKEYBYTES', 'SIGNATUREBYTES'] - -PUBLICKEYBYTES = 32 -SECRETKEYBYTES = 64 -SIGNATUREBYTES = 64 - -Keypair = namedtuple('Keypair', ('vk', 'sk')) # verifying key, secret key - - -def crypto_sign_keypair(seed=None): - """Return (verifying, secret) key from a given seed, or os.urandom(32)""" - if seed is None: - seed = os.urandom(PUBLICKEYBYTES) - else: - warnings.warn("ed25519ll should choose random seed.", - RuntimeWarning) - if len(seed) != 32: - raise ValueError("seed must be 32 random bytes or None.") - skbytes = seed - vkbytes = djbec.publickey(skbytes) - return Keypair(vkbytes, skbytes+vkbytes) - - -def crypto_sign(msg, sk): - """Return signature+message given message and secret key. - The signature is the first SIGNATUREBYTES bytes of the return value. - A copy of msg is in the remainder.""" - if len(sk) != SECRETKEYBYTES: - raise ValueError("Bad signing key length %d" % len(sk)) - vkbytes = sk[PUBLICKEYBYTES:] - skbytes = sk[:PUBLICKEYBYTES] - sig = djbec.signature(msg, skbytes, vkbytes) - return sig + msg - - -def crypto_sign_open(signed, vk): - """Return message given signature+message and the verifying key.""" - if len(vk) != PUBLICKEYBYTES: - raise ValueError("Bad verifying key length %d" % len(vk)) - rc = djbec.checkvalid(signed[:SIGNATUREBYTES], signed[SIGNATUREBYTES:], vk) - if not rc: - raise ValueError("rc != True", rc) - return signed[SIGNATUREBYTES:] diff --git a/lib/python3.4/site-packages/wheel/signatures/keys.py b/lib/python3.4/site-packages/wheel/signatures/keys.py deleted file mode 100644 index eb5d4ac..0000000 --- a/lib/python3.4/site-packages/wheel/signatures/keys.py +++ /dev/null @@ -1,101 +0,0 @@ -"""Store and retrieve wheel signing / verifying keys. - -Given a scope (a package name, + meaning "all packages", or - meaning -"no packages"), return a list of verifying keys that are trusted for that -scope. - -Given a package name, return a list of (scope, key) suggested keys to sign -that package (only the verifying keys; the private signing key is stored -elsewhere). - -Keys here are represented as urlsafe_b64encoded strings with no padding. - -Tentative command line interface: - -# list trusts -wheel trust -# trust a particular key for all -wheel trust + key -# trust key for beaglevote -wheel trust beaglevote key -# stop trusting a key for all -wheel untrust + key - -# generate a key pair -wheel keygen - -# import a signing key from a file -wheel import keyfile - -# export a signing key -wheel export key -""" - -import json -import os.path - -from ..util import native, load_config_paths, save_config_path - - -class WheelKeys(object): - SCHEMA = 1 - CONFIG_NAME = 'wheel.json' - - def __init__(self): - self.data = {'signers': [], 'verifiers': []} - - def load(self): - # XXX JSON is not a great database - for path in load_config_paths('wheel'): - conf = os.path.join(native(path), self.CONFIG_NAME) - if os.path.exists(conf): - with open(conf, 'r') as infile: - self.data = json.load(infile) - for x in ('signers', 'verifiers'): - if x not in self.data: - self.data[x] = [] - if 'schema' not in self.data: - self.data['schema'] = self.SCHEMA - elif self.data['schema'] != self.SCHEMA: - raise ValueError( - "Bad wheel.json version {0}, expected {1}".format( - self.data['schema'], self.SCHEMA)) - break - return self - - def save(self): - # Try not to call this a very long time after load() - path = save_config_path('wheel') - conf = os.path.join(native(path), self.CONFIG_NAME) - with open(conf, 'w+') as out: - json.dump(self.data, out, indent=2) - return self - - def trust(self, scope, vk): - """Start trusting a particular key for given scope.""" - self.data['verifiers'].append({'scope': scope, 'vk': vk}) - return self - - def untrust(self, scope, vk): - """Stop trusting a particular key for given scope.""" - self.data['verifiers'].remove({'scope': scope, 'vk': vk}) - return self - - def trusted(self, scope=None): - """Return list of [(scope, trusted key), ...] for given scope.""" - trust = [(x['scope'], x['vk']) for x in self.data['verifiers'] - if x['scope'] in (scope, '+')] - trust.sort(key=lambda x: x[0]) - trust.reverse() - return trust - - def signers(self, scope): - """Return list of signing key(s).""" - sign = [(x['scope'], x['vk']) for x in self.data['signers'] if x['scope'] in (scope, '+')] - sign.sort(key=lambda x: x[0]) - sign.reverse() - return sign - - def add_signer(self, scope, vk): - """Remember verifying key vk as being valid for signing in scope.""" - self.data['signers'].append({'scope': scope, 'vk': vk}) diff --git a/lib/python3.4/site-packages/wheel/tool/__init__.py b/lib/python3.4/site-packages/wheel/tool/__init__.py deleted file mode 100644 index d6b9893..0000000 --- a/lib/python3.4/site-packages/wheel/tool/__init__.py +++ /dev/null @@ -1,376 +0,0 @@ -""" -Wheel command-line utility. -""" - -import argparse -import hashlib -import json -import os -import sys -from glob import iglob - -from .. import signatures -from ..install import WheelFile, VerifyingZipFile -from ..paths import get_install_command -from ..util import urlsafe_b64decode, urlsafe_b64encode, native, binary, matches_requirement - - -def require_pkgresources(name): - try: - import pkg_resources # noqa: F401 - except ImportError: - raise RuntimeError("'{0}' needs pkg_resources (part of setuptools).".format(name)) - - -class WheelError(Exception): - pass - - -# For testability -def get_keyring(): - try: - from ..signatures import keys - import keyring - assert keyring.get_keyring().priority - except (ImportError, AssertionError): - raise WheelError( - "Install wheel[signatures] (requires keyring, keyrings.alt, pyxdg) for signatures.") - - return keys.WheelKeys, keyring - - -def keygen(get_keyring=get_keyring): - """Generate a public/private key pair.""" - WheelKeys, keyring = get_keyring() - - ed25519ll = signatures.get_ed25519ll() - - wk = WheelKeys().load() - - keypair = ed25519ll.crypto_sign_keypair() - vk = native(urlsafe_b64encode(keypair.vk)) - sk = native(urlsafe_b64encode(keypair.sk)) - kr = keyring.get_keyring() - kr.set_password("wheel", vk, sk) - sys.stdout.write("Created Ed25519 keypair with vk={0}\n".format(vk)) - sys.stdout.write("in {0!r}\n".format(kr)) - - sk2 = kr.get_password('wheel', vk) - if sk2 != sk: - raise WheelError("Keyring is broken. Could not retrieve secret key.") - - sys.stdout.write("Trusting {0} to sign and verify all packages.\n".format(vk)) - wk.add_signer('+', vk) - wk.trust('+', vk) - wk.save() - - -def sign(wheelfile, replace=False, get_keyring=get_keyring): - """Sign a wheel""" - WheelKeys, keyring = get_keyring() - - ed25519ll = signatures.get_ed25519ll() - - wf = WheelFile(wheelfile, append=True) - wk = WheelKeys().load() - - name = wf.parsed_filename.group('name') - sign_with = wk.signers(name)[0] - sys.stdout.write("Signing {0} with {1}\n".format(name, sign_with[1])) - - vk = sign_with[1] - kr = keyring.get_keyring() - sk = kr.get_password('wheel', vk) - keypair = ed25519ll.Keypair(urlsafe_b64decode(binary(vk)), - urlsafe_b64decode(binary(sk))) - - record_name = wf.distinfo_name + '/RECORD' - sig_name = wf.distinfo_name + '/RECORD.jws' - if sig_name in wf.zipfile.namelist(): - raise WheelError("Wheel is already signed.") - record_data = wf.zipfile.read(record_name) - payload = {"hash": "sha256=" + native(urlsafe_b64encode(hashlib.sha256(record_data).digest()))} - sig = signatures.sign(payload, keypair) - wf.zipfile.writestr(sig_name, json.dumps(sig, sort_keys=True)) - wf.zipfile.close() - - -def unsign(wheelfile): - """ - Remove RECORD.jws from a wheel by truncating the zip file. - - RECORD.jws must be at the end of the archive. The zip file must be an - ordinary archive, with the compressed files and the directory in the same - order, and without any non-zip content after the truncation point. - """ - vzf = VerifyingZipFile(wheelfile, "a") - info = vzf.infolist() - if not (len(info) and info[-1].filename.endswith('/RECORD.jws')): - raise WheelError('The wheel is not signed (RECORD.jws not found at end of the archive).') - vzf.pop() - vzf.close() - - -def verify(wheelfile): - """Verify a wheel. - - The signature will be verified for internal consistency ONLY and printed. - Wheel's own unpack/install commands verify the manifest against the - signature and file contents. - """ - wf = WheelFile(wheelfile) - sig_name = wf.distinfo_name + '/RECORD.jws' - try: - sig = json.loads(native(wf.zipfile.open(sig_name).read())) - except KeyError: - raise WheelError('The wheel is not signed (RECORD.jws not found at end of the archive).') - - verified = signatures.verify(sig) - sys.stderr.write("Signatures are internally consistent.\n") - sys.stdout.write(json.dumps(verified, indent=2)) - sys.stdout.write('\n') - - -def unpack(wheelfile, dest='.'): - """Unpack a wheel. - - Wheel content will be unpacked to {dest}/{name}-{ver}, where {name} - is the package name and {ver} its version. - - :param wheelfile: The path to the wheel. - :param dest: Destination directory (default to current directory). - """ - wf = WheelFile(wheelfile) - namever = wf.parsed_filename.group('namever') - destination = os.path.join(dest, namever) - sys.stderr.write("Unpacking to: %s\n" % (destination)) - wf.zipfile.extractall(destination) - wf.zipfile.close() - - -def install(requirements, requirements_file=None, - wheel_dirs=None, force=False, list_files=False, - dry_run=False): - """Install wheels. - - :param requirements: A list of requirements or wheel files to install. - :param requirements_file: A file containing requirements to install. - :param wheel_dirs: A list of directories to search for wheels. - :param force: Install a wheel file even if it is not compatible. - :param list_files: Only list the files to install, don't install them. - :param dry_run: Do everything but the actual install. - """ - - # If no wheel directories specified, use the WHEELPATH environment - # variable, or the current directory if that is not set. - if not wheel_dirs: - wheelpath = os.getenv("WHEELPATH") - if wheelpath: - wheel_dirs = wheelpath.split(os.pathsep) - else: - wheel_dirs = [os.path.curdir] - - # Get a list of all valid wheels in wheel_dirs - all_wheels = [] - for d in wheel_dirs: - for w in os.listdir(d): - if w.endswith('.whl'): - wf = WheelFile(os.path.join(d, w)) - if wf.compatible: - all_wheels.append(wf) - - # If there is a requirements file, add it to the list of requirements - if requirements_file: - # If the file doesn't exist, search for it in wheel_dirs - # This allows standard requirements files to be stored with the - # wheels. - if not os.path.exists(requirements_file): - for d in wheel_dirs: - name = os.path.join(d, requirements_file) - if os.path.exists(name): - requirements_file = name - break - - with open(requirements_file) as fd: - requirements.extend(fd) - - to_install = [] - for req in requirements: - if req.endswith('.whl'): - # Explicitly specified wheel filename - if os.path.exists(req): - wf = WheelFile(req) - if wf.compatible or force: - to_install.append(wf) - else: - msg = ("{0} is not compatible with this Python. " - "--force to install anyway.".format(req)) - raise WheelError(msg) - else: - # We could search on wheel_dirs, but it's probably OK to - # assume the user has made an error. - raise WheelError("No such wheel file: {}".format(req)) - continue - - # We have a requirement spec - # If we don't have pkg_resources, this will raise an exception - matches = matches_requirement(req, all_wheels) - if not matches: - raise WheelError("No match for requirement {}".format(req)) - to_install.append(max(matches)) - - # We now have a list of wheels to install - if list_files: - sys.stdout.write("Installing:\n") - - if dry_run: - return - - for wf in to_install: - if list_files: - sys.stdout.write(" {0}\n".format(wf.filename)) - continue - wf.install(force=force) - wf.zipfile.close() - - -def install_scripts(distributions): - """ - Regenerate the entry_points console_scripts for the named distribution. - """ - try: - from setuptools.command import easy_install - import pkg_resources - except ImportError: - raise RuntimeError("'wheel install_scripts' needs setuptools.") - - for dist in distributions: - pkg_resources_dist = pkg_resources.get_distribution(dist) - install = get_install_command(dist) - command = easy_install.easy_install(install.distribution) - command.args = ['wheel'] # dummy argument - command.finalize_options() - command.install_egg_scripts(pkg_resources_dist) - - -def convert(installers, dest_dir, verbose): - require_pkgresources('wheel convert') - - # Only support wheel convert if pkg_resources is present - from ..wininst2wheel import bdist_wininst2wheel - from ..egg2wheel import egg2wheel - - for pat in installers: - for installer in iglob(pat): - if os.path.splitext(installer)[1] == '.egg': - conv = egg2wheel - else: - conv = bdist_wininst2wheel - if verbose: - sys.stdout.write("{0}... ".format(installer)) - sys.stdout.flush() - conv(installer, dest_dir) - if verbose: - sys.stdout.write("OK\n") - - -def parser(): - p = argparse.ArgumentParser() - s = p.add_subparsers(help="commands") - - def keygen_f(args): - keygen() - keygen_parser = s.add_parser('keygen', help='Generate signing key') - keygen_parser.set_defaults(func=keygen_f) - - def sign_f(args): - sign(args.wheelfile) - sign_parser = s.add_parser('sign', help='Sign wheel') - sign_parser.add_argument('wheelfile', help='Wheel file') - sign_parser.set_defaults(func=sign_f) - - def unsign_f(args): - unsign(args.wheelfile) - unsign_parser = s.add_parser('unsign', help=unsign.__doc__) - unsign_parser.add_argument('wheelfile', help='Wheel file') - unsign_parser.set_defaults(func=unsign_f) - - def verify_f(args): - verify(args.wheelfile) - verify_parser = s.add_parser('verify', help=verify.__doc__) - verify_parser.add_argument('wheelfile', help='Wheel file') - verify_parser.set_defaults(func=verify_f) - - def unpack_f(args): - unpack(args.wheelfile, args.dest) - unpack_parser = s.add_parser('unpack', help='Unpack wheel') - unpack_parser.add_argument('--dest', '-d', help='Destination directory', - default='.') - unpack_parser.add_argument('wheelfile', help='Wheel file') - unpack_parser.set_defaults(func=unpack_f) - - def install_f(args): - install(args.requirements, args.requirements_file, - args.wheel_dirs, args.force, args.list_files) - install_parser = s.add_parser('install', help='Install wheels') - install_parser.add_argument('requirements', nargs='*', - help='Requirements to install.') - install_parser.add_argument('--force', default=False, - action='store_true', - help='Install incompatible wheel files.') - install_parser.add_argument('--wheel-dir', '-d', action='append', - dest='wheel_dirs', - help='Directories containing wheels.') - install_parser.add_argument('--requirements-file', '-r', - help="A file containing requirements to " - "install.") - install_parser.add_argument('--list', '-l', default=False, - dest='list_files', - action='store_true', - help="List wheels which would be installed, " - "but don't actually install anything.") - install_parser.set_defaults(func=install_f) - - def install_scripts_f(args): - install_scripts(args.distributions) - install_scripts_parser = s.add_parser('install-scripts', help='Install console_scripts') - install_scripts_parser.add_argument('distributions', nargs='*', - help='Regenerate console_scripts for these distributions') - install_scripts_parser.set_defaults(func=install_scripts_f) - - def convert_f(args): - convert(args.installers, args.dest_dir, args.verbose) - convert_parser = s.add_parser('convert', help='Convert egg or wininst to wheel') - convert_parser.add_argument('installers', nargs='*', help='Installers to convert') - convert_parser.add_argument('--dest-dir', '-d', default=os.path.curdir, - help="Directory to store wheels (default %(default)s)") - convert_parser.add_argument('--verbose', '-v', action='store_true') - convert_parser.set_defaults(func=convert_f) - - def version_f(args): - from .. import __version__ - sys.stdout.write("wheel %s\n" % __version__) - version_parser = s.add_parser('version', help='Print version and exit') - version_parser.set_defaults(func=version_f) - - def help_f(args): - p.print_help() - help_parser = s.add_parser('help', help='Show this help') - help_parser.set_defaults(func=help_f) - - return p - - -def main(): - p = parser() - args = p.parse_args() - if not hasattr(args, 'func'): - p.print_help() - else: - # XXX on Python 3.3 we get 'args has no func' rather than short help. - try: - args.func(args) - return 0 - except WheelError as e: - sys.stderr.write(e.message + "\n") - return 1 diff --git a/lib/python3.4/site-packages/wheel/util.py b/lib/python3.4/site-packages/wheel/util.py deleted file mode 100644 index c58d108..0000000 --- a/lib/python3.4/site-packages/wheel/util.py +++ /dev/null @@ -1,176 +0,0 @@ -"""Utility functions.""" - -import base64 -import hashlib -import json -import os -import sys -from collections import OrderedDict - -__all__ = ['urlsafe_b64encode', 'urlsafe_b64decode', 'utf8', - 'to_json', 'from_json', 'matches_requirement'] - - -# For encoding ascii back and forth between bytestrings, as is repeatedly -# necessary in JSON-based crypto under Python 3 -if sys.version_info[0] < 3: - text_type = unicode # noqa: F821 - - def native(s): - return s -else: - text_type = str - - def native(s): - if isinstance(s, bytes): - return s.decode('ascii') - return s - - -def urlsafe_b64encode(data): - """urlsafe_b64encode without padding""" - return base64.urlsafe_b64encode(data).rstrip(binary('=')) - - -def urlsafe_b64decode(data): - """urlsafe_b64decode without padding""" - pad = b'=' * (4 - (len(data) & 3)) - return base64.urlsafe_b64decode(data + pad) - - -def to_json(o): - """Convert given data to JSON.""" - return json.dumps(o, sort_keys=True) - - -def from_json(j): - """Decode a JSON payload.""" - return json.loads(j) - - -def open_for_csv(name, mode): - if sys.version_info[0] < 3: - nl = {} - bin = 'b' - else: - nl = {'newline': ''} - bin = '' - - return open(name, mode + bin, **nl) - - -def utf8(data): - """Utf-8 encode data.""" - if isinstance(data, text_type): - return data.encode('utf-8') - return data - - -def binary(s): - if isinstance(s, text_type): - return s.encode('ascii') - return s - - -class HashingFile(object): - def __init__(self, path, mode, hashtype='sha256'): - self.fd = open(path, mode) - self.hashtype = hashtype - self.hash = hashlib.new(hashtype) - self.length = 0 - - def write(self, data): - self.hash.update(data) - self.length += len(data) - self.fd.write(data) - - def close(self): - self.fd.close() - - def digest(self): - if self.hashtype == 'md5': - return self.hash.hexdigest() - digest = self.hash.digest() - return self.hashtype + '=' + native(urlsafe_b64encode(digest)) - - def __enter__(self): - return self - - def __exit__(self, exc_type, exc_val, exc_tb): - self.fd.close() - - -class OrderedDefaultDict(OrderedDict): - def __init__(self, *args, **kwargs): - if not args: - self.default_factory = None - else: - if not (args[0] is None or callable(args[0])): - raise TypeError('first argument must be callable or None') - self.default_factory = args[0] - args = args[1:] - super(OrderedDefaultDict, self).__init__(*args, **kwargs) - - def __missing__(self, key): - if self.default_factory is None: - raise KeyError(key) - self[key] = default = self.default_factory() - return default - - -if sys.platform == 'win32': - import ctypes.wintypes - # CSIDL_APPDATA for reference - not used here for compatibility with - # dirspec, which uses LOCAL_APPDATA and COMMON_APPDATA in that order - csidl = dict(CSIDL_APPDATA=26, CSIDL_LOCAL_APPDATA=28, CSIDL_COMMON_APPDATA=35) - - def get_path(name): - SHGFP_TYPE_CURRENT = 0 - buf = ctypes.create_unicode_buffer(ctypes.wintypes.MAX_PATH) - ctypes.windll.shell32.SHGetFolderPathW(0, csidl[name], 0, SHGFP_TYPE_CURRENT, buf) - return buf.value - - def save_config_path(*resource): - appdata = get_path("CSIDL_LOCAL_APPDATA") - path = os.path.join(appdata, *resource) - if not os.path.isdir(path): - os.makedirs(path) - return path - - def load_config_paths(*resource): - ids = ["CSIDL_LOCAL_APPDATA", "CSIDL_COMMON_APPDATA"] - for id in ids: - base = get_path(id) - path = os.path.join(base, *resource) - if os.path.exists(path): - yield path -else: - def save_config_path(*resource): - import xdg.BaseDirectory - return xdg.BaseDirectory.save_config_path(*resource) - - def load_config_paths(*resource): - import xdg.BaseDirectory - return xdg.BaseDirectory.load_config_paths(*resource) - - -def matches_requirement(req, wheels): - """List of wheels matching a requirement. - - :param req: The requirement to satisfy - :param wheels: List of wheels to search. - """ - try: - from pkg_resources import Distribution, Requirement - except ImportError: - raise RuntimeError("Cannot use requirements without pkg_resources") - - req = Requirement.parse(req) - - selected = [] - for wf in wheels: - f = wf.parsed_filename - dist = Distribution(project_name=f.group("name"), version=f.group("ver")) - if dist in req: - selected.append(wf) - return selected diff --git a/lib/python3.4/site-packages/wheel/wininst2wheel.py b/lib/python3.4/site-packages/wheel/wininst2wheel.py deleted file mode 100644 index b8a3469..0000000 --- a/lib/python3.4/site-packages/wheel/wininst2wheel.py +++ /dev/null @@ -1,217 +0,0 @@ -#!/usr/bin/env python -import distutils.dist -import os.path -import re -import sys -import tempfile -import zipfile -from argparse import ArgumentParser -from glob import iglob -from shutil import rmtree - -import wheel.bdist_wheel -from wheel.archive import archive_wheelfile - -egg_info_re = re.compile(r'''(^|/)(?P[^/]+?)-(?P.+?) - (-(?P.+?))?(-(?P.+?))?.egg-info(/|$)''', re.VERBOSE) - - -def parse_info(wininfo_name, egginfo_name): - """Extract metadata from filenames. - - Extracts the 4 metadataitems needed (name, version, pyversion, arch) from - the installer filename and the name of the egg-info directory embedded in - the zipfile (if any). - - The egginfo filename has the format:: - - name-ver(-pyver)(-arch).egg-info - - The installer filename has the format:: - - name-ver.arch(-pyver).exe - - Some things to note: - - 1. The installer filename is not definitive. An installer can be renamed - and work perfectly well as an installer. So more reliable data should - be used whenever possible. - 2. The egg-info data should be preferred for the name and version, because - these come straight from the distutils metadata, and are mandatory. - 3. The pyver from the egg-info data should be ignored, as it is - constructed from the version of Python used to build the installer, - which is irrelevant - the installer filename is correct here (even to - the point that when it's not there, any version is implied). - 4. The architecture must be taken from the installer filename, as it is - not included in the egg-info data. - 5. Architecture-neutral installers still have an architecture because the - installer format itself (being executable) is architecture-specific. We - should therefore ignore the architecture if the content is pure-python. - """ - - egginfo = None - if egginfo_name: - egginfo = egg_info_re.search(egginfo_name) - if not egginfo: - raise ValueError("Egg info filename %s is not valid" % (egginfo_name,)) - - # Parse the wininst filename - # 1. Distribution name (up to the first '-') - w_name, sep, rest = wininfo_name.partition('-') - if not sep: - raise ValueError("Installer filename %s is not valid" % (wininfo_name,)) - - # Strip '.exe' - rest = rest[:-4] - # 2. Python version (from the last '-', must start with 'py') - rest2, sep, w_pyver = rest.rpartition('-') - if sep and w_pyver.startswith('py'): - rest = rest2 - w_pyver = w_pyver.replace('.', '') - else: - # Not version specific - use py2.py3. While it is possible that - # pure-Python code is not compatible with both Python 2 and 3, there - # is no way of knowing from the wininst format, so we assume the best - # here (the user can always manually rename the wheel to be more - # restrictive if needed). - w_pyver = 'py2.py3' - # 3. Version and architecture - w_ver, sep, w_arch = rest.rpartition('.') - if not sep: - raise ValueError("Installer filename %s is not valid" % (wininfo_name,)) - - if egginfo: - w_name = egginfo.group('name') - w_ver = egginfo.group('ver') - - return dict(name=w_name, ver=w_ver, arch=w_arch, pyver=w_pyver) - - -def bdist_wininst2wheel(path, dest_dir=os.path.curdir): - bdw = zipfile.ZipFile(path) - - # Search for egg-info in the archive - egginfo_name = None - for filename in bdw.namelist(): - if '.egg-info' in filename: - egginfo_name = filename - break - - info = parse_info(os.path.basename(path), egginfo_name) - - root_is_purelib = True - for zipinfo in bdw.infolist(): - if zipinfo.filename.startswith('PLATLIB'): - root_is_purelib = False - break - if root_is_purelib: - paths = {'purelib': ''} - else: - paths = {'platlib': ''} - - dist_info = "%(name)s-%(ver)s" % info - datadir = "%s.data/" % dist_info - - # rewrite paths to trick ZipFile into extracting an egg - # XXX grab wininst .ini - between .exe, padding, and first zip file. - members = [] - egginfo_name = '' - for zipinfo in bdw.infolist(): - key, basename = zipinfo.filename.split('/', 1) - key = key.lower() - basepath = paths.get(key, None) - if basepath is None: - basepath = datadir + key.lower() + '/' - oldname = zipinfo.filename - newname = basepath + basename - zipinfo.filename = newname - del bdw.NameToInfo[oldname] - bdw.NameToInfo[newname] = zipinfo - # Collect member names, but omit '' (from an entry like "PLATLIB/" - if newname: - members.append(newname) - # Remember egg-info name for the egg2dist call below - if not egginfo_name: - if newname.endswith('.egg-info'): - egginfo_name = newname - elif '.egg-info/' in newname: - egginfo_name, sep, _ = newname.rpartition('/') - dir = tempfile.mkdtemp(suffix="_b2w") - bdw.extractall(dir, members) - - # egg2wheel - abi = 'none' - pyver = info['pyver'] - arch = (info['arch'] or 'any').replace('.', '_').replace('-', '_') - # Wininst installers always have arch even if they are not - # architecture-specific (because the format itself is). - # So, assume the content is architecture-neutral if root is purelib. - if root_is_purelib: - arch = 'any' - # If the installer is architecture-specific, it's almost certainly also - # CPython-specific. - if arch != 'any': - pyver = pyver.replace('py', 'cp') - wheel_name = '-'.join(( - dist_info, - pyver, - abi, - arch - )) - if root_is_purelib: - bw = wheel.bdist_wheel.bdist_wheel(distutils.dist.Distribution()) - else: - bw = _bdist_wheel_tag(distutils.dist.Distribution()) - - bw.root_is_pure = root_is_purelib - bw.python_tag = pyver - bw.plat_name_supplied = True - bw.plat_name = info['arch'] or 'any' - - if not root_is_purelib: - bw.full_tag_supplied = True - bw.full_tag = (pyver, abi, arch) - - dist_info_dir = os.path.join(dir, '%s.dist-info' % dist_info) - bw.egg2dist(os.path.join(dir, egginfo_name), dist_info_dir) - bw.write_wheelfile(dist_info_dir, generator='wininst2wheel') - bw.write_record(dir, dist_info_dir) - - archive_wheelfile(os.path.join(dest_dir, wheel_name), dir) - rmtree(dir) - - -class _bdist_wheel_tag(wheel.bdist_wheel.bdist_wheel): - # allow the client to override the default generated wheel tag - # The default bdist_wheel implementation uses python and abi tags - # of the running python process. This is not suitable for - # generating/repackaging prebuild binaries. - - full_tag_supplied = False - full_tag = None # None or a (pytag, soabitag, plattag) triple - - def get_tag(self): - if self.full_tag_supplied and self.full_tag is not None: - return self.full_tag - else: - return super(_bdist_wheel_tag, self).get_tag() - - -def main(): - parser = ArgumentParser() - parser.add_argument('installers', nargs='*', help="Installers to convert") - parser.add_argument('--dest-dir', '-d', default=os.path.curdir, - help="Directory to store wheels (default %(default)s)") - parser.add_argument('--verbose', '-v', action='store_true') - args = parser.parse_args() - for pat in args.installers: - for installer in iglob(pat): - if args.verbose: - sys.stdout.write("{0}... ".format(installer)) - bdist_wininst2wheel(installer, args.dest_dir) - if args.verbose: - sys.stdout.write("OK\n") - - -if __name__ == "__main__": - main() diff --git a/lib/python3.7/site-packages/SQLAlchemy-1.0.12.egg-info/PKG-INFO b/lib/python3.7/site-packages/SQLAlchemy-1.0.12.egg-info/PKG-INFO new file mode 100644 index 0000000..c95fb48 --- /dev/null +++ b/lib/python3.7/site-packages/SQLAlchemy-1.0.12.egg-info/PKG-INFO @@ -0,0 +1,155 @@ +Metadata-Version: 1.1 +Name: SQLAlchemy +Version: 1.0.12 +Summary: Database Abstraction Library +Home-page: http://www.sqlalchemy.org +Author: Mike Bayer +Author-email: mike_mp@zzzcomputing.com +License: MIT License +Description: SQLAlchemy + ========== + + The Python SQL Toolkit and Object Relational Mapper + + Introduction + ------------- + + SQLAlchemy is the Python SQL toolkit and Object Relational Mapper + that gives application developers the full power and + flexibility of SQL. SQLAlchemy provides a full suite + of well known enterprise-level persistence patterns, + designed for efficient and high-performing database + access, adapted into a simple and Pythonic domain + language. + + Major SQLAlchemy features include: + + * An industrial strength ORM, built + from the core on the identity map, unit of work, + and data mapper patterns. These patterns + allow transparent persistence of objects + using a declarative configuration system. + Domain models + can be constructed and manipulated naturally, + and changes are synchronized with the + current transaction automatically. + * A relationally-oriented query system, exposing + the full range of SQL's capabilities + explicitly, including joins, subqueries, + correlation, and most everything else, + in terms of the object model. + Writing queries with the ORM uses the same + techniques of relational composition you use + when writing SQL. While you can drop into + literal SQL at any time, it's virtually never + needed. + * A comprehensive and flexible system + of eager loading for related collections and objects. + Collections are cached within a session, + and can be loaded on individual access, all + at once using joins, or by query per collection + across the full result set. + * A Core SQL construction system and DBAPI + interaction layer. The SQLAlchemy Core is + separate from the ORM and is a full database + abstraction layer in its own right, and includes + an extensible Python-based SQL expression + language, schema metadata, connection pooling, + type coercion, and custom types. + * All primary and foreign key constraints are + assumed to be composite and natural. Surrogate + integer primary keys are of course still the + norm, but SQLAlchemy never assumes or hardcodes + to this model. + * Database introspection and generation. Database + schemas can be "reflected" in one step into + Python structures representing database metadata; + those same structures can then generate + CREATE statements right back out - all within + the Core, independent of the ORM. + + SQLAlchemy's philosophy: + + * SQL databases behave less and less like object + collections the more size and performance start to + matter; object collections behave less and less like + tables and rows the more abstraction starts to matter. + SQLAlchemy aims to accommodate both of these + principles. + * An ORM doesn't need to hide the "R". A relational + database provides rich, set-based functionality + that should be fully exposed. SQLAlchemy's + ORM provides an open-ended set of patterns + that allow a developer to construct a custom + mediation layer between a domain model and + a relational schema, turning the so-called + "object relational impedance" issue into + a distant memory. + * The developer, in all cases, makes all decisions + regarding the design, structure, and naming conventions + of both the object model as well as the relational + schema. SQLAlchemy only provides the means + to automate the execution of these decisions. + * With SQLAlchemy, there's no such thing as + "the ORM generated a bad query" - you + retain full control over the structure of + queries, including how joins are organized, + how subqueries and correlation is used, what + columns are requested. Everything SQLAlchemy + does is ultimately the result of a developer- + initiated decision. + * Don't use an ORM if the problem doesn't need one. + SQLAlchemy consists of a Core and separate ORM + component. The Core offers a full SQL expression + language that allows Pythonic construction + of SQL constructs that render directly to SQL + strings for a target database, returning + result sets that are essentially enhanced DBAPI + cursors. + * Transactions should be the norm. With SQLAlchemy's + ORM, nothing goes to permanent storage until + commit() is called. SQLAlchemy encourages applications + to create a consistent means of delineating + the start and end of a series of operations. + * Never render a literal value in a SQL statement. + Bound parameters are used to the greatest degree + possible, allowing query optimizers to cache + query plans effectively and making SQL injection + attacks a non-issue. + + Documentation + ------------- + + Latest documentation is at: + + http://www.sqlalchemy.org/docs/ + + Installation / Requirements + --------------------------- + + Full documentation for installation is at + `Installation `_. + + Getting Help / Development / Bug reporting + ------------------------------------------ + + Please refer to the `SQLAlchemy Community Guide `_. + + License + ------- + + SQLAlchemy is distributed under the `MIT license + `_. + + +Platform: UNKNOWN +Classifier: Development Status :: 5 - Production/Stable +Classifier: Intended Audience :: Developers +Classifier: License :: OSI Approved :: MIT License +Classifier: Programming Language :: Python +Classifier: Programming Language :: Python :: 3 +Classifier: Programming Language :: Python :: Implementation :: CPython +Classifier: Programming Language :: Python :: Implementation :: Jython +Classifier: Programming Language :: Python :: Implementation :: PyPy +Classifier: Topic :: Database :: Front-Ends +Classifier: Operating System :: OS Independent diff --git a/lib/python3.7/site-packages/SQLAlchemy-1.0.12.egg-info/SOURCES.txt b/lib/python3.7/site-packages/SQLAlchemy-1.0.12.egg-info/SOURCES.txt new file mode 100644 index 0000000..f69f69f --- /dev/null +++ b/lib/python3.7/site-packages/SQLAlchemy-1.0.12.egg-info/SOURCES.txt @@ -0,0 +1,786 @@ +AUTHORS +CHANGES +LICENSE +MANIFEST.in +README.dialects.rst +README.rst +README.unittests.rst +setup.cfg +setup.py +sqla_nose.py +tox.ini +doc/contents.html +doc/copyright.html +doc/genindex.html +doc/glossary.html +doc/index.html +doc/intro.html +doc/search.html +doc/searchindex.js +doc/_images/sqla_arch_small.png +doc/_images/sqla_engine_arch.png +doc/_modules/index.html +doc/_modules/examples/adjacency_list/adjacency_list.html +doc/_modules/examples/association/basic_association.html +doc/_modules/examples/association/dict_of_sets_with_default.html +doc/_modules/examples/association/proxied_association.html +doc/_modules/examples/custom_attributes/custom_management.html +doc/_modules/examples/custom_attributes/listen_for_events.html +doc/_modules/examples/dogpile_caching/advanced.html +doc/_modules/examples/dogpile_caching/caching_query.html +doc/_modules/examples/dogpile_caching/environment.html +doc/_modules/examples/dogpile_caching/fixture_data.html +doc/_modules/examples/dogpile_caching/helloworld.html +doc/_modules/examples/dogpile_caching/local_session_caching.html +doc/_modules/examples/dogpile_caching/model.html +doc/_modules/examples/dogpile_caching/relationship_caching.html +doc/_modules/examples/dynamic_dict/dynamic_dict.html +doc/_modules/examples/elementtree/adjacency_list.html +doc/_modules/examples/elementtree/optimized_al.html +doc/_modules/examples/elementtree/pickle.html +doc/_modules/examples/generic_associations/discriminator_on_association.html +doc/_modules/examples/generic_associations/generic_fk.html +doc/_modules/examples/generic_associations/table_per_association.html +doc/_modules/examples/generic_associations/table_per_related.html +doc/_modules/examples/graphs/directed_graph.html +doc/_modules/examples/inheritance/concrete.html +doc/_modules/examples/inheritance/joined.html +doc/_modules/examples/inheritance/single.html +doc/_modules/examples/join_conditions/cast.html +doc/_modules/examples/join_conditions/threeway.html +doc/_modules/examples/large_collection/large_collection.html +doc/_modules/examples/materialized_paths/materialized_paths.html +doc/_modules/examples/nested_sets/nested_sets.html +doc/_modules/examples/performance/__main__.html +doc/_modules/examples/performance/bulk_inserts.html +doc/_modules/examples/performance/bulk_updates.html +doc/_modules/examples/performance/large_resultsets.html +doc/_modules/examples/performance/short_selects.html +doc/_modules/examples/performance/single_inserts.html +doc/_modules/examples/postgis/postgis.html +doc/_modules/examples/sharding/attribute_shard.html +doc/_modules/examples/versioned_history/history_meta.html +doc/_modules/examples/versioned_history/test_versioning.html +doc/_modules/examples/versioned_rows/versioned_map.html +doc/_modules/examples/versioned_rows/versioned_rows.html +doc/_modules/examples/vertical/dictlike-polymorphic.html +doc/_modules/examples/vertical/dictlike.html +doc/_static/basic.css +doc/_static/changelog.css +doc/_static/comment-bright.png +doc/_static/comment-close.png +doc/_static/comment.png +doc/_static/detectmobile.js +doc/_static/docs.css +doc/_static/doctools.js +doc/_static/down-pressed.png +doc/_static/down.png +doc/_static/file.png +doc/_static/init.js +doc/_static/jquery-1.11.1.js +doc/_static/jquery.js +doc/_static/minus.png +doc/_static/plus.png +doc/_static/pygments.css +doc/_static/searchtools.js +doc/_static/sphinx_paramlinks.css +doc/_static/underscore-1.3.1.js +doc/_static/underscore.js +doc/_static/up-pressed.png +doc/_static/up.png +doc/_static/websupport.js +doc/build/Makefile +doc/build/conf.py +doc/build/contents.rst +doc/build/copyright.rst +doc/build/corrections.py +doc/build/glossary.rst +doc/build/index.rst +doc/build/intro.rst +doc/build/requirements.txt +doc/build/sqla_arch_small.png +doc/build/changelog/changelog_01.rst +doc/build/changelog/changelog_02.rst +doc/build/changelog/changelog_03.rst +doc/build/changelog/changelog_04.rst +doc/build/changelog/changelog_05.rst +doc/build/changelog/changelog_06.rst +doc/build/changelog/changelog_07.rst +doc/build/changelog/changelog_08.rst +doc/build/changelog/changelog_09.rst +doc/build/changelog/changelog_10.rst +doc/build/changelog/index.rst +doc/build/changelog/migration_04.rst +doc/build/changelog/migration_05.rst +doc/build/changelog/migration_06.rst +doc/build/changelog/migration_07.rst +doc/build/changelog/migration_08.rst +doc/build/changelog/migration_09.rst +doc/build/changelog/migration_10.rst +doc/build/core/api_basics.rst +doc/build/core/compiler.rst +doc/build/core/connections.rst +doc/build/core/constraints.rst +doc/build/core/custom_types.rst +doc/build/core/ddl.rst +doc/build/core/defaults.rst +doc/build/core/dml.rst +doc/build/core/engines.rst +doc/build/core/engines_connections.rst +doc/build/core/event.rst +doc/build/core/events.rst +doc/build/core/exceptions.rst +doc/build/core/expression_api.rst +doc/build/core/functions.rst +doc/build/core/index.rst +doc/build/core/inspection.rst +doc/build/core/interfaces.rst +doc/build/core/internals.rst +doc/build/core/metadata.rst +doc/build/core/pooling.rst +doc/build/core/reflection.rst +doc/build/core/schema.rst +doc/build/core/selectable.rst +doc/build/core/serializer.rst +doc/build/core/sqla_engine_arch.png +doc/build/core/sqlelement.rst +doc/build/core/tutorial.rst +doc/build/core/type_api.rst +doc/build/core/type_basics.rst +doc/build/core/types.rst +doc/build/dialects/firebird.rst +doc/build/dialects/index.rst +doc/build/dialects/mssql.rst +doc/build/dialects/mysql.rst +doc/build/dialects/oracle.rst +doc/build/dialects/postgresql.rst +doc/build/dialects/sqlite.rst +doc/build/dialects/sybase.rst +doc/build/faq/connections.rst +doc/build/faq/index.rst +doc/build/faq/metadata_schema.rst +doc/build/faq/ormconfiguration.rst +doc/build/faq/performance.rst +doc/build/faq/sessions.rst +doc/build/faq/sqlexpressions.rst +doc/build/orm/backref.rst +doc/build/orm/basic_relationships.rst +doc/build/orm/cascades.rst +doc/build/orm/classical.rst +doc/build/orm/collections.rst +doc/build/orm/composites.rst +doc/build/orm/constructors.rst +doc/build/orm/contextual.rst +doc/build/orm/deprecated.rst +doc/build/orm/events.rst +doc/build/orm/examples.rst +doc/build/orm/exceptions.rst +doc/build/orm/extending.rst +doc/build/orm/index.rst +doc/build/orm/inheritance.rst +doc/build/orm/internals.rst +doc/build/orm/join_conditions.rst +doc/build/orm/loading.rst +doc/build/orm/loading_columns.rst +doc/build/orm/loading_objects.rst +doc/build/orm/loading_relationships.rst +doc/build/orm/mapped_attributes.rst +doc/build/orm/mapped_sql_expr.rst +doc/build/orm/mapper_config.rst +doc/build/orm/mapping_api.rst +doc/build/orm/mapping_columns.rst +doc/build/orm/mapping_styles.rst +doc/build/orm/nonstandard_mappings.rst +doc/build/orm/persistence_techniques.rst +doc/build/orm/query.rst +doc/build/orm/relationship_api.rst +doc/build/orm/relationship_persistence.rst +doc/build/orm/relationships.rst +doc/build/orm/scalar_mapping.rst +doc/build/orm/self_referential.rst +doc/build/orm/session.rst +doc/build/orm/session_api.rst +doc/build/orm/session_basics.rst +doc/build/orm/session_events.rst +doc/build/orm/session_state_management.rst +doc/build/orm/session_transaction.rst +doc/build/orm/tutorial.rst +doc/build/orm/versioning.rst +doc/build/orm/extensions/associationproxy.rst +doc/build/orm/extensions/automap.rst +doc/build/orm/extensions/baked.rst +doc/build/orm/extensions/horizontal_shard.rst +doc/build/orm/extensions/hybrid.rst +doc/build/orm/extensions/index.rst +doc/build/orm/extensions/instrumentation.rst +doc/build/orm/extensions/mutable.rst +doc/build/orm/extensions/orderinglist.rst +doc/build/orm/extensions/declarative/api.rst +doc/build/orm/extensions/declarative/basic_use.rst +doc/build/orm/extensions/declarative/index.rst +doc/build/orm/extensions/declarative/inheritance.rst +doc/build/orm/extensions/declarative/mixins.rst +doc/build/orm/extensions/declarative/relationships.rst +doc/build/orm/extensions/declarative/table_config.rst +doc/build/texinputs/Makefile +doc/build/texinputs/sphinx.sty +doc/changelog/changelog_01.html +doc/changelog/changelog_02.html +doc/changelog/changelog_03.html +doc/changelog/changelog_04.html +doc/changelog/changelog_05.html +doc/changelog/changelog_06.html +doc/changelog/changelog_07.html +doc/changelog/changelog_08.html +doc/changelog/changelog_09.html +doc/changelog/changelog_10.html +doc/changelog/index.html +doc/changelog/migration_04.html +doc/changelog/migration_05.html +doc/changelog/migration_06.html +doc/changelog/migration_07.html +doc/changelog/migration_08.html +doc/changelog/migration_09.html +doc/changelog/migration_10.html +doc/core/api_basics.html +doc/core/compiler.html +doc/core/connections.html +doc/core/constraints.html +doc/core/custom_types.html +doc/core/ddl.html +doc/core/defaults.html +doc/core/dml.html +doc/core/engines.html +doc/core/engines_connections.html +doc/core/event.html +doc/core/events.html +doc/core/exceptions.html +doc/core/expression_api.html +doc/core/functions.html +doc/core/index.html +doc/core/inspection.html +doc/core/interfaces.html +doc/core/internals.html +doc/core/metadata.html +doc/core/pooling.html +doc/core/reflection.html +doc/core/schema.html +doc/core/selectable.html +doc/core/serializer.html +doc/core/sqlelement.html +doc/core/tutorial.html +doc/core/type_api.html +doc/core/type_basics.html +doc/core/types.html +doc/dialects/firebird.html +doc/dialects/index.html +doc/dialects/mssql.html +doc/dialects/mysql.html +doc/dialects/oracle.html +doc/dialects/postgresql.html +doc/dialects/sqlite.html +doc/dialects/sybase.html +doc/faq/connections.html +doc/faq/index.html +doc/faq/metadata_schema.html +doc/faq/ormconfiguration.html +doc/faq/performance.html +doc/faq/sessions.html +doc/faq/sqlexpressions.html +doc/orm/backref.html +doc/orm/basic_relationships.html +doc/orm/cascades.html +doc/orm/classical.html +doc/orm/collections.html +doc/orm/composites.html +doc/orm/constructors.html +doc/orm/contextual.html +doc/orm/deprecated.html +doc/orm/events.html +doc/orm/examples.html +doc/orm/exceptions.html +doc/orm/extending.html +doc/orm/index.html +doc/orm/inheritance.html +doc/orm/internals.html +doc/orm/join_conditions.html +doc/orm/loading.html +doc/orm/loading_columns.html +doc/orm/loading_objects.html +doc/orm/loading_relationships.html +doc/orm/mapped_attributes.html +doc/orm/mapped_sql_expr.html +doc/orm/mapper_config.html +doc/orm/mapping_api.html +doc/orm/mapping_columns.html +doc/orm/mapping_styles.html +doc/orm/nonstandard_mappings.html +doc/orm/persistence_techniques.html +doc/orm/query.html +doc/orm/relationship_api.html +doc/orm/relationship_persistence.html +doc/orm/relationships.html +doc/orm/scalar_mapping.html +doc/orm/self_referential.html +doc/orm/session.html +doc/orm/session_api.html +doc/orm/session_basics.html +doc/orm/session_events.html +doc/orm/session_state_management.html +doc/orm/session_transaction.html +doc/orm/tutorial.html +doc/orm/versioning.html +doc/orm/extensions/associationproxy.html +doc/orm/extensions/automap.html +doc/orm/extensions/baked.html +doc/orm/extensions/horizontal_shard.html +doc/orm/extensions/hybrid.html +doc/orm/extensions/index.html +doc/orm/extensions/instrumentation.html +doc/orm/extensions/mutable.html +doc/orm/extensions/orderinglist.html +doc/orm/extensions/declarative/api.html +doc/orm/extensions/declarative/basic_use.html +doc/orm/extensions/declarative/index.html +doc/orm/extensions/declarative/inheritance.html +doc/orm/extensions/declarative/mixins.html +doc/orm/extensions/declarative/relationships.html +doc/orm/extensions/declarative/table_config.html +examples/__init__.py +examples/adjacency_list/__init__.py +examples/adjacency_list/adjacency_list.py +examples/association/__init__.py +examples/association/basic_association.py +examples/association/dict_of_sets_with_default.py +examples/association/proxied_association.py +examples/custom_attributes/__init__.py +examples/custom_attributes/custom_management.py +examples/custom_attributes/listen_for_events.py +examples/dogpile_caching/__init__.py +examples/dogpile_caching/advanced.py +examples/dogpile_caching/caching_query.py +examples/dogpile_caching/environment.py +examples/dogpile_caching/fixture_data.py +examples/dogpile_caching/helloworld.py +examples/dogpile_caching/local_session_caching.py +examples/dogpile_caching/model.py +examples/dogpile_caching/relationship_caching.py +examples/dynamic_dict/__init__.py +examples/dynamic_dict/dynamic_dict.py +examples/elementtree/__init__.py +examples/elementtree/adjacency_list.py +examples/elementtree/optimized_al.py +examples/elementtree/pickle.py +examples/elementtree/test.xml +examples/elementtree/test2.xml +examples/elementtree/test3.xml +examples/generic_associations/__init__.py +examples/generic_associations/discriminator_on_association.py +examples/generic_associations/generic_fk.py +examples/generic_associations/table_per_association.py +examples/generic_associations/table_per_related.py +examples/graphs/__init__.py +examples/graphs/directed_graph.py +examples/inheritance/__init__.py +examples/inheritance/concrete.py +examples/inheritance/joined.py +examples/inheritance/single.py +examples/join_conditions/__init__.py +examples/join_conditions/cast.py +examples/join_conditions/threeway.py +examples/large_collection/__init__.py +examples/large_collection/large_collection.py +examples/materialized_paths/__init__.py +examples/materialized_paths/materialized_paths.py +examples/nested_sets/__init__.py +examples/nested_sets/nested_sets.py +examples/performance/__init__.py +examples/performance/__main__.py +examples/performance/bulk_inserts.py +examples/performance/bulk_updates.py +examples/performance/large_resultsets.py +examples/performance/short_selects.py +examples/performance/single_inserts.py +examples/postgis/__init__.py +examples/postgis/postgis.py +examples/sharding/__init__.py +examples/sharding/attribute_shard.py +examples/versioned_history/__init__.py +examples/versioned_history/history_meta.py +examples/versioned_history/test_versioning.py +examples/versioned_rows/__init__.py +examples/versioned_rows/versioned_map.py +examples/versioned_rows/versioned_rows.py +examples/vertical/__init__.py +examples/vertical/dictlike-polymorphic.py +examples/vertical/dictlike.py +lib/SQLAlchemy.egg-info/PKG-INFO +lib/SQLAlchemy.egg-info/SOURCES.txt +lib/SQLAlchemy.egg-info/dependency_links.txt +lib/SQLAlchemy.egg-info/top_level.txt +lib/sqlalchemy/__init__.py +lib/sqlalchemy/events.py +lib/sqlalchemy/exc.py +lib/sqlalchemy/inspection.py +lib/sqlalchemy/interfaces.py +lib/sqlalchemy/log.py +lib/sqlalchemy/pool.py +lib/sqlalchemy/processors.py +lib/sqlalchemy/schema.py +lib/sqlalchemy/types.py +lib/sqlalchemy/cextension/processors.c +lib/sqlalchemy/cextension/resultproxy.c +lib/sqlalchemy/cextension/utils.c +lib/sqlalchemy/connectors/__init__.py +lib/sqlalchemy/connectors/mxodbc.py +lib/sqlalchemy/connectors/pyodbc.py +lib/sqlalchemy/connectors/zxJDBC.py +lib/sqlalchemy/databases/__init__.py +lib/sqlalchemy/dialects/__init__.py +lib/sqlalchemy/dialects/postgres.py +lib/sqlalchemy/dialects/type_migration_guidelines.txt +lib/sqlalchemy/dialects/firebird/__init__.py +lib/sqlalchemy/dialects/firebird/base.py +lib/sqlalchemy/dialects/firebird/fdb.py +lib/sqlalchemy/dialects/firebird/kinterbasdb.py +lib/sqlalchemy/dialects/mssql/__init__.py +lib/sqlalchemy/dialects/mssql/adodbapi.py +lib/sqlalchemy/dialects/mssql/base.py +lib/sqlalchemy/dialects/mssql/information_schema.py +lib/sqlalchemy/dialects/mssql/mxodbc.py +lib/sqlalchemy/dialects/mssql/pymssql.py +lib/sqlalchemy/dialects/mssql/pyodbc.py +lib/sqlalchemy/dialects/mssql/zxjdbc.py +lib/sqlalchemy/dialects/mysql/__init__.py +lib/sqlalchemy/dialects/mysql/base.py +lib/sqlalchemy/dialects/mysql/cymysql.py +lib/sqlalchemy/dialects/mysql/gaerdbms.py +lib/sqlalchemy/dialects/mysql/mysqlconnector.py +lib/sqlalchemy/dialects/mysql/mysqldb.py +lib/sqlalchemy/dialects/mysql/oursql.py +lib/sqlalchemy/dialects/mysql/pymysql.py +lib/sqlalchemy/dialects/mysql/pyodbc.py +lib/sqlalchemy/dialects/mysql/zxjdbc.py +lib/sqlalchemy/dialects/oracle/__init__.py +lib/sqlalchemy/dialects/oracle/base.py +lib/sqlalchemy/dialects/oracle/cx_oracle.py +lib/sqlalchemy/dialects/oracle/zxjdbc.py +lib/sqlalchemy/dialects/postgresql/__init__.py +lib/sqlalchemy/dialects/postgresql/base.py +lib/sqlalchemy/dialects/postgresql/constraints.py +lib/sqlalchemy/dialects/postgresql/hstore.py +lib/sqlalchemy/dialects/postgresql/json.py +lib/sqlalchemy/dialects/postgresql/pg8000.py +lib/sqlalchemy/dialects/postgresql/psycopg2.py +lib/sqlalchemy/dialects/postgresql/psycopg2cffi.py +lib/sqlalchemy/dialects/postgresql/pypostgresql.py +lib/sqlalchemy/dialects/postgresql/ranges.py +lib/sqlalchemy/dialects/postgresql/zxjdbc.py +lib/sqlalchemy/dialects/sqlite/__init__.py +lib/sqlalchemy/dialects/sqlite/base.py +lib/sqlalchemy/dialects/sqlite/pysqlcipher.py +lib/sqlalchemy/dialects/sqlite/pysqlite.py +lib/sqlalchemy/dialects/sybase/__init__.py +lib/sqlalchemy/dialects/sybase/base.py +lib/sqlalchemy/dialects/sybase/mxodbc.py +lib/sqlalchemy/dialects/sybase/pyodbc.py +lib/sqlalchemy/dialects/sybase/pysybase.py +lib/sqlalchemy/engine/__init__.py +lib/sqlalchemy/engine/base.py +lib/sqlalchemy/engine/default.py +lib/sqlalchemy/engine/interfaces.py +lib/sqlalchemy/engine/reflection.py +lib/sqlalchemy/engine/result.py +lib/sqlalchemy/engine/strategies.py +lib/sqlalchemy/engine/threadlocal.py +lib/sqlalchemy/engine/url.py +lib/sqlalchemy/engine/util.py +lib/sqlalchemy/event/__init__.py +lib/sqlalchemy/event/api.py +lib/sqlalchemy/event/attr.py +lib/sqlalchemy/event/base.py +lib/sqlalchemy/event/legacy.py +lib/sqlalchemy/event/registry.py +lib/sqlalchemy/ext/__init__.py +lib/sqlalchemy/ext/associationproxy.py +lib/sqlalchemy/ext/automap.py +lib/sqlalchemy/ext/baked.py +lib/sqlalchemy/ext/compiler.py +lib/sqlalchemy/ext/horizontal_shard.py +lib/sqlalchemy/ext/hybrid.py +lib/sqlalchemy/ext/instrumentation.py +lib/sqlalchemy/ext/mutable.py +lib/sqlalchemy/ext/orderinglist.py +lib/sqlalchemy/ext/serializer.py +lib/sqlalchemy/ext/declarative/__init__.py +lib/sqlalchemy/ext/declarative/api.py +lib/sqlalchemy/ext/declarative/base.py +lib/sqlalchemy/ext/declarative/clsregistry.py +lib/sqlalchemy/orm/__init__.py +lib/sqlalchemy/orm/attributes.py +lib/sqlalchemy/orm/base.py +lib/sqlalchemy/orm/collections.py +lib/sqlalchemy/orm/dependency.py +lib/sqlalchemy/orm/deprecated_interfaces.py +lib/sqlalchemy/orm/descriptor_props.py +lib/sqlalchemy/orm/dynamic.py +lib/sqlalchemy/orm/evaluator.py +lib/sqlalchemy/orm/events.py +lib/sqlalchemy/orm/exc.py +lib/sqlalchemy/orm/identity.py +lib/sqlalchemy/orm/instrumentation.py +lib/sqlalchemy/orm/interfaces.py +lib/sqlalchemy/orm/loading.py +lib/sqlalchemy/orm/mapper.py +lib/sqlalchemy/orm/path_registry.py +lib/sqlalchemy/orm/persistence.py +lib/sqlalchemy/orm/properties.py +lib/sqlalchemy/orm/query.py +lib/sqlalchemy/orm/relationships.py +lib/sqlalchemy/orm/scoping.py +lib/sqlalchemy/orm/session.py +lib/sqlalchemy/orm/state.py +lib/sqlalchemy/orm/strategies.py +lib/sqlalchemy/orm/strategy_options.py +lib/sqlalchemy/orm/sync.py +lib/sqlalchemy/orm/unitofwork.py +lib/sqlalchemy/orm/util.py +lib/sqlalchemy/sql/__init__.py +lib/sqlalchemy/sql/annotation.py +lib/sqlalchemy/sql/base.py +lib/sqlalchemy/sql/compiler.py +lib/sqlalchemy/sql/crud.py +lib/sqlalchemy/sql/ddl.py +lib/sqlalchemy/sql/default_comparator.py +lib/sqlalchemy/sql/dml.py +lib/sqlalchemy/sql/elements.py +lib/sqlalchemy/sql/expression.py +lib/sqlalchemy/sql/functions.py +lib/sqlalchemy/sql/naming.py +lib/sqlalchemy/sql/operators.py +lib/sqlalchemy/sql/schema.py +lib/sqlalchemy/sql/selectable.py +lib/sqlalchemy/sql/sqltypes.py +lib/sqlalchemy/sql/type_api.py +lib/sqlalchemy/sql/util.py +lib/sqlalchemy/sql/visitors.py +lib/sqlalchemy/testing/__init__.py +lib/sqlalchemy/testing/assertions.py +lib/sqlalchemy/testing/assertsql.py +lib/sqlalchemy/testing/config.py +lib/sqlalchemy/testing/distutils_run.py +lib/sqlalchemy/testing/engines.py +lib/sqlalchemy/testing/entities.py +lib/sqlalchemy/testing/exclusions.py +lib/sqlalchemy/testing/fixtures.py +lib/sqlalchemy/testing/mock.py +lib/sqlalchemy/testing/pickleable.py +lib/sqlalchemy/testing/profiling.py +lib/sqlalchemy/testing/provision.py +lib/sqlalchemy/testing/replay_fixture.py +lib/sqlalchemy/testing/requirements.py +lib/sqlalchemy/testing/runner.py +lib/sqlalchemy/testing/schema.py +lib/sqlalchemy/testing/util.py +lib/sqlalchemy/testing/warnings.py +lib/sqlalchemy/testing/plugin/__init__.py +lib/sqlalchemy/testing/plugin/bootstrap.py +lib/sqlalchemy/testing/plugin/noseplugin.py +lib/sqlalchemy/testing/plugin/plugin_base.py +lib/sqlalchemy/testing/plugin/pytestplugin.py +lib/sqlalchemy/testing/suite/__init__.py +lib/sqlalchemy/testing/suite/test_ddl.py +lib/sqlalchemy/testing/suite/test_dialect.py +lib/sqlalchemy/testing/suite/test_insert.py +lib/sqlalchemy/testing/suite/test_reflection.py +lib/sqlalchemy/testing/suite/test_results.py +lib/sqlalchemy/testing/suite/test_select.py +lib/sqlalchemy/testing/suite/test_sequence.py +lib/sqlalchemy/testing/suite/test_types.py +lib/sqlalchemy/testing/suite/test_update_delete.py +lib/sqlalchemy/util/__init__.py +lib/sqlalchemy/util/_collections.py +lib/sqlalchemy/util/compat.py +lib/sqlalchemy/util/deprecations.py +lib/sqlalchemy/util/langhelpers.py +lib/sqlalchemy/util/queue.py +lib/sqlalchemy/util/topological.py +test/__init__.py +test/binary_data_one.dat +test/binary_data_two.dat +test/conftest.py +test/requirements.py +test/aaa_profiling/__init__.py +test/aaa_profiling/test_compiler.py +test/aaa_profiling/test_memusage.py +test/aaa_profiling/test_orm.py +test/aaa_profiling/test_pool.py +test/aaa_profiling/test_resultset.py +test/aaa_profiling/test_zoomark.py +test/aaa_profiling/test_zoomark_orm.py +test/base/__init__.py +test/base/test_dependency.py +test/base/test_events.py +test/base/test_except.py +test/base/test_inspect.py +test/base/test_tutorials.py +test/base/test_utils.py +test/dialect/__init__.py +test/dialect/test_firebird.py +test/dialect/test_mxodbc.py +test/dialect/test_oracle.py +test/dialect/test_pyodbc.py +test/dialect/test_sqlite.py +test/dialect/test_suite.py +test/dialect/test_sybase.py +test/dialect/mssql/__init__.py +test/dialect/mssql/test_compiler.py +test/dialect/mssql/test_engine.py +test/dialect/mssql/test_query.py +test/dialect/mssql/test_reflection.py +test/dialect/mssql/test_types.py +test/dialect/mysql/__init__.py +test/dialect/mysql/test_compiler.py +test/dialect/mysql/test_dialect.py +test/dialect/mysql/test_query.py +test/dialect/mysql/test_reflection.py +test/dialect/mysql/test_types.py +test/dialect/postgresql/__init__.py +test/dialect/postgresql/test_compiler.py +test/dialect/postgresql/test_dialect.py +test/dialect/postgresql/test_query.py +test/dialect/postgresql/test_reflection.py +test/dialect/postgresql/test_types.py +test/engine/__init__.py +test/engine/test_bind.py +test/engine/test_ddlevents.py +test/engine/test_execute.py +test/engine/test_logging.py +test/engine/test_parseconnect.py +test/engine/test_pool.py +test/engine/test_processors.py +test/engine/test_reconnect.py +test/engine/test_reflection.py +test/engine/test_transaction.py +test/ext/__init__.py +test/ext/test_associationproxy.py +test/ext/test_automap.py +test/ext/test_baked.py +test/ext/test_compiler.py +test/ext/test_extendedattr.py +test/ext/test_horizontal_shard.py +test/ext/test_hybrid.py +test/ext/test_mutable.py +test/ext/test_orderinglist.py +test/ext/test_serializer.py +test/ext/declarative/__init__.py +test/ext/declarative/test_basic.py +test/ext/declarative/test_clsregistry.py +test/ext/declarative/test_inheritance.py +test/ext/declarative/test_mixin.py +test/ext/declarative/test_reflection.py +test/orm/__init__.py +test/orm/_fixtures.py +test/orm/test_association.py +test/orm/test_assorted_eager.py +test/orm/test_attributes.py +test/orm/test_backref_mutations.py +test/orm/test_bind.py +test/orm/test_bulk.py +test/orm/test_bundle.py +test/orm/test_cascade.py +test/orm/test_collection.py +test/orm/test_compile.py +test/orm/test_composites.py +test/orm/test_cycles.py +test/orm/test_default_strategies.py +test/orm/test_defaults.py +test/orm/test_deferred.py +test/orm/test_deprecations.py +test/orm/test_descriptor.py +test/orm/test_dynamic.py +test/orm/test_eager_relations.py +test/orm/test_evaluator.py +test/orm/test_events.py +test/orm/test_expire.py +test/orm/test_froms.py +test/orm/test_generative.py +test/orm/test_hasparent.py +test/orm/test_immediate_load.py +test/orm/test_inspect.py +test/orm/test_instrumentation.py +test/orm/test_joins.py +test/orm/test_lazy_relations.py +test/orm/test_load_on_fks.py +test/orm/test_loading.py +test/orm/test_lockmode.py +test/orm/test_manytomany.py +test/orm/test_mapper.py +test/orm/test_merge.py +test/orm/test_naturalpks.py +test/orm/test_of_type.py +test/orm/test_onetoone.py +test/orm/test_options.py +test/orm/test_pickled.py +test/orm/test_query.py +test/orm/test_rel_fn.py +test/orm/test_relationships.py +test/orm/test_scoping.py +test/orm/test_selectable.py +test/orm/test_session.py +test/orm/test_subquery_relations.py +test/orm/test_sync.py +test/orm/test_transaction.py +test/orm/test_unitofwork.py +test/orm/test_unitofworkv2.py +test/orm/test_update_delete.py +test/orm/test_utils.py +test/orm/test_validators.py +test/orm/test_versioning.py +test/orm/inheritance/__init__.py +test/orm/inheritance/_poly_fixtures.py +test/orm/inheritance/test_abc_inheritance.py +test/orm/inheritance/test_abc_polymorphic.py +test/orm/inheritance/test_assorted_poly.py +test/orm/inheritance/test_basic.py +test/orm/inheritance/test_concrete.py +test/orm/inheritance/test_magazine.py +test/orm/inheritance/test_manytomany.py +test/orm/inheritance/test_poly_linked_list.py +test/orm/inheritance/test_poly_persistence.py +test/orm/inheritance/test_polymorphic_rel.py +test/orm/inheritance/test_productspec.py +test/orm/inheritance/test_relationship.py +test/orm/inheritance/test_selects.py +test/orm/inheritance/test_single.py +test/orm/inheritance/test_with_poly.py +test/perf/invalidate_stresstest.py +test/perf/orm2010.py +test/sql/__init__.py +test/sql/test_case_statement.py +test/sql/test_compiler.py +test/sql/test_constraints.py +test/sql/test_cte.py +test/sql/test_ddlemit.py +test/sql/test_defaults.py +test/sql/test_delete.py +test/sql/test_functions.py +test/sql/test_generative.py +test/sql/test_insert.py +test/sql/test_insert_exec.py +test/sql/test_inspect.py +test/sql/test_join_rewriting.py +test/sql/test_labels.py +test/sql/test_metadata.py +test/sql/test_operators.py +test/sql/test_query.py +test/sql/test_quote.py +test/sql/test_resultset.py +test/sql/test_returning.py +test/sql/test_rowcount.py +test/sql/test_selectable.py +test/sql/test_text.py +test/sql/test_type_expressions.py +test/sql/test_types.py +test/sql/test_unicode.py +test/sql/test_update.py \ No newline at end of file diff --git a/lib/python3.4/site-packages/netifaces-0.10.6.dist-info/zip-safe b/lib/python3.7/site-packages/SQLAlchemy-1.0.12.egg-info/dependency_links.txt similarity index 100% rename from lib/python3.4/site-packages/netifaces-0.10.6.dist-info/zip-safe rename to lib/python3.7/site-packages/SQLAlchemy-1.0.12.egg-info/dependency_links.txt diff --git a/lib/python3.7/site-packages/SQLAlchemy-1.0.12.egg-info/installed-files.txt b/lib/python3.7/site-packages/SQLAlchemy-1.0.12.egg-info/installed-files.txt new file mode 100644 index 0000000..2ef7e78 --- /dev/null +++ b/lib/python3.7/site-packages/SQLAlchemy-1.0.12.egg-info/installed-files.txt @@ -0,0 +1,373 @@ +../sqlalchemy/__init__.py +../sqlalchemy/__pycache__/__init__.cpython-37.pyc +../sqlalchemy/__pycache__/events.cpython-37.pyc +../sqlalchemy/__pycache__/exc.cpython-37.pyc +../sqlalchemy/__pycache__/inspection.cpython-37.pyc +../sqlalchemy/__pycache__/interfaces.cpython-37.pyc +../sqlalchemy/__pycache__/log.cpython-37.pyc +../sqlalchemy/__pycache__/pool.cpython-37.pyc +../sqlalchemy/__pycache__/processors.cpython-37.pyc +../sqlalchemy/__pycache__/schema.cpython-37.pyc +../sqlalchemy/__pycache__/types.cpython-37.pyc +../sqlalchemy/connectors/__init__.py +../sqlalchemy/connectors/__pycache__/__init__.cpython-37.pyc +../sqlalchemy/connectors/__pycache__/mxodbc.cpython-37.pyc +../sqlalchemy/connectors/__pycache__/pyodbc.cpython-37.pyc +../sqlalchemy/connectors/__pycache__/zxJDBC.cpython-37.pyc +../sqlalchemy/connectors/mxodbc.py +../sqlalchemy/connectors/pyodbc.py +../sqlalchemy/connectors/zxJDBC.py +../sqlalchemy/cprocessors.cpython-37m-x86_64-linux-gnu.so +../sqlalchemy/cresultproxy.cpython-37m-x86_64-linux-gnu.so +../sqlalchemy/cutils.cpython-37m-x86_64-linux-gnu.so +../sqlalchemy/databases/__init__.py +../sqlalchemy/databases/__pycache__/__init__.cpython-37.pyc +../sqlalchemy/dialects/__init__.py +../sqlalchemy/dialects/__pycache__/__init__.cpython-37.pyc +../sqlalchemy/dialects/__pycache__/postgres.cpython-37.pyc +../sqlalchemy/dialects/firebird/__init__.py +../sqlalchemy/dialects/firebird/__pycache__/__init__.cpython-37.pyc +../sqlalchemy/dialects/firebird/__pycache__/base.cpython-37.pyc +../sqlalchemy/dialects/firebird/__pycache__/fdb.cpython-37.pyc +../sqlalchemy/dialects/firebird/__pycache__/kinterbasdb.cpython-37.pyc +../sqlalchemy/dialects/firebird/base.py +../sqlalchemy/dialects/firebird/fdb.py +../sqlalchemy/dialects/firebird/kinterbasdb.py +../sqlalchemy/dialects/mssql/__init__.py +../sqlalchemy/dialects/mssql/__pycache__/__init__.cpython-37.pyc +../sqlalchemy/dialects/mssql/__pycache__/adodbapi.cpython-37.pyc +../sqlalchemy/dialects/mssql/__pycache__/base.cpython-37.pyc +../sqlalchemy/dialects/mssql/__pycache__/information_schema.cpython-37.pyc +../sqlalchemy/dialects/mssql/__pycache__/mxodbc.cpython-37.pyc +../sqlalchemy/dialects/mssql/__pycache__/pymssql.cpython-37.pyc +../sqlalchemy/dialects/mssql/__pycache__/pyodbc.cpython-37.pyc +../sqlalchemy/dialects/mssql/__pycache__/zxjdbc.cpython-37.pyc +../sqlalchemy/dialects/mssql/adodbapi.py +../sqlalchemy/dialects/mssql/base.py +../sqlalchemy/dialects/mssql/information_schema.py +../sqlalchemy/dialects/mssql/mxodbc.py +../sqlalchemy/dialects/mssql/pymssql.py +../sqlalchemy/dialects/mssql/pyodbc.py +../sqlalchemy/dialects/mssql/zxjdbc.py +../sqlalchemy/dialects/mysql/__init__.py +../sqlalchemy/dialects/mysql/__pycache__/__init__.cpython-37.pyc +../sqlalchemy/dialects/mysql/__pycache__/base.cpython-37.pyc +../sqlalchemy/dialects/mysql/__pycache__/cymysql.cpython-37.pyc +../sqlalchemy/dialects/mysql/__pycache__/gaerdbms.cpython-37.pyc +../sqlalchemy/dialects/mysql/__pycache__/mysqlconnector.cpython-37.pyc +../sqlalchemy/dialects/mysql/__pycache__/mysqldb.cpython-37.pyc +../sqlalchemy/dialects/mysql/__pycache__/oursql.cpython-37.pyc +../sqlalchemy/dialects/mysql/__pycache__/pymysql.cpython-37.pyc +../sqlalchemy/dialects/mysql/__pycache__/pyodbc.cpython-37.pyc +../sqlalchemy/dialects/mysql/__pycache__/zxjdbc.cpython-37.pyc +../sqlalchemy/dialects/mysql/base.py +../sqlalchemy/dialects/mysql/cymysql.py +../sqlalchemy/dialects/mysql/gaerdbms.py +../sqlalchemy/dialects/mysql/mysqlconnector.py +../sqlalchemy/dialects/mysql/mysqldb.py +../sqlalchemy/dialects/mysql/oursql.py +../sqlalchemy/dialects/mysql/pymysql.py +../sqlalchemy/dialects/mysql/pyodbc.py +../sqlalchemy/dialects/mysql/zxjdbc.py +../sqlalchemy/dialects/oracle/__init__.py +../sqlalchemy/dialects/oracle/__pycache__/__init__.cpython-37.pyc +../sqlalchemy/dialects/oracle/__pycache__/base.cpython-37.pyc +../sqlalchemy/dialects/oracle/__pycache__/cx_oracle.cpython-37.pyc +../sqlalchemy/dialects/oracle/__pycache__/zxjdbc.cpython-37.pyc +../sqlalchemy/dialects/oracle/base.py +../sqlalchemy/dialects/oracle/cx_oracle.py +../sqlalchemy/dialects/oracle/zxjdbc.py +../sqlalchemy/dialects/postgres.py +../sqlalchemy/dialects/postgresql/__init__.py +../sqlalchemy/dialects/postgresql/__pycache__/__init__.cpython-37.pyc +../sqlalchemy/dialects/postgresql/__pycache__/base.cpython-37.pyc +../sqlalchemy/dialects/postgresql/__pycache__/constraints.cpython-37.pyc +../sqlalchemy/dialects/postgresql/__pycache__/hstore.cpython-37.pyc +../sqlalchemy/dialects/postgresql/__pycache__/json.cpython-37.pyc +../sqlalchemy/dialects/postgresql/__pycache__/pg8000.cpython-37.pyc +../sqlalchemy/dialects/postgresql/__pycache__/psycopg2.cpython-37.pyc +../sqlalchemy/dialects/postgresql/__pycache__/psycopg2cffi.cpython-37.pyc +../sqlalchemy/dialects/postgresql/__pycache__/pypostgresql.cpython-37.pyc +../sqlalchemy/dialects/postgresql/__pycache__/ranges.cpython-37.pyc +../sqlalchemy/dialects/postgresql/__pycache__/zxjdbc.cpython-37.pyc +../sqlalchemy/dialects/postgresql/base.py +../sqlalchemy/dialects/postgresql/constraints.py +../sqlalchemy/dialects/postgresql/hstore.py +../sqlalchemy/dialects/postgresql/json.py +../sqlalchemy/dialects/postgresql/pg8000.py +../sqlalchemy/dialects/postgresql/psycopg2.py +../sqlalchemy/dialects/postgresql/psycopg2cffi.py +../sqlalchemy/dialects/postgresql/pypostgresql.py +../sqlalchemy/dialects/postgresql/ranges.py +../sqlalchemy/dialects/postgresql/zxjdbc.py +../sqlalchemy/dialects/sqlite/__init__.py +../sqlalchemy/dialects/sqlite/__pycache__/__init__.cpython-37.pyc +../sqlalchemy/dialects/sqlite/__pycache__/base.cpython-37.pyc +../sqlalchemy/dialects/sqlite/__pycache__/pysqlcipher.cpython-37.pyc +../sqlalchemy/dialects/sqlite/__pycache__/pysqlite.cpython-37.pyc +../sqlalchemy/dialects/sqlite/base.py +../sqlalchemy/dialects/sqlite/pysqlcipher.py +../sqlalchemy/dialects/sqlite/pysqlite.py +../sqlalchemy/dialects/sybase/__init__.py +../sqlalchemy/dialects/sybase/__pycache__/__init__.cpython-37.pyc +../sqlalchemy/dialects/sybase/__pycache__/base.cpython-37.pyc +../sqlalchemy/dialects/sybase/__pycache__/mxodbc.cpython-37.pyc +../sqlalchemy/dialects/sybase/__pycache__/pyodbc.cpython-37.pyc +../sqlalchemy/dialects/sybase/__pycache__/pysybase.cpython-37.pyc +../sqlalchemy/dialects/sybase/base.py +../sqlalchemy/dialects/sybase/mxodbc.py +../sqlalchemy/dialects/sybase/pyodbc.py +../sqlalchemy/dialects/sybase/pysybase.py +../sqlalchemy/engine/__init__.py +../sqlalchemy/engine/__pycache__/__init__.cpython-37.pyc +../sqlalchemy/engine/__pycache__/base.cpython-37.pyc +../sqlalchemy/engine/__pycache__/default.cpython-37.pyc +../sqlalchemy/engine/__pycache__/interfaces.cpython-37.pyc +../sqlalchemy/engine/__pycache__/reflection.cpython-37.pyc +../sqlalchemy/engine/__pycache__/result.cpython-37.pyc +../sqlalchemy/engine/__pycache__/strategies.cpython-37.pyc +../sqlalchemy/engine/__pycache__/threadlocal.cpython-37.pyc +../sqlalchemy/engine/__pycache__/url.cpython-37.pyc +../sqlalchemy/engine/__pycache__/util.cpython-37.pyc +../sqlalchemy/engine/base.py +../sqlalchemy/engine/default.py +../sqlalchemy/engine/interfaces.py +../sqlalchemy/engine/reflection.py +../sqlalchemy/engine/result.py +../sqlalchemy/engine/strategies.py +../sqlalchemy/engine/threadlocal.py +../sqlalchemy/engine/url.py +../sqlalchemy/engine/util.py +../sqlalchemy/event/__init__.py +../sqlalchemy/event/__pycache__/__init__.cpython-37.pyc +../sqlalchemy/event/__pycache__/api.cpython-37.pyc +../sqlalchemy/event/__pycache__/attr.cpython-37.pyc +../sqlalchemy/event/__pycache__/base.cpython-37.pyc +../sqlalchemy/event/__pycache__/legacy.cpython-37.pyc +../sqlalchemy/event/__pycache__/registry.cpython-37.pyc +../sqlalchemy/event/api.py +../sqlalchemy/event/attr.py +../sqlalchemy/event/base.py +../sqlalchemy/event/legacy.py +../sqlalchemy/event/registry.py +../sqlalchemy/events.py +../sqlalchemy/exc.py +../sqlalchemy/ext/__init__.py +../sqlalchemy/ext/__pycache__/__init__.cpython-37.pyc +../sqlalchemy/ext/__pycache__/associationproxy.cpython-37.pyc +../sqlalchemy/ext/__pycache__/automap.cpython-37.pyc +../sqlalchemy/ext/__pycache__/baked.cpython-37.pyc +../sqlalchemy/ext/__pycache__/compiler.cpython-37.pyc +../sqlalchemy/ext/__pycache__/horizontal_shard.cpython-37.pyc +../sqlalchemy/ext/__pycache__/hybrid.cpython-37.pyc +../sqlalchemy/ext/__pycache__/instrumentation.cpython-37.pyc +../sqlalchemy/ext/__pycache__/mutable.cpython-37.pyc +../sqlalchemy/ext/__pycache__/orderinglist.cpython-37.pyc +../sqlalchemy/ext/__pycache__/serializer.cpython-37.pyc +../sqlalchemy/ext/associationproxy.py +../sqlalchemy/ext/automap.py +../sqlalchemy/ext/baked.py +../sqlalchemy/ext/compiler.py +../sqlalchemy/ext/declarative/__init__.py +../sqlalchemy/ext/declarative/__pycache__/__init__.cpython-37.pyc +../sqlalchemy/ext/declarative/__pycache__/api.cpython-37.pyc +../sqlalchemy/ext/declarative/__pycache__/base.cpython-37.pyc +../sqlalchemy/ext/declarative/__pycache__/clsregistry.cpython-37.pyc +../sqlalchemy/ext/declarative/api.py +../sqlalchemy/ext/declarative/base.py +../sqlalchemy/ext/declarative/clsregistry.py +../sqlalchemy/ext/horizontal_shard.py +../sqlalchemy/ext/hybrid.py +../sqlalchemy/ext/instrumentation.py +../sqlalchemy/ext/mutable.py +../sqlalchemy/ext/orderinglist.py +../sqlalchemy/ext/serializer.py +../sqlalchemy/inspection.py +../sqlalchemy/interfaces.py +../sqlalchemy/log.py +../sqlalchemy/orm/__init__.py +../sqlalchemy/orm/__pycache__/__init__.cpython-37.pyc +../sqlalchemy/orm/__pycache__/attributes.cpython-37.pyc +../sqlalchemy/orm/__pycache__/base.cpython-37.pyc +../sqlalchemy/orm/__pycache__/collections.cpython-37.pyc +../sqlalchemy/orm/__pycache__/dependency.cpython-37.pyc +../sqlalchemy/orm/__pycache__/deprecated_interfaces.cpython-37.pyc +../sqlalchemy/orm/__pycache__/descriptor_props.cpython-37.pyc +../sqlalchemy/orm/__pycache__/dynamic.cpython-37.pyc +../sqlalchemy/orm/__pycache__/evaluator.cpython-37.pyc +../sqlalchemy/orm/__pycache__/events.cpython-37.pyc +../sqlalchemy/orm/__pycache__/exc.cpython-37.pyc +../sqlalchemy/orm/__pycache__/identity.cpython-37.pyc +../sqlalchemy/orm/__pycache__/instrumentation.cpython-37.pyc +../sqlalchemy/orm/__pycache__/interfaces.cpython-37.pyc +../sqlalchemy/orm/__pycache__/loading.cpython-37.pyc +../sqlalchemy/orm/__pycache__/mapper.cpython-37.pyc +../sqlalchemy/orm/__pycache__/path_registry.cpython-37.pyc +../sqlalchemy/orm/__pycache__/persistence.cpython-37.pyc +../sqlalchemy/orm/__pycache__/properties.cpython-37.pyc +../sqlalchemy/orm/__pycache__/query.cpython-37.pyc +../sqlalchemy/orm/__pycache__/relationships.cpython-37.pyc +../sqlalchemy/orm/__pycache__/scoping.cpython-37.pyc +../sqlalchemy/orm/__pycache__/session.cpython-37.pyc +../sqlalchemy/orm/__pycache__/state.cpython-37.pyc +../sqlalchemy/orm/__pycache__/strategies.cpython-37.pyc +../sqlalchemy/orm/__pycache__/strategy_options.cpython-37.pyc +../sqlalchemy/orm/__pycache__/sync.cpython-37.pyc +../sqlalchemy/orm/__pycache__/unitofwork.cpython-37.pyc +../sqlalchemy/orm/__pycache__/util.cpython-37.pyc +../sqlalchemy/orm/attributes.py +../sqlalchemy/orm/base.py +../sqlalchemy/orm/collections.py +../sqlalchemy/orm/dependency.py +../sqlalchemy/orm/deprecated_interfaces.py +../sqlalchemy/orm/descriptor_props.py +../sqlalchemy/orm/dynamic.py +../sqlalchemy/orm/evaluator.py +../sqlalchemy/orm/events.py +../sqlalchemy/orm/exc.py +../sqlalchemy/orm/identity.py +../sqlalchemy/orm/instrumentation.py +../sqlalchemy/orm/interfaces.py +../sqlalchemy/orm/loading.py +../sqlalchemy/orm/mapper.py +../sqlalchemy/orm/path_registry.py +../sqlalchemy/orm/persistence.py +../sqlalchemy/orm/properties.py +../sqlalchemy/orm/query.py +../sqlalchemy/orm/relationships.py +../sqlalchemy/orm/scoping.py +../sqlalchemy/orm/session.py +../sqlalchemy/orm/state.py +../sqlalchemy/orm/strategies.py +../sqlalchemy/orm/strategy_options.py +../sqlalchemy/orm/sync.py +../sqlalchemy/orm/unitofwork.py +../sqlalchemy/orm/util.py +../sqlalchemy/pool.py +../sqlalchemy/processors.py +../sqlalchemy/schema.py +../sqlalchemy/sql/__init__.py +../sqlalchemy/sql/__pycache__/__init__.cpython-37.pyc +../sqlalchemy/sql/__pycache__/annotation.cpython-37.pyc +../sqlalchemy/sql/__pycache__/base.cpython-37.pyc +../sqlalchemy/sql/__pycache__/compiler.cpython-37.pyc +../sqlalchemy/sql/__pycache__/crud.cpython-37.pyc +../sqlalchemy/sql/__pycache__/ddl.cpython-37.pyc +../sqlalchemy/sql/__pycache__/default_comparator.cpython-37.pyc +../sqlalchemy/sql/__pycache__/dml.cpython-37.pyc +../sqlalchemy/sql/__pycache__/elements.cpython-37.pyc +../sqlalchemy/sql/__pycache__/expression.cpython-37.pyc +../sqlalchemy/sql/__pycache__/functions.cpython-37.pyc +../sqlalchemy/sql/__pycache__/naming.cpython-37.pyc +../sqlalchemy/sql/__pycache__/operators.cpython-37.pyc +../sqlalchemy/sql/__pycache__/schema.cpython-37.pyc +../sqlalchemy/sql/__pycache__/selectable.cpython-37.pyc +../sqlalchemy/sql/__pycache__/sqltypes.cpython-37.pyc +../sqlalchemy/sql/__pycache__/type_api.cpython-37.pyc +../sqlalchemy/sql/__pycache__/util.cpython-37.pyc +../sqlalchemy/sql/__pycache__/visitors.cpython-37.pyc +../sqlalchemy/sql/annotation.py +../sqlalchemy/sql/base.py +../sqlalchemy/sql/compiler.py +../sqlalchemy/sql/crud.py +../sqlalchemy/sql/ddl.py +../sqlalchemy/sql/default_comparator.py +../sqlalchemy/sql/dml.py +../sqlalchemy/sql/elements.py +../sqlalchemy/sql/expression.py +../sqlalchemy/sql/functions.py +../sqlalchemy/sql/naming.py +../sqlalchemy/sql/operators.py +../sqlalchemy/sql/schema.py +../sqlalchemy/sql/selectable.py +../sqlalchemy/sql/sqltypes.py +../sqlalchemy/sql/type_api.py +../sqlalchemy/sql/util.py +../sqlalchemy/sql/visitors.py +../sqlalchemy/testing/__init__.py +../sqlalchemy/testing/__pycache__/__init__.cpython-37.pyc +../sqlalchemy/testing/__pycache__/assertions.cpython-37.pyc +../sqlalchemy/testing/__pycache__/assertsql.cpython-37.pyc +../sqlalchemy/testing/__pycache__/config.cpython-37.pyc +../sqlalchemy/testing/__pycache__/distutils_run.cpython-37.pyc +../sqlalchemy/testing/__pycache__/engines.cpython-37.pyc +../sqlalchemy/testing/__pycache__/entities.cpython-37.pyc +../sqlalchemy/testing/__pycache__/exclusions.cpython-37.pyc +../sqlalchemy/testing/__pycache__/fixtures.cpython-37.pyc +../sqlalchemy/testing/__pycache__/mock.cpython-37.pyc +../sqlalchemy/testing/__pycache__/pickleable.cpython-37.pyc +../sqlalchemy/testing/__pycache__/profiling.cpython-37.pyc +../sqlalchemy/testing/__pycache__/provision.cpython-37.pyc +../sqlalchemy/testing/__pycache__/replay_fixture.cpython-37.pyc +../sqlalchemy/testing/__pycache__/requirements.cpython-37.pyc +../sqlalchemy/testing/__pycache__/runner.cpython-37.pyc +../sqlalchemy/testing/__pycache__/schema.cpython-37.pyc +../sqlalchemy/testing/__pycache__/util.cpython-37.pyc +../sqlalchemy/testing/__pycache__/warnings.cpython-37.pyc +../sqlalchemy/testing/assertions.py +../sqlalchemy/testing/assertsql.py +../sqlalchemy/testing/config.py +../sqlalchemy/testing/distutils_run.py +../sqlalchemy/testing/engines.py +../sqlalchemy/testing/entities.py +../sqlalchemy/testing/exclusions.py +../sqlalchemy/testing/fixtures.py +../sqlalchemy/testing/mock.py +../sqlalchemy/testing/pickleable.py +../sqlalchemy/testing/plugin/__init__.py +../sqlalchemy/testing/plugin/__pycache__/__init__.cpython-37.pyc +../sqlalchemy/testing/plugin/__pycache__/bootstrap.cpython-37.pyc +../sqlalchemy/testing/plugin/__pycache__/noseplugin.cpython-37.pyc +../sqlalchemy/testing/plugin/__pycache__/plugin_base.cpython-37.pyc +../sqlalchemy/testing/plugin/__pycache__/pytestplugin.cpython-37.pyc +../sqlalchemy/testing/plugin/bootstrap.py +../sqlalchemy/testing/plugin/noseplugin.py +../sqlalchemy/testing/plugin/plugin_base.py +../sqlalchemy/testing/plugin/pytestplugin.py +../sqlalchemy/testing/profiling.py +../sqlalchemy/testing/provision.py +../sqlalchemy/testing/replay_fixture.py +../sqlalchemy/testing/requirements.py +../sqlalchemy/testing/runner.py +../sqlalchemy/testing/schema.py +../sqlalchemy/testing/suite/__init__.py +../sqlalchemy/testing/suite/__pycache__/__init__.cpython-37.pyc +../sqlalchemy/testing/suite/__pycache__/test_ddl.cpython-37.pyc +../sqlalchemy/testing/suite/__pycache__/test_dialect.cpython-37.pyc +../sqlalchemy/testing/suite/__pycache__/test_insert.cpython-37.pyc +../sqlalchemy/testing/suite/__pycache__/test_reflection.cpython-37.pyc +../sqlalchemy/testing/suite/__pycache__/test_results.cpython-37.pyc +../sqlalchemy/testing/suite/__pycache__/test_select.cpython-37.pyc +../sqlalchemy/testing/suite/__pycache__/test_sequence.cpython-37.pyc +../sqlalchemy/testing/suite/__pycache__/test_types.cpython-37.pyc +../sqlalchemy/testing/suite/__pycache__/test_update_delete.cpython-37.pyc +../sqlalchemy/testing/suite/test_ddl.py +../sqlalchemy/testing/suite/test_dialect.py +../sqlalchemy/testing/suite/test_insert.py +../sqlalchemy/testing/suite/test_reflection.py +../sqlalchemy/testing/suite/test_results.py +../sqlalchemy/testing/suite/test_select.py +../sqlalchemy/testing/suite/test_sequence.py +../sqlalchemy/testing/suite/test_types.py +../sqlalchemy/testing/suite/test_update_delete.py +../sqlalchemy/testing/util.py +../sqlalchemy/testing/warnings.py +../sqlalchemy/types.py +../sqlalchemy/util/__init__.py +../sqlalchemy/util/__pycache__/__init__.cpython-37.pyc +../sqlalchemy/util/__pycache__/_collections.cpython-37.pyc +../sqlalchemy/util/__pycache__/compat.cpython-37.pyc +../sqlalchemy/util/__pycache__/deprecations.cpython-37.pyc +../sqlalchemy/util/__pycache__/langhelpers.cpython-37.pyc +../sqlalchemy/util/__pycache__/queue.cpython-37.pyc +../sqlalchemy/util/__pycache__/topological.cpython-37.pyc +../sqlalchemy/util/_collections.py +../sqlalchemy/util/compat.py +../sqlalchemy/util/deprecations.py +../sqlalchemy/util/langhelpers.py +../sqlalchemy/util/queue.py +../sqlalchemy/util/topological.py +PKG-INFO +SOURCES.txt +dependency_links.txt +top_level.txt diff --git a/lib/python3.4/site-packages/SQLAlchemy-1.0.12.dist-info/top_level.txt b/lib/python3.7/site-packages/SQLAlchemy-1.0.12.egg-info/top_level.txt similarity index 100% rename from lib/python3.4/site-packages/SQLAlchemy-1.0.12.dist-info/top_level.txt rename to lib/python3.7/site-packages/SQLAlchemy-1.0.12.egg-info/top_level.txt diff --git a/lib/python3.4/site-packages/easy_install.py b/lib/python3.7/site-packages/easy_install.py similarity index 100% rename from lib/python3.4/site-packages/easy_install.py rename to lib/python3.7/site-packages/easy_install.py diff --git a/lib/python3.4/site-packages/ed25519-1.4.dist-info/METADATA b/lib/python3.7/site-packages/ed25519-1.4.egg-info/PKG-INFO similarity index 54% rename from lib/python3.4/site-packages/ed25519-1.4.dist-info/METADATA rename to lib/python3.7/site-packages/ed25519-1.4.egg-info/PKG-INFO index 1aee179..654981e 100644 --- a/lib/python3.4/site-packages/ed25519-1.4.dist-info/METADATA +++ b/lib/python3.7/site-packages/ed25519-1.4.egg-info/PKG-INFO @@ -1,4 +1,4 @@ -Metadata-Version: 2.0 +Metadata-Version: 1.1 Name: ed25519 Version: 1.4 Summary: Ed25519 public-key signatures @@ -6,7 +6,16 @@ Home-page: https://github.com/warner/python-ed25519 Author: Brian Warner Author-email: warner-python-ed25519@lothar.com License: MIT -Description-Content-Type: UNKNOWN +Description: Python bindings to the Ed25519 public-key signature system. + + This offers a comfortable python interface to a C implementation of the + Ed25519 public-key signature system (http://ed25519.cr.yp.to/), using the + portable 'ref' code from the 'SUPERCOP' benchmarking suite. + + This system provides high (128-bit) security, short (32-byte) keys, short + (64-byte) signatures, and fast (2-6ms) operation. Please see the README for + more details. + Platform: UNKNOWN Classifier: Development Status :: 5 - Production/Stable Classifier: Intended Audience :: Developers @@ -17,15 +26,3 @@ Classifier: Programming Language :: Python :: 2.7 Classifier: Programming Language :: Python :: 3.3 Classifier: Programming Language :: Python :: 3.4 Classifier: Topic :: Security :: Cryptography - -Python bindings to the Ed25519 public-key signature system. - -This offers a comfortable python interface to a C implementation of the -Ed25519 public-key signature system (http://ed25519.cr.yp.to/), using the -portable 'ref' code from the 'SUPERCOP' benchmarking suite. - -This system provides high (128-bit) security, short (32-byte) keys, short -(64-byte) signatures, and fast (2-6ms) operation. Please see the README for -more details. - - diff --git a/lib/python3.7/site-packages/ed25519-1.4.egg-info/SOURCES.txt b/lib/python3.7/site-packages/ed25519-1.4.egg-info/SOURCES.txt new file mode 100644 index 0000000..ebe0028 --- /dev/null +++ b/lib/python3.7/site-packages/ed25519-1.4.egg-info/SOURCES.txt @@ -0,0 +1,39 @@ +LICENSE +MANIFEST.in +Makefile +NEWS +README.md +kat-ed25519.txt +kat.py +setup.cfg +test_ed25519_kat.py +versioneer.py +bin/edsig +ed25519.egg-info/PKG-INFO +ed25519.egg-info/SOURCES.txt +ed25519.egg-info/dependency_links.txt +ed25519.egg-info/top_level.txt +src/ed25519/__init__.py +src/ed25519/_version.py +src/ed25519/keys.py +src/ed25519/test_ed25519.py +src/ed25519-glue/ed25519module.c +src/ed25519-supercop-ref/Makefile +src/ed25519-supercop-ref/api.h +src/ed25519-supercop-ref/crypto_int32.h +src/ed25519-supercop-ref/crypto_sign.h +src/ed25519-supercop-ref/crypto_uint32.h +src/ed25519-supercop-ref/crypto_verify_32.h +src/ed25519-supercop-ref/ed25519.c +src/ed25519-supercop-ref/fe25519.c +src/ed25519-supercop-ref/fe25519.h +src/ed25519-supercop-ref/ge25519.c +src/ed25519-supercop-ref/ge25519.h +src/ed25519-supercop-ref/ge25519_base.data +src/ed25519-supercop-ref/sc25519.c +src/ed25519-supercop-ref/sc25519.h +src/ed25519-supercop-ref/sha512-blocks.c +src/ed25519-supercop-ref/sha512-hash.c +src/ed25519-supercop-ref/sha512.h +src/ed25519-supercop-ref/test.c +src/ed25519-supercop-ref/verify.c \ No newline at end of file diff --git a/lib/python3.4/site-packages/setuptools-36.6.0.dist-info/zip-safe b/lib/python3.7/site-packages/ed25519-1.4.egg-info/dependency_links.txt similarity index 100% rename from lib/python3.4/site-packages/setuptools-36.6.0.dist-info/zip-safe rename to lib/python3.7/site-packages/ed25519-1.4.egg-info/dependency_links.txt diff --git a/lib/python3.7/site-packages/ed25519-1.4.egg-info/installed-files.txt b/lib/python3.7/site-packages/ed25519-1.4.egg-info/installed-files.txt new file mode 100644 index 0000000..ee1d87c --- /dev/null +++ b/lib/python3.7/site-packages/ed25519-1.4.egg-info/installed-files.txt @@ -0,0 +1,14 @@ +../../../../bin/edsig +../ed25519/__init__.py +../ed25519/__pycache__/__init__.cpython-37.pyc +../ed25519/__pycache__/_version.cpython-37.pyc +../ed25519/__pycache__/keys.cpython-37.pyc +../ed25519/__pycache__/test_ed25519.cpython-37.pyc +../ed25519/_ed25519.cpython-37m-x86_64-linux-gnu.so +../ed25519/_version.py +../ed25519/keys.py +../ed25519/test_ed25519.py +PKG-INFO +SOURCES.txt +dependency_links.txt +top_level.txt diff --git a/lib/python3.4/site-packages/ed25519-1.4.dist-info/top_level.txt b/lib/python3.7/site-packages/ed25519-1.4.egg-info/top_level.txt similarity index 100% rename from lib/python3.4/site-packages/ed25519-1.4.dist-info/top_level.txt rename to lib/python3.7/site-packages/ed25519-1.4.egg-info/top_level.txt diff --git a/lib/python3.4/site-packages/ed25519/__init__.py b/lib/python3.7/site-packages/ed25519/__init__.py similarity index 100% rename from lib/python3.4/site-packages/ed25519/__init__.py rename to lib/python3.7/site-packages/ed25519/__init__.py diff --git a/lib/python3.7/site-packages/ed25519/_ed25519.cpython-35m-x86_64-linux-gnu.so b/lib/python3.7/site-packages/ed25519/_ed25519.cpython-35m-x86_64-linux-gnu.so new file mode 100755 index 0000000..aa56d12 Binary files /dev/null and b/lib/python3.7/site-packages/ed25519/_ed25519.cpython-35m-x86_64-linux-gnu.so differ diff --git a/lib/python3.7/site-packages/ed25519/_ed25519.cpython-36m-x86_64-linux-gnu.so b/lib/python3.7/site-packages/ed25519/_ed25519.cpython-36m-x86_64-linux-gnu.so new file mode 100755 index 0000000..51568d4 Binary files /dev/null and b/lib/python3.7/site-packages/ed25519/_ed25519.cpython-36m-x86_64-linux-gnu.so differ diff --git a/lib/python3.7/site-packages/ed25519/_ed25519.cpython-37m-x86_64-linux-gnu.so b/lib/python3.7/site-packages/ed25519/_ed25519.cpython-37m-x86_64-linux-gnu.so new file mode 100755 index 0000000..ed1ccde Binary files /dev/null and b/lib/python3.7/site-packages/ed25519/_ed25519.cpython-37m-x86_64-linux-gnu.so differ diff --git a/lib/python3.4/site-packages/ed25519/_version.py b/lib/python3.7/site-packages/ed25519/_version.py similarity index 100% rename from lib/python3.4/site-packages/ed25519/_version.py rename to lib/python3.7/site-packages/ed25519/_version.py diff --git a/lib/python3.4/site-packages/ed25519/keys.py b/lib/python3.7/site-packages/ed25519/keys.py similarity index 100% rename from lib/python3.4/site-packages/ed25519/keys.py rename to lib/python3.7/site-packages/ed25519/keys.py diff --git a/lib/python3.4/site-packages/ed25519/test_ed25519.py b/lib/python3.7/site-packages/ed25519/test_ed25519.py similarity index 100% rename from lib/python3.4/site-packages/ed25519/test_ed25519.py rename to lib/python3.7/site-packages/ed25519/test_ed25519.py diff --git a/lib/python3.7/site-packages/netifaces-0.10.7.egg-info/PKG-INFO b/lib/python3.7/site-packages/netifaces-0.10.7.egg-info/PKG-INFO new file mode 100644 index 0000000..ac4b3ec --- /dev/null +++ b/lib/python3.7/site-packages/netifaces-0.10.7.egg-info/PKG-INFO @@ -0,0 +1,230 @@ +Metadata-Version: 1.1 +Name: netifaces +Version: 0.10.7 +Summary: Portable network interface information. +Home-page: https://github.com/al45tair/netifaces +Author: Alastair Houghton +Author-email: alastair@alastairs-place.net +License: MIT License +Description: netifaces 0.10.7 + ================ + + +-------------+------------------+ + | Linux/macOS | |BuildStatus| | + +-------------+------------------+ + | Windows | |WinBuildStatus| | + +-------------+------------------+ + + .. |BuildStatus| image:: https://travis-ci.org/al45tair/netifaces.svg?branch=master + :target: https://travis-ci.org/al45tair/dmgbuild + :alt: Build Status (Linux/Mac) + + .. |WinBuildStatus| image:: https://ci.appveyor.com/api/projects/status/3ctn1bl0aigpfjoo/branch/master?svg=true + :target: https://ci.appveyor.com/project/al45tair/netifaces/branch/master + :alt: Build Status (Windows) + + 1. What is this? + ---------------- + + It's been annoying me for some time that there's no easy way to get the + address(es) of the machine's network interfaces from Python. There is + a good reason for this difficulty, which is that it is virtually impossible + to do so in a portable manner. However, it seems to me that there should + be a package you can easy_install that will take care of working out the + details of doing so on the machine you're using, then you can get on with + writing Python code without concerning yourself with the nitty gritty of + system-dependent low-level networking APIs. + + This package attempts to solve that problem. + + 2. How do I use it? + ------------------- + + First you need to install it, which you can do by typing:: + + tar xvzf netifaces-0.10.7.tar.gz + cd netifaces-0.10.7 + python setup.py install + + **Note that you will need the relevant developer tools for your platform**, + as netifaces is written in C and installing this way will compile the extension. + + Once that's done, you'll need to start Python and do something like the + following:: + + >>> import netifaces + + Then if you enter + + >>> netifaces.interfaces() + ['lo0', 'gif0', 'stf0', 'en0', 'en1', 'fw0'] + + you'll see the list of interface identifiers for your machine. + + You can ask for the addresses of a particular interface by doing + + >>> netifaces.ifaddresses('lo0') + {18: [{'addr': ''}], 2: [{'peer': '127.0.0.1', 'netmask': '255.0.0.0', 'addr': '127.0.0.1'}], 30: [{'peer': '::1', 'netmask': 'ffff:ffff:ffff:ffff:ffff:ffff:ffff:ffff', 'addr': '::1'}, {'peer': '', 'netmask': 'ffff:ffff:ffff:ffff::', 'addr': 'fe80::1%lo0'}]} + + Hmmmm. That result looks a bit cryptic; let's break it apart and explain + what each piece means. It returned a dictionary, so let's look there first:: + + { 18: [...], 2: [...], 30: [...] } + + Each of the numbers refers to a particular address family. In this case, we + have three address families listed; on my system, 18 is ``AF_LINK`` (which means + the link layer interface, e.g. Ethernet), 2 is ``AF_INET`` (normal Internet + addresses), and 30 is ``AF_INET6`` (IPv6). + + But wait! Don't use these numbers in your code. The numeric values here are + system dependent; fortunately, I thought of that when writing netifaces, so + the module declares a range of values that you might need. e.g. + + >>> netifaces.AF_LINK + 18 + + Again, on your system, the number may be different. + + So, what we've established is that the dictionary that's returned has one + entry for each address family for which this interface has an address. Let's + take a look at the ``AF_INET`` addresses now: + + >>> addrs = netifaces.ifaddresses('lo0') + >>> addrs[netifaces.AF_INET] + [{'peer': '127.0.0.1', 'netmask': '255.0.0.0', 'addr': '127.0.0.1'}] + + You might be wondering why this value is a list. The reason is that it's + possible for an interface to have more than one address, even within the + same family. I'll say that again: *you can have more than one address of + the same type associated with each interface*. + + *Asking for "the" address of a particular interface doesn't make sense.* + + Right, so, we can see that this particular interface only has one address, + and, because it's a loopback interface, it's point-to-point and therefore + has a *peer* address rather than a broadcast address. + + Let's look at a more interesting interface. + + >>> addrs = netifaces.ifaddresses('en0') + >>> addrs[netifaces.AF_INET] + [{'broadcast': '10.15.255.255', 'netmask': '255.240.0.0', 'addr': '10.0.1.4'}, {'broadcast': '192.168.0.255', 'addr': '192.168.0.47'}] + + This interface has two addresses (see, I told you...) Both of them are + regular IPv4 addresses, although in one case the netmask has been changed + from its default. The netmask *may not* appear on your system if it's set + to the default for the address range. + + Because this interface isn't point-to-point, it also has broadcast addresses. + + Now, say we want, instead of the IP addresses, to get the MAC address; that + is, the hardware address of the Ethernet adapter running this interface. We + can do + + >>> addrs[netifaces.AF_LINK] + [{'addr': '00:12:34:56:78:9a'}] + + Note that this may not be available on platforms without getifaddrs(), unless + they happen to implement ``SIOCGIFHWADDR``. Note also that you just get the + address; it's unlikely that you'll see anything else with an ``AF_LINK`` address. + Oh, and don't assume that all ``AF_LINK`` addresses are Ethernet; you might, for + instance, be on a Mac, in which case: + + >>> addrs = netifaces.ifaddresses('fw0') + >>> addrs[netifaces.AF_LINK] + [{'addr': '00:12:34:56:78:9a:bc:de'}] + + No, that isn't an exceptionally long Ethernet MAC address---it's a FireWire + address. + + As of version 0.10.0, you can also obtain a list of gateways on your + machine: + + >>> netifaces.gateways() + {2: [('10.0.1.1', 'en0', True), ('10.2.1.1', 'en1', False)], 30: [('fe80::1', 'en0', True)], 'default': { 2: ('10.0.1.1', 'en0'), 30: ('fe80::1', 'en0') }} + + This dictionary is keyed on address family---in this case, ``AF_INET``---and + each entry is a list of gateways as ``(address, interface, is_default)`` tuples. + Notice that here we have two separate gateways for IPv4 (``AF_INET``); some + operating systems support configurations like this and can either route packets + based on their source, or based on administratively configured routing tables. + + For convenience, we also allow you to index the dictionary with the special + value ``'default'``, which returns a dictionary mapping address families to the + default gateway in each case. Thus you can get the default IPv4 gateway with + + >>> gws = netifaces.gateways() + >>> gws['default'][netifaces.AF_INET] + ('10.0.1.1', 'en0') + + Do note that there may be no default gateway for any given address family; + this is currently very common for IPv6 and much less common for IPv4 but it + can happen even for ``AF_INET``. + + BTW, if you're trying to configure your machine to have multiple gateways for + the same address family, it's a very good idea to check the documentation for + your operating system *very* carefully, as some systems become extremely + confused or route packets in a non-obvious manner. + + I'm very interested in hearing from anyone (on any platform) for whom the + ``gateways()`` method doesn't produce the expected results. It's quite + complicated extracting this information from the operating system (whichever + operating system we're talking about), and so I expect there's at least one + system out there where this just won't work. + + 3. This is great! What platforms does it work on? + -------------------------------------------------- + + It gets regular testing on OS X, Linux and Windows. It has also been used + successfully on Solaris, and it's expected to work properly on other UNIX-like + systems as well. If you are running something that is not supported, and + wish to contribute a patch, please use Github to send a pull request. + + 4. What license is this under? + ------------------------------ + + It's an MIT-style license. Here goes: + + Copyright (c) 2007-2018 Alastair Houghton + + Permission is hereby granted, free of charge, to any person obtaining a copy + of this software and associated documentation files (the "Software"), to deal + in the Software without restriction, including without limitation the rights + to use, copy, modify, merge, publish, distribute, sublicense, and/or sell + copies of the Software, and to permit persons to whom the Software is + furnished to do so, subject to the following conditions: + + The above copyright notice and this permission notice shall be included in all + copies or substantial portions of the Software. + + THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR + IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, + FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE + AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER + LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, + OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE + SOFTWARE. + + 5. Why the jump to 0.10.0? + -------------------------- + + Because someone released a fork of netifaces with the version 0.9.0. + Hopefully skipping the version number should remove any confusion. In + addition starting with 0.10.0 Python 3 is now supported and other + features/bugfixes have been included as well. See the CHANGELOG for a + more complete list of changes. + +Platform: UNKNOWN +Classifier: Development Status :: 4 - Beta +Classifier: Intended Audience :: Developers +Classifier: License :: OSI Approved :: MIT License +Classifier: Topic :: System :: Networking +Classifier: Programming Language :: Python +Classifier: Programming Language :: Python :: 2 +Classifier: Programming Language :: Python :: 2.5 +Classifier: Programming Language :: Python :: 2.6 +Classifier: Programming Language :: Python :: 2.7 +Classifier: Programming Language :: Python :: 3 +Classifier: Programming Language :: Python :: 3.4 +Classifier: Programming Language :: Python :: 3.5 +Classifier: Programming Language :: Python :: 3.6 diff --git a/lib/python3.7/site-packages/netifaces-0.10.7.egg-info/SOURCES.txt b/lib/python3.7/site-packages/netifaces-0.10.7.egg-info/SOURCES.txt new file mode 100644 index 0000000..a382c6c --- /dev/null +++ b/lib/python3.7/site-packages/netifaces-0.10.7.egg-info/SOURCES.txt @@ -0,0 +1,9 @@ +README.rst +netifaces.c +setup.cfg +setup.py +netifaces.egg-info/PKG-INFO +netifaces.egg-info/SOURCES.txt +netifaces.egg-info/dependency_links.txt +netifaces.egg-info/top_level.txt +netifaces.egg-info/zip-safe \ No newline at end of file diff --git a/lib/python3.7/site-packages/netifaces-0.10.7.egg-info/dependency_links.txt b/lib/python3.7/site-packages/netifaces-0.10.7.egg-info/dependency_links.txt new file mode 100644 index 0000000..8b13789 --- /dev/null +++ b/lib/python3.7/site-packages/netifaces-0.10.7.egg-info/dependency_links.txt @@ -0,0 +1 @@ + diff --git a/lib/python3.7/site-packages/netifaces-0.10.7.egg-info/installed-files.txt b/lib/python3.7/site-packages/netifaces-0.10.7.egg-info/installed-files.txt new file mode 100644 index 0000000..d41042b --- /dev/null +++ b/lib/python3.7/site-packages/netifaces-0.10.7.egg-info/installed-files.txt @@ -0,0 +1,6 @@ +../netifaces.cpython-37m-x86_64-linux-gnu.so +PKG-INFO +SOURCES.txt +dependency_links.txt +top_level.txt +zip-safe diff --git a/lib/python3.4/site-packages/netifaces-0.10.6.dist-info/top_level.txt b/lib/python3.7/site-packages/netifaces-0.10.7.egg-info/top_level.txt similarity index 100% rename from lib/python3.4/site-packages/netifaces-0.10.6.dist-info/top_level.txt rename to lib/python3.7/site-packages/netifaces-0.10.7.egg-info/top_level.txt diff --git a/lib/python3.7/site-packages/netifaces-0.10.7.egg-info/zip-safe b/lib/python3.7/site-packages/netifaces-0.10.7.egg-info/zip-safe new file mode 100644 index 0000000..8b13789 --- /dev/null +++ b/lib/python3.7/site-packages/netifaces-0.10.7.egg-info/zip-safe @@ -0,0 +1 @@ + diff --git a/lib/python3.7/site-packages/netifaces.cpython-35m-x86_64-linux-gnu.so b/lib/python3.7/site-packages/netifaces.cpython-35m-x86_64-linux-gnu.so new file mode 100755 index 0000000..0117aeb Binary files /dev/null and b/lib/python3.7/site-packages/netifaces.cpython-35m-x86_64-linux-gnu.so differ diff --git a/lib/python3.7/site-packages/netifaces.cpython-36m-x86_64-linux-gnu.so b/lib/python3.7/site-packages/netifaces.cpython-36m-x86_64-linux-gnu.so new file mode 100755 index 0000000..ae02fe1 Binary files /dev/null and b/lib/python3.7/site-packages/netifaces.cpython-36m-x86_64-linux-gnu.so differ diff --git a/lib/python3.7/site-packages/netifaces.cpython-37m-x86_64-linux-gnu.so b/lib/python3.7/site-packages/netifaces.cpython-37m-x86_64-linux-gnu.so new file mode 100755 index 0000000..f561cae Binary files /dev/null and b/lib/python3.7/site-packages/netifaces.cpython-37m-x86_64-linux-gnu.so differ diff --git a/lib/python3.4/site-packages/SQLAlchemy-1.0.12.dist-info/INSTALLER b/lib/python3.7/site-packages/pip-18.1.dist-info/INSTALLER similarity index 100% rename from lib/python3.4/site-packages/SQLAlchemy-1.0.12.dist-info/INSTALLER rename to lib/python3.7/site-packages/pip-18.1.dist-info/INSTALLER diff --git a/lib/python3.7/site-packages/pip-18.1.dist-info/LICENSE.txt b/lib/python3.7/site-packages/pip-18.1.dist-info/LICENSE.txt new file mode 100644 index 0000000..d3379fa --- /dev/null +++ b/lib/python3.7/site-packages/pip-18.1.dist-info/LICENSE.txt @@ -0,0 +1,20 @@ +Copyright (c) 2008-2018 The pip developers (see AUTHORS.txt file) + +Permission is hereby granted, free of charge, to any person obtaining +a copy of this software and associated documentation files (the +"Software"), to deal in the Software without restriction, including +without limitation the rights to use, copy, modify, merge, publish, +distribute, sublicense, and/or sell copies of the Software, and to +permit persons to whom the Software is furnished to do so, subject to +the following conditions: + +The above copyright notice and this permission notice shall be +included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, +EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF +MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND +NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE +LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION +OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION +WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/lib/python3.7/site-packages/pip-18.1.dist-info/METADATA b/lib/python3.7/site-packages/pip-18.1.dist-info/METADATA new file mode 100644 index 0000000..2e314aa --- /dev/null +++ b/lib/python3.7/site-packages/pip-18.1.dist-info/METADATA @@ -0,0 +1,70 @@ +Metadata-Version: 2.1 +Name: pip +Version: 18.1 +Summary: The PyPA recommended tool for installing Python packages. +Home-page: https://pip.pypa.io/ +Author: The pip developers +Author-email: pypa-dev@groups.google.com +License: MIT +Keywords: distutils easy_install egg setuptools wheel virtualenv +Platform: UNKNOWN +Classifier: Development Status :: 5 - Production/Stable +Classifier: Intended Audience :: Developers +Classifier: License :: OSI Approved :: MIT License +Classifier: Topic :: Software Development :: Build Tools +Classifier: Programming Language :: Python +Classifier: Programming Language :: Python :: 2 +Classifier: Programming Language :: Python :: 2.7 +Classifier: Programming Language :: Python :: 3 +Classifier: Programming Language :: Python :: 3.4 +Classifier: Programming Language :: Python :: 3.5 +Classifier: Programming Language :: Python :: 3.6 +Classifier: Programming Language :: Python :: 3.7 +Classifier: Programming Language :: Python :: Implementation :: CPython +Classifier: Programming Language :: Python :: Implementation :: PyPy +Requires-Python: >=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.* + +pip +=== + +The `PyPA recommended`_ tool for installing Python packages. + +.. image:: https://img.shields.io/pypi/v/pip.svg + :target: https://pypi.org/project/pip/ + +.. image:: https://img.shields.io/travis/pypa/pip/master.svg?label=travis-ci + :target: https://travis-ci.org/pypa/pip + +.. image:: https://img.shields.io/appveyor/ci/pypa/pip.svg?label=appveyor-ci + :target: https://ci.appveyor.com/project/pypa/pip/history + +.. image:: https://readthedocs.org/projects/pip/badge/?version=latest + :target: https://pip.pypa.io/en/latest + +* `Installation`_ +* `Documentation`_ +* `Changelog`_ +* `GitHub Page`_ +* `Issue Tracking`_ +* `User mailing list`_ +* `Dev mailing list`_ +* User IRC: #pypa on Freenode. +* Dev IRC: #pypa-dev on Freenode. + +Code of Conduct +--------------- + +Everyone interacting in the pip project's codebases, issue trackers, chat +rooms and mailing lists is expected to follow the `PyPA Code of Conduct`_. + +.. _PyPA recommended: https://packaging.python.org/en/latest/current/ +.. _Installation: https://pip.pypa.io/en/stable/installing.html +.. _Documentation: https://pip.pypa.io/en/stable/ +.. _Changelog: https://pip.pypa.io/en/stable/news.html +.. _GitHub Page: https://github.com/pypa/pip +.. _Issue Tracking: https://github.com/pypa/pip/issues +.. _User mailing list: https://groups.google.com/forum/#!forum/python-virtualenv +.. _Dev mailing list: https://groups.google.com/forum/#!forum/pypa-dev +.. _PyPA Code of Conduct: https://www.pypa.io/en/latest/code-of-conduct/ + + diff --git a/lib/python3.7/site-packages/pip-18.1.dist-info/RECORD b/lib/python3.7/site-packages/pip-18.1.dist-info/RECORD new file mode 100644 index 0000000..391cdb5 --- /dev/null +++ b/lib/python3.7/site-packages/pip-18.1.dist-info/RECORD @@ -0,0 +1,172 @@ +../../../bin/pip,sha256=nCPCPgV3qeLtiRQVkc3wEvfxCZQWWidXtPh3kZWQgD8,281 +../../../bin/pip3,sha256=nCPCPgV3qeLtiRQVkc3wEvfxCZQWWidXtPh3kZWQgD8,281 +../../../bin/pip3.7,sha256=nCPCPgV3qeLtiRQVkc3wEvfxCZQWWidXtPh3kZWQgD8,281 +pip-18.1.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4 +pip-18.1.dist-info/LICENSE.txt,sha256=ORqHhOMZ2uVDFHfUzJvFBPxdcf2eieHIDxzThV9dfPo,1090 +pip-18.1.dist-info/METADATA,sha256=D7pqBJTuqM9w_HTW91a0XGjLT9vynlBAE4pPCt_W_UE,2588 +pip-18.1.dist-info/RECORD,, +pip-18.1.dist-info/WHEEL,sha256=_wJFdOYk7i3xxT8ElOkUJvOdOvfNGbR9g-bf6UQT6sU,110 +pip-18.1.dist-info/entry_points.txt,sha256=S_zfxY25QtQDVY1BiLAmOKSkkI5llzCKPLiYOSEupsY,98 +pip-18.1.dist-info/top_level.txt,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4 +pip/__init__.py,sha256=nO-iphoXiDoci_ZAMl-PG2zdd4Y7m88jBDILTYzwGy4,21 +pip/__main__.py,sha256=L3IHqBeasELUHvwy5CT_izVEMhM12tve289qut49DvU,623 +pip/__pycache__/__init__.cpython-37.pyc,, +pip/__pycache__/__main__.cpython-37.pyc,, +pip/_internal/__init__.py,sha256=b0jSFCCViGhB1RWni35_NMkH3Y-mbZrV648DGMagDjs,2869 +pip/_internal/__pycache__/__init__.cpython-37.pyc,, +pip/_internal/__pycache__/build_env.cpython-37.pyc,, +pip/_internal/__pycache__/cache.cpython-37.pyc,, +pip/_internal/__pycache__/configuration.cpython-37.pyc,, +pip/_internal/__pycache__/download.cpython-37.pyc,, +pip/_internal/__pycache__/exceptions.cpython-37.pyc,, +pip/_internal/__pycache__/index.cpython-37.pyc,, +pip/_internal/__pycache__/locations.cpython-37.pyc,, +pip/_internal/__pycache__/pep425tags.cpython-37.pyc,, +pip/_internal/__pycache__/pyproject.cpython-37.pyc,, +pip/_internal/__pycache__/resolve.cpython-37.pyc,, +pip/_internal/__pycache__/wheel.cpython-37.pyc,, +pip/_internal/build_env.py,sha256=zKhqmDMnrX5OTSNQ4xBw-mN5mTGVu6wjiNFW-ajWYEI,4797 +pip/_internal/cache.py,sha256=96_aKtDbwgLEVNgNabOT8GrFCYZEACedoiucqU5ccg8,6829 +pip/_internal/cli/__init__.py,sha256=FkHBgpxxb-_gd6r1FjnNhfMOzAUYyXoXKJ6abijfcFU,132 +pip/_internal/cli/__pycache__/__init__.cpython-37.pyc,, +pip/_internal/cli/__pycache__/autocompletion.cpython-37.pyc,, +pip/_internal/cli/__pycache__/base_command.cpython-37.pyc,, +pip/_internal/cli/__pycache__/cmdoptions.cpython-37.pyc,, +pip/_internal/cli/__pycache__/main_parser.cpython-37.pyc,, +pip/_internal/cli/__pycache__/parser.cpython-37.pyc,, +pip/_internal/cli/__pycache__/status_codes.cpython-37.pyc,, +pip/_internal/cli/autocompletion.py,sha256=ptvsMdGjq42pzoY4skABVF43u2xAtLJlXAulPi-A10Y,6083 +pip/_internal/cli/base_command.py,sha256=ke6af4iWzrZoc3HtiPKnCZJvD6GlX8dRwBwpFCg1axc,9963 +pip/_internal/cli/cmdoptions.py,sha256=klAO3AxS0_xoZY_3LwwRjT4TbxtdIwBrmnLJvgG6sGI,19467 +pip/_internal/cli/main_parser.py,sha256=Ga_kT7if-Gg0rmmRqlGEHW6JWVm9zwzO7igJm6RE9EI,2763 +pip/_internal/cli/parser.py,sha256=VZKUKJPbU6I2cHPLDOikin-aCx7OvLcZ3fzYp3xytd8,9378 +pip/_internal/cli/status_codes.py,sha256=F6uDG6Gj7RNKQJUDnd87QKqI16Us-t-B0wPF_4QMpWc,156 +pip/_internal/commands/__init__.py,sha256=CQAzhVx9ViPtqLNUvAeqnKj5iWfFEcqMx5RlZWjJ30c,2251 +pip/_internal/commands/__pycache__/__init__.cpython-37.pyc,, +pip/_internal/commands/__pycache__/check.cpython-37.pyc,, +pip/_internal/commands/__pycache__/completion.cpython-37.pyc,, +pip/_internal/commands/__pycache__/configuration.cpython-37.pyc,, +pip/_internal/commands/__pycache__/download.cpython-37.pyc,, +pip/_internal/commands/__pycache__/freeze.cpython-37.pyc,, +pip/_internal/commands/__pycache__/hash.cpython-37.pyc,, +pip/_internal/commands/__pycache__/help.cpython-37.pyc,, +pip/_internal/commands/__pycache__/install.cpython-37.pyc,, +pip/_internal/commands/__pycache__/list.cpython-37.pyc,, +pip/_internal/commands/__pycache__/search.cpython-37.pyc,, +pip/_internal/commands/__pycache__/show.cpython-37.pyc,, +pip/_internal/commands/__pycache__/uninstall.cpython-37.pyc,, +pip/_internal/commands/__pycache__/wheel.cpython-37.pyc,, +pip/_internal/commands/check.py,sha256=CyeYH2kfDKSGSURoBfWtx-sTcZZQP-bK170NmKYlmsg,1398 +pip/_internal/commands/completion.py,sha256=hqvCvoxsIHjysiD7olHKTqK2lzE1_lS6LWn69kN5qyI,2929 +pip/_internal/commands/configuration.py,sha256=265HWuUxPggCNcIeWHA3p-LDDiRVnexwFgwmHGgWOHY,7125 +pip/_internal/commands/download.py,sha256=D_iGMp3xX2iD7KZYZAjXlYT3rf3xjwxyYe05KE-DVzE,6514 +pip/_internal/commands/freeze.py,sha256=VvS3G0wrm_9BH3B7Ex5msLL_1UQTtCq5G8dDI63Iemo,3259 +pip/_internal/commands/hash.py,sha256=K1JycsD-rpjqrRcL_ijacY9UKmI82pQcLYq4kCM4Pv0,1681 +pip/_internal/commands/help.py,sha256=MwBhPJpW1Dt3GfJV3V8V6kgAy_pXT0jGrZJB1wCTW-E,1090 +pip/_internal/commands/install.py,sha256=I_zZhkmIbDm_HqLI2WWC9vjXEnd5kNAdQ2k1xtU38zg,21874 +pip/_internal/commands/list.py,sha256=n740MsR0cG34EuvGWMzdVl0uIA3UIYx1_95FUsTktN0,10272 +pip/_internal/commands/search.py,sha256=sLZ9icKMEEGekHvzRRZMiTd1zCFIZeDptyyU1mQCYzk,4728 +pip/_internal/commands/show.py,sha256=9EVh86vY0NZdlhT-wsuV-zq_MAV6qqV4S1Akn3wkUuw,6289 +pip/_internal/commands/uninstall.py,sha256=h0gfPF5jylDESx_IHgF6bZME7QAEOHzQHdn65GP-jrE,2963 +pip/_internal/commands/wheel.py,sha256=ZuVf_DMpKCUzBVstolvQPAeajQRC51Oky5_hDHzhhFs,7020 +pip/_internal/configuration.py,sha256=KMgG3ufFrUKX_QESi2cMVvFi47tl845Bg1ZkNthlWik,13243 +pip/_internal/download.py,sha256=c5Hkimq39eJdZ6DN0_0etjK43-0a5CK_W_3sVLqH87g,33300 +pip/_internal/exceptions.py,sha256=EIGotnq6qM2nbGtnlgZ8Xp5VfP2W4-9UOCzQGMwy5MY,8899 +pip/_internal/index.py,sha256=6CAtZ8QTLcpw0fJqQ9OPu-Os1ettLZtVY1pPSKia8r8,34789 +pip/_internal/locations.py,sha256=ujNrLnA04Y_EmSriO0nS6qkkw_BkPfobB_hdwIDPvpM,6307 +pip/_internal/models/__init__.py,sha256=3DHUd_qxpPozfzouoqa9g9ts1Czr5qaHfFxbnxriepM,63 +pip/_internal/models/__pycache__/__init__.cpython-37.pyc,, +pip/_internal/models/__pycache__/candidate.cpython-37.pyc,, +pip/_internal/models/__pycache__/format_control.cpython-37.pyc,, +pip/_internal/models/__pycache__/index.cpython-37.pyc,, +pip/_internal/models/__pycache__/link.cpython-37.pyc,, +pip/_internal/models/candidate.py,sha256=zq2Vb5l5JflrVX7smHTJHQciZWHyoJZuYTLeQa1G16c,741 +pip/_internal/models/format_control.py,sha256=aDbH4D2XuyaGjtRjTLQhNzClAcLZdJCKSHO8xbZSmFA,2202 +pip/_internal/models/index.py,sha256=YI1WlhWfS9mVPY0bIboA5la2pjJ2J0qgPJIbvdEjZBk,996 +pip/_internal/models/link.py,sha256=E61PvS2Wrmb9-zT-eAc_8_xI3C-89wJlpL8SL-mlQmg,3998 +pip/_internal/operations/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0 +pip/_internal/operations/__pycache__/__init__.cpython-37.pyc,, +pip/_internal/operations/__pycache__/check.cpython-37.pyc,, +pip/_internal/operations/__pycache__/freeze.cpython-37.pyc,, +pip/_internal/operations/__pycache__/prepare.cpython-37.pyc,, +pip/_internal/operations/check.py,sha256=ahcOg5p68nNow6_wy5prYYK0KZq22lm0CsJn8AyDMCI,4937 +pip/_internal/operations/freeze.py,sha256=lskaBcqf3bPZupG032fuLf76QYv5wpAQ6jsiXac56Bg,10450 +pip/_internal/operations/prepare.py,sha256=atoLFj3OD5KfXsa5dYBMC_mI06l068F5yZhF4jle1JA,14280 +pip/_internal/pep425tags.py,sha256=TQhxOPss4RjxgyVgxpSRe31HaTcWmn-LVjWBbkvkjzk,10845 +pip/_internal/pyproject.py,sha256=fpO52MCa3w5xSlXIBXw39BDTGzP8G4570EW34hVvIKQ,5481 +pip/_internal/req/__init__.py,sha256=JnNZWvKUQuqAwHh64LCD3zprzWIVQEXChTo2UGHzVqo,2093 +pip/_internal/req/__pycache__/__init__.cpython-37.pyc,, +pip/_internal/req/__pycache__/constructors.cpython-37.pyc,, +pip/_internal/req/__pycache__/req_file.cpython-37.pyc,, +pip/_internal/req/__pycache__/req_install.cpython-37.pyc,, +pip/_internal/req/__pycache__/req_set.cpython-37.pyc,, +pip/_internal/req/__pycache__/req_tracker.cpython-37.pyc,, +pip/_internal/req/__pycache__/req_uninstall.cpython-37.pyc,, +pip/_internal/req/constructors.py,sha256=97WQp9Svh-Jw3oLZL9_57gJ3zihm5LnWlSRjOwOorDU,9573 +pip/_internal/req/req_file.py,sha256=ORA0GKUjGd6vy7pmBwXR55FFj4h_OxYykFQ6gHuWvt0,11940 +pip/_internal/req/req_install.py,sha256=ry1RtNNCefDHAnf3EeGMpea-9pC6Yk1uHzP0Q5p2Un0,34046 +pip/_internal/req/req_set.py,sha256=nE6oagXJSiQREuuebX3oJO5OHSOVUIlvLLilodetBzc,7264 +pip/_internal/req/req_tracker.py,sha256=zH28YHV5TXAVh1ZOEZi6Z1Edkiu26dN2tXfR6VbQ3B4,2370 +pip/_internal/req/req_uninstall.py,sha256=ORSPah64KOVrKo-InMM3zgS5HQqbl5TLHFnE_Lxstq8,16737 +pip/_internal/resolve.py,sha256=tdepxCewsXXNFKSIYGSxiLvzi1xCv7UVFT9jRCDO90A,13578 +pip/_internal/utils/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0 +pip/_internal/utils/__pycache__/__init__.cpython-37.pyc,, +pip/_internal/utils/__pycache__/appdirs.cpython-37.pyc,, +pip/_internal/utils/__pycache__/compat.cpython-37.pyc,, +pip/_internal/utils/__pycache__/deprecation.cpython-37.pyc,, +pip/_internal/utils/__pycache__/encoding.cpython-37.pyc,, +pip/_internal/utils/__pycache__/filesystem.cpython-37.pyc,, +pip/_internal/utils/__pycache__/glibc.cpython-37.pyc,, +pip/_internal/utils/__pycache__/hashes.cpython-37.pyc,, +pip/_internal/utils/__pycache__/logging.cpython-37.pyc,, +pip/_internal/utils/__pycache__/misc.cpython-37.pyc,, +pip/_internal/utils/__pycache__/models.cpython-37.pyc,, +pip/_internal/utils/__pycache__/outdated.cpython-37.pyc,, +pip/_internal/utils/__pycache__/packaging.cpython-37.pyc,, +pip/_internal/utils/__pycache__/setuptools_build.cpython-37.pyc,, +pip/_internal/utils/__pycache__/temp_dir.cpython-37.pyc,, +pip/_internal/utils/__pycache__/typing.cpython-37.pyc,, +pip/_internal/utils/__pycache__/ui.cpython-37.pyc,, +pip/_internal/utils/appdirs.py,sha256=SPfibHtvOKzD_sHrpEZ60HfLae3GharU4Tg7SB3c-XM,9120 +pip/_internal/utils/compat.py,sha256=LSAvzXcsGY2O2drKIPszR5Ja2G0kup__51l3bx1jR_Q,8015 +pip/_internal/utils/deprecation.py,sha256=yQTe6dyWlBfxSBrOv_MdRXF1RPLER_EWOp-pa2zLoZc,3021 +pip/_internal/utils/encoding.py,sha256=D8tmfStCah6xh9OLhH9mWLr77q4akhg580YHJMKpq3Y,1025 +pip/_internal/utils/filesystem.py,sha256=ZOIHbacJ-SJtuZru4GoA5DuSIYyeaE4G5kfZPf5cn1A,915 +pip/_internal/utils/glibc.py,sha256=prOrsBjmgkDE-hY4Pl120yF5MIlkkmGrFLs8XfIyT-w,3004 +pip/_internal/utils/hashes.py,sha256=rJk-gj6F-sHggXAG97dhynqUHFFgApyZLWgaG2xCHME,2900 +pip/_internal/utils/logging.py,sha256=BQeUDEER3zlK0O4yv6DBfz6TK3f9XoLXyDlnB0mZVf0,6295 +pip/_internal/utils/misc.py,sha256=K5ouAkGO96le5zhngk_hSo7eysD-vMRYMqmkWnEaIFc,30639 +pip/_internal/utils/models.py,sha256=DQYZSRhjvSdDTAaJLLCpDtxAn1S_-v_8nlNjv4T2jwY,1042 +pip/_internal/utils/outdated.py,sha256=BXtCMKR6gjTrvMfP3MWzZ1Y4ZU4qqoCfbRNqQCusVt8,5642 +pip/_internal/utils/packaging.py,sha256=Ru8ls_S8PPKR8RKEn7jMetENY_A9jPet1HlhTZwpFxU,2443 +pip/_internal/utils/setuptools_build.py,sha256=0blfscmNJW_iZ5DcswJeDB_PbtTEjfK9RL1R1WEDW2E,278 +pip/_internal/utils/temp_dir.py,sha256=n2FkVlwRX_hS61fYt3nSAh2e2V6CcZn_dfbPId1pAQE,2615 +pip/_internal/utils/typing.py,sha256=ztYtZAcqjCYDwP-WlF6EiAAskAsZBMMXtuqvfgZIlgQ,1139 +pip/_internal/utils/ui.py,sha256=FW8wdtc7DvNwJClGr_TvGZlqcoO482GYe0UY9nKmpso,13657 +pip/_internal/vcs/__init__.py,sha256=2Ct9ogOwzS6ZKKaEXKN2XDiBOiFHMcejnN1KM21mLrQ,16319 +pip/_internal/vcs/__pycache__/__init__.cpython-37.pyc,, +pip/_internal/vcs/__pycache__/bazaar.cpython-37.pyc,, +pip/_internal/vcs/__pycache__/git.cpython-37.pyc,, +pip/_internal/vcs/__pycache__/mercurial.cpython-37.pyc,, +pip/_internal/vcs/__pycache__/subversion.cpython-37.pyc,, +pip/_internal/vcs/bazaar.py,sha256=rjskVmSSn68O7lC5JrGmDTWXneXFMMJJvj_bbdSM8QA,3669 +pip/_internal/vcs/git.py,sha256=n1cFBqTnLIcxAOClZMgOBqELjEjygDBPZ9z-Q7g0qVQ,12580 +pip/_internal/vcs/mercurial.py,sha256=jVTa0XQpFR6EiBcaqW4E4JjTce_t1tFnKRaIhaIPlS8,3471 +pip/_internal/vcs/subversion.py,sha256=vDLTfcjj0kgqcEsbPBfveC4CRxyhWiOjke-qN0Zr8CE,7676 +pip/_internal/wheel.py,sha256=fg9E936DaI1LyrBPHqtzHG_WEVyuUwipHISkD6N3jNw,32007 +pip/_vendor/__init__.py,sha256=bdhl7DUZ1z7eukZLktoO1vhki9sC576gBWcFgel4684,4890 +pip/_vendor/__pycache__/__init__.cpython-37.pyc,, +pip/_vendor/pep517/__init__.py,sha256=GH4HshnLERtjAjkY0zHoz3f7-35UcIvr27iFWSOUazU,82 +pip/_vendor/pep517/__pycache__/__init__.cpython-37.pyc,, +pip/_vendor/pep517/__pycache__/_in_process.cpython-37.pyc,, +pip/_vendor/pep517/__pycache__/check.cpython-37.pyc,, +pip/_vendor/pep517/__pycache__/colorlog.cpython-37.pyc,, +pip/_vendor/pep517/__pycache__/compat.cpython-37.pyc,, +pip/_vendor/pep517/__pycache__/envbuild.cpython-37.pyc,, +pip/_vendor/pep517/__pycache__/wrappers.cpython-37.pyc,, +pip/_vendor/pep517/_in_process.py,sha256=iWpagFk2GhNBbvl-Ca2RagfD0ALuits4WWSM6nQMTdg,5831 +pip/_vendor/pep517/check.py,sha256=Yp2NHW71DIOCgkFb7HKJOzKmsum_s_OokRP6HnR3bTg,5761 +pip/_vendor/pep517/colorlog.py,sha256=2AJuPI_DHM5T9IDgcTwf0E8suyHAFnfsesogr0AB7RQ,4048 +pip/_vendor/pep517/compat.py,sha256=4SFG4QN-cNj8ebSa0wV0HUtEEQWwmbok2a0uk1gYEOM,631 +pip/_vendor/pep517/envbuild.py,sha256=osRsJVd7hir1w_uFXiVeeWxfJ3iYhwxsKRgNBWpqtCI,5672 +pip/_vendor/pep517/wrappers.py,sha256=RhgWm-MLxpYPgc9cZ3-A3ToN99ZzgM8-ia4FDB58koM,5018 diff --git a/lib/python3.4/site-packages/pkg_resources-0.0.0.dist-info/WHEEL b/lib/python3.7/site-packages/pip-18.1.dist-info/WHEEL similarity index 70% rename from lib/python3.4/site-packages/pkg_resources-0.0.0.dist-info/WHEEL rename to lib/python3.7/site-packages/pip-18.1.dist-info/WHEEL index 8b6dd1b..c4bde30 100644 --- a/lib/python3.4/site-packages/pkg_resources-0.0.0.dist-info/WHEEL +++ b/lib/python3.7/site-packages/pip-18.1.dist-info/WHEEL @@ -1,5 +1,5 @@ Wheel-Version: 1.0 -Generator: bdist_wheel (0.29.0) +Generator: bdist_wheel (0.32.3) Root-Is-Purelib: true Tag: py2-none-any Tag: py3-none-any diff --git a/lib/python3.7/site-packages/pip-18.1.dist-info/entry_points.txt b/lib/python3.7/site-packages/pip-18.1.dist-info/entry_points.txt new file mode 100644 index 0000000..f5809cb --- /dev/null +++ b/lib/python3.7/site-packages/pip-18.1.dist-info/entry_points.txt @@ -0,0 +1,5 @@ +[console_scripts] +pip = pip._internal:main +pip3 = pip._internal:main +pip3.7 = pip._internal:main + diff --git a/lib/python3.4/site-packages/pip-9.0.1.dist-info/top_level.txt b/lib/python3.7/site-packages/pip-18.1.dist-info/top_level.txt similarity index 100% rename from lib/python3.4/site-packages/pip-9.0.1.dist-info/top_level.txt rename to lib/python3.7/site-packages/pip-18.1.dist-info/top_level.txt diff --git a/lib/python3.7/site-packages/pip/__init__.py b/lib/python3.7/site-packages/pip/__init__.py new file mode 100644 index 0000000..ae265fa --- /dev/null +++ b/lib/python3.7/site-packages/pip/__init__.py @@ -0,0 +1 @@ +__version__ = "18.1" diff --git a/lib/python3.4/site-packages/pip/__main__.py b/lib/python3.7/site-packages/pip/__main__.py similarity index 86% rename from lib/python3.4/site-packages/pip/__main__.py rename to lib/python3.7/site-packages/pip/__main__.py index 5556539..0c223f8 100644 --- a/lib/python3.4/site-packages/pip/__main__.py +++ b/lib/python3.7/site-packages/pip/__main__.py @@ -13,7 +13,7 @@ if __package__ == '': path = os.path.dirname(os.path.dirname(__file__)) sys.path.insert(0, path) -import pip # noqa +from pip._internal import main as _main # isort:skip # noqa if __name__ == '__main__': - sys.exit(pip.main()) + sys.exit(_main()) diff --git a/lib/python3.7/site-packages/pip/_internal/__init__.py b/lib/python3.7/site-packages/pip/_internal/__init__.py new file mode 100644 index 0000000..276124d --- /dev/null +++ b/lib/python3.7/site-packages/pip/_internal/__init__.py @@ -0,0 +1,78 @@ +#!/usr/bin/env python +from __future__ import absolute_import + +import locale +import logging +import os +import warnings + +import sys + +# 2016-06-17 barry@debian.org: urllib3 1.14 added optional support for socks, +# but if invoked (i.e. imported), it will issue a warning to stderr if socks +# isn't available. requests unconditionally imports urllib3's socks contrib +# module, triggering this warning. The warning breaks DEP-8 tests (because of +# the stderr output) and is just plain annoying in normal usage. I don't want +# to add socks as yet another dependency for pip, nor do I want to allow-stder +# in the DEP-8 tests, so just suppress the warning. pdb tells me this has to +# be done before the import of pip.vcs. +from pip._vendor.urllib3.exceptions import DependencyWarning +warnings.filterwarnings("ignore", category=DependencyWarning) # noqa + +# We want to inject the use of SecureTransport as early as possible so that any +# references or sessions or what have you are ensured to have it, however we +# only want to do this in the case that we're running on macOS and the linked +# OpenSSL is too old to handle TLSv1.2 +try: + import ssl +except ImportError: + pass +else: + # Checks for OpenSSL 1.0.1 on MacOS + if sys.platform == "darwin" and ssl.OPENSSL_VERSION_NUMBER < 0x1000100f: + try: + from pip._vendor.urllib3.contrib import securetransport + except (ImportError, OSError): + pass + else: + securetransport.inject_into_urllib3() + +from pip._internal.cli.autocompletion import autocomplete +from pip._internal.cli.main_parser import parse_command +from pip._internal.commands import commands_dict +from pip._internal.exceptions import PipError +from pip._internal.utils import deprecation +from pip._internal.vcs import git, mercurial, subversion, bazaar # noqa +from pip._vendor.urllib3.exceptions import InsecureRequestWarning + +logger = logging.getLogger(__name__) + +# Hide the InsecureRequestWarning from urllib3 +warnings.filterwarnings("ignore", category=InsecureRequestWarning) + + +def main(args=None): + if args is None: + args = sys.argv[1:] + + # Configure our deprecation warnings to be sent through loggers + deprecation.install_warning_logger() + + autocomplete() + + try: + cmd_name, cmd_args = parse_command(args) + except PipError as exc: + sys.stderr.write("ERROR: %s" % exc) + sys.stderr.write(os.linesep) + sys.exit(1) + + # Needed for locale.getpreferredencoding(False) to work + # in pip._internal.utils.encoding.auto_decode + try: + locale.setlocale(locale.LC_ALL, '') + except locale.Error as e: + # setlocale can apparently crash if locale are uninitialized + logger.debug("Ignoring error %s when setting locale", e) + command = commands_dict[cmd_name](isolated=("--isolated" in cmd_args)) + return command.main(cmd_args) diff --git a/lib/python3.7/site-packages/pip/_internal/build_env.py b/lib/python3.7/site-packages/pip/_internal/build_env.py new file mode 100644 index 0000000..673409d --- /dev/null +++ b/lib/python3.7/site-packages/pip/_internal/build_env.py @@ -0,0 +1,142 @@ +"""Build Environment used for isolation during sdist building +""" + +import logging +import os +import sys +from distutils.sysconfig import get_python_lib +from sysconfig import get_paths + +from pip._vendor.pkg_resources import Requirement, VersionConflict, WorkingSet + +from pip._internal.utils.misc import call_subprocess +from pip._internal.utils.temp_dir import TempDirectory +from pip._internal.utils.ui import open_spinner + +logger = logging.getLogger(__name__) + + +class BuildEnvironment(object): + """Creates and manages an isolated environment to install build deps + """ + + def __init__(self): + self._temp_dir = TempDirectory(kind="build-env") + self._temp_dir.create() + + @property + def path(self): + return self._temp_dir.path + + def __enter__(self): + self.save_path = os.environ.get('PATH', None) + self.save_pythonpath = os.environ.get('PYTHONPATH', None) + self.save_nousersite = os.environ.get('PYTHONNOUSERSITE', None) + + install_scheme = 'nt' if (os.name == 'nt') else 'posix_prefix' + install_dirs = get_paths(install_scheme, vars={ + 'base': self.path, + 'platbase': self.path, + }) + + scripts = install_dirs['scripts'] + if self.save_path: + os.environ['PATH'] = scripts + os.pathsep + self.save_path + else: + os.environ['PATH'] = scripts + os.pathsep + os.defpath + + # Note: prefer distutils' sysconfig to get the + # library paths so PyPy is correctly supported. + purelib = get_python_lib(plat_specific=0, prefix=self.path) + platlib = get_python_lib(plat_specific=1, prefix=self.path) + if purelib == platlib: + lib_dirs = purelib + else: + lib_dirs = purelib + os.pathsep + platlib + if self.save_pythonpath: + os.environ['PYTHONPATH'] = lib_dirs + os.pathsep + \ + self.save_pythonpath + else: + os.environ['PYTHONPATH'] = lib_dirs + + os.environ['PYTHONNOUSERSITE'] = '1' + + return self.path + + def __exit__(self, exc_type, exc_val, exc_tb): + def restore_var(varname, old_value): + if old_value is None: + os.environ.pop(varname, None) + else: + os.environ[varname] = old_value + + restore_var('PATH', self.save_path) + restore_var('PYTHONPATH', self.save_pythonpath) + restore_var('PYTHONNOUSERSITE', self.save_nousersite) + + def cleanup(self): + self._temp_dir.cleanup() + + def missing_requirements(self, reqs): + """Return a list of the requirements from reqs that are not present + """ + missing = [] + with self: + ws = WorkingSet(os.environ["PYTHONPATH"].split(os.pathsep)) + for req in reqs: + try: + if ws.find(Requirement.parse(req)) is None: + missing.append(req) + except VersionConflict: + missing.append(req) + return missing + + def install_requirements(self, finder, requirements, message): + args = [ + sys.executable, '-m', 'pip', 'install', '--ignore-installed', + '--no-user', '--prefix', self.path, '--no-warn-script-location', + ] + if logger.getEffectiveLevel() <= logging.DEBUG: + args.append('-v') + for format_control in ('no_binary', 'only_binary'): + formats = getattr(finder.format_control, format_control) + args.extend(('--' + format_control.replace('_', '-'), + ','.join(sorted(formats or {':none:'})))) + if finder.index_urls: + args.extend(['-i', finder.index_urls[0]]) + for extra_index in finder.index_urls[1:]: + args.extend(['--extra-index-url', extra_index]) + else: + args.append('--no-index') + for link in finder.find_links: + args.extend(['--find-links', link]) + for _, host, _ in finder.secure_origins: + args.extend(['--trusted-host', host]) + if finder.allow_all_prereleases: + args.append('--pre') + if finder.process_dependency_links: + args.append('--process-dependency-links') + args.append('--') + args.extend(requirements) + with open_spinner(message) as spinner: + call_subprocess(args, show_stdout=False, spinner=spinner) + + +class NoOpBuildEnvironment(BuildEnvironment): + """A no-op drop-in replacement for BuildEnvironment + """ + + def __init__(self): + pass + + def __enter__(self): + pass + + def __exit__(self, exc_type, exc_val, exc_tb): + pass + + def cleanup(self): + pass + + def install_requirements(self, finder, requirements, message): + raise NotImplementedError() diff --git a/lib/python3.7/site-packages/pip/_internal/cache.py b/lib/python3.7/site-packages/pip/_internal/cache.py new file mode 100644 index 0000000..33bec97 --- /dev/null +++ b/lib/python3.7/site-packages/pip/_internal/cache.py @@ -0,0 +1,202 @@ +"""Cache Management +""" + +import errno +import hashlib +import logging +import os + +from pip._vendor.packaging.utils import canonicalize_name + +from pip._internal.download import path_to_url +from pip._internal.models.link import Link +from pip._internal.utils.compat import expanduser +from pip._internal.utils.temp_dir import TempDirectory +from pip._internal.wheel import InvalidWheelFilename, Wheel + +logger = logging.getLogger(__name__) + + +class Cache(object): + """An abstract class - provides cache directories for data from links + + + :param cache_dir: The root of the cache. + :param format_control: An object of FormatControl class to limit + binaries being read from the cache. + :param allowed_formats: which formats of files the cache should store. + ('binary' and 'source' are the only allowed values) + """ + + def __init__(self, cache_dir, format_control, allowed_formats): + super(Cache, self).__init__() + self.cache_dir = expanduser(cache_dir) if cache_dir else None + self.format_control = format_control + self.allowed_formats = allowed_formats + + _valid_formats = {"source", "binary"} + assert self.allowed_formats.union(_valid_formats) == _valid_formats + + def _get_cache_path_parts(self, link): + """Get parts of part that must be os.path.joined with cache_dir + """ + + # We want to generate an url to use as our cache key, we don't want to + # just re-use the URL because it might have other items in the fragment + # and we don't care about those. + key_parts = [link.url_without_fragment] + if link.hash_name is not None and link.hash is not None: + key_parts.append("=".join([link.hash_name, link.hash])) + key_url = "#".join(key_parts) + + # Encode our key url with sha224, we'll use this because it has similar + # security properties to sha256, but with a shorter total output (and + # thus less secure). However the differences don't make a lot of + # difference for our use case here. + hashed = hashlib.sha224(key_url.encode()).hexdigest() + + # We want to nest the directories some to prevent having a ton of top + # level directories where we might run out of sub directories on some + # FS. + parts = [hashed[:2], hashed[2:4], hashed[4:6], hashed[6:]] + + return parts + + def _get_candidates(self, link, package_name): + can_not_cache = ( + not self.cache_dir or + not package_name or + not link + ) + if can_not_cache: + return [] + + canonical_name = canonicalize_name(package_name) + formats = self.format_control.get_allowed_formats( + canonical_name + ) + if not self.allowed_formats.intersection(formats): + return [] + + root = self.get_path_for_link(link) + try: + return os.listdir(root) + except OSError as err: + if err.errno in {errno.ENOENT, errno.ENOTDIR}: + return [] + raise + + def get_path_for_link(self, link): + """Return a directory to store cached items in for link. + """ + raise NotImplementedError() + + def get(self, link, package_name): + """Returns a link to a cached item if it exists, otherwise returns the + passed link. + """ + raise NotImplementedError() + + def _link_for_candidate(self, link, candidate): + root = self.get_path_for_link(link) + path = os.path.join(root, candidate) + + return Link(path_to_url(path)) + + def cleanup(self): + pass + + +class SimpleWheelCache(Cache): + """A cache of wheels for future installs. + """ + + def __init__(self, cache_dir, format_control): + super(SimpleWheelCache, self).__init__( + cache_dir, format_control, {"binary"} + ) + + def get_path_for_link(self, link): + """Return a directory to store cached wheels for link + + Because there are M wheels for any one sdist, we provide a directory + to cache them in, and then consult that directory when looking up + cache hits. + + We only insert things into the cache if they have plausible version + numbers, so that we don't contaminate the cache with things that were + not unique. E.g. ./package might have dozens of installs done for it + and build a version of 0.0...and if we built and cached a wheel, we'd + end up using the same wheel even if the source has been edited. + + :param link: The link of the sdist for which this will cache wheels. + """ + parts = self._get_cache_path_parts(link) + + # Store wheels within the root cache_dir + return os.path.join(self.cache_dir, "wheels", *parts) + + def get(self, link, package_name): + candidates = [] + + for wheel_name in self._get_candidates(link, package_name): + try: + wheel = Wheel(wheel_name) + except InvalidWheelFilename: + continue + if not wheel.supported(): + # Built for a different python/arch/etc + continue + candidates.append((wheel.support_index_min(), wheel_name)) + + if not candidates: + return link + + return self._link_for_candidate(link, min(candidates)[1]) + + +class EphemWheelCache(SimpleWheelCache): + """A SimpleWheelCache that creates it's own temporary cache directory + """ + + def __init__(self, format_control): + self._temp_dir = TempDirectory(kind="ephem-wheel-cache") + self._temp_dir.create() + + super(EphemWheelCache, self).__init__( + self._temp_dir.path, format_control + ) + + def cleanup(self): + self._temp_dir.cleanup() + + +class WheelCache(Cache): + """Wraps EphemWheelCache and SimpleWheelCache into a single Cache + + This Cache allows for gracefully degradation, using the ephem wheel cache + when a certain link is not found in the simple wheel cache first. + """ + + def __init__(self, cache_dir, format_control): + super(WheelCache, self).__init__( + cache_dir, format_control, {'binary'} + ) + self._wheel_cache = SimpleWheelCache(cache_dir, format_control) + self._ephem_cache = EphemWheelCache(format_control) + + def get_path_for_link(self, link): + return self._wheel_cache.get_path_for_link(link) + + def get_ephem_path_for_link(self, link): + return self._ephem_cache.get_path_for_link(link) + + def get(self, link, package_name): + retval = self._wheel_cache.get(link, package_name) + if retval is link: + retval = self._ephem_cache.get(link, package_name) + return retval + + def cleanup(self): + self._wheel_cache.cleanup() + self._ephem_cache.cleanup() diff --git a/lib/python3.7/site-packages/pip/_internal/cli/__init__.py b/lib/python3.7/site-packages/pip/_internal/cli/__init__.py new file mode 100644 index 0000000..e589bb9 --- /dev/null +++ b/lib/python3.7/site-packages/pip/_internal/cli/__init__.py @@ -0,0 +1,4 @@ +"""Subpackage containing all of pip's command line interface related code +""" + +# This file intentionally does not import submodules diff --git a/lib/python3.7/site-packages/pip/_internal/cli/autocompletion.py b/lib/python3.7/site-packages/pip/_internal/cli/autocompletion.py new file mode 100644 index 0000000..0a04199 --- /dev/null +++ b/lib/python3.7/site-packages/pip/_internal/cli/autocompletion.py @@ -0,0 +1,152 @@ +"""Logic that powers autocompletion installed by ``pip completion``. +""" + +import optparse +import os +import sys + +from pip._internal.cli.main_parser import create_main_parser +from pip._internal.commands import commands_dict, get_summaries +from pip._internal.utils.misc import get_installed_distributions + + +def autocomplete(): + """Entry Point for completion of main and subcommand options. + """ + # Don't complete if user hasn't sourced bash_completion file. + if 'PIP_AUTO_COMPLETE' not in os.environ: + return + cwords = os.environ['COMP_WORDS'].split()[1:] + cword = int(os.environ['COMP_CWORD']) + try: + current = cwords[cword - 1] + except IndexError: + current = '' + + subcommands = [cmd for cmd, summary in get_summaries()] + options = [] + # subcommand + try: + subcommand_name = [w for w in cwords if w in subcommands][0] + except IndexError: + subcommand_name = None + + parser = create_main_parser() + # subcommand options + if subcommand_name: + # special case: 'help' subcommand has no options + if subcommand_name == 'help': + sys.exit(1) + # special case: list locally installed dists for show and uninstall + should_list_installed = ( + subcommand_name in ['show', 'uninstall'] and + not current.startswith('-') + ) + if should_list_installed: + installed = [] + lc = current.lower() + for dist in get_installed_distributions(local_only=True): + if dist.key.startswith(lc) and dist.key not in cwords[1:]: + installed.append(dist.key) + # if there are no dists installed, fall back to option completion + if installed: + for dist in installed: + print(dist) + sys.exit(1) + + subcommand = commands_dict[subcommand_name]() + + for opt in subcommand.parser.option_list_all: + if opt.help != optparse.SUPPRESS_HELP: + for opt_str in opt._long_opts + opt._short_opts: + options.append((opt_str, opt.nargs)) + + # filter out previously specified options from available options + prev_opts = [x.split('=')[0] for x in cwords[1:cword - 1]] + options = [(x, v) for (x, v) in options if x not in prev_opts] + # filter options by current input + options = [(k, v) for k, v in options if k.startswith(current)] + # get completion type given cwords and available subcommand options + completion_type = get_path_completion_type( + cwords, cword, subcommand.parser.option_list_all, + ) + # get completion files and directories if ``completion_type`` is + # ````, ```` or ```` + if completion_type: + options = auto_complete_paths(current, completion_type) + options = ((opt, 0) for opt in options) + for option in options: + opt_label = option[0] + # append '=' to options which require args + if option[1] and option[0][:2] == "--": + opt_label += '=' + print(opt_label) + else: + # show main parser options only when necessary + + opts = [i.option_list for i in parser.option_groups] + opts.append(parser.option_list) + opts = (o for it in opts for o in it) + if current.startswith('-'): + for opt in opts: + if opt.help != optparse.SUPPRESS_HELP: + subcommands += opt._long_opts + opt._short_opts + else: + # get completion type given cwords and all available options + completion_type = get_path_completion_type(cwords, cword, opts) + if completion_type: + subcommands = auto_complete_paths(current, completion_type) + + print(' '.join([x for x in subcommands if x.startswith(current)])) + sys.exit(1) + + +def get_path_completion_type(cwords, cword, opts): + """Get the type of path completion (``file``, ``dir``, ``path`` or None) + + :param cwords: same as the environmental variable ``COMP_WORDS`` + :param cword: same as the environmental variable ``COMP_CWORD`` + :param opts: The available options to check + :return: path completion type (``file``, ``dir``, ``path`` or None) + """ + if cword < 2 or not cwords[cword - 2].startswith('-'): + return + for opt in opts: + if opt.help == optparse.SUPPRESS_HELP: + continue + for o in str(opt).split('/'): + if cwords[cword - 2].split('=')[0] == o: + if not opt.metavar or any( + x in ('path', 'file', 'dir') + for x in opt.metavar.split('/')): + return opt.metavar + + +def auto_complete_paths(current, completion_type): + """If ``completion_type`` is ``file`` or ``path``, list all regular files + and directories starting with ``current``; otherwise only list directories + starting with ``current``. + + :param current: The word to be completed + :param completion_type: path completion type(`file`, `path` or `dir`)i + :return: A generator of regular files and/or directories + """ + directory, filename = os.path.split(current) + current_path = os.path.abspath(directory) + # Don't complete paths if they can't be accessed + if not os.access(current_path, os.R_OK): + return + filename = os.path.normcase(filename) + # list all files that start with ``filename`` + file_list = (x for x in os.listdir(current_path) + if os.path.normcase(x).startswith(filename)) + for f in file_list: + opt = os.path.join(current_path, f) + comp_file = os.path.normcase(os.path.join(directory, f)) + # complete regular files when there is not ```` after option + # complete directories when there is ````, ```` or + # ````after option + if completion_type != 'dir' and os.path.isfile(opt): + yield comp_file + elif os.path.isdir(opt): + yield os.path.join(comp_file, '') diff --git a/lib/python3.4/site-packages/pip/basecommand.py b/lib/python3.7/site-packages/pip/_internal/cli/base_command.py similarity index 51% rename from lib/python3.4/site-packages/pip/basecommand.py rename to lib/python3.7/site-packages/pip/_internal/cli/base_command.py index 54c6706..dac4b05 100644 --- a/lib/python3.4/site-packages/pip/basecommand.py +++ b/lib/python3.7/site-packages/pip/_internal/cli/base_command.py @@ -2,41 +2,48 @@ from __future__ import absolute_import import logging +import logging.config +import optparse import os import sys -import optparse -import warnings -from pip import cmdoptions -from pip.index import PackageFinder -from pip.locations import running_under_virtualenv -from pip.download import PipSession -from pip.exceptions import (BadCommand, InstallationError, UninstallationError, - CommandError, PreviousBuildDirError) - -from pip.compat import logging_dictConfig -from pip.baseparser import ConfigOptionParser, UpdatingDefaultsHelpFormatter -from pip.req import InstallRequirement, parse_requirements -from pip.status_codes import ( - SUCCESS, ERROR, UNKNOWN_ERROR, VIRTUALENV_NOT_FOUND, - PREVIOUS_BUILD_DIR_ERROR, +from pip._internal.cli import cmdoptions +from pip._internal.cli.parser import ( + ConfigOptionParser, UpdatingDefaultsHelpFormatter, ) -from pip.utils import deprecation, get_prog, normalize_path -from pip.utils.logging import IndentingFormatter -from pip.utils.outdated import pip_version_check +from pip._internal.cli.status_codes import ( + ERROR, PREVIOUS_BUILD_DIR_ERROR, SUCCESS, UNKNOWN_ERROR, + VIRTUALENV_NOT_FOUND, +) +from pip._internal.download import PipSession +from pip._internal.exceptions import ( + BadCommand, CommandError, InstallationError, PreviousBuildDirError, + UninstallationError, +) +from pip._internal.index import PackageFinder +from pip._internal.locations import running_under_virtualenv +from pip._internal.req.constructors import ( + install_req_from_editable, install_req_from_line, +) +from pip._internal.req.req_file import parse_requirements +from pip._internal.utils.logging import setup_logging +from pip._internal.utils.misc import get_prog, normalize_path +from pip._internal.utils.outdated import pip_version_check +from pip._internal.utils.typing import MYPY_CHECK_RUNNING +if MYPY_CHECK_RUNNING: + from typing import Optional # noqa: F401 __all__ = ['Command'] - logger = logging.getLogger(__name__) class Command(object): - name = None - usage = None - hidden = False - log_streams = ("ext://sys.stdout", "ext://sys.stderr") + name = None # type: Optional[str] + usage = None # type: Optional[str] + hidden = False # type: bool + ignore_require_venv = False # type: bool def __init__(self, isolated=False): parser_kw = { @@ -105,97 +112,18 @@ class Command(object): def main(self, args): options, args = self.parse_args(args) - if options.quiet: - if options.quiet == 1: - level = "WARNING" - if options.quiet == 2: - level = "ERROR" - else: - level = "CRITICAL" - elif options.verbose: - level = "DEBUG" - else: - level = "INFO" + # Set verbosity so that it can be used elsewhere. + self.verbosity = options.verbose - options.quiet - # The root logger should match the "console" level *unless* we - # specified "--log" to send debug logs to a file. - root_level = level - if options.log: - root_level = "DEBUG" + setup_logging( + verbosity=self.verbosity, + no_color=options.no_color, + user_log_file=options.log, + ) - logging_dictConfig({ - "version": 1, - "disable_existing_loggers": False, - "filters": { - "exclude_warnings": { - "()": "pip.utils.logging.MaxLevelFilter", - "level": logging.WARNING, - }, - }, - "formatters": { - "indent": { - "()": IndentingFormatter, - "format": "%(message)s", - }, - }, - "handlers": { - "console": { - "level": level, - "class": "pip.utils.logging.ColorizedStreamHandler", - "stream": self.log_streams[0], - "filters": ["exclude_warnings"], - "formatter": "indent", - }, - "console_errors": { - "level": "WARNING", - "class": "pip.utils.logging.ColorizedStreamHandler", - "stream": self.log_streams[1], - "formatter": "indent", - }, - "user_log": { - "level": "DEBUG", - "class": "pip.utils.logging.BetterRotatingFileHandler", - "filename": options.log or "/dev/null", - "delay": True, - "formatter": "indent", - }, - }, - "root": { - "level": root_level, - "handlers": list(filter(None, [ - "console", - "console_errors", - "user_log" if options.log else None, - ])), - }, - # Disable any logging besides WARNING unless we have DEBUG level - # logging enabled. These use both pip._vendor and the bare names - # for the case where someone unbundles our libraries. - "loggers": dict( - ( - name, - { - "level": ( - "WARNING" - if level in ["INFO", "ERROR"] - else "DEBUG" - ), - }, - ) - for name in ["pip._vendor", "distlib", "requests", "urllib3"] - ), - }) - - if sys.version_info[:2] == (2, 6): - warnings.warn( - "Python 2.6 is no longer supported by the Python core team, " - "please upgrade your Python. A future version of pip will " - "drop support for Python 2.6", - deprecation.Python26DeprecationWarning - ) - - # TODO: try to get these passing down from the command? - # without resorting to os.environ to hold these. + # TODO: Try to get these passing down from the command? + # without resorting to os.environ to hold these. + # This also affects isolated builds and it should. if options.no_input: os.environ['PIP_NO_INPUT'] = '1' @@ -203,7 +131,7 @@ class Command(object): if options.exists_action: os.environ['PIP_EXISTS_ACTION'] = ' '.join(options.exists_action) - if options.require_venv: + if options.require_venv and not self.ignore_require_venv: # If a venv is required check if it can really be found if not running_under_virtualenv(): logger.critical( @@ -237,19 +165,29 @@ class Command(object): logger.debug('Exception information:', exc_info=True) return ERROR - except: + except BaseException: logger.critical('Exception:', exc_info=True) return UNKNOWN_ERROR finally: + allow_version_check = ( + # Does this command have the index_group options? + hasattr(options, "no_index") and + # Is this command allowed to perform this check? + not (options.disable_pip_version_check or options.no_index) + ) # Check if we're using the latest version of pip available - if (not options.disable_pip_version_check and not - getattr(options, "no_index", False)): - with self._build_session( - options, - retries=0, - timeout=min(5, options.timeout)) as session: - pip_version_check(session) + if allow_version_check: + session = self._build_session( + options, + retries=0, + timeout=min(5, options.timeout) + ) + with session: + pip_version_check(session, options) + + # Shutdown the logging module + logging.shutdown() return SUCCESS @@ -262,54 +200,56 @@ class RequirementCommand(Command): """ Marshal cmd line args into a requirement set. """ + # NOTE: As a side-effect, options.require_hashes and + # requirement_set.require_hashes may be updated + for filename in options.constraints: - for req in parse_requirements( + for req_to_add in parse_requirements( filename, constraint=True, finder=finder, options=options, session=session, wheel_cache=wheel_cache): - requirement_set.add_requirement(req) + req_to_add.is_direct = True + requirement_set.add_requirement(req_to_add) for req in args: - requirement_set.add_requirement( - InstallRequirement.from_line( - req, None, isolated=options.isolated_mode, - wheel_cache=wheel_cache - ) + req_to_add = install_req_from_line( + req, None, isolated=options.isolated_mode, + wheel_cache=wheel_cache ) + req_to_add.is_direct = True + requirement_set.add_requirement(req_to_add) for req in options.editables: - requirement_set.add_requirement( - InstallRequirement.from_editable( - req, - default_vcs=options.default_vcs, - isolated=options.isolated_mode, - wheel_cache=wheel_cache - ) + req_to_add = install_req_from_editable( + req, + isolated=options.isolated_mode, + wheel_cache=wheel_cache ) + req_to_add.is_direct = True + requirement_set.add_requirement(req_to_add) - found_req_in_file = False for filename in options.requirements: - for req in parse_requirements( + for req_to_add in parse_requirements( filename, finder=finder, options=options, session=session, wheel_cache=wheel_cache): - found_req_in_file = True - requirement_set.add_requirement(req) + req_to_add.is_direct = True + requirement_set.add_requirement(req_to_add) # If --require-hashes was a line in a requirements file, tell # RequirementSet about it: requirement_set.require_hashes = options.require_hashes - if not (args or options.editables or found_req_in_file): + if not (args or options.editables or options.requirements): opts = {'name': name} if options.find_links: - msg = ('You must give at least one requirement to ' - '%(name)s (maybe you meant "pip %(name)s ' - '%(links)s"?)' % - dict(opts, links=' '.join(options.find_links))) + raise CommandError( + 'You must give at least one requirement to %(name)s ' + '(maybe you meant "pip %(name)s %(links)s"?)' % + dict(opts, links=' '.join(options.find_links))) else: - msg = ('You must give at least one requirement ' - 'to %(name)s (see "pip help %(name)s")' % opts) - logger.warning(msg) + raise CommandError( + 'You must give at least one requirement to %(name)s ' + '(see "pip help %(name)s")' % opts) def _build_package_finder(self, options, session, platform=None, python_versions=None, @@ -334,4 +274,5 @@ class RequirementCommand(Command): versions=python_versions, abi=abi, implementation=implementation, + prefer_binary=options.prefer_binary, ) diff --git a/lib/python3.4/site-packages/pip/cmdoptions.py b/lib/python3.7/site-packages/pip/_internal/cli/cmdoptions.py similarity index 64% rename from lib/python3.4/site-packages/pip/cmdoptions.py rename to lib/python3.7/site-packages/pip/_internal/cli/cmdoptions.py index f75c093..29b758f 100644 --- a/lib/python3.4/site-packages/pip/cmdoptions.py +++ b/lib/python3.7/site-packages/pip/_internal/cli/cmdoptions.py @@ -9,16 +9,20 @@ pass on state. To be consistent, all options will follow this design. """ from __future__ import absolute_import -from functools import partial -from optparse import OptionGroup, SUPPRESS_HELP, Option import warnings +from functools import partial +from optparse import SUPPRESS_HELP, Option, OptionGroup -from pip.index import ( - FormatControl, fmt_ctl_handle_mutual_exclude, fmt_ctl_no_binary, - fmt_ctl_no_use_wheel) -from pip.models import PyPI -from pip.locations import USER_CACHE_DIR, src_prefix -from pip.utils.hashes import STRONG_HASHES +from pip._internal.exceptions import CommandError +from pip._internal.locations import USER_CACHE_DIR, src_prefix +from pip._internal.models.format_control import FormatControl +from pip._internal.models.index import PyPI +from pip._internal.utils.hashes import STRONG_HASHES +from pip._internal.utils.typing import MYPY_CHECK_RUNNING +from pip._internal.utils.ui import BAR_TYPES + +if MYPY_CHECK_RUNNING: + from typing import Any # noqa: F401 def make_option_group(group, parser): @@ -33,12 +37,6 @@ def make_option_group(group, parser): return option_group -def resolve_wheel_no_use_binary(options): - if not options.use_wheel: - control = options.format_control - fmt_ctl_no_use_wheel(control) - - def check_install_build_global(options, check_options=None): """Disable wheels if per-setup.py call options are set. @@ -54,10 +52,50 @@ def check_install_build_global(options, check_options=None): names = ["build_options", "global_options", "install_options"] if any(map(getname, names)): control = options.format_control - fmt_ctl_no_binary(control) + control.disallow_binaries() warnings.warn( 'Disabling all use of wheels due to the use of --build-options ' - '/ --global-options / --install-options.', stacklevel=2) + '/ --global-options / --install-options.', stacklevel=2, + ) + + +def check_dist_restriction(options, check_target=False): + """Function for determining if custom platform options are allowed. + + :param options: The OptionParser options. + :param check_target: Whether or not to check if --target is being used. + """ + dist_restriction_set = any([ + options.python_version, + options.platform, + options.abi, + options.implementation, + ]) + + binary_only = FormatControl(set(), {':all:'}) + sdist_dependencies_allowed = ( + options.format_control != binary_only and + not options.ignore_dependencies + ) + + # Installations or downloads using dist restrictions must not combine + # source distributions and dist-specific wheels, as they are not + # gauranteed to be locally compatible. + if dist_restriction_set and sdist_dependencies_allowed: + raise CommandError( + "When restricting platform and interpreter constraints using " + "--python-version, --platform, --abi, or --implementation, " + "either --no-deps must be set, or --only-binary=:all: must be " + "set and --no-binary must not be set (or must be set to " + ":none:)." + ) + + if check_target: + if dist_restriction_set and not options.target_dir: + raise CommandError( + "Can not use any platform or abi specific options unless " + "installing via '--target'" + ) ########### @@ -69,7 +107,8 @@ help_ = partial( '-h', '--help', dest='help', action='help', - help='Show help.') + help='Show help.', +) # type: Any isolated_mode = partial( Option, @@ -90,7 +129,8 @@ require_virtualenv = partial( dest='require_venv', action='store_true', default=False, - help=SUPPRESS_HELP) + help=SUPPRESS_HELP +) # type: Any verbose = partial( Option, @@ -101,12 +141,22 @@ verbose = partial( help='Give more output. Option is additive, and can be used up to 3 times.' ) +no_color = partial( + Option, + '--no-color', + dest='no_color', + action='store_true', + default=False, + help="Suppress colored output", +) + version = partial( Option, '-V', '--version', dest='version', action='store_true', - help='Show version and exit.') + help='Show version and exit.', +) # type: Any quiet = partial( Option, @@ -114,10 +164,25 @@ quiet = partial( dest='quiet', action='count', default=0, - help=('Give less output. Option is additive, and can be used up to 3' - ' times (corresponding to WARNING, ERROR, and CRITICAL logging' - ' levels).') -) + help=( + 'Give less output. Option is additive, and can be used up to 3' + ' times (corresponding to WARNING, ERROR, and CRITICAL logging' + ' levels).' + ), +) # type: Any + +progress_bar = partial( + Option, + '--progress-bar', + dest='progress_bar', + type='choice', + choices=list(BAR_TYPES.keys()), + default='on', + help=( + 'Specify type of progress to be displayed [' + + '|'.join(BAR_TYPES.keys()) + '] (default: %default)' + ), +) # type: Any log = partial( Option, @@ -125,7 +190,7 @@ log = partial( dest="log", metavar="path", help="Path to a verbose appending log." -) +) # type: Any no_input = partial( Option, @@ -134,7 +199,8 @@ no_input = partial( dest='no_input', action='store_true', default=False, - help=SUPPRESS_HELP) + help=SUPPRESS_HELP +) # type: Any proxy = partial( Option, @@ -142,7 +208,8 @@ proxy = partial( dest='proxy', type='str', default='', - help="Specify a proxy in the form [user:passwd@]proxy.server:port.") + help="Specify a proxy in the form [user:passwd@]proxy.server:port." +) # type: Any retries = partial( Option, @@ -151,7 +218,8 @@ retries = partial( type='int', default=5, help="Maximum number of retries each connection should attempt " - "(default %default times).") + "(default %default times).", +) # type: Any timeout = partial( Option, @@ -160,16 +228,8 @@ timeout = partial( dest='timeout', type='float', default=15, - help='Set the socket timeout (default %default seconds).') - -default_vcs = partial( - Option, - # The default version control system for editables, e.g. 'svn' - '--default-vcs', - dest='default_vcs', - type='str', - default='', - help=SUPPRESS_HELP) + help='Set the socket timeout (default %default seconds).', +) # type: Any skip_requirements_regex = partial( Option, @@ -178,7 +238,8 @@ skip_requirements_regex = partial( dest='skip_requirements_regex', type='str', default='', - help=SUPPRESS_HELP) + help=SUPPRESS_HELP, +) # type: Any def exists_action(): @@ -192,7 +253,8 @@ def exists_action(): action='append', metavar='action', help="Default action when a path already exists: " - "(s)witch, (i)gnore, (w)ipe, (b)ackup, (a)bort.") + "(s)witch, (i)gnore, (w)ipe, (b)ackup, (a)bort).", + ) cert = partial( @@ -201,7 +263,8 @@ cert = partial( dest='cert', type='str', metavar='path', - help="Path to alternate CA bundle.") + help="Path to alternate CA bundle.", +) # type: Any client_cert = partial( Option, @@ -211,7 +274,8 @@ client_cert = partial( default=None, metavar='path', help="Path to SSL client certificate, a single file containing the " - "private key and the certificate in PEM format.") + "private key and the certificate in PEM format.", +) # type: Any index_url = partial( Option, @@ -222,7 +286,8 @@ index_url = partial( help="Base URL of Python Package Index (default %default). " "This should point to a repository compliant with PEP 503 " "(the simple repository API) or a local directory laid out " - "in the same format.") + "in the same format.", +) # type: Any def extra_index_url(): @@ -234,7 +299,7 @@ def extra_index_url(): default=[], help="Extra URLs of package indexes to use in addition to " "--index-url. Should follow the same rules as " - "--index-url." + "--index-url.", ) @@ -244,7 +309,8 @@ no_index = partial( dest='no_index', action='store_true', default=False, - help='Ignore package index (only looking at --find-links URLs instead).') + help='Ignore package index (only looking at --find-links URLs instead).', +) # type: Any def find_links(): @@ -256,30 +322,10 @@ def find_links(): metavar='url', help="If a url or path to an html file, then parse for links to " "archives. If a local path or file:// url that's a directory, " - "then look for archives in the directory listing.") - - -def allow_external(): - return Option( - "--allow-external", - dest="allow_external", - action="append", - default=[], - metavar="PACKAGE", - help=SUPPRESS_HELP, + "then look for archives in the directory listing.", ) -allow_all_external = partial( - Option, - "--allow-all-external", - dest="allow_all_external", - action="store_true", - default=False, - help=SUPPRESS_HELP, -) - - def trusted_host(): return Option( "--trusted-host", @@ -292,38 +338,6 @@ def trusted_host(): ) -# Remove after 7.0 -no_allow_external = partial( - Option, - "--no-allow-external", - dest="allow_all_external", - action="store_false", - default=False, - help=SUPPRESS_HELP, -) - - -# Remove --allow-insecure after 7.0 -def allow_unsafe(): - return Option( - "--allow-unverified", "--allow-insecure", - dest="allow_unverified", - action="append", - default=[], - metavar="PACKAGE", - help=SUPPRESS_HELP, - ) - -# Remove after 7.0 -no_allow_unsafe = partial( - Option, - "--no-allow-insecure", - dest="allow_all_insecure", - action="store_false", - default=False, - help=SUPPRESS_HELP -) - # Remove after 1.5 process_dependency_links = partial( Option, @@ -332,7 +346,7 @@ process_dependency_links = partial( action="store_true", default=False, help="Enable the processing of dependency links.", -) +) # type: Any def constraints(): @@ -343,7 +357,8 @@ def constraints(): default=[], metavar='file', help='Constrain versions using the given constraints file. ' - 'This option can be used multiple times.') + 'This option can be used multiple times.' + ) def requirements(): @@ -354,7 +369,8 @@ def requirements(): default=[], metavar='file', help='Install from the given requirements file. ' - 'This option can be used multiple times.') + 'This option can be used multiple times.' + ) def editable(): @@ -368,6 +384,7 @@ def editable(): '"develop mode") from a local project path or a VCS url.'), ) + src = partial( Option, '--src', '--source', '--source-dir', '--source-directory', @@ -377,28 +394,7 @@ src = partial( help='Directory to check out editable projects into. ' 'The default in a virtualenv is "/src". ' 'The default for global installs is "/src".' -) - -# XXX: deprecated, remove in 9.0 -use_wheel = partial( - Option, - '--use-wheel', - dest='use_wheel', - action='store_true', - default=True, - help=SUPPRESS_HELP, -) - -# XXX: deprecated, remove in 9.0 -no_use_wheel = partial( - Option, - '--no-use-wheel', - dest='use_wheel', - action='store_false', - default=True, - help=('Do not Find and prefer wheel archives when searching indexes and ' - 'find-links locations. DEPRECATED in favour of --no-binary.'), -) +) # type: Any def _get_format_control(values, option): @@ -407,41 +403,112 @@ def _get_format_control(values, option): def _handle_no_binary(option, opt_str, value, parser): - existing = getattr(parser.values, option.dest) - fmt_ctl_handle_mutual_exclude( - value, existing.no_binary, existing.only_binary) + existing = _get_format_control(parser.values, option) + FormatControl.handle_mutual_excludes( + value, existing.no_binary, existing.only_binary, + ) def _handle_only_binary(option, opt_str, value, parser): - existing = getattr(parser.values, option.dest) - fmt_ctl_handle_mutual_exclude( - value, existing.only_binary, existing.no_binary) + existing = _get_format_control(parser.values, option) + FormatControl.handle_mutual_excludes( + value, existing.only_binary, existing.no_binary, + ) def no_binary(): + format_control = FormatControl(set(), set()) return Option( "--no-binary", dest="format_control", action="callback", callback=_handle_no_binary, type="str", - default=FormatControl(set(), set()), + default=format_control, help="Do not use binary packages. Can be supplied multiple times, and " "each time adds to the existing value. Accepts either :all: to " "disable all binary packages, :none: to empty the set, or one or " "more package names with commas between them. Note that some " "packages are tricky to compile and may fail to install when " - "this option is used on them.") + "this option is used on them.", + ) def only_binary(): + format_control = FormatControl(set(), set()) return Option( "--only-binary", dest="format_control", action="callback", callback=_handle_only_binary, type="str", - default=FormatControl(set(), set()), + default=format_control, help="Do not use source packages. Can be supplied multiple times, and " "each time adds to the existing value. Accepts either :all: to " "disable all source packages, :none: to empty the set, or one or " "more package names with commas between them. Packages without " "binary distributions will fail to install when this option is " - "used on them.") + "used on them.", + ) + + +platform = partial( + Option, + '--platform', + dest='platform', + metavar='platform', + default=None, + help=("Only use wheels compatible with . " + "Defaults to the platform of the running system."), +) + + +python_version = partial( + Option, + '--python-version', + dest='python_version', + metavar='python_version', + default=None, + help=("Only use wheels compatible with Python " + "interpreter version . If not specified, then the " + "current system interpreter minor version is used. A major " + "version (e.g. '2') can be specified to match all " + "minor revs of that major version. A minor version " + "(e.g. '34') can also be specified."), +) + + +implementation = partial( + Option, + '--implementation', + dest='implementation', + metavar='implementation', + default=None, + help=("Only use wheels compatible with Python " + "implementation , e.g. 'pp', 'jy', 'cp', " + " or 'ip'. If not specified, then the current " + "interpreter implementation is used. Use 'py' to force " + "implementation-agnostic wheels."), +) + + +abi = partial( + Option, + '--abi', + dest='abi', + metavar='abi', + default=None, + help=("Only use wheels compatible with Python " + "abi , e.g. 'pypy_41'. If not specified, then the " + "current interpreter abi tag is used. Generally " + "you will need to specify --implementation, " + "--platform, and --python-version when using " + "this option."), +) + + +def prefer_binary(): + return Option( + "--prefer-binary", + dest="prefer_binary", + action="store_true", + default=False, + help="Prefer older binary packages over newer source packages." + ) cache_dir = partial( @@ -467,22 +534,39 @@ no_deps = partial( dest='ignore_dependencies', action='store_true', default=False, - help="Don't install package dependencies.") + help="Don't install package dependencies.", +) # type: Any build_dir = partial( Option, '-b', '--build', '--build-dir', '--build-directory', dest='build_dir', metavar='dir', - help='Directory to unpack packages into and build in.' -) + help='Directory to unpack packages into and build in. Note that ' + 'an initial build still takes place in a temporary directory. ' + 'The location of temporary directories can be controlled by setting ' + 'the TMPDIR environment variable (TEMP on Windows) appropriately. ' + 'When passed, build directories are not cleaned in case of failures.' +) # type: Any ignore_requires_python = partial( Option, '--ignore-requires-python', dest='ignore_requires_python', action='store_true', - help='Ignore the Requires-Python information.') + help='Ignore the Requires-Python information.' +) # type: Any + +no_build_isolation = partial( + Option, + '--no-build-isolation', + dest='build_isolation', + action='store_false', + default=True, + help='Disable isolation when building a modern source distribution. ' + 'Build dependencies specified by PEP 518 must be already installed ' + 'if this option is used.' +) # type: Any install_options = partial( Option, @@ -494,7 +578,8 @@ install_options = partial( "command (use like --install-option=\"--install-scripts=/usr/local/" "bin\"). Use multiple --install-option options to pass multiple " "options to setup.py install. If you are using an option with a " - "directory path, be sure to use absolute path.") + "directory path, be sure to use absolute path.", +) # type: Any global_options = partial( Option, @@ -503,14 +588,16 @@ global_options = partial( action='append', metavar='options', help="Extra global options to be supplied to the setup.py " - "call before the install command.") + "call before the install command.", +) # type: Any no_clean = partial( Option, '--no-clean', action='store_true', default=False, - help="Don't clean up build directories.") + help="Don't clean up build directories." +) # type: Any pre = partial( Option, @@ -518,7 +605,8 @@ pre = partial( action='store_true', default=False, help="Include pre-release and development versions. By default, " - "pip only finds stable versions.") + "pip only finds stable versions.", +) # type: Any disable_pip_version_check = partial( Option, @@ -527,7 +615,9 @@ disable_pip_version_check = partial( action="store_true", default=True, help="Don't periodically check PyPI to determine whether a new version " - "of pip is available for download. Implied with --no-index.") + "of pip is available for download. Implied with --no-index.", +) # type: Any + # Deprecated, Remove later always_unzip = partial( @@ -536,7 +626,7 @@ always_unzip = partial( dest='always_unzip', action='store_true', help=SUPPRESS_HELP, -) +) # type: Any def _merge_hash(option, opt_str, value, parser): @@ -566,7 +656,8 @@ hash = partial( callback=_merge_hash, type='string', help="Verify that the package's archive matches this " - 'hash before installing. Example: --hash=sha256:abcdef...') + 'hash before installing. Example: --hash=sha256:abcdef...', +) # type: Any require_hashes = partial( @@ -577,7 +668,8 @@ require_hashes = partial( default=False, help='Require a hash to check each requirement against, for ' 'repeatable installs. This option is implied when any package in a ' - 'requirements file has a --hash option.') + 'requirements file has a --hash option.', +) # type: Any ########## @@ -598,7 +690,6 @@ general_group = { proxy, retries, timeout, - default_vcs, skip_requirements_regex, exists_action, trusted_host, @@ -607,10 +698,11 @@ general_group = { cache_dir, no_cache, disable_pip_version_check, + no_color, ] } -non_deprecated_index_group = { +index_group = { 'name': 'Package Index Options', 'options': [ index_url, @@ -620,14 +712,3 @@ non_deprecated_index_group = { process_dependency_links, ] } - -index_group = { - 'name': 'Package Index Options (including deprecated options)', - 'options': non_deprecated_index_group['options'] + [ - allow_external, - allow_all_external, - no_allow_external, - allow_unsafe, - no_allow_unsafe, - ] -} diff --git a/lib/python3.7/site-packages/pip/_internal/cli/main_parser.py b/lib/python3.7/site-packages/pip/_internal/cli/main_parser.py new file mode 100644 index 0000000..1774a6b --- /dev/null +++ b/lib/python3.7/site-packages/pip/_internal/cli/main_parser.py @@ -0,0 +1,96 @@ +"""A single place for constructing and exposing the main parser +""" + +import os +import sys + +from pip import __version__ +from pip._internal.cli import cmdoptions +from pip._internal.cli.parser import ( + ConfigOptionParser, UpdatingDefaultsHelpFormatter, +) +from pip._internal.commands import ( + commands_dict, get_similar_commands, get_summaries, +) +from pip._internal.exceptions import CommandError +from pip._internal.utils.misc import get_prog + +__all__ = ["create_main_parser", "parse_command"] + + +def create_main_parser(): + """Creates and returns the main parser for pip's CLI + """ + + parser_kw = { + 'usage': '\n%prog [options]', + 'add_help_option': False, + 'formatter': UpdatingDefaultsHelpFormatter(), + 'name': 'global', + 'prog': get_prog(), + } + + parser = ConfigOptionParser(**parser_kw) + parser.disable_interspersed_args() + + pip_pkg_dir = os.path.abspath(os.path.join( + os.path.dirname(__file__), "..", "..", + )) + parser.version = 'pip %s from %s (python %s)' % ( + __version__, pip_pkg_dir, sys.version[:3], + ) + + # add the general options + gen_opts = cmdoptions.make_option_group(cmdoptions.general_group, parser) + parser.add_option_group(gen_opts) + + parser.main = True # so the help formatter knows + + # create command listing for description + command_summaries = get_summaries() + description = [''] + ['%-27s %s' % (i, j) for i, j in command_summaries] + parser.description = '\n'.join(description) + + return parser + + +def parse_command(args): + parser = create_main_parser() + + # Note: parser calls disable_interspersed_args(), so the result of this + # call is to split the initial args into the general options before the + # subcommand and everything else. + # For example: + # args: ['--timeout=5', 'install', '--user', 'INITools'] + # general_options: ['--timeout==5'] + # args_else: ['install', '--user', 'INITools'] + general_options, args_else = parser.parse_args(args) + + # --version + if general_options.version: + sys.stdout.write(parser.version) + sys.stdout.write(os.linesep) + sys.exit() + + # pip || pip help -> print_help() + if not args_else or (args_else[0] == 'help' and len(args_else) == 1): + parser.print_help() + sys.exit() + + # the subcommand name + cmd_name = args_else[0] + + if cmd_name not in commands_dict: + guess = get_similar_commands(cmd_name) + + msg = ['unknown command "%s"' % cmd_name] + if guess: + msg.append('maybe you meant "%s"' % guess) + + raise CommandError(' - '.join(msg)) + + # all the args without the subcommand + cmd_args = args[:] + cmd_args.remove(cmd_name) + + return cmd_name, cmd_args diff --git a/lib/python3.4/site-packages/pip/baseparser.py b/lib/python3.7/site-packages/pip/_internal/cli/parser.py similarity index 67% rename from lib/python3.4/site-packages/pip/baseparser.py rename to lib/python3.7/site-packages/pip/_internal/cli/parser.py index 2dd4533..e1eaac4 100644 --- a/lib/python3.4/site-packages/pip/baseparser.py +++ b/lib/python3.7/site-packages/pip/_internal/cli/parser.py @@ -1,23 +1,19 @@ """Base option parser setup""" from __future__ import absolute_import -import sys +import logging import optparse -import os -import re +import sys import textwrap from distutils.util import strtobool from pip._vendor.six import string_types -from pip._vendor.six.moves import configparser -from pip.locations import ( - legacy_config_file, config_basename, running_under_virtualenv, - site_config_files -) -from pip.utils import appdirs, get_terminal_size +from pip._internal.cli.status_codes import UNKNOWN_ERROR +from pip._internal.configuration import Configuration, ConfigurationError +from pip._internal.utils.compat import get_terminal_size -_environ_prefix_re = re.compile(r"^PIP_", re.I) +logger = logging.getLogger(__name__) class PrettyHelpFormatter(optparse.IndentedHelpFormatter): @@ -137,58 +133,15 @@ class ConfigOptionParser(CustomOptionParser): """Custom option parser which updates its defaults by checking the configuration files and environmental variables""" - isolated = False - def __init__(self, *args, **kwargs): - self.config = configparser.RawConfigParser() self.name = kwargs.pop('name') - self.isolated = kwargs.pop("isolated", False) - self.files = self.get_config_files() - if self.files: - self.config.read(self.files) + + isolated = kwargs.pop("isolated", False) + self.config = Configuration(isolated) + assert self.name optparse.OptionParser.__init__(self, *args, **kwargs) - def get_config_files(self): - # the files returned by this method will be parsed in order with the - # first files listed being overridden by later files in standard - # ConfigParser fashion - config_file = os.environ.get('PIP_CONFIG_FILE', False) - if config_file == os.devnull: - return [] - - # at the base we have any site-wide configuration - files = list(site_config_files) - - # per-user configuration next - if not self.isolated: - if config_file and os.path.exists(config_file): - files.append(config_file) - else: - # This is the legacy config file, we consider it to be a lower - # priority than the new file location. - files.append(legacy_config_file) - - # This is the new config file, we consider it to be a higher - # priority than the legacy file. - files.append( - os.path.join( - appdirs.user_config_dir("pip"), - config_basename, - ) - ) - - # finally virtualenv configuration first trumping others - if running_under_virtualenv(): - venv_config_file = os.path.join( - sys.prefix, - config_basename, - ) - if os.path.exists(venv_config_file): - files.append(venv_config_file) - - return files - def check_default(self, option, key, val): try: return option.check_value(key, val) @@ -196,30 +149,43 @@ class ConfigOptionParser(CustomOptionParser): print("An error occurred during configuration: %s" % exc) sys.exit(3) + def _get_ordered_configuration_items(self): + # Configuration gives keys in an unordered manner. Order them. + override_order = ["global", self.name, ":env:"] + + # Pool the options into different groups + section_items = {name: [] for name in override_order} + for section_key, val in self.config.items(): + # ignore empty values + if not val: + logger.debug( + "Ignoring configuration key '%s' as it's value is empty.", + section_key + ) + continue + + section, key = section_key.split(".", 1) + if section in override_order: + section_items[section].append((key, val)) + + # Yield each group in their override order + for section in override_order: + for key, val in section_items[section]: + yield key, val + def _update_defaults(self, defaults): """Updates the given defaults with values from the config files and the environ. Does a little special handling for certain types of options (lists).""" - # Then go and look for the other sources of configuration: - config = {} - # 1. config files - for section in ('global', self.name): - config.update( - self.normalize_keys(self.get_config_section(section)) - ) - # 2. environmental variables - if not self.isolated: - config.update(self.normalize_keys(self.get_environ_vars())) + # Accumulate complex default state. self.values = optparse.Values(self.defaults) late_eval = set() # Then set the options with those values - for key, val in config.items(): - # ignore empty values - if not val: - continue + for key, val in self._get_ordered_configuration_items(): + # '--' because configuration supports only long names + option = self.get_option('--' + key) - option = self.get_option(key) # Ignore options not present in this parser. E.g. non-globals put # in [global] by users that want them to apply to all applicable # commands. @@ -227,7 +193,14 @@ class ConfigOptionParser(CustomOptionParser): continue if option.action in ('store_true', 'store_false', 'count'): - val = strtobool(val) + try: + val = strtobool(val) + except ValueError: + error_msg = invalid_config_error_message( + option.action, key, val + ) + self.error(error_msg) + elif option.action == 'append': val = val.split() val = [self.check_default(option, key, v) for v in val] @@ -249,30 +222,6 @@ class ConfigOptionParser(CustomOptionParser): self.values = None return defaults - def normalize_keys(self, items): - """Return a config dictionary with normalized keys regardless of - whether the keys were specified in environment variables or in config - files""" - normalized = {} - for key, val in items: - key = key.replace('_', '-') - if not key.startswith('--'): - key = '--%s' % key # only prefer long opts - normalized[key] = val - return normalized - - def get_config_section(self, name): - """Get a section of a configuration""" - if self.config.has_section(name): - return self.config.items(name) - return [] - - def get_environ_vars(self): - """Returns a generator with all environmental vars with prefix PIP_""" - for key, val in os.environ.items(): - if _environ_prefix_re.search(key): - yield (_environ_prefix_re.sub("", key).lower(), val) - def get_default_values(self): """Overriding to make updating the defaults after instantiation of the option parser possible, _update_defaults() does the dirty work.""" @@ -280,6 +229,12 @@ class ConfigOptionParser(CustomOptionParser): # Old, pre-Optik 1.5 behaviour. return optparse.Values(self.defaults) + # Load the configuration, or error out in case of an error + try: + self.config.load() + except ConfigurationError as err: + self.exit(UNKNOWN_ERROR, str(err)) + defaults = self._update_defaults(self.defaults.copy()) # ours for option in self._get_all_options(): default = defaults.get(option.dest) @@ -290,4 +245,17 @@ class ConfigOptionParser(CustomOptionParser): def error(self, msg): self.print_usage(sys.stderr) - self.exit(2, "%s\n" % msg) + self.exit(UNKNOWN_ERROR, "%s\n" % msg) + + +def invalid_config_error_message(action, key, val): + """Returns a better error message when invalid configuration option + is provided.""" + if action in ('store_true', 'store_false'): + return ("{0} is not a valid value for {1} option, " + "please specify a boolean value like yes/no, " + "true/false or 1/0 instead.").format(val, key) + + return ("{0} is not a valid value for {1} option, " + "please specify a numerical value like 1/0 " + "instead.").format(val, key) diff --git a/lib/python3.4/site-packages/pip/status_codes.py b/lib/python3.7/site-packages/pip/_internal/cli/status_codes.py similarity index 100% rename from lib/python3.4/site-packages/pip/status_codes.py rename to lib/python3.7/site-packages/pip/_internal/cli/status_codes.py diff --git a/lib/python3.4/site-packages/pip/commands/__init__.py b/lib/python3.7/site-packages/pip/_internal/commands/__init__.py similarity index 53% rename from lib/python3.4/site-packages/pip/commands/__init__.py rename to lib/python3.7/site-packages/pip/_internal/commands/__init__.py index 62c64eb..c7d1da3 100644 --- a/lib/python3.4/site-packages/pip/commands/__init__.py +++ b/lib/python3.7/site-packages/pip/_internal/commands/__init__.py @@ -3,35 +3,25 @@ Package containing all pip commands """ from __future__ import absolute_import -from pip.commands.completion import CompletionCommand -from pip.commands.download import DownloadCommand -from pip.commands.freeze import FreezeCommand -from pip.commands.hash import HashCommand -from pip.commands.help import HelpCommand -from pip.commands.list import ListCommand -from pip.commands.check import CheckCommand -from pip.commands.search import SearchCommand -from pip.commands.show import ShowCommand -from pip.commands.install import InstallCommand -from pip.commands.uninstall import UninstallCommand -from pip.commands.wheel import WheelCommand +from pip._internal.commands.completion import CompletionCommand +from pip._internal.commands.configuration import ConfigurationCommand +from pip._internal.commands.download import DownloadCommand +from pip._internal.commands.freeze import FreezeCommand +from pip._internal.commands.hash import HashCommand +from pip._internal.commands.help import HelpCommand +from pip._internal.commands.list import ListCommand +from pip._internal.commands.check import CheckCommand +from pip._internal.commands.search import SearchCommand +from pip._internal.commands.show import ShowCommand +from pip._internal.commands.install import InstallCommand +from pip._internal.commands.uninstall import UninstallCommand +from pip._internal.commands.wheel import WheelCommand +from pip._internal.utils.typing import MYPY_CHECK_RUNNING -commands_dict = { - CompletionCommand.name: CompletionCommand, - FreezeCommand.name: FreezeCommand, - HashCommand.name: HashCommand, - HelpCommand.name: HelpCommand, - SearchCommand.name: SearchCommand, - ShowCommand.name: ShowCommand, - InstallCommand.name: InstallCommand, - UninstallCommand.name: UninstallCommand, - DownloadCommand.name: DownloadCommand, - ListCommand.name: ListCommand, - CheckCommand.name: CheckCommand, - WheelCommand.name: WheelCommand, -} - +if MYPY_CHECK_RUNNING: + from typing import List, Type # noqa: F401 + from pip._internal.cli.base_command import Command # noqa: F401 commands_order = [ InstallCommand, @@ -41,12 +31,15 @@ commands_order = [ ListCommand, ShowCommand, CheckCommand, + ConfigurationCommand, SearchCommand, WheelCommand, HashCommand, CompletionCommand, HelpCommand, -] +] # type: List[Type[Command]] + +commands_dict = {c.name: c for c in commands_order} def get_summaries(ordered=True): diff --git a/lib/python3.7/site-packages/pip/_internal/commands/check.py b/lib/python3.7/site-packages/pip/_internal/commands/check.py new file mode 100644 index 0000000..1be3ec2 --- /dev/null +++ b/lib/python3.7/site-packages/pip/_internal/commands/check.py @@ -0,0 +1,41 @@ +import logging + +from pip._internal.cli.base_command import Command +from pip._internal.operations.check import ( + check_package_set, create_package_set_from_installed, +) + +logger = logging.getLogger(__name__) + + +class CheckCommand(Command): + """Verify installed packages have compatible dependencies.""" + name = 'check' + usage = """ + %prog [options]""" + summary = 'Verify installed packages have compatible dependencies.' + + def run(self, options, args): + package_set = create_package_set_from_installed() + missing, conflicting = check_package_set(package_set) + + for project_name in missing: + version = package_set[project_name].version + for dependency in missing[project_name]: + logger.info( + "%s %s requires %s, which is not installed.", + project_name, version, dependency[0], + ) + + for project_name in conflicting: + version = package_set[project_name].version + for dep_name, dep_version, req in conflicting[project_name]: + logger.info( + "%s %s has requirement %s, but you have %s %s.", + project_name, version, req, dep_name, dep_version, + ) + + if missing or conflicting: + return 1 + else: + logger.info("No broken requirements found.") diff --git a/lib/python3.4/site-packages/pip/commands/completion.py b/lib/python3.7/site-packages/pip/_internal/commands/completion.py similarity index 54% rename from lib/python3.4/site-packages/pip/commands/completion.py rename to lib/python3.7/site-packages/pip/_internal/commands/completion.py index 66e41a6..2fcdd39 100644 --- a/lib/python3.4/site-packages/pip/commands/completion.py +++ b/lib/python3.7/site-packages/pip/_internal/commands/completion.py @@ -1,7 +1,10 @@ from __future__ import absolute_import import sys -from pip.basecommand import Command +import textwrap + +from pip._internal.cli.base_command import Command +from pip._internal.utils.misc import get_prog BASE_COMPLETION = """ # pip %(shell)s completion start%(script)s# pip %(shell)s completion end @@ -9,38 +12,44 @@ BASE_COMPLETION = """ COMPLETION_SCRIPTS = { 'bash': """ -_pip_completion() -{ - COMPREPLY=( $( COMP_WORDS="${COMP_WORDS[*]}" \\ - COMP_CWORD=$COMP_CWORD \\ - PIP_AUTO_COMPLETE=1 $1 ) ) + _pip_completion() + { + COMPREPLY=( $( COMP_WORDS="${COMP_WORDS[*]}" \\ + COMP_CWORD=$COMP_CWORD \\ + PIP_AUTO_COMPLETE=1 $1 ) ) + } + complete -o default -F _pip_completion %(prog)s + """, + 'zsh': """ + function _pip_completion { + local words cword + read -Ac words + read -cn cword + reply=( $( COMP_WORDS="$words[*]" \\ + COMP_CWORD=$(( cword-1 )) \\ + PIP_AUTO_COMPLETE=1 $words[1] ) ) + } + compctl -K _pip_completion %(prog)s + """, + 'fish': """ + function __fish_complete_pip + set -lx COMP_WORDS (commandline -o) "" + set -lx COMP_CWORD ( \\ + math (contains -i -- (commandline -t) $COMP_WORDS)-1 \\ + ) + set -lx PIP_AUTO_COMPLETE 1 + string split \\ -- (eval $COMP_WORDS[1]) + end + complete -fa "(__fish_complete_pip)" -c %(prog)s + """, } -complete -o default -F _pip_completion pip -""", 'zsh': """ -function _pip_completion { - local words cword - read -Ac words - read -cn cword - reply=( $( COMP_WORDS="$words[*]" \\ - COMP_CWORD=$(( cword-1 )) \\ - PIP_AUTO_COMPLETE=1 $words[1] ) ) -} -compctl -K _pip_completion pip -""", 'fish': """ -function __fish_complete_pip - set -lx COMP_WORDS (commandline -o) "" - set -lx COMP_CWORD (math (contains -i -- (commandline -t) $COMP_WORDS)-1) - set -lx PIP_AUTO_COMPLETE 1 - string split \ -- (eval $COMP_WORDS[1]) -end -complete -fa "(__fish_complete_pip)" -c pip -"""} class CompletionCommand(Command): """A helper command to be used for command completion.""" name = 'completion' summary = 'A helper command used for command completion.' + ignore_require_venv = True def __init__(self, *args, **kw): super(CompletionCommand, self).__init__(*args, **kw) @@ -73,7 +82,11 @@ class CompletionCommand(Command): shells = COMPLETION_SCRIPTS.keys() shell_options = ['--' + shell for shell in sorted(shells)] if options.shell in shells: - script = COMPLETION_SCRIPTS.get(options.shell, '') + script = textwrap.dedent( + COMPLETION_SCRIPTS.get(options.shell, '') % { + 'prog': get_prog(), + } + ) print(BASE_COMPLETION % {'script': script, 'shell': options.shell}) else: sys.stderr.write( diff --git a/lib/python3.7/site-packages/pip/_internal/commands/configuration.py b/lib/python3.7/site-packages/pip/_internal/commands/configuration.py new file mode 100644 index 0000000..826c08d --- /dev/null +++ b/lib/python3.7/site-packages/pip/_internal/commands/configuration.py @@ -0,0 +1,227 @@ +import logging +import os +import subprocess + +from pip._internal.cli.base_command import Command +from pip._internal.cli.status_codes import ERROR, SUCCESS +from pip._internal.configuration import Configuration, kinds +from pip._internal.exceptions import PipError +from pip._internal.locations import venv_config_file +from pip._internal.utils.misc import get_prog + +logger = logging.getLogger(__name__) + + +class ConfigurationCommand(Command): + """Manage local and global configuration. + + Subcommands: + + list: List the active configuration (or from the file specified) + edit: Edit the configuration file in an editor + get: Get the value associated with name + set: Set the name=value + unset: Unset the value associated with name + + If none of --user, --global and --venv are passed, a virtual + environment configuration file is used if one is active and the file + exists. Otherwise, all modifications happen on the to the user file by + default. + """ + + name = 'config' + usage = """ + %prog [] list + %prog [] [--editor ] edit + + %prog [] get name + %prog [] set name value + %prog [] unset name + """ + + summary = "Manage local and global configuration." + + def __init__(self, *args, **kwargs): + super(ConfigurationCommand, self).__init__(*args, **kwargs) + + self.configuration = None + + self.cmd_opts.add_option( + '--editor', + dest='editor', + action='store', + default=None, + help=( + 'Editor to use to edit the file. Uses VISUAL or EDITOR ' + 'environment variables if not provided.' + ) + ) + + self.cmd_opts.add_option( + '--global', + dest='global_file', + action='store_true', + default=False, + help='Use the system-wide configuration file only' + ) + + self.cmd_opts.add_option( + '--user', + dest='user_file', + action='store_true', + default=False, + help='Use the user configuration file only' + ) + + self.cmd_opts.add_option( + '--venv', + dest='venv_file', + action='store_true', + default=False, + help='Use the virtualenv configuration file only' + ) + + self.parser.insert_option_group(0, self.cmd_opts) + + def run(self, options, args): + handlers = { + "list": self.list_values, + "edit": self.open_in_editor, + "get": self.get_name, + "set": self.set_name_value, + "unset": self.unset_name + } + + # Determine action + if not args or args[0] not in handlers: + logger.error("Need an action ({}) to perform.".format( + ", ".join(sorted(handlers))) + ) + return ERROR + + action = args[0] + + # Determine which configuration files are to be loaded + # Depends on whether the command is modifying. + try: + load_only = self._determine_file( + options, need_value=(action in ["get", "set", "unset", "edit"]) + ) + except PipError as e: + logger.error(e.args[0]) + return ERROR + + # Load a new configuration + self.configuration = Configuration( + isolated=options.isolated_mode, load_only=load_only + ) + self.configuration.load() + + # Error handling happens here, not in the action-handlers. + try: + handlers[action](options, args[1:]) + except PipError as e: + logger.error(e.args[0]) + return ERROR + + return SUCCESS + + def _determine_file(self, options, need_value): + file_options = { + kinds.USER: options.user_file, + kinds.GLOBAL: options.global_file, + kinds.VENV: options.venv_file + } + + if sum(file_options.values()) == 0: + if not need_value: + return None + # Default to user, unless there's a virtualenv file. + elif os.path.exists(venv_config_file): + return kinds.VENV + else: + return kinds.USER + elif sum(file_options.values()) == 1: + # There's probably a better expression for this. + return [key for key in file_options if file_options[key]][0] + + raise PipError( + "Need exactly one file to operate upon " + "(--user, --venv, --global) to perform." + ) + + def list_values(self, options, args): + self._get_n_args(args, "list", n=0) + + for key, value in sorted(self.configuration.items()): + logger.info("%s=%r", key, value) + + def get_name(self, options, args): + key = self._get_n_args(args, "get [name]", n=1) + value = self.configuration.get_value(key) + + logger.info("%s", value) + + def set_name_value(self, options, args): + key, value = self._get_n_args(args, "set [name] [value]", n=2) + self.configuration.set_value(key, value) + + self._save_configuration() + + def unset_name(self, options, args): + key = self._get_n_args(args, "unset [name]", n=1) + self.configuration.unset_value(key) + + self._save_configuration() + + def open_in_editor(self, options, args): + editor = self._determine_editor(options) + + fname = self.configuration.get_file_to_edit() + if fname is None: + raise PipError("Could not determine appropriate file.") + + try: + subprocess.check_call([editor, fname]) + except subprocess.CalledProcessError as e: + raise PipError( + "Editor Subprocess exited with exit code {}" + .format(e.returncode) + ) + + def _get_n_args(self, args, example, n): + """Helper to make sure the command got the right number of arguments + """ + if len(args) != n: + msg = ( + 'Got unexpected number of arguments, expected {}. ' + '(example: "{} config {}")' + ).format(n, get_prog(), example) + raise PipError(msg) + + if n == 1: + return args[0] + else: + return args + + def _save_configuration(self): + # We successfully ran a modifying command. Need to save the + # configuration. + try: + self.configuration.save() + except Exception: + logger.error( + "Unable to save configuration. Please report this as a bug.", + exc_info=1 + ) + raise PipError("Internal Error.") + + def _determine_editor(self, options): + if options.editor is not None: + return options.editor + elif "VISUAL" in os.environ: + return os.environ["VISUAL"] + elif "EDITOR" in os.environ: + return os.environ["EDITOR"] + else: + raise PipError("Could not determine editor to use.") diff --git a/lib/python3.4/site-packages/pip/commands/download.py b/lib/python3.7/site-packages/pip/_internal/commands/download.py similarity index 54% rename from lib/python3.4/site-packages/pip/commands/download.py rename to lib/python3.7/site-packages/pip/_internal/commands/download.py index 4bc0640..b3f3c6e 100644 --- a/lib/python3.4/site-packages/pip/commands/download.py +++ b/lib/python3.7/site-packages/pip/_internal/commands/download.py @@ -3,15 +3,15 @@ from __future__ import absolute_import import logging import os -from pip.exceptions import CommandError -from pip.index import FormatControl -from pip.req import RequirementSet -from pip.basecommand import RequirementCommand -from pip import cmdoptions -from pip.utils import ensure_dir, normalize_path -from pip.utils.build import BuildDirectory -from pip.utils.filesystem import check_path_owner - +from pip._internal.cli import cmdoptions +from pip._internal.cli.base_command import RequirementCommand +from pip._internal.operations.prepare import RequirementPreparer +from pip._internal.req import RequirementSet +from pip._internal.req.req_tracker import RequirementTracker +from pip._internal.resolve import Resolver +from pip._internal.utils.filesystem import check_path_owner +from pip._internal.utils.misc import ensure_dir, normalize_path +from pip._internal.utils.temp_dir import TempDirectory logger = logging.getLogger(__name__) @@ -33,8 +33,8 @@ class DownloadCommand(RequirementCommand): usage = """ %prog [options] [package-index-options] ... %prog [options] -r [package-index-options] ... - %prog [options] [-e] ... - %prog [options] [-e] ... + %prog [options] ... + %prog [options] ... %prog [options] ...""" summary = 'Download packages.' @@ -45,17 +45,19 @@ class DownloadCommand(RequirementCommand): cmd_opts = self.cmd_opts cmd_opts.add_option(cmdoptions.constraints()) - cmd_opts.add_option(cmdoptions.editable()) cmd_opts.add_option(cmdoptions.requirements()) cmd_opts.add_option(cmdoptions.build_dir()) cmd_opts.add_option(cmdoptions.no_deps()) cmd_opts.add_option(cmdoptions.global_options()) cmd_opts.add_option(cmdoptions.no_binary()) cmd_opts.add_option(cmdoptions.only_binary()) + cmd_opts.add_option(cmdoptions.prefer_binary()) cmd_opts.add_option(cmdoptions.src()) cmd_opts.add_option(cmdoptions.pre()) cmd_opts.add_option(cmdoptions.no_clean()) cmd_opts.add_option(cmdoptions.require_hashes()) + cmd_opts.add_option(cmdoptions.progress_bar()) + cmd_opts.add_option(cmdoptions.no_build_isolation()) cmd_opts.add_option( '-d', '--dest', '--destination-dir', '--destination-directory', @@ -65,55 +67,13 @@ class DownloadCommand(RequirementCommand): help=("Download packages into ."), ) - cmd_opts.add_option( - '--platform', - dest='platform', - metavar='platform', - default=None, - help=("Only download wheels compatible with . " - "Defaults to the platform of the running system."), - ) - - cmd_opts.add_option( - '--python-version', - dest='python_version', - metavar='python_version', - default=None, - help=("Only download wheels compatible with Python " - "interpreter version . If not specified, then the " - "current system interpreter minor version is used. A major " - "version (e.g. '2') can be specified to match all " - "minor revs of that major version. A minor version " - "(e.g. '34') can also be specified."), - ) - - cmd_opts.add_option( - '--implementation', - dest='implementation', - metavar='implementation', - default=None, - help=("Only download wheels compatible with Python " - "implementation , e.g. 'pp', 'jy', 'cp', " - " or 'ip'. If not specified, then the current " - "interpreter implementation is used. Use 'py' to force " - "implementation-agnostic wheels."), - ) - - cmd_opts.add_option( - '--abi', - dest='abi', - metavar='abi', - default=None, - help=("Only download wheels compatible with Python " - "abi , e.g. 'pypy_41'. If not specified, then the " - "current interpreter abi tag is used. Generally " - "you will need to specify --implementation, " - "--platform, and --python-version when using " - "this option."), - ) + cmd_opts.add_option(cmdoptions.platform()) + cmd_opts.add_option(cmdoptions.python_version()) + cmd_opts.add_option(cmdoptions.implementation()) + cmd_opts.add_option(cmdoptions.abi()) index_opts = cmdoptions.make_option_group( - cmdoptions.non_deprecated_index_group, + cmdoptions.index_group, self.parser, ) @@ -122,26 +82,16 @@ class DownloadCommand(RequirementCommand): def run(self, options, args): options.ignore_installed = True + # editable doesn't really make sense for `pip download`, but the bowels + # of the RequirementSet code require that property. + options.editables = [] if options.python_version: python_versions = [options.python_version] else: python_versions = None - dist_restriction_set = any([ - options.python_version, - options.platform, - options.abi, - options.implementation, - ]) - binary_only = FormatControl(set(), set([':all:'])) - if dist_restriction_set and options.format_control != binary_only: - raise CommandError( - "--only-binary=:all: must be set and --no-binary must not " - "be set (or must be set to :none:) when restricting platform " - "and interpreter constraints using --python-version, " - "--platform, --abi, or --implementation." - ) + cmdoptions.check_dist_restriction(options) options.src_dir = os.path.abspath(options.src_dir) options.download_dir = normalize_path(options.download_dir) @@ -169,18 +119,12 @@ class DownloadCommand(RequirementCommand): ) options.cache_dir = None - with BuildDirectory(options.build_dir, - delete=build_delete) as build_dir: + with RequirementTracker() as req_tracker, TempDirectory( + options.build_dir, delete=build_delete, kind="download" + ) as directory: requirement_set = RequirementSet( - build_dir=build_dir, - src_dir=options.src_dir, - download_dir=options.download_dir, - ignore_installed=True, - ignore_dependencies=options.ignore_dependencies, - session=session, - isolated=options.isolated_mode, - require_hashes=options.require_hashes + require_hashes=options.require_hashes, ) self.populate_requirement_set( requirement_set, @@ -192,18 +136,36 @@ class DownloadCommand(RequirementCommand): None ) - if not requirement_set.has_requirements: - return + preparer = RequirementPreparer( + build_dir=directory.path, + src_dir=options.src_dir, + download_dir=options.download_dir, + wheel_download_dir=None, + progress_bar=options.progress_bar, + build_isolation=options.build_isolation, + req_tracker=req_tracker, + ) - requirement_set.prepare_files(finder) + resolver = Resolver( + preparer=preparer, + finder=finder, + session=session, + wheel_cache=None, + use_user_site=False, + upgrade_strategy="to-satisfy-only", + force_reinstall=False, + ignore_dependencies=options.ignore_dependencies, + ignore_requires_python=False, + ignore_installed=True, + isolated=options.isolated_mode, + ) + resolver.resolve(requirement_set) downloaded = ' '.join([ req.name for req in requirement_set.successfully_downloaded ]) if downloaded: - logger.info( - 'Successfully downloaded %s', downloaded - ) + logger.info('Successfully downloaded %s', downloaded) # Clean up if not options.no_clean: diff --git a/lib/python3.4/site-packages/pip/commands/freeze.py b/lib/python3.7/site-packages/pip/_internal/commands/freeze.py similarity index 75% rename from lib/python3.4/site-packages/pip/commands/freeze.py rename to lib/python3.7/site-packages/pip/_internal/commands/freeze.py index c198796..dc9c53a 100644 --- a/lib/python3.4/site-packages/pip/commands/freeze.py +++ b/lib/python3.7/site-packages/pip/_internal/commands/freeze.py @@ -2,14 +2,13 @@ from __future__ import absolute_import import sys -import pip -from pip.compat import stdlib_pkgs -from pip.basecommand import Command -from pip.operations.freeze import freeze -from pip.wheel import WheelCache +from pip._internal.cache import WheelCache +from pip._internal.cli.base_command import Command +from pip._internal.models.format_control import FormatControl +from pip._internal.operations.freeze import freeze +from pip._internal.utils.compat import stdlib_pkgs - -DEV_PKGS = ('pip', 'setuptools', 'distribute', 'wheel') +DEV_PKGS = {'pip', 'setuptools', 'distribute', 'wheel'} class FreezeCommand(Command): @@ -63,11 +62,16 @@ class FreezeCommand(Command): action='store_true', help='Do not skip these packages in the output:' ' %s' % ', '.join(DEV_PKGS)) + self.cmd_opts.add_option( + '--exclude-editable', + dest='exclude_editable', + action='store_true', + help='Exclude editable package from output.') self.parser.insert_option_group(0, self.cmd_opts) def run(self, options, args): - format_control = pip.index.FormatControl(set(), set()) + format_control = FormatControl(set(), set()) wheel_cache = WheelCache(options.cache_dir, format_control) skip = set(stdlib_pkgs) if not options.freeze_all: @@ -81,7 +85,12 @@ class FreezeCommand(Command): skip_regex=options.skip_requirements_regex, isolated=options.isolated_mode, wheel_cache=wheel_cache, - skip=skip) + skip=skip, + exclude_editable=options.exclude_editable, + ) - for line in freeze(**freeze_kwargs): - sys.stdout.write(line + '\n') + try: + for line in freeze(**freeze_kwargs): + sys.stdout.write(line + '\n') + finally: + wheel_cache.cleanup() diff --git a/lib/python3.4/site-packages/pip/commands/hash.py b/lib/python3.7/site-packages/pip/_internal/commands/hash.py similarity index 85% rename from lib/python3.4/site-packages/pip/commands/hash.py rename to lib/python3.7/site-packages/pip/_internal/commands/hash.py index 27cca0b..423440e 100644 --- a/lib/python3.4/site-packages/pip/commands/hash.py +++ b/lib/python3.7/site-packages/pip/_internal/commands/hash.py @@ -4,11 +4,10 @@ import hashlib import logging import sys -from pip.basecommand import Command -from pip.status_codes import ERROR -from pip.utils import read_chunks -from pip.utils.hashes import FAVORITE_HASH, STRONG_HASHES - +from pip._internal.cli.base_command import Command +from pip._internal.cli.status_codes import ERROR +from pip._internal.utils.hashes import FAVORITE_HASH, STRONG_HASHES +from pip._internal.utils.misc import read_chunks logger = logging.getLogger(__name__) @@ -24,6 +23,7 @@ class HashCommand(Command): name = 'hash' usage = '%prog [options] ...' summary = 'Compute hashes of package archives.' + ignore_require_venv = True def __init__(self, *args, **kw): super(HashCommand, self).__init__(*args, **kw) diff --git a/lib/python3.4/site-packages/pip/commands/help.py b/lib/python3.7/site-packages/pip/_internal/commands/help.py similarity index 75% rename from lib/python3.4/site-packages/pip/commands/help.py rename to lib/python3.7/site-packages/pip/_internal/commands/help.py index 11722f1..49a81cb 100644 --- a/lib/python3.4/site-packages/pip/commands/help.py +++ b/lib/python3.7/site-packages/pip/_internal/commands/help.py @@ -1,7 +1,8 @@ from __future__ import absolute_import -from pip.basecommand import Command, SUCCESS -from pip.exceptions import CommandError +from pip._internal.cli.base_command import Command +from pip._internal.cli.status_codes import SUCCESS +from pip._internal.exceptions import CommandError class HelpCommand(Command): @@ -10,9 +11,10 @@ class HelpCommand(Command): usage = """ %prog """ summary = 'Show help for commands.' + ignore_require_venv = True def run(self, options, args): - from pip.commands import commands_dict, get_similar_commands + from pip._internal.commands import commands_dict, get_similar_commands try: # 'pip help' with no args is handled by pip.__init__.parseopt() diff --git a/lib/python3.7/site-packages/pip/_internal/commands/install.py b/lib/python3.7/site-packages/pip/_internal/commands/install.py new file mode 100644 index 0000000..c9ed3b4 --- /dev/null +++ b/lib/python3.7/site-packages/pip/_internal/commands/install.py @@ -0,0 +1,555 @@ +from __future__ import absolute_import + +import errno +import logging +import operator +import os +import shutil +from optparse import SUPPRESS_HELP + +from pip._vendor import pkg_resources + +from pip._internal.cache import WheelCache +from pip._internal.cli import cmdoptions +from pip._internal.cli.base_command import RequirementCommand +from pip._internal.cli.status_codes import ERROR +from pip._internal.exceptions import ( + CommandError, InstallationError, PreviousBuildDirError, +) +from pip._internal.locations import distutils_scheme, virtualenv_no_global +from pip._internal.operations.check import check_install_conflicts +from pip._internal.operations.prepare import RequirementPreparer +from pip._internal.req import RequirementSet, install_given_reqs +from pip._internal.req.req_tracker import RequirementTracker +from pip._internal.resolve import Resolver +from pip._internal.utils.filesystem import check_path_owner +from pip._internal.utils.misc import ( + ensure_dir, get_installed_version, + protect_pip_from_modification_on_windows, +) +from pip._internal.utils.temp_dir import TempDirectory +from pip._internal.wheel import WheelBuilder + +try: + import wheel +except ImportError: + wheel = None + +from pip._internal.locations import running_under_virtualenv + +logger = logging.getLogger(__name__) + + +class InstallCommand(RequirementCommand): + """ + Install packages from: + + - PyPI (and other indexes) using requirement specifiers. + - VCS project urls. + - Local project directories. + - Local or remote source archives. + + pip also supports installing from "requirements files", which provide + an easy way to specify a whole environment to be installed. + """ + name = 'install' + + usage = """ + %prog [options] [package-index-options] ... + %prog [options] -r [package-index-options] ... + %prog [options] [-e] ... + %prog [options] [-e] ... + %prog [options] ...""" + + summary = 'Install packages.' + + def __init__(self, *args, **kw): + super(InstallCommand, self).__init__(*args, **kw) + + cmd_opts = self.cmd_opts + + cmd_opts.add_option(cmdoptions.requirements()) + cmd_opts.add_option(cmdoptions.constraints()) + cmd_opts.add_option(cmdoptions.no_deps()) + cmd_opts.add_option(cmdoptions.pre()) + + cmd_opts.add_option(cmdoptions.editable()) + cmd_opts.add_option( + '-t', '--target', + dest='target_dir', + metavar='dir', + default=None, + help='Install packages into . ' + 'By default this will not replace existing files/folders in ' + '. Use --upgrade to replace existing packages in ' + 'with new versions.' + ) + cmd_opts.add_option(cmdoptions.platform()) + cmd_opts.add_option(cmdoptions.python_version()) + cmd_opts.add_option(cmdoptions.implementation()) + cmd_opts.add_option(cmdoptions.abi()) + + cmd_opts.add_option( + '--user', + dest='use_user_site', + action='store_true', + help="Install to the Python user install directory for your " + "platform. Typically ~/.local/, or %APPDATA%\\Python on " + "Windows. (See the Python documentation for site.USER_BASE " + "for full details.) On Debian systems, this is the " + "default when running outside of a virtual environment " + "and not as root.") + + cmd_opts.add_option( + '--no-user', + dest='use_system_location', + action='store_true', + help=SUPPRESS_HELP) + cmd_opts.add_option( + '--root', + dest='root_path', + metavar='dir', + default=None, + help="Install everything relative to this alternate root " + "directory.") + cmd_opts.add_option( + '--prefix', + dest='prefix_path', + metavar='dir', + default=None, + help="Installation prefix where lib, bin and other top-level " + "folders are placed") + + cmd_opts.add_option( + '--system', + dest='use_system_location', + action='store_true', + help="Install using the system scheme (overrides --user on " + "Debian systems)") + + cmd_opts.add_option(cmdoptions.build_dir()) + + cmd_opts.add_option(cmdoptions.src()) + + cmd_opts.add_option( + '-U', '--upgrade', + dest='upgrade', + action='store_true', + help='Upgrade all specified packages to the newest available ' + 'version. The handling of dependencies depends on the ' + 'upgrade-strategy used.' + ) + + cmd_opts.add_option( + '--upgrade-strategy', + dest='upgrade_strategy', + default='only-if-needed', + choices=['only-if-needed', 'eager'], + help='Determines how dependency upgrading should be handled ' + '[default: %default]. ' + '"eager" - dependencies are upgraded regardless of ' + 'whether the currently installed version satisfies the ' + 'requirements of the upgraded package(s). ' + '"only-if-needed" - are upgraded only when they do not ' + 'satisfy the requirements of the upgraded package(s).' + ) + + cmd_opts.add_option( + '--force-reinstall', + dest='force_reinstall', + action='store_true', + help='Reinstall all packages even if they are already ' + 'up-to-date.') + + cmd_opts.add_option( + '-I', '--ignore-installed', + dest='ignore_installed', + action='store_true', + help='Ignore the installed packages (reinstalling instead).') + + cmd_opts.add_option(cmdoptions.ignore_requires_python()) + cmd_opts.add_option(cmdoptions.no_build_isolation()) + + cmd_opts.add_option(cmdoptions.install_options()) + cmd_opts.add_option(cmdoptions.global_options()) + + cmd_opts.add_option( + "--compile", + action="store_true", + dest="compile", + default=True, + help="Compile Python source files to bytecode", + ) + + cmd_opts.add_option( + "--no-compile", + action="store_false", + dest="compile", + help="Do not compile Python source files to bytecode", + ) + + cmd_opts.add_option( + "--no-warn-script-location", + action="store_false", + dest="warn_script_location", + default=True, + help="Do not warn when installing scripts outside PATH", + ) + cmd_opts.add_option( + "--no-warn-conflicts", + action="store_false", + dest="warn_about_conflicts", + default=True, + help="Do not warn about broken dependencies", + ) + + cmd_opts.add_option(cmdoptions.no_binary()) + cmd_opts.add_option(cmdoptions.only_binary()) + cmd_opts.add_option(cmdoptions.prefer_binary()) + cmd_opts.add_option(cmdoptions.no_clean()) + cmd_opts.add_option(cmdoptions.require_hashes()) + cmd_opts.add_option(cmdoptions.progress_bar()) + + index_opts = cmdoptions.make_option_group( + cmdoptions.index_group, + self.parser, + ) + + self.parser.insert_option_group(0, index_opts) + self.parser.insert_option_group(0, cmd_opts) + + def run(self, options, args): + cmdoptions.check_install_build_global(options) + upgrade_strategy = "to-satisfy-only" + if options.upgrade: + upgrade_strategy = options.upgrade_strategy + + if options.build_dir: + options.build_dir = os.path.abspath(options.build_dir) + + cmdoptions.check_dist_restriction(options, check_target=True) + + if options.python_version: + python_versions = [options.python_version] + else: + python_versions = None + + # compute install location defaults + if (not options.use_user_site and not options.prefix_path and not + options.target_dir and not options.use_system_location): + if not running_under_virtualenv() and os.geteuid() != 0: + options.use_user_site = True + + if options.use_system_location: + options.use_user_site = False + + options.src_dir = os.path.abspath(options.src_dir) + install_options = options.install_options or [] + if options.use_user_site: + if options.prefix_path: + raise CommandError( + "Can not combine '--user' and '--prefix' as they imply " + "different installation locations" + ) + if virtualenv_no_global(): + raise InstallationError( + "Can not perform a '--user' install. User site-packages " + "are not visible in this virtualenv." + ) + install_options.append('--user') + install_options.append('--prefix=') + + target_temp_dir = TempDirectory(kind="target") + if options.target_dir: + options.ignore_installed = True + options.target_dir = os.path.abspath(options.target_dir) + if (os.path.exists(options.target_dir) and not + os.path.isdir(options.target_dir)): + raise CommandError( + "Target path exists but is not a directory, will not " + "continue." + ) + + # Create a target directory for using with the target option + target_temp_dir.create() + install_options.append('--home=' + target_temp_dir.path) + + global_options = options.global_options or [] + + with self._build_session(options) as session: + finder = self._build_package_finder( + options=options, + session=session, + platform=options.platform, + python_versions=python_versions, + abi=options.abi, + implementation=options.implementation, + ) + build_delete = (not (options.no_clean or options.build_dir)) + wheel_cache = WheelCache(options.cache_dir, options.format_control) + + if options.cache_dir and not check_path_owner(options.cache_dir): + logger.warning( + "The directory '%s' or its parent directory is not owned " + "by the current user and caching wheels has been " + "disabled. check the permissions and owner of that " + "directory. If executing pip with sudo, you may want " + "sudo's -H flag.", + options.cache_dir, + ) + options.cache_dir = None + + with RequirementTracker() as req_tracker, TempDirectory( + options.build_dir, delete=build_delete, kind="install" + ) as directory: + requirement_set = RequirementSet( + require_hashes=options.require_hashes, + check_supported_wheels=not options.target_dir, + ) + + try: + self.populate_requirement_set( + requirement_set, args, options, finder, session, + self.name, wheel_cache + ) + preparer = RequirementPreparer( + build_dir=directory.path, + src_dir=options.src_dir, + download_dir=None, + wheel_download_dir=None, + progress_bar=options.progress_bar, + build_isolation=options.build_isolation, + req_tracker=req_tracker, + ) + + resolver = Resolver( + preparer=preparer, + finder=finder, + session=session, + wheel_cache=wheel_cache, + use_user_site=options.use_user_site, + upgrade_strategy=upgrade_strategy, + force_reinstall=options.force_reinstall, + ignore_dependencies=options.ignore_dependencies, + ignore_requires_python=options.ignore_requires_python, + ignore_installed=options.ignore_installed, + isolated=options.isolated_mode, + ) + resolver.resolve(requirement_set) + + protect_pip_from_modification_on_windows( + modifying_pip=requirement_set.has_requirement("pip") + ) + + # If caching is disabled or wheel is not installed don't + # try to build wheels. + if wheel and options.cache_dir: + # build wheels before install. + wb = WheelBuilder( + finder, preparer, wheel_cache, + build_options=[], global_options=[], + ) + # Ignore the result: a failed wheel will be + # installed from the sdist/vcs whatever. + wb.build( + requirement_set.requirements.values(), + session=session, autobuilding=True + ) + + to_install = resolver.get_installation_order( + requirement_set + ) + + # Consistency Checking of the package set we're installing. + should_warn_about_conflicts = ( + not options.ignore_dependencies and + options.warn_about_conflicts + ) + if should_warn_about_conflicts: + self._warn_about_conflicts(to_install) + + # Don't warn about script install locations if + # --target has been specified + warn_script_location = options.warn_script_location + if options.target_dir: + warn_script_location = False + + installed = install_given_reqs( + to_install, + install_options, + global_options, + root=options.root_path, + home=target_temp_dir.path, + prefix=options.prefix_path, + pycompile=options.compile, + warn_script_location=warn_script_location, + use_user_site=options.use_user_site, + ) + + lib_locations = get_lib_location_guesses( + user=options.use_user_site, + home=target_temp_dir.path, + root=options.root_path, + prefix=options.prefix_path, + isolated=options.isolated_mode, + ) + working_set = pkg_resources.WorkingSet(lib_locations) + + reqs = sorted(installed, key=operator.attrgetter('name')) + items = [] + for req in reqs: + item = req.name + try: + installed_version = get_installed_version( + req.name, working_set=working_set + ) + if installed_version: + item += '-' + installed_version + except Exception: + pass + items.append(item) + installed = ' '.join(items) + if installed: + logger.info('Successfully installed %s', installed) + except EnvironmentError as error: + show_traceback = (self.verbosity >= 1) + + message = create_env_error_message( + error, show_traceback, options.use_user_site, + ) + logger.error(message, exc_info=show_traceback) + + return ERROR + except PreviousBuildDirError: + options.no_clean = True + raise + finally: + # Clean up + if not options.no_clean: + requirement_set.cleanup_files() + wheel_cache.cleanup() + + if options.target_dir: + self._handle_target_dir( + options.target_dir, target_temp_dir, options.upgrade + ) + return requirement_set + + def _handle_target_dir(self, target_dir, target_temp_dir, upgrade): + ensure_dir(target_dir) + + # Checking both purelib and platlib directories for installed + # packages to be moved to target directory + lib_dir_list = [] + + with target_temp_dir: + # Checking both purelib and platlib directories for installed + # packages to be moved to target directory + scheme = distutils_scheme('', home=target_temp_dir.path) + purelib_dir = scheme['purelib'] + platlib_dir = scheme['platlib'] + data_dir = scheme['data'] + + if os.path.exists(purelib_dir): + lib_dir_list.append(purelib_dir) + if os.path.exists(platlib_dir) and platlib_dir != purelib_dir: + lib_dir_list.append(platlib_dir) + if os.path.exists(data_dir): + lib_dir_list.append(data_dir) + + for lib_dir in lib_dir_list: + for item in os.listdir(lib_dir): + if lib_dir == data_dir: + ddir = os.path.join(data_dir, item) + if any(s.startswith(ddir) for s in lib_dir_list[:-1]): + continue + target_item_dir = os.path.join(target_dir, item) + if os.path.exists(target_item_dir): + if not upgrade: + logger.warning( + 'Target directory %s already exists. Specify ' + '--upgrade to force replacement.', + target_item_dir + ) + continue + if os.path.islink(target_item_dir): + logger.warning( + 'Target directory %s already exists and is ' + 'a link. Pip will not automatically replace ' + 'links, please remove if replacement is ' + 'desired.', + target_item_dir + ) + continue + if os.path.isdir(target_item_dir): + shutil.rmtree(target_item_dir) + else: + os.remove(target_item_dir) + + shutil.move( + os.path.join(lib_dir, item), + target_item_dir + ) + + def _warn_about_conflicts(self, to_install): + package_set, _dep_info = check_install_conflicts(to_install) + missing, conflicting = _dep_info + + # NOTE: There is some duplication here from pip check + for project_name in missing: + version = package_set[project_name][0] + for dependency in missing[project_name]: + logger.critical( + "%s %s requires %s, which is not installed.", + project_name, version, dependency[1], + ) + + for project_name in conflicting: + version = package_set[project_name][0] + for dep_name, dep_version, req in conflicting[project_name]: + logger.critical( + "%s %s has requirement %s, but you'll have %s %s which is " + "incompatible.", + project_name, version, req, dep_name, dep_version, + ) + + +def get_lib_location_guesses(*args, **kwargs): + scheme = distutils_scheme('', *args, **kwargs) + return [scheme['purelib'], scheme['platlib']] + + +def create_env_error_message(error, show_traceback, using_user_site): + """Format an error message for an EnvironmentError + + It may occur anytime during the execution of the install command. + """ + parts = [] + + # Mention the error if we are not going to show a traceback + parts.append("Could not install packages due to an EnvironmentError") + if not show_traceback: + parts.append(": ") + parts.append(str(error)) + else: + parts.append(".") + + # Spilt the error indication from a helper message (if any) + parts[-1] += "\n" + + # Suggest useful actions to the user: + # (1) using user site-packages or (2) verifying the permissions + if error.errno == errno.EACCES: + user_option_part = "Consider using the `--user` option" + permissions_part = "Check the permissions" + + if not using_user_site: + parts.extend([ + user_option_part, " or ", + permissions_part.lower(), + ]) + else: + parts.append(permissions_part) + parts.append(".\n") + + return "".join(parts).strip() + "\n" diff --git a/lib/python3.4/site-packages/pip/commands/list.py b/lib/python3.7/site-packages/pip/_internal/commands/list.py similarity index 73% rename from lib/python3.4/site-packages/pip/commands/list.py rename to lib/python3.7/site-packages/pip/_internal/commands/list.py index 6f6995d..c6eeca7 100644 --- a/lib/python3.4/site-packages/pip/commands/list.py +++ b/lib/python3.7/site-packages/pip/_internal/commands/list.py @@ -2,21 +2,18 @@ from __future__ import absolute_import import json import logging -import warnings -try: - from itertools import zip_longest -except ImportError: - from itertools import izip_longest as zip_longest from pip._vendor import six +from pip._vendor.six.moves import zip_longest -from pip.basecommand import Command -from pip.exceptions import CommandError -from pip.index import PackageFinder -from pip.utils import ( - get_installed_distributions, dist_is_editable) -from pip.utils.deprecation import RemovedInPip10Warning -from pip.cmdoptions import make_option_group, index_group +from pip._internal.cli import cmdoptions +from pip._internal.cli.base_command import Command +from pip._internal.exceptions import CommandError +from pip._internal.index import PackageFinder +from pip._internal.utils.misc import ( + dist_is_editable, get_installed_distributions, +) +from pip._internal.utils.packaging import get_installer logger = logging.getLogger(__name__) @@ -78,9 +75,10 @@ class ListCommand(Command): '--format', action='store', dest='list_format', - choices=('legacy', 'columns', 'freeze', 'json'), - help="Select the output format among: legacy (default), columns, " - "freeze or json.", + default="columns", + choices=('columns', 'freeze', 'json'), + help="Select the output format among: columns (default), freeze, " + "or json", ) cmd_opts.add_option( @@ -91,7 +89,22 @@ class ListCommand(Command): "installed packages.", ) - index_opts = make_option_group(index_group, self.parser) + cmd_opts.add_option( + '--exclude-editable', + action='store_false', + dest='include_editable', + help='Exclude editable package from output.', + ) + cmd_opts.add_option( + '--include-editable', + action='store_true', + dest='include_editable', + help='Include editable package from output.', + default=True, + ) + index_opts = cmdoptions.make_option_group( + cmdoptions.index_group, self.parser + ) self.parser.insert_option_group(0, index_opts) self.parser.insert_option_group(0, cmd_opts) @@ -110,39 +123,6 @@ class ListCommand(Command): ) def run(self, options, args): - if options.allow_external: - warnings.warn( - "--allow-external has been deprecated and will be removed in " - "the future. Due to changes in the repository protocol, it no " - "longer has any effect.", - RemovedInPip10Warning, - ) - - if options.allow_all_external: - warnings.warn( - "--allow-all-external has been deprecated and will be removed " - "in the future. Due to changes in the repository protocol, it " - "no longer has any effect.", - RemovedInPip10Warning, - ) - - if options.allow_unverified: - warnings.warn( - "--allow-unverified has been deprecated and will be removed " - "in the future. Due to changes in the repository protocol, it " - "no longer has any effect.", - RemovedInPip10Warning, - ) - - if options.list_format is None: - warnings.warn( - "The default format will switch to columns in the future. " - "You can use --format=(legacy|columns) (or define a " - "format=(legacy|columns) in your pip.conf under the [list] " - "section) to disable this warning.", - RemovedInPip10Warning, - ) - if options.outdated and options.uptodate: raise CommandError( "Options --outdated and --uptodate cannot be combined.") @@ -151,6 +131,7 @@ class ListCommand(Command): local_only=options.local, user_only=options.user, editables_only=options.editable, + include_editables=options.include_editable, ) if options.outdated: @@ -179,7 +160,7 @@ class ListCommand(Command): dep_keys = set() for dist in packages: dep_keys.update(requirement.key for requirement in dist.requires()) - return set(pkg for pkg in packages if pkg.key not in dep_keys) + return {pkg for pkg in packages if pkg.key not in dep_keys} def iter_packages_latest_infos(self, packages, options): index_urls = [options.index_url] + options.extra_index_urls @@ -220,23 +201,6 @@ class ListCommand(Command): dist.latest_filetype = typ yield dist - def output_legacy(self, dist): - if dist_is_editable(dist): - return '%s (%s, %s)' % ( - dist.project_name, - dist.version, - dist.location, - ) - else: - return '%s (%s)' % (dist.project_name, dist.version) - - def output_legacy_latest(self, dist): - return '%s - Latest: %s [%s]' % ( - self.output_legacy(dist), - dist.latest_version, - dist.latest_filetype, - ) - def output_package_listing(self, packages, options): packages = sorted( packages, @@ -247,15 +211,13 @@ class ListCommand(Command): self.output_package_listing_columns(data, header) elif options.list_format == 'freeze': for dist in packages: - logger.info("%s==%s", dist.project_name, dist.version) + if options.verbose >= 1: + logger.info("%s==%s (%s)", dist.project_name, + dist.version, dist.location) + else: + logger.info("%s==%s", dist.project_name, dist.version) elif options.list_format == 'json': logger.info(format_for_json(packages, options)) - else: # legacy - for dist in packages: - if options.outdated: - logger.info(self.output_legacy_latest(dist)) - else: - logger.info(self.output_legacy(dist)) def output_package_listing_columns(self, data, header): # insert the header first: we need to know the size of column names @@ -303,8 +265,10 @@ def format_for_columns(pkgs, options): header = ["Package", "Version"] data = [] - if any(dist_is_editable(x) for x in pkgs): + if options.verbose >= 1 or any(dist_is_editable(x) for x in pkgs): header.append("Location") + if options.verbose >= 1: + header.append("Installer") for proj in pkgs: # if we're working on the 'outdated' list, separate out the @@ -315,8 +279,10 @@ def format_for_columns(pkgs, options): row.append(proj.latest_version) row.append(proj.latest_filetype) - if dist_is_editable(proj): + if options.verbose >= 1 or dist_is_editable(proj): row.append(proj.location) + if options.verbose >= 1: + row.append(get_installer(proj)) data.append(row) @@ -330,6 +296,9 @@ def format_for_json(packages, options): 'name': dist.project_name, 'version': six.text_type(dist.version), } + if options.verbose >= 1: + info['location'] = dist.location + info['installer'] = get_installer(dist) if options.outdated: info['latest_version'] = six.text_type(dist.latest_version) info['latest_filetype'] = dist.latest_filetype diff --git a/lib/python3.4/site-packages/pip/commands/search.py b/lib/python3.7/site-packages/pip/_internal/commands/search.py similarity index 82% rename from lib/python3.4/site-packages/pip/commands/search.py rename to lib/python3.7/site-packages/pip/_internal/commands/search.py index bd2ea8a..c157a31 100644 --- a/lib/python3.4/site-packages/pip/commands/search.py +++ b/lib/python3.7/site-packages/pip/_internal/commands/search.py @@ -3,19 +3,21 @@ from __future__ import absolute_import import logging import sys import textwrap +from collections import OrderedDict -from pip.basecommand import Command, SUCCESS -from pip.compat import OrderedDict -from pip.download import PipXmlrpcTransport -from pip.models import PyPI -from pip.utils import get_terminal_size -from pip.utils.logging import indent_log -from pip.exceptions import CommandError -from pip.status_codes import NO_MATCHES_FOUND -from pip._vendor.packaging.version import parse as parse_version from pip._vendor import pkg_resources -from pip._vendor.six.moves import xmlrpc_client +from pip._vendor.packaging.version import parse as parse_version +# NOTE: XMLRPC Client is not annotated in typeshed as on 2017-07-17, which is +# why we ignore the type on this import +from pip._vendor.six.moves import xmlrpc_client # type: ignore +from pip._internal.cli.base_command import Command +from pip._internal.cli.status_codes import NO_MATCHES_FOUND, SUCCESS +from pip._internal.download import PipXmlrpcTransport +from pip._internal.exceptions import CommandError +from pip._internal.models.index import PyPI +from pip._internal.utils.compat import get_terminal_size +from pip._internal.utils.logging import indent_log logger = logging.getLogger(__name__) @@ -26,6 +28,7 @@ class SearchCommand(Command): usage = """ %prog [options] """ summary = 'Search PyPI for packages.' + ignore_require_venv = True def __init__(self, *args, **kw): super(SearchCommand, self).__init__(*args, **kw) @@ -96,7 +99,7 @@ def print_results(hits, name_column_width=None, terminal_width=None): return if name_column_width is None: name_column_width = max([ - len(hit['name']) + len(hit.get('versions', ['-'])[-1]) + len(hit['name']) + len(highest_version(hit.get('versions', ['-']))) for hit in hits ]) + 4 @@ -104,7 +107,7 @@ def print_results(hits, name_column_width=None, terminal_width=None): for hit in hits: name = hit['name'] summary = hit['summary'] or '' - version = hit.get('versions', ['-'])[-1] + latest = highest_version(hit.get('versions', ['-'])) if terminal_width is not None: target_width = terminal_width - name_column_width - 5 if target_width > 10: @@ -113,13 +116,12 @@ def print_results(hits, name_column_width=None, terminal_width=None): summary = ('\n' + ' ' * (name_column_width + 3)).join(summary) line = '%-*s - %s' % (name_column_width, - '%s (%s)' % (name, version), summary) + '%s (%s)' % (name, latest), summary) try: logger.info(line) if name in installed_packages: dist = pkg_resources.get_distribution(name) with indent_log(): - latest = highest_version(hit['versions']) if dist.version == latest: logger.info('INSTALLED: %s (latest)', dist.version) else: diff --git a/lib/python3.4/site-packages/pip/commands/show.py b/lib/python3.7/site-packages/pip/_internal/commands/show.py similarity index 89% rename from lib/python3.4/site-packages/pip/commands/show.py rename to lib/python3.7/site-packages/pip/_internal/commands/show.py index 111c16d..f92c9bc 100644 --- a/lib/python3.4/site-packages/pip/commands/show.py +++ b/lib/python3.7/site-packages/pip/_internal/commands/show.py @@ -1,24 +1,29 @@ from __future__ import absolute_import -from email.parser import FeedParser import logging import os +from email.parser import FeedParser # type: ignore -from pip.basecommand import Command -from pip.status_codes import SUCCESS, ERROR from pip._vendor import pkg_resources from pip._vendor.packaging.utils import canonicalize_name +from pip._internal.cli.base_command import Command +from pip._internal.cli.status_codes import ERROR, SUCCESS logger = logging.getLogger(__name__) class ShowCommand(Command): - """Show information about one or more installed packages.""" + """ + Show information about one or more installed packages. + + The output is in RFC-compliant mail header format. + """ name = 'show' usage = """ %prog [options] ...""" summary = 'Show information about installed packages.' + ignore_require_venv = True def __init__(self, *args, **kw): super(ShowCommand, self).__init__(*args, **kw) @@ -126,7 +131,14 @@ def print_results(distributions, list_files=False, verbose=False): results_printed = True if i > 0: logger.info("---") - logger.info("Name: %s", dist.get('name', '')) + + name = dist.get('name', '') + required_by = [ + pkg.project_name for pkg in pkg_resources.working_set + if name in [required.name for required in pkg.requires()] + ] + + logger.info("Name: %s", name) logger.info("Version: %s", dist.get('version', '')) logger.info("Summary: %s", dist.get('summary', '')) logger.info("Home-page: %s", dist.get('home-page', '')) @@ -135,6 +147,8 @@ def print_results(distributions, list_files=False, verbose=False): logger.info("License: %s", dist.get('license', '')) logger.info("Location: %s", dist.get('location', '')) logger.info("Requires: %s", ', '.join(dist.get('requires', []))) + logger.info("Required-by: %s", ', '.join(required_by)) + if verbose: logger.info("Metadata-Version: %s", dist.get('metadata-version', '')) diff --git a/lib/python3.4/site-packages/pip/commands/uninstall.py b/lib/python3.7/site-packages/pip/_internal/commands/uninstall.py similarity index 60% rename from lib/python3.4/site-packages/pip/commands/uninstall.py rename to lib/python3.7/site-packages/pip/_internal/commands/uninstall.py index 8ba1a7c..0cd6f54 100644 --- a/lib/python3.4/site-packages/pip/commands/uninstall.py +++ b/lib/python3.7/site-packages/pip/_internal/commands/uninstall.py @@ -1,10 +1,12 @@ from __future__ import absolute_import -import pip -from pip.wheel import WheelCache -from pip.req import InstallRequirement, RequirementSet, parse_requirements -from pip.basecommand import Command -from pip.exceptions import InstallationError +from pip._vendor.packaging.utils import canonicalize_name + +from pip._internal.cli.base_command import Command +from pip._internal.exceptions import InstallationError +from pip._internal.req import parse_requirements +from pip._internal.req.constructors import install_req_from_line +from pip._internal.utils.misc import protect_pip_from_modification_on_windows class UninstallCommand(Command): @@ -44,33 +46,33 @@ class UninstallCommand(Command): def run(self, options, args): with self._build_session(options) as session: - format_control = pip.index.FormatControl(set(), set()) - wheel_cache = WheelCache(options.cache_dir, format_control) - requirement_set = RequirementSet( - build_dir=None, - src_dir=None, - download_dir=None, - isolated=options.isolated_mode, - session=session, - wheel_cache=wheel_cache, - ) + reqs_to_uninstall = {} for name in args: - requirement_set.add_requirement( - InstallRequirement.from_line( - name, isolated=options.isolated_mode, - wheel_cache=wheel_cache - ) + req = install_req_from_line( + name, isolated=options.isolated_mode, ) + if req.name: + reqs_to_uninstall[canonicalize_name(req.name)] = req for filename in options.requirements: for req in parse_requirements( filename, options=options, - session=session, - wheel_cache=wheel_cache): - requirement_set.add_requirement(req) - if not requirement_set.has_requirements: + session=session): + if req.name: + reqs_to_uninstall[canonicalize_name(req.name)] = req + if not reqs_to_uninstall: raise InstallationError( 'You must give at least one requirement to %(name)s (see ' '"pip help %(name)s")' % dict(name=self.name) ) - requirement_set.uninstall(auto_confirm=options.yes) + + protect_pip_from_modification_on_windows( + modifying_pip="pip" in reqs_to_uninstall + ) + + for req in reqs_to_uninstall.values(): + uninstall_pathset = req.uninstall( + auto_confirm=options.yes, verbose=self.verbosity > 0, + ) + if uninstall_pathset: + uninstall_pathset.commit() diff --git a/lib/python3.4/site-packages/pip/commands/wheel.py b/lib/python3.7/site-packages/pip/_internal/commands/wheel.py similarity index 56% rename from lib/python3.4/site-packages/pip/commands/wheel.py rename to lib/python3.7/site-packages/pip/_internal/commands/wheel.py index 70e95eb..9c1f149 100644 --- a/lib/python3.4/site-packages/pip/commands/wheel.py +++ b/lib/python3.7/site-packages/pip/_internal/commands/wheel.py @@ -3,17 +3,17 @@ from __future__ import absolute_import import logging import os -import warnings - -from pip.basecommand import RequirementCommand -from pip.exceptions import CommandError, PreviousBuildDirError -from pip.req import RequirementSet -from pip.utils import import_or_raise -from pip.utils.build import BuildDirectory -from pip.utils.deprecation import RemovedInPip10Warning -from pip.wheel import WheelCache, WheelBuilder -from pip import cmdoptions +from pip._internal.cache import WheelCache +from pip._internal.cli import cmdoptions +from pip._internal.cli.base_command import RequirementCommand +from pip._internal.exceptions import CommandError, PreviousBuildDirError +from pip._internal.operations.prepare import RequirementPreparer +from pip._internal.req import RequirementSet +from pip._internal.req.req_tracker import RequirementTracker +from pip._internal.resolve import Resolver +from pip._internal.utils.temp_dir import TempDirectory +from pip._internal.wheel import WheelBuilder logger = logging.getLogger(__name__) @@ -56,16 +56,17 @@ class WheelCommand(RequirementCommand): help=("Build wheels into , where the default is the " "current working directory."), ) - cmd_opts.add_option(cmdoptions.use_wheel()) - cmd_opts.add_option(cmdoptions.no_use_wheel()) cmd_opts.add_option(cmdoptions.no_binary()) cmd_opts.add_option(cmdoptions.only_binary()) + cmd_opts.add_option(cmdoptions.prefer_binary()) cmd_opts.add_option( '--build-option', dest='build_options', metavar='options', action='append', - help="Extra arguments to be supplied to 'setup.py bdist_wheel'.") + help="Extra arguments to be supplied to 'setup.py bdist_wheel'.", + ) + cmd_opts.add_option(cmdoptions.no_build_isolation()) cmd_opts.add_option(cmdoptions.constraints()) cmd_opts.add_option(cmdoptions.editable()) cmd_opts.add_option(cmdoptions.requirements()) @@ -73,6 +74,7 @@ class WheelCommand(RequirementCommand): cmd_opts.add_option(cmdoptions.ignore_requires_python()) cmd_opts.add_option(cmdoptions.no_deps()) cmd_opts.add_option(cmdoptions.build_dir()) + cmd_opts.add_option(cmdoptions.progress_bar()) cmd_opts.add_option( '--global-option', @@ -101,55 +103,9 @@ class WheelCommand(RequirementCommand): self.parser.insert_option_group(0, index_opts) self.parser.insert_option_group(0, cmd_opts) - def check_required_packages(self): - import_or_raise( - 'wheel.bdist_wheel', - CommandError, - "'pip wheel' requires the 'wheel' package. To fix this, run: " - "pip install wheel" - ) - pkg_resources = import_or_raise( - 'pkg_resources', - CommandError, - "'pip wheel' requires setuptools >= 0.8 for dist-info support." - " To fix this, run: pip install --upgrade setuptools" - ) - if not hasattr(pkg_resources, 'DistInfoDistribution'): - raise CommandError( - "'pip wheel' requires setuptools >= 0.8 for dist-info " - "support. To fix this, run: pip install --upgrade " - "setuptools" - ) - def run(self, options, args): - self.check_required_packages() - cmdoptions.resolve_wheel_no_use_binary(options) cmdoptions.check_install_build_global(options) - if options.allow_external: - warnings.warn( - "--allow-external has been deprecated and will be removed in " - "the future. Due to changes in the repository protocol, it no " - "longer has any effect.", - RemovedInPip10Warning, - ) - - if options.allow_all_external: - warnings.warn( - "--allow-all-external has been deprecated and will be removed " - "in the future. Due to changes in the repository protocol, it " - "no longer has any effect.", - RemovedInPip10Warning, - ) - - if options.allow_unverified: - warnings.warn( - "--allow-unverified has been deprecated and will be removed " - "in the future. Due to changes in the repository protocol, it " - "no longer has any effect.", - RemovedInPip10Warning, - ) - index_urls = [options.index_url] + options.extra_index_urls if options.no_index: logger.debug('Ignoring indexes: %s', ','.join(index_urls)) @@ -164,39 +120,57 @@ class WheelCommand(RequirementCommand): finder = self._build_package_finder(options, session) build_delete = (not (options.no_clean or options.build_dir)) wheel_cache = WheelCache(options.cache_dir, options.format_control) - with BuildDirectory(options.build_dir, - delete=build_delete) as build_dir: + + with RequirementTracker() as req_tracker, TempDirectory( + options.build_dir, delete=build_delete, kind="wheel" + ) as directory: + requirement_set = RequirementSet( - build_dir=build_dir, - src_dir=options.src_dir, - download_dir=None, - ignore_dependencies=options.ignore_dependencies, - ignore_installed=True, - ignore_requires_python=options.ignore_requires_python, - isolated=options.isolated_mode, - session=session, - wheel_cache=wheel_cache, - wheel_download_dir=options.wheel_dir, - require_hashes=options.require_hashes + require_hashes=options.require_hashes, ) - self.populate_requirement_set( - requirement_set, args, options, finder, session, self.name, - wheel_cache - ) - - if not requirement_set.has_requirements: - return - try: + self.populate_requirement_set( + requirement_set, args, options, finder, session, + self.name, wheel_cache + ) + + preparer = RequirementPreparer( + build_dir=directory.path, + src_dir=options.src_dir, + download_dir=None, + wheel_download_dir=options.wheel_dir, + progress_bar=options.progress_bar, + build_isolation=options.build_isolation, + req_tracker=req_tracker, + ) + + resolver = Resolver( + preparer=preparer, + finder=finder, + session=session, + wheel_cache=wheel_cache, + use_user_site=False, + upgrade_strategy="to-satisfy-only", + force_reinstall=False, + ignore_dependencies=options.ignore_dependencies, + ignore_requires_python=options.ignore_requires_python, + ignore_installed=True, + isolated=options.isolated_mode, + ) + resolver.resolve(requirement_set) + # build wheels wb = WheelBuilder( - requirement_set, - finder, + finder, preparer, wheel_cache, build_options=options.build_options or [], global_options=options.global_options or [], + no_clean=options.no_clean, ) - if not wb.build(): + wheels_built_successfully = wb.build( + requirement_set.requirements.values(), session=session, + ) + if not wheels_built_successfully: raise CommandError( "Failed to build one or more wheels" ) @@ -206,3 +180,4 @@ class WheelCommand(RequirementCommand): finally: if not options.no_clean: requirement_set.cleanup_files() + wheel_cache.cleanup() diff --git a/lib/python3.7/site-packages/pip/_internal/configuration.py b/lib/python3.7/site-packages/pip/_internal/configuration.py new file mode 100644 index 0000000..fe6df9b --- /dev/null +++ b/lib/python3.7/site-packages/pip/_internal/configuration.py @@ -0,0 +1,387 @@ +"""Configuration management setup + +Some terminology: +- name + As written in config files. +- value + Value associated with a name +- key + Name combined with it's section (section.name) +- variant + A single word describing where the configuration key-value pair came from +""" + +import locale +import logging +import os + +from pip._vendor import six +from pip._vendor.six.moves import configparser + +from pip._internal.exceptions import ( + ConfigurationError, ConfigurationFileCouldNotBeLoaded, +) +from pip._internal.locations import ( + legacy_config_file, new_config_file, running_under_virtualenv, + site_config_files, venv_config_file, +) +from pip._internal.utils.misc import ensure_dir, enum +from pip._internal.utils.typing import MYPY_CHECK_RUNNING + +if MYPY_CHECK_RUNNING: + from typing import ( # noqa: F401 + Any, Dict, Iterable, List, NewType, Optional, Tuple + ) + + RawConfigParser = configparser.RawConfigParser # Shorthand + Kind = NewType("Kind", str) + +logger = logging.getLogger(__name__) + + +# NOTE: Maybe use the optionx attribute to normalize keynames. +def _normalize_name(name): + # type: (str) -> str + """Make a name consistent regardless of source (environment or file) + """ + name = name.lower().replace('_', '-') + if name.startswith('--'): + name = name[2:] # only prefer long opts + return name + + +def _disassemble_key(name): + # type: (str) -> List[str] + return name.split(".", 1) + + +# The kinds of configurations there are. +kinds = enum( + USER="user", # User Specific + GLOBAL="global", # System Wide + VENV="venv", # Virtual Environment Specific + ENV="env", # from PIP_CONFIG_FILE + ENV_VAR="env-var", # from Environment Variables +) + + +class Configuration(object): + """Handles management of configuration. + + Provides an interface to accessing and managing configuration files. + + This class converts provides an API that takes "section.key-name" style + keys and stores the value associated with it as "key-name" under the + section "section". + + This allows for a clean interface wherein the both the section and the + key-name are preserved in an easy to manage form in the configuration files + and the data stored is also nice. + """ + + def __init__(self, isolated, load_only=None): + # type: (bool, Kind) -> None + super(Configuration, self).__init__() + + _valid_load_only = [kinds.USER, kinds.GLOBAL, kinds.VENV, None] + if load_only not in _valid_load_only: + raise ConfigurationError( + "Got invalid value for load_only - should be one of {}".format( + ", ".join(map(repr, _valid_load_only[:-1])) + ) + ) + self.isolated = isolated # type: bool + self.load_only = load_only # type: Optional[Kind] + + # The order here determines the override order. + self._override_order = [ + kinds.GLOBAL, kinds.USER, kinds.VENV, kinds.ENV, kinds.ENV_VAR + ] + + self._ignore_env_names = ["version", "help"] + + # Because we keep track of where we got the data from + self._parsers = { + variant: [] for variant in self._override_order + } # type: Dict[Kind, List[Tuple[str, RawConfigParser]]] + self._config = { + variant: {} for variant in self._override_order + } # type: Dict[Kind, Dict[str, Any]] + self._modified_parsers = [] # type: List[Tuple[str, RawConfigParser]] + + def load(self): + # type: () -> None + """Loads configuration from configuration files and environment + """ + self._load_config_files() + if not self.isolated: + self._load_environment_vars() + + def get_file_to_edit(self): + # type: () -> Optional[str] + """Returns the file with highest priority in configuration + """ + assert self.load_only is not None, \ + "Need to be specified a file to be editing" + + try: + return self._get_parser_to_modify()[0] + except IndexError: + return None + + def items(self): + # type: () -> Iterable[Tuple[str, Any]] + """Returns key-value pairs like dict.items() representing the loaded + configuration + """ + return self._dictionary.items() + + def get_value(self, key): + # type: (str) -> Any + """Get a value from the configuration. + """ + try: + return self._dictionary[key] + except KeyError: + raise ConfigurationError("No such key - {}".format(key)) + + def set_value(self, key, value): + # type: (str, Any) -> None + """Modify a value in the configuration. + """ + self._ensure_have_load_only() + + fname, parser = self._get_parser_to_modify() + + if parser is not None: + section, name = _disassemble_key(key) + + # Modify the parser and the configuration + if not parser.has_section(section): + parser.add_section(section) + parser.set(section, name, value) + + self._config[self.load_only][key] = value + self._mark_as_modified(fname, parser) + + def unset_value(self, key): + # type: (str) -> None + """Unset a value in the configuration. + """ + self._ensure_have_load_only() + + if key not in self._config[self.load_only]: + raise ConfigurationError("No such key - {}".format(key)) + + fname, parser = self._get_parser_to_modify() + + if parser is not None: + section, name = _disassemble_key(key) + + # Remove the key in the parser + modified_something = False + if parser.has_section(section): + # Returns whether the option was removed or not + modified_something = parser.remove_option(section, name) + + if modified_something: + # name removed from parser, section may now be empty + section_iter = iter(parser.items(section)) + try: + val = six.next(section_iter) + except StopIteration: + val = None + + if val is None: + parser.remove_section(section) + + self._mark_as_modified(fname, parser) + else: + raise ConfigurationError( + "Fatal Internal error [id=1]. Please report as a bug." + ) + + del self._config[self.load_only][key] + + def save(self): + # type: () -> None + """Save the currentin-memory state. + """ + self._ensure_have_load_only() + + for fname, parser in self._modified_parsers: + logger.info("Writing to %s", fname) + + # Ensure directory exists. + ensure_dir(os.path.dirname(fname)) + + with open(fname, "w") as f: + parser.write(f) # type: ignore + + # + # Private routines + # + + def _ensure_have_load_only(self): + # type: () -> None + if self.load_only is None: + raise ConfigurationError("Needed a specific file to be modifying.") + logger.debug("Will be working with %s variant only", self.load_only) + + @property + def _dictionary(self): + # type: () -> Dict[str, Any] + """A dictionary representing the loaded configuration. + """ + # NOTE: Dictionaries are not populated if not loaded. So, conditionals + # are not needed here. + retval = {} + + for variant in self._override_order: + retval.update(self._config[variant]) + + return retval + + def _load_config_files(self): + # type: () -> None + """Loads configuration from configuration files + """ + config_files = dict(self._iter_config_files()) + if config_files[kinds.ENV][0:1] == [os.devnull]: + logger.debug( + "Skipping loading configuration files due to " + "environment's PIP_CONFIG_FILE being os.devnull" + ) + return + + for variant, files in config_files.items(): + for fname in files: + # If there's specific variant set in `load_only`, load only + # that variant, not the others. + if self.load_only is not None and variant != self.load_only: + logger.debug( + "Skipping file '%s' (variant: %s)", fname, variant + ) + continue + + parser = self._load_file(variant, fname) + + # Keeping track of the parsers used + self._parsers[variant].append((fname, parser)) + + def _load_file(self, variant, fname): + # type: (Kind, str) -> RawConfigParser + logger.debug("For variant '%s', will try loading '%s'", variant, fname) + parser = self._construct_parser(fname) + + for section in parser.sections(): + items = parser.items(section) + self._config[variant].update(self._normalized_keys(section, items)) + + return parser + + def _construct_parser(self, fname): + # type: (str) -> RawConfigParser + parser = configparser.RawConfigParser() + # If there is no such file, don't bother reading it but create the + # parser anyway, to hold the data. + # Doing this is useful when modifying and saving files, where we don't + # need to construct a parser. + if os.path.exists(fname): + try: + parser.read(fname) + except UnicodeDecodeError: + # See https://github.com/pypa/pip/issues/4963 + raise ConfigurationFileCouldNotBeLoaded( + reason="contains invalid {} characters".format( + locale.getpreferredencoding(False) + ), + fname=fname, + ) + except configparser.Error as error: + # See https://github.com/pypa/pip/issues/4893 + raise ConfigurationFileCouldNotBeLoaded(error=error) + return parser + + def _load_environment_vars(self): + # type: () -> None + """Loads configuration from environment variables + """ + self._config[kinds.ENV_VAR].update( + self._normalized_keys(":env:", self._get_environ_vars()) + ) + + def _normalized_keys(self, section, items): + # type: (str, Iterable[Tuple[str, Any]]) -> Dict[str, Any] + """Normalizes items to construct a dictionary with normalized keys. + + This routine is where the names become keys and are made the same + regardless of source - configuration files or environment. + """ + normalized = {} + for name, val in items: + key = section + "." + _normalize_name(name) + normalized[key] = val + return normalized + + def _get_environ_vars(self): + # type: () -> Iterable[Tuple[str, str]] + """Returns a generator with all environmental vars with prefix PIP_""" + for key, val in os.environ.items(): + should_be_yielded = ( + key.startswith("PIP_") and + key[4:].lower() not in self._ignore_env_names + ) + if should_be_yielded: + yield key[4:].lower(), val + + # XXX: This is patched in the tests. + def _iter_config_files(self): + # type: () -> Iterable[Tuple[Kind, List[str]]] + """Yields variant and configuration files associated with it. + + This should be treated like items of a dictionary. + """ + # SMELL: Move the conditions out of this function + + # environment variables have the lowest priority + config_file = os.environ.get('PIP_CONFIG_FILE', None) + if config_file is not None: + yield kinds.ENV, [config_file] + else: + yield kinds.ENV, [] + + # at the base we have any global configuration + yield kinds.GLOBAL, list(site_config_files) + + # per-user configuration next + should_load_user_config = not self.isolated and not ( + config_file and os.path.exists(config_file) + ) + if should_load_user_config: + # The legacy config file is overridden by the new config file + yield kinds.USER, [legacy_config_file, new_config_file] + + # finally virtualenv configuration first trumping others + if running_under_virtualenv(): + yield kinds.VENV, [venv_config_file] + + def _get_parser_to_modify(self): + # type: () -> Tuple[str, RawConfigParser] + # Determine which parser to modify + parsers = self._parsers[self.load_only] + if not parsers: + # This should not happen if everything works correctly. + raise ConfigurationError( + "Fatal Internal error [id=2]. Please report as a bug." + ) + + # Use the highest priority parser. + return parsers[-1] + + # XXX: This is patched in the tests. + def _mark_as_modified(self, fname, parser): + # type: (str, RawConfigParser) -> None + file_parser_tuple = (fname, parser) + if file_parser_tuple not in self._modified_parsers: + self._modified_parsers.append(file_parser_tuple) diff --git a/lib/python3.4/site-packages/pip/download.py b/lib/python3.7/site-packages/pip/_internal/download.py similarity index 88% rename from lib/python3.4/site-packages/pip/download.py rename to lib/python3.7/site-packages/pip/_internal/download.py index 54d3131..96f3b65 100644 --- a/lib/python3.4/site-packages/pip/download.py +++ b/lib/python3.7/site-packages/pip/_internal/download.py @@ -11,44 +11,48 @@ import platform import re import shutil import sys -import tempfile -try: - import ssl # noqa - HAS_TLS = True -except ImportError: - HAS_TLS = False - -from pip._vendor.six.moves.urllib import parse as urllib_parse -from pip._vendor.six.moves.urllib import request as urllib_request - -import pip - -from pip.exceptions import InstallationError, HashMismatch -from pip.models import PyPI -from pip.utils import (splitext, rmtree, format_size, display_path, - backup_dir, ask_path_exists, unpack_file, - ARCHIVE_EXTENSIONS, consume, call_subprocess) -from pip.utils.encoding import auto_decode -from pip.utils.filesystem import check_path_owner -from pip.utils.logging import indent_log -from pip.utils.setuptools_build import SETUPTOOLS_SHIM -from pip.utils.glibc import libc_ver -from pip.utils.ui import DownloadProgressBar, DownloadProgressSpinner -from pip.locations import write_delete_marker_file -from pip.vcs import vcs -from pip._vendor import requests, six -from pip._vendor.requests.adapters import BaseAdapter, HTTPAdapter -from pip._vendor.requests.auth import AuthBase, HTTPBasicAuth -from pip._vendor.requests.models import CONTENT_CHUNK_SIZE, Response -from pip._vendor.requests.utils import get_netrc_auth -from pip._vendor.requests.structures import CaseInsensitiveDict -from pip._vendor.requests.packages import urllib3 +from pip._vendor import requests, six, urllib3 from pip._vendor.cachecontrol import CacheControlAdapter from pip._vendor.cachecontrol.caches import FileCache from pip._vendor.lockfile import LockError -from pip._vendor.six.moves import xmlrpc_client +from pip._vendor.requests.adapters import BaseAdapter, HTTPAdapter +from pip._vendor.requests.auth import AuthBase, HTTPBasicAuth +from pip._vendor.requests.models import CONTENT_CHUNK_SIZE, Response +from pip._vendor.requests.structures import CaseInsensitiveDict +from pip._vendor.requests.utils import get_netrc_auth +# NOTE: XMLRPC Client is not annotated in typeshed as on 2017-07-17, which is +# why we ignore the type on this import +from pip._vendor.six.moves import xmlrpc_client # type: ignore +from pip._vendor.six.moves.urllib import parse as urllib_parse +from pip._vendor.six.moves.urllib import request as urllib_request +from pip._vendor.six.moves.urllib.parse import unquote as urllib_unquote +from pip._vendor.urllib3.util import IS_PYOPENSSL +import pip +from pip._internal.exceptions import HashMismatch, InstallationError +from pip._internal.locations import write_delete_marker_file +from pip._internal.models.index import PyPI +from pip._internal.utils.encoding import auto_decode +from pip._internal.utils.filesystem import check_path_owner +from pip._internal.utils.glibc import libc_ver +from pip._internal.utils.logging import indent_log +from pip._internal.utils.misc import ( + ARCHIVE_EXTENSIONS, ask_path_exists, backup_dir, call_subprocess, consume, + display_path, format_size, get_installed_version, rmtree, splitext, + unpack_file, +) +from pip._internal.utils.setuptools_build import SETUPTOOLS_SHIM +from pip._internal.utils.temp_dir import TempDirectory +from pip._internal.utils.ui import DownloadProgressProvider +from pip._internal.vcs import vcs + +try: + import ssl # noqa +except ImportError: + ssl = None + +HAS_TLS = (ssl is not None) or IS_PYOPENSSL __all__ = ['get_file_content', 'is_url', 'url_to_path', 'path_to_url', @@ -116,10 +120,13 @@ def user_agent(): if platform.machine(): data["cpu"] = platform.machine() - # Python 2.6 doesn't have ssl.OPENSSL_VERSION. - if HAS_TLS and sys.version_info[:2] > (2, 6): + if HAS_TLS: data["openssl_version"] = ssl.OPENSSL_VERSION + setuptools_version = get_installed_version("setuptools") + if setuptools_version is not None: + data["setuptools_version"] = setuptools_version + return "{data[installer][name]}/{data[installer][version]} {json}".format( data=data, json=json.dumps(data, separators=(",", ":"), sort_keys=True), @@ -203,8 +210,9 @@ class MultiDomainBasicAuth(AuthBase): if "@" in netloc: userinfo = netloc.rsplit("@", 1)[0] if ":" in userinfo: - return userinfo.split(":", 1) - return userinfo, None + user, pwd = userinfo.split(":", 1) + return (urllib_unquote(user), urllib_unquote(pwd)) + return urllib_unquote(userinfo), None return None, None @@ -342,7 +350,9 @@ class PipSession(requests.Session): # connection got interrupted in some way. A 503 error in general # is typically considered a transient error so we'll go ahead and # retry it. - status_forcelist=[503], + # A 500 may indicate transient error in Amazon S3 + # A 520 or 527 - may indicate transient error in CloudFlare + status_forcelist=[500, 503, 520, 527], # Add a small amount of back off between failed requests in # order to prevent hammering the service. @@ -376,7 +386,7 @@ class PipSession(requests.Session): # We want to use a non-validating adapter for any requests which are # deemed insecure. for host in insecure_hosts: - self.mount("https://{0}/".format(host), insecure_adapter) + self.mount("https://{}/".format(host), insecure_adapter) def request(self, method, url, *args, **kwargs): # Allow setting a default timeout on a session @@ -388,7 +398,12 @@ class PipSession(requests.Session): def get_file_content(url, comes_from=None, session=None): """Gets the content of a file; it may be a filename, file: URL, or - http: URL. Returns (location, content). Content is unicode.""" + http: URL. Returns (location, content). Content is unicode. + + :param url: File path or url. + :param comes_from: Origin description of requirements. + :param session: Instance of pip.download.PipSession. + """ if session is None: raise TypeError( "get_file_content() missing 1 required keyword argument: 'session'" @@ -509,14 +524,13 @@ def _progress_indicator(iterable, *args, **kwargs): return iterable -def _download_url(resp, link, content_file, hashes): +def _download_url(resp, link, content_file, hashes, progress_bar): try: total_length = int(resp.headers['content-length']) except (ValueError, KeyError, TypeError): total_length = 0 cached_resp = getattr(resp, "from_cache", False) - if logger.getEffectiveLevel() > logging.INFO: show_progress = False elif cached_resp: @@ -580,12 +594,12 @@ def _download_url(resp, link, content_file, hashes): url = link.url_without_fragment if show_progress: # We don't show progress on cached responses + progress_indicator = DownloadProgressProvider(progress_bar, + max=total_length) if total_length: logger.info("Downloading %s (%s)", url, format_size(total_length)) - progress_indicator = DownloadProgressBar(max=total_length).iter else: logger.info("Downloading %s", url) - progress_indicator = DownloadProgressSpinner().iter elif cached_resp: logger.info("Using cached %s", url) else: @@ -633,42 +647,41 @@ def _copy_file(filename, location, link): def unpack_http_url(link, location, download_dir=None, - session=None, hashes=None): + session=None, hashes=None, progress_bar="on"): if session is None: raise TypeError( "unpack_http_url() missing 1 required keyword argument: 'session'" ) - temp_dir = tempfile.mkdtemp('-unpack', 'pip-') + with TempDirectory(kind="unpack") as temp_dir: + # If a download dir is specified, is the file already downloaded there? + already_downloaded_path = None + if download_dir: + already_downloaded_path = _check_download_dir(link, + download_dir, + hashes) - # If a download dir is specified, is the file already downloaded there? - already_downloaded_path = None - if download_dir: - already_downloaded_path = _check_download_dir(link, - download_dir, - hashes) + if already_downloaded_path: + from_path = already_downloaded_path + content_type = mimetypes.guess_type(from_path)[0] + else: + # let's download to a tmp dir + from_path, content_type = _download_http_url(link, + session, + temp_dir.path, + hashes, + progress_bar) - if already_downloaded_path: - from_path = already_downloaded_path - content_type = mimetypes.guess_type(from_path)[0] - else: - # let's download to a tmp dir - from_path, content_type = _download_http_url(link, - session, - temp_dir, - hashes) + # unpack the archive to the build dir location. even when only + # downloading archives, they have to be unpacked to parse dependencies + unpack_file(from_path, location, content_type, link) - # unpack the archive to the build dir location. even when only downloading - # archives, they have to be unpacked to parse dependencies - unpack_file(from_path, location, content_type, link) + # a download dir is specified; let's copy the archive there + if download_dir and not already_downloaded_path: + _copy_file(from_path, download_dir, link) - # a download dir is specified; let's copy the archive there - if download_dir and not already_downloaded_path: - _copy_file(from_path, download_dir, link) - - if not already_downloaded_path: - os.unlink(from_path) - rmtree(temp_dir) + if not already_downloaded_path: + os.unlink(from_path) def unpack_file_url(link, location, download_dir=None, hashes=None): @@ -785,7 +798,8 @@ class PipXmlrpcTransport(xmlrpc_client.Transport): def unpack_url(link, location, download_dir=None, - only_download=False, session=None, hashes=None): + only_download=False, session=None, hashes=None, + progress_bar="on"): """Unpack link. If link is a VCS link: if only_download, export into download_dir and ignore location @@ -818,13 +832,14 @@ def unpack_url(link, location, download_dir=None, location, download_dir, session, - hashes=hashes + hashes=hashes, + progress_bar=progress_bar ) if only_download: write_delete_marker_file(location) -def _download_http_url(link, session, temp_dir, hashes): +def _download_http_url(link, session, temp_dir, hashes, progress_bar): """Download link url into temp_dir using provided session""" target_url = link.url.split('#', 1)[0] try: @@ -879,7 +894,7 @@ def _download_http_url(link, session, temp_dir, hashes): filename += ext file_path = os.path.join(temp_dir, filename) with open(file_path, 'wb') as content_file: - _download_url(resp, link, content_file, hashes) + _download_url(resp, link, content_file, hashes, progress_bar) return file_path, content_type diff --git a/lib/python3.4/site-packages/pip/exceptions.py b/lib/python3.7/site-packages/pip/_internal/exceptions.py similarity index 90% rename from lib/python3.4/site-packages/pip/exceptions.py rename to lib/python3.7/site-packages/pip/_internal/exceptions.py index 50b527f..f1ca6f3 100644 --- a/lib/python3.4/site-packages/pip/exceptions.py +++ b/lib/python3.7/site-packages/pip/_internal/exceptions.py @@ -10,6 +10,10 @@ class PipError(Exception): """Base pip exception""" +class ConfigurationError(PipError): + """General exception in configuration""" + + class InstallationError(PipError): """General exception during installation""" @@ -158,7 +162,8 @@ class HashMissing(HashError): self.gotten_hash = gotten_hash def body(self): - from pip.utils.hashes import FAVORITE_HASH # Dodge circular import. + # Dodge circular import. + from pip._internal.utils.hashes import FAVORITE_HASH package = None if self.req: @@ -242,3 +247,22 @@ class HashMismatch(HashError): class UnsupportedPythonVersion(InstallationError): """Unsupported python version according to Requires-Python package metadata.""" + + +class ConfigurationFileCouldNotBeLoaded(ConfigurationError): + """When there are errors while loading a configuration file + """ + + def __init__(self, reason="could not be loaded", fname=None, error=None): + super(ConfigurationFileCouldNotBeLoaded, self).__init__(error) + self.reason = reason + self.fname = fname + self.error = error + + def __str__(self): + if self.fname is not None: + message_part = " in {}.".format(self.fname) + else: + assert self.error is not None + message_part = ".\n{}\n".format(self.error.message) + return "Configuration file {}{}".format(self.reason, message_part) diff --git a/lib/python3.4/site-packages/pip/index.py b/lib/python3.7/site-packages/pip/_internal/index.py similarity index 63% rename from lib/python3.4/site-packages/pip/index.py rename to lib/python3.7/site-packages/pip/_internal/index.py index f653f6e..8c2f24f 100644 --- a/lib/python3.4/site-packages/pip/index.py +++ b/lib/python3.7/site-packages/pip/_internal/index.py @@ -1,44 +1,46 @@ """Routines related to PyPI, indexes""" from __future__ import absolute_import -import logging import cgi -from collections import namedtuple import itertools -import sys -import os -import re +import logging import mimetypes +import os import posixpath -import warnings +import re +import sys +from collections import namedtuple +from pip._vendor import html5lib, requests, six +from pip._vendor.distlib.compat import unescape +from pip._vendor.packaging import specifiers +from pip._vendor.packaging.utils import canonicalize_name +from pip._vendor.packaging.version import parse as parse_version +from pip._vendor.requests.exceptions import SSLError from pip._vendor.six.moves.urllib import parse as urllib_parse from pip._vendor.six.moves.urllib import request as urllib_request -from pip.compat import ipaddress -from pip.utils import ( - cached_property, splitext, normalize_path, - ARCHIVE_EXTENSIONS, SUPPORTED_EXTENSIONS, -) -from pip.utils.deprecation import RemovedInPip10Warning -from pip.utils.logging import indent_log -from pip.utils.packaging import check_requires_python -from pip.exceptions import ( - DistributionNotFound, BestVersionAlreadyInstalled, InvalidWheelFilename, +from pip._internal.download import HAS_TLS, is_url, path_to_url, url_to_path +from pip._internal.exceptions import ( + BestVersionAlreadyInstalled, DistributionNotFound, InvalidWheelFilename, UnsupportedWheel, ) -from pip.download import HAS_TLS, is_url, path_to_url, url_to_path -from pip.wheel import Wheel, wheel_ext -from pip.pep425tags import get_supported -from pip._vendor import html5lib, requests, six -from pip._vendor.packaging.version import parse as parse_version -from pip._vendor.packaging.utils import canonicalize_name -from pip._vendor.packaging import specifiers -from pip._vendor.requests.exceptions import SSLError -from pip._vendor.distlib.compat import unescape +from pip._internal.models.candidate import InstallationCandidate +from pip._internal.models.format_control import FormatControl +from pip._internal.models.index import PyPI +from pip._internal.models.link import Link +from pip._internal.pep425tags import get_supported +from pip._internal.utils.compat import ipaddress +from pip._internal.utils.deprecation import deprecated +from pip._internal.utils.logging import indent_log +from pip._internal.utils.misc import ( + ARCHIVE_EXTENSIONS, SUPPORTED_EXTENSIONS, normalize_path, + remove_auth_from_url, +) +from pip._internal.utils.packaging import check_requires_python +from pip._internal.wheel import Wheel, wheel_ext - -__all__ = ['FormatControl', 'fmt_ctl_handle_mutual_exclude', 'PackageFinder'] +__all__ = ['FormatControl', 'PackageFinder'] SECURE_ORIGINS = [ @@ -57,45 +59,120 @@ SECURE_ORIGINS = [ logger = logging.getLogger(__name__) -class InstallationCandidate(object): +def _get_content_type(url, session): + """Get the Content-Type of the given url, using a HEAD request""" + scheme, netloc, path, query, fragment = urllib_parse.urlsplit(url) + if scheme not in {'http', 'https'}: + # FIXME: some warning or something? + # assertion error? + return '' - def __init__(self, project, version, location): - self.project = project - self.version = parse_version(version) - self.location = location - self._key = (self.project, self.version, self.location) + resp = session.head(url, allow_redirects=True) + resp.raise_for_status() - def __repr__(self): - return "".format( - self.project, self.version, self.location, + return resp.headers.get("Content-Type", "") + + +def _handle_get_page_fail(link, reason, url, meth=None): + if meth is None: + meth = logger.debug + meth("Could not fetch URL %s: %s - skipping", link, reason) + + +def _get_html_page(link, session=None): + if session is None: + raise TypeError( + "_get_html_page() missing 1 required keyword argument: 'session'" ) - def __hash__(self): - return hash(self._key) + url = link.url + url = url.split('#', 1)[0] - def __lt__(self, other): - return self._compare(other, lambda s, o: s < o) + # Check for VCS schemes that do not support lookup as web pages. + from pip._internal.vcs import VcsSupport + for scheme in VcsSupport.schemes: + if url.lower().startswith(scheme) and url[len(scheme)] in '+:': + logger.debug('Cannot look at %s URL %s', scheme, link) + return None - def __le__(self, other): - return self._compare(other, lambda s, o: s <= o) + try: + filename = link.filename + for bad_ext in ARCHIVE_EXTENSIONS: + if filename.endswith(bad_ext): + content_type = _get_content_type(url, session=session) + if content_type.lower().startswith('text/html'): + break + else: + logger.debug( + 'Skipping page %s because of Content-Type: %s', + link, + content_type, + ) + return - def __eq__(self, other): - return self._compare(other, lambda s, o: s == o) + logger.debug('Getting page %s', url) - def __ge__(self, other): - return self._compare(other, lambda s, o: s >= o) + # Tack index.html onto file:// URLs that point to directories + (scheme, netloc, path, params, query, fragment) = \ + urllib_parse.urlparse(url) + if (scheme == 'file' and + os.path.isdir(urllib_request.url2pathname(path))): + # add trailing slash if not present so urljoin doesn't trim + # final segment + if not url.endswith('/'): + url += '/' + url = urllib_parse.urljoin(url, 'index.html') + logger.debug(' file: URL is directory, getting %s', url) - def __gt__(self, other): - return self._compare(other, lambda s, o: s > o) + resp = session.get( + url, + headers={ + "Accept": "text/html", + # We don't want to blindly returned cached data for + # /simple/, because authors generally expecting that + # twine upload && pip install will function, but if + # they've done a pip install in the last ~10 minutes + # it won't. Thus by setting this to zero we will not + # blindly use any cached data, however the benefit of + # using max-age=0 instead of no-cache, is that we will + # still support conditional requests, so we will still + # minimize traffic sent in cases where the page hasn't + # changed at all, we will just always incur the round + # trip for the conditional GET now instead of only + # once per 10 minutes. + # For more information, please see pypa/pip#5670. + "Cache-Control": "max-age=0", + }, + ) + resp.raise_for_status() - def __ne__(self, other): - return self._compare(other, lambda s, o: s != o) + # The check for archives above only works if the url ends with + # something that looks like an archive. However that is not a + # requirement of an url. Unless we issue a HEAD request on every + # url we cannot know ahead of time for sure if something is HTML + # or not. However we can check after we've downloaded it. + content_type = resp.headers.get('Content-Type', 'unknown') + if not content_type.lower().startswith("text/html"): + logger.debug( + 'Skipping page %s because of Content-Type: %s', + link, + content_type, + ) + return - def _compare(self, other, method): - if not isinstance(other, InstallationCandidate): - return NotImplemented - - return method(self._key, other._key) + inst = HTMLPage(resp.content, resp.url, resp.headers) + except requests.HTTPError as exc: + _handle_get_page_fail(link, exc, url) + except SSLError as exc: + reason = "There was a problem confirming the ssl certificate: " + reason += str(exc) + _handle_get_page_fail(link, reason, url, meth=logger.info) + except requests.ConnectionError as exc: + _handle_get_page_fail(link, "connection error: %s" % exc, url) + except requests.Timeout: + _handle_get_page_fail(link, "timed out", url) + else: + return inst class PackageFinder(object): @@ -108,7 +185,8 @@ class PackageFinder(object): def __init__(self, find_links, index_urls, allow_all_prereleases=False, trusted_hosts=None, process_dependency_links=False, session=None, format_control=None, platform=None, - versions=None, abi=None, implementation=None): + versions=None, abi=None, implementation=None, + prefer_binary=False): """Create a PackageFinder. :param format_control: A FormatControl object or None. Used to control @@ -176,6 +254,9 @@ class PackageFinder(object): impl=implementation, ) + # Do we prefer old, but valid, binary dist over new source dist + self.prefer_binary = prefer_binary + # If we don't have TLS enabled, then WARN if anyplace we're looking # relies on TLS. if not HAS_TLS: @@ -189,16 +270,31 @@ class PackageFinder(object): ) break + def get_formatted_locations(self): + lines = [] + if self.index_urls and self.index_urls != [PyPI.simple_url]: + lines.append( + "Looking in indexes: {}".format(", ".join( + remove_auth_from_url(url) for url in self.index_urls)) + ) + if self.find_links: + lines.append( + "Looking in links: {}".format(", ".join(self.find_links)) + ) + return "\n".join(lines) + def add_dependency_links(self, links): - # # FIXME: this shouldn't be global list this, it should only - # # apply to requirements of the package that specifies the - # # dependency_links value - # # FIXME: also, we should track comes_from (i.e., use Link) + # FIXME: this shouldn't be global list this, it should only + # apply to requirements of the package that specifies the + # dependency_links value + # FIXME: also, we should track comes_from (i.e., use Link) if self.process_dependency_links: - warnings.warn( + deprecated( "Dependency Links processing has been deprecated and will be " "removed in a future release.", - RemovedInPip10Warning, + replacement="PEP 508 URL dependencies", + gone_in="18.2", + issue=4187, ) self.dependency_links.extend(links) @@ -241,14 +337,16 @@ class PackageFinder(object): else: logger.warning( "Url '%s' is ignored: it is neither a file " - "nor a directory.", url) + "nor a directory.", url, + ) elif is_url(url): # Only add url with clear scheme urls.append(url) else: logger.warning( "Url '%s' is ignored. It is either a non-existing " - "path or lacks a specific scheme.", url) + "path or lacks a specific scheme.", url, + ) return files, urls @@ -261,11 +359,14 @@ class PackageFinder(object): 1. existing installs 2. wheels ordered via Wheel.support_index_min(self.valid_tags) 3. source archives + If prefer_binary was set, then all wheels are sorted above sources. Note: it was considered to embed this logic into the Link comparison operators, but then different sdist links with the same version, would have to be considered equal """ support_num = len(self.valid_tags) + build_tag = tuple() + binary_preference = 0 if candidate.location.is_wheel: # can raise InvalidWheelFilename wheel = Wheel(candidate.location.filename) @@ -274,10 +375,16 @@ class PackageFinder(object): "%s is not a supported wheel for this platform. It " "can't be sorted." % wheel.filename ) + if self.prefer_binary: + binary_preference = 1 pri = -(wheel.support_index_min(self.valid_tags)) + if wheel.build_tag is not None: + match = re.match(r'^(\d+)(.*)$', wheel.build_tag) + build_tag_groups = match.groups() + build_tag = (int(build_tag_groups[0]), build_tag_groups[1]) else: # sdist pri = -(support_num) - return (candidate.version, pri) + return (binary_preference, candidate.version, build_tag, pri) def _validate_secure_origin(self, logger, location): # Determine if this url used a secure transport mechanism @@ -341,9 +448,9 @@ class PackageFinder(object): # log a warning that we are ignoring it. logger.warning( "The repository located at %s is not a trusted or secure host and " - "is being ignored. If this repository is available via HTTPS it " - "is recommended to use HTTPS instead, otherwise you may silence " - "this warning and allow it anyways with '--trusted-host %s'.", + "is being ignored. If this repository is available via HTTPS we " + "recommend you use HTTPS instead, otherwise you may silence " + "this warning and allow it anyway with '--trusted-host %s'.", parsed.hostname, parsed.hostname, ) @@ -383,13 +490,13 @@ class PackageFinder(object): index_locations = self._get_index_urls_locations(project_name) index_file_loc, index_url_loc = self._sort_locations(index_locations) fl_file_loc, fl_url_loc = self._sort_locations( - self.find_links, expand_dir=True) + self.find_links, expand_dir=True, + ) dep_file_loc, dep_url_loc = self._sort_locations(self.dependency_links) - file_locations = ( - Link(url) for url in itertools.chain( - index_file_loc, fl_file_loc, dep_file_loc) - ) + file_locations = (Link(url) for url in itertools.chain( + index_file_loc, fl_file_loc, dep_file_loc, + )) # We trust every url that the user has given us whether it was given # via --index-url or --find-links @@ -411,7 +518,7 @@ class PackageFinder(object): logger.debug('* %s', location) canonical_name = canonicalize_name(project_name) - formats = fmt_ctl_formats(self.format_control, canonical_name) + formats = self.format_control.get_allowed_formats(canonical_name) search = Search(project_name, canonical_name, formats) find_links_versions = self._package_versions( # We trust every directly linked archive in find_links @@ -424,7 +531,7 @@ class PackageFinder(object): logger.debug('Analyzing links from page %s', page.url) with indent_log(): page_versions.extend( - self._package_versions(page.links, search) + self._package_versions(page.iter_links(), search) ) dependency_versions = self._package_versions( @@ -504,7 +611,7 @@ class PackageFinder(object): req, ', '.join( sorted( - set(str(c.version) for c in all_candidates), + {str(c.version) for c in all_candidates}, key=parse_version, ) ) @@ -615,11 +722,13 @@ class PackageFinder(object): return if ext not in SUPPORTED_EXTENSIONS: self._log_skipped_link( - link, 'unsupported archive format: %s' % ext) + link, 'unsupported archive format: %s' % ext, + ) return if "binary" not in search.formats and ext == wheel_ext: self._log_skipped_link( - link, 'No binaries permitted for %s' % search.supplied) + link, 'No binaries permitted for %s' % search.supplied, + ) return if "macosx10" in link.path and ext == '.zip': self._log_skipped_link(link, 'macosx10 one') @@ -645,14 +754,15 @@ class PackageFinder(object): # This should be up by the search.ok_binary check, but see issue 2700. if "source" not in search.formats and ext != wheel_ext: self._log_skipped_link( - link, 'No sources permitted for %s' % search.supplied) + link, 'No sources permitted for %s' % search.supplied, + ) return if not version: version = egg_info_matches(egg_info, search.supplied, link) if version is None: self._log_skipped_link( - link, 'wrong project name (not %s)' % search.supplied) + link, 'Missing project version for %s' % search.supplied) return match = self._py_version_re.search(version) @@ -680,7 +790,7 @@ class PackageFinder(object): return InstallationCandidate(search.supplied, version, link) def _get_page(self, link): - return HTMLPage.get_page(link, session=self.session) + return _get_html_page(link, session=self.session) def egg_info_matches( @@ -700,7 +810,7 @@ def egg_info_matches( return None if search_name is None: full_match = match.group(0) - return full_match[full_match.index('-'):] + return full_match.split('-', 1)[-1] name = match.group(0).lower() # To match the "safe" name that pkg_resources creates: name = name.replace('_', '-') @@ -712,384 +822,71 @@ def egg_info_matches( return None +def _determine_base_url(document, page_url): + """Determine the HTML document's base URL. + + This looks for a ```` tag in the HTML document. If present, its href + attribute denotes the base URL of anchor tags in the document. If there is + no such tag (or if it does not have a valid href attribute), the HTML + file's URL is used as the base URL. + + :param document: An HTML document representation. The current + implementation expects the result of ``html5lib.parse()``. + :param page_url: The URL of the HTML document. + """ + for base in document.findall(".//base"): + href = base.get("href") + if href is not None: + return href + return page_url + + +def _get_encoding_from_headers(headers): + """Determine if we have any encoding information in our headers. + """ + if headers and "Content-Type" in headers: + content_type, params = cgi.parse_header(headers["Content-Type"]) + if "charset" in params: + return params['charset'] + return None + + +_CLEAN_LINK_RE = re.compile(r'[^a-z0-9$&+,/:;=?@.#%_\\|-]', re.I) + + +def _clean_link(url): + """Makes sure a link is fully encoded. That is, if a ' ' shows up in + the link, it will be rewritten to %20 (while not over-quoting + % or other characters).""" + return _CLEAN_LINK_RE.sub(lambda match: '%%%2x' % ord(match.group(0)), url) + + class HTMLPage(object): """Represents one page, along with its URL""" def __init__(self, content, url, headers=None): - # Determine if we have any encoding information in our headers - encoding = None - if headers and "Content-Type" in headers: - content_type, params = cgi.parse_header(headers["Content-Type"]) - - if "charset" in params: - encoding = params['charset'] - self.content = content - self.parsed = html5lib.parse( - self.content, - transport_encoding=encoding, - namespaceHTMLElements=False, - ) self.url = url self.headers = headers def __str__(self): return self.url - @classmethod - def get_page(cls, link, skip_archives=True, session=None): - if session is None: - raise TypeError( - "get_page() missing 1 required keyword argument: 'session'" - ) - - url = link.url - url = url.split('#', 1)[0] - - # Check for VCS schemes that do not support lookup as web pages. - from pip.vcs import VcsSupport - for scheme in VcsSupport.schemes: - if url.lower().startswith(scheme) and url[len(scheme)] in '+:': - logger.debug('Cannot look at %s URL %s', scheme, link) - return None - - try: - if skip_archives: - filename = link.filename - for bad_ext in ARCHIVE_EXTENSIONS: - if filename.endswith(bad_ext): - content_type = cls._get_content_type( - url, session=session, - ) - if content_type.lower().startswith('text/html'): - break - else: - logger.debug( - 'Skipping page %s because of Content-Type: %s', - link, - content_type, - ) - return - - logger.debug('Getting page %s', url) - - # Tack index.html onto file:// URLs that point to directories - (scheme, netloc, path, params, query, fragment) = \ - urllib_parse.urlparse(url) - if (scheme == 'file' and - os.path.isdir(urllib_request.url2pathname(path))): - # add trailing slash if not present so urljoin doesn't trim - # final segment - if not url.endswith('/'): - url += '/' - url = urllib_parse.urljoin(url, 'index.html') - logger.debug(' file: URL is directory, getting %s', url) - - resp = session.get( - url, - headers={ - "Accept": "text/html", - "Cache-Control": "max-age=600", - }, - ) - resp.raise_for_status() - - # The check for archives above only works if the url ends with - # something that looks like an archive. However that is not a - # requirement of an url. Unless we issue a HEAD request on every - # url we cannot know ahead of time for sure if something is HTML - # or not. However we can check after we've downloaded it. - content_type = resp.headers.get('Content-Type', 'unknown') - if not content_type.lower().startswith("text/html"): - logger.debug( - 'Skipping page %s because of Content-Type: %s', - link, - content_type, - ) - return - - inst = cls(resp.content, resp.url, resp.headers) - except requests.HTTPError as exc: - cls._handle_fail(link, exc, url) - except SSLError as exc: - reason = ("There was a problem confirming the ssl certificate: " - "%s" % exc) - cls._handle_fail(link, reason, url, meth=logger.info) - except requests.ConnectionError as exc: - cls._handle_fail(link, "connection error: %s" % exc, url) - except requests.Timeout: - cls._handle_fail(link, "timed out", url) - else: - return inst - - @staticmethod - def _handle_fail(link, reason, url, meth=None): - if meth is None: - meth = logger.debug - - meth("Could not fetch URL %s: %s - skipping", link, reason) - - @staticmethod - def _get_content_type(url, session): - """Get the Content-Type of the given url, using a HEAD request""" - scheme, netloc, path, query, fragment = urllib_parse.urlsplit(url) - if scheme not in ('http', 'https'): - # FIXME: some warning or something? - # assertion error? - return '' - - resp = session.head(url, allow_redirects=True) - resp.raise_for_status() - - return resp.headers.get("Content-Type", "") - - @cached_property - def base_url(self): - bases = [ - x for x in self.parsed.findall(".//base") - if x.get("href") is not None - ] - if bases and bases[0].get("href"): - return bases[0].get("href") - else: - return self.url - - @property - def links(self): + def iter_links(self): """Yields all links in the page""" - for anchor in self.parsed.findall(".//a"): + document = html5lib.parse( + self.content, + transport_encoding=_get_encoding_from_headers(self.headers), + namespaceHTMLElements=False, + ) + base_url = _determine_base_url(document, self.url) + for anchor in document.findall(".//a"): if anchor.get("href"): href = anchor.get("href") - url = self.clean_link( - urllib_parse.urljoin(self.base_url, href) - ) + url = _clean_link(urllib_parse.urljoin(base_url, href)) pyrequire = anchor.get('data-requires-python') pyrequire = unescape(pyrequire) if pyrequire else None - yield Link(url, self, requires_python=pyrequire) - - _clean_re = re.compile(r'[^a-z0-9$&+,/:;=?@.#%_\\|-]', re.I) - - def clean_link(self, url): - """Makes sure a link is fully encoded. That is, if a ' ' shows up in - the link, it will be rewritten to %20 (while not over-quoting - % or other characters).""" - return self._clean_re.sub( - lambda match: '%%%2x' % ord(match.group(0)), url) - - -class Link(object): - - def __init__(self, url, comes_from=None, requires_python=None): - """ - Object representing a parsed link from https://pypi.python.org/simple/* - - url: - url of the resource pointed to (href of the link) - comes_from: - instance of HTMLPage where the link was found, or string. - requires_python: - String containing the `Requires-Python` metadata field, specified - in PEP 345. This may be specified by a data-requires-python - attribute in the HTML link tag, as described in PEP 503. - """ - - # url can be a UNC windows share - if url.startswith('\\\\'): - url = path_to_url(url) - - self.url = url - self.comes_from = comes_from - self.requires_python = requires_python if requires_python else None - - def __str__(self): - if self.requires_python: - rp = ' (requires-python:%s)' % self.requires_python - else: - rp = '' - if self.comes_from: - return '%s (from %s)%s' % (self.url, self.comes_from, rp) - else: - return str(self.url) - - def __repr__(self): - return '' % self - - def __eq__(self, other): - if not isinstance(other, Link): - return NotImplemented - return self.url == other.url - - def __ne__(self, other): - if not isinstance(other, Link): - return NotImplemented - return self.url != other.url - - def __lt__(self, other): - if not isinstance(other, Link): - return NotImplemented - return self.url < other.url - - def __le__(self, other): - if not isinstance(other, Link): - return NotImplemented - return self.url <= other.url - - def __gt__(self, other): - if not isinstance(other, Link): - return NotImplemented - return self.url > other.url - - def __ge__(self, other): - if not isinstance(other, Link): - return NotImplemented - return self.url >= other.url - - def __hash__(self): - return hash(self.url) - - @property - def filename(self): - _, netloc, path, _, _ = urllib_parse.urlsplit(self.url) - name = posixpath.basename(path.rstrip('/')) or netloc - name = urllib_parse.unquote(name) - assert name, ('URL %r produced no filename' % self.url) - return name - - @property - def scheme(self): - return urllib_parse.urlsplit(self.url)[0] - - @property - def netloc(self): - return urllib_parse.urlsplit(self.url)[1] - - @property - def path(self): - return urllib_parse.unquote(urllib_parse.urlsplit(self.url)[2]) - - def splitext(self): - return splitext(posixpath.basename(self.path.rstrip('/'))) - - @property - def ext(self): - return self.splitext()[1] - - @property - def url_without_fragment(self): - scheme, netloc, path, query, fragment = urllib_parse.urlsplit(self.url) - return urllib_parse.urlunsplit((scheme, netloc, path, query, None)) - - _egg_fragment_re = re.compile(r'[#&]egg=([^&]*)') - - @property - def egg_fragment(self): - match = self._egg_fragment_re.search(self.url) - if not match: - return None - return match.group(1) - - _subdirectory_fragment_re = re.compile(r'[#&]subdirectory=([^&]*)') - - @property - def subdirectory_fragment(self): - match = self._subdirectory_fragment_re.search(self.url) - if not match: - return None - return match.group(1) - - _hash_re = re.compile( - r'(sha1|sha224|sha384|sha256|sha512|md5)=([a-f0-9]+)' - ) - - @property - def hash(self): - match = self._hash_re.search(self.url) - if match: - return match.group(2) - return None - - @property - def hash_name(self): - match = self._hash_re.search(self.url) - if match: - return match.group(1) - return None - - @property - def show_url(self): - return posixpath.basename(self.url.split('#', 1)[0].split('?', 1)[0]) - - @property - def is_wheel(self): - return self.ext == wheel_ext - - @property - def is_artifact(self): - """ - Determines if this points to an actual artifact (e.g. a tarball) or if - it points to an "abstract" thing like a path or a VCS location. - """ - from pip.vcs import vcs - - if self.scheme in vcs.all_schemes: - return False - - return True - - -FormatControl = namedtuple('FormatControl', 'no_binary only_binary') -"""This object has two fields, no_binary and only_binary. - -If a field is falsy, it isn't set. If it is {':all:'}, it should match all -packages except those listed in the other field. Only one field can be set -to {':all:'} at a time. The rest of the time exact package name matches -are listed, with any given package only showing up in one field at a time. -""" - - -def fmt_ctl_handle_mutual_exclude(value, target, other): - new = value.split(',') - while ':all:' in new: - other.clear() - target.clear() - target.add(':all:') - del new[:new.index(':all:') + 1] - if ':none:' not in new: - # Without a none, we want to discard everything as :all: covers it - return - for name in new: - if name == ':none:': - target.clear() - continue - name = canonicalize_name(name) - other.discard(name) - target.add(name) - - -def fmt_ctl_formats(fmt_ctl, canonical_name): - result = set(["binary", "source"]) - if canonical_name in fmt_ctl.only_binary: - result.discard('source') - elif canonical_name in fmt_ctl.no_binary: - result.discard('binary') - elif ':all:' in fmt_ctl.only_binary: - result.discard('source') - elif ':all:' in fmt_ctl.no_binary: - result.discard('binary') - return frozenset(result) - - -def fmt_ctl_no_binary(fmt_ctl): - fmt_ctl_handle_mutual_exclude( - ':all:', fmt_ctl.no_binary, fmt_ctl.only_binary) - - -def fmt_ctl_no_use_wheel(fmt_ctl): - fmt_ctl_no_binary(fmt_ctl) - warnings.warn( - '--no-use-wheel is deprecated and will be removed in the future. ' - ' Please use --no-binary :all: instead.', RemovedInPip10Warning, - stacklevel=2) + yield Link(url, self.url, requires_python=pyrequire) Search = namedtuple('Search', 'supplied canonical formats') diff --git a/lib/python3.4/site-packages/pip/locations.py b/lib/python3.7/site-packages/pip/_internal/locations.py similarity index 84% rename from lib/python3.4/site-packages/pip/locations.py rename to lib/python3.7/site-packages/pip/_internal/locations.py index e598ef1..183aaa3 100644 --- a/lib/python3.4/site-packages/pip/locations.py +++ b/lib/python3.7/site-packages/pip/_internal/locations.py @@ -3,15 +3,15 @@ from __future__ import absolute_import import os import os.path +import platform import site import sys +import sysconfig +from distutils import sysconfig as distutils_sysconfig +from distutils.command.install import SCHEME_KEYS # type: ignore -from distutils import sysconfig -from distutils.command.install import install, SCHEME_KEYS # noqa - -from pip.compat import WINDOWS, expanduser -from pip.utils import appdirs - +from pip._internal.utils import appdirs +from pip._internal.utils.compat import WINDOWS, expanduser # Application Directories USER_CACHE_DIR = appdirs.user_cache_dir("pip") @@ -80,8 +80,18 @@ src_prefix = os.path.abspath(src_prefix) # FIXME doesn't account for venv linked to global site-packages -site_packages = sysconfig.get_python_lib() -user_site = site.USER_SITE +site_packages = sysconfig.get_path("purelib") +# This is because of a bug in PyPy's sysconfig module, see +# https://bitbucket.org/pypy/pypy/issues/2506/sysconfig-returns-incorrect-paths +# for more information. +if platform.python_implementation().lower() == "pypy": + site_packages = distutils_sysconfig.get_python_lib() +try: + # Use getusersitepackages if this is present, as it ensures that the + # value is initialised properly. + user_site = site.getusersitepackages() +except AttributeError: + user_site = site.USER_SITE user_dir = expanduser('~') if WINDOWS: bin_py = os.path.join(sys.prefix, 'Scripts') @@ -109,7 +119,6 @@ else: legacy_storage_dir, config_basename, ) - # Forcing to use /usr/local/bin for standard macOS framework installs # Also log to ~/Library/Logs/ for use with the Console.app log viewer if sys.platform[:6] == 'darwin' and sys.prefix[:16] == '/System/Library/': @@ -120,6 +129,9 @@ site_config_files = [ for path in appdirs.site_config_dirs('pip') ] +venv_config_file = os.path.join(sys.prefix, config_basename) +new_config_file = os.path.join(appdirs.user_config_dir("pip"), config_basename) + def distutils_scheme(dist_name, user=False, home=None, root=None, isolated=False, prefix=None): @@ -143,7 +155,7 @@ def distutils_scheme(dist_name, user=False, home=None, root=None, # NOTE: setting user or home has the side-effect of creating the home dir # or user base for installations during finalize_options() # ideally, we'd prefer a scheme class that has no side-effects. - assert not (user and prefix), "user={0} prefix={1}".format(user, prefix) + assert not (user and prefix), "user={} prefix={}".format(user, prefix) i.user = user or i.user if user: i.prefix = "" diff --git a/lib/python3.7/site-packages/pip/_internal/models/__init__.py b/lib/python3.7/site-packages/pip/_internal/models/__init__.py new file mode 100644 index 0000000..7855226 --- /dev/null +++ b/lib/python3.7/site-packages/pip/_internal/models/__init__.py @@ -0,0 +1,2 @@ +"""A package that contains models that represent entities. +""" diff --git a/lib/python3.7/site-packages/pip/_internal/models/candidate.py b/lib/python3.7/site-packages/pip/_internal/models/candidate.py new file mode 100644 index 0000000..c736de6 --- /dev/null +++ b/lib/python3.7/site-packages/pip/_internal/models/candidate.py @@ -0,0 +1,23 @@ +from pip._vendor.packaging.version import parse as parse_version + +from pip._internal.utils.models import KeyBasedCompareMixin + + +class InstallationCandidate(KeyBasedCompareMixin): + """Represents a potential "candidate" for installation. + """ + + def __init__(self, project, version, location): + self.project = project + self.version = parse_version(version) + self.location = location + + super(InstallationCandidate, self).__init__( + key=(self.project, self.version, self.location), + defining_class=InstallationCandidate + ) + + def __repr__(self): + return "".format( + self.project, self.version, self.location, + ) diff --git a/lib/python3.7/site-packages/pip/_internal/models/format_control.py b/lib/python3.7/site-packages/pip/_internal/models/format_control.py new file mode 100644 index 0000000..2748856 --- /dev/null +++ b/lib/python3.7/site-packages/pip/_internal/models/format_control.py @@ -0,0 +1,62 @@ +from pip._vendor.packaging.utils import canonicalize_name + + +class FormatControl(object): + """A helper class for controlling formats from which packages are installed. + If a field is falsy, it isn't set. If it is {':all:'}, it should match all + packages except those listed in the other field. Only one field can be set + to {':all:'} at a time. The rest of the time exact package name matches + are listed, with any given package only showing up in one field at a time. + """ + def __init__(self, no_binary=None, only_binary=None): + self.no_binary = set() if no_binary is None else no_binary + self.only_binary = set() if only_binary is None else only_binary + + def __eq__(self, other): + return self.__dict__ == other.__dict__ + + def __ne__(self, other): + return not self.__eq__(other) + + def __repr__(self): + return "{}({}, {})".format( + self.__class__.__name__, + self.no_binary, + self.only_binary + ) + + @staticmethod + def handle_mutual_excludes(value, target, other): + new = value.split(',') + while ':all:' in new: + other.clear() + target.clear() + target.add(':all:') + del new[:new.index(':all:') + 1] + # Without a none, we want to discard everything as :all: covers it + if ':none:' not in new: + return + for name in new: + if name == ':none:': + target.clear() + continue + name = canonicalize_name(name) + other.discard(name) + target.add(name) + + def get_allowed_formats(self, canonical_name): + result = {"binary", "source"} + if canonical_name in self.only_binary: + result.discard('source') + elif canonical_name in self.no_binary: + result.discard('binary') + elif ':all:' in self.only_binary: + result.discard('source') + elif ':all:' in self.no_binary: + result.discard('binary') + return frozenset(result) + + def disallow_binaries(self): + self.handle_mutual_excludes( + ':all:', self.no_binary, self.only_binary, + ) diff --git a/lib/python3.7/site-packages/pip/_internal/models/index.py b/lib/python3.7/site-packages/pip/_internal/models/index.py new file mode 100644 index 0000000..870a315 --- /dev/null +++ b/lib/python3.7/site-packages/pip/_internal/models/index.py @@ -0,0 +1,29 @@ +from pip._vendor.six.moves.urllib import parse as urllib_parse + + +class PackageIndex(object): + """Represents a Package Index and provides easier access to endpoints + """ + + def __init__(self, url, file_storage_domain): + super(PackageIndex, self).__init__() + self.url = url + self.netloc = urllib_parse.urlsplit(url).netloc + self.simple_url = self._url_for_path('simple') + self.pypi_url = self._url_for_path('pypi') + + # This is part of a temporary hack used to block installs of PyPI + # packages which depend on external urls only necessary until PyPI can + # block such packages themselves + self.file_storage_domain = file_storage_domain + + def _url_for_path(self, path): + return urllib_parse.urljoin(self.url, path) + + +PyPI = PackageIndex( + 'https://pypi.org/', file_storage_domain='files.pythonhosted.org' +) +TestPyPI = PackageIndex( + 'https://test.pypi.org/', file_storage_domain='test-files.pythonhosted.org' +) diff --git a/lib/python3.7/site-packages/pip/_internal/models/link.py b/lib/python3.7/site-packages/pip/_internal/models/link.py new file mode 100644 index 0000000..5decb7c --- /dev/null +++ b/lib/python3.7/site-packages/pip/_internal/models/link.py @@ -0,0 +1,141 @@ +import posixpath +import re + +from pip._vendor.six.moves.urllib import parse as urllib_parse + +from pip._internal.download import path_to_url +from pip._internal.utils.misc import splitext +from pip._internal.utils.models import KeyBasedCompareMixin +from pip._internal.wheel import wheel_ext + + +class Link(KeyBasedCompareMixin): + """Represents a parsed link from a Package Index's simple URL + """ + + def __init__(self, url, comes_from=None, requires_python=None): + """ + url: + url of the resource pointed to (href of the link) + comes_from: + instance of HTMLPage where the link was found, or string. + requires_python: + String containing the `Requires-Python` metadata field, specified + in PEP 345. This may be specified by a data-requires-python + attribute in the HTML link tag, as described in PEP 503. + """ + + # url can be a UNC windows share + if url.startswith('\\\\'): + url = path_to_url(url) + + self.url = url + self.comes_from = comes_from + self.requires_python = requires_python if requires_python else None + + super(Link, self).__init__( + key=(self.url), + defining_class=Link + ) + + def __str__(self): + if self.requires_python: + rp = ' (requires-python:%s)' % self.requires_python + else: + rp = '' + if self.comes_from: + return '%s (from %s)%s' % (self.url, self.comes_from, rp) + else: + return str(self.url) + + def __repr__(self): + return '' % self + + @property + def filename(self): + _, netloc, path, _, _ = urllib_parse.urlsplit(self.url) + name = posixpath.basename(path.rstrip('/')) or netloc + name = urllib_parse.unquote(name) + assert name, ('URL %r produced no filename' % self.url) + return name + + @property + def scheme(self): + return urllib_parse.urlsplit(self.url)[0] + + @property + def netloc(self): + return urllib_parse.urlsplit(self.url)[1] + + @property + def path(self): + return urllib_parse.unquote(urllib_parse.urlsplit(self.url)[2]) + + def splitext(self): + return splitext(posixpath.basename(self.path.rstrip('/'))) + + @property + def ext(self): + return self.splitext()[1] + + @property + def url_without_fragment(self): + scheme, netloc, path, query, fragment = urllib_parse.urlsplit(self.url) + return urllib_parse.urlunsplit((scheme, netloc, path, query, None)) + + _egg_fragment_re = re.compile(r'[#&]egg=([^&]*)') + + @property + def egg_fragment(self): + match = self._egg_fragment_re.search(self.url) + if not match: + return None + return match.group(1) + + _subdirectory_fragment_re = re.compile(r'[#&]subdirectory=([^&]*)') + + @property + def subdirectory_fragment(self): + match = self._subdirectory_fragment_re.search(self.url) + if not match: + return None + return match.group(1) + + _hash_re = re.compile( + r'(sha1|sha224|sha384|sha256|sha512|md5)=([a-f0-9]+)' + ) + + @property + def hash(self): + match = self._hash_re.search(self.url) + if match: + return match.group(2) + return None + + @property + def hash_name(self): + match = self._hash_re.search(self.url) + if match: + return match.group(1) + return None + + @property + def show_url(self): + return posixpath.basename(self.url.split('#', 1)[0].split('?', 1)[0]) + + @property + def is_wheel(self): + return self.ext == wheel_ext + + @property + def is_artifact(self): + """ + Determines if this points to an actual artifact (e.g. a tarball) or if + it points to an "abstract" thing like a path or a VCS location. + """ + from pip._internal.vcs import vcs + + if self.scheme in vcs.all_schemes: + return False + + return True diff --git a/lib/python3.4/site-packages/pip/operations/__init__.py b/lib/python3.7/site-packages/pip/_internal/operations/__init__.py similarity index 100% rename from lib/python3.4/site-packages/pip/operations/__init__.py rename to lib/python3.7/site-packages/pip/_internal/operations/__init__.py diff --git a/lib/python3.7/site-packages/pip/_internal/operations/check.py b/lib/python3.7/site-packages/pip/_internal/operations/check.py new file mode 100644 index 0000000..799257a --- /dev/null +++ b/lib/python3.7/site-packages/pip/_internal/operations/check.py @@ -0,0 +1,148 @@ +"""Validation of dependencies of packages +""" + +from collections import namedtuple + +from pip._vendor.packaging.utils import canonicalize_name + +from pip._internal.operations.prepare import make_abstract_dist +from pip._internal.utils.misc import get_installed_distributions +from pip._internal.utils.typing import MYPY_CHECK_RUNNING + +if MYPY_CHECK_RUNNING: + from pip._internal.req.req_install import InstallRequirement # noqa: F401 + from typing import ( # noqa: F401 + Any, Callable, Dict, Iterator, Optional, Set, Tuple, List + ) + + # Shorthands + PackageSet = Dict[str, 'PackageDetails'] + Missing = Tuple[str, Any] + Conflicting = Tuple[str, str, Any] + + MissingDict = Dict[str, List[Missing]] + ConflictingDict = Dict[str, List[Conflicting]] + CheckResult = Tuple[MissingDict, ConflictingDict] + +PackageDetails = namedtuple('PackageDetails', ['version', 'requires']) + + +def create_package_set_from_installed(**kwargs): + # type: (**Any) -> PackageSet + """Converts a list of distributions into a PackageSet. + """ + # Default to using all packages installed on the system + if kwargs == {}: + kwargs = {"local_only": False, "skip": ()} + + package_set = {} + for dist in get_installed_distributions(**kwargs): + name = canonicalize_name(dist.project_name) + package_set[name] = PackageDetails(dist.version, dist.requires()) + return package_set + + +def check_package_set(package_set, should_ignore=None): + # type: (PackageSet, Optional[Callable[[str], bool]]) -> CheckResult + """Check if a package set is consistent + + If should_ignore is passed, it should be a callable that takes a + package name and returns a boolean. + """ + if should_ignore is None: + def should_ignore(name): + return False + + missing = dict() + conflicting = dict() + + for package_name in package_set: + # Info about dependencies of package_name + missing_deps = set() # type: Set[Missing] + conflicting_deps = set() # type: Set[Conflicting] + + if should_ignore(package_name): + continue + + for req in package_set[package_name].requires: + name = canonicalize_name(req.project_name) # type: str + + # Check if it's missing + if name not in package_set: + missed = True + if req.marker is not None: + missed = req.marker.evaluate() + if missed: + missing_deps.add((name, req)) + continue + + # Check if there's a conflict + version = package_set[name].version # type: str + if not req.specifier.contains(version, prereleases=True): + conflicting_deps.add((name, version, req)) + + if missing_deps: + missing[package_name] = sorted(missing_deps, key=str) + if conflicting_deps: + conflicting[package_name] = sorted(conflicting_deps, key=str) + + return missing, conflicting + + +def check_install_conflicts(to_install): + # type: (List[InstallRequirement]) -> Tuple[PackageSet, CheckResult] + """For checking if the dependency graph would be consistent after \ + installing given requirements + """ + # Start from the current state + package_set = create_package_set_from_installed() + # Install packages + would_be_installed = _simulate_installation_of(to_install, package_set) + + # Only warn about directly-dependent packages; create a whitelist of them + whitelist = _create_whitelist(would_be_installed, package_set) + + return ( + package_set, + check_package_set( + package_set, should_ignore=lambda name: name not in whitelist + ) + ) + + +# NOTE from @pradyunsg +# This required a minor update in dependency link handling logic over at +# operations.prepare.IsSDist.dist() to get it working +def _simulate_installation_of(to_install, package_set): + # type: (List[InstallRequirement], PackageSet) -> Set[str] + """Computes the version of packages after installing to_install. + """ + + # Keep track of packages that were installed + installed = set() + + # Modify it as installing requirement_set would (assuming no errors) + for inst_req in to_install: + dist = make_abstract_dist(inst_req).dist(finder=None) + name = canonicalize_name(dist.key) + package_set[name] = PackageDetails(dist.version, dist.requires()) + + installed.add(name) + + return installed + + +def _create_whitelist(would_be_installed, package_set): + # type: (Set[str], PackageSet) -> Set[str] + packages_affected = set(would_be_installed) + + for package_name in package_set: + if package_name in packages_affected: + continue + + for req in package_set[package_name].requires: + if canonicalize_name(req.name) in packages_affected: + packages_affected.add(package_name) + break + + return packages_affected diff --git a/lib/python3.7/site-packages/pip/_internal/operations/freeze.py b/lib/python3.7/site-packages/pip/_internal/operations/freeze.py new file mode 100644 index 0000000..beb2feb --- /dev/null +++ b/lib/python3.7/site-packages/pip/_internal/operations/freeze.py @@ -0,0 +1,264 @@ +from __future__ import absolute_import + +import collections +import logging +import os +import re + +from pip._vendor import pkg_resources, six +from pip._vendor.packaging.utils import canonicalize_name +from pip._vendor.pkg_resources import RequirementParseError + +from pip._internal.exceptions import InstallationError +from pip._internal.req.constructors import ( + install_req_from_editable, install_req_from_line, +) +from pip._internal.req.req_file import COMMENT_RE +from pip._internal.utils.deprecation import deprecated +from pip._internal.utils.misc import ( + dist_is_editable, get_installed_distributions, make_vcs_requirement_url, +) + +logger = logging.getLogger(__name__) + + +def freeze( + requirement=None, + find_links=None, local_only=None, user_only=None, skip_regex=None, + isolated=False, + wheel_cache=None, + exclude_editable=False, + skip=()): + find_links = find_links or [] + skip_match = None + + if skip_regex: + skip_match = re.compile(skip_regex).search + + dependency_links = [] + + for dist in pkg_resources.working_set: + if dist.has_metadata('dependency_links.txt'): + dependency_links.extend( + dist.get_metadata_lines('dependency_links.txt') + ) + for link in find_links: + if '#egg=' in link: + dependency_links.append(link) + for link in find_links: + yield '-f %s' % link + installations = {} + for dist in get_installed_distributions(local_only=local_only, + skip=(), + user_only=user_only): + try: + req = FrozenRequirement.from_dist( + dist, + dependency_links + ) + except RequirementParseError: + logger.warning( + "Could not parse requirement: %s", + dist.project_name + ) + continue + if exclude_editable and req.editable: + continue + installations[req.name] = req + + if requirement: + # the options that don't get turned into an InstallRequirement + # should only be emitted once, even if the same option is in multiple + # requirements files, so we need to keep track of what has been emitted + # so that we don't emit it again if it's seen again + emitted_options = set() + # keep track of which files a requirement is in so that we can + # give an accurate warning if a requirement appears multiple times. + req_files = collections.defaultdict(list) + for req_file_path in requirement: + with open(req_file_path) as req_file: + for line in req_file: + if (not line.strip() or + line.strip().startswith('#') or + (skip_match and skip_match(line)) or + line.startswith(( + '-r', '--requirement', + '-Z', '--always-unzip', + '-f', '--find-links', + '-i', '--index-url', + '--pre', + '--trusted-host', + '--process-dependency-links', + '--extra-index-url'))): + line = line.rstrip() + if line not in emitted_options: + emitted_options.add(line) + yield line + continue + + if line.startswith('-e') or line.startswith('--editable'): + if line.startswith('-e'): + line = line[2:].strip() + else: + line = line[len('--editable'):].strip().lstrip('=') + line_req = install_req_from_editable( + line, + isolated=isolated, + wheel_cache=wheel_cache, + ) + else: + line_req = install_req_from_line( + COMMENT_RE.sub('', line).strip(), + isolated=isolated, + wheel_cache=wheel_cache, + ) + + if not line_req.name: + logger.info( + "Skipping line in requirement file [%s] because " + "it's not clear what it would install: %s", + req_file_path, line.strip(), + ) + logger.info( + " (add #egg=PackageName to the URL to avoid" + " this warning)" + ) + elif line_req.name not in installations: + # either it's not installed, or it is installed + # but has been processed already + if not req_files[line_req.name]: + logger.warning( + "Requirement file [%s] contains %s, but that " + "package is not installed", + req_file_path, + COMMENT_RE.sub('', line).strip(), + ) + else: + req_files[line_req.name].append(req_file_path) + else: + yield str(installations[line_req.name]).rstrip() + del installations[line_req.name] + req_files[line_req.name].append(req_file_path) + + # Warn about requirements that were included multiple times (in a + # single requirements file or in different requirements files). + for name, files in six.iteritems(req_files): + if len(files) > 1: + logger.warning("Requirement %s included multiple times [%s]", + name, ', '.join(sorted(set(files)))) + + yield( + '## The following requirements were added by ' + 'pip freeze:' + ) + for installation in sorted( + installations.values(), key=lambda x: x.name.lower()): + if canonicalize_name(installation.name) not in skip: + yield str(installation).rstrip() + + +class FrozenRequirement(object): + def __init__(self, name, req, editable, comments=()): + self.name = name + self.req = req + self.editable = editable + self.comments = comments + + _rev_re = re.compile(r'-r(\d+)$') + _date_re = re.compile(r'-(20\d\d\d\d\d\d)$') + + @classmethod + def _init_args_from_dist(cls, dist, dependency_links): + """ + Compute and return arguments (req, editable, comments) to pass to + FrozenRequirement.__init__(). + + This method is for use in FrozenRequirement.from_dist(). + """ + location = os.path.normcase(os.path.abspath(dist.location)) + comments = [] + from pip._internal.vcs import vcs, get_src_requirement + if dist_is_editable(dist) and vcs.get_backend_name(location): + editable = True + try: + req = get_src_requirement(dist, location) + except InstallationError as exc: + logger.warning( + "Error when trying to get requirement for VCS system %s, " + "falling back to uneditable format", exc + ) + req = None + if req is None: + logger.warning( + 'Could not determine repository location of %s', location + ) + comments.append( + '## !! Could not determine repository location' + ) + req = dist.as_requirement() + editable = False + else: + editable = False + req = dist.as_requirement() + specs = req.specs + assert len(specs) == 1 and specs[0][0] in ["==", "==="], \ + 'Expected 1 spec with == or ===; specs = %r; dist = %r' % \ + (specs, dist) + version = specs[0][1] + ver_match = cls._rev_re.search(version) + date_match = cls._date_re.search(version) + if ver_match or date_match: + svn_backend = vcs.get_backend('svn') + if svn_backend: + svn_location = svn_backend().get_location( + dist, + dependency_links, + ) + if not svn_location: + logger.warning( + 'Warning: cannot find svn location for %s', req, + ) + comments.append( + '## FIXME: could not find svn URL in dependency_links ' + 'for this package:' + ) + else: + deprecated( + "SVN editable detection based on dependency links " + "will be dropped in the future.", + replacement=None, + gone_in="18.2", + issue=4187, + ) + comments.append( + '# Installing as editable to satisfy requirement %s:' % + req + ) + if ver_match: + rev = ver_match.group(1) + else: + rev = '{%s}' % date_match.group(1) + editable = True + egg_name = cls.egg_name(dist) + req = make_vcs_requirement_url(svn_location, rev, egg_name) + + return (req, editable, comments) + + @classmethod + def from_dist(cls, dist, dependency_links): + args = cls._init_args_from_dist(dist, dependency_links) + return cls(dist.project_name, *args) + + @staticmethod + def egg_name(dist): + name = dist.egg_name() + match = re.search(r'-py\d\.\d$', name) + if match: + name = name[:match.start()] + return name + + def __str__(self): + req = self.req + if self.editable: + req = '-e %s' % req + return '\n'.join(list(self.comments) + [str(req)]) + '\n' diff --git a/lib/python3.7/site-packages/pip/_internal/operations/prepare.py b/lib/python3.7/site-packages/pip/_internal/operations/prepare.py new file mode 100644 index 0000000..104bea3 --- /dev/null +++ b/lib/python3.7/site-packages/pip/_internal/operations/prepare.py @@ -0,0 +1,355 @@ +"""Prepares a distribution for installation +""" + +import logging +import os + +from pip._vendor import pkg_resources, requests + +from pip._internal.build_env import BuildEnvironment +from pip._internal.download import ( + is_dir_url, is_file_url, is_vcs_url, unpack_url, url_to_path, +) +from pip._internal.exceptions import ( + DirectoryUrlHashUnsupported, HashUnpinned, InstallationError, + PreviousBuildDirError, VcsHashUnsupported, +) +from pip._internal.utils.compat import expanduser +from pip._internal.utils.hashes import MissingHashes +from pip._internal.utils.logging import indent_log +from pip._internal.utils.misc import display_path, normalize_path +from pip._internal.vcs import vcs + +logger = logging.getLogger(__name__) + + +def make_abstract_dist(req): + """Factory to make an abstract dist object. + + Preconditions: Either an editable req with a source_dir, or satisfied_by or + a wheel link, or a non-editable req with a source_dir. + + :return: A concrete DistAbstraction. + """ + if req.editable: + return IsSDist(req) + elif req.link and req.link.is_wheel: + return IsWheel(req) + else: + return IsSDist(req) + + +class DistAbstraction(object): + """Abstracts out the wheel vs non-wheel Resolver.resolve() logic. + + The requirements for anything installable are as follows: + - we must be able to determine the requirement name + (or we can't correctly handle the non-upgrade case). + - we must be able to generate a list of run-time dependencies + without installing any additional packages (or we would + have to either burn time by doing temporary isolated installs + or alternatively violate pips 'don't start installing unless + all requirements are available' rule - neither of which are + desirable). + - for packages with setup requirements, we must also be able + to determine their requirements without installing additional + packages (for the same reason as run-time dependencies) + - we must be able to create a Distribution object exposing the + above metadata. + """ + + def __init__(self, req): + self.req = req + + def dist(self, finder): + """Return a setuptools Dist object.""" + raise NotImplementedError(self.dist) + + def prep_for_dist(self, finder, build_isolation): + """Ensure that we can get a Dist for this requirement.""" + raise NotImplementedError(self.dist) + + +class IsWheel(DistAbstraction): + + def dist(self, finder): + return list(pkg_resources.find_distributions( + self.req.source_dir))[0] + + def prep_for_dist(self, finder, build_isolation): + # FIXME:https://github.com/pypa/pip/issues/1112 + pass + + +class IsSDist(DistAbstraction): + + def dist(self, finder): + dist = self.req.get_dist() + # FIXME: shouldn't be globally added. + if finder and dist.has_metadata('dependency_links.txt'): + finder.add_dependency_links( + dist.get_metadata_lines('dependency_links.txt') + ) + return dist + + def prep_for_dist(self, finder, build_isolation): + # Prepare for building. We need to: + # 1. Load pyproject.toml (if it exists) + # 2. Set up the build environment + + self.req.load_pyproject_toml() + should_isolate = self.req.use_pep517 and build_isolation + + if should_isolate: + # Isolate in a BuildEnvironment and install the build-time + # requirements. + self.req.build_env = BuildEnvironment() + self.req.build_env.install_requirements( + finder, self.req.pyproject_requires, + "Installing build dependencies" + ) + missing = [] + if self.req.requirements_to_check: + check = self.req.requirements_to_check + missing = self.req.build_env.missing_requirements(check) + if missing: + logger.warning( + "Missing build requirements in pyproject.toml for %s.", + self.req, + ) + logger.warning( + "The project does not specify a build backend, and pip " + "cannot fall back to setuptools without %s.", + " and ".join(map(repr, sorted(missing))) + ) + + self.req.run_egg_info() + self.req.assert_source_matches_version() + + +class Installed(DistAbstraction): + + def dist(self, finder): + return self.req.satisfied_by + + def prep_for_dist(self, finder, build_isolation): + pass + + +class RequirementPreparer(object): + """Prepares a Requirement + """ + + def __init__(self, build_dir, download_dir, src_dir, wheel_download_dir, + progress_bar, build_isolation, req_tracker): + super(RequirementPreparer, self).__init__() + + self.src_dir = src_dir + self.build_dir = build_dir + self.req_tracker = req_tracker + + # Where still packed archives should be written to. If None, they are + # not saved, and are deleted immediately after unpacking. + self.download_dir = download_dir + + # Where still-packed .whl files should be written to. If None, they are + # written to the download_dir parameter. Separate to download_dir to + # permit only keeping wheel archives for pip wheel. + if wheel_download_dir: + wheel_download_dir = normalize_path(wheel_download_dir) + self.wheel_download_dir = wheel_download_dir + + # NOTE + # download_dir and wheel_download_dir overlap semantically and may + # be combined if we're willing to have non-wheel archives present in + # the wheelhouse output by 'pip wheel'. + + self.progress_bar = progress_bar + + # Is build isolation allowed? + self.build_isolation = build_isolation + + @property + def _download_should_save(self): + # TODO: Modify to reduce indentation needed + if self.download_dir: + self.download_dir = expanduser(self.download_dir) + if os.path.exists(self.download_dir): + return True + else: + logger.critical('Could not find download directory') + raise InstallationError( + "Could not find or access download directory '%s'" + % display_path(self.download_dir)) + return False + + def prepare_linked_requirement(self, req, session, finder, + upgrade_allowed, require_hashes): + """Prepare a requirement that would be obtained from req.link + """ + # TODO: Breakup into smaller functions + if req.link and req.link.scheme == 'file': + path = url_to_path(req.link.url) + logger.info('Processing %s', display_path(path)) + else: + logger.info('Collecting %s', req) + + with indent_log(): + # @@ if filesystem packages are not marked + # editable in a req, a non deterministic error + # occurs when the script attempts to unpack the + # build directory + req.ensure_has_source_dir(self.build_dir) + # If a checkout exists, it's unwise to keep going. version + # inconsistencies are logged later, but do not fail the + # installation. + # FIXME: this won't upgrade when there's an existing + # package unpacked in `req.source_dir` + # package unpacked in `req.source_dir` + if os.path.exists(os.path.join(req.source_dir, 'setup.py')): + raise PreviousBuildDirError( + "pip can't proceed with requirements '%s' due to a" + " pre-existing build directory (%s). This is " + "likely due to a previous installation that failed" + ". pip is being responsible and not assuming it " + "can delete this. Please delete it and try again." + % (req, req.source_dir) + ) + req.populate_link(finder, upgrade_allowed, require_hashes) + + # We can't hit this spot and have populate_link return None. + # req.satisfied_by is None here (because we're + # guarded) and upgrade has no impact except when satisfied_by + # is not None. + # Then inside find_requirement existing_applicable -> False + # If no new versions are found, DistributionNotFound is raised, + # otherwise a result is guaranteed. + assert req.link + link = req.link + + # Now that we have the real link, we can tell what kind of + # requirements we have and raise some more informative errors + # than otherwise. (For example, we can raise VcsHashUnsupported + # for a VCS URL rather than HashMissing.) + if require_hashes: + # We could check these first 2 conditions inside + # unpack_url and save repetition of conditions, but then + # we would report less-useful error messages for + # unhashable requirements, complaining that there's no + # hash provided. + if is_vcs_url(link): + raise VcsHashUnsupported() + elif is_file_url(link) and is_dir_url(link): + raise DirectoryUrlHashUnsupported() + if not req.original_link and not req.is_pinned: + # Unpinned packages are asking for trouble when a new + # version is uploaded. This isn't a security check, but + # it saves users a surprising hash mismatch in the + # future. + # + # file:/// URLs aren't pinnable, so don't complain + # about them not being pinned. + raise HashUnpinned() + + hashes = req.hashes(trust_internet=not require_hashes) + if require_hashes and not hashes: + # Known-good hashes are missing for this requirement, so + # shim it with a facade object that will provoke hash + # computation and then raise a HashMissing exception + # showing the user what the hash should be. + hashes = MissingHashes() + + try: + download_dir = self.download_dir + # We always delete unpacked sdists after pip ran. + autodelete_unpacked = True + if req.link.is_wheel and self.wheel_download_dir: + # when doing 'pip wheel` we download wheels to a + # dedicated dir. + download_dir = self.wheel_download_dir + if req.link.is_wheel: + if download_dir: + # When downloading, we only unpack wheels to get + # metadata. + autodelete_unpacked = True + else: + # When installing a wheel, we use the unpacked + # wheel. + autodelete_unpacked = False + unpack_url( + req.link, req.source_dir, + download_dir, autodelete_unpacked, + session=session, hashes=hashes, + progress_bar=self.progress_bar + ) + except requests.HTTPError as exc: + logger.critical( + 'Could not install requirement %s because of error %s', + req, + exc, + ) + raise InstallationError( + 'Could not install requirement %s because of HTTP ' + 'error %s for URL %s' % + (req, exc, req.link) + ) + abstract_dist = make_abstract_dist(req) + with self.req_tracker.track(req): + abstract_dist.prep_for_dist(finder, self.build_isolation) + if self._download_should_save: + # Make a .zip of the source_dir we already created. + if req.link.scheme in vcs.all_schemes: + req.archive(self.download_dir) + return abstract_dist + + def prepare_editable_requirement(self, req, require_hashes, use_user_site, + finder): + """Prepare an editable requirement + """ + assert req.editable, "cannot prepare a non-editable req as editable" + + logger.info('Obtaining %s', req) + + with indent_log(): + if require_hashes: + raise InstallationError( + 'The editable requirement %s cannot be installed when ' + 'requiring hashes, because there is no single file to ' + 'hash.' % req + ) + req.ensure_has_source_dir(self.src_dir) + req.update_editable(not self._download_should_save) + + abstract_dist = make_abstract_dist(req) + with self.req_tracker.track(req): + abstract_dist.prep_for_dist(finder, self.build_isolation) + + if self._download_should_save: + req.archive(self.download_dir) + req.check_if_exists(use_user_site) + + return abstract_dist + + def prepare_installed_requirement(self, req, require_hashes, skip_reason): + """Prepare an already-installed requirement + """ + assert req.satisfied_by, "req should have been satisfied but isn't" + assert skip_reason is not None, ( + "did not get skip reason skipped but req.satisfied_by " + "is set to %r" % (req.satisfied_by,) + ) + logger.info( + 'Requirement %s: %s (%s)', + skip_reason, req, req.satisfied_by.version + ) + with indent_log(): + if require_hashes: + logger.debug( + 'Since it is already installed, we are trusting this ' + 'package without checking its hash. To ensure a ' + 'completely repeatable environment, install into an ' + 'empty virtualenv.' + ) + abstract_dist = Installed(req) + + return abstract_dist diff --git a/lib/python3.4/site-packages/pip/pep425tags.py b/lib/python3.7/site-packages/pip/_internal/pep425tags.py similarity index 91% rename from lib/python3.4/site-packages/pip/pep425tags.py rename to lib/python3.7/site-packages/pip/_internal/pep425tags.py index ad202ef..ab1a029 100644 --- a/lib/python3.4/site-packages/pip/pep425tags.py +++ b/lib/python3.7/site-packages/pip/_internal/pep425tags.py @@ -1,21 +1,17 @@ """Generate and work with PEP 425 Compatibility Tags.""" from __future__ import absolute_import +import distutils.util +import logging +import platform import re import sys +import sysconfig import warnings -import platform -import logging +from collections import OrderedDict -try: - import sysconfig -except ImportError: # pragma nocover - # Python < 2.7 - import distutils.sysconfig as sysconfig -import distutils.util - -from pip.compat import OrderedDict -import pip.utils.glibc +import pip._internal.utils.glibc +from pip._internal.utils.compat import get_extension_suffixes logger = logging.getLogger(__name__) @@ -26,7 +22,7 @@ def get_config_var(var): try: return sysconfig.get_config_var(var) except IOError as e: # Issue #1074 - warnings.warn("{0}".format(e), RuntimeWarning) + warnings.warn("{}".format(e), RuntimeWarning) return None @@ -66,7 +62,7 @@ def get_impl_tag(): """ Returns the Tag for this specific implementation. """ - return "{0}{1}".format(get_abbr_impl(), get_impl_ver()) + return "{}{}".format(get_abbr_impl(), get_impl_ver()) def get_flag(var, fallback, expected=True, warn=True): @@ -86,7 +82,7 @@ def get_abi_tag(): (CPython 2, PyPy).""" soabi = get_config_var('SOABI') impl = get_abbr_impl() - if not soabi and impl in ('cp', 'pp') and hasattr(sys, 'maxunicode'): + if not soabi and impl in {'cp', 'pp'} and hasattr(sys, 'maxunicode'): d = '' m = '' u = '' @@ -133,7 +129,7 @@ def get_platform(): elif machine == "ppc64" and _is_running_32bit(): machine = "ppc" - return 'macosx_{0}_{1}_{2}'.format(split_ver[0], split_ver[1], machine) + return 'macosx_{}_{}_{}'.format(split_ver[0], split_ver[1], machine) # XXX remove distutils dependency result = distutils.util.get_platform().replace('.', '_').replace('-', '_') @@ -147,7 +143,7 @@ def get_platform(): def is_manylinux1_compatible(): # Only Linux, and only x86-64 / i686 - if get_platform() not in ("linux_x86_64", "linux_i686"): + if get_platform() not in {"linux_x86_64", "linux_i686"}: return False # Check for presence of _manylinux module @@ -159,7 +155,7 @@ def is_manylinux1_compatible(): pass # Check glibc version. CentOS 5 uses glibc 2.5. - return pip.utils.glibc.have_compatible_glibc(2, 5) + return pip._internal.utils.glibc.have_compatible_glibc(2, 5) def get_darwin_arches(major, minor, machine): @@ -257,10 +253,9 @@ def get_supported(versions=None, noarch=False, platform=None, abis[0:0] = [abi] abi3s = set() - import imp - for suffix in imp.get_suffixes(): - if suffix[0].startswith('.abi'): - abi3s.add(suffix[0].split('.', 2)[1]) + for suffix in get_extension_suffixes(): + if suffix.startswith('.abi'): + abi3s.add(suffix.split('.', 2)[1]) abis.extend(sorted(list(abi3s))) @@ -273,7 +268,7 @@ def get_supported(versions=None, noarch=False, platform=None, match = _osx_arch_pat.match(arch) if match: name, major, minor, actual_arch = match.groups() - tpl = '{0}_{1}_%i_%s'.format(name, major) + tpl = '{}_{}_%i_%s'.format(name, major) arches = [] for m in reversed(range(int(minor) + 1)): for a in get_darwin_arches(int(major), m, actual_arch): @@ -294,7 +289,7 @@ def get_supported(versions=None, noarch=False, platform=None, # abi3 modules compatible with older version of Python for version in versions[1:]: # abi3 was introduced in Python 3.2 - if version in ('31', '30'): + if version in {'31', '30'}: break for abi in abi3s: # empty set if not Python 3 for arch in arches: @@ -318,7 +313,5 @@ def get_supported(versions=None, noarch=False, platform=None, return supported -supported_tags = get_supported() -supported_tags_noarch = get_supported(noarch=True) implementation_tag = get_impl_tag() diff --git a/lib/python3.7/site-packages/pip/_internal/pyproject.py b/lib/python3.7/site-packages/pip/_internal/pyproject.py new file mode 100644 index 0000000..f938a76 --- /dev/null +++ b/lib/python3.7/site-packages/pip/_internal/pyproject.py @@ -0,0 +1,144 @@ +from __future__ import absolute_import + +import io +import os + +from pip._vendor import pytoml, six + +from pip._internal.exceptions import InstallationError + + +def _is_list_of_str(obj): + return ( + isinstance(obj, list) and + all(isinstance(item, six.string_types) for item in obj) + ) + + +def load_pyproject_toml(use_pep517, pyproject_toml, setup_py, req_name): + """Load the pyproject.toml file. + + Parameters: + use_pep517 - Has the user requested PEP 517 processing? None + means the user hasn't explicitly specified. + pyproject_toml - Location of the project's pyproject.toml file + setup_py - Location of the project's setup.py file + req_name - The name of the requirement we're processing (for + error reporting) + + Returns: + None if we should use the legacy code path, otherwise a tuple + ( + requirements from pyproject.toml, + name of PEP 517 backend, + requirements we should check are installed after setting + up the build environment + ) + """ + has_pyproject = os.path.isfile(pyproject_toml) + has_setup = os.path.isfile(setup_py) + + if has_pyproject: + with io.open(pyproject_toml, encoding="utf-8") as f: + pp_toml = pytoml.load(f) + build_system = pp_toml.get("build-system") + else: + build_system = None + + # The following cases must use PEP 517 + # We check for use_pep517 equalling False because that + # means the user explicitly requested --no-use-pep517 + if has_pyproject and not has_setup: + if use_pep517 is False: + raise InstallationError( + "Disabling PEP 517 processing is invalid: " + "project does not have a setup.py" + ) + use_pep517 = True + elif build_system and "build-backend" in build_system: + if use_pep517 is False: + raise InstallationError( + "Disabling PEP 517 processing is invalid: " + "project specifies a build backend of {} " + "in pyproject.toml".format( + build_system["build-backend"] + ) + ) + use_pep517 = True + + # If we haven't worked out whether to use PEP 517 yet, + # and the user hasn't explicitly stated a preference, + # we do so if the project has a pyproject.toml file. + elif use_pep517 is None: + use_pep517 = has_pyproject + + # At this point, we know whether we're going to use PEP 517. + assert use_pep517 is not None + + # If we're using the legacy code path, there is nothing further + # for us to do here. + if not use_pep517: + return None + + if build_system is None: + # Either the user has a pyproject.toml with no build-system + # section, or the user has no pyproject.toml, but has opted in + # explicitly via --use-pep517. + # In the absence of any explicit backend specification, we + # assume the setuptools backend, and require wheel and a version + # of setuptools that supports that backend. + build_system = { + "requires": ["setuptools>=38.2.5", "wheel"], + "build-backend": "setuptools.build_meta", + } + + # If we're using PEP 517, we have build system information (either + # from pyproject.toml, or defaulted by the code above). + # Note that at this point, we do not know if the user has actually + # specified a backend, though. + assert build_system is not None + + # Ensure that the build-system section in pyproject.toml conforms + # to PEP 518. + error_template = ( + "{package} has a pyproject.toml file that does not comply " + "with PEP 518: {reason}" + ) + + # Specifying the build-system table but not the requires key is invalid + if "requires" not in build_system: + raise InstallationError( + error_template.format(package=req_name, reason=( + "it has a 'build-system' table but not " + "'build-system.requires' which is mandatory in the table" + )) + ) + + # Error out if requires is not a list of strings + requires = build_system["requires"] + if not _is_list_of_str(requires): + raise InstallationError(error_template.format( + package=req_name, + reason="'build-system.requires' is not a list of strings.", + )) + + backend = build_system.get("build-backend") + check = [] + if backend is None: + # If the user didn't specify a backend, we assume they want to use + # the setuptools backend. But we can't be sure they have included + # a version of setuptools which supplies the backend, or wheel + # (which is neede by the backend) in their requirements. So we + # make a note to check that those requirements are present once + # we have set up the environment. + # TODO: Review this - it's quite a lot of work to check for a very + # specific case. The problem is, that case is potentially quite + # common - projects that adopted PEP 518 early for the ability to + # specify requirements to execute setup.py, but never considered + # needing to mention the build tools themselves. The original PEP + # 518 code had a similar check (but implemented in a different + # way). + backend = "setuptools.build_meta" + check = ["setuptools>=38.2.5", "wheel"] + + return (requires, backend, check) diff --git a/lib/python3.7/site-packages/pip/_internal/req/__init__.py b/lib/python3.7/site-packages/pip/_internal/req/__init__.py new file mode 100644 index 0000000..b270498 --- /dev/null +++ b/lib/python3.7/site-packages/pip/_internal/req/__init__.py @@ -0,0 +1,69 @@ +from __future__ import absolute_import + +import logging + +from .req_install import InstallRequirement +from .req_set import RequirementSet +from .req_file import parse_requirements +from pip._internal.utils.logging import indent_log + + +__all__ = [ + "RequirementSet", "InstallRequirement", + "parse_requirements", "install_given_reqs", +] + +logger = logging.getLogger(__name__) + + +def install_given_reqs(to_install, install_options, global_options=(), + *args, **kwargs): + """ + Install everything in the given list. + + (to be called after having downloaded and unpacked the packages) + """ + + if to_install: + logger.info( + 'Installing collected packages: %s', + ', '.join([req.name for req in to_install]), + ) + + with indent_log(): + for requirement in to_install: + if requirement.conflicts_with: + logger.info( + 'Found existing installation: %s', + requirement.conflicts_with, + ) + with indent_log(): + uninstalled_pathset = requirement.uninstall( + auto_confirm=True + ) + try: + requirement.install( + install_options, + global_options, + *args, + **kwargs + ) + except Exception: + should_rollback = ( + requirement.conflicts_with and + not requirement.install_succeeded + ) + # if install did not succeed, rollback previous uninstall + if should_rollback: + uninstalled_pathset.rollback() + raise + else: + should_commit = ( + requirement.conflicts_with and + requirement.install_succeeded + ) + if should_commit: + uninstalled_pathset.commit() + requirement.remove_temporary_source() + + return to_install diff --git a/lib/python3.7/site-packages/pip/_internal/req/constructors.py b/lib/python3.7/site-packages/pip/_internal/req/constructors.py new file mode 100644 index 0000000..4c4641d --- /dev/null +++ b/lib/python3.7/site-packages/pip/_internal/req/constructors.py @@ -0,0 +1,298 @@ +"""Backing implementation for InstallRequirement's various constructors + +The idea here is that these formed a major chunk of InstallRequirement's size +so, moving them and support code dedicated to them outside of that class +helps creates for better understandability for the rest of the code. + +These are meant to be used elsewhere within pip to create instances of +InstallRequirement. +""" + +import logging +import os +import re +import traceback + +from pip._vendor.packaging.markers import Marker +from pip._vendor.packaging.requirements import InvalidRequirement, Requirement +from pip._vendor.packaging.specifiers import Specifier +from pip._vendor.pkg_resources import RequirementParseError, parse_requirements + +from pip._internal.download import ( + is_archive_file, is_url, path_to_url, url_to_path, +) +from pip._internal.exceptions import InstallationError +from pip._internal.models.index import PyPI, TestPyPI +from pip._internal.models.link import Link +from pip._internal.req.req_install import InstallRequirement +from pip._internal.utils.misc import is_installable_dir +from pip._internal.vcs import vcs +from pip._internal.wheel import Wheel + +__all__ = [ + "install_req_from_editable", "install_req_from_line", + "parse_editable" +] + +logger = logging.getLogger(__name__) +operators = Specifier._operators.keys() + + +def _strip_extras(path): + m = re.match(r'^(.+)(\[[^\]]+\])$', path) + extras = None + if m: + path_no_extras = m.group(1) + extras = m.group(2) + else: + path_no_extras = path + + return path_no_extras, extras + + +def parse_editable(editable_req): + """Parses an editable requirement into: + - a requirement name + - an URL + - extras + - editable options + Accepted requirements: + svn+http://blahblah@rev#egg=Foobar[baz]&subdirectory=version_subdir + .[some_extra] + """ + + url = editable_req + + # If a file path is specified with extras, strip off the extras. + url_no_extras, extras = _strip_extras(url) + + if os.path.isdir(url_no_extras): + if not os.path.exists(os.path.join(url_no_extras, 'setup.py')): + raise InstallationError( + "Directory %r is not installable. File 'setup.py' not found." % + url_no_extras + ) + # Treating it as code that has already been checked out + url_no_extras = path_to_url(url_no_extras) + + if url_no_extras.lower().startswith('file:'): + package_name = Link(url_no_extras).egg_fragment + if extras: + return ( + package_name, + url_no_extras, + Requirement("placeholder" + extras.lower()).extras, + ) + else: + return package_name, url_no_extras, None + + for version_control in vcs: + if url.lower().startswith('%s:' % version_control): + url = '%s+%s' % (version_control, url) + break + + if '+' not in url: + raise InstallationError( + '%s should either be a path to a local project or a VCS url ' + 'beginning with svn+, git+, hg+, or bzr+' % + editable_req + ) + + vc_type = url.split('+', 1)[0].lower() + + if not vcs.get_backend(vc_type): + error_message = 'For --editable=%s only ' % editable_req + \ + ', '.join([backend.name + '+URL' for backend in vcs.backends]) + \ + ' is currently supported' + raise InstallationError(error_message) + + package_name = Link(url).egg_fragment + if not package_name: + raise InstallationError( + "Could not detect requirement name for '%s', please specify one " + "with #egg=your_package_name" % editable_req + ) + return package_name, url, None + + +def deduce_helpful_msg(req): + """Returns helpful msg in case requirements file does not exist, + or cannot be parsed. + + :params req: Requirements file path + """ + msg = "" + if os.path.exists(req): + msg = " It does exist." + # Try to parse and check if it is a requirements file. + try: + with open(req, 'r') as fp: + # parse first line only + next(parse_requirements(fp.read())) + msg += " The argument you provided " + \ + "(%s) appears to be a" % (req) + \ + " requirements file. If that is the" + \ + " case, use the '-r' flag to install" + \ + " the packages specified within it." + except RequirementParseError: + logger.debug("Cannot parse '%s' as requirements \ + file" % (req), exc_info=1) + else: + msg += " File '%s' does not exist." % (req) + return msg + + +# ---- The actual constructors follow ---- + + +def install_req_from_editable( + editable_req, comes_from=None, isolated=False, options=None, + wheel_cache=None, constraint=False +): + name, url, extras_override = parse_editable(editable_req) + if url.startswith('file:'): + source_dir = url_to_path(url) + else: + source_dir = None + + if name is not None: + try: + req = Requirement(name) + except InvalidRequirement: + raise InstallationError("Invalid requirement: '%s'" % name) + else: + req = None + return InstallRequirement( + req, comes_from, source_dir=source_dir, + editable=True, + link=Link(url), + constraint=constraint, + isolated=isolated, + options=options if options else {}, + wheel_cache=wheel_cache, + extras=extras_override or (), + ) + + +def install_req_from_line( + name, comes_from=None, isolated=False, options=None, wheel_cache=None, + constraint=False +): + """Creates an InstallRequirement from a name, which might be a + requirement, directory containing 'setup.py', filename, or URL. + """ + if is_url(name): + marker_sep = '; ' + else: + marker_sep = ';' + if marker_sep in name: + name, markers = name.split(marker_sep, 1) + markers = markers.strip() + if not markers: + markers = None + else: + markers = Marker(markers) + else: + markers = None + name = name.strip() + req = None + path = os.path.normpath(os.path.abspath(name)) + link = None + extras = None + + if is_url(name): + link = Link(name) + else: + p, extras = _strip_extras(path) + looks_like_dir = os.path.isdir(p) and ( + os.path.sep in name or + (os.path.altsep is not None and os.path.altsep in name) or + name.startswith('.') + ) + if looks_like_dir: + if not is_installable_dir(p): + raise InstallationError( + "Directory %r is not installable. Neither 'setup.py' " + "nor 'pyproject.toml' found." % name + ) + link = Link(path_to_url(p)) + elif is_archive_file(p): + if not os.path.isfile(p): + logger.warning( + 'Requirement %r looks like a filename, but the ' + 'file does not exist', + name + ) + link = Link(path_to_url(p)) + + # it's a local file, dir, or url + if link: + # Handle relative file URLs + if link.scheme == 'file' and re.search(r'\.\./', link.url): + link = Link( + path_to_url(os.path.normpath(os.path.abspath(link.path)))) + # wheel file + if link.is_wheel: + wheel = Wheel(link.filename) # can raise InvalidWheelFilename + req = "%s==%s" % (wheel.name, wheel.version) + else: + # set the req to the egg fragment. when it's not there, this + # will become an 'unnamed' requirement + req = link.egg_fragment + + # a requirement specifier + else: + req = name + + if extras: + extras = Requirement("placeholder" + extras.lower()).extras + else: + extras = () + if req is not None: + try: + req = Requirement(req) + except InvalidRequirement: + if os.path.sep in req: + add_msg = "It looks like a path." + add_msg += deduce_helpful_msg(req) + elif '=' in req and not any(op in req for op in operators): + add_msg = "= is not a valid operator. Did you mean == ?" + else: + add_msg = traceback.format_exc() + raise InstallationError( + "Invalid requirement: '%s'\n%s" % (req, add_msg) + ) + + return InstallRequirement( + req, comes_from, link=link, markers=markers, + isolated=isolated, + options=options if options else {}, + wheel_cache=wheel_cache, + constraint=constraint, + extras=extras, + ) + + +def install_req_from_req( + req, comes_from=None, isolated=False, wheel_cache=None +): + try: + req = Requirement(req) + except InvalidRequirement: + raise InstallationError("Invalid requirement: '%s'" % req) + + domains_not_allowed = [ + PyPI.file_storage_domain, + TestPyPI.file_storage_domain, + ] + if req.url and comes_from.link.netloc in domains_not_allowed: + # Explicitly disallow pypi packages that depend on external urls + raise InstallationError( + "Packages installed from PyPI cannot depend on packages " + "which are not also hosted on PyPI.\n" + "%s depends on %s " % (comes_from.name, req) + ) + + return InstallRequirement( + req, comes_from, isolated=isolated, wheel_cache=wheel_cache + ) diff --git a/lib/python3.4/site-packages/pip/req/req_file.py b/lib/python3.7/site-packages/pip/_internal/req/req_file.py similarity index 82% rename from lib/python3.4/site-packages/pip/req/req_file.py rename to lib/python3.7/site-packages/pip/_internal/req/req_file.py index 821df22..e7acf7c 100644 --- a/lib/python3.4/site-packages/pip/req/req_file.py +++ b/lib/python3.7/site-packages/pip/_internal/req/req_file.py @@ -4,28 +4,33 @@ Requirements file parsing from __future__ import absolute_import +import optparse import os import re import shlex import sys -import optparse -import warnings -from pip._vendor.six.moves.urllib import parse as urllib_parse from pip._vendor.six.moves import filterfalse +from pip._vendor.six.moves.urllib import parse as urllib_parse -import pip -from pip.download import get_file_content -from pip.req.req_install import InstallRequirement -from pip.exceptions import (RequirementsFileParseError) -from pip.utils.deprecation import RemovedInPip10Warning -from pip import cmdoptions +from pip._internal.cli import cmdoptions +from pip._internal.download import get_file_content +from pip._internal.exceptions import RequirementsFileParseError +from pip._internal.req.constructors import ( + install_req_from_editable, install_req_from_line, +) __all__ = ['parse_requirements'] SCHEME_RE = re.compile(r'^(http|https|file):', re.I) COMMENT_RE = re.compile(r'(^|\s)+#.*$') +# Matches environment variable-style values in '${MY_VARIABLE_1}' with the +# variable name consisting of only uppercase letters, digits or the '_' +# (underscore). This follows the POSIX standard defined in IEEE Std 1003.1, +# 2013 Edition. +ENV_VAR_RE = re.compile(r'(?P\$\{(?P[A-Z0-9_]+)\})') + SUPPORTED_OPTIONS = [ cmdoptions.constraints, cmdoptions.editable, @@ -34,13 +39,6 @@ SUPPORTED_OPTIONS = [ cmdoptions.index_url, cmdoptions.find_links, cmdoptions.extra_index_url, - cmdoptions.allow_external, - cmdoptions.allow_all_external, - cmdoptions.no_allow_external, - cmdoptions.allow_unsafe, - cmdoptions.no_allow_unsafe, - cmdoptions.use_wheel, - cmdoptions.no_use_wheel, cmdoptions.always_unzip, cmdoptions.no_binary, cmdoptions.only_binary, @@ -104,6 +102,7 @@ def preprocess(content, options): lines_enum = join_lines(lines_enum) lines_enum = ignore_comments(lines_enum) lines_enum = skip_regex(lines_enum, options) + lines_enum = expand_env_variables(lines_enum) return lines_enum @@ -127,7 +126,7 @@ def process_line(line, filename, line_number, finder=None, comes_from=None, :param constraint: If True, parsing a constraints file. :param options: OptionParser options that we may update """ - parser = build_parser() + parser = build_parser(line) defaults = parser.get_default_values() defaults.index_url = None if finder: @@ -141,7 +140,8 @@ def process_line(line, filename, line_number, finder=None, comes_from=None, # preserve for the nested code path line_comes_from = '%s %s (line %s)' % ( - '-c' if constraint else '-r', filename, line_number) + '-c' if constraint else '-r', filename, line_number, + ) # yield a line requirement if args_str: @@ -153,7 +153,7 @@ def process_line(line, filename, line_number, finder=None, comes_from=None, for dest in SUPPORTED_OPTIONS_REQ_DEST: if dest in opts.__dict__ and opts.__dict__[dest]: req_options[dest] = opts.__dict__[dest] - yield InstallRequirement.from_line( + yield install_req_from_line( args_str, line_comes_from, constraint=constraint, isolated=isolated, options=req_options, wheel_cache=wheel_cache ) @@ -161,11 +161,9 @@ def process_line(line, filename, line_number, finder=None, comes_from=None, # yield an editable requirement elif opts.editables: isolated = options.isolated_mode if options else False - default_vcs = options.default_vcs if options else None - yield InstallRequirement.from_editable( + yield install_req_from_editable( opts.editables[0], comes_from=line_comes_from, - constraint=constraint, default_vcs=default_vcs, isolated=isolated, - wheel_cache=wheel_cache + constraint=constraint, isolated=isolated, wheel_cache=wheel_cache ) # parse a nested requirements file @@ -198,35 +196,8 @@ def process_line(line, filename, line_number, finder=None, comes_from=None, # set finder options elif finder: - if opts.allow_external: - warnings.warn( - "--allow-external has been deprecated and will be removed in " - "the future. Due to changes in the repository protocol, it no " - "longer has any effect.", - RemovedInPip10Warning, - ) - - if opts.allow_all_external: - warnings.warn( - "--allow-all-external has been deprecated and will be removed " - "in the future. Due to changes in the repository protocol, it " - "no longer has any effect.", - RemovedInPip10Warning, - ) - - if opts.allow_unverified: - warnings.warn( - "--allow-unverified has been deprecated and will be removed " - "in the future. Due to changes in the repository protocol, it " - "no longer has any effect.", - RemovedInPip10Warning, - ) - if opts.index_url: finder.index_urls = [opts.index_url] - if opts.use_wheel is False: - finder.use_wheel = False - pip.index.fmt_ctl_no_use_wheel(finder.format_control) if opts.no_index is True: finder.index_urls = [] if opts.extra_index_urls: @@ -267,7 +238,7 @@ def break_args_options(line): return ' '.join(args), ' '.join(options) -def build_parser(): +def build_parser(line): """ Return a parser for parsing requirement lines """ @@ -281,6 +252,8 @@ def build_parser(): # By default optparse sys.exits on parsing errors. We want to wrap # that in our own exception. def parser_exit(self, msg): + # add offending line + msg = 'Invalid requirement: %s\n%s' % (line, msg) raise RequirementsFileParseError(msg) parser.exit = parser_exit @@ -336,7 +309,32 @@ def skip_regex(lines_enum, options): skip_regex = options.skip_requirements_regex if options else None if skip_regex: pattern = re.compile(skip_regex) - lines_enum = filterfalse( - lambda e: pattern.search(e[1]), - lines_enum) + lines_enum = filterfalse(lambda e: pattern.search(e[1]), lines_enum) return lines_enum + + +def expand_env_variables(lines_enum): + """Replace all environment variables that can be retrieved via `os.getenv`. + + The only allowed format for environment variables defined in the + requirement file is `${MY_VARIABLE_1}` to ensure two things: + + 1. Strings that contain a `$` aren't accidentally (partially) expanded. + 2. Ensure consistency across platforms for requirement files. + + These points are the result of a discusssion on the `github pull + request #3514 `_. + + Valid characters in variable names follow the `POSIX standard + `_ and are limited + to uppercase letter, digits and the `_` (underscore). + """ + for line_number, line in lines_enum: + for env_var, var_name in ENV_VAR_RE.findall(line): + value = os.getenv(var_name) + if not value: + continue + + line = line.replace(env_var, value) + + yield line_number, line diff --git a/lib/python3.4/site-packages/pip/req/req_install.py b/lib/python3.7/site-packages/pip/_internal/req/req_install.py similarity index 51% rename from lib/python3.4/site-packages/pip/req/req_install.py rename to lib/python3.7/site-packages/pip/_internal/req/req_install.py index 1a98f37..c2624fe 100644 --- a/lib/python3.4/site-packages/pip/req/req_install.py +++ b/lib/python3.7/site-packages/pip/_internal/req/req_install.py @@ -2,104 +2,81 @@ from __future__ import absolute_import import logging import os -import re import shutil import sys -import tempfile -import traceback -import warnings +import sysconfig import zipfile - -from distutils import sysconfig from distutils.util import change_root -from email.parser import FeedParser from pip._vendor import pkg_resources, six -from pip._vendor.packaging import specifiers -from pip._vendor.packaging.markers import Marker -from pip._vendor.packaging.requirements import InvalidRequirement, Requirement +from pip._vendor.packaging.requirements import Requirement from pip._vendor.packaging.utils import canonicalize_name -from pip._vendor.packaging.version import Version, parse as parse_version -from pip._vendor.six.moves import configparser +from pip._vendor.packaging.version import Version +from pip._vendor.packaging.version import parse as parse_version +from pip._vendor.pep517.wrappers import Pep517HookCaller -import pip.wheel - -from pip.compat import native_str, get_stdlib, WINDOWS -from pip.download import is_url, url_to_path, path_to_url, is_archive_file -from pip.exceptions import ( - InstallationError, UninstallationError, +from pip._internal import wheel +from pip._internal.build_env import NoOpBuildEnvironment +from pip._internal.exceptions import InstallationError +from pip._internal.locations import ( + PIP_DELETE_MARKER_FILENAME, running_under_virtualenv, ) -from pip.locations import ( - bin_py, running_under_virtualenv, PIP_DELETE_MARKER_FILENAME, bin_user, +from pip._internal.models.link import Link +from pip._internal.pyproject import load_pyproject_toml +from pip._internal.req.req_uninstall import UninstallPathSet +from pip._internal.utils.compat import native_str +from pip._internal.utils.hashes import Hashes +from pip._internal.utils.logging import indent_log +from pip._internal.utils.misc import ( + _make_build_dir, ask_path_exists, backup_dir, call_subprocess, + display_path, dist_in_site_packages, dist_in_usersite, ensure_dir, + get_installed_version, rmtree, ) -from pip.utils import ( - display_path, rmtree, ask_path_exists, backup_dir, is_installable_dir, - dist_in_usersite, dist_in_site_packages, egg_link_path, - call_subprocess, read_text_file, FakeFile, _make_build_dir, ensure_dir, - get_installed_version, normalize_path, dist_is_local, -) - -from pip.utils.hashes import Hashes -from pip.utils.deprecation import RemovedInPip10Warning -from pip.utils.logging import indent_log -from pip.utils.setuptools_build import SETUPTOOLS_SHIM -from pip.utils.ui import open_spinner -from pip.req.req_uninstall import UninstallPathSet -from pip.vcs import vcs -from pip.wheel import move_wheel_files, Wheel - +from pip._internal.utils.packaging import get_metadata +from pip._internal.utils.setuptools_build import SETUPTOOLS_SHIM +from pip._internal.utils.temp_dir import TempDirectory +from pip._internal.utils.ui import open_spinner +from pip._internal.vcs import vcs +from pip._internal.wheel import move_wheel_files logger = logging.getLogger(__name__) -operators = specifiers.Specifier._operators.keys() - - -def _strip_extras(path): - m = re.match(r'^(.+)(\[[^\]]+\])$', path) - extras = None - if m: - path_no_extras = m.group(1) - extras = m.group(2) - else: - path_no_extras = path - - return path_no_extras, extras - - -def _safe_extras(extras): - return set(pkg_resources.safe_extra(extra) for extra in extras) - class InstallRequirement(object): + """ + Represents something that may be installed later on, may have information + about where to fetch the relavant requirement and also contains logic for + installing the said requirement. + """ def __init__(self, req, comes_from, source_dir=None, editable=False, - link=None, as_egg=False, update=True, - pycompile=True, markers=None, isolated=False, options=None, - wheel_cache=None, constraint=False): - self.extras = () - if isinstance(req, six.string_types): - try: - req = Requirement(req) - except InvalidRequirement: - if os.path.sep in req: - add_msg = "It looks like a path. Does it exist ?" - elif '=' in req and not any(op in req for op in operators): - add_msg = "= is not a valid operator. Did you mean == ?" - else: - add_msg = traceback.format_exc() - raise InstallationError( - "Invalid requirement: '%s'\n%s" % (req, add_msg)) - self.extras = _safe_extras(req.extras) - + link=None, update=True, markers=None, + isolated=False, options=None, wheel_cache=None, + constraint=False, extras=()): + assert req is None or isinstance(req, Requirement), req self.req = req self.comes_from = comes_from self.constraint = constraint - self.source_dir = source_dir + if source_dir is not None: + self.source_dir = os.path.normpath(os.path.abspath(source_dir)) + else: + self.source_dir = None self.editable = editable self._wheel_cache = wheel_cache - self.link = self.original_link = link - self.as_egg = as_egg + if link is not None: + self.link = self.original_link = link + else: + self.link = self.original_link = req and req.url and Link(req.url) + + if extras: + self.extras = extras + elif req: + self.extras = { + pkg_resources.safe_extra(extra) for extra in req.extras + } + else: + self.extras = set() if markers is not None: self.markers = markers else: @@ -112,7 +89,7 @@ class InstallRequirement(object): # conflicts with another installed distribution: self.conflicts_with = None # Temporary build location - self._temp_build_dir = None + self._temp_build_dir = TempDirectory(kind="req-build") # Used to store the global directory where the _temp_build_dir should # have been created. Cf _correct_build_location method. self._ideal_build_dir = None @@ -121,132 +98,41 @@ class InstallRequirement(object): # Set to True after successful installation self.install_succeeded = None # UninstallPathSet of uninstalled distribution (for possible rollback) - self.uninstalled = None - # Set True if a legitimate do-nothing-on-uninstall has happened - e.g. - # system site packages, stdlib packages. - self.nothing_to_uninstall = False - self.use_user_site = False - self.target_dir = None + self.uninstalled_pathset = None self.options = options if options else {} - self.pycompile = pycompile # Set to True after successful preparation of this requirement self.prepared = False + self.is_direct = False self.isolated = isolated + self.build_env = NoOpBuildEnvironment() - @classmethod - def from_editable(cls, editable_req, comes_from=None, default_vcs=None, - isolated=False, options=None, wheel_cache=None, - constraint=False): - from pip.index import Link + # The static build requirements (from pyproject.toml) + self.pyproject_requires = None - name, url, extras_override = parse_editable( - editable_req, default_vcs) - if url.startswith('file:'): - source_dir = url_to_path(url) - else: - source_dir = None + # Build requirements that we will check are available + # TODO: We don't do this for --no-build-isolation. Should we? + self.requirements_to_check = [] - res = cls(name, comes_from, source_dir=source_dir, - editable=True, - link=Link(url), - constraint=constraint, - isolated=isolated, - options=options if options else {}, - wheel_cache=wheel_cache) + # The PEP 517 backend we should use to build the project + self.pep517_backend = None - if extras_override is not None: - res.extras = _safe_extras(extras_override) - - return res - - @classmethod - def from_line( - cls, name, comes_from=None, isolated=False, options=None, - wheel_cache=None, constraint=False): - """Creates an InstallRequirement from a name, which might be a - requirement, directory containing 'setup.py', filename, or URL. - """ - from pip.index import Link - - if is_url(name): - marker_sep = '; ' - else: - marker_sep = ';' - if marker_sep in name: - name, markers = name.split(marker_sep, 1) - markers = markers.strip() - if not markers: - markers = None - else: - markers = Marker(markers) - else: - markers = None - name = name.strip() - req = None - path = os.path.normpath(os.path.abspath(name)) - link = None - extras = None - - if is_url(name): - link = Link(name) - else: - p, extras = _strip_extras(path) - if (os.path.isdir(p) and - (os.path.sep in name or name.startswith('.'))): - - if not is_installable_dir(p): - raise InstallationError( - "Directory %r is not installable. File 'setup.py' " - "not found." % name - ) - link = Link(path_to_url(p)) - elif is_archive_file(p): - if not os.path.isfile(p): - logger.warning( - 'Requirement %r looks like a filename, but the ' - 'file does not exist', - name - ) - link = Link(path_to_url(p)) - - # it's a local file, dir, or url - if link: - # Handle relative file URLs - if link.scheme == 'file' and re.search(r'\.\./', link.url): - link = Link( - path_to_url(os.path.normpath(os.path.abspath(link.path)))) - # wheel file - if link.is_wheel: - wheel = Wheel(link.filename) # can raise InvalidWheelFilename - req = "%s==%s" % (wheel.name, wheel.version) - else: - # set the req to the egg fragment. when it's not there, this - # will become an 'unnamed' requirement - req = link.egg_fragment - - # a requirement specifier - else: - req = name - - options = options if options else {} - res = cls(req, comes_from, link=link, markers=markers, - isolated=isolated, options=options, - wheel_cache=wheel_cache, constraint=constraint) - - if extras: - res.extras = _safe_extras( - Requirement('placeholder' + extras).extras) - - return res + # Are we using PEP 517 for this requirement? + # After pyproject.toml has been loaded, the only valid values are True + # and False. Before loading, None is valid (meaning "use the default"). + # Setting an explicit value before loading pyproject.toml is supported, + # but after loading this flag should be treated as read only. + self.use_pep517 = None def __str__(self): if self.req: s = str(self.req) if self.link: s += ' from %s' % self.link.url + elif self.link: + s = self.link.url else: - s = self.link.url if self.link else None + s = '' if self.satisfied_by is not None: s += ' in %s' % display_path(self.satisfied_by.location) if self.comes_from: @@ -278,10 +164,17 @@ class InstallRequirement(object): self.link = finder.find_requirement(self, upgrade) if self._wheel_cache is not None and not require_hashes: old_link = self.link - self.link = self._wheel_cache.cached_wheel(self.link, self.name) + self.link = self._wheel_cache.get(self.link, self.name) if old_link != self.link: logger.debug('Using cached wheel link: %s', self.link) + # Things that are valid for all kinds of requirements? + @property + def name(self): + if self.req is None: + return None + return native_str(pkg_resources.safe_name(self.req.name)) + @property def specifier(self): return self.req.specifier @@ -294,9 +187,58 @@ class InstallRequirement(object): """ specifiers = self.specifier return (len(specifiers) == 1 and - next(iter(specifiers)).operator in ('==', '===')) + next(iter(specifiers)).operator in {'==', '==='}) + + @property + def installed_version(self): + return get_installed_version(self.name) + + def match_markers(self, extras_requested=None): + if not extras_requested: + # Provide an extra to safely evaluate the markers + # without matching any extra + extras_requested = ('',) + if self.markers is not None: + return any( + self.markers.evaluate({'extra': extra}) + for extra in extras_requested) + else: + return True + + @property + def has_hash_options(self): + """Return whether any known-good hashes are specified as options. + + These activate --require-hashes mode; hashes specified as part of a + URL do not. + + """ + return bool(self.options.get('hashes', {})) + + def hashes(self, trust_internet=True): + """Return a hash-comparer that considers my option- and URL-based + hashes to be known-good. + + Hashes in URLs--ones embedded in the requirements file, not ones + downloaded from an index server--are almost peers with ones from + flags. They satisfy --require-hashes (whether it was implicitly or + explicitly activated) but do not activate it. md5 and sha224 are not + allowed in flags, which should nudge people toward good algos. We + always OR all hashes together, even ones from URLs. + + :param trust_internet: Whether to trust URL-based (#md5=...) hashes + downloaded from the internet, as by populate_link() + + """ + good_hashes = self.options.get('hashes', {}).copy() + link = self.link if trust_internet else self.original_link + if link and link.hash: + good_hashes.setdefault(link.hash_name, []).append(link.hash) + return Hashes(good_hashes) def from_path(self): + """Format a nice indicator to show where this "comes from" + """ if self.req is None: return None s = str(self.req) @@ -310,8 +252,9 @@ class InstallRequirement(object): return s def build_location(self, build_dir): - if self._temp_build_dir is not None: - return self._temp_build_dir + assert build_dir is not None + if self._temp_build_dir.path is not None: + return self._temp_build_dir.path if self.req is None: # for requirement via a path to a directory: the name of the # package is not available yet so we create a temp directory @@ -320,11 +263,10 @@ class InstallRequirement(object): # Some systems have /tmp as a symlink which confuses custom # builds (such as numpy). Thus, we ensure that the real path # is returned. - self._temp_build_dir = os.path.realpath( - tempfile.mkdtemp('-build', 'pip-') - ) + self._temp_build_dir.create() self._ideal_build_dir = build_dir - return self._temp_build_dir + + return self._temp_build_dir.path if self.editable: name = self.name.lower() else: @@ -343,16 +285,17 @@ class InstallRequirement(object): package is not available until we run egg_info, so the build_location will return a temporary directory and store the _ideal_build_dir. - This is only called by self.egg_info_path to fix the temporary build + This is only called by self.run_egg_info to fix the temporary build directory. """ if self.source_dir is not None: return assert self.req is not None - assert self._temp_build_dir - assert self._ideal_build_dir - old_location = self._temp_build_dir - self._temp_build_dir = None + assert self._temp_build_dir.path + assert self._ideal_build_dir.path + old_location = self._temp_build_dir.path + self._temp_build_dir.path = None + new_location = self.build_location(self._ideal_build_dir) if os.path.exists(new_location): raise InstallationError( @@ -363,17 +306,83 @@ class InstallRequirement(object): self, display_path(old_location), display_path(new_location), ) shutil.move(old_location, new_location) - self._temp_build_dir = new_location + self._temp_build_dir.path = new_location self._ideal_build_dir = None - self.source_dir = new_location + self.source_dir = os.path.normpath(os.path.abspath(new_location)) self._egg_info_path = None - @property - def name(self): - if self.req is None: - return None - return native_str(pkg_resources.safe_name(self.req.name)) + def remove_temporary_source(self): + """Remove the source files from this requirement, if they are marked + for deletion""" + if self.source_dir and os.path.exists( + os.path.join(self.source_dir, PIP_DELETE_MARKER_FILENAME)): + logger.debug('Removing source in %s', self.source_dir) + rmtree(self.source_dir) + self.source_dir = None + self._temp_build_dir.cleanup() + self.build_env.cleanup() + def check_if_exists(self, use_user_site): + """Find an installed distribution that satisfies or conflicts + with this requirement, and set self.satisfied_by or + self.conflicts_with appropriately. + """ + if self.req is None: + return False + try: + # get_distribution() will resolve the entire list of requirements + # anyway, and we've already determined that we need the requirement + # in question, so strip the marker so that we don't try to + # evaluate it. + no_marker = Requirement(str(self.req)) + no_marker.marker = None + self.satisfied_by = pkg_resources.get_distribution(str(no_marker)) + if self.editable and self.satisfied_by: + self.conflicts_with = self.satisfied_by + # when installing editables, nothing pre-existing should ever + # satisfy + self.satisfied_by = None + return True + except pkg_resources.DistributionNotFound: + return False + except pkg_resources.VersionConflict: + existing_dist = pkg_resources.get_distribution( + self.req.name + ) + if use_user_site: + if dist_in_usersite(existing_dist): + self.conflicts_with = existing_dist + elif (running_under_virtualenv() and + dist_in_site_packages(existing_dist)): + raise InstallationError( + "Will not install to the user site because it will " + "lack sys.path precedence to %s in %s" % + (existing_dist.project_name, existing_dist.location) + ) + else: + self.conflicts_with = existing_dist + return True + + # Things valid for wheels + @property + def is_wheel(self): + return self.link and self.link.is_wheel + + def move_wheel_files(self, wheeldir, root=None, home=None, prefix=None, + warn_script_location=True, use_user_site=False, + pycompile=True): + move_wheel_files( + self.name, self.req, wheeldir, + user=use_user_site, + home=home, + root=root, + prefix=prefix, + pycompile=pycompile, + isolated=self.isolated, + warn_script_location=warn_script_location, + ) + + # Things valid for sdists @property def setup_py_dir(self): return os.path.join( @@ -383,18 +392,6 @@ class InstallRequirement(object): @property def setup_py(self): assert self.source_dir, "No source dir for %s" % self - try: - import setuptools # noqa - except ImportError: - if get_installed_version('setuptools') is None: - add_msg = "Please install setuptools." - else: - add_msg = traceback.format_exc() - # Setuptools is not available - raise InstallationError( - "Could not import setuptools which is required to " - "install from a source distribution.\n%s" % add_msg - ) setup_py = os.path.join(self.setup_py_dir, 'setup.py') @@ -404,6 +401,42 @@ class InstallRequirement(object): return setup_py + @property + def pyproject_toml(self): + assert self.source_dir, "No source dir for %s" % self + + pp_toml = os.path.join(self.setup_py_dir, 'pyproject.toml') + + # Python2 __file__ should not be unicode + if six.PY2 and isinstance(pp_toml, six.text_type): + pp_toml = pp_toml.encode(sys.getfilesystemencoding()) + + return pp_toml + + def load_pyproject_toml(self): + """Load the pyproject.toml file. + + After calling this routine, all of the attributes related to PEP 517 + processing for this requirement have been set. In particular, the + use_pep517 attribute can be used to determine whether we should + follow the PEP 517 or legacy (setup.py) code path. + """ + pep517_data = load_pyproject_toml( + self.use_pep517, + self.pyproject_toml, + self.setup_py, + str(self) + ) + + if pep517_data is None: + self.use_pep517 = False + else: + self.use_pep517 = True + requires, backend, check = pep517_data + self.requirements_to_check = check + self.pyproject_requires = requires + self.pep517_backend = Pep517HookCaller(self.setup_py_dir, backend) + def run_egg_info(self): assert self.source_dir if self.name: @@ -432,27 +465,28 @@ class InstallRequirement(object): egg_info_dir = os.path.join(self.setup_py_dir, 'pip-egg-info') ensure_dir(egg_info_dir) egg_base_option = ['--egg-base', 'pip-egg-info'] - call_subprocess( - egg_info_cmd + egg_base_option, - cwd=self.setup_py_dir, - show_stdout=False, - command_desc='python setup.py egg_info') + with self.build_env: + call_subprocess( + egg_info_cmd + egg_base_option, + cwd=self.setup_py_dir, + show_stdout=False, + command_desc='python setup.py egg_info') if not self.req: - if isinstance(parse_version(self.pkg_info()["Version"]), Version): + if isinstance(parse_version(self.metadata["Version"]), Version): op = "==" else: op = "===" self.req = Requirement( "".join([ - self.pkg_info()["Name"], + self.metadata["Name"], op, - self.pkg_info()["Version"], + self.metadata["Version"], ]) ) self._correct_build_location() else: - metadata_name = canonicalize_name(self.pkg_info()["Name"]) + metadata_name = canonicalize_name(self.metadata["Name"]) if canonicalize_name(self.req.name) != metadata_name: logger.warning( 'Running setup.py (path:%s) egg_info for package %s ' @@ -462,19 +496,8 @@ class InstallRequirement(object): ) self.req = Requirement(metadata_name) - def egg_info_data(self, filename): - if self.satisfied_by is not None: - if not self.satisfied_by.has_metadata(filename): - return None - return self.satisfied_by.get_metadata(filename) - assert self.source_dir - filename = self.egg_info_path(filename) - if not os.path.exists(filename): - return None - data = read_text_file(filename) - return data - - def egg_info_path(self, filename): + @property + def egg_info_path(self): if self._egg_info_path is None: if self.editable: base = self.source_dir @@ -507,16 +530,13 @@ class InstallRequirement(object): elif dir == 'test' or dir == 'tests': dirs.remove(dir) filenames.extend([os.path.join(root, dir) - for dir in dirs]) + for dir in dirs]) filenames = [f for f in filenames if f.endswith('.egg-info')] if not filenames: raise InstallationError( - 'No files/directories in %s (from %s)' % (base, filename) + "Files/directories not found in %s" % base ) - assert filenames, \ - "No files/directories in %s (from %s)" % (base, filename) - # if we have more than one match, we pick the toplevel one. This # can easily be the case if there is a dist folder which contains # an extracted tarball for testing purposes. @@ -526,33 +546,35 @@ class InstallRequirement(object): (os.path.altsep and x.count(os.path.altsep) or 0) ) self._egg_info_path = os.path.join(base, filenames[0]) - return os.path.join(self._egg_info_path, filename) - - def pkg_info(self): - p = FeedParser() - data = self.egg_info_data('PKG-INFO') - if not data: - logger.warning( - 'No PKG-INFO file found in %s', - display_path(self.egg_info_path('PKG-INFO')), - ) - p.feed(data or '') - return p.close() - - _requirements_section_re = re.compile(r'\[(.*?)\]') + return self._egg_info_path @property - def installed_version(self): - return get_installed_version(self.name) + def metadata(self): + if not hasattr(self, '_metadata'): + self._metadata = get_metadata(self.get_dist()) + + return self._metadata + + def get_dist(self): + """Return a pkg_resources.Distribution built from self.egg_info_path""" + egg_info = self.egg_info_path.rstrip(os.path.sep) + base_dir = os.path.dirname(egg_info) + metadata = pkg_resources.PathMetadata(base_dir, egg_info) + dist_name = os.path.splitext(os.path.basename(egg_info))[0] + return pkg_resources.Distribution( + os.path.dirname(egg_info), + project_name=dist_name, + metadata=metadata, + ) def assert_source_matches_version(self): assert self.source_dir - version = self.pkg_info()['version'] + version = self.metadata['version'] if self.req.specifier and version not in self.req.specifier: logger.warning( 'Requested %s, but installing version %s', self, - self.installed_version, + version, ) else: logger.debug( @@ -562,6 +584,52 @@ class InstallRequirement(object): self, ) + # For both source distributions and editables + def ensure_has_source_dir(self, parent_dir): + """Ensure that a source_dir is set. + + This will create a temporary build dir if the name of the requirement + isn't known yet. + + :param parent_dir: The ideal pip parent_dir for the source_dir. + Generally src_dir for editables and build_dir for sdists. + :return: self.source_dir + """ + if self.source_dir is None: + self.source_dir = self.build_location(parent_dir) + return self.source_dir + + # For editable installations + def install_editable(self, install_options, + global_options=(), prefix=None): + logger.info('Running setup.py develop for %s', self.name) + + if self.isolated: + global_options = list(global_options) + ["--no-user-cfg"] + + if prefix: + prefix_param = ['--prefix={}'.format(prefix)] + install_options = list(install_options) + prefix_param + + with indent_log(): + # FIXME: should we do --install-headers here too? + with self.build_env: + call_subprocess( + [ + sys.executable, + '-c', + SETUPTOOLS_SHIM % self.setup_py + ] + + list(global_options) + + ['develop', '--no-deps'] + + list(install_options), + + cwd=self.setup_py_dir, + show_stdout=False, + ) + + self.install_succeeded = True + def update_editable(self, obtain=True): if not self.link: logger.debug( @@ -591,7 +659,9 @@ class InstallRequirement(object): 'Unexpected version control type (in %s): %s' % (self.link, vc_type)) - def uninstall(self, auto_confirm=False): + # Top-level Actions + def uninstall(self, auto_confirm=False, verbose=False, + use_user_site=False): """ Uninstall the distribution currently satisfying this requirement. @@ -604,176 +674,29 @@ class InstallRequirement(object): linked to global site-packages. """ - if not self.check_if_exists(): - raise UninstallationError( - "Cannot uninstall requirement %s, not installed" % (self.name,) - ) + if not self.check_if_exists(use_user_site): + logger.warning("Skipping %s as it is not installed.", self.name) + return dist = self.satisfied_by or self.conflicts_with - dist_path = normalize_path(dist.location) - if not dist_is_local(dist): - logger.info( - "Not uninstalling %s at %s, outside environment %s", - dist.key, - dist_path, - sys.prefix, - ) - self.nothing_to_uninstall = True - return + uninstalled_pathset = UninstallPathSet.from_dist(dist) + uninstalled_pathset.remove(auto_confirm, verbose) + return uninstalled_pathset - if dist_path in get_stdlib(): - logger.info( - "Not uninstalling %s at %s, as it is in the standard library.", - dist.key, - dist_path, - ) - self.nothing_to_uninstall = True - return - - paths_to_remove = UninstallPathSet(dist) - develop_egg_link = egg_link_path(dist) - develop_egg_link_egg_info = '{0}.egg-info'.format( - pkg_resources.to_filename(dist.project_name)) - egg_info_exists = dist.egg_info and os.path.exists(dist.egg_info) - # Special case for distutils installed package - distutils_egg_info = getattr(dist._provider, 'path', None) - - # Uninstall cases order do matter as in the case of 2 installs of the - # same package, pip needs to uninstall the currently detected version - if (egg_info_exists and dist.egg_info.endswith('.egg-info') and - not dist.egg_info.endswith(develop_egg_link_egg_info)): - # if dist.egg_info.endswith(develop_egg_link_egg_info), we - # are in fact in the develop_egg_link case - paths_to_remove.add(dist.egg_info) - if dist.has_metadata('installed-files.txt'): - for installed_file in dist.get_metadata( - 'installed-files.txt').splitlines(): - path = os.path.normpath( - os.path.join(dist.egg_info, installed_file) - ) - paths_to_remove.add(path) - # FIXME: need a test for this elif block - # occurs with --single-version-externally-managed/--record outside - # of pip - elif dist.has_metadata('top_level.txt'): - if dist.has_metadata('namespace_packages.txt'): - namespaces = dist.get_metadata('namespace_packages.txt') - else: - namespaces = [] - for top_level_pkg in [ - p for p - in dist.get_metadata('top_level.txt').splitlines() - if p and p not in namespaces]: - path = os.path.join(dist.location, top_level_pkg) - paths_to_remove.add(path) - paths_to_remove.add(path + '.py') - paths_to_remove.add(path + '.pyc') - paths_to_remove.add(path + '.pyo') - - elif distutils_egg_info: - warnings.warn( - "Uninstalling a distutils installed project ({0}) has been " - "deprecated and will be removed in a future version. This is " - "due to the fact that uninstalling a distutils project will " - "only partially uninstall the project.".format(self.name), - RemovedInPip10Warning, - ) - paths_to_remove.add(distutils_egg_info) - - elif dist.location.endswith('.egg'): - # package installed by easy_install - # We cannot match on dist.egg_name because it can slightly vary - # i.e. setuptools-0.6c11-py2.6.egg vs setuptools-0.6rc11-py2.6.egg - paths_to_remove.add(dist.location) - easy_install_egg = os.path.split(dist.location)[1] - easy_install_pth = os.path.join(os.path.dirname(dist.location), - 'easy-install.pth') - paths_to_remove.add_pth(easy_install_pth, './' + easy_install_egg) - - elif egg_info_exists and dist.egg_info.endswith('.dist-info'): - for path in pip.wheel.uninstallation_paths(dist): - paths_to_remove.add(path) - - elif develop_egg_link: - # develop egg - with open(develop_egg_link, 'r') as fh: - link_pointer = os.path.normcase(fh.readline().strip()) - assert (link_pointer == dist.location), ( - 'Egg-link %s does not match installed location of %s ' - '(at %s)' % (link_pointer, self.name, dist.location) - ) - paths_to_remove.add(develop_egg_link) - easy_install_pth = os.path.join(os.path.dirname(develop_egg_link), - 'easy-install.pth') - paths_to_remove.add_pth(easy_install_pth, dist.location) - - else: - logger.debug( - 'Not sure how to uninstall: %s - Check: %s', - dist, dist.location) - - # find distutils scripts= scripts - if dist.has_metadata('scripts') and dist.metadata_isdir('scripts'): - for script in dist.metadata_listdir('scripts'): - if dist_in_usersite(dist): - bin_dir = bin_user - else: - bin_dir = bin_py - paths_to_remove.add(os.path.join(bin_dir, script)) - if WINDOWS: - paths_to_remove.add(os.path.join(bin_dir, script) + '.bat') - - # find console_scripts - if dist.has_metadata('entry_points.txt'): - if six.PY2: - options = {} - else: - options = {"delimiters": ('=', )} - config = configparser.SafeConfigParser(**options) - config.readfp( - FakeFile(dist.get_metadata_lines('entry_points.txt')) - ) - if config.has_section('console_scripts'): - for name, value in config.items('console_scripts'): - if dist_in_usersite(dist): - bin_dir = bin_user - else: - bin_dir = bin_py - paths_to_remove.add(os.path.join(bin_dir, name)) - if WINDOWS: - paths_to_remove.add( - os.path.join(bin_dir, name) + '.exe' - ) - paths_to_remove.add( - os.path.join(bin_dir, name) + '.exe.manifest' - ) - paths_to_remove.add( - os.path.join(bin_dir, name) + '-script.py' - ) - - paths_to_remove.remove(auto_confirm) - self.uninstalled = paths_to_remove - - def rollback_uninstall(self): - if self.uninstalled: - self.uninstalled.rollback() - else: - logger.error( - "Can't rollback %s, nothing uninstalled.", self.name, - ) - - def commit_uninstall(self): - if self.uninstalled: - self.uninstalled.commit() - elif not self.nothing_to_uninstall: - logger.error( - "Can't commit %s, nothing uninstalled.", self.name, - ) + def _clean_zip_name(self, name, prefix): # only used by archive. + assert name.startswith(prefix + os.path.sep), ( + "name %r doesn't start with prefix %r" % (name, prefix) + ) + name = name[len(prefix) + 1:] + name = name.replace(os.path.sep, '/') + return name + # TODO: Investigate if this should be kept in InstallRequirement + # Seems to be used only when VCS + downloads def archive(self, build_dir): assert self.source_dir create_archive = True - archive_name = '%s-%s.zip' % (self.name, self.pkg_info()["version"]) + archive_name = '%s-%s.zip' % (self.name, self.metadata["version"]) archive_path = os.path.join(build_dir, archive_name) if os.path.exists(archive_path): response = ask_path_exists( @@ -818,37 +741,24 @@ class InstallRequirement(object): zip.close() logger.info('Saved %s', display_path(archive_path)) - def _clean_zip_name(self, name, prefix): - assert name.startswith(prefix + os.path.sep), ( - "name %r doesn't start with prefix %r" % (name, prefix) - ) - name = name[len(prefix) + 1:] - name = name.replace(os.path.sep, '/') - return name - - def match_markers(self, extras_requested=None): - if not extras_requested: - # Provide an extra to safely evaluate the markers - # without matching any extra - extras_requested = ('',) - if self.markers is not None: - return any( - self.markers.evaluate({'extra': extra}) - for extra in extras_requested) - else: - return True - - def install(self, install_options, global_options=[], root=None, - prefix=None): + def install(self, install_options, global_options=None, root=None, + home=None, prefix=None, warn_script_location=True, + use_user_site=False, pycompile=True): + global_options = global_options if global_options is not None else [] if self.editable: self.install_editable( - install_options, global_options, prefix=prefix) + install_options, global_options, prefix=prefix, + ) return if self.is_wheel: - version = pip.wheel.wheel_version(self.source_dir) - pip.wheel.check_compatibility(version, self.name) + version = wheel.wheel_version(self.source_dir) + wheel.check_compatibility(version, self.name) - self.move_wheel_files(self.source_dir, root=root, prefix=prefix) + self.move_wheel_files( + self.source_dir, root=root, prefix=prefix, home=home, + warn_script_location=warn_script_location, + use_user_site=use_user_site, pycompile=pycompile, + ) self.install_succeeded = True return @@ -857,35 +767,34 @@ class InstallRequirement(object): # Options specified in requirements file override those # specified on the command line, since the last option given # to setup.py is the one that is used. - global_options += self.options.get('global_options', []) - install_options += self.options.get('install_options', []) + global_options = list(global_options) + \ + self.options.get('global_options', []) + install_options = list(install_options) + \ + self.options.get('install_options', []) if self.isolated: - global_options = list(global_options) + ["--no-user-cfg"] + global_options = global_options + ["--no-user-cfg"] - temp_location = tempfile.mkdtemp('-record', 'pip-') - record_filename = os.path.join(temp_location, 'install-record.txt') - try: + with TempDirectory(kind="record") as temp_dir: + record_filename = os.path.join(temp_dir.path, 'install-record.txt') install_args = self.get_install_args( - global_options, record_filename, root, prefix) + global_options, record_filename, root, prefix, pycompile, + ) msg = 'Running setup.py install for %s' % (self.name,) with open_spinner(msg) as spinner: with indent_log(): - call_subprocess( - install_args + install_options, - cwd=self.setup_py_dir, - show_stdout=False, - spinner=spinner, - ) + with self.build_env: + call_subprocess( + install_args + install_options, + cwd=self.setup_py_dir, + show_stdout=False, + spinner=spinner, + ) if not os.path.exists(record_filename): logger.debug('Record file %s not found', record_filename) return self.install_succeeded = True - if self.as_egg: - # there's no --always-unzip option we can pass to install - # command so we unable to save the installed-files.txt - return def prepend_root(path): if root is None or not os.path.isabs(path): @@ -915,47 +824,29 @@ class InstallRequirement(object): if os.path.isdir(filename): filename += os.path.sep new_lines.append( - os.path.relpath( - prepend_root(filename), egg_info_dir) + os.path.relpath(prepend_root(filename), egg_info_dir) ) + new_lines.sort() + ensure_dir(egg_info_dir) inst_files_path = os.path.join(egg_info_dir, 'installed-files.txt') with open(inst_files_path, 'w') as f: f.write('\n'.join(new_lines) + '\n') - finally: - if os.path.exists(record_filename): - os.remove(record_filename) - rmtree(temp_location) - def ensure_has_source_dir(self, parent_dir): - """Ensure that a source_dir is set. - - This will create a temporary build dir if the name of the requirement - isn't known yet. - - :param parent_dir: The ideal pip parent_dir for the source_dir. - Generally src_dir for editables and build_dir for sdists. - :return: self.source_dir - """ - if self.source_dir is None: - self.source_dir = self.build_location(parent_dir) - return self.source_dir - - def get_install_args(self, global_options, record_filename, root, prefix): + def get_install_args(self, global_options, record_filename, root, prefix, + pycompile): install_args = [sys.executable, "-u"] install_args.append('-c') install_args.append(SETUPTOOLS_SHIM % self.setup_py) install_args += list(global_options) + \ ['install', '--record', record_filename] - - if not self.as_egg: - install_args += ['--single-version-externally-managed'] + install_args += ['--single-version-externally-managed'] if root is not None: install_args += ['--root', root] if prefix is not None: install_args += ['--prefix', prefix] - if self.pycompile: + if pycompile: install_args += ["--compile"] else: install_args += ["--no-compile"] @@ -967,238 +858,3 @@ class InstallRequirement(object): py_ver_str, self.name)] return install_args - - def remove_temporary_source(self): - """Remove the source files from this requirement, if they are marked - for deletion""" - if self.source_dir and os.path.exists( - os.path.join(self.source_dir, PIP_DELETE_MARKER_FILENAME)): - logger.debug('Removing source in %s', self.source_dir) - rmtree(self.source_dir) - self.source_dir = None - if self._temp_build_dir and os.path.exists(self._temp_build_dir): - rmtree(self._temp_build_dir) - self._temp_build_dir = None - - def install_editable(self, install_options, - global_options=(), prefix=None): - logger.info('Running setup.py develop for %s', self.name) - - if self.isolated: - global_options = list(global_options) + ["--no-user-cfg"] - - if prefix: - prefix_param = ['--prefix={0}'.format(prefix)] - install_options = list(install_options) + prefix_param - - with indent_log(): - # FIXME: should we do --install-headers here too? - call_subprocess( - [ - sys.executable, - '-c', - SETUPTOOLS_SHIM % self.setup_py - ] + - list(global_options) + - ['develop', '--no-deps'] + - list(install_options), - - cwd=self.setup_py_dir, - show_stdout=False) - - self.install_succeeded = True - - def check_if_exists(self): - """Find an installed distribution that satisfies or conflicts - with this requirement, and set self.satisfied_by or - self.conflicts_with appropriately. - """ - if self.req is None: - return False - try: - # get_distribution() will resolve the entire list of requirements - # anyway, and we've already determined that we need the requirement - # in question, so strip the marker so that we don't try to - # evaluate it. - no_marker = Requirement(str(self.req)) - no_marker.marker = None - self.satisfied_by = pkg_resources.get_distribution(str(no_marker)) - if self.editable and self.satisfied_by: - self.conflicts_with = self.satisfied_by - # when installing editables, nothing pre-existing should ever - # satisfy - self.satisfied_by = None - return True - except pkg_resources.DistributionNotFound: - return False - except pkg_resources.VersionConflict: - existing_dist = pkg_resources.get_distribution( - self.req.name - ) - if self.use_user_site: - if dist_in_usersite(existing_dist): - self.conflicts_with = existing_dist - elif (running_under_virtualenv() and - dist_in_site_packages(existing_dist)): - raise InstallationError( - "Will not install to the user site because it will " - "lack sys.path precedence to %s in %s" % - (existing_dist.project_name, existing_dist.location) - ) - else: - self.conflicts_with = existing_dist - return True - - @property - def is_wheel(self): - return self.link and self.link.is_wheel - - def move_wheel_files(self, wheeldir, root=None, prefix=None): - move_wheel_files( - self.name, self.req, wheeldir, - user=self.use_user_site, - home=self.target_dir, - root=root, - prefix=prefix, - pycompile=self.pycompile, - isolated=self.isolated, - ) - - def get_dist(self): - """Return a pkg_resources.Distribution built from self.egg_info_path""" - egg_info = self.egg_info_path('').rstrip('/') - base_dir = os.path.dirname(egg_info) - metadata = pkg_resources.PathMetadata(base_dir, egg_info) - dist_name = os.path.splitext(os.path.basename(egg_info))[0] - return pkg_resources.Distribution( - os.path.dirname(egg_info), - project_name=dist_name, - metadata=metadata) - - @property - def has_hash_options(self): - """Return whether any known-good hashes are specified as options. - - These activate --require-hashes mode; hashes specified as part of a - URL do not. - - """ - return bool(self.options.get('hashes', {})) - - def hashes(self, trust_internet=True): - """Return a hash-comparer that considers my option- and URL-based - hashes to be known-good. - - Hashes in URLs--ones embedded in the requirements file, not ones - downloaded from an index server--are almost peers with ones from - flags. They satisfy --require-hashes (whether it was implicitly or - explicitly activated) but do not activate it. md5 and sha224 are not - allowed in flags, which should nudge people toward good algos. We - always OR all hashes together, even ones from URLs. - - :param trust_internet: Whether to trust URL-based (#md5=...) hashes - downloaded from the internet, as by populate_link() - - """ - good_hashes = self.options.get('hashes', {}).copy() - link = self.link if trust_internet else self.original_link - if link and link.hash: - good_hashes.setdefault(link.hash_name, []).append(link.hash) - return Hashes(good_hashes) - - -def _strip_postfix(req): - """ - Strip req postfix ( -dev, 0.2, etc ) - """ - # FIXME: use package_to_requirement? - match = re.search(r'^(.*?)(?:-dev|-\d.*)$', req) - if match: - # Strip off -dev, -0.2, etc. - req = match.group(1) - return req - - -def parse_editable(editable_req, default_vcs=None): - """Parses an editable requirement into: - - a requirement name - - an URL - - extras - - editable options - Accepted requirements: - svn+http://blahblah@rev#egg=Foobar[baz]&subdirectory=version_subdir - .[some_extra] - """ - - from pip.index import Link - - url = editable_req - extras = None - - # If a file path is specified with extras, strip off the extras. - m = re.match(r'^(.+)(\[[^\]]+\])$', url) - if m: - url_no_extras = m.group(1) - extras = m.group(2) - else: - url_no_extras = url - - if os.path.isdir(url_no_extras): - if not os.path.exists(os.path.join(url_no_extras, 'setup.py')): - raise InstallationError( - "Directory %r is not installable. File 'setup.py' not found." % - url_no_extras - ) - # Treating it as code that has already been checked out - url_no_extras = path_to_url(url_no_extras) - - if url_no_extras.lower().startswith('file:'): - package_name = Link(url_no_extras).egg_fragment - if extras: - return ( - package_name, - url_no_extras, - Requirement("placeholder" + extras.lower()).extras, - ) - else: - return package_name, url_no_extras, None - - for version_control in vcs: - if url.lower().startswith('%s:' % version_control): - url = '%s+%s' % (version_control, url) - break - - if '+' not in url: - if default_vcs: - warnings.warn( - "--default-vcs has been deprecated and will be removed in " - "the future.", - RemovedInPip10Warning, - ) - url = default_vcs + '+' + url - else: - raise InstallationError( - '%s should either be a path to a local project or a VCS url ' - 'beginning with svn+, git+, hg+, or bzr+' % - editable_req - ) - - vc_type = url.split('+', 1)[0].lower() - - if not vcs.get_backend(vc_type): - error_message = 'For --editable=%s only ' % editable_req + \ - ', '.join([backend.name + '+URL' for backend in vcs.backends]) + \ - ' is currently supported' - raise InstallationError(error_message) - - package_name = Link(url).egg_fragment - if not package_name: - raise InstallationError( - "Could not detect requirement name, please specify one with #egg=" - ) - if not package_name: - raise InstallationError( - '--editable=%s is not the right format; it must have ' - '#egg=Package' % editable_req - ) - return _strip_postfix(package_name), url, None diff --git a/lib/python3.7/site-packages/pip/_internal/req/req_set.py b/lib/python3.7/site-packages/pip/_internal/req/req_set.py new file mode 100644 index 0000000..b198317 --- /dev/null +++ b/lib/python3.7/site-packages/pip/_internal/req/req_set.py @@ -0,0 +1,181 @@ +from __future__ import absolute_import + +import logging +from collections import OrderedDict + +from pip._internal.exceptions import InstallationError +from pip._internal.utils.logging import indent_log +from pip._internal.wheel import Wheel + +logger = logging.getLogger(__name__) + + +class RequirementSet(object): + + def __init__(self, require_hashes=False, check_supported_wheels=True): + """Create a RequirementSet. + """ + + self.requirements = OrderedDict() + self.require_hashes = require_hashes + self.check_supported_wheels = check_supported_wheels + + # Mapping of alias: real_name + self.requirement_aliases = {} + self.unnamed_requirements = [] + self.successfully_downloaded = [] + self.reqs_to_cleanup = [] + + def __str__(self): + reqs = [req for req in self.requirements.values() + if not req.comes_from] + reqs.sort(key=lambda req: req.name.lower()) + return ' '.join([str(req.req) for req in reqs]) + + def __repr__(self): + reqs = [req for req in self.requirements.values()] + reqs.sort(key=lambda req: req.name.lower()) + reqs_str = ', '.join([str(req.req) for req in reqs]) + return ('<%s object; %d requirement(s): %s>' + % (self.__class__.__name__, len(reqs), reqs_str)) + + def add_requirement(self, install_req, parent_req_name=None, + extras_requested=None): + """Add install_req as a requirement to install. + + :param parent_req_name: The name of the requirement that needed this + added. The name is used because when multiple unnamed requirements + resolve to the same name, we could otherwise end up with dependency + links that point outside the Requirements set. parent_req must + already be added. Note that None implies that this is a user + supplied requirement, vs an inferred one. + :param extras_requested: an iterable of extras used to evaluate the + environment markers. + :return: Additional requirements to scan. That is either [] if + the requirement is not applicable, or [install_req] if the + requirement is applicable and has just been added. + """ + name = install_req.name + + # If the markers do not match, ignore this requirement. + if not install_req.match_markers(extras_requested): + logger.info( + "Ignoring %s: markers '%s' don't match your environment", + name, install_req.markers, + ) + return [], None + + # If the wheel is not supported, raise an error. + # Should check this after filtering out based on environment markers to + # allow specifying different wheels based on the environment/OS, in a + # single requirements file. + if install_req.link and install_req.link.is_wheel: + wheel = Wheel(install_req.link.filename) + if self.check_supported_wheels and not wheel.supported(): + raise InstallationError( + "%s is not a supported wheel on this platform." % + wheel.filename + ) + + # This next bit is really a sanity check. + assert install_req.is_direct == (parent_req_name is None), ( + "a direct req shouldn't have a parent and also, " + "a non direct req should have a parent" + ) + + # Unnamed requirements are scanned again and the requirement won't be + # added as a dependency until after scanning. + if not name: + # url or path requirement w/o an egg fragment + self.unnamed_requirements.append(install_req) + return [install_req], None + + try: + existing_req = self.get_requirement(name) + except KeyError: + existing_req = None + + has_conflicting_requirement = ( + parent_req_name is None and + existing_req and + not existing_req.constraint and + existing_req.extras == install_req.extras and + existing_req.req.specifier != install_req.req.specifier + ) + if has_conflicting_requirement: + raise InstallationError( + "Double requirement given: %s (already in %s, name=%r)" + % (install_req, existing_req, name) + ) + + # When no existing requirement exists, add the requirement as a + # dependency and it will be scanned again after. + if not existing_req: + self.requirements[name] = install_req + # FIXME: what about other normalizations? E.g., _ vs. -? + if name.lower() != name: + self.requirement_aliases[name.lower()] = name + # We'd want to rescan this requirements later + return [install_req], install_req + + # Assume there's no need to scan, and that we've already + # encountered this for scanning. + if install_req.constraint or not existing_req.constraint: + return [], existing_req + + does_not_satisfy_constraint = ( + install_req.link and + not ( + existing_req.link and + install_req.link.path == existing_req.link.path + ) + ) + if does_not_satisfy_constraint: + self.reqs_to_cleanup.append(install_req) + raise InstallationError( + "Could not satisfy constraints for '%s': " + "installation from path or url cannot be " + "constrained to a version" % name, + ) + # If we're now installing a constraint, mark the existing + # object for real installation. + existing_req.constraint = False + existing_req.extras = tuple(sorted( + set(existing_req.extras) | set(install_req.extras) + )) + logger.debug( + "Setting %s extras to: %s", + existing_req, existing_req.extras, + ) + # Return the existing requirement for addition to the parent and + # scanning again. + return [existing_req], existing_req + + def has_requirement(self, project_name): + name = project_name.lower() + if (name in self.requirements and + not self.requirements[name].constraint or + name in self.requirement_aliases and + not self.requirements[self.requirement_aliases[name]].constraint): + return True + return False + + @property + def has_requirements(self): + return list(req for req in self.requirements.values() if not + req.constraint) or self.unnamed_requirements + + def get_requirement(self, project_name): + for name in project_name, project_name.lower(): + if name in self.requirements: + return self.requirements[name] + if name in self.requirement_aliases: + return self.requirements[self.requirement_aliases[name]] + raise KeyError("No project with the name %r" % project_name) + + def cleanup_files(self): + """Clean up files, remove builds.""" + logger.debug('Cleaning up...') + with indent_log(): + for req in self.reqs_to_cleanup: + req.remove_temporary_source() diff --git a/lib/python3.7/site-packages/pip/_internal/req/req_tracker.py b/lib/python3.7/site-packages/pip/_internal/req/req_tracker.py new file mode 100644 index 0000000..0a86f4c --- /dev/null +++ b/lib/python3.7/site-packages/pip/_internal/req/req_tracker.py @@ -0,0 +1,76 @@ +from __future__ import absolute_import + +import contextlib +import errno +import hashlib +import logging +import os + +from pip._internal.utils.temp_dir import TempDirectory + +logger = logging.getLogger(__name__) + + +class RequirementTracker(object): + + def __init__(self): + self._root = os.environ.get('PIP_REQ_TRACKER') + if self._root is None: + self._temp_dir = TempDirectory(delete=False, kind='req-tracker') + self._temp_dir.create() + self._root = os.environ['PIP_REQ_TRACKER'] = self._temp_dir.path + logger.debug('Created requirements tracker %r', self._root) + else: + self._temp_dir = None + logger.debug('Re-using requirements tracker %r', self._root) + self._entries = set() + + def __enter__(self): + return self + + def __exit__(self, exc_type, exc_val, exc_tb): + self.cleanup() + + def _entry_path(self, link): + hashed = hashlib.sha224(link.url_without_fragment.encode()).hexdigest() + return os.path.join(self._root, hashed) + + def add(self, req): + link = req.link + info = str(req) + entry_path = self._entry_path(link) + try: + with open(entry_path) as fp: + # Error, these's already a build in progress. + raise LookupError('%s is already being built: %s' + % (link, fp.read())) + except IOError as e: + if e.errno != errno.ENOENT: + raise + assert req not in self._entries + with open(entry_path, 'w') as fp: + fp.write(info) + self._entries.add(req) + logger.debug('Added %s to build tracker %r', req, self._root) + + def remove(self, req): + link = req.link + self._entries.remove(req) + os.unlink(self._entry_path(link)) + logger.debug('Removed %s from build tracker %r', req, self._root) + + def cleanup(self): + for req in set(self._entries): + self.remove(req) + remove = self._temp_dir is not None + if remove: + self._temp_dir.cleanup() + logger.debug('%s build tracker %r', + 'Removed' if remove else 'Cleaned', + self._root) + + @contextlib.contextmanager + def track(self, req): + self.add(req) + yield + self.remove(req) diff --git a/lib/python3.7/site-packages/pip/_internal/req/req_uninstall.py b/lib/python3.7/site-packages/pip/_internal/req/req_uninstall.py new file mode 100644 index 0000000..a7d8230 --- /dev/null +++ b/lib/python3.7/site-packages/pip/_internal/req/req_uninstall.py @@ -0,0 +1,460 @@ +from __future__ import absolute_import + +import csv +import functools +import logging +import os +import sys +import sysconfig + +from pip._vendor import pkg_resources + +from pip._internal.exceptions import UninstallationError +from pip._internal.locations import bin_py, bin_user +from pip._internal.utils.compat import WINDOWS, cache_from_source, uses_pycache +from pip._internal.utils.logging import indent_log +from pip._internal.utils.misc import ( + FakeFile, ask, dist_in_usersite, dist_is_local, egg_link_path, is_local, + normalize_path, renames, +) +from pip._internal.utils.temp_dir import TempDirectory + +logger = logging.getLogger(__name__) + + +def _script_names(dist, script_name, is_gui): + """Create the fully qualified name of the files created by + {console,gui}_scripts for the given ``dist``. + Returns the list of file names + """ + if dist_in_usersite(dist): + bin_dir = bin_user + else: + bin_dir = bin_py + exe_name = os.path.join(bin_dir, script_name) + paths_to_remove = [exe_name] + if WINDOWS: + paths_to_remove.append(exe_name + '.exe') + paths_to_remove.append(exe_name + '.exe.manifest') + if is_gui: + paths_to_remove.append(exe_name + '-script.pyw') + else: + paths_to_remove.append(exe_name + '-script.py') + return paths_to_remove + + +def _unique(fn): + @functools.wraps(fn) + def unique(*args, **kw): + seen = set() + for item in fn(*args, **kw): + if item not in seen: + seen.add(item) + yield item + return unique + + +@_unique +def uninstallation_paths(dist): + """ + Yield all the uninstallation paths for dist based on RECORD-without-.py[co] + + Yield paths to all the files in RECORD. For each .py file in RECORD, add + the .pyc and .pyo in the same directory. + + UninstallPathSet.add() takes care of the __pycache__ .py[co]. + """ + r = csv.reader(FakeFile(dist.get_metadata_lines('RECORD'))) + for row in r: + path = os.path.join(dist.location, row[0]) + yield path + if path.endswith('.py'): + dn, fn = os.path.split(path) + base = fn[:-3] + path = os.path.join(dn, base + '.pyc') + yield path + path = os.path.join(dn, base + '.pyo') + yield path + + +def compact(paths): + """Compact a path set to contain the minimal number of paths + necessary to contain all paths in the set. If /a/path/ and + /a/path/to/a/file.txt are both in the set, leave only the + shorter path.""" + + sep = os.path.sep + short_paths = set() + for path in sorted(paths, key=len): + should_add = any( + path.startswith(shortpath.rstrip("*")) and + path[len(shortpath.rstrip("*").rstrip(sep))] == sep + for shortpath in short_paths + ) + if not should_add: + short_paths.add(path) + return short_paths + + +def compress_for_output_listing(paths): + """Returns a tuple of 2 sets of which paths to display to user + + The first set contains paths that would be deleted. Files of a package + are not added and the top-level directory of the package has a '*' added + at the end - to signify that all it's contents are removed. + + The second set contains files that would have been skipped in the above + folders. + """ + + will_remove = list(paths) + will_skip = set() + + # Determine folders and files + folders = set() + files = set() + for path in will_remove: + if path.endswith(".pyc"): + continue + if path.endswith("__init__.py") or ".dist-info" in path: + folders.add(os.path.dirname(path)) + files.add(path) + + _normcased_files = set(map(os.path.normcase, files)) + + folders = compact(folders) + + # This walks the tree using os.walk to not miss extra folders + # that might get added. + for folder in folders: + for dirpath, _, dirfiles in os.walk(folder): + for fname in dirfiles: + if fname.endswith(".pyc"): + continue + + file_ = os.path.join(dirpath, fname) + if (os.path.isfile(file_) and + os.path.normcase(file_) not in _normcased_files): + # We are skipping this file. Add it to the set. + will_skip.add(file_) + + will_remove = files | { + os.path.join(folder, "*") for folder in folders + } + + return will_remove, will_skip + + +class UninstallPathSet(object): + """A set of file paths to be removed in the uninstallation of a + requirement.""" + def __init__(self, dist): + self.paths = set() + self._refuse = set() + self.pth = {} + self.dist = dist + self.save_dir = TempDirectory(kind="uninstall") + self._moved_paths = [] + + def _permitted(self, path): + """ + Return True if the given path is one we are permitted to + remove/modify, False otherwise. + + """ + return is_local(path) + + def add(self, path): + head, tail = os.path.split(path) + + # we normalize the head to resolve parent directory symlinks, but not + # the tail, since we only want to uninstall symlinks, not their targets + path = os.path.join(normalize_path(head), os.path.normcase(tail)) + + if not os.path.exists(path): + return + if self._permitted(path): + self.paths.add(path) + else: + self._refuse.add(path) + + # __pycache__ files can show up after 'installed-files.txt' is created, + # due to imports + if os.path.splitext(path)[1] == '.py' and uses_pycache: + self.add(cache_from_source(path)) + + def add_pth(self, pth_file, entry): + pth_file = normalize_path(pth_file) + if self._permitted(pth_file): + if pth_file not in self.pth: + self.pth[pth_file] = UninstallPthEntries(pth_file) + self.pth[pth_file].add(entry) + else: + self._refuse.add(pth_file) + + def _stash(self, path): + return os.path.join( + self.save_dir.path, os.path.splitdrive(path)[1].lstrip(os.path.sep) + ) + + def remove(self, auto_confirm=False, verbose=False): + """Remove paths in ``self.paths`` with confirmation (unless + ``auto_confirm`` is True).""" + + if not self.paths: + logger.info( + "Can't uninstall '%s'. No files were found to uninstall.", + self.dist.project_name, + ) + return + + dist_name_version = ( + self.dist.project_name + "-" + self.dist.version + ) + logger.info('Uninstalling %s:', dist_name_version) + + with indent_log(): + if auto_confirm or self._allowed_to_proceed(verbose): + self.save_dir.create() + + for path in sorted(compact(self.paths)): + new_path = self._stash(path) + logger.debug('Removing file or directory %s', path) + self._moved_paths.append(path) + renames(path, new_path) + for pth in self.pth.values(): + pth.remove() + + logger.info('Successfully uninstalled %s', dist_name_version) + + def _allowed_to_proceed(self, verbose): + """Display which files would be deleted and prompt for confirmation + """ + + def _display(msg, paths): + if not paths: + return + + logger.info(msg) + with indent_log(): + for path in sorted(compact(paths)): + logger.info(path) + + if not verbose: + will_remove, will_skip = compress_for_output_listing(self.paths) + else: + # In verbose mode, display all the files that are going to be + # deleted. + will_remove = list(self.paths) + will_skip = set() + + _display('Would remove:', will_remove) + _display('Would not remove (might be manually added):', will_skip) + _display('Would not remove (outside of prefix):', self._refuse) + + return ask('Proceed (y/n)? ', ('y', 'n')) == 'y' + + def rollback(self): + """Rollback the changes previously made by remove().""" + if self.save_dir.path is None: + logger.error( + "Can't roll back %s; was not uninstalled", + self.dist.project_name, + ) + return False + logger.info('Rolling back uninstall of %s', self.dist.project_name) + for path in self._moved_paths: + tmp_path = self._stash(path) + logger.debug('Replacing %s', path) + renames(tmp_path, path) + for pth in self.pth.values(): + pth.rollback() + + def commit(self): + """Remove temporary save dir: rollback will no longer be possible.""" + self.save_dir.cleanup() + self._moved_paths = [] + + @classmethod + def from_dist(cls, dist): + dist_path = normalize_path(dist.location) + if not dist_is_local(dist): + logger.info( + "Not uninstalling %s at %s, outside environment %s", + dist.key, + dist_path, + sys.prefix, + ) + return cls(dist) + + if dist_path in {p for p in {sysconfig.get_path("stdlib"), + sysconfig.get_path("platstdlib")} + if p}: + logger.info( + "Not uninstalling %s at %s, as it is in the standard library.", + dist.key, + dist_path, + ) + return cls(dist) + + paths_to_remove = cls(dist) + develop_egg_link = egg_link_path(dist) + develop_egg_link_egg_info = '{}.egg-info'.format( + pkg_resources.to_filename(dist.project_name)) + egg_info_exists = dist.egg_info and os.path.exists(dist.egg_info) + # Special case for distutils installed package + distutils_egg_info = getattr(dist._provider, 'path', None) + + # Uninstall cases order do matter as in the case of 2 installs of the + # same package, pip needs to uninstall the currently detected version + if (egg_info_exists and dist.egg_info.endswith('.egg-info') and + not dist.egg_info.endswith(develop_egg_link_egg_info)): + # if dist.egg_info.endswith(develop_egg_link_egg_info), we + # are in fact in the develop_egg_link case + paths_to_remove.add(dist.egg_info) + if dist.has_metadata('installed-files.txt'): + for installed_file in dist.get_metadata( + 'installed-files.txt').splitlines(): + path = os.path.normpath( + os.path.join(dist.egg_info, installed_file) + ) + paths_to_remove.add(path) + # FIXME: need a test for this elif block + # occurs with --single-version-externally-managed/--record outside + # of pip + elif dist.has_metadata('top_level.txt'): + if dist.has_metadata('namespace_packages.txt'): + namespaces = dist.get_metadata('namespace_packages.txt') + else: + namespaces = [] + for top_level_pkg in [ + p for p + in dist.get_metadata('top_level.txt').splitlines() + if p and p not in namespaces]: + path = os.path.join(dist.location, top_level_pkg) + paths_to_remove.add(path) + paths_to_remove.add(path + '.py') + paths_to_remove.add(path + '.pyc') + paths_to_remove.add(path + '.pyo') + + elif distutils_egg_info: + raise UninstallationError( + "Cannot uninstall {!r}. It is a distutils installed project " + "and thus we cannot accurately determine which files belong " + "to it which would lead to only a partial uninstall.".format( + dist.project_name, + ) + ) + + elif dist.location.endswith('.egg'): + # package installed by easy_install + # We cannot match on dist.egg_name because it can slightly vary + # i.e. setuptools-0.6c11-py2.6.egg vs setuptools-0.6rc11-py2.6.egg + paths_to_remove.add(dist.location) + easy_install_egg = os.path.split(dist.location)[1] + easy_install_pth = os.path.join(os.path.dirname(dist.location), + 'easy-install.pth') + paths_to_remove.add_pth(easy_install_pth, './' + easy_install_egg) + + elif egg_info_exists and dist.egg_info.endswith('.dist-info'): + for path in uninstallation_paths(dist): + paths_to_remove.add(path) + + elif develop_egg_link: + # develop egg + with open(develop_egg_link, 'r') as fh: + link_pointer = os.path.normcase(fh.readline().strip()) + assert (link_pointer == dist.location), ( + 'Egg-link %s does not match installed location of %s ' + '(at %s)' % (link_pointer, dist.project_name, dist.location) + ) + paths_to_remove.add(develop_egg_link) + easy_install_pth = os.path.join(os.path.dirname(develop_egg_link), + 'easy-install.pth') + paths_to_remove.add_pth(easy_install_pth, dist.location) + + else: + logger.debug( + 'Not sure how to uninstall: %s - Check: %s', + dist, dist.location, + ) + + # find distutils scripts= scripts + if dist.has_metadata('scripts') and dist.metadata_isdir('scripts'): + for script in dist.metadata_listdir('scripts'): + if dist_in_usersite(dist): + bin_dir = bin_user + else: + bin_dir = bin_py + paths_to_remove.add(os.path.join(bin_dir, script)) + if WINDOWS: + paths_to_remove.add(os.path.join(bin_dir, script) + '.bat') + + # find console_scripts + _scripts_to_remove = [] + console_scripts = dist.get_entry_map(group='console_scripts') + for name in console_scripts.keys(): + _scripts_to_remove.extend(_script_names(dist, name, False)) + # find gui_scripts + gui_scripts = dist.get_entry_map(group='gui_scripts') + for name in gui_scripts.keys(): + _scripts_to_remove.extend(_script_names(dist, name, True)) + + for s in _scripts_to_remove: + paths_to_remove.add(s) + + return paths_to_remove + + +class UninstallPthEntries(object): + def __init__(self, pth_file): + if not os.path.isfile(pth_file): + raise UninstallationError( + "Cannot remove entries from nonexistent file %s" % pth_file + ) + self.file = pth_file + self.entries = set() + self._saved_lines = None + + def add(self, entry): + entry = os.path.normcase(entry) + # On Windows, os.path.normcase converts the entry to use + # backslashes. This is correct for entries that describe absolute + # paths outside of site-packages, but all the others use forward + # slashes. + if WINDOWS and not os.path.splitdrive(entry)[0]: + entry = entry.replace('\\', '/') + self.entries.add(entry) + + def remove(self): + logger.debug('Removing pth entries from %s:', self.file) + with open(self.file, 'rb') as fh: + # windows uses '\r\n' with py3k, but uses '\n' with py2.x + lines = fh.readlines() + self._saved_lines = lines + if any(b'\r\n' in line for line in lines): + endline = '\r\n' + else: + endline = '\n' + # handle missing trailing newline + if lines and not lines[-1].endswith(endline.encode("utf-8")): + lines[-1] = lines[-1] + endline.encode("utf-8") + for entry in self.entries: + try: + logger.debug('Removing entry: %s', entry) + lines.remove((entry + endline).encode("utf-8")) + except ValueError: + pass + with open(self.file, 'wb') as fh: + fh.writelines(lines) + + def rollback(self): + if self._saved_lines is None: + logger.error( + 'Cannot roll back changes to %s, none were made', self.file + ) + return False + logger.debug('Rolling %s back to previous state', self.file) + with open(self.file, 'wb') as fh: + fh.writelines(self._saved_lines) + return True diff --git a/lib/python3.7/site-packages/pip/_internal/resolve.py b/lib/python3.7/site-packages/pip/_internal/resolve.py new file mode 100644 index 0000000..2d9f1c5 --- /dev/null +++ b/lib/python3.7/site-packages/pip/_internal/resolve.py @@ -0,0 +1,353 @@ +"""Dependency Resolution + +The dependency resolution in pip is performed as follows: + +for top-level requirements: + a. only one spec allowed per project, regardless of conflicts or not. + otherwise a "double requirement" exception is raised + b. they override sub-dependency requirements. +for sub-dependencies + a. "first found, wins" (where the order is breadth first) +""" + +import logging +from collections import defaultdict +from itertools import chain + +from pip._internal.exceptions import ( + BestVersionAlreadyInstalled, DistributionNotFound, HashError, HashErrors, + UnsupportedPythonVersion, +) +from pip._internal.req.constructors import install_req_from_req +from pip._internal.utils.logging import indent_log +from pip._internal.utils.misc import dist_in_usersite, ensure_dir +from pip._internal.utils.packaging import check_dist_requires_python + +logger = logging.getLogger(__name__) + + +class Resolver(object): + """Resolves which packages need to be installed/uninstalled to perform \ + the requested operation without breaking the requirements of any package. + """ + + _allowed_strategies = {"eager", "only-if-needed", "to-satisfy-only"} + + def __init__(self, preparer, session, finder, wheel_cache, use_user_site, + ignore_dependencies, ignore_installed, ignore_requires_python, + force_reinstall, isolated, upgrade_strategy): + super(Resolver, self).__init__() + assert upgrade_strategy in self._allowed_strategies + + self.preparer = preparer + self.finder = finder + self.session = session + + # NOTE: This would eventually be replaced with a cache that can give + # information about both sdist and wheels transparently. + self.wheel_cache = wheel_cache + + self.require_hashes = None # This is set in resolve + + self.upgrade_strategy = upgrade_strategy + self.force_reinstall = force_reinstall + self.isolated = isolated + self.ignore_dependencies = ignore_dependencies + self.ignore_installed = ignore_installed + self.ignore_requires_python = ignore_requires_python + self.use_user_site = use_user_site + + self._discovered_dependencies = defaultdict(list) + + def resolve(self, requirement_set): + """Resolve what operations need to be done + + As a side-effect of this method, the packages (and their dependencies) + are downloaded, unpacked and prepared for installation. This + preparation is done by ``pip.operations.prepare``. + + Once PyPI has static dependency metadata available, it would be + possible to move the preparation to become a step separated from + dependency resolution. + """ + # make the wheelhouse + if self.preparer.wheel_download_dir: + ensure_dir(self.preparer.wheel_download_dir) + + # If any top-level requirement has a hash specified, enter + # hash-checking mode, which requires hashes from all. + root_reqs = ( + requirement_set.unnamed_requirements + + list(requirement_set.requirements.values()) + ) + self.require_hashes = ( + requirement_set.require_hashes or + any(req.has_hash_options for req in root_reqs) + ) + + # Display where finder is looking for packages + locations = self.finder.get_formatted_locations() + if locations: + logger.info(locations) + + # Actually prepare the files, and collect any exceptions. Most hash + # exceptions cannot be checked ahead of time, because + # req.populate_link() needs to be called before we can make decisions + # based on link type. + discovered_reqs = [] + hash_errors = HashErrors() + for req in chain(root_reqs, discovered_reqs): + try: + discovered_reqs.extend( + self._resolve_one(requirement_set, req) + ) + except HashError as exc: + exc.req = req + hash_errors.append(exc) + + if hash_errors: + raise hash_errors + + def _is_upgrade_allowed(self, req): + if self.upgrade_strategy == "to-satisfy-only": + return False + elif self.upgrade_strategy == "eager": + return True + else: + assert self.upgrade_strategy == "only-if-needed" + return req.is_direct + + def _set_req_to_reinstall(self, req): + """ + Set a requirement to be installed. + """ + # Don't uninstall the conflict if doing a user install and the + # conflict is not a user install. + if not self.use_user_site or dist_in_usersite(req.satisfied_by): + req.conflicts_with = req.satisfied_by + req.satisfied_by = None + + # XXX: Stop passing requirement_set for options + def _check_skip_installed(self, req_to_install): + """Check if req_to_install should be skipped. + + This will check if the req is installed, and whether we should upgrade + or reinstall it, taking into account all the relevant user options. + + After calling this req_to_install will only have satisfied_by set to + None if the req_to_install is to be upgraded/reinstalled etc. Any + other value will be a dist recording the current thing installed that + satisfies the requirement. + + Note that for vcs urls and the like we can't assess skipping in this + routine - we simply identify that we need to pull the thing down, + then later on it is pulled down and introspected to assess upgrade/ + reinstalls etc. + + :return: A text reason for why it was skipped, or None. + """ + if self.ignore_installed: + return None + + req_to_install.check_if_exists(self.use_user_site) + if not req_to_install.satisfied_by: + return None + + if self.force_reinstall: + self._set_req_to_reinstall(req_to_install) + return None + + if not self._is_upgrade_allowed(req_to_install): + if self.upgrade_strategy == "only-if-needed": + return 'already satisfied, skipping upgrade' + return 'already satisfied' + + # Check for the possibility of an upgrade. For link-based + # requirements we have to pull the tree down and inspect to assess + # the version #, so it's handled way down. + if not req_to_install.link: + try: + self.finder.find_requirement(req_to_install, upgrade=True) + except BestVersionAlreadyInstalled: + # Then the best version is installed. + return 'already up-to-date' + except DistributionNotFound: + # No distribution found, so we squash the error. It will + # be raised later when we re-try later to do the install. + # Why don't we just raise here? + pass + + self._set_req_to_reinstall(req_to_install) + return None + + def _get_abstract_dist_for(self, req): + """Takes a InstallRequirement and returns a single AbstractDist \ + representing a prepared variant of the same. + """ + assert self.require_hashes is not None, ( + "require_hashes should have been set in Resolver.resolve()" + ) + + if req.editable: + return self.preparer.prepare_editable_requirement( + req, self.require_hashes, self.use_user_site, self.finder, + ) + + # satisfied_by is only evaluated by calling _check_skip_installed, + # so it must be None here. + assert req.satisfied_by is None + skip_reason = self._check_skip_installed(req) + + if req.satisfied_by: + return self.preparer.prepare_installed_requirement( + req, self.require_hashes, skip_reason + ) + + upgrade_allowed = self._is_upgrade_allowed(req) + abstract_dist = self.preparer.prepare_linked_requirement( + req, self.session, self.finder, upgrade_allowed, + self.require_hashes + ) + + # NOTE + # The following portion is for determining if a certain package is + # going to be re-installed/upgraded or not and reporting to the user. + # This should probably get cleaned up in a future refactor. + + # req.req is only avail after unpack for URL + # pkgs repeat check_if_exists to uninstall-on-upgrade + # (#14) + if not self.ignore_installed: + req.check_if_exists(self.use_user_site) + + if req.satisfied_by: + should_modify = ( + self.upgrade_strategy != "to-satisfy-only" or + self.force_reinstall or + self.ignore_installed or + req.link.scheme == 'file' + ) + if should_modify: + self._set_req_to_reinstall(req) + else: + logger.info( + 'Requirement already satisfied (use --upgrade to upgrade):' + ' %s', req, + ) + + return abstract_dist + + def _resolve_one(self, requirement_set, req_to_install): + """Prepare a single requirements file. + + :return: A list of additional InstallRequirements to also install. + """ + # Tell user what we are doing for this requirement: + # obtain (editable), skipping, processing (local url), collecting + # (remote url or package name) + if req_to_install.constraint or req_to_install.prepared: + return [] + + req_to_install.prepared = True + + # register tmp src for cleanup in case something goes wrong + requirement_set.reqs_to_cleanup.append(req_to_install) + + abstract_dist = self._get_abstract_dist_for(req_to_install) + + # Parse and return dependencies + dist = abstract_dist.dist(self.finder) + try: + check_dist_requires_python(dist) + except UnsupportedPythonVersion as err: + if self.ignore_requires_python: + logger.warning(err.args[0]) + else: + raise + + more_reqs = [] + + def add_req(subreq, extras_requested): + sub_install_req = install_req_from_req( + str(subreq), + req_to_install, + isolated=self.isolated, + wheel_cache=self.wheel_cache, + ) + parent_req_name = req_to_install.name + to_scan_again, add_to_parent = requirement_set.add_requirement( + sub_install_req, + parent_req_name=parent_req_name, + extras_requested=extras_requested, + ) + if parent_req_name and add_to_parent: + self._discovered_dependencies[parent_req_name].append( + add_to_parent + ) + more_reqs.extend(to_scan_again) + + with indent_log(): + # We add req_to_install before its dependencies, so that we + # can refer to it when adding dependencies. + if not requirement_set.has_requirement(req_to_install.name): + # 'unnamed' requirements will get added here + req_to_install.is_direct = True + requirement_set.add_requirement( + req_to_install, parent_req_name=None, + ) + + if not self.ignore_dependencies: + if req_to_install.extras: + logger.debug( + "Installing extra requirements: %r", + ','.join(req_to_install.extras), + ) + missing_requested = sorted( + set(req_to_install.extras) - set(dist.extras) + ) + for missing in missing_requested: + logger.warning( + '%s does not provide the extra \'%s\'', + dist, missing + ) + + available_requested = sorted( + set(dist.extras) & set(req_to_install.extras) + ) + for subreq in dist.requires(available_requested): + add_req(subreq, extras_requested=available_requested) + + if not req_to_install.editable and not req_to_install.satisfied_by: + # XXX: --no-install leads this to report 'Successfully + # downloaded' for only non-editable reqs, even though we took + # action on them. + requirement_set.successfully_downloaded.append(req_to_install) + + return more_reqs + + def get_installation_order(self, req_set): + """Create the installation order. + + The installation order is topological - requirements are installed + before the requiring thing. We break cycles at an arbitrary point, + and make no other guarantees. + """ + # The current implementation, which we may change at any point + # installs the user specified things in the order given, except when + # dependencies must come earlier to achieve topological order. + order = [] + ordered_reqs = set() + + def schedule(req): + if req.satisfied_by or req in ordered_reqs: + return + if req.constraint: + return + ordered_reqs.add(req) + for dep in self._discovered_dependencies[req.name]: + schedule(dep) + order.append(req) + + for install_req in req_set.requirements.values(): + schedule(install_req) + return order diff --git a/lib/python3.4/site-packages/pkg_resources/_vendor/__init__.py b/lib/python3.7/site-packages/pip/_internal/utils/__init__.py similarity index 100% rename from lib/python3.4/site-packages/pkg_resources/_vendor/__init__.py rename to lib/python3.7/site-packages/pip/_internal/utils/__init__.py diff --git a/lib/python3.4/site-packages/pip/utils/appdirs.py b/lib/python3.7/site-packages/pip/_internal/utils/appdirs.py similarity index 95% rename from lib/python3.4/site-packages/pip/utils/appdirs.py rename to lib/python3.7/site-packages/pip/_internal/utils/appdirs.py index 9b82801..cc96f98 100644 --- a/lib/python3.4/site-packages/pip/utils/appdirs.py +++ b/lib/python3.7/site-packages/pip/_internal/utils/appdirs.py @@ -7,9 +7,10 @@ from __future__ import absolute_import import os import sys -from pip.compat import WINDOWS, expanduser from pip._vendor.six import PY2, text_type +from pip._internal.utils.compat import WINDOWS, expanduser + def user_cache_dir(appname): r""" @@ -60,7 +61,7 @@ def user_cache_dir(appname): def user_data_dir(appname, roaming=False): - """ + r""" Return full path to the user-specific data dir for this application. "appname" is the name of application. @@ -74,6 +75,7 @@ def user_data_dir(appname, roaming=False): Typical user data directories are: macOS: ~/Library/Application Support/ + if it exists, else ~/.config/ Unix: ~/.local/share/ # or in $XDG_DATA_HOME, if defined Win XP (not roaming): C:\Documents and Settings\\ ... @@ -93,6 +95,13 @@ def user_data_dir(appname, roaming=False): path = os.path.join( expanduser('~/Library/Application Support/'), appname, + ) if os.path.isdir(os.path.join( + expanduser('~/Library/Application Support/'), + appname, + ) + ) else os.path.join( + expanduser('~/.config/'), + appname, ) else: path = os.path.join( @@ -137,7 +146,7 @@ def user_config_dir(appname, roaming=True): # for the discussion regarding site_config_dirs locations # see def site_config_dirs(appname): - """Return a list of potential user-shared config dirs for this application. + r"""Return a list of potential user-shared config dirs for this application. "appname" is the name of application. @@ -222,6 +231,7 @@ def _get_win_folder_with_ctypes(csidl_name): return buf.value + if WINDOWS: try: import ctypes diff --git a/lib/python3.7/site-packages/pip/_internal/utils/compat.py b/lib/python3.7/site-packages/pip/_internal/utils/compat.py new file mode 100644 index 0000000..3114f2d --- /dev/null +++ b/lib/python3.7/site-packages/pip/_internal/utils/compat.py @@ -0,0 +1,248 @@ +"""Stuff that differs in different Python versions and platform +distributions.""" +from __future__ import absolute_import, division + +import codecs +import locale +import logging +import os +import shutil +import sys + +from pip._vendor.six import text_type + +try: + import ipaddress +except ImportError: + try: + from pip._vendor import ipaddress # type: ignore + except ImportError: + import ipaddr as ipaddress # type: ignore + ipaddress.ip_address = ipaddress.IPAddress + ipaddress.ip_network = ipaddress.IPNetwork + + +__all__ = [ + "ipaddress", "uses_pycache", "console_to_str", "native_str", + "get_path_uid", "stdlib_pkgs", "WINDOWS", "samefile", "get_terminal_size", + "get_extension_suffixes", +] + + +logger = logging.getLogger(__name__) + +if sys.version_info >= (3, 4): + uses_pycache = True + from importlib.util import cache_from_source +else: + import imp + + try: + cache_from_source = imp.cache_from_source # type: ignore + except AttributeError: + # does not use __pycache__ + cache_from_source = None + + uses_pycache = cache_from_source is not None + + +if sys.version_info >= (3, 5): + backslashreplace_decode = "backslashreplace" +else: + # In version 3.4 and older, backslashreplace exists + # but does not support use for decoding. + # We implement our own replace handler for this + # situation, so that we can consistently use + # backslash replacement for all versions. + def backslashreplace_decode_fn(err): + raw_bytes = (err.object[i] for i in range(err.start, err.end)) + if sys.version_info[0] == 2: + # Python 2 gave us characters - convert to numeric bytes + raw_bytes = (ord(b) for b in raw_bytes) + return u"".join(u"\\x%x" % c for c in raw_bytes), err.end + codecs.register_error( + "backslashreplace_decode", + backslashreplace_decode_fn, + ) + backslashreplace_decode = "backslashreplace_decode" + + +def console_to_str(data): + """Return a string, safe for output, of subprocess output. + + We assume the data is in the locale preferred encoding. + If it won't decode properly, we warn the user but decode as + best we can. + + We also ensure that the output can be safely written to + standard output without encoding errors. + """ + + # First, get the encoding we assume. This is the preferred + # encoding for the locale, unless that is not found, or + # it is ASCII, in which case assume UTF-8 + encoding = locale.getpreferredencoding() + if (not encoding) or codecs.lookup(encoding).name == "ascii": + encoding = "utf-8" + + # Now try to decode the data - if we fail, warn the user and + # decode with replacement. + try: + s = data.decode(encoding) + except UnicodeDecodeError: + logger.warning( + "Subprocess output does not appear to be encoded as %s", + encoding, + ) + s = data.decode(encoding, errors=backslashreplace_decode) + + # Make sure we can print the output, by encoding it to the output + # encoding with replacement of unencodable characters, and then + # decoding again. + # We use stderr's encoding because it's less likely to be + # redirected and if we don't find an encoding we skip this + # step (on the assumption that output is wrapped by something + # that won't fail). + # The double getattr is to deal with the possibility that we're + # being called in a situation where sys.__stderr__ doesn't exist, + # or doesn't have an encoding attribute. Neither of these cases + # should occur in normal pip use, but there's no harm in checking + # in case people use pip in (unsupported) unusual situations. + output_encoding = getattr(getattr(sys, "__stderr__", None), + "encoding", None) + + if output_encoding: + s = s.encode(output_encoding, errors="backslashreplace") + s = s.decode(output_encoding) + + return s + + +if sys.version_info >= (3,): + def native_str(s, replace=False): + if isinstance(s, bytes): + return s.decode('utf-8', 'replace' if replace else 'strict') + return s + +else: + def native_str(s, replace=False): + # Replace is ignored -- unicode to UTF-8 can't fail + if isinstance(s, text_type): + return s.encode('utf-8') + return s + + +def get_path_uid(path): + """ + Return path's uid. + + Does not follow symlinks: + https://github.com/pypa/pip/pull/935#discussion_r5307003 + + Placed this function in compat due to differences on AIX and + Jython, that should eventually go away. + + :raises OSError: When path is a symlink or can't be read. + """ + if hasattr(os, 'O_NOFOLLOW'): + fd = os.open(path, os.O_RDONLY | os.O_NOFOLLOW) + file_uid = os.fstat(fd).st_uid + os.close(fd) + else: # AIX and Jython + # WARNING: time of check vulnerability, but best we can do w/o NOFOLLOW + if not os.path.islink(path): + # older versions of Jython don't have `os.fstat` + file_uid = os.stat(path).st_uid + else: + # raise OSError for parity with os.O_NOFOLLOW above + raise OSError( + "%s is a symlink; Will not return uid for symlinks" % path + ) + return file_uid + + +if sys.version_info >= (3, 4): + from importlib.machinery import EXTENSION_SUFFIXES + + def get_extension_suffixes(): + return EXTENSION_SUFFIXES +else: + from imp import get_suffixes + + def get_extension_suffixes(): + return [suffix[0] for suffix in get_suffixes()] + + +def expanduser(path): + """ + Expand ~ and ~user constructions. + + Includes a workaround for https://bugs.python.org/issue14768 + """ + expanded = os.path.expanduser(path) + if path.startswith('~/') and expanded.startswith('//'): + expanded = expanded[1:] + return expanded + + +# packages in the stdlib that may have installation metadata, but should not be +# considered 'installed'. this theoretically could be determined based on +# dist.location (py27:`sysconfig.get_paths()['stdlib']`, +# py26:sysconfig.get_config_vars('LIBDEST')), but fear platform variation may +# make this ineffective, so hard-coding +stdlib_pkgs = {"python", "wsgiref", "argparse"} + + +# windows detection, covers cpython and ironpython +WINDOWS = (sys.platform.startswith("win") or + (sys.platform == 'cli' and os.name == 'nt')) + + +def samefile(file1, file2): + """Provide an alternative for os.path.samefile on Windows/Python2""" + if hasattr(os.path, 'samefile'): + return os.path.samefile(file1, file2) + else: + path1 = os.path.normcase(os.path.abspath(file1)) + path2 = os.path.normcase(os.path.abspath(file2)) + return path1 == path2 + + +if hasattr(shutil, 'get_terminal_size'): + def get_terminal_size(): + """ + Returns a tuple (x, y) representing the width(x) and the height(y) + in characters of the terminal window. + """ + return tuple(shutil.get_terminal_size()) +else: + def get_terminal_size(): + """ + Returns a tuple (x, y) representing the width(x) and the height(y) + in characters of the terminal window. + """ + def ioctl_GWINSZ(fd): + try: + import fcntl + import termios + import struct + cr = struct.unpack_from( + 'hh', + fcntl.ioctl(fd, termios.TIOCGWINSZ, '12345678') + ) + except Exception: + return None + if cr == (0, 0): + return None + return cr + cr = ioctl_GWINSZ(0) or ioctl_GWINSZ(1) or ioctl_GWINSZ(2) + if not cr: + try: + fd = os.open(os.ctermid(), os.O_RDONLY) + cr = ioctl_GWINSZ(fd) + os.close(fd) + except Exception: + pass + if not cr: + cr = (os.environ.get('LINES', 25), os.environ.get('COLUMNS', 80)) + return int(cr[1]), int(cr[0]) diff --git a/lib/python3.7/site-packages/pip/_internal/utils/deprecation.py b/lib/python3.7/site-packages/pip/_internal/utils/deprecation.py new file mode 100644 index 0000000..bd744cf --- /dev/null +++ b/lib/python3.7/site-packages/pip/_internal/utils/deprecation.py @@ -0,0 +1,89 @@ +""" +A module that implements tooling to enable easy warnings about deprecations. +""" +from __future__ import absolute_import + +import logging +import warnings + +from pip._vendor.packaging.version import parse + +from pip import __version__ as current_version +from pip._internal.utils.typing import MYPY_CHECK_RUNNING + +if MYPY_CHECK_RUNNING: + from typing import Any, Optional # noqa: F401 + + +class PipDeprecationWarning(Warning): + pass + + +_original_showwarning = None # type: Any + + +# Warnings <-> Logging Integration +def _showwarning(message, category, filename, lineno, file=None, line=None): + if file is not None: + if _original_showwarning is not None: + _original_showwarning( + message, category, filename, lineno, file, line, + ) + elif issubclass(category, PipDeprecationWarning): + # We use a specially named logger which will handle all of the + # deprecation messages for pip. + logger = logging.getLogger("pip._internal.deprecations") + logger.warning(message) + else: + _original_showwarning( + message, category, filename, lineno, file, line, + ) + + +def install_warning_logger(): + # Enable our Deprecation Warnings + warnings.simplefilter("default", PipDeprecationWarning, append=True) + + global _original_showwarning + + if _original_showwarning is None: + _original_showwarning = warnings.showwarning + warnings.showwarning = _showwarning + + +def deprecated(reason, replacement, gone_in, issue=None): + # type: (str, Optional[str], Optional[str], Optional[int]) -> None + """Helper to deprecate existing functionality. + + reason: + Textual reason shown to the user about why this functionality has + been deprecated. + replacement: + Textual suggestion shown to the user about what alternative + functionality they can use. + gone_in: + The version of pip does this functionality should get removed in. + Raises errors if pip's current version is greater than or equal to + this. + issue: + Issue number on the tracker that would serve as a useful place for + users to find related discussion and provide feedback. + + Always pass replacement, gone_in and issue as keyword arguments for clarity + at the call site. + """ + + # Construct a nice message. + # This is purposely eagerly formatted as we want it to appear as if someone + # typed this entire message out. + message = "DEPRECATION: " + reason + if replacement is not None: + message += " A possible replacement is {}.".format(replacement) + if issue is not None: + url = "https://github.com/pypa/pip/issues/" + str(issue) + message += " You can find discussion regarding this at {}.".format(url) + + # Raise as an error if it has to be removed. + if gone_in is not None and parse(current_version) >= parse(gone_in): + raise PipDeprecationWarning(message) + warnings.warn(message, category=PipDeprecationWarning, stacklevel=2) diff --git a/lib/python3.4/site-packages/pip/utils/encoding.py b/lib/python3.7/site-packages/pip/_internal/utils/encoding.py similarity index 83% rename from lib/python3.4/site-packages/pip/utils/encoding.py rename to lib/python3.7/site-packages/pip/_internal/utils/encoding.py index 2483168..56f6036 100644 --- a/lib/python3.4/site-packages/pip/utils/encoding.py +++ b/lib/python3.7/site-packages/pip/_internal/utils/encoding.py @@ -1,7 +1,7 @@ import codecs import locale import re - +import sys BOMS = [ (codecs.BOM_UTF8, 'utf8'), @@ -13,7 +13,7 @@ BOMS = [ (codecs.BOM_UTF32_LE, 'utf32-le'), ] -ENCODING_RE = re.compile(b'coding[:=]\s*([-\w.]+)') +ENCODING_RE = re.compile(br'coding[:=]\s*([-\w.]+)') def auto_decode(data): @@ -28,4 +28,6 @@ def auto_decode(data): if line[0:1] == b'#' and ENCODING_RE.search(line): encoding = ENCODING_RE.search(line).groups()[0].decode('ascii') return data.decode(encoding) - return data.decode(locale.getpreferredencoding(False)) + return data.decode( + locale.getpreferredencoding(False) or sys.getdefaultencoding(), + ) diff --git a/lib/python3.4/site-packages/pip/utils/filesystem.py b/lib/python3.7/site-packages/pip/_internal/utils/filesystem.py similarity index 94% rename from lib/python3.4/site-packages/pip/utils/filesystem.py rename to lib/python3.7/site-packages/pip/_internal/utils/filesystem.py index 25ad516..1e9cebd 100644 --- a/lib/python3.4/site-packages/pip/utils/filesystem.py +++ b/lib/python3.7/site-packages/pip/_internal/utils/filesystem.py @@ -1,7 +1,7 @@ import os import os.path -from pip.compat import get_path_uid +from pip._internal.utils.compat import get_path_uid def check_path_owner(path): diff --git a/lib/python3.4/site-packages/pip/utils/glibc.py b/lib/python3.7/site-packages/pip/_internal/utils/glibc.py similarity index 93% rename from lib/python3.4/site-packages/pip/utils/glibc.py rename to lib/python3.7/site-packages/pip/_internal/utils/glibc.py index 7847885..ebcfc5b 100644 --- a/lib/python3.4/site-packages/pip/utils/glibc.py +++ b/lib/python3.7/site-packages/pip/_internal/utils/glibc.py @@ -1,8 +1,7 @@ from __future__ import absolute_import -import re import ctypes -import platform +import re import warnings @@ -73,9 +72,13 @@ def have_compatible_glibc(required_major, minimum_minor): # misleading. Solution: instead of using platform, use our code that actually # works. def libc_ver(): + """Try to determine the glibc version + + Returns a tuple of strings (lib, version) which default to empty strings + in case the lookup fails. + """ glibc_version = glibc_version_string() if glibc_version is None: - # For non-glibc platforms, fall back on platform.libc_ver - return platform.libc_ver() + return ("", "") else: return ("glibc", glibc_version) diff --git a/lib/python3.4/site-packages/pip/utils/hashes.py b/lib/python3.7/site-packages/pip/_internal/utils/hashes.py similarity index 95% rename from lib/python3.4/site-packages/pip/utils/hashes.py rename to lib/python3.7/site-packages/pip/_internal/utils/hashes.py index 9602970..8b909ba 100644 --- a/lib/python3.4/site-packages/pip/utils/hashes.py +++ b/lib/python3.7/site-packages/pip/_internal/utils/hashes.py @@ -2,10 +2,12 @@ from __future__ import absolute_import import hashlib -from pip.exceptions import HashMismatch, HashMissing, InstallationError -from pip.utils import read_chunks from pip._vendor.six import iteritems, iterkeys, itervalues +from pip._internal.exceptions import ( + HashMismatch, HashMissing, InstallationError, +) +from pip._internal.utils.misc import read_chunks # The recommended hash algo of the moment. Change this whenever the state of # the art changes; it won't hurt backward compatibility. diff --git a/lib/python3.7/site-packages/pip/_internal/utils/logging.py b/lib/python3.7/site-packages/pip/_internal/utils/logging.py new file mode 100644 index 0000000..d9b9541 --- /dev/null +++ b/lib/python3.7/site-packages/pip/_internal/utils/logging.py @@ -0,0 +1,225 @@ +from __future__ import absolute_import + +import contextlib +import logging +import logging.handlers +import os + +from pip._internal.utils.compat import WINDOWS +from pip._internal.utils.misc import ensure_dir + +try: + import threading +except ImportError: + import dummy_threading as threading # type: ignore + + +try: + from pip._vendor import colorama +# Lots of different errors can come from this, including SystemError and +# ImportError. +except Exception: + colorama = None + + +_log_state = threading.local() +_log_state.indentation = 0 + + +@contextlib.contextmanager +def indent_log(num=2): + """ + A context manager which will cause the log output to be indented for any + log messages emitted inside it. + """ + _log_state.indentation += num + try: + yield + finally: + _log_state.indentation -= num + + +def get_indentation(): + return getattr(_log_state, 'indentation', 0) + + +class IndentingFormatter(logging.Formatter): + + def format(self, record): + """ + Calls the standard formatter, but will indent all of the log messages + by our current indentation level. + """ + formatted = logging.Formatter.format(self, record) + formatted = "".join([ + (" " * get_indentation()) + line + for line in formatted.splitlines(True) + ]) + return formatted + + +def _color_wrap(*colors): + def wrapped(inp): + return "".join(list(colors) + [inp, colorama.Style.RESET_ALL]) + return wrapped + + +class ColorizedStreamHandler(logging.StreamHandler): + + # Don't build up a list of colors if we don't have colorama + if colorama: + COLORS = [ + # This needs to be in order from highest logging level to lowest. + (logging.ERROR, _color_wrap(colorama.Fore.RED)), + (logging.WARNING, _color_wrap(colorama.Fore.YELLOW)), + ] + else: + COLORS = [] + + def __init__(self, stream=None, no_color=None): + logging.StreamHandler.__init__(self, stream) + self._no_color = no_color + + if WINDOWS and colorama: + self.stream = colorama.AnsiToWin32(self.stream) + + def should_color(self): + # Don't colorize things if we do not have colorama or if told not to + if not colorama or self._no_color: + return False + + real_stream = ( + self.stream if not isinstance(self.stream, colorama.AnsiToWin32) + else self.stream.wrapped + ) + + # If the stream is a tty we should color it + if hasattr(real_stream, "isatty") and real_stream.isatty(): + return True + + # If we have an ANSI term we should color it + if os.environ.get("TERM") == "ANSI": + return True + + # If anything else we should not color it + return False + + def format(self, record): + msg = logging.StreamHandler.format(self, record) + + if self.should_color(): + for level, color in self.COLORS: + if record.levelno >= level: + msg = color(msg) + break + + return msg + + +class BetterRotatingFileHandler(logging.handlers.RotatingFileHandler): + + def _open(self): + ensure_dir(os.path.dirname(self.baseFilename)) + return logging.handlers.RotatingFileHandler._open(self) + + +class MaxLevelFilter(logging.Filter): + + def __init__(self, level): + self.level = level + + def filter(self, record): + return record.levelno < self.level + + +def setup_logging(verbosity, no_color, user_log_file): + """Configures and sets up all of the logging + """ + + # Determine the level to be logging at. + if verbosity >= 1: + level = "DEBUG" + elif verbosity == -1: + level = "WARNING" + elif verbosity == -2: + level = "ERROR" + elif verbosity <= -3: + level = "CRITICAL" + else: + level = "INFO" + + # The "root" logger should match the "console" level *unless* we also need + # to log to a user log file. + include_user_log = user_log_file is not None + if include_user_log: + additional_log_file = user_log_file + root_level = "DEBUG" + else: + additional_log_file = "/dev/null" + root_level = level + + # Disable any logging besides WARNING unless we have DEBUG level logging + # enabled for vendored libraries. + vendored_log_level = "WARNING" if level in ["INFO", "ERROR"] else "DEBUG" + + # Shorthands for clarity + log_streams = { + "stdout": "ext://sys.stdout", + "stderr": "ext://sys.stderr", + } + handler_classes = { + "stream": "pip._internal.utils.logging.ColorizedStreamHandler", + "file": "pip._internal.utils.logging.BetterRotatingFileHandler", + } + + logging.config.dictConfig({ + "version": 1, + "disable_existing_loggers": False, + "filters": { + "exclude_warnings": { + "()": "pip._internal.utils.logging.MaxLevelFilter", + "level": logging.WARNING, + }, + }, + "formatters": { + "indent": { + "()": IndentingFormatter, + "format": "%(message)s", + }, + }, + "handlers": { + "console": { + "level": level, + "class": handler_classes["stream"], + "no_color": no_color, + "stream": log_streams["stdout"], + "filters": ["exclude_warnings"], + "formatter": "indent", + }, + "console_errors": { + "level": "WARNING", + "class": handler_classes["stream"], + "no_color": no_color, + "stream": log_streams["stderr"], + "formatter": "indent", + }, + "user_log": { + "level": "DEBUG", + "class": handler_classes["file"], + "filename": additional_log_file, + "delay": True, + "formatter": "indent", + }, + }, + "root": { + "level": root_level, + "handlers": ["console", "console_errors"] + ( + ["user_log"] if include_user_log else [] + ), + }, + "loggers": { + "pip._vendor": { + "level": vendored_log_level + } + }, + }) diff --git a/lib/python3.4/site-packages/pip/utils/__init__.py b/lib/python3.7/site-packages/pip/_internal/utils/misc.py similarity index 84% rename from lib/python3.4/site-packages/pip/utils/__init__.py rename to lib/python3.7/site-packages/pip/_internal/utils/misc.py index 0d25d91..e9e552e 100644 --- a/lib/python3.4/site-packages/pip/utils/__init__.py +++ b/lib/python3.7/site-packages/pip/_internal/utils/misc.py @@ -1,6 +1,5 @@ from __future__ import absolute_import -from collections import deque import contextlib import errno import io @@ -8,26 +7,33 @@ import locale # we have a submodule named 'logging' which would shadow this if we used the # regular name: import logging as std_logging -import re import os import posixpath +import re import shutil import stat import subprocess import sys import tarfile import zipfile +from collections import deque -from pip.exceptions import InstallationError -from pip.compat import console_to_str, expanduser, stdlib_pkgs -from pip.locations import ( - site_packages, user_site, running_under_virtualenv, virtualenv_no_global, +from pip._vendor import pkg_resources +# NOTE: retrying is not annotated in typeshed as on 2017-07-17, which is +# why we ignore the type on this import. +from pip._vendor.retrying import retry # type: ignore +from pip._vendor.six import PY2 +from pip._vendor.six.moves import input +from pip._vendor.six.moves.urllib import parse as urllib_parse + +from pip._internal.exceptions import CommandError, InstallationError +from pip._internal.locations import ( + running_under_virtualenv, site_packages, user_site, virtualenv_no_global, write_delete_marker_file, ) -from pip._vendor import pkg_resources -from pip._vendor.six.moves import input -from pip._vendor.six import PY2 -from pip._vendor.retrying import retry +from pip._internal.utils.compat import ( + WINDOWS, console_to_str, expanduser, stdlib_pkgs, +) if PY2: from io import BytesIO as StringIO @@ -40,11 +46,11 @@ __all__ = ['rmtree', 'display_path', 'backup_dir', 'is_svn_page', 'file_contents', 'split_leading_dir', 'has_leading_dir', 'normalize_path', - 'renames', 'get_terminal_size', 'get_prog', + 'renames', 'get_prog', 'unzip_file', 'untar_file', 'unpack_file', 'call_subprocess', 'captured_stdout', 'ensure_dir', 'ARCHIVE_EXTENSIONS', 'SUPPORTED_EXTENSIONS', - 'get_installed_version'] + 'get_installed_version', 'remove_auth_from_url'] logger = std_logging.getLogger(__name__) @@ -88,8 +94,11 @@ def ensure_dir(path): def get_prog(): try: - if os.path.basename(sys.argv[0]) in ('__main__.py', '-c'): + prog = os.path.basename(sys.argv[0]) + if prog in ('__main__.py', '-c'): return "%s -m pip" % sys.executable + else: + return prog except (AttributeError, TypeError, IndexError): pass return 'pip' @@ -178,12 +187,16 @@ def format_size(bytes): def is_installable_dir(path): - """Return True if `path` is a directory containing a setup.py file.""" + """Is path is a directory containing setup.py or pyproject.toml? + """ if not os.path.isdir(path): return False setup_py = os.path.join(path, 'setup.py') if os.path.isfile(setup_py): return True + pyproject_toml = os.path.join(path, 'pyproject.toml') + if os.path.isfile(pyproject_toml): + return True return False @@ -296,7 +309,7 @@ def is_local(path): if running_under_virtualenv(): return path.startswith(normalize_path(sys.prefix)) else: - from pip.locations import distutils_scheme + from pip._internal.locations import distutils_scheme if path.startswith(prefix): for local_path in distutils_scheme("").values(): if path.startswith(normalize_path(local_path)): @@ -326,7 +339,7 @@ def dist_in_usersite(dist): def dist_in_site_packages(dist): """ Return True if given Distribution is installed in - distutils.sysconfig.get_python_lib(). + sysconfig.get_python_lib(). """ return normalize_path( dist_location(dist) @@ -356,7 +369,7 @@ def get_installed_distributions(local_only=True, ``skip`` argument is an iterable of lower-case project names to ignore; defaults to stdlib_pkgs - If ``editables`` is False, don't report editables. + If ``include_editables`` is False, don't report editables. If ``editables_only`` is True , only report editables. @@ -450,36 +463,6 @@ def dist_location(dist): return dist.location -def get_terminal_size(): - """Returns a tuple (x, y) representing the width(x) and the height(x) - in characters of the terminal window.""" - def ioctl_GWINSZ(fd): - try: - import fcntl - import termios - import struct - cr = struct.unpack( - 'hh', - fcntl.ioctl(fd, termios.TIOCGWINSZ, '1234') - ) - except: - return None - if cr == (0, 0): - return None - return cr - cr = ioctl_GWINSZ(0) or ioctl_GWINSZ(1) or ioctl_GWINSZ(2) - if not cr: - try: - fd = os.open(os.ctermid(), os.O_RDONLY) - cr = ioctl_GWINSZ(fd) - os.close(fd) - except: - pass - if not cr: - cr = (os.environ.get('LINES', 25), os.environ.get('COLUMNS', 80)) - return int(cr[1]), int(cr[0]) - - def current_umask(): """Get the current umask which involves having to set it temporarily.""" mask = os.umask(0) @@ -624,7 +607,7 @@ def unpack_file(filename, location, content_type, link): elif (content_type and content_type.startswith('text/html') and is_svn_page(file_contents(filename))): # We don't really care about this - from pip.vcs.subversion import Subversion + from pip._internal.vcs.subversion import Subversion Subversion('svn+' + link.url).unpack(location) else: # FIXME: handle? @@ -642,7 +625,14 @@ def unpack_file(filename, location, content_type, link): def call_subprocess(cmd, show_stdout=True, cwd=None, on_returncode='raise', command_desc=None, - extra_environ=None, spinner=None): + extra_environ=None, unset_environ=None, spinner=None): + """ + Args: + unset_environ: an iterable of environment variable names to unset + prior to calling subprocess.Popen(). + """ + if unset_environ is None: + unset_environ = [] # This function's handling of subprocess output is confusing and I # previously broke it terribly, so as penance I will write a long comment # explaining things. @@ -679,17 +669,21 @@ def call_subprocess(cmd, show_stdout=True, cwd=None, env = os.environ.copy() if extra_environ: env.update(extra_environ) + for name in unset_environ: + env.pop(name, None) try: proc = subprocess.Popen( - cmd, stderr=subprocess.STDOUT, stdin=None, stdout=stdout, - cwd=cwd, env=env) + cmd, stderr=subprocess.STDOUT, stdin=subprocess.PIPE, + stdout=stdout, cwd=cwd, env=env, + ) + proc.stdin.close() except Exception as exc: logger.critical( "Error %s while executing command %s", exc, command_desc, ) raise + all_output = [] if stdout is not None: - all_output = [] while True: line = console_to_str(proc.stdout.readline()) if not line: @@ -703,7 +697,11 @@ def call_subprocess(cmd, show_stdout=True, cwd=None, # Update the spinner if spinner is not None: spinner.spin() - proc.wait() + try: + proc.wait() + finally: + if proc.stdout: + proc.stdout.close() if spinner is not None: if proc.returncode: spinner.finish("error") @@ -845,17 +843,15 @@ class cached_property(object): return value -def get_installed_version(dist_name, lookup_dirs=None): +def get_installed_version(dist_name, working_set=None): """Get the installed version of dist_name avoiding pkg_resources cache""" # Create a requirement that we'll look for inside of setuptools. req = pkg_resources.Requirement.parse(dist_name) - # We want to avoid having this cached, so we need to construct a new - # working set each time. - if lookup_dirs is None: + if working_set is None: + # We want to avoid having this cached, so we need to construct a new + # working set each time. working_set = pkg_resources.WorkingSet() - else: - working_set = pkg_resources.WorkingSet(lookup_dirs) # Get the installed distribution from our working set dist = working_set.find(req) @@ -868,3 +864,95 @@ def get_installed_version(dist_name, lookup_dirs=None): def consume(iterator): """Consume an iterable at C speed.""" deque(iterator, maxlen=0) + + +# Simulates an enum +def enum(*sequential, **named): + enums = dict(zip(sequential, range(len(sequential))), **named) + reverse = {value: key for key, value in enums.items()} + enums['reverse_mapping'] = reverse + return type('Enum', (), enums) + + +def make_vcs_requirement_url(repo_url, rev, egg_project_name, subdir=None): + """ + Return the URL for a VCS requirement. + + Args: + repo_url: the remote VCS url, with any needed VCS prefix (e.g. "git+"). + """ + req = '{}@{}#egg={}'.format(repo_url, rev, egg_project_name) + if subdir: + req += '&subdirectory={}'.format(subdir) + + return req + + +def split_auth_from_netloc(netloc): + """ + Parse out and remove the auth information from a netloc. + + Returns: (netloc, (username, password)). + """ + if '@' not in netloc: + return netloc, (None, None) + + # Split from the right because that's how urllib.parse.urlsplit() + # behaves if more than one @ is present (which can be checked using + # the password attribute of urlsplit()'s return value). + auth, netloc = netloc.rsplit('@', 1) + if ':' in auth: + # Split from the left because that's how urllib.parse.urlsplit() + # behaves if more than one : is present (which again can be checked + # using the password attribute of the return value) + user_pass = tuple(auth.split(':', 1)) + else: + user_pass = auth, None + + return netloc, user_pass + + +def remove_auth_from_url(url): + # Return a copy of url with 'username:password@' removed. + # username/pass params are passed to subversion through flags + # and are not recognized in the url. + + # parsed url + purl = urllib_parse.urlsplit(url) + netloc, user_pass = split_auth_from_netloc(purl.netloc) + + # stripped url + url_pieces = ( + purl.scheme, netloc, purl.path, purl.query, purl.fragment + ) + surl = urllib_parse.urlunsplit(url_pieces) + return surl + + +def protect_pip_from_modification_on_windows(modifying_pip): + """Protection of pip.exe from modification on Windows + + On Windows, any operation modifying pip should be run as: + python -m pip ... + """ + pip_names = [ + "pip.exe", + "pip{}.exe".format(sys.version_info[0]), + "pip{}.{}.exe".format(*sys.version_info[:2]) + ] + + # See https://github.com/pypa/pip/issues/1299 for more discussion + should_show_use_python_msg = ( + modifying_pip and + WINDOWS and + os.path.basename(sys.argv[0]) in pip_names + ) + + if should_show_use_python_msg: + new_command = [ + sys.executable, "-m", "pip" + ] + sys.argv[1:] + raise CommandError( + 'To modify pip, please run the following command:\n{}' + .format(" ".join(new_command)) + ) diff --git a/lib/python3.7/site-packages/pip/_internal/utils/models.py b/lib/python3.7/site-packages/pip/_internal/utils/models.py new file mode 100644 index 0000000..d5cb80a --- /dev/null +++ b/lib/python3.7/site-packages/pip/_internal/utils/models.py @@ -0,0 +1,40 @@ +"""Utilities for defining models +""" + +import operator + + +class KeyBasedCompareMixin(object): + """Provides comparision capabilities that is based on a key + """ + + def __init__(self, key, defining_class): + self._compare_key = key + self._defining_class = defining_class + + def __hash__(self): + return hash(self._compare_key) + + def __lt__(self, other): + return self._compare(other, operator.__lt__) + + def __le__(self, other): + return self._compare(other, operator.__le__) + + def __gt__(self, other): + return self._compare(other, operator.__gt__) + + def __ge__(self, other): + return self._compare(other, operator.__ge__) + + def __eq__(self, other): + return self._compare(other, operator.__eq__) + + def __ne__(self, other): + return self._compare(other, operator.__ne__) + + def _compare(self, other, method): + if not isinstance(other, self._defining_class): + return NotImplemented + + return method(self._compare_key, other._compare_key) diff --git a/lib/python3.4/site-packages/pip/utils/outdated.py b/lib/python3.7/site-packages/pip/_internal/utils/outdated.py similarity index 56% rename from lib/python3.4/site-packages/pip/utils/outdated.py rename to lib/python3.7/site-packages/pip/_internal/utils/outdated.py index 2164cc3..5bfbfe1 100644 --- a/lib/python3.4/site-packages/pip/utils/outdated.py +++ b/lib/python3.7/site-packages/pip/_internal/utils/outdated.py @@ -6,15 +6,13 @@ import logging import os.path import sys -from pip._vendor import lockfile +from pip._vendor import lockfile, pkg_resources from pip._vendor.packaging import version as packaging_version -from pip.compat import total_seconds, WINDOWS -from pip.models import PyPI -from pip.locations import USER_CACHE_DIR, running_under_virtualenv -from pip.utils import ensure_dir, get_installed_version -from pip.utils.filesystem import check_path_owner - +from pip._internal.index import PackageFinder +from pip._internal.utils.compat import WINDOWS +from pip._internal.utils.filesystem import check_path_owner +from pip._internal.utils.misc import ensure_dir, get_installed_version SELFCHECK_DATE_FMT = "%Y-%m-%dT%H:%M:%SZ" @@ -22,43 +20,27 @@ SELFCHECK_DATE_FMT = "%Y-%m-%dT%H:%M:%SZ" logger = logging.getLogger(__name__) -class VirtualenvSelfCheckState(object): - def __init__(self): - self.statefile_path = os.path.join(sys.prefix, "pip-selfcheck.json") +class SelfCheckState(object): + def __init__(self, cache_dir): + self.state = {} + self.statefile_path = None - # Load the existing state - try: - with open(self.statefile_path) as statefile: - self.state = json.load(statefile) - except (IOError, ValueError): - self.state = {} + # Try to load the existing state + if cache_dir: + self.statefile_path = os.path.join(cache_dir, "selfcheck.json") + try: + with open(self.statefile_path) as statefile: + self.state = json.load(statefile)[sys.prefix] + except (IOError, ValueError, KeyError): + # Explicitly suppressing exceptions, since we don't want to + # error out if the cache file is invalid. + pass def save(self, pypi_version, current_time): - # Attempt to write out our version check file - with open(self.statefile_path, "w") as statefile: - json.dump( - { - "last_check": current_time.strftime(SELFCHECK_DATE_FMT), - "pypi_version": pypi_version, - }, - statefile, - sort_keys=True, - separators=(",", ":") - ) + # If we do not have a path to cache in, don't bother saving. + if not self.statefile_path: + return - -class GlobalSelfCheckState(object): - def __init__(self): - self.statefile_path = os.path.join(USER_CACHE_DIR, "selfcheck.json") - - # Load the existing state - try: - with open(self.statefile_path) as statefile: - self.state = json.load(statefile)[sys.prefix] - except (IOError, ValueError, KeyError): - self.state = {} - - def save(self, pypi_version, current_time): # Check to make sure that we own the directory if not check_path_owner(os.path.dirname(self.statefile_path)): return @@ -85,14 +67,21 @@ class GlobalSelfCheckState(object): separators=(",", ":")) -def load_selfcheck_statefile(): - if running_under_virtualenv(): - return VirtualenvSelfCheckState() - else: - return GlobalSelfCheckState() +def was_installed_by_pip(pkg): + """Checks whether pkg was installed by pip + + This is used not to display the upgrade message when pip is in fact + installed by system package manager, such as dnf on Fedora. + """ + try: + dist = pkg_resources.get_distribution(pkg) + return (dist.has_metadata('INSTALLER') and + 'pip' in dist.get_metadata_lines('INSTALLER')) + except pkg_resources.DistributionNotFound: + return False -def pip_version_check(session): +def pip_version_check(session, options): """Check for an update for pip. Limit the frequency of checks to once per week. State is stored either in @@ -100,14 +89,14 @@ def pip_version_check(session): of the pip script path. """ installed_version = get_installed_version("pip") - if installed_version is None: + if not installed_version: return pip_version = packaging_version.parse(installed_version) pypi_version = None try: - state = load_selfcheck_statefile() + state = SelfCheckState(cache_dir=options.cache_dir) current_time = datetime.datetime.utcnow() # Determine if we need to refresh the state @@ -116,23 +105,26 @@ def pip_version_check(session): state.state["last_check"], SELFCHECK_DATE_FMT ) - if total_seconds(current_time - last_check) < 7 * 24 * 60 * 60: + if (current_time - last_check).total_seconds() < 7 * 24 * 60 * 60: pypi_version = state.state["pypi_version"] # Refresh the version if we need to or just see if we need to warn if pypi_version is None: - resp = session.get( - PyPI.pip_json_url, - headers={"Accept": "application/json"}, + # Lets use PackageFinder to see what the latest pip version is + finder = PackageFinder( + find_links=options.find_links, + index_urls=[options.index_url] + options.extra_index_urls, + allow_all_prereleases=False, # Explicitly set to False + trusted_hosts=options.trusted_hosts, + process_dependency_links=options.process_dependency_links, + session=session, + ) + all_candidates = finder.find_all_candidates("pip") + if not all_candidates: + return + pypi_version = str( + max(all_candidates, key=lambda c: c.version).version ) - resp.raise_for_status() - pypi_version = [ - v for v in sorted( - list(resp.json()["releases"]), - key=packaging_version.parse, - ) - if not packaging_version.parse(v).is_prerelease - ][-1] # save that we've performed a check state.save(pypi_version, current_time) @@ -141,7 +133,8 @@ def pip_version_check(session): # Determine if our pypi_version is older if (pip_version < remote_version and - pip_version.base_version != remote_version.base_version): + pip_version.base_version != remote_version.base_version and + was_installed_by_pip('pip')): # Advise "python -m pip" on Windows to avoid issues # with overwriting pip.exe. if WINDOWS: @@ -154,7 +147,6 @@ def pip_version_check(session): "'%s install --upgrade pip' command.", pip_version, pypi_version, pip_cmd ) - except Exception: logger.debug( "There was an error checking the latest version of pip", diff --git a/lib/python3.4/site-packages/pip/utils/packaging.py b/lib/python3.7/site-packages/pip/_internal/utils/packaging.py similarity index 69% rename from lib/python3.4/site-packages/pip/utils/packaging.py rename to lib/python3.7/site-packages/pip/_internal/utils/packaging.py index e93b20d..c43142f 100644 --- a/lib/python3.4/site-packages/pip/utils/packaging.py +++ b/lib/python3.7/site-packages/pip/_internal/utils/packaging.py @@ -1,15 +1,14 @@ from __future__ import absolute_import -from email.parser import FeedParser - import logging import sys +from email.parser import FeedParser # type: ignore -from pip._vendor.packaging import specifiers -from pip._vendor.packaging import version from pip._vendor import pkg_resources +from pip._vendor.packaging import specifiers, version -from pip import exceptions +from pip._internal import exceptions +from pip._internal.utils.misc import display_path logger = logging.getLogger(__name__) @@ -37,16 +36,20 @@ def check_requires_python(requires_python): def get_metadata(dist): if (isinstance(dist, pkg_resources.DistInfoDistribution) and dist.has_metadata('METADATA')): - return dist.get_metadata('METADATA') + metadata = dist.get_metadata('METADATA') elif dist.has_metadata('PKG-INFO'): - return dist.get_metadata('PKG-INFO') + metadata = dist.get_metadata('PKG-INFO') + else: + logger.warning("No metadata found in %s", display_path(dist.location)) + metadata = '' + + feed_parser = FeedParser() + feed_parser.feed(metadata) + return feed_parser.close() def check_dist_requires_python(dist): - metadata = get_metadata(dist) - feed_parser = FeedParser() - feed_parser.feed(metadata) - pkg_info_dict = feed_parser.close() + pkg_info_dict = get_metadata(dist) requires_python = pkg_info_dict.get('Requires-Python') try: if not check_requires_python(requires_python): @@ -58,6 +61,15 @@ def check_dist_requires_python(dist): ) except specifiers.InvalidSpecifier as e: logger.warning( - "Package %s has an invalid Requires-Python entry %s - %s" % ( - dist.project_name, requires_python, e)) + "Package %s has an invalid Requires-Python entry %s - %s", + dist.project_name, requires_python, e, + ) return + + +def get_installer(dist): + if dist.has_metadata('INSTALLER'): + for line in dist.get_metadata_lines('INSTALLER'): + if line.strip(): + return line.strip() + return '' diff --git a/lib/python3.4/site-packages/pip/utils/setuptools_build.py b/lib/python3.7/site-packages/pip/_internal/utils/setuptools_build.py similarity index 100% rename from lib/python3.4/site-packages/pip/utils/setuptools_build.py rename to lib/python3.7/site-packages/pip/_internal/utils/setuptools_build.py diff --git a/lib/python3.7/site-packages/pip/_internal/utils/temp_dir.py b/lib/python3.7/site-packages/pip/_internal/utils/temp_dir.py new file mode 100644 index 0000000..edc506b --- /dev/null +++ b/lib/python3.7/site-packages/pip/_internal/utils/temp_dir.py @@ -0,0 +1,82 @@ +from __future__ import absolute_import + +import logging +import os.path +import tempfile + +from pip._internal.utils.misc import rmtree + +logger = logging.getLogger(__name__) + + +class TempDirectory(object): + """Helper class that owns and cleans up a temporary directory. + + This class can be used as a context manager or as an OO representation of a + temporary directory. + + Attributes: + path + Location to the created temporary directory or None + delete + Whether the directory should be deleted when exiting + (when used as a contextmanager) + + Methods: + create() + Creates a temporary directory and stores its path in the path + attribute. + cleanup() + Deletes the temporary directory and sets path attribute to None + + When used as a context manager, a temporary directory is created on + entering the context and, if the delete attribute is True, on exiting the + context the created directory is deleted. + """ + + def __init__(self, path=None, delete=None, kind="temp"): + super(TempDirectory, self).__init__() + + if path is None and delete is None: + # If we were not given an explicit directory, and we were not given + # an explicit delete option, then we'll default to deleting. + delete = True + + self.path = path + self.delete = delete + self.kind = kind + + def __repr__(self): + return "<{} {!r}>".format(self.__class__.__name__, self.path) + + def __enter__(self): + self.create() + return self + + def __exit__(self, exc, value, tb): + if self.delete: + self.cleanup() + + def create(self): + """Create a temporary directory and store it's path in self.path + """ + if self.path is not None: + logger.debug( + "Skipped creation of temporary directory: {}".format(self.path) + ) + return + # We realpath here because some systems have their default tmpdir + # symlinked to another directory. This tends to confuse build + # scripts, so we canonicalize the path by traversing potential + # symlinks here. + self.path = os.path.realpath( + tempfile.mkdtemp(prefix="pip-{}-".format(self.kind)) + ) + logger.debug("Created temporary directory: {}".format(self.path)) + + def cleanup(self): + """Remove the temporary directory created and reset state + """ + if self.path is not None and os.path.exists(self.path): + rmtree(self.path) + self.path = None diff --git a/lib/python3.7/site-packages/pip/_internal/utils/typing.py b/lib/python3.7/site-packages/pip/_internal/utils/typing.py new file mode 100644 index 0000000..e085cdf --- /dev/null +++ b/lib/python3.7/site-packages/pip/_internal/utils/typing.py @@ -0,0 +1,29 @@ +"""For neatly implementing static typing in pip. + +`mypy` - the static type analysis tool we use - uses the `typing` module, which +provides core functionality fundamental to mypy's functioning. + +Generally, `typing` would be imported at runtime and used in that fashion - +it acts as a no-op at runtime and does not have any run-time overhead by +design. + +As it turns out, `typing` is not vendorable - it uses separate sources for +Python 2/Python 3. Thus, this codebase can not expect it to be present. +To work around this, mypy allows the typing import to be behind a False-y +optional to prevent it from running at runtime and type-comments can be used +to remove the need for the types to be accessible directly during runtime. + +This module provides the False-y guard in a nicely named fashion so that a +curious maintainer can reach here to read this. + +In pip, all static-typing related imports should be guarded as follows: + + from pip._internal.utils.typing import MYPY_CHECK_RUNNING + + if MYPY_CHECK_RUNNING: + from typing import ... # noqa: F401 + +Ref: https://github.com/python/mypy/issues/3216 +""" + +MYPY_CHECK_RUNNING = False diff --git a/lib/python3.4/site-packages/pip/utils/ui.py b/lib/python3.7/site-packages/pip/_internal/utils/ui.py similarity index 80% rename from lib/python3.4/site-packages/pip/utils/ui.py rename to lib/python3.7/site-packages/pip/_internal/utils/ui.py index bba73e3..6bab904 100644 --- a/lib/python3.4/site-packages/pip/utils/ui.py +++ b/lib/python3.7/site-packages/pip/_internal/utils/ui.py @@ -1,22 +1,28 @@ -from __future__ import absolute_import -from __future__ import division +from __future__ import absolute_import, division -import itertools -import sys -from signal import signal, SIGINT, default_int_handler -import time import contextlib +import itertools import logging +import sys +import time +from signal import SIGINT, default_int_handler, signal -from pip.compat import WINDOWS -from pip.utils import format_size -from pip.utils.logging import get_indentation from pip._vendor import six -from pip._vendor.progress.bar import Bar, IncrementalBar -from pip._vendor.progress.helpers import (WritelnMixin, - HIDE_CURSOR, SHOW_CURSOR) +from pip._vendor.progress.bar import ( + Bar, ChargingBar, FillingCirclesBar, FillingSquaresBar, IncrementalBar, + ShadyBar, +) +from pip._vendor.progress.helpers import HIDE_CURSOR, SHOW_CURSOR, WritelnMixin from pip._vendor.progress.spinner import Spinner +from pip._internal.utils.compat import WINDOWS +from pip._internal.utils.logging import get_indentation +from pip._internal.utils.misc import format_size +from pip._internal.utils.typing import MYPY_CHECK_RUNNING + +if MYPY_CHECK_RUNNING: + from typing import Any # noqa: F401 + try: from pip._vendor import colorama # Lots of different errors can come from this, including SystemError and @@ -54,7 +60,7 @@ def _select_progress_class(preferred, fallback): return preferred -_BaseBar = _select_progress_class(IncrementalBar, Bar) +_BaseBar = _select_progress_class(IncrementalBar, Bar) # type: Any class InterruptibleMixin(object): @@ -112,6 +118,20 @@ class InterruptibleMixin(object): self.original_handler(signum, frame) +class SilentBar(Bar): + + def update(self): + pass + + +class BlueEmojiBar(IncrementalBar): + + suffix = "%(percent)d%%" + bar_prefix = " " + bar_suffix = " " + phases = (u"\U0001F539", u"\U0001F537", u"\U0001F535") # type: Any + + class DownloadProgressMixin(object): def __init__(self, *args, **kwargs): @@ -171,13 +191,54 @@ class WindowsMixin(object): self.file.flush = lambda: self.file.wrapped.flush() -class DownloadProgressBar(WindowsMixin, InterruptibleMixin, - DownloadProgressMixin, _BaseBar): +class BaseDownloadProgressBar(WindowsMixin, InterruptibleMixin, + DownloadProgressMixin): file = sys.stdout message = "%(percent)d%%" suffix = "%(downloaded)s %(download_speed)s %(pretty_eta)s" +# NOTE: The "type: ignore" comments on the following classes are there to +# work around https://github.com/python/typing/issues/241 + + +class DefaultDownloadProgressBar(BaseDownloadProgressBar, + _BaseBar): # type: ignore + pass + + +class DownloadSilentBar(BaseDownloadProgressBar, SilentBar): # type: ignore + pass + + +class DownloadIncrementalBar(BaseDownloadProgressBar, # type: ignore + IncrementalBar): + pass + + +class DownloadChargingBar(BaseDownloadProgressBar, # type: ignore + ChargingBar): + pass + + +class DownloadShadyBar(BaseDownloadProgressBar, ShadyBar): # type: ignore + pass + + +class DownloadFillingSquaresBar(BaseDownloadProgressBar, # type: ignore + FillingSquaresBar): + pass + + +class DownloadFillingCirclesBar(BaseDownloadProgressBar, # type: ignore + FillingCirclesBar): + pass + + +class DownloadBlueEmojiProgressBar(BaseDownloadProgressBar, # type: ignore + BlueEmojiBar): + pass + class DownloadProgressSpinner(WindowsMixin, InterruptibleMixin, DownloadProgressMixin, WritelnMixin, Spinner): @@ -205,6 +266,22 @@ class DownloadProgressSpinner(WindowsMixin, InterruptibleMixin, self.writeln(line) +BAR_TYPES = { + "off": (DownloadSilentBar, DownloadSilentBar), + "on": (DefaultDownloadProgressBar, DownloadProgressSpinner), + "ascii": (DownloadIncrementalBar, DownloadProgressSpinner), + "pretty": (DownloadFillingCirclesBar, DownloadProgressSpinner), + "emoji": (DownloadBlueEmojiProgressBar, DownloadProgressSpinner) +} + + +def DownloadProgressProvider(progress_bar, max=None): + if max is None or max == 0: + return BAR_TYPES[progress_bar][1]().iter + else: + return BAR_TYPES[progress_bar][0](max=max).iter + + ################################################################ # Generic "something is happening" spinners # diff --git a/lib/python3.7/site-packages/pip/_internal/vcs/__init__.py b/lib/python3.7/site-packages/pip/_internal/vcs/__init__.py new file mode 100644 index 0000000..794b35d --- /dev/null +++ b/lib/python3.7/site-packages/pip/_internal/vcs/__init__.py @@ -0,0 +1,509 @@ +"""Handles all VCS (version control) support""" +from __future__ import absolute_import + +import errno +import logging +import os +import shutil +import sys + +from pip._vendor.six.moves.urllib import parse as urllib_parse + +from pip._internal.exceptions import BadCommand +from pip._internal.utils.misc import ( + display_path, backup_dir, call_subprocess, rmtree, ask_path_exists, +) +from pip._internal.utils.typing import MYPY_CHECK_RUNNING + +if MYPY_CHECK_RUNNING: + from typing import Dict, Optional, Tuple # noqa: F401 + from pip._internal.cli.base_command import Command # noqa: F401 + +__all__ = ['vcs', 'get_src_requirement'] + + +logger = logging.getLogger(__name__) + + +class RevOptions(object): + + """ + Encapsulates a VCS-specific revision to install, along with any VCS + install options. + + Instances of this class should be treated as if immutable. + """ + + def __init__(self, vcs, rev=None, extra_args=None): + """ + Args: + vcs: a VersionControl object. + rev: the name of the revision to install. + extra_args: a list of extra options. + """ + if extra_args is None: + extra_args = [] + + self.extra_args = extra_args + self.rev = rev + self.vcs = vcs + + def __repr__(self): + return ''.format(self.vcs.name, self.rev) + + @property + def arg_rev(self): + if self.rev is None: + return self.vcs.default_arg_rev + + return self.rev + + def to_args(self): + """ + Return the VCS-specific command arguments. + """ + args = [] + rev = self.arg_rev + if rev is not None: + args += self.vcs.get_base_rev_args(rev) + args += self.extra_args + + return args + + def to_display(self): + if not self.rev: + return '' + + return ' (to revision {})'.format(self.rev) + + def make_new(self, rev): + """ + Make a copy of the current instance, but with a new rev. + + Args: + rev: the name of the revision for the new object. + """ + return self.vcs.make_rev_options(rev, extra_args=self.extra_args) + + +class VcsSupport(object): + _registry = {} # type: Dict[str, Command] + schemes = ['ssh', 'git', 'hg', 'bzr', 'sftp', 'svn'] + + def __init__(self): + # Register more schemes with urlparse for various version control + # systems + urllib_parse.uses_netloc.extend(self.schemes) + # Python >= 2.7.4, 3.3 doesn't have uses_fragment + if getattr(urllib_parse, 'uses_fragment', None): + urllib_parse.uses_fragment.extend(self.schemes) + super(VcsSupport, self).__init__() + + def __iter__(self): + return self._registry.__iter__() + + @property + def backends(self): + return list(self._registry.values()) + + @property + def dirnames(self): + return [backend.dirname for backend in self.backends] + + @property + def all_schemes(self): + schemes = [] + for backend in self.backends: + schemes.extend(backend.schemes) + return schemes + + def register(self, cls): + if not hasattr(cls, 'name'): + logger.warning('Cannot register VCS %s', cls.__name__) + return + if cls.name not in self._registry: + self._registry[cls.name] = cls + logger.debug('Registered VCS backend: %s', cls.name) + + def unregister(self, cls=None, name=None): + if name in self._registry: + del self._registry[name] + elif cls in self._registry.values(): + del self._registry[cls.name] + else: + logger.warning('Cannot unregister because no class or name given') + + def get_backend_name(self, location): + """ + Return the name of the version control backend if found at given + location, e.g. vcs.get_backend_name('/path/to/vcs/checkout') + """ + for vc_type in self._registry.values(): + if vc_type.controls_location(location): + logger.debug('Determine that %s uses VCS: %s', + location, vc_type.name) + return vc_type.name + return None + + def get_backend(self, name): + name = name.lower() + if name in self._registry: + return self._registry[name] + + def get_backend_from_location(self, location): + vc_type = self.get_backend_name(location) + if vc_type: + return self.get_backend(vc_type) + return None + + +vcs = VcsSupport() + + +class VersionControl(object): + name = '' + dirname = '' + # List of supported schemes for this Version Control + schemes = () # type: Tuple[str, ...] + # Iterable of environment variable names to pass to call_subprocess(). + unset_environ = () # type: Tuple[str, ...] + default_arg_rev = None # type: Optional[str] + + def __init__(self, url=None, *args, **kwargs): + self.url = url + super(VersionControl, self).__init__(*args, **kwargs) + + def get_base_rev_args(self, rev): + """ + Return the base revision arguments for a vcs command. + + Args: + rev: the name of a revision to install. Cannot be None. + """ + raise NotImplementedError + + def make_rev_options(self, rev=None, extra_args=None): + """ + Return a RevOptions object. + + Args: + rev: the name of a revision to install. + extra_args: a list of extra options. + """ + return RevOptions(self, rev, extra_args=extra_args) + + def _is_local_repository(self, repo): + """ + posix absolute paths start with os.path.sep, + win32 ones start with drive (like c:\\folder) + """ + drive, tail = os.path.splitdrive(repo) + return repo.startswith(os.path.sep) or drive + + def export(self, location): + """ + Export the repository at the url to the destination location + i.e. only download the files, without vcs informations + """ + raise NotImplementedError + + def get_netloc_and_auth(self, netloc, scheme): + """ + Parse the repository URL's netloc, and return the new netloc to use + along with auth information. + + Args: + netloc: the original repository URL netloc. + scheme: the repository URL's scheme without the vcs prefix. + + This is mainly for the Subversion class to override, so that auth + information can be provided via the --username and --password options + instead of through the URL. For other subclasses like Git without + such an option, auth information must stay in the URL. + + Returns: (netloc, (username, password)). + """ + return netloc, (None, None) + + def get_url_rev_and_auth(self, url): + """ + Parse the repository URL to use, and return the URL, revision, + and auth info to use. + + Returns: (url, rev, (username, password)). + """ + scheme, netloc, path, query, frag = urllib_parse.urlsplit(url) + if '+' not in scheme: + raise ValueError( + "Sorry, {!r} is a malformed VCS url. " + "The format is +://, " + "e.g. svn+http://myrepo/svn/MyApp#egg=MyApp".format(url) + ) + # Remove the vcs prefix. + scheme = scheme.split('+', 1)[1] + netloc, user_pass = self.get_netloc_and_auth(netloc, scheme) + rev = None + if '@' in path: + path, rev = path.rsplit('@', 1) + url = urllib_parse.urlunsplit((scheme, netloc, path, query, '')) + return url, rev, user_pass + + def make_rev_args(self, username, password): + """ + Return the RevOptions "extra arguments" to use in obtain(). + """ + return [] + + def get_url_rev_options(self, url): + """ + Return the URL and RevOptions object to use in obtain() and in + some cases export(), as a tuple (url, rev_options). + """ + url, rev, user_pass = self.get_url_rev_and_auth(url) + username, password = user_pass + extra_args = self.make_rev_args(username, password) + rev_options = self.make_rev_options(rev, extra_args=extra_args) + + return url, rev_options + + def normalize_url(self, url): + """ + Normalize a URL for comparison by unquoting it and removing any + trailing slash. + """ + return urllib_parse.unquote(url).rstrip('/') + + def compare_urls(self, url1, url2): + """ + Compare two repo URLs for identity, ignoring incidental differences. + """ + return (self.normalize_url(url1) == self.normalize_url(url2)) + + def fetch_new(self, dest, url, rev_options): + """ + Fetch a revision from a repository, in the case that this is the + first fetch from the repository. + + Args: + dest: the directory to fetch the repository to. + rev_options: a RevOptions object. + """ + raise NotImplementedError + + def switch(self, dest, url, rev_options): + """ + Switch the repo at ``dest`` to point to ``URL``. + + Args: + rev_options: a RevOptions object. + """ + raise NotImplementedError + + def update(self, dest, url, rev_options): + """ + Update an already-existing repo to the given ``rev_options``. + + Args: + rev_options: a RevOptions object. + """ + raise NotImplementedError + + def is_commit_id_equal(self, dest, name): + """ + Return whether the id of the current commit equals the given name. + + Args: + dest: the repository directory. + name: a string name. + """ + raise NotImplementedError + + def obtain(self, dest): + """ + Install or update in editable mode the package represented by this + VersionControl object. + + Args: + dest: the repository directory in which to install or update. + """ + url, rev_options = self.get_url_rev_options(self.url) + + if not os.path.exists(dest): + self.fetch_new(dest, url, rev_options) + return + + rev_display = rev_options.to_display() + if self.is_repository_directory(dest): + existing_url = self.get_url(dest) + if self.compare_urls(existing_url, url): + logger.debug( + '%s in %s exists, and has correct URL (%s)', + self.repo_name.title(), + display_path(dest), + url, + ) + if not self.is_commit_id_equal(dest, rev_options.rev): + logger.info( + 'Updating %s %s%s', + display_path(dest), + self.repo_name, + rev_display, + ) + self.update(dest, url, rev_options) + else: + logger.info('Skipping because already up-to-date.') + return + + logger.warning( + '%s %s in %s exists with URL %s', + self.name, + self.repo_name, + display_path(dest), + existing_url, + ) + prompt = ('(s)witch, (i)gnore, (w)ipe, (b)ackup ', + ('s', 'i', 'w', 'b')) + else: + logger.warning( + 'Directory %s already exists, and is not a %s %s.', + dest, + self.name, + self.repo_name, + ) + prompt = ('(i)gnore, (w)ipe, (b)ackup ', ('i', 'w', 'b')) + + logger.warning( + 'The plan is to install the %s repository %s', + self.name, + url, + ) + response = ask_path_exists('What to do? %s' % prompt[0], prompt[1]) + + if response == 'a': + sys.exit(-1) + + if response == 'w': + logger.warning('Deleting %s', display_path(dest)) + rmtree(dest) + self.fetch_new(dest, url, rev_options) + return + + if response == 'b': + dest_dir = backup_dir(dest) + logger.warning( + 'Backing up %s to %s', display_path(dest), dest_dir, + ) + shutil.move(dest, dest_dir) + self.fetch_new(dest, url, rev_options) + return + + # Do nothing if the response is "i". + if response == 's': + logger.info( + 'Switching %s %s to %s%s', + self.repo_name, + display_path(dest), + url, + rev_display, + ) + self.switch(dest, url, rev_options) + + def unpack(self, location): + """ + Clean up current location and download the url repository + (and vcs infos) into location + """ + if os.path.exists(location): + rmtree(location) + self.obtain(location) + + def get_src_requirement(self, dist, location): + """ + Return a string representing the requirement needed to + redownload the files currently present in location, something + like: + {repository_url}@{revision}#egg={project_name}-{version_identifier} + """ + raise NotImplementedError + + def get_url(self, location): + """ + Return the url used at location + """ + raise NotImplementedError + + def get_revision(self, location): + """ + Return the current commit id of the files at the given location. + """ + raise NotImplementedError + + def run_command(self, cmd, show_stdout=True, cwd=None, + on_returncode='raise', + command_desc=None, + extra_environ=None, spinner=None): + """ + Run a VCS subcommand + This is simply a wrapper around call_subprocess that adds the VCS + command name, and checks that the VCS is available + """ + cmd = [self.name] + cmd + try: + return call_subprocess(cmd, show_stdout, cwd, + on_returncode, + command_desc, extra_environ, + unset_environ=self.unset_environ, + spinner=spinner) + except OSError as e: + # errno.ENOENT = no such file or directory + # In other words, the VCS executable isn't available + if e.errno == errno.ENOENT: + raise BadCommand( + 'Cannot find command %r - do you have ' + '%r installed and in your ' + 'PATH?' % (self.name, self.name)) + else: + raise # re-raise exception if a different error occurred + + @classmethod + def is_repository_directory(cls, path): + """ + Return whether a directory path is a repository directory. + """ + logger.debug('Checking in %s for %s (%s)...', + path, cls.dirname, cls.name) + return os.path.exists(os.path.join(path, cls.dirname)) + + @classmethod + def controls_location(cls, location): + """ + Check if a location is controlled by the vcs. + It is meant to be overridden to implement smarter detection + mechanisms for specific vcs. + + This can do more than is_repository_directory() alone. For example, + the Git override checks that Git is actually available. + """ + return cls.is_repository_directory(location) + + +def get_src_requirement(dist, location): + version_control = vcs.get_backend_from_location(location) + if version_control: + try: + return version_control().get_src_requirement(dist, + location) + except BadCommand: + logger.warning( + 'cannot determine version of editable source in %s ' + '(%s command not found in path)', + location, + version_control.name, + ) + return dist.as_requirement() + logger.warning( + 'cannot determine version of editable source in %s (is not SVN ' + 'checkout, Git clone, Mercurial clone or Bazaar branch)', + location, + ) + return dist.as_requirement() diff --git a/lib/python3.4/site-packages/pip/vcs/bazaar.py b/lib/python3.7/site-packages/pip/_internal/vcs/bazaar.py similarity index 55% rename from lib/python3.4/site-packages/pip/vcs/bazaar.py rename to lib/python3.7/site-packages/pip/_internal/vcs/bazaar.py index 0f09584..3cc66c9 100644 --- a/lib/python3.4/site-packages/pip/vcs/bazaar.py +++ b/lib/python3.7/site-packages/pip/_internal/vcs/bazaar.py @@ -2,18 +2,15 @@ from __future__ import absolute_import import logging import os -import tempfile -# TODO: Get this into six.moves.urllib.parse -try: - from urllib import parse as urllib_parse -except ImportError: - import urlparse as urllib_parse - -from pip.utils import rmtree, display_path -from pip.vcs import vcs, VersionControl -from pip.download import path_to_url +from pip._vendor.six.moves.urllib import parse as urllib_parse +from pip._internal.download import path_to_url +from pip._internal.utils.misc import ( + display_path, make_vcs_requirement_url, rmtree, +) +from pip._internal.utils.temp_dir import TempDirectory +from pip._internal.vcs import VersionControl, vcs logger = logging.getLogger(__name__) @@ -29,56 +26,54 @@ class Bazaar(VersionControl): def __init__(self, url=None, *args, **kwargs): super(Bazaar, self).__init__(url, *args, **kwargs) - # Python >= 2.7.4, 3.3 doesn't have uses_fragment or non_hierarchical + # This is only needed for python <2.7.5 # Register lp but do not expose as a scheme to support bzr+lp. if getattr(urllib_parse, 'uses_fragment', None): urllib_parse.uses_fragment.extend(['lp']) - urllib_parse.non_hierarchical.extend(['lp']) + + def get_base_rev_args(self, rev): + return ['-r', rev] def export(self, location): """ Export the Bazaar repository at the url to the destination location """ - temp_dir = tempfile.mkdtemp('-export', 'pip-') - self.unpack(temp_dir) + # Remove the location to make sure Bazaar can export it correctly if os.path.exists(location): - # Remove the location to make sure Bazaar can export it correctly rmtree(location) - try: - self.run_command(['export', location], cwd=temp_dir, - show_stdout=False) - finally: - rmtree(temp_dir) + + with TempDirectory(kind="export") as temp_dir: + self.unpack(temp_dir.path) + + self.run_command( + ['export', location], + cwd=temp_dir.path, show_stdout=False, + ) + + def fetch_new(self, dest, url, rev_options): + rev_display = rev_options.to_display() + logger.info( + 'Checking out %s%s to %s', + url, + rev_display, + display_path(dest), + ) + cmd_args = ['branch', '-q'] + rev_options.to_args() + [url, dest] + self.run_command(cmd_args) def switch(self, dest, url, rev_options): self.run_command(['switch', url], cwd=dest) - def update(self, dest, rev_options): - self.run_command(['pull', '-q'] + rev_options, cwd=dest) + def update(self, dest, url, rev_options): + cmd_args = ['pull', '-q'] + rev_options.to_args() + self.run_command(cmd_args, cwd=dest) - def obtain(self, dest): - url, rev = self.get_url_rev() - if rev: - rev_options = ['-r', rev] - rev_display = ' (to revision %s)' % rev - else: - rev_options = [] - rev_display = '' - if self.check_destination(dest, url, rev_options, rev_display): - logger.info( - 'Checking out %s%s to %s', - url, - rev_display, - display_path(dest), - ) - self.run_command(['branch', '-q'] + rev_options + [url, dest]) - - def get_url_rev(self): + def get_url_rev_and_auth(self, url): # hotfix the URL scheme after removing bzr+ from bzr+ssh:// readd it - url, rev = super(Bazaar, self).get_url_rev() + url, rev, user_pass = super(Bazaar, self).get_url_rev_and_auth(url) if url.startswith('ssh://'): url = 'bzr+' + url - return url, rev + return url, rev, user_pass def get_url(self, location): urls = self.run_command(['info'], show_stdout=False, cwd=location) @@ -95,7 +90,8 @@ class Bazaar(VersionControl): def get_revision(self, location): revision = self.run_command( - ['revno'], show_stdout=False, cwd=location) + ['revno'], show_stdout=False, cwd=location, + ) return revision.splitlines()[-1] def get_src_requirement(self, dist, location): @@ -104,11 +100,11 @@ class Bazaar(VersionControl): return None if not repo.lower().startswith('bzr:'): repo = 'bzr+' + repo - egg_project_name = dist.egg_name().split('-', 1)[0] current_rev = self.get_revision(location) - return '%s@%s#egg=%s' % (repo, current_rev, egg_project_name) + egg_project_name = dist.egg_name().split('-', 1)[0] + return make_vcs_requirement_url(repo, current_rev, egg_project_name) - def check_version(self, dest, rev_options): + def is_commit_id_equal(self, dest, name): """Always assume the versions don't match""" return False diff --git a/lib/python3.7/site-packages/pip/_internal/vcs/git.py b/lib/python3.7/site-packages/pip/_internal/vcs/git.py new file mode 100644 index 0000000..9778539 --- /dev/null +++ b/lib/python3.7/site-packages/pip/_internal/vcs/git.py @@ -0,0 +1,346 @@ +from __future__ import absolute_import + +import logging +import os.path +import re + +from pip._vendor.packaging.version import parse as parse_version +from pip._vendor.six.moves.urllib import parse as urllib_parse +from pip._vendor.six.moves.urllib import request as urllib_request + +from pip._internal.exceptions import BadCommand +from pip._internal.utils.compat import samefile +from pip._internal.utils.misc import display_path, make_vcs_requirement_url +from pip._internal.utils.temp_dir import TempDirectory +from pip._internal.vcs import VersionControl, vcs + +urlsplit = urllib_parse.urlsplit +urlunsplit = urllib_parse.urlunsplit + + +logger = logging.getLogger(__name__) + + +HASH_REGEX = re.compile('[a-fA-F0-9]{40}') + + +def looks_like_hash(sha): + return bool(HASH_REGEX.match(sha)) + + +class Git(VersionControl): + name = 'git' + dirname = '.git' + repo_name = 'clone' + schemes = ( + 'git', 'git+http', 'git+https', 'git+ssh', 'git+git', 'git+file', + ) + # Prevent the user's environment variables from interfering with pip: + # https://github.com/pypa/pip/issues/1130 + unset_environ = ('GIT_DIR', 'GIT_WORK_TREE') + default_arg_rev = 'HEAD' + + def __init__(self, url=None, *args, **kwargs): + + # Works around an apparent Git bug + # (see https://article.gmane.org/gmane.comp.version-control.git/146500) + if url: + scheme, netloc, path, query, fragment = urlsplit(url) + if scheme.endswith('file'): + initial_slashes = path[:-len(path.lstrip('/'))] + newpath = ( + initial_slashes + + urllib_request.url2pathname(path) + .replace('\\', '/').lstrip('/') + ) + url = urlunsplit((scheme, netloc, newpath, query, fragment)) + after_plus = scheme.find('+') + 1 + url = scheme[:after_plus] + urlunsplit( + (scheme[after_plus:], netloc, newpath, query, fragment), + ) + + super(Git, self).__init__(url, *args, **kwargs) + + def get_base_rev_args(self, rev): + return [rev] + + def get_git_version(self): + VERSION_PFX = 'git version ' + version = self.run_command(['version'], show_stdout=False) + if version.startswith(VERSION_PFX): + version = version[len(VERSION_PFX):].split()[0] + else: + version = '' + # get first 3 positions of the git version becasue + # on windows it is x.y.z.windows.t, and this parses as + # LegacyVersion which always smaller than a Version. + version = '.'.join(version.split('.')[:3]) + return parse_version(version) + + def get_branch(self, location): + """ + Return the current branch, or None if HEAD isn't at a branch + (e.g. detached HEAD). + """ + args = ['rev-parse', '--abbrev-ref', 'HEAD'] + output = self.run_command(args, show_stdout=False, cwd=location) + branch = output.strip() + + if branch == 'HEAD': + return None + + return branch + + def export(self, location): + """Export the Git repository at the url to the destination location""" + if not location.endswith('/'): + location = location + '/' + + with TempDirectory(kind="export") as temp_dir: + self.unpack(temp_dir.path) + self.run_command( + ['checkout-index', '-a', '-f', '--prefix', location], + show_stdout=False, cwd=temp_dir.path + ) + + def get_revision_sha(self, dest, rev): + """ + Return (sha_or_none, is_branch), where sha_or_none is a commit hash + if the revision names a remote branch or tag, otherwise None. + + Args: + dest: the repository directory. + rev: the revision name. + """ + # Pass rev to pre-filter the list. + output = self.run_command(['show-ref', rev], cwd=dest, + show_stdout=False, on_returncode='ignore') + refs = {} + for line in output.strip().splitlines(): + try: + sha, ref = line.split() + except ValueError: + # Include the offending line to simplify troubleshooting if + # this error ever occurs. + raise ValueError('unexpected show-ref line: {!r}'.format(line)) + + refs[ref] = sha + + branch_ref = 'refs/remotes/origin/{}'.format(rev) + tag_ref = 'refs/tags/{}'.format(rev) + + sha = refs.get(branch_ref) + if sha is not None: + return (sha, True) + + sha = refs.get(tag_ref) + + return (sha, False) + + def resolve_revision(self, dest, url, rev_options): + """ + Resolve a revision to a new RevOptions object with the SHA1 of the + branch, tag, or ref if found. + + Args: + rev_options: a RevOptions object. + """ + rev = rev_options.arg_rev + sha, is_branch = self.get_revision_sha(dest, rev) + + if sha is not None: + rev_options = rev_options.make_new(sha) + rev_options.branch_name = rev if is_branch else None + + return rev_options + + # Do not show a warning for the common case of something that has + # the form of a Git commit hash. + if not looks_like_hash(rev): + logger.warning( + "Did not find branch or tag '%s', assuming revision or ref.", + rev, + ) + + if not rev.startswith('refs/'): + return rev_options + + # If it looks like a ref, we have to fetch it explicitly. + self.run_command( + ['fetch', '-q', url] + rev_options.to_args(), + cwd=dest, + ) + # Change the revision to the SHA of the ref we fetched + sha = self.get_revision(dest, rev='FETCH_HEAD') + rev_options = rev_options.make_new(sha) + + return rev_options + + def is_commit_id_equal(self, dest, name): + """ + Return whether the current commit hash equals the given name. + + Args: + dest: the repository directory. + name: a string name. + """ + if not name: + # Then avoid an unnecessary subprocess call. + return False + + return self.get_revision(dest) == name + + def fetch_new(self, dest, url, rev_options): + rev_display = rev_options.to_display() + logger.info( + 'Cloning %s%s to %s', url, rev_display, display_path(dest), + ) + self.run_command(['clone', '-q', url, dest]) + + if rev_options.rev: + # Then a specific revision was requested. + rev_options = self.resolve_revision(dest, url, rev_options) + branch_name = getattr(rev_options, 'branch_name', None) + if branch_name is None: + # Only do a checkout if the current commit id doesn't match + # the requested revision. + if not self.is_commit_id_equal(dest, rev_options.rev): + cmd_args = ['checkout', '-q'] + rev_options.to_args() + self.run_command(cmd_args, cwd=dest) + elif self.get_branch(dest) != branch_name: + # Then a specific branch was requested, and that branch + # is not yet checked out. + track_branch = 'origin/{}'.format(branch_name) + cmd_args = [ + 'checkout', '-b', branch_name, '--track', track_branch, + ] + self.run_command(cmd_args, cwd=dest) + + #: repo may contain submodules + self.update_submodules(dest) + + def switch(self, dest, url, rev_options): + self.run_command(['config', 'remote.origin.url', url], cwd=dest) + cmd_args = ['checkout', '-q'] + rev_options.to_args() + self.run_command(cmd_args, cwd=dest) + + self.update_submodules(dest) + + def update(self, dest, url, rev_options): + # First fetch changes from the default remote + if self.get_git_version() >= parse_version('1.9.0'): + # fetch tags in addition to everything else + self.run_command(['fetch', '-q', '--tags'], cwd=dest) + else: + self.run_command(['fetch', '-q'], cwd=dest) + # Then reset to wanted revision (maybe even origin/master) + rev_options = self.resolve_revision(dest, url, rev_options) + cmd_args = ['reset', '--hard', '-q'] + rev_options.to_args() + self.run_command(cmd_args, cwd=dest) + #: update submodules + self.update_submodules(dest) + + def get_url(self, location): + """Return URL of the first remote encountered.""" + remotes = self.run_command( + ['config', '--get-regexp', r'remote\..*\.url'], + show_stdout=False, cwd=location, + ) + remotes = remotes.splitlines() + found_remote = remotes[0] + for remote in remotes: + if remote.startswith('remote.origin.url '): + found_remote = remote + break + url = found_remote.split(' ')[1] + return url.strip() + + def get_revision(self, location, rev=None): + if rev is None: + rev = 'HEAD' + current_rev = self.run_command( + ['rev-parse', rev], show_stdout=False, cwd=location, + ) + return current_rev.strip() + + def _get_subdirectory(self, location): + """Return the relative path of setup.py to the git repo root.""" + # find the repo root + git_dir = self.run_command(['rev-parse', '--git-dir'], + show_stdout=False, cwd=location).strip() + if not os.path.isabs(git_dir): + git_dir = os.path.join(location, git_dir) + root_dir = os.path.join(git_dir, '..') + # find setup.py + orig_location = location + while not os.path.exists(os.path.join(location, 'setup.py')): + last_location = location + location = os.path.dirname(location) + if location == last_location: + # We've traversed up to the root of the filesystem without + # finding setup.py + logger.warning( + "Could not find setup.py for directory %s (tried all " + "parent directories)", + orig_location, + ) + return None + # relative path of setup.py to repo root + if samefile(root_dir, location): + return None + return os.path.relpath(location, root_dir) + + def get_src_requirement(self, dist, location): + repo = self.get_url(location) + if not repo.lower().startswith('git:'): + repo = 'git+' + repo + current_rev = self.get_revision(location) + egg_project_name = dist.egg_name().split('-', 1)[0] + subdir = self._get_subdirectory(location) + req = make_vcs_requirement_url(repo, current_rev, egg_project_name, + subdir=subdir) + + return req + + def get_url_rev_and_auth(self, url): + """ + Prefixes stub URLs like 'user@hostname:user/repo.git' with 'ssh://'. + That's required because although they use SSH they sometimes don't + work with a ssh:// scheme (e.g. GitHub). But we need a scheme for + parsing. Hence we remove it again afterwards and return it as a stub. + """ + if '://' not in url: + assert 'file:' not in url + url = url.replace('git+', 'git+ssh://') + url, rev, user_pass = super(Git, self).get_url_rev_and_auth(url) + url = url.replace('ssh://', '') + else: + url, rev, user_pass = super(Git, self).get_url_rev_and_auth(url) + + return url, rev, user_pass + + def update_submodules(self, location): + if not os.path.exists(os.path.join(location, '.gitmodules')): + return + self.run_command( + ['submodule', 'update', '--init', '--recursive', '-q'], + cwd=location, + ) + + @classmethod + def controls_location(cls, location): + if super(Git, cls).controls_location(location): + return True + try: + r = cls().run_command(['rev-parse'], + cwd=location, + show_stdout=False, + on_returncode='ignore') + return not r + except BadCommand: + logger.debug("could not determine if %s is under git control " + "because git is not available", location) + return False + + +vcs.register(Git) diff --git a/lib/python3.4/site-packages/pip/vcs/mercurial.py b/lib/python3.7/site-packages/pip/_internal/vcs/mercurial.py similarity index 64% rename from lib/python3.4/site-packages/pip/vcs/mercurial.py rename to lib/python3.7/site-packages/pip/_internal/vcs/mercurial.py index 1aa83b9..17cfb67 100644 --- a/lib/python3.4/site-packages/pip/vcs/mercurial.py +++ b/lib/python3.7/site-packages/pip/_internal/vcs/mercurial.py @@ -2,13 +2,13 @@ from __future__ import absolute_import import logging import os -import tempfile -from pip.utils import display_path, rmtree -from pip.vcs import vcs, VersionControl -from pip.download import path_to_url from pip._vendor.six.moves import configparser +from pip._internal.download import path_to_url +from pip._internal.utils.misc import display_path, make_vcs_requirement_url +from pip._internal.utils.temp_dir import TempDirectory +from pip._internal.vcs import VersionControl, vcs logger = logging.getLogger(__name__) @@ -19,15 +19,29 @@ class Mercurial(VersionControl): repo_name = 'clone' schemes = ('hg', 'hg+http', 'hg+https', 'hg+ssh', 'hg+static-http') + def get_base_rev_args(self, rev): + return [rev] + def export(self, location): """Export the Hg repository at the url to the destination location""" - temp_dir = tempfile.mkdtemp('-export', 'pip-') - self.unpack(temp_dir) - try: + with TempDirectory(kind="export") as temp_dir: + self.unpack(temp_dir.path) + self.run_command( - ['archive', location], show_stdout=False, cwd=temp_dir) - finally: - rmtree(temp_dir) + ['archive', location], show_stdout=False, cwd=temp_dir.path + ) + + def fetch_new(self, dest, url, rev_options): + rev_display = rev_options.to_display() + logger.info( + 'Cloning hg %s%s to %s', + url, + rev_display, + display_path(dest), + ) + self.run_command(['clone', '--noupdate', '-q', url, dest]) + cmd_args = ['update', '-q'] + rev_options.to_args() + self.run_command(cmd_args, cwd=dest) def switch(self, dest, url, rev_options): repo_config = os.path.join(dest, self.dirname, 'hgrc') @@ -42,29 +56,13 @@ class Mercurial(VersionControl): 'Could not switch Mercurial repository to %s: %s', url, exc, ) else: - self.run_command(['update', '-q'] + rev_options, cwd=dest) + cmd_args = ['update', '-q'] + rev_options.to_args() + self.run_command(cmd_args, cwd=dest) - def update(self, dest, rev_options): + def update(self, dest, url, rev_options): self.run_command(['pull', '-q'], cwd=dest) - self.run_command(['update', '-q'] + rev_options, cwd=dest) - - def obtain(self, dest): - url, rev = self.get_url_rev() - if rev: - rev_options = [rev] - rev_display = ' (to revision %s)' % rev - else: - rev_options = [] - rev_display = '' - if self.check_destination(dest, url, rev_options, rev_display): - logger.info( - 'Cloning hg %s%s to %s', - url, - rev_display, - display_path(dest), - ) - self.run_command(['clone', '--noupdate', '-q', url, dest]) - self.run_command(['update', '-q'] + rev_options, cwd=dest) + cmd_args = ['update', '-q'] + rev_options.to_args() + self.run_command(cmd_args, cwd=dest) def get_url(self, location): url = self.run_command( @@ -90,14 +88,14 @@ class Mercurial(VersionControl): repo = self.get_url(location) if not repo.lower().startswith('hg:'): repo = 'hg+' + repo - egg_project_name = dist.egg_name().split('-', 1)[0] - if not repo: - return None current_rev_hash = self.get_revision_hash(location) - return '%s@%s#egg=%s' % (repo, current_rev_hash, egg_project_name) + egg_project_name = dist.egg_name().split('-', 1)[0] + return make_vcs_requirement_url(repo, current_rev_hash, + egg_project_name) - def check_version(self, dest, rev_options): + def is_commit_id_equal(self, dest, name): """Always assume the versions don't match""" return False + vcs.register(Mercurial) diff --git a/lib/python3.4/site-packages/pip/vcs/subversion.py b/lib/python3.7/site-packages/pip/_internal/vcs/subversion.py similarity index 56% rename from lib/python3.4/site-packages/pip/vcs/subversion.py rename to lib/python3.7/site-packages/pip/_internal/vcs/subversion.py index 4b23156..6f7cb5d 100644 --- a/lib/python3.4/site-packages/pip/vcs/subversion.py +++ b/lib/python3.7/site-packages/pip/_internal/vcs/subversion.py @@ -4,17 +4,15 @@ import logging import os import re -from pip._vendor.six.moves.urllib import parse as urllib_parse - -from pip.index import Link -from pip.utils import rmtree, display_path -from pip.utils.logging import indent_log -from pip.vcs import vcs, VersionControl +from pip._internal.models.link import Link +from pip._internal.utils.logging import indent_log +from pip._internal.utils.misc import ( + display_path, make_vcs_requirement_url, rmtree, split_auth_from_netloc, +) +from pip._internal.vcs import VersionControl, vcs _svn_xml_url_re = re.compile('url="([^"]+)"') -_svn_rev_re = re.compile('committed-rev="(\d+)"') -_svn_url_re = re.compile(r'URL: (.+)') -_svn_revision_re = re.compile(r'Revision: (.+)') +_svn_rev_re = re.compile(r'committed-rev="(\d+)"') _svn_info_xml_rev_re = re.compile(r'\s*revision="(\d+)"') _svn_info_xml_url_re = re.compile(r'(.*)') @@ -28,71 +26,40 @@ class Subversion(VersionControl): repo_name = 'checkout' schemes = ('svn', 'svn+ssh', 'svn+http', 'svn+https', 'svn+svn') - def get_info(self, location): - """Returns (url, revision), where both are strings""" - assert not location.rstrip('/').endswith(self.dirname), \ - 'Bad directory: %s' % location - output = self.run_command( - ['info', location], - show_stdout=False, - extra_environ={'LANG': 'C'}, - ) - match = _svn_url_re.search(output) - if not match: - logger.warning( - 'Cannot determine URL of svn checkout %s', - display_path(location), - ) - logger.debug('Output that cannot be parsed: \n%s', output) - return None, None - url = match.group(1).strip() - match = _svn_revision_re.search(output) - if not match: - logger.warning( - 'Cannot determine revision of svn checkout %s', - display_path(location), - ) - logger.debug('Output that cannot be parsed: \n%s', output) - return url, None - return url, match.group(1) + def get_base_rev_args(self, rev): + return ['-r', rev] def export(self, location): """Export the svn repository at the url to the destination location""" - url, rev = self.get_url_rev() - rev_options = get_rev_options(url, rev) - url = self.remove_auth_from_url(url) + url, rev_options = self.get_url_rev_options(self.url) + logger.info('Exporting svn repository %s to %s', url, location) with indent_log(): if os.path.exists(location): # Subversion doesn't like to check out over an existing # directory --force fixes this, but was only added in svn 1.5 rmtree(location) - self.run_command( - ['export'] + rev_options + [url, location], - show_stdout=False) + cmd_args = ['export'] + rev_options.to_args() + [url, location] + self.run_command(cmd_args, show_stdout=False) + + def fetch_new(self, dest, url, rev_options): + rev_display = rev_options.to_display() + logger.info( + 'Checking out %s%s to %s', + url, + rev_display, + display_path(dest), + ) + cmd_args = ['checkout', '-q'] + rev_options.to_args() + [url, dest] + self.run_command(cmd_args) def switch(self, dest, url, rev_options): - self.run_command(['switch'] + rev_options + [url, dest]) + cmd_args = ['switch'] + rev_options.to_args() + [url, dest] + self.run_command(cmd_args) - def update(self, dest, rev_options): - self.run_command(['update'] + rev_options + [dest]) - - def obtain(self, dest): - url, rev = self.get_url_rev() - rev_options = get_rev_options(url, rev) - url = self.remove_auth_from_url(url) - if rev: - rev_display = ' (to revision %s)' % rev - else: - rev_display = '' - if self.check_destination(dest, url, rev_options, rev_display): - logger.info( - 'Checking out %s%s to %s', - url, - rev_display, - display_path(dest), - ) - self.run_command(['checkout', '-q'] + rev_options + [url, dest]) + def update(self, dest, url, rev_options): + cmd_args = ['update'] + rev_options.to_args() + [dest] + self.run_command(cmd_args) def get_location(self, dist, dependency_links): for url in dependency_links: @@ -128,19 +95,41 @@ class Subversion(VersionControl): dirurl, localrev = self._get_svn_url_rev(base) if base == location: - base_url = dirurl + '/' # save the root url - elif not dirurl or not dirurl.startswith(base_url): + base = dirurl + '/' # save the root url + elif not dirurl or not dirurl.startswith(base): dirs[:] = [] continue # not part of the same svn tree, skip it revision = max(revision, localrev) return revision - def get_url_rev(self): + def get_netloc_and_auth(self, netloc, scheme): + """ + This override allows the auth information to be passed to svn via the + --username and --password options instead of via the URL. + """ + if scheme == 'ssh': + # The --username and --password options can't be used for + # svn+ssh URLs, so keep the auth information in the URL. + return super(Subversion, self).get_netloc_and_auth( + netloc, scheme) + + return split_auth_from_netloc(netloc) + + def get_url_rev_and_auth(self, url): # hotfix the URL scheme after removing svn+ from svn+ssh:// readd it - url, rev = super(Subversion, self).get_url_rev() + url, rev, user_pass = super(Subversion, self).get_url_rev_and_auth(url) if url.startswith('ssh://'): url = 'svn+' + url - return url, rev + return url, rev, user_pass + + def make_rev_args(self, username, password): + extra_args = [] + if username: + extra_args += ['--username', username] + if password: + extra_args += ['--password', password] + + return extra_args def get_url(self, location): # In cases where the source is in a subdirectory, not alongside @@ -163,7 +152,7 @@ class Subversion(VersionControl): return self._get_svn_url_rev(location)[0] def _get_svn_url_rev(self, location): - from pip.exceptions import InstallationError + from pip._internal.exceptions import InstallationError entries_path = os.path.join(location, self.dirname, 'entries') if os.path.exists(entries_path): @@ -210,60 +199,15 @@ class Subversion(VersionControl): repo = self.get_url(location) if repo is None: return None + repo = 'svn+' + repo + rev = self.get_revision(location) # FIXME: why not project name? egg_project_name = dist.egg_name().split('-', 1)[0] - rev = self.get_revision(location) - return 'svn+%s@%s#egg=%s' % (repo, rev, egg_project_name) + return make_vcs_requirement_url(repo, rev, egg_project_name) - def check_version(self, dest, rev_options): + def is_commit_id_equal(self, dest, name): """Always assume the versions don't match""" return False - @staticmethod - def remove_auth_from_url(url): - # Return a copy of url with 'username:password@' removed. - # username/pass params are passed to subversion through flags - # and are not recognized in the url. - - # parsed url - purl = urllib_parse.urlsplit(url) - stripped_netloc = \ - purl.netloc.split('@')[-1] - - # stripped url - url_pieces = ( - purl.scheme, stripped_netloc, purl.path, purl.query, purl.fragment - ) - surl = urllib_parse.urlunsplit(url_pieces) - return surl - - -def get_rev_options(url, rev): - if rev: - rev_options = ['-r', rev] - else: - rev_options = [] - - r = urllib_parse.urlsplit(url) - if hasattr(r, 'username'): - # >= Python-2.5 - username, password = r.username, r.password - else: - netloc = r[1] - if '@' in netloc: - auth = netloc.split('@')[0] - if ':' in auth: - username, password = auth.split(':', 1) - else: - username, password = auth, None - else: - username, password = None, None - - if username: - rev_options += ['--username', username] - if password: - rev_options += ['--password', password] - return rev_options - vcs.register(Subversion) diff --git a/lib/python3.4/site-packages/pip/wheel.py b/lib/python3.7/site-packages/pip/_internal/wheel.py similarity index 74% rename from lib/python3.4/site-packages/pip/wheel.py rename to lib/python3.7/site-packages/pip/_internal/wheel.py index 9ac9dff..5ce890e 100644 --- a/lib/python3.4/site-packages/pip/wheel.py +++ b/lib/python3.7/site-packages/pip/_internal/wheel.py @@ -3,44 +3,44 @@ Support for installing and building the "wheel" binary package format. """ from __future__ import absolute_import +import collections import compileall import csv -import errno -import functools import hashlib import logging -import os import os.path import re import shutil import stat import sys -import tempfile import warnings - from base64 import urlsafe_b64encode from email.parser import Parser +from pip._vendor import pkg_resources +from pip._vendor.distlib.scripts import ScriptMaker +from pip._vendor.packaging.utils import canonicalize_name from pip._vendor.six import StringIO -import pip -from pip.compat import expanduser -from pip.download import path_to_url, unpack_url -from pip.exceptions import ( - InstallationError, InvalidWheelFilename, UnsupportedWheel) -from pip.locations import distutils_scheme, PIP_DELETE_MARKER_FILENAME -from pip import pep425tags -from pip.utils import ( - call_subprocess, ensure_dir, captured_stdout, rmtree, read_chunks, +from pip._internal import pep425tags +from pip._internal.download import path_to_url, unpack_url +from pip._internal.exceptions import ( + InstallationError, InvalidWheelFilename, UnsupportedWheel, ) -from pip.utils.ui import open_spinner -from pip.utils.logging import indent_log -from pip.utils.setuptools_build import SETUPTOOLS_SHIM -from pip._vendor.distlib.scripts import ScriptMaker -from pip._vendor import pkg_resources -from pip._vendor.packaging.utils import canonicalize_name -from pip._vendor.six.moves import configparser +from pip._internal.locations import ( + PIP_DELETE_MARKER_FILENAME, distutils_scheme, +) +from pip._internal.utils.logging import indent_log +from pip._internal.utils.misc import ( + call_subprocess, captured_stdout, ensure_dir, read_chunks, +) +from pip._internal.utils.setuptools_build import SETUPTOOLS_SHIM +from pip._internal.utils.temp_dir import TempDirectory +from pip._internal.utils.typing import MYPY_CHECK_RUNNING +from pip._internal.utils.ui import open_spinner +if MYPY_CHECK_RUNNING: + from typing import Dict, List, Optional # noqa: F401 wheel_ext = '.whl' @@ -50,107 +50,9 @@ VERSION_COMPATIBLE = (1, 0) logger = logging.getLogger(__name__) -class WheelCache(object): - """A cache of wheels for future installs.""" - - def __init__(self, cache_dir, format_control): - """Create a wheel cache. - - :param cache_dir: The root of the cache. - :param format_control: A pip.index.FormatControl object to limit - binaries being read from the cache. - """ - self._cache_dir = expanduser(cache_dir) if cache_dir else None - self._format_control = format_control - - def cached_wheel(self, link, package_name): - return cached_wheel( - self._cache_dir, link, self._format_control, package_name) - - -def _cache_for_link(cache_dir, link): - """ - Return a directory to store cached wheels in for link. - - Because there are M wheels for any one sdist, we provide a directory - to cache them in, and then consult that directory when looking up - cache hits. - - We only insert things into the cache if they have plausible version - numbers, so that we don't contaminate the cache with things that were not - unique. E.g. ./package might have dozens of installs done for it and build - a version of 0.0...and if we built and cached a wheel, we'd end up using - the same wheel even if the source has been edited. - - :param cache_dir: The cache_dir being used by pip. - :param link: The link of the sdist for which this will cache wheels. - """ - - # We want to generate an url to use as our cache key, we don't want to just - # re-use the URL because it might have other items in the fragment and we - # don't care about those. - key_parts = [link.url_without_fragment] - if link.hash_name is not None and link.hash is not None: - key_parts.append("=".join([link.hash_name, link.hash])) - key_url = "#".join(key_parts) - - # Encode our key url with sha224, we'll use this because it has similar - # security properties to sha256, but with a shorter total output (and thus - # less secure). However the differences don't make a lot of difference for - # our use case here. - hashed = hashlib.sha224(key_url.encode()).hexdigest() - - # We want to nest the directories some to prevent having a ton of top level - # directories where we might run out of sub directories on some FS. - parts = [hashed[:2], hashed[2:4], hashed[4:6], hashed[6:]] - - # Inside of the base location for cached wheels, expand our parts and join - # them all together. - return os.path.join(cache_dir, "wheels", *parts) - - -def cached_wheel(cache_dir, link, format_control, package_name): - if not cache_dir: - return link - if not link: - return link - if link.is_wheel: - return link - if not link.is_artifact: - return link - if not package_name: - return link - canonical_name = canonicalize_name(package_name) - formats = pip.index.fmt_ctl_formats(format_control, canonical_name) - if "binary" not in formats: - return link - root = _cache_for_link(cache_dir, link) - try: - wheel_names = os.listdir(root) - except OSError as e: - if e.errno in (errno.ENOENT, errno.ENOTDIR): - return link - raise - candidates = [] - for wheel_name in wheel_names: - try: - wheel = Wheel(wheel_name) - except InvalidWheelFilename: - continue - if not wheel.supported(): - # Built for a different python/arch/etc - continue - candidates.append((wheel.support_index_min(), wheel_name)) - if not candidates: - return link - candidates.sort() - path = os.path.join(root, candidates[0][1]) - return pip.index.Link(path_to_url(path)) - - -def rehash(path, algo='sha256', blocksize=1 << 20): - """Return (hash, length) for path using hashlib.new(algo)""" - h = hashlib.new(algo) +def rehash(path, blocksize=1 << 20): + """Return (hash, length) for path using hashlib.sha256()""" + h = hashlib.sha256() length = 0 with open(path, 'rb') as f: for block in read_chunks(f, size=blocksize): @@ -189,7 +91,8 @@ def fix_script(path): script.write(rest) return True -dist_info_re = re.compile(r"""^(?P(?P.+?)(-(?P\d.+?))?) + +dist_info_re = re.compile(r"""^(?P(?P.+?)(-(?P.+?))?) \.dist-info$""", re.VERBOSE) @@ -224,21 +127,86 @@ def get_entrypoints(filename): data.write("\n") data.seek(0) - cp = configparser.RawConfigParser() - cp.optionxform = lambda option: option - cp.readfp(data) + # get the entry points and then the script names + entry_points = pkg_resources.EntryPoint.parse_map(data) + console = entry_points.get('console_scripts', {}) + gui = entry_points.get('gui_scripts', {}) - console = {} - gui = {} - if cp.has_section('console_scripts'): - console = dict(cp.items('console_scripts')) - if cp.has_section('gui_scripts'): - gui = dict(cp.items('gui_scripts')) + def _split_ep(s): + """get the string representation of EntryPoint, remove space and split + on '='""" + return str(s).replace(" ", "").split("=") + + # convert the EntryPoint objects into strings with module:function + console = dict(_split_ep(v) for v in console.values()) + gui = dict(_split_ep(v) for v in gui.values()) return console, gui +def message_about_scripts_not_on_PATH(scripts): + # type: (List[str]) -> Optional[str] + """Determine if any scripts are not on PATH and format a warning. + + Returns a warning message if one or more scripts are not on PATH, + otherwise None. + """ + if not scripts: + return None + + # Group scripts by the path they were installed in + grouped_by_dir = collections.defaultdict(set) # type: Dict[str, set] + for destfile in scripts: + parent_dir = os.path.dirname(destfile) + script_name = os.path.basename(destfile) + grouped_by_dir[parent_dir].add(script_name) + + # We don't want to warn for directories that are on PATH. + not_warn_dirs = [ + os.path.normcase(i).rstrip(os.sep) for i in + os.environ.get("PATH", "").split(os.pathsep) + ] + # If an executable sits with sys.executable, we don't warn for it. + # This covers the case of venv invocations without activating the venv. + not_warn_dirs.append(os.path.normcase(os.path.dirname(sys.executable))) + warn_for = { + parent_dir: scripts for parent_dir, scripts in grouped_by_dir.items() + if os.path.normcase(parent_dir) not in not_warn_dirs + } + if not warn_for: + return None + + # Format a message + msg_lines = [] + for parent_dir, scripts in warn_for.items(): + scripts = sorted(scripts) + if len(scripts) == 1: + start_text = "script {} is".format(scripts[0]) + else: + start_text = "scripts {} are".format( + ", ".join(scripts[:-1]) + " and " + scripts[-1] + ) + + msg_lines.append( + "The {} installed in '{}' which is not on PATH." + .format(start_text, parent_dir) + ) + + last_line_fmt = ( + "Consider adding {} to PATH or, if you prefer " + "to suppress this warning, use --no-warn-script-location." + ) + if len(msg_lines) == 1: + msg_lines.append(last_line_fmt.format("this directory")) + else: + msg_lines.append(last_line_fmt.format("these directories")) + + # Returns the formatted multiline message + return "\n".join(msg_lines) + + def move_wheel_files(name, req, wheeldir, user=False, home=None, root=None, - pycompile=True, scheme=None, isolated=False, prefix=None): + pycompile=True, scheme=None, isolated=False, prefix=None, + warn_script_location=True): """Install a wheel""" if not scheme: @@ -315,6 +283,17 @@ def move_wheel_files(name, req, wheeldir, user=False, home=None, root=None, # uninstalled. ensure_dir(destdir) + # copyfile (called below) truncates the destination if it + # exists and then writes the new contents. This is fine in most + # cases, but can cause a segfault if pip has loaded a shared + # object (e.g. from pyopenssl through its vendored urllib3) + # Since the shared object is mmap'd an attempt to call a + # symbol in it will then cause a segfault. Unlinking the file + # allows writing of new contents while allowing the process to + # continue to use the old copy. + if os.path.exists(destfile): + os.unlink(destfile) + # We use copyfile (not move, copy, or copy2) to be extra sure # that we are not moving directories over (copyfile fails for # directories) as well as to ensure that we are not copying @@ -385,7 +364,7 @@ def move_wheel_files(name, req, wheeldir, user=False, home=None, root=None, # Ensure we don't generate any variants for scripts because this is almost # never what somebody wants. # See https://bitbucket.org/pypa/distlib/issue/35/ - maker.variants = set(('', )) + maker.variants = {''} # This is required because otherwise distlib creates scripts that are not # executable. @@ -411,7 +390,7 @@ def move_wheel_files(name, req, wheeldir, user=False, home=None, root=None, } maker._get_script_text = _get_script_text - maker.script_template = """# -*- coding: utf-8 -*- + maker.script_template = r"""# -*- coding: utf-8 -*- import re import sys @@ -488,9 +467,16 @@ if __name__ == '__main__': # Generate the console and GUI entry points specified in the wheel if len(console) > 0: - generated.extend( - maker.make_multiple(['%s = %s' % kv for kv in console.items()]) + generated_console_scripts = maker.make_multiple( + ['%s = %s' % kv for kv in console.items()] ) + generated.extend(generated_console_scripts) + + if warn_script_location: + msg = message_about_scripts_not_on_PATH(generated_console_scripts) + if msg is not None: + logger.warning(msg) + if len(gui) > 0: generated.extend( maker.make_multiple( @@ -514,53 +500,22 @@ if __name__ == '__main__': with open_for_csv(temp_record, 'w+') as record_out: reader = csv.reader(record_in) writer = csv.writer(record_out) + outrows = [] for row in reader: row[0] = installed.pop(row[0], row[0]) if row[0] in changed: row[1], row[2] = rehash(row[0]) - writer.writerow(row) + outrows.append(tuple(row)) for f in generated: - h, l = rehash(f) - writer.writerow((normpath(f, lib_dir), h, l)) + digest, length = rehash(f) + outrows.append((normpath(f, lib_dir), digest, length)) for f in installed: - writer.writerow((installed[f], '', '')) + outrows.append((installed[f], '', '')) + for row in sorted(outrows): + writer.writerow(row) shutil.move(temp_record, record) -def _unique(fn): - @functools.wraps(fn) - def unique(*args, **kw): - seen = set() - for item in fn(*args, **kw): - if item not in seen: - seen.add(item) - yield item - return unique - - -# TODO: this goes somewhere besides the wheel module -@_unique -def uninstallation_paths(dist): - """ - Yield all the uninstallation paths for dist based on RECORD-without-.pyc - - Yield paths to all the files in RECORD. For each .py file in RECORD, add - the .pyc in the same directory. - - UninstallPathSet.add() takes care of the __pycache__ .pyc. - """ - from pip.utils import FakeFile # circular import - r = csv.reader(FakeFile(dist.get_metadata_lines('RECORD'))) - for row in r: - path = os.path.join(dist.location, row[0]) - yield path - if path.endswith('.py'): - dn, fn = os.path.split(path) - base = fn[:-3] - path = os.path.join(dn, base + '.pyc') - yield path - - def wheel_version(source_dir): """ Return the Wheel-Version of an extracted wheel, if possible. @@ -576,7 +531,7 @@ def wheel_version(source_dir): version = wheel_data['Wheel-Version'].strip() version = tuple(map(int, version.split('.'))) return version - except: + except Exception: return False @@ -615,8 +570,8 @@ class Wheel(object): # TODO: maybe move the install code into this class wheel_file_re = re.compile( - r"""^(?P(?P.+?)-(?P\d.*?)) - ((-(?P\d.*?))?-(?P.+?)-(?P.+?)-(?P.+?) + r"""^(?P(?P.+?)-(?P.*?)) + ((-(?P\d[^-]*?))?-(?P.+?)-(?P.+?)-(?P.+?) \.whl|\.dist-info)$""", re.VERBOSE ) @@ -635,15 +590,16 @@ class Wheel(object): # we'll assume "_" means "-" due to wheel naming scheme # (https://github.com/pypa/pip/issues/1150) self.version = wheel_info.group('ver').replace('_', '-') + self.build_tag = wheel_info.group('build') self.pyversions = wheel_info.group('pyver').split('.') self.abis = wheel_info.group('abi').split('.') self.plats = wheel_info.group('plat').split('.') # All the tag combinations from this file - self.file_tags = set( + self.file_tags = { (x, y, z) for x in self.pyversions for y in self.abis for z in self.plats - ) + } def support_index_min(self, tags=None): """ @@ -653,54 +609,66 @@ class Wheel(object): None is the wheel is not supported. """ if tags is None: # for mock - tags = pep425tags.supported_tags + tags = pep425tags.get_supported() indexes = [tags.index(c) for c in self.file_tags if c in tags] return min(indexes) if indexes else None def supported(self, tags=None): """Is this wheel supported on this system?""" if tags is None: # for mock - tags = pep425tags.supported_tags + tags = pep425tags.get_supported() return bool(set(tags).intersection(self.file_tags)) class WheelBuilder(object): """Build wheels from a RequirementSet.""" - def __init__(self, requirement_set, finder, build_options=None, - global_options=None): - self.requirement_set = requirement_set + def __init__(self, finder, preparer, wheel_cache, + build_options=None, global_options=None, no_clean=False): self.finder = finder - self._cache_root = requirement_set._wheel_cache._cache_dir - self._wheel_dir = requirement_set.wheel_download_dir + self.preparer = preparer + self.wheel_cache = wheel_cache + + self._wheel_dir = preparer.wheel_download_dir + self.build_options = build_options or [] self.global_options = global_options or [] + self.no_clean = no_clean def _build_one(self, req, output_dir, python_tag=None): """Build one wheel. :return: The filename of the built wheel, or None if the build failed. """ - tempd = tempfile.mkdtemp('pip-wheel-') - try: - if self.__build_one(req, tempd, python_tag=python_tag): + # Install build deps into temporary directory (PEP 518) + with req.build_env: + return self._build_one_inside_env(req, output_dir, + python_tag=python_tag) + + def _build_one_inside_env(self, req, output_dir, python_tag=None): + with TempDirectory(kind="wheel") as temp_dir: + if self.__build_one(req, temp_dir.path, python_tag=python_tag): try: - wheel_name = os.listdir(tempd)[0] + wheel_name = os.listdir(temp_dir.path)[0] wheel_path = os.path.join(output_dir, wheel_name) - shutil.move(os.path.join(tempd, wheel_name), wheel_path) + shutil.move( + os.path.join(temp_dir.path, wheel_name), wheel_path + ) logger.info('Stored in directory: %s', output_dir) return wheel_path - except: + except Exception: pass # Ignore return, we can't do anything else useful. self._clean_one(req) return None - finally: - rmtree(tempd) def _base_setup_args(self, req): + # NOTE: Eventually, we'd want to also -S to the flags here, when we're + # isolating. Currently, it breaks Python in virtualenvs, because it + # relies on site.py to find parts of the standard library outside the + # virtualenv. return [ - sys.executable, "-u", '-c', + sys.executable, '-u', '-c', SETUPTOOLS_SHIM % req.setup_py ] + list(self.global_options) @@ -720,7 +688,7 @@ class WheelBuilder(object): call_subprocess(wheel_args, cwd=req.setup_py_dir, show_stdout=False, spinner=spinner) return True - except: + except Exception: spinner.finish("error") logger.error('Failed building wheel for %s', req.name) return False @@ -733,53 +701,58 @@ class WheelBuilder(object): try: call_subprocess(clean_args, cwd=req.source_dir, show_stdout=False) return True - except: + except Exception: logger.error('Failed cleaning build dir for %s', req.name) return False - def build(self, autobuilding=False): + def build(self, requirements, session, autobuilding=False): """Build wheels. :param unpack: If True, replace the sdist we built from with the newly built wheel, in preparation for installation. :return: True if all the wheels built correctly. """ - assert self._wheel_dir or (autobuilding and self._cache_root) - # unpack sdists and constructs req set - self.requirement_set.prepare_files(self.finder) + from pip._internal import index + from pip._internal.models.link import Link - reqset = self.requirement_set.requirements.values() + building_is_possible = self._wheel_dir or ( + autobuilding and self.wheel_cache.cache_dir + ) + assert building_is_possible buildset = [] - for req in reqset: + format_control = self.finder.format_control + for req in requirements: if req.constraint: continue if req.is_wheel: if not autobuilding: logger.info( - 'Skipping %s, due to already being wheel.', req.name) + 'Skipping %s, due to already being wheel.', req.name, + ) elif autobuilding and req.editable: pass - elif autobuilding and req.link and not req.link.is_artifact: - pass elif autobuilding and not req.source_dir: pass + elif autobuilding and req.link and not req.link.is_artifact: + # VCS checkout. Build wheel just for this run. + buildset.append((req, True)) else: + ephem_cache = False if autobuilding: link = req.link base, ext = link.splitext() - if pip.index.egg_info_matches(base, None, link) is None: - # Doesn't look like a package - don't autobuild a wheel - # because we'll have no way to lookup the result sanely - continue - if "binary" not in pip.index.fmt_ctl_formats( - self.finder.format_control, + if index.egg_info_matches(base, None, link) is None: + # E.g. local directory. Build wheel just for this run. + ephem_cache = True + if "binary" not in format_control.get_allowed_formats( canonicalize_name(req.name)): logger.info( "Skipping bdist_wheel for %s, due to binaries " - "being disabled for it.", req.name) + "being disabled for it.", req.name, + ) continue - buildset.append(req) + buildset.append((req, ephem_cache)) if not buildset: return True @@ -787,15 +760,19 @@ class WheelBuilder(object): # Build the wheels. logger.info( 'Building wheels for collected packages: %s', - ', '.join([req.name for req in buildset]), + ', '.join([req.name for (req, _) in buildset]), ) + _cache = self.wheel_cache # shorter name with indent_log(): build_success, build_failure = [], [] - for req in buildset: + for req, ephem in buildset: python_tag = None if autobuilding: python_tag = pep425tags.implementation_tag - output_dir = _cache_for_link(self._cache_root, req.link) + if ephem: + output_dir = _cache.get_ephem_path_for_link(req.link) + else: + output_dir = _cache.get_path_for_link(req.link) try: ensure_dir(output_dir) except OSError as e: @@ -826,15 +803,16 @@ class WheelBuilder(object): # set the build directory again - name is known from # the work prepare_files did. req.source_dir = req.build_location( - self.requirement_set.build_dir) + self.preparer.build_dir + ) # Update the link for this. - req.link = pip.index.Link( - path_to_url(wheel_file)) + req.link = Link(path_to_url(wheel_file)) assert req.link.is_wheel # extract the wheel into the dir unpack_url( req.link, req.source_dir, None, False, - session=self.requirement_set.session) + session=session, + ) else: build_failure.append(req) diff --git a/lib/python3.4/site-packages/pip/_vendor/__init__.py b/lib/python3.7/site-packages/pip/_vendor/__init__.py similarity index 98% rename from lib/python3.4/site-packages/pip/_vendor/__init__.py rename to lib/python3.7/site-packages/pip/_vendor/__init__.py index 8e76ab8..07db110 100644 --- a/lib/python3.4/site-packages/pip/_vendor/__init__.py +++ b/lib/python3.7/site-packages/pip/_vendor/__init__.py @@ -70,11 +70,13 @@ if DEBUNDLED: vendored("six") vendored("six.moves") vendored("six.moves.urllib") + vendored("six.moves.urllib.parse") vendored("packaging") vendored("packaging.version") vendored("packaging.specifiers") vendored("pkg_resources") vendored("progress") + vendored("pytoml") vendored("retrying") vendored("requests") vendored("requests.packages") @@ -109,3 +111,4 @@ if DEBUNDLED: vendored("requests.packages.urllib3.util.ssl_") vendored("requests.packages.urllib3.util.timeout") vendored("requests.packages.urllib3.util.url") + vendored("urllib3") diff --git a/lib/python3.7/site-packages/pip/_vendor/pep517/__init__.py b/lib/python3.7/site-packages/pip/_vendor/pep517/__init__.py new file mode 100644 index 0000000..8beedea --- /dev/null +++ b/lib/python3.7/site-packages/pip/_vendor/pep517/__init__.py @@ -0,0 +1,4 @@ +"""Wrappers to build Python packages using PEP 517 hooks +""" + +__version__ = '0.2' diff --git a/lib/python3.7/site-packages/pip/_vendor/pep517/_in_process.py b/lib/python3.7/site-packages/pip/_vendor/pep517/_in_process.py new file mode 100644 index 0000000..baa14d3 --- /dev/null +++ b/lib/python3.7/site-packages/pip/_vendor/pep517/_in_process.py @@ -0,0 +1,182 @@ +"""This is invoked in a subprocess to call the build backend hooks. + +It expects: +- Command line args: hook_name, control_dir +- Environment variable: PEP517_BUILD_BACKEND=entry.point:spec +- control_dir/input.json: + - {"kwargs": {...}} + +Results: +- control_dir/output.json + - {"return_val": ...} +""" +from glob import glob +from importlib import import_module +import os +from os.path import join as pjoin +import re +import shutil +import sys + +# This is run as a script, not a module, so it can't do a relative import +import compat + +def _build_backend(): + """Find and load the build backend""" + ep = os.environ['PEP517_BUILD_BACKEND'] + mod_path, _, obj_path = ep.partition(':') + obj = import_module(mod_path) + if obj_path: + for path_part in obj_path.split('.'): + obj = getattr(obj, path_part) + return obj + +def get_requires_for_build_wheel(config_settings): + """Invoke the optional get_requires_for_build_wheel hook + + Returns [] if the hook is not defined. + """ + backend = _build_backend() + try: + hook = backend.get_requires_for_build_wheel + except AttributeError: + return [] + else: + return hook(config_settings) + +def prepare_metadata_for_build_wheel(metadata_directory, config_settings): + """Invoke optional prepare_metadata_for_build_wheel + + Implements a fallback by building a wheel if the hook isn't defined. + """ + backend = _build_backend() + try: + hook = backend.prepare_metadata_for_build_wheel + except AttributeError: + return _get_wheel_metadata_from_wheel(backend, metadata_directory, + config_settings) + else: + return hook(metadata_directory, config_settings) + +WHEEL_BUILT_MARKER = 'PEP517_ALREADY_BUILT_WHEEL' + +def _dist_info_files(whl_zip): + """Identify the .dist-info folder inside a wheel ZipFile.""" + res = [] + for path in whl_zip.namelist(): + m = re.match(r'[^/\\]+-[^/\\]+\.dist-info/', path) + if m: + res.append(path) + if res: + return res + raise Exception("No .dist-info folder found in wheel") + +def _get_wheel_metadata_from_wheel(backend, metadata_directory, config_settings): + """Build a wheel and extract the metadata from it. + + Fallback for when the build backend does not define the 'get_wheel_metadata' + hook. + """ + from zipfile import ZipFile + whl_basename = backend.build_wheel(metadata_directory, config_settings) + with open(os.path.join(metadata_directory, WHEEL_BUILT_MARKER), 'wb'): + pass # Touch marker file + + whl_file = os.path.join(metadata_directory, whl_basename) + with ZipFile(whl_file) as zipf: + dist_info = _dist_info_files(zipf) + zipf.extractall(path=metadata_directory, members=dist_info) + return dist_info[0].split('/')[0] + +def _find_already_built_wheel(metadata_directory): + """Check for a wheel already built during the get_wheel_metadata hook. + """ + if not metadata_directory: + return None + metadata_parent = os.path.dirname(metadata_directory) + if not os.path.isfile(pjoin(metadata_parent, WHEEL_BUILT_MARKER)): + return None + + whl_files = glob(os.path.join(metadata_parent, '*.whl')) + if not whl_files: + print('Found wheel built marker, but no .whl files') + return None + if len(whl_files) > 1: + print('Found multiple .whl files; unspecified behaviour. ' + 'Will call build_wheel.') + return None + + # Exactly one .whl file + return whl_files[0] + +def build_wheel(wheel_directory, config_settings, metadata_directory=None): + """Invoke the mandatory build_wheel hook. + + If a wheel was already built in the prepare_metadata_for_build_wheel fallback, this + will copy it rather than rebuilding the wheel. + """ + prebuilt_whl = _find_already_built_wheel(metadata_directory) + if prebuilt_whl: + shutil.copy2(prebuilt_whl, wheel_directory) + return os.path.basename(prebuilt_whl) + + return _build_backend().build_wheel(wheel_directory, config_settings, + metadata_directory) + + +def get_requires_for_build_sdist(config_settings): + """Invoke the optional get_requires_for_build_wheel hook + + Returns [] if the hook is not defined. + """ + backend = _build_backend() + try: + hook = backend.get_requires_for_build_sdist + except AttributeError: + return [] + else: + return hook(config_settings) + +class _DummyException(Exception): + """Nothing should ever raise this exception""" + +class GotUnsupportedOperation(Exception): + """For internal use when backend raises UnsupportedOperation""" + +def build_sdist(sdist_directory, config_settings): + """Invoke the mandatory build_sdist hook.""" + backend = _build_backend() + try: + return backend.build_sdist(sdist_directory, config_settings) + except getattr(backend, 'UnsupportedOperation', _DummyException): + raise GotUnsupportedOperation + +HOOK_NAMES = { + 'get_requires_for_build_wheel', + 'prepare_metadata_for_build_wheel', + 'build_wheel', + 'get_requires_for_build_sdist', + 'build_sdist', +} + +def main(): + if len(sys.argv) < 3: + sys.exit("Needs args: hook_name, control_dir") + hook_name = sys.argv[1] + control_dir = sys.argv[2] + if hook_name not in HOOK_NAMES: + sys.exit("Unknown hook: %s" % hook_name) + hook = globals()[hook_name] + + hook_input = compat.read_json(pjoin(control_dir, 'input.json')) + + json_out = {'unsupported': False, 'return_val': None} + try: + json_out['return_val'] = hook(**hook_input['kwargs']) + except GotUnsupportedOperation: + json_out['unsupported'] = True + + compat.write_json(json_out, pjoin(control_dir, 'output.json'), indent=2) + +if __name__ == '__main__': + main() diff --git a/lib/python3.7/site-packages/pip/_vendor/pep517/check.py b/lib/python3.7/site-packages/pip/_vendor/pep517/check.py new file mode 100644 index 0000000..c65d51c --- /dev/null +++ b/lib/python3.7/site-packages/pip/_vendor/pep517/check.py @@ -0,0 +1,194 @@ +"""Check a project and backend by attempting to build using PEP 517 hooks. +""" +import argparse +import logging +import os +from os.path import isfile, join as pjoin +from pip._vendor.pytoml import TomlError, load as toml_load +import shutil +from subprocess import CalledProcessError +import sys +import tarfile +from tempfile import mkdtemp +import zipfile + +from .colorlog import enable_colourful_output +from .envbuild import BuildEnvironment +from .wrappers import Pep517HookCaller + +log = logging.getLogger(__name__) + +def check_build_sdist(hooks): + with BuildEnvironment() as env: + try: + env.pip_install(hooks.build_sys_requires) + log.info('Installed static build dependencies') + except CalledProcessError: + log.error('Failed to install static build dependencies') + return False + + try: + reqs = hooks.get_requires_for_build_sdist({}) + log.info('Got build requires: %s', reqs) + except: + log.error('Failure in get_requires_for_build_sdist', exc_info=True) + return False + + try: + env.pip_install(reqs) + log.info('Installed dynamic build dependencies') + except CalledProcessError: + log.error('Failed to install dynamic build dependencies') + return False + + td = mkdtemp() + log.info('Trying to build sdist in %s', td) + try: + try: + filename = hooks.build_sdist(td, {}) + log.info('build_sdist returned %r', filename) + except: + log.info('Failure in build_sdist', exc_info=True) + return False + + if not filename.endswith('.tar.gz'): + log.error("Filename %s doesn't have .tar.gz extension", filename) + return False + + path = pjoin(td, filename) + if isfile(path): + log.info("Output file %s exists", path) + else: + log.error("Output file %s does not exist", path) + return False + + if tarfile.is_tarfile(path): + log.info("Output file is a tar file") + else: + log.error("Output file is not a tar file") + return False + + finally: + shutil.rmtree(td) + + return True + +def check_build_wheel(hooks): + with BuildEnvironment() as env: + try: + env.pip_install(hooks.build_sys_requires) + log.info('Installed static build dependencies') + except CalledProcessError: + log.error('Failed to install static build dependencies') + return False + + try: + reqs = hooks.get_requires_for_build_wheel({}) + log.info('Got build requires: %s', reqs) + except: + log.error('Failure in get_requires_for_build_sdist', exc_info=True) + return False + + try: + env.pip_install(reqs) + log.info('Installed dynamic build dependencies') + except CalledProcessError: + log.error('Failed to install dynamic build dependencies') + return False + + td = mkdtemp() + log.info('Trying to build wheel in %s', td) + try: + try: + filename = hooks.build_wheel(td, {}) + log.info('build_wheel returned %r', filename) + except: + log.info('Failure in build_wheel', exc_info=True) + return False + + if not filename.endswith('.whl'): + log.error("Filename %s doesn't have .whl extension", filename) + return False + + path = pjoin(td, filename) + if isfile(path): + log.info("Output file %s exists", path) + else: + log.error("Output file %s does not exist", path) + return False + + if zipfile.is_zipfile(path): + log.info("Output file is a zip file") + else: + log.error("Output file is not a zip file") + return False + + finally: + shutil.rmtree(td) + + return True + + +def check(source_dir): + pyproject = pjoin(source_dir, 'pyproject.toml') + if isfile(pyproject): + log.info('Found pyproject.toml') + else: + log.error('Missing pyproject.toml') + return False + + try: + with open(pyproject) as f: + pyproject_data = toml_load(f) + # Ensure the mandatory data can be loaded + buildsys = pyproject_data['build-system'] + requires = buildsys['requires'] + backend = buildsys['build-backend'] + log.info('Loaded pyproject.toml') + except (TomlError, KeyError): + log.error("Invalid pyproject.toml", exc_info=True) + return False + + hooks = Pep517HookCaller(source_dir, backend) + + sdist_ok = check_build_sdist(hooks) + wheel_ok = check_build_wheel(hooks) + + if not sdist_ok: + log.warning('Sdist checks failed; scroll up to see') + if not wheel_ok: + log.warning('Wheel checks failed') + + return sdist_ok + + +def main(argv=None): + ap = argparse.ArgumentParser() + ap.add_argument('source_dir', + help="A directory containing pyproject.toml") + args = ap.parse_args(argv) + + enable_colourful_output() + + ok = check(args.source_dir) + + if ok: + print(ansi('Checks passed', 'green')) + else: + print(ansi('Checks failed', 'red')) + sys.exit(1) + +ansi_codes = { + 'reset': '\x1b[0m', + 'bold': '\x1b[1m', + 'red': '\x1b[31m', + 'green': '\x1b[32m', +} +def ansi(s, attr): + if os.name != 'nt' and sys.stdout.isatty(): + return ansi_codes[attr] + str(s) + ansi_codes['reset'] + else: + return str(s) + +if __name__ == '__main__': + main() diff --git a/lib/python3.7/site-packages/pip/_vendor/pep517/colorlog.py b/lib/python3.7/site-packages/pip/_vendor/pep517/colorlog.py new file mode 100644 index 0000000..26cf748 --- /dev/null +++ b/lib/python3.7/site-packages/pip/_vendor/pep517/colorlog.py @@ -0,0 +1,110 @@ +"""Nicer log formatting with colours. + +Code copied from Tornado, Apache licensed. +""" +# Copyright 2012 Facebook +# +# Licensed under the Apache License, Version 2.0 (the "License"); you may +# not use this file except in compliance with the License. You may obtain +# a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT +# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the +# License for the specific language governing permissions and limitations +# under the License. + +import logging +import sys + +try: + import curses +except ImportError: + curses = None + +def _stderr_supports_color(): + color = False + if curses and hasattr(sys.stderr, 'isatty') and sys.stderr.isatty(): + try: + curses.setupterm() + if curses.tigetnum("colors") > 0: + color = True + except Exception: + pass + return color + +class LogFormatter(logging.Formatter): + """Log formatter with colour support + """ + DEFAULT_COLORS = { + logging.INFO: 2, # Green + logging.WARNING: 3, # Yellow + logging.ERROR: 1, # Red + logging.CRITICAL: 1, + } + + def __init__(self, color=True, datefmt=None): + r""" + :arg bool color: Enables color support. + :arg string fmt: Log message format. + It will be applied to the attributes dict of log records. The + text between ``%(color)s`` and ``%(end_color)s`` will be colored + depending on the level if color support is on. + :arg dict colors: color mappings from logging level to terminal color + code + :arg string datefmt: Datetime format. + Used for formatting ``(asctime)`` placeholder in ``prefix_fmt``. + .. versionchanged:: 3.2 + Added ``fmt`` and ``datefmt`` arguments. + """ + logging.Formatter.__init__(self, datefmt=datefmt) + self._colors = {} + if color and _stderr_supports_color(): + # The curses module has some str/bytes confusion in + # python3. Until version 3.2.3, most methods return + # bytes, but only accept strings. In addition, we want to + # output these strings with the logging module, which + # works with unicode strings. The explicit calls to + # unicode() below are harmless in python2 but will do the + # right conversion in python 3. + fg_color = (curses.tigetstr("setaf") or + curses.tigetstr("setf") or "") + if (3, 0) < sys.version_info < (3, 2, 3): + fg_color = str(fg_color, "ascii") + + for levelno, code in self.DEFAULT_COLORS.items(): + self._colors[levelno] = str(curses.tparm(fg_color, code), "ascii") + self._normal = str(curses.tigetstr("sgr0"), "ascii") + + scr = curses.initscr() + self.termwidth = scr.getmaxyx()[1] + curses.endwin() + else: + self._normal = '' + # Default width is usually 80, but too wide is worse than too narrow + self.termwidth = 70 + + def formatMessage(self, record): + l = len(record.message) + right_text = '{initial}-{name}'.format(initial=record.levelname[0], + name=record.name) + if l + len(right_text) < self.termwidth: + space = ' ' * (self.termwidth - (l + len(right_text))) + else: + space = ' ' + + if record.levelno in self._colors: + start_color = self._colors[record.levelno] + end_color = self._normal + else: + start_color = end_color = '' + + return record.message + space + start_color + right_text + end_color + +def enable_colourful_output(level=logging.INFO): + handler = logging.StreamHandler() + handler.setFormatter(LogFormatter()) + logging.root.addHandler(handler) + logging.root.setLevel(level) diff --git a/lib/python3.7/site-packages/pip/_vendor/pep517/compat.py b/lib/python3.7/site-packages/pip/_vendor/pep517/compat.py new file mode 100644 index 0000000..01c66fc --- /dev/null +++ b/lib/python3.7/site-packages/pip/_vendor/pep517/compat.py @@ -0,0 +1,23 @@ +"""Handle reading and writing JSON in UTF-8, on Python 3 and 2.""" +import json +import sys + +if sys.version_info[0] >= 3: + # Python 3 + def write_json(obj, path, **kwargs): + with open(path, 'w', encoding='utf-8') as f: + json.dump(obj, f, **kwargs) + + def read_json(path): + with open(path, 'r', encoding='utf-8') as f: + return json.load(f) + +else: + # Python 2 + def write_json(obj, path, **kwargs): + with open(path, 'wb') as f: + json.dump(obj, f, encoding='utf-8', **kwargs) + + def read_json(path): + with open(path, 'rb') as f: + return json.load(f) diff --git a/lib/python3.7/site-packages/pip/_vendor/pep517/envbuild.py b/lib/python3.7/site-packages/pip/_vendor/pep517/envbuild.py new file mode 100644 index 0000000..c264f46 --- /dev/null +++ b/lib/python3.7/site-packages/pip/_vendor/pep517/envbuild.py @@ -0,0 +1,150 @@ +"""Build wheels/sdists by installing build deps to a temporary environment. +""" + +import os +import logging +from pip._vendor import pytoml +import shutil +from subprocess import check_call +import sys +from sysconfig import get_paths +from tempfile import mkdtemp + +from .wrappers import Pep517HookCaller + +log = logging.getLogger(__name__) + +def _load_pyproject(source_dir): + with open(os.path.join(source_dir, 'pyproject.toml')) as f: + pyproject_data = pytoml.load(f) + buildsys = pyproject_data['build-system'] + return buildsys['requires'], buildsys['build-backend'] + + +class BuildEnvironment(object): + """Context manager to install build deps in a simple temporary environment + + Based on code I wrote for pip, which is MIT licensed. + """ + # Copyright (c) 2008-2016 The pip developers (see AUTHORS.txt file) + # + # Permission is hereby granted, free of charge, to any person obtaining + # a copy of this software and associated documentation files (the + # "Software"), to deal in the Software without restriction, including + # without limitation the rights to use, copy, modify, merge, publish, + # distribute, sublicense, and/or sell copies of the Software, and to + # permit persons to whom the Software is furnished to do so, subject to + # the following conditions: + # + # The above copyright notice and this permission notice shall be + # included in all copies or substantial portions of the Software. + # + # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, + # EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF + # MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND + # NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE + # LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION + # OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION + # WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. + + path = None + + def __init__(self, cleanup=True): + self._cleanup = cleanup + + def __enter__(self): + self.path = mkdtemp(prefix='pep517-build-env-') + log.info('Temporary build environment: %s', self.path) + + self.save_path = os.environ.get('PATH', None) + self.save_pythonpath = os.environ.get('PYTHONPATH', None) + + install_scheme = 'nt' if (os.name == 'nt') else 'posix_prefix' + install_dirs = get_paths(install_scheme, vars={ + 'base': self.path, + 'platbase': self.path, + }) + + scripts = install_dirs['scripts'] + if self.save_path: + os.environ['PATH'] = scripts + os.pathsep + self.save_path + else: + os.environ['PATH'] = scripts + os.pathsep + os.defpath + + if install_dirs['purelib'] == install_dirs['platlib']: + lib_dirs = install_dirs['purelib'] + else: + lib_dirs = install_dirs['purelib'] + os.pathsep + \ + install_dirs['platlib'] + if self.save_pythonpath: + os.environ['PYTHONPATH'] = lib_dirs + os.pathsep + \ + self.save_pythonpath + else: + os.environ['PYTHONPATH'] = lib_dirs + + return self + + def pip_install(self, reqs): + """Install dependencies into this env by calling pip in a subprocess""" + if not reqs: + return + log.info('Calling pip to install %s', reqs) + check_call([sys.executable, '-m', 'pip', 'install', '--ignore-installed', + '--prefix', self.path] + list(reqs)) + + def __exit__(self, exc_type, exc_val, exc_tb): + if self._cleanup and (self.path is not None) and os.path.isdir(self.path): + shutil.rmtree(self.path) + + if self.save_path is None: + os.environ.pop('PATH', None) + else: + os.environ['PATH'] = self.save_path + + if self.save_pythonpath is None: + os.environ.pop('PYTHONPATH', None) + else: + os.environ['PYTHONPATH'] = self.save_pythonpath + +def build_wheel(source_dir, wheel_dir, config_settings=None): + """Build a wheel from a source directory using PEP 517 hooks. + + :param str source_dir: Source directory containing pyproject.toml + :param str wheel_dir: Target directory to create wheel in + :param dict config_settings: Options to pass to build backend + + This is a blocking function which will run pip in a subprocess to install + build requirements. + """ + if config_settings is None: + config_settings = {} + requires, backend = _load_pyproject(source_dir) + hooks = Pep517HookCaller(source_dir, backend) + + with BuildEnvironment() as env: + env.pip_install(requires) + reqs = hooks.get_requires_for_build_wheel(config_settings) + env.pip_install(reqs) + return hooks.build_wheel(wheel_dir, config_settings) + + +def build_sdist(source_dir, sdist_dir, config_settings=None): + """Build an sdist from a source directory using PEP 517 hooks. + + :param str source_dir: Source directory containing pyproject.toml + :param str sdist_dir: Target directory to place sdist in + :param dict config_settings: Options to pass to build backend + + This is a blocking function which will run pip in a subprocess to install + build requirements. + """ + if config_settings is None: + config_settings = {} + requires, backend = _load_pyproject(source_dir) + hooks = Pep517HookCaller(source_dir, backend) + + with BuildEnvironment() as env: + env.pip_install(requires) + reqs = hooks.get_requires_for_build_sdist(config_settings) + env.pip_install(reqs) + return hooks.build_sdist(sdist_dir, config_settings) diff --git a/lib/python3.7/site-packages/pip/_vendor/pep517/wrappers.py b/lib/python3.7/site-packages/pip/_vendor/pep517/wrappers.py new file mode 100644 index 0000000..28260f3 --- /dev/null +++ b/lib/python3.7/site-packages/pip/_vendor/pep517/wrappers.py @@ -0,0 +1,134 @@ +from contextlib import contextmanager +import os +from os.path import dirname, abspath, join as pjoin +import shutil +from subprocess import check_call +import sys +from tempfile import mkdtemp + +from . import compat + +_in_proc_script = pjoin(dirname(abspath(__file__)), '_in_process.py') + +@contextmanager +def tempdir(): + td = mkdtemp() + try: + yield td + finally: + shutil.rmtree(td) + +class UnsupportedOperation(Exception): + """May be raised by build_sdist if the backend indicates that it can't.""" + +class Pep517HookCaller(object): + """A wrapper around a source directory to be built with a PEP 517 backend. + + source_dir : The path to the source directory, containing pyproject.toml. + backend : The build backend spec, as per PEP 517, from pyproject.toml. + """ + def __init__(self, source_dir, build_backend): + self.source_dir = abspath(source_dir) + self.build_backend = build_backend + + def get_requires_for_build_wheel(self, config_settings=None): + """Identify packages required for building a wheel + + Returns a list of dependency specifications, e.g.: + ["wheel >= 0.25", "setuptools"] + + This does not include requirements specified in pyproject.toml. + It returns the result of calling the equivalently named hook in a + subprocess. + """ + return self._call_hook('get_requires_for_build_wheel', { + 'config_settings': config_settings + }) + + def prepare_metadata_for_build_wheel(self, metadata_directory, config_settings=None): + """Prepare a *.dist-info folder with metadata for this project. + + Returns the name of the newly created folder. + + If the build backend defines a hook with this name, it will be called + in a subprocess. If not, the backend will be asked to build a wheel, + and the dist-info extracted from that. + """ + return self._call_hook('prepare_metadata_for_build_wheel', { + 'metadata_directory': abspath(metadata_directory), + 'config_settings': config_settings, + }) + + def build_wheel(self, wheel_directory, config_settings=None, metadata_directory=None): + """Build a wheel from this project. + + Returns the name of the newly created file. + + In general, this will call the 'build_wheel' hook in the backend. + However, if that was previously called by + 'prepare_metadata_for_build_wheel', and the same metadata_directory is + used, the previously built wheel will be copied to wheel_directory. + """ + if metadata_directory is not None: + metadata_directory = abspath(metadata_directory) + return self._call_hook('build_wheel', { + 'wheel_directory': abspath(wheel_directory), + 'config_settings': config_settings, + 'metadata_directory': metadata_directory, + }) + + def get_requires_for_build_sdist(self, config_settings=None): + """Identify packages required for building a wheel + + Returns a list of dependency specifications, e.g.: + ["setuptools >= 26"] + + This does not include requirements specified in pyproject.toml. + It returns the result of calling the equivalently named hook in a + subprocess. + """ + return self._call_hook('get_requires_for_build_sdist', { + 'config_settings': config_settings + }) + + def build_sdist(self, sdist_directory, config_settings=None): + """Build an sdist from this project. + + Returns the name of the newly created file. + + This calls the 'build_sdist' backend hook in a subprocess. + """ + return self._call_hook('build_sdist', { + 'sdist_directory': abspath(sdist_directory), + 'config_settings': config_settings, + }) + + + def _call_hook(self, hook_name, kwargs): + env = os.environ.copy() + + # On Python 2, pytoml returns Unicode values (which is correct) but the + # environment passed to check_call needs to contain string values. We + # convert here by encoding using ASCII (the backend can only contain + # letters, digits and _, . and : characters, and will be used as a + # Python identifier, so non-ASCII content is wrong on Python 2 in + # any case). + if sys.version_info[0] == 2: + build_backend = self.build_backend.encode('ASCII') + else: + build_backend = self.build_backend + + env['PEP517_BUILD_BACKEND'] = build_backend + with tempdir() as td: + compat.write_json({'kwargs': kwargs}, pjoin(td, 'input.json'), + indent=2) + + # Run the hook in a subprocess + check_call([sys.executable, _in_proc_script, hook_name, td], + cwd=self.source_dir, env=env) + + data = compat.read_json(pjoin(td, 'output.json')) + if data.get('unsupported'): + raise UnsupportedOperation + return data['return_val'] + diff --git a/lib/python3.7/site-packages/pkg_resources-0.0.0.dist-info/AUTHORS.txt b/lib/python3.7/site-packages/pkg_resources-0.0.0.dist-info/AUTHORS.txt new file mode 100644 index 0000000..e845ac7 --- /dev/null +++ b/lib/python3.7/site-packages/pkg_resources-0.0.0.dist-info/AUTHORS.txt @@ -0,0 +1,421 @@ +Adam Chainz +Adam Wentz +Adrien Morison +Alan Yee +Aleks Bunin +Alex Gaynor +Alex Grönholm +Alex Morega +Alex Stachowiak +Alexander Shtyrov +Alexandre Conrad +Alli +Anatoly Techtonik +Andrei Geacar +Andrew Gaul +Andrey Bulgakov +Andrés Delfino <34587441+andresdelfino@users.noreply.github.com> +Andrés Delfino +Andy Freeland +Andy Kluger +Anish Tambe +Anrs Hu +Anthony Sottile +Antoine Musso +Anton Ovchinnikov +Anton Patrushev +Antonio Alvarado Hernandez +Antony Lee +Antti Kaihola +Anubhav Patel +Anuj Godase +AQNOUCH Mohammed +AraHaan +Arindam Choudhury +Armin Ronacher +Ashley Manton +Atsushi Odagiri +Avner Cohen +Baptiste Mispelon +Barney Gale +barneygale +Bartek Ogryczak +Bastian Venthur +Ben Darnell +Ben Hoyt +Ben Rosser +Bence Nagy +Benjamin VanEvery +Benoit Pierre +Berker Peksag +Bernardo B. Marques +Bernhard M. Wiedemann +Bogdan Opanchuk +Brad Erickson +Bradley Ayers +Brandon L. Reiss +Brett Randall +Brian Rosner +BrownTruck +Bruno Oliveira +Bruno Renié +Bstrdsmkr +Buck Golemon +burrows +Bussonnier Matthias +c22 +Calvin Smith +Carl Meyer +Carlos Liam +Carol Willing +Carter Thayer +Cass +Chandrasekhar Atina +Chris Brinker +Chris Jerdonek +Chris McDonough +Chris Wolfe +Christian Heimes +Christian Oudard +Christopher Snyder +Clark Boylan +Clay McClure +Cody +Cody Soyland +Colin Watson +Connor Osborn +Cooper Lees +Cooper Ry Lees +Cory Benfield +Cory Wright +Craig Kerstiens +Cristian Sorinel +Curtis Doty +Damian Quiroga +Dan Black +Dan Savilonis +Dan Sully +daniel +Daniel Collins +Daniel Hahler +Daniel Holth +Daniel Jost +Daniel Shaulov +Daniele Procida +Danny Hermes +Dav Clark +Dave Abrahams +David Aguilar +David Black +David Caro +David Evans +David Linke +David Pursehouse +David Tucker +David Wales +Davidovich +derwolfe +Dmitry Gladkov +Domen Kožar +Donald Stufft +Dongweiming +Douglas Thor +DrFeathers +Dustin Ingram +Dwayne Bailey +Ed Morley <501702+edmorley@users.noreply.github.com> +Ed Morley +Eli Schwartz +Emil Styrke +Endoh Takanao +enoch +Eric Gillingham +Eric Hanchrow +Eric Hopper +Erik M. Bray +Erik Rose +Ernest W Durbin III +Ernest W. Durbin III +Erwin Janssen +Eugene Vereshchagin +fiber-space +Filip Kokosiński +Florian Briand +Francesco +Francesco Montesano +Gabriel Curio +Gabriel de Perthuis +Garry Polley +gdanielson +Geoffrey Lehée +Geoffrey Sneddon +George Song +Georgi Valkov +Giftlin Rajaiah +gizmoguy1 +gkdoc <40815324+gkdoc@users.noreply.github.com> +GOTO Hayato <3532528+gh640@users.noreply.github.com> +Guilherme Espada +Guy Rozendorn +Hari Charan +Herbert Pfennig +Hsiaoming Yang +Hugo +Hugo Lopes Tavares +hugovk +Hynek Schlawack +Ian Bicking +Ian Cordasco +Ian Lee +Ian Stapleton Cordasco +Ian Wienand +Ian Wienand +Igor Kuzmitshov +Igor Sobreira +Ilya Baryshev +INADA Naoki +Ionel Cristian Mărieș +Ionel Maries Cristian +Jakub Stasiak +Jakub Vysoky +Jakub Wilk +James Cleveland +James Cleveland +James Firth +James Polley +Jan Pokorný +Jannis Leidel +jarondl +Jason R. Coombs +Jay Graves +Jean-Christophe Fillion-Robin +Jeff Barber +Jeff Dairiki +Jeremy Stanley +Jeremy Zafran +Jim Garrison +Jivan Amara +John-Scott Atlakson +Jon Banafato +Jon Dufresne +Jon Parise +Jon Wayne Parrott +Jonas Nockert +Jonathan Herbert +Joost Molenaar +Jorge Niedbalski +Joseph Long +Josh Bronson +Josh Hansen +Josh Schneier +Julien Demoor +jwg4 +Jyrki Pulliainen +Kamal Bin Mustafa +kaustav haldar +keanemind +Kelsey Hightower +Kenneth Belitzky +Kenneth Reitz +Kenneth Reitz +Kevin Burke +Kevin Carter +Kevin Frommelt +Kexuan Sun +Kit Randel +kpinc +Kumar McMillan +Kyle Persohn +Laurent Bristiel +Laurie Opperman +Leon Sasson +Lev Givon +Lincoln de Sousa +Lipis +Loren Carvalho +Lucas Cimon +Ludovic Gasc +Luke Macken +Luo Jiebin +luojiebin +luz.paz +Marc Abramowitz +Marc Tamlyn +Marcus Smith +Mariatta +Mark Kohler +Markus Hametner +Masklinn +Matej Stuchlik +Mathew Jennings +Mathieu Bridon +Matt Good +Matt Maker +Matt Robenolt +matthew +Matthew Einhorn +Matthew Gilliard +Matthew Iversen +Matthew Trumbell +Matthew Willson +Matthias Bussonnier +mattip +Maxim Kurnikov +Maxime Rouyrre +memoselyk +Michael +Michael Aquilina +Michael E. Karpeles +Michael Klich +Michael Williamson +michaelpacer +Mickaël Schoentgen +Miguel Araujo Perez +Mihir Singh +Min RK +MinRK +Miro Hrončok +montefra +Monty Taylor +Nate Coraor +Nathaniel J. Smith +Nehal J Wani +Nick Coghlan +Nick Stenning +Nikhil Benesch +Nitesh Sharma +Nowell Strite +nvdv +Ofekmeister +Oliver Jeeves +Oliver Tonnhofer +Olivier Girardot +Olivier Grisel +Ollie Rutherfurd +OMOTO Kenji +Oren Held +Oscar Benjamin +Oz N Tiram +Patrick Dubroy +Patrick Jenkins +Patrick Lawson +patricktokeeffe +Paul Kehrer +Paul Moore +Paul Nasrat +Paul Oswald +Paul van der Linden +Paulus Schoutsen +Pawel Jasinski +Pekka Klärck +Peter Waller +Phaneendra Chiruvella +Phil Freo +Phil Pennock +Phil Whelan +Philip Molloy +Philippe Ombredanne +Pi Delport +Pierre-Yves Rofes +pip +Pradyun Gedam +Pratik Mallya +Preston Holmes +Przemek Wrzos +Qiangning Hong +R. David Murray +Rafael Caricio +Ralf Schmitt +Razzi Abuissa +Remi Rampin +Rene Dudfield +Richard Jones +RobberPhex +Robert Collins +Robert McGibbon +Robert T. McGibbon +Roey Berman +Rohan Jain +Rohan Jain +Rohan Jain +Roman Bogorodskiy +Romuald Brunet +Ronny Pfannschmidt +Rory McCann +Ross Brattain +Roy Wellington Ⅳ +Roy Wellington Ⅳ +Ryan Wooden +ryneeverett +Sachi King +Salvatore Rinchiera +schlamar +Scott Kitterman +seanj +Sebastian Schaetz +Segev Finer +Sergey Vasilyev +Seth Woodworth +Shlomi Fish +Simeon Visser +Simon Cross +Simon Pichugin +Sorin Sbarnea +Stavros Korokithakis +Stefan Scherfke +Stephan Erb +stepshal +Steve (Gadget) Barnes +Steve Barnes +Steve Kowalik +Steven Myint +stonebig +Stéphane Bidoul (ACSONE) +Stéphane Bidoul +Stéphane Klein +Takayuki SHIMIZUKAWA +Thijs Triemstra +Thomas Fenzl +Thomas Grainger +Thomas Guettler +Thomas Johansson +Thomas Kluyver +Thomas Smith +Tim D. Smith +Tim Harder +Tim Heap +tim smith +tinruufu +Tom Freudenheim +Tom V +Tomer Chachamu +Tony Zhaocheng Tan +Toshio Kuratomi +Travis Swicegood +Tzu-ping Chung +Valentin Haenel +Victor Stinner +Viktor Szépe +Ville Skyttä +Vinay Sajip +Vincent Philippon +Vitaly Babiy +Vladimir Rutsky +W. Trevor King +Wil Tan +Wilfred Hughes +William ML Leslie +Wolfgang Maier +Xavier Fernandez +Xavier Fernandez +xoviat +YAMAMOTO Takashi +Yen Chi Hsuan +Yoval P +Yu Jian +Zearin +Zearin +Zhiping Deng +Zvezdan Petkovic +Łukasz Langa +Семён Марьясин diff --git a/lib/python3.4/site-packages/ed25519-1.4.dist-info/INSTALLER b/lib/python3.7/site-packages/pkg_resources-0.0.0.dist-info/INSTALLER similarity index 100% rename from lib/python3.4/site-packages/ed25519-1.4.dist-info/INSTALLER rename to lib/python3.7/site-packages/pkg_resources-0.0.0.dist-info/INSTALLER diff --git a/lib/python3.7/site-packages/pkg_resources-0.0.0.dist-info/LICENSE.txt b/lib/python3.7/site-packages/pkg_resources-0.0.0.dist-info/LICENSE.txt new file mode 100644 index 0000000..d3379fa --- /dev/null +++ b/lib/python3.7/site-packages/pkg_resources-0.0.0.dist-info/LICENSE.txt @@ -0,0 +1,20 @@ +Copyright (c) 2008-2018 The pip developers (see AUTHORS.txt file) + +Permission is hereby granted, free of charge, to any person obtaining +a copy of this software and associated documentation files (the +"Software"), to deal in the Software without restriction, including +without limitation the rights to use, copy, modify, merge, publish, +distribute, sublicense, and/or sell copies of the Software, and to +permit persons to whom the Software is furnished to do so, subject to +the following conditions: + +The above copyright notice and this permission notice shall be +included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, +EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF +MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND +NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE +LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION +OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION +WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/lib/python3.4/site-packages/pkg_resources-0.0.0.dist-info/METADATA b/lib/python3.7/site-packages/pkg_resources-0.0.0.dist-info/METADATA similarity index 87% rename from lib/python3.4/site-packages/pkg_resources-0.0.0.dist-info/METADATA rename to lib/python3.7/site-packages/pkg_resources-0.0.0.dist-info/METADATA index 7a50487..cf6c930 100644 --- a/lib/python3.4/site-packages/pkg_resources-0.0.0.dist-info/METADATA +++ b/lib/python3.7/site-packages/pkg_resources-0.0.0.dist-info/METADATA @@ -1,4 +1,4 @@ -Metadata-Version: 2.0 +Metadata-Version: 2.1 Name: pkg_resources Version: 0.0.0 Summary: UNKNOWN diff --git a/lib/python3.7/site-packages/pkg_resources-0.0.0.dist-info/RECORD b/lib/python3.7/site-packages/pkg_resources-0.0.0.dist-info/RECORD new file mode 100644 index 0000000..7c72d6a --- /dev/null +++ b/lib/python3.7/site-packages/pkg_resources-0.0.0.dist-info/RECORD @@ -0,0 +1,38 @@ +pkg_resources-0.0.0.dist-info/AUTHORS.txt,sha256=Pu4WdZapZ2U2wKwWxd830ZxnROCHwmV_TpWoL9dqJ-M,15880 +pkg_resources-0.0.0.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4 +pkg_resources-0.0.0.dist-info/LICENSE.txt,sha256=ORqHhOMZ2uVDFHfUzJvFBPxdcf2eieHIDxzThV9dfPo,1090 +pkg_resources-0.0.0.dist-info/METADATA,sha256=V9_WPOtD1FnuKrTGv6Ique7kAOn2lasvT8W0_iMCCCk,177 +pkg_resources-0.0.0.dist-info/RECORD,, +pkg_resources-0.0.0.dist-info/WHEEL,sha256=_wJFdOYk7i3xxT8ElOkUJvOdOvfNGbR9g-bf6UQT6sU,110 +pkg_resources/__init__.py,sha256=1CH-AzmMwXmdx_7bCm03hV11azPdW64rzVum2ylDE7k,104406 +pkg_resources/__pycache__/__init__.cpython-37.pyc,, +pkg_resources/__pycache__/py31compat.cpython-37.pyc,, +pkg_resources/_vendor/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0 +pkg_resources/_vendor/__pycache__/__init__.cpython-37.pyc,, +pkg_resources/_vendor/__pycache__/appdirs.cpython-37.pyc,, +pkg_resources/_vendor/__pycache__/pyparsing.cpython-37.pyc,, +pkg_resources/_vendor/__pycache__/six.cpython-37.pyc,, +pkg_resources/_vendor/appdirs.py,sha256=MievUEuv3l_mQISH5SF0shDk_BNhHHzYiAPrT3ITN4I,24701 +pkg_resources/_vendor/packaging/__about__.py,sha256=zkcCPTN_6TcLW0Nrlg0176-R1QQ_WVPTm8sz1R4-HjM,720 +pkg_resources/_vendor/packaging/__init__.py,sha256=_vNac5TrzwsrzbOFIbF-5cHqc_Y2aPT2D7zrIR06BOo,513 +pkg_resources/_vendor/packaging/__pycache__/__about__.cpython-37.pyc,, +pkg_resources/_vendor/packaging/__pycache__/__init__.cpython-37.pyc,, +pkg_resources/_vendor/packaging/__pycache__/_compat.cpython-37.pyc,, +pkg_resources/_vendor/packaging/__pycache__/_structures.cpython-37.pyc,, +pkg_resources/_vendor/packaging/__pycache__/markers.cpython-37.pyc,, +pkg_resources/_vendor/packaging/__pycache__/requirements.cpython-37.pyc,, +pkg_resources/_vendor/packaging/__pycache__/specifiers.cpython-37.pyc,, +pkg_resources/_vendor/packaging/__pycache__/utils.cpython-37.pyc,, +pkg_resources/_vendor/packaging/__pycache__/version.cpython-37.pyc,, +pkg_resources/_vendor/packaging/_compat.py,sha256=Vi_A0rAQeHbU-a9X0tt1yQm9RqkgQbDSxzRw8WlU9kA,860 +pkg_resources/_vendor/packaging/_structures.py,sha256=RImECJ4c_wTlaTYYwZYLHEiebDMaAJmK1oPARhw1T5o,1416 +pkg_resources/_vendor/packaging/markers.py,sha256=uEcBBtGvzqltgnArqb9c4RrcInXezDLos14zbBHhWJo,8248 +pkg_resources/_vendor/packaging/requirements.py,sha256=SikL2UynbsT0qtY9ltqngndha_sfo0w6XGFhAhoSoaQ,4355 +pkg_resources/_vendor/packaging/specifiers.py,sha256=SAMRerzO3fK2IkFZCaZkuwZaL_EGqHNOz4pni4vhnN0,28025 +pkg_resources/_vendor/packaging/utils.py,sha256=3m6WvPm6NNxE8rkTGmn0r75B_GZSGg7ikafxHsBN1WA,421 +pkg_resources/_vendor/packaging/version.py,sha256=OwGnxYfr2ghNzYx59qWIBkrK3SnB6n-Zfd1XaLpnnM0,11556 +pkg_resources/_vendor/pyparsing.py,sha256=tmrp-lu-qO1i75ZzIN5A12nKRRD1Cm4Vpk-5LR9rims,232055 +pkg_resources/_vendor/six.py,sha256=A6hdJZVjI3t_geebZ9BzUvwRrIXo0lfwzQlM2LcKyas,30098 +pkg_resources/extern/__init__.py,sha256=cHiEfHuLmm6rs5Ve_ztBfMI7Lr31vss-D4wkqF5xzlI,2498 +pkg_resources/extern/__pycache__/__init__.cpython-37.pyc,, +pkg_resources/py31compat.py,sha256=-WQ0e4c3RG_acdhwC3gLiXhP_lg4G5q7XYkZkQg0gxU,558 diff --git a/lib/python3.4/site-packages/setuptools-36.6.0.dist-info/WHEEL b/lib/python3.7/site-packages/pkg_resources-0.0.0.dist-info/WHEEL similarity index 70% rename from lib/python3.4/site-packages/setuptools-36.6.0.dist-info/WHEEL rename to lib/python3.7/site-packages/pkg_resources-0.0.0.dist-info/WHEEL index 7332a41..c4bde30 100644 --- a/lib/python3.4/site-packages/setuptools-36.6.0.dist-info/WHEEL +++ b/lib/python3.7/site-packages/pkg_resources-0.0.0.dist-info/WHEEL @@ -1,5 +1,5 @@ Wheel-Version: 1.0 -Generator: bdist_wheel (0.30.0) +Generator: bdist_wheel (0.32.3) Root-Is-Purelib: true Tag: py2-none-any Tag: py3-none-any diff --git a/lib/python3.4/site-packages/pkg_resources/__init__.py b/lib/python3.7/site-packages/pkg_resources/__init__.py similarity index 88% rename from lib/python3.4/site-packages/pkg_resources/__init__.py rename to lib/python3.7/site-packages/pkg_resources/__init__.py index 4c9868c..7413470 100644 --- a/lib/python3.4/site-packages/pkg_resources/__init__.py +++ b/lib/python3.7/site-packages/pkg_resources/__init__.py @@ -34,9 +34,11 @@ import platform import collections import plistlib import email.parser +import errno import tempfile import textwrap import itertools +import inspect from pkgutil import get_importer try: @@ -45,6 +47,11 @@ except ImportError: # Python 3.2 compatibility import imp as _imp +try: + FileExistsError +except NameError: + FileExistsError = OSError + from pkg_resources.extern import six from pkg_resources.extern.six.moves import urllib, map, filter @@ -67,6 +74,7 @@ try: except ImportError: importlib_machinery = None +from . import py31compat from pkg_resources.extern import appdirs from pkg_resources.extern import packaging __import__('pkg_resources.extern.packaging.version') @@ -74,13 +82,37 @@ __import__('pkg_resources.extern.packaging.specifiers') __import__('pkg_resources.extern.packaging.requirements') __import__('pkg_resources.extern.packaging.markers') -if (3, 0) < sys.version_info < (3, 3): - raise RuntimeError("Python 3.3 or later is required") + +__metaclass__ = type + + +if (3, 0) < sys.version_info < (3, 4): + raise RuntimeError("Python 3.4 or later is required") + +if six.PY2: + # Those builtin exceptions are only defined in Python 3 + PermissionError = None + NotADirectoryError = None # declare some globals that will be defined later to # satisfy the linters. require = None working_set = None +add_activation_listener = None +resources_stream = None +cleanup_resources = None +resource_dir = None +resource_stream = None +set_extraction_path = None +resource_isdir = None +resource_string = None +iter_entry_points = None +resource_listdir = None +resource_filename = None +resource_exists = None +_distribution_finders = None +_namespace_handlers = None +_namespace_packages = None class PEP440Warning(RuntimeWarning): @@ -90,118 +122,11 @@ class PEP440Warning(RuntimeWarning): """ -class _SetuptoolsVersionMixin(object): - def __hash__(self): - return super(_SetuptoolsVersionMixin, self).__hash__() - - def __lt__(self, other): - if isinstance(other, tuple): - return tuple(self) < other - else: - return super(_SetuptoolsVersionMixin, self).__lt__(other) - - def __le__(self, other): - if isinstance(other, tuple): - return tuple(self) <= other - else: - return super(_SetuptoolsVersionMixin, self).__le__(other) - - def __eq__(self, other): - if isinstance(other, tuple): - return tuple(self) == other - else: - return super(_SetuptoolsVersionMixin, self).__eq__(other) - - def __ge__(self, other): - if isinstance(other, tuple): - return tuple(self) >= other - else: - return super(_SetuptoolsVersionMixin, self).__ge__(other) - - def __gt__(self, other): - if isinstance(other, tuple): - return tuple(self) > other - else: - return super(_SetuptoolsVersionMixin, self).__gt__(other) - - def __ne__(self, other): - if isinstance(other, tuple): - return tuple(self) != other - else: - return super(_SetuptoolsVersionMixin, self).__ne__(other) - - def __getitem__(self, key): - return tuple(self)[key] - - def __iter__(self): - component_re = re.compile(r'(\d+ | [a-z]+ | \.| -)', re.VERBOSE) - replace = { - 'pre': 'c', - 'preview': 'c', - '-': 'final-', - 'rc': 'c', - 'dev': '@', - }.get - - def _parse_version_parts(s): - for part in component_re.split(s): - part = replace(part, part) - if not part or part == '.': - continue - if part[:1] in '0123456789': - # pad for numeric comparison - yield part.zfill(8) - else: - yield '*' + part - - # ensure that alpha/beta/candidate are before final - yield '*final' - - def old_parse_version(s): - parts = [] - for part in _parse_version_parts(s.lower()): - if part.startswith('*'): - # remove '-' before a prerelease tag - if part < '*final': - while parts and parts[-1] == '*final-': - parts.pop() - # remove trailing zeros from each series of numeric parts - while parts and parts[-1] == '00000000': - parts.pop() - parts.append(part) - return tuple(parts) - - # Warn for use of this function - warnings.warn( - "You have iterated over the result of " - "pkg_resources.parse_version. This is a legacy behavior which is " - "inconsistent with the new version class introduced in setuptools " - "8.0. In most cases, conversion to a tuple is unnecessary. For " - "comparison of versions, sort the Version instances directly. If " - "you have another use case requiring the tuple, please file a " - "bug with the setuptools project describing that need.", - RuntimeWarning, - stacklevel=1, - ) - - for part in old_parse_version(str(self)): - yield part - - -class SetuptoolsVersion(_SetuptoolsVersionMixin, packaging.version.Version): - pass - - -class SetuptoolsLegacyVersion(_SetuptoolsVersionMixin, - packaging.version.LegacyVersion): - pass - - def parse_version(v): try: - return SetuptoolsVersion(v) + return packaging.version.Version(v) except packaging.version.InvalidVersion: - return SetuptoolsLegacyVersion(v) + return packaging.version.LegacyVersion(v) _state_vars = {} @@ -460,19 +385,17 @@ def get_build_platform(): XXX Currently this is the same as ``distutils.util.get_platform()``, but it needs some hacks for Linux and Mac OS X. """ - try: - # Python 2.7 or >=3.2 - from sysconfig import get_platform - except ImportError: - from distutils.util import get_platform + from sysconfig import get_platform plat = get_platform() if sys.platform == "darwin" and not plat.startswith('macosx-'): try: version = _macosx_vers() machine = os.uname()[4].replace(" ", "_") - return "macosx-%d.%d-%s" % (int(version[0]), int(version[1]), - _macosx_arch(machine)) + return "macosx-%d.%d-%s" % ( + int(version[0]), int(version[1]), + _macosx_arch(machine), + ) except ValueError: # if someone is running a non-Mac darwin system, this will fall # through to the default implementation @@ -622,7 +545,7 @@ class IResourceProvider(IMetadataProvider): """List of resource names in the directory (like ``os.listdir()``)""" -class WorkingSet(object): +class WorkingSet: """A collection of active distributions on sys.path (or a similar list)""" def __init__(self, entries=None): @@ -722,13 +645,12 @@ class WorkingSet(object): distributions in the working set, otherwise only ones matching both `group` and `name` are yielded (in distribution order). """ - for dist in self: - entries = dist.get_entry_map(group) - if name is None: - for ep in entries.values(): - yield ep - elif name in entries: - yield entries[name] + return ( + entry + for dist in self + for entry in dist.get_entry_map(group).values() + if name is None or name == entry.name + ) def run_script(self, requires, script_name): """Locate distribution for `requires` and run `script_name` script""" @@ -786,7 +708,7 @@ class WorkingSet(object): self._added_new(dist) def resolve(self, requirements, env=None, installer=None, - replace_conflicting=False): + replace_conflicting=False, extras=None): """List all distributions needed to (recursively) meet `requirements` `requirements` must be a sequence of ``Requirement`` objects. `env`, @@ -797,11 +719,18 @@ class WorkingSet(object): already-installed distribution; it should return a ``Distribution`` or ``None``. - Unless `replace_conflicting=True`, raises a VersionConflict exception if + Unless `replace_conflicting=True`, raises a VersionConflict exception + if any requirements are found on the path that have the correct name but the wrong version. Otherwise, if an `installer` is supplied it will be invoked to obtain the correct version of the requirement and activate it. + + `extras` is a list of the extras to be used with these requirements. + This is important because extra requirements may look like `my_req; + extra = "my_extra"`, which would otherwise be interpreted as a purely + optional requirement. Instead, we want to be able to assert that these + requirements are truly required. """ # set up the stack @@ -825,7 +754,7 @@ class WorkingSet(object): # Ignore cyclic or redundant dependencies continue - if not req_extras.markers_pass(req): + if not req_extras.markers_pass(req, extras): continue dist = best.get(req.key) @@ -843,7 +772,10 @@ class WorkingSet(object): # distribution env = Environment([]) ws = WorkingSet([]) - dist = best[req.key] = env.best_match(req, ws, installer) + dist = best[req.key] = env.best_match( + req, ws, installer, + replace_conflicting=replace_conflicting + ) if dist is None: requirers = required_by.get(req, None) raise DistributionNotFound(req, requirers) @@ -867,8 +799,8 @@ class WorkingSet(object): # return list of distros to activate return to_activate - def find_plugins(self, plugin_env, full_env=None, installer=None, - fallback=True): + def find_plugins( + self, plugin_env, full_env=None, installer=None, fallback=True): """Find all activatable distributions in `plugin_env` Example usage:: @@ -1004,7 +936,7 @@ class _ReqExtras(dict): Map each requirement to the extras that demanded it. """ - def markers_pass(self, req): + def markers_pass(self, req, extras=None): """ Evaluate markers for req against each extra that demanded it. @@ -1014,15 +946,16 @@ class _ReqExtras(dict): """ extra_evals = ( req.marker.evaluate({'extra': extra}) - for extra in self.get(req, ()) + (None,) + for extra in self.get(req, ()) + (extras or (None,)) ) return not req.marker or any(extra_evals) -class Environment(object): +class Environment: """Searchable snapshot of distributions on a search path""" - def __init__(self, search_path=None, platform=get_supported_platform(), + def __init__( + self, search_path=None, platform=get_supported_platform(), python=PY_MAJOR): """Snapshot distributions available on a search path @@ -1033,7 +966,7 @@ class Environment(object): `platform` is an optional string specifying the name of the platform that platform-specific distributions must be compatible with. If unspecified, it defaults to the current platform. `python` is an - optional string naming the desired version of Python (e.g. ``'3.3'``); + optional string naming the desired version of Python (e.g. ``'3.6'``); it defaults to the current version. You may explicitly set `platform` (and/or `python`) to ``None`` if you @@ -1052,9 +985,12 @@ class Environment(object): requirements specified when this environment was created, or False is returned. """ - return (self.python is None or dist.py_version is None - or dist.py_version == self.python) \ - and compatible_platforms(dist.platform, self.platform) + py_compat = ( + self.python is None + or dist.py_version is None + or dist.py_version == self.python + ) + return py_compat and compatible_platforms(dist.platform, self.platform) def remove(self, dist): """Remove `dist` from the environment""" @@ -1095,7 +1031,8 @@ class Environment(object): dists.append(dist) dists.sort(key=operator.attrgetter('hashcmp'), reverse=True) - def best_match(self, req, working_set, installer=None): + def best_match( + self, req, working_set, installer=None, replace_conflicting=False): """Find distribution best matching `req` and usable on `working_set` This calls the ``find(req)`` method of the `working_set` to see if a @@ -1108,7 +1045,12 @@ class Environment(object): calling the environment's ``obtain(req, installer)`` method will be returned. """ - dist = working_set.find(req) + try: + dist = working_set.find(req) + except VersionConflict: + if not replace_conflicting: + raise + dist = None if dist is not None: return dist for dist in self[req.key]: @@ -1225,8 +1167,8 @@ class ResourceManager: tmpl = textwrap.dedent(""" Can't extract file(s) to egg cache - The following error occurred while trying to extract file(s) to the Python egg - cache: + The following error occurred while trying to extract file(s) + to the Python egg cache: {old_exc} @@ -1234,9 +1176,9 @@ class ResourceManager: {cache_path} - Perhaps your account does not have write access to this directory? You can - change the cache directory by setting the PYTHON_EGG_CACHE environment - variable to point to an accessible directory. + Perhaps your account does not have write access to this directory? + You can change the cache directory by setting the PYTHON_EGG_CACHE + environment variable to point to an accessible directory. """).lstrip() err = ExtractionError(tmpl.format(**locals())) err.manager = self @@ -1261,7 +1203,7 @@ class ResourceManager: target_path = os.path.join(extract_path, archive_name + '-tmp', *names) try: _bypass_ensure_directory(target_path) - except: + except Exception: self.extraction_error() self._warn_unsafe_extraction_path(extract_path) @@ -1286,11 +1228,13 @@ class ResourceManager: return mode = os.stat(path).st_mode if mode & stat.S_IWOTH or mode & stat.S_IWGRP: - msg = ("%s is writable by group/others and vulnerable to attack " + msg = ( + "%s is writable by group/others and vulnerable to attack " "when " "used with get_resource_filename. Consider a more secure " "location (set with .set_extraction_path or the " - "PYTHON_EGG_CACHE environment variable)." % path) + "PYTHON_EGG_CACHE environment variable)." % path + ) warnings.warn(msg, UserWarning) def postprocess(self, tempname, filename): @@ -1483,7 +1427,10 @@ class NullProvider: def run_script(self, script_name, namespace): script = 'scripts/' + script_name if not self.has_metadata(script): - raise ResolutionError("No script named %r" % script_name) + raise ResolutionError( + "Script {script!r} not found in metadata at {self.egg_info!r}" + .format(**locals()), + ) script_text = self.get_metadata(script).replace('\r\n', '\n') script_text = script_text.replace('\r', '\n') script_filename = self._fn(self.egg_info, script) @@ -1544,7 +1491,7 @@ class EggProvider(NullProvider): path = self.module_path old = None while path != old: - if _is_unpacked_egg(path): + if _is_egg_path(path): self.egg_name = os.path.basename(path) self.egg_info = os.path.join(path, 'EGG-INFO') self.egg_root = path @@ -1574,9 +1521,10 @@ class DefaultProvider(EggProvider): @classmethod def _register(cls): - loader_cls = getattr(importlib_machinery, 'SourceFileLoader', - type(None)) - register_loader_type(loader_cls, cls) + loader_names = 'SourceFileLoader', 'SourcelessFileLoader', + for name in loader_names: + loader_cls = getattr(importlib_machinery, name, type(None)) + register_loader_type(loader_cls, cls) DefaultProvider._register() @@ -1585,11 +1533,16 @@ DefaultProvider._register() class EmptyProvider(NullProvider): """Provider that returns nothing for all requests""" - _isdir = _has = lambda self, path: False - _get = lambda self, path: '' - _listdir = lambda self, path: [] module_path = None + _isdir = _has = lambda self, path: False + + def _get(self, path): + return '' + + def _listdir(self, path): + return [] + def __init__(self): pass @@ -1611,7 +1564,7 @@ class ZipManifests(dict): Use a platform-specific path separator (os.sep) for the path keys for compatibility with pypy on Windows. """ - with ContextualZipFile(path) as zfile: + with zipfile.ZipFile(path) as zfile: items = ( ( name.replace('/', os.sep), @@ -1644,26 +1597,6 @@ class MemoizedZipManifests(ZipManifests): return self[path].manifest -class ContextualZipFile(zipfile.ZipFile): - """ - Supplement ZipFile class to support context manager for Python 2.6 - """ - - def __enter__(self): - return self - - def __exit__(self, type, value, traceback): - self.close() - - def __new__(cls, *args, **kwargs): - """ - Construct a ZipFile or ContextualZipFile as appropriate - """ - if hasattr(zipfile.ZipFile, '__exit__'): - return zipfile.ZipFile(*args, **kwargs) - return super(ContextualZipFile, cls).__new__(cls) - - class ZipProvider(EggProvider): """Resource support for zips and eggs""" @@ -1677,6 +1610,9 @@ class ZipProvider(EggProvider): def _zipinfo_name(self, fspath): # Convert a virtual filename (full path to file) into a zipfile subpath # usable with the zipimport directory cache for our target archive + fspath = fspath.rstrip(os.sep) + if fspath == self.loader.archive: + return '' if fspath.startswith(self.zip_pre): return fspath[len(self.zip_pre):] raise AssertionError( @@ -1743,7 +1679,10 @@ class ZipProvider(EggProvider): if self._is_current(real_path, zip_path): return real_path - outf, tmpnam = _mkstemp(".$extract", dir=os.path.dirname(real_path)) + outf, tmpnam = _mkstemp( + ".$extract", + dir=os.path.dirname(real_path), + ) os.write(outf, self.loader.get_data(zip_path)) os.close(outf) utime(tmpnam, (timestamp, timestamp)) @@ -1861,7 +1800,7 @@ class FileMetadata(EmptyProvider): return metadata def _warn_on_replacement(self, metadata): - # Python 2.6 and 3.2 compat for: replacement_char = '�' + # Python 2.7 compat for: replacement_char = '�' replacement_char = b'\xef\xbf\xbd'.decode('utf-8') if replacement_char in metadata: tmpl = "{self.path} could not be properly decoded in UTF-8" @@ -1947,10 +1886,16 @@ def find_eggs_in_zip(importer, path_item, only=False): # don't yield nested distros return for subitem in metadata.resource_listdir('/'): - if _is_unpacked_egg(subitem): + if _is_egg_path(subitem): subpath = os.path.join(path_item, subitem) - for dist in find_eggs_in_zip(zipimport.zipimporter(subpath), subpath): + dists = find_eggs_in_zip(zipimport.zipimporter(subpath), subpath) + for dist in dists: yield dist + elif subitem.lower().endswith('.dist-info'): + subpath = os.path.join(path_item, subitem) + submeta = EggMetadata(zipimport.zipimporter(subpath)) + submeta.egg_info = subpath + yield Distribution.from_location(path_item, subitem, submeta) register_finder(zipimport.zipimporter, find_eggs_in_zip) @@ -1993,46 +1938,127 @@ def find_on_path(importer, path_item, only=False): """Yield distributions accessible on a sys.path directory""" path_item = _normalize_cached(path_item) - if os.path.isdir(path_item) and os.access(path_item, os.R_OK): - if _is_unpacked_egg(path_item): - yield Distribution.from_filename( - path_item, metadata=PathMetadata( - path_item, os.path.join(path_item, 'EGG-INFO') - ) + if _is_unpacked_egg(path_item): + yield Distribution.from_filename( + path_item, metadata=PathMetadata( + path_item, os.path.join(path_item, 'EGG-INFO') ) - else: - # scan for .egg and .egg-info in directory - path_item_entries = _by_version_descending(os.listdir(path_item)) - for entry in path_item_entries: - lower = entry.lower() - if lower.endswith('.egg-info') or lower.endswith('.dist-info'): - fullpath = os.path.join(path_item, entry) - if os.path.isdir(fullpath): - # egg-info directory, allow getting metadata - if len(os.listdir(fullpath)) == 0: - # Empty egg directory, skip. - continue - metadata = PathMetadata(path_item, fullpath) - else: - metadata = FileMetadata(fullpath) - yield Distribution.from_location( - path_item, entry, metadata, precedence=DEVELOP_DIST - ) - elif not only and _is_unpacked_egg(entry): - dists = find_distributions(os.path.join(path_item, entry)) - for dist in dists: - yield dist - elif not only and lower.endswith('.egg-link'): - with open(os.path.join(path_item, entry)) as entry_file: - entry_lines = entry_file.readlines() - for line in entry_lines: - if not line.strip(): - continue - path = os.path.join(path_item, line.rstrip()) - dists = find_distributions(path) - for item in dists: - yield item - break + ) + return + + entries = safe_listdir(path_item) + + # for performance, before sorting by version, + # screen entries for only those that will yield + # distributions + filtered = ( + entry + for entry in entries + if dist_factory(path_item, entry, only) + ) + + # scan for .egg and .egg-info in directory + path_item_entries = _by_version_descending(filtered) + for entry in path_item_entries: + fullpath = os.path.join(path_item, entry) + factory = dist_factory(path_item, entry, only) + for dist in factory(fullpath): + yield dist + + +def dist_factory(path_item, entry, only): + """ + Return a dist_factory for a path_item and entry + """ + lower = entry.lower() + is_meta = any(map(lower.endswith, ('.egg-info', '.dist-info'))) + return ( + distributions_from_metadata + if is_meta else + find_distributions + if not only and _is_egg_path(entry) else + resolve_egg_link + if not only and lower.endswith('.egg-link') else + NoDists() + ) + + +class NoDists: + """ + >>> bool(NoDists()) + False + + >>> list(NoDists()('anything')) + [] + """ + def __bool__(self): + return False + if six.PY2: + __nonzero__ = __bool__ + + def __call__(self, fullpath): + return iter(()) + + +def safe_listdir(path): + """ + Attempt to list contents of path, but suppress some exceptions. + """ + try: + return os.listdir(path) + except (PermissionError, NotADirectoryError): + pass + except OSError as e: + # Ignore the directory if does not exist, not a directory or + # permission denied + ignorable = ( + e.errno in (errno.ENOTDIR, errno.EACCES, errno.ENOENT) + # Python 2 on Windows needs to be handled this way :( + or getattr(e, "winerror", None) == 267 + ) + if not ignorable: + raise + return () + + +def distributions_from_metadata(path): + root = os.path.dirname(path) + if os.path.isdir(path): + if len(os.listdir(path)) == 0: + # empty metadata dir; skip + return + metadata = PathMetadata(root, path) + else: + metadata = FileMetadata(path) + entry = os.path.basename(path) + yield Distribution.from_location( + root, entry, metadata, precedence=DEVELOP_DIST, + ) + + +def non_empty_lines(path): + """ + Yield non-empty lines from file at path + """ + with open(path) as f: + for line in f: + line = line.strip() + if line: + yield line + + +def resolve_egg_link(path): + """ + Given a path to an .egg-link, resolve distributions + present in the referenced path. + """ + referenced_paths = non_empty_lines(path) + resolved_paths = ( + os.path.join(os.path.dirname(path), ref) + for ref in referenced_paths + ) + dist_groups = map(find_distributions, resolved_paths) + return next(dist_groups, ()) register_finder(pkgutil.ImpImporter, find_on_path) @@ -2068,7 +2094,12 @@ def _handle_ns(packageName, path_item): importer = get_importer(path_item) if importer is None: return None - loader = importer.find_module(packageName) + + # capture warnings due to #1111 + with warnings.catch_warnings(): + warnings.simplefilter("ignore") + loader = importer.find_module(packageName) + if loader is None: return None module = sys.modules.get(packageName) @@ -2113,12 +2144,13 @@ def _rebuild_mod_path(orig_path, package_name, module): parts = path_parts[:-module_parts] return safe_sys_path_index(_normalize_cached(os.sep.join(parts))) - if not isinstance(orig_path, list): - # Is this behavior useful when module.__path__ is not a list? - return + new_path = sorted(orig_path, key=position_in_sys_path) + new_path = [_normalize_cached(p) for p in new_path] - orig_path.sort(key=position_in_sys_path) - module.__path__[:] = [_normalize_cached(p) for p in orig_path] + if isinstance(module.__path__, list): + module.__path__[:] = new_path + else: + module.__path__ = new_path def declare_namespace(packageName): @@ -2129,9 +2161,10 @@ def declare_namespace(packageName): if packageName in _namespace_packages: return - path, parent = sys.path, None - if '.' in packageName: - parent = '.'.join(packageName.split('.')[:-1]) + path = sys.path + parent, _, _ = packageName.rpartition('.') + + if parent: declare_namespace(parent) if parent not in _namespace_packages: __import__(parent) @@ -2142,7 +2175,7 @@ def declare_namespace(packageName): # Track what packages are namespaces, so when new path items are added, # they can be updated - _namespace_packages.setdefault(parent, []).append(packageName) + _namespace_packages.setdefault(parent or None, []).append(packageName) _namespace_packages.setdefault(packageName, []) for path_item in path: @@ -2195,7 +2228,18 @@ register_namespace_handler(object, null_ns_handler) def normalize_path(filename): """Normalize a file/dir name for comparison purposes""" - return os.path.normcase(os.path.realpath(filename)) + return os.path.normcase(os.path.realpath(_cygwin_patch(filename))) + + +def _cygwin_patch(filename): # pragma: nocover + """ + Contrary to POSIX 2008, on Cygwin, getcwd (3) contains + symlink components. Using + os.path.abspath() works around this limitation. A fix in os.getcwd() + would probably better, in Cygwin even more so, except + that this seems to be by design... + """ + return os.path.abspath(filename) if sys.platform == 'cygwin' else filename def _normalize_cached(filename, _cache={}): @@ -2206,12 +2250,20 @@ def _normalize_cached(filename, _cache={}): return result +def _is_egg_path(path): + """ + Determine if given path appears to be an egg. + """ + return path.lower().endswith('.egg') + + def _is_unpacked_egg(path): """ Determine if given path appears to be an unpacked egg. """ return ( - path.lower().endswith('.egg') + _is_egg_path(path) and + os.path.isfile(os.path.join(path, 'EGG-INFO', 'PKG-INFO')) ) @@ -2252,7 +2304,7 @@ EGG_NAME = re.compile( ).match -class EntryPoint(object): +class EntryPoint: """Object representing an advertised importable object""" def __init__(self, name, module_name, attrs=(), extras=(), dist=None): @@ -2261,7 +2313,7 @@ class EntryPoint(object): self.name = name self.module_name = module_name self.attrs = tuple(attrs) - self.extras = Requirement.parse(("x[%s]" % ','.join(extras))).extras + self.extras = tuple(extras) self.dist = dist def __str__(self): @@ -2303,8 +2355,14 @@ class EntryPoint(object): def require(self, env=None, installer=None): if self.extras and not self.dist: raise UnknownExtra("Can't require() without a distribution", self) + + # Get the requirements for this entry point with all its extras and + # then resolve them. We have to pass `extras` along when resolving so + # that the working set knows what extras we want. Otherwise, for + # dist-info distributions, the working set will assume that the + # requirements for that extra are purely optional and skip over them. reqs = self.dist.requires(self.extras) - items = working_set.resolve(reqs, env, installer) + items = working_set.resolve(reqs, env, installer, extras=self.extras) list(map(working_set.add, items)) pattern = re.compile( @@ -2392,18 +2450,20 @@ def _version_from_file(lines): Given an iterable of lines from a Metadata file, return the value of the Version field, if present, or None otherwise. """ - is_version_line = lambda line: line.lower().startswith('version:') + def is_version_line(line): + return line.lower().startswith('version:') version_lines = filter(is_version_line, lines) line = next(iter(version_lines), '') _, _, value = line.partition(':') return safe_version(value.strip()) or None -class Distribution(object): +class Distribution: """Wrap an actual or potential sys.path entry w/metadata""" PKG_INFO = 'PKG-INFO' - def __init__(self, location=None, metadata=None, project_name=None, + def __init__( + self, location=None, metadata=None, project_name=None, version=None, py_version=PY_MAJOR, platform=None, precedence=EGG_DIST): self.project_name = safe_name(project_name or 'Unknown') @@ -2528,23 +2588,44 @@ class Distribution(object): @property def _dep_map(self): + """ + A map of extra to its list of (direct) requirements + for this distribution, including the null extra. + """ try: return self.__dep_map except AttributeError: - dm = self.__dep_map = {None: []} - for name in 'requires.txt', 'depends.txt': - for extra, reqs in split_sections(self._get_metadata(name)): - if extra: - if ':' in extra: - extra, marker = extra.split(':', 1) - if invalid_marker(marker): - # XXX warn - reqs = [] - elif not evaluate_marker(marker): - reqs = [] - extra = safe_extra(extra) or None - dm.setdefault(extra, []).extend(parse_requirements(reqs)) - return dm + self.__dep_map = self._filter_extras(self._build_dep_map()) + return self.__dep_map + + @staticmethod + def _filter_extras(dm): + """ + Given a mapping of extras to dependencies, strip off + environment markers and filter out any dependencies + not matching the markers. + """ + for extra in list(filter(None, dm)): + new_extra = extra + reqs = dm.pop(extra) + new_extra, _, marker = extra.partition(':') + fails_marker = marker and ( + invalid_marker(marker) + or not evaluate_marker(marker) + ) + if fails_marker: + reqs = [] + new_extra = safe_extra(new_extra) or None + + dm.setdefault(new_extra, []).extend(reqs) + return dm + + def _build_dep_map(self): + dm = {} + for name in 'requires.txt', 'depends.txt': + for extra, reqs in split_sections(self._get_metadata(name)): + dm.setdefault(extra, []).extend(parse_requirements(reqs)) + return dm def requires(self, extras=()): """List of Requirements needed for this distro if `extras` are used""" @@ -2607,6 +2688,19 @@ class Distribution(object): raise AttributeError(attr) return getattr(self._provider, attr) + def __dir__(self): + return list( + set(super(Distribution, self).__dir__()) + | set( + attr for attr in self._provider.__dir__() + if not attr.startswith('_') + ) + ) + + if not hasattr(object, '__dir__'): + # python 2.7 not supported + del __dir__ + @classmethod def from_filename(cls, filename, metadata=None, **kw): return cls.from_location( @@ -2679,7 +2773,8 @@ class Distribution(object): if replace: break else: - # don't modify path (even removing duplicates) if found and not replace + # don't modify path (even removing duplicates) if + # found and not replace return elif item == bdir and self.precedence == EGG_DIST: # if it's an .egg, give it precedence over its directory @@ -2776,7 +2871,10 @@ class EggInfoDistribution(Distribution): class DistInfoDistribution(Distribution): - """Wrap an actual or potential sys.path entry w/metadata, .dist-info style""" + """ + Wrap an actual or potential sys.path entry + w/metadata, .dist-info style. + """ PKG_INFO = 'METADATA' EQEQ = re.compile(r"([\(,])\s*(\d.*?)\s*([,\)])") @@ -2826,7 +2924,7 @@ _distributionImpl = { '.egg': Distribution, '.egg-info': EggInfoDistribution, '.dist-info': DistInfoDistribution, - } +} def issue_warning(*args, **kw): @@ -2862,7 +2960,10 @@ def parse_requirements(strs): # If there is a line continuation, drop it, and append the next line. if line.endswith('\\'): line = line[:-2].strip() - line += next(lines) + try: + line += next(lines) + except StopIteration: + return yield Requirement(line) @@ -2911,7 +3012,8 @@ class Requirement(packaging.requirements.Requirement): def __hash__(self): return self.__hash - def __repr__(self): return "Requirement.parse(%r)" % str(self) + def __repr__(self): + return "Requirement.parse(%r)" % str(self) @staticmethod def parse(s): @@ -2919,20 +3021,20 @@ class Requirement(packaging.requirements.Requirement): return req -def _get_mro(cls): - """Get an mro for a type or classic class""" - if not isinstance(cls, type): - - class cls(cls, object): - pass - - return cls.__mro__[1:] - return cls.__mro__ +def _always_object(classes): + """ + Ensure object appears in the mro even + for old-style classes. + """ + if object not in classes: + return classes + (object,) + return classes def _find_adapter(registry, ob): """Return an adapter factory for `ob` from `registry`""" - for t in _get_mro(getattr(ob, '__class__', type(ob))): + types = _always_object(inspect.getmro(getattr(ob, '__class__', type(ob)))) + for t in types: if t in registry: return registry[t] @@ -2940,8 +3042,7 @@ def _find_adapter(registry, ob): def ensure_directory(path): """Ensure that the parent directory of `path` exists""" dirname = os.path.dirname(path) - if not os.path.isdir(dirname): - os.makedirs(dirname) + py31compat.makedirs(dirname, exist_ok=True) def _bypass_ensure_directory(path): @@ -2951,7 +3052,10 @@ def _bypass_ensure_directory(path): dirname, filename = split(path) if dirname and filename and not isdir(dirname): _bypass_ensure_directory(dirname) - mkdir(dirname, 0o755) + try: + mkdir(dirname, 0o755) + except FileExistsError: + pass def split_sections(s): @@ -3046,7 +3150,10 @@ def _initialize_master_working_set(): dist.activate(replace=False) for dist in working_set ) - add_activation_listener(lambda dist: dist.activate(replace=True), existing=False) + add_activation_listener( + lambda dist: dist.activate(replace=True), + existing=False, + ) working_set.entries = [] # match order list(map(working_set.add_entry, sys.path)) diff --git a/lib/python3.4/site-packages/sqlalchemy/testing/plugin/__init__.py b/lib/python3.7/site-packages/pkg_resources/_vendor/__init__.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/testing/plugin/__init__.py rename to lib/python3.7/site-packages/pkg_resources/_vendor/__init__.py diff --git a/lib/python3.4/site-packages/pkg_resources/_vendor/appdirs.py b/lib/python3.7/site-packages/pkg_resources/_vendor/appdirs.py similarity index 87% rename from lib/python3.4/site-packages/pkg_resources/_vendor/appdirs.py rename to lib/python3.7/site-packages/pkg_resources/_vendor/appdirs.py index f4dba09..ae67001 100644 --- a/lib/python3.4/site-packages/pkg_resources/_vendor/appdirs.py +++ b/lib/python3.7/site-packages/pkg_resources/_vendor/appdirs.py @@ -13,7 +13,7 @@ See for details and usage. # - Mac OS X: http://developer.apple.com/documentation/MacOSX/Conceptual/BPFileSystem/index.html # - XDG spec for Un*x: http://standards.freedesktop.org/basedir-spec/basedir-spec-latest.html -__version_info__ = (1, 4, 0) +__version_info__ = (1, 4, 3) __version__ = '.'.join(map(str, __version_info__)) @@ -98,7 +98,7 @@ def user_data_dir(appname=None, appauthor=None, version=None, roaming=False): def site_data_dir(appname=None, appauthor=None, version=None, multipath=False): - """Return full path to the user-shared data dir for this application. + r"""Return full path to the user-shared data dir for this application. "appname" is the name of application. If None, just the system directory is returned. @@ -117,7 +117,7 @@ def site_data_dir(appname=None, appauthor=None, version=None, multipath=False): returned, or '/usr/local/share/', if XDG_DATA_DIRS is not set - Typical user data directories are: + Typical site data directories are: Mac OS X: /Library/Application Support/ Unix: /usr/local/share/ or /usr/share/ Win XP: C:\Documents and Settings\All Users\Application Data\\ @@ -184,13 +184,13 @@ def user_config_dir(appname=None, appauthor=None, version=None, roaming=False): for a discussion of issues. - Typical user data directories are: + Typical user config directories are: Mac OS X: same as user_data_dir Unix: ~/.config/ # or in $XDG_CONFIG_HOME, if defined Win *: same as user_data_dir For Unix, we follow the XDG spec and support $XDG_CONFIG_HOME. - That means, by deafult "~/.config/". + That means, by default "~/.config/". """ if system in ["win32", "darwin"]: path = user_data_dir(appname, appauthor, None, roaming) @@ -204,7 +204,7 @@ def user_config_dir(appname=None, appauthor=None, version=None, roaming=False): def site_config_dir(appname=None, appauthor=None, version=None, multipath=False): - """Return full path to the user-shared data dir for this application. + r"""Return full path to the user-shared data dir for this application. "appname" is the name of application. If None, just the system directory is returned. @@ -222,7 +222,7 @@ def site_config_dir(appname=None, appauthor=None, version=None, multipath=False) returned. By default, the first item from XDG_CONFIG_DIRS is returned, or '/etc/xdg/', if XDG_CONFIG_DIRS is not set - Typical user data directories are: + Typical site config directories are: Mac OS X: same as site_data_dir Unix: /etc/xdg/ or $XDG_CONFIG_DIRS[i]/ for each value in $XDG_CONFIG_DIRS @@ -311,6 +311,48 @@ def user_cache_dir(appname=None, appauthor=None, version=None, opinion=True): return path +def user_state_dir(appname=None, appauthor=None, version=None, roaming=False): + r"""Return full path to the user-specific state dir for this application. + + "appname" is the name of application. + If None, just the system directory is returned. + "appauthor" (only used on Windows) is the name of the + appauthor or distributing body for this application. Typically + it is the owning company name. This falls back to appname. You may + pass False to disable it. + "version" is an optional version path element to append to the + path. You might want to use this if you want multiple versions + of your app to be able to run independently. If used, this + would typically be ".". + Only applied when appname is present. + "roaming" (boolean, default False) can be set True to use the Windows + roaming appdata directory. That means that for users on a Windows + network setup for roaming profiles, this user data will be + sync'd on login. See + + for a discussion of issues. + + Typical user state directories are: + Mac OS X: same as user_data_dir + Unix: ~/.local/state/ # or in $XDG_STATE_HOME, if defined + Win *: same as user_data_dir + + For Unix, we follow this Debian proposal + to extend the XDG spec and support $XDG_STATE_HOME. + + That means, by default "~/.local/state/". + """ + if system in ["win32", "darwin"]: + path = user_data_dir(appname, appauthor, None, roaming) + else: + path = os.getenv('XDG_STATE_HOME', os.path.expanduser("~/.local/state")) + if appname: + path = os.path.join(path, appname) + if appname and version: + path = os.path.join(path, version) + return path + + def user_log_dir(appname=None, appauthor=None, version=None, opinion=True): r"""Return full path to the user-specific log dir for this application. @@ -329,7 +371,7 @@ def user_log_dir(appname=None, appauthor=None, version=None, opinion=True): "Logs" to the base app data dir for Windows, and "log" to the base cache dir for Unix. See discussion below. - Typical user cache directories are: + Typical user log directories are: Mac OS X: ~/Library/Logs/ Unix: ~/.cache//log # or under $XDG_CACHE_HOME if defined Win XP: C:\Documents and Settings\\Local Settings\Application Data\\\Logs @@ -364,8 +406,8 @@ def user_log_dir(appname=None, appauthor=None, version=None, opinion=True): class AppDirs(object): """Convenience wrapper for getting application dirs.""" - def __init__(self, appname, appauthor=None, version=None, roaming=False, - multipath=False): + def __init__(self, appname=None, appauthor=None, version=None, + roaming=False, multipath=False): self.appname = appname self.appauthor = appauthor self.version = version @@ -397,6 +439,11 @@ class AppDirs(object): return user_cache_dir(self.appname, self.appauthor, version=self.version) + @property + def user_state_dir(self): + return user_state_dir(self.appname, self.appauthor, + version=self.version) + @property def user_log_dir(self): return user_log_dir(self.appname, self.appauthor, @@ -410,7 +457,10 @@ def _get_win_folder_from_registry(csidl_name): registry for this guarantees us the correct answer for all CSIDL_* names. """ - import _winreg + if PY3: + import winreg as _winreg + else: + import _winreg shell_folder_name = { "CSIDL_APPDATA": "AppData", @@ -500,7 +550,7 @@ def _get_win_folder_with_jna(csidl_name): if has_high_char: buf = array.zeros('c', buf_size) kernel = win32.Kernel32.INSTANCE - if kernal.GetShortPathName(dir, buf, buf_size): + if kernel.GetShortPathName(dir, buf, buf_size): dir = jna.Native.toString(buf.tostring()).rstrip("\0") return dir @@ -527,9 +577,15 @@ if __name__ == "__main__": appname = "MyApp" appauthor = "MyCompany" - props = ("user_data_dir", "site_data_dir", - "user_config_dir", "site_config_dir", - "user_cache_dir", "user_log_dir") + props = ("user_data_dir", + "user_config_dir", + "user_cache_dir", + "user_state_dir", + "user_log_dir", + "site_data_dir", + "site_config_dir") + + print("-- app dirs %s --" % __version__) print("-- app dirs (with optional 'version')") dirs = AppDirs(appname, appauthor, version="1.0") diff --git a/lib/python3.4/site-packages/pkg_resources/_vendor/packaging/__about__.py b/lib/python3.7/site-packages/pkg_resources/_vendor/packaging/__about__.py similarity index 100% rename from lib/python3.4/site-packages/pkg_resources/_vendor/packaging/__about__.py rename to lib/python3.7/site-packages/pkg_resources/_vendor/packaging/__about__.py diff --git a/lib/python3.4/site-packages/pkg_resources/_vendor/packaging/__init__.py b/lib/python3.7/site-packages/pkg_resources/_vendor/packaging/__init__.py similarity index 100% rename from lib/python3.4/site-packages/pkg_resources/_vendor/packaging/__init__.py rename to lib/python3.7/site-packages/pkg_resources/_vendor/packaging/__init__.py diff --git a/lib/python3.4/site-packages/pkg_resources/_vendor/packaging/_compat.py b/lib/python3.7/site-packages/pkg_resources/_vendor/packaging/_compat.py similarity index 100% rename from lib/python3.4/site-packages/pkg_resources/_vendor/packaging/_compat.py rename to lib/python3.7/site-packages/pkg_resources/_vendor/packaging/_compat.py diff --git a/lib/python3.4/site-packages/pkg_resources/_vendor/packaging/_structures.py b/lib/python3.7/site-packages/pkg_resources/_vendor/packaging/_structures.py similarity index 100% rename from lib/python3.4/site-packages/pkg_resources/_vendor/packaging/_structures.py rename to lib/python3.7/site-packages/pkg_resources/_vendor/packaging/_structures.py diff --git a/lib/python3.4/site-packages/pkg_resources/_vendor/packaging/markers.py b/lib/python3.7/site-packages/pkg_resources/_vendor/packaging/markers.py similarity index 100% rename from lib/python3.4/site-packages/pkg_resources/_vendor/packaging/markers.py rename to lib/python3.7/site-packages/pkg_resources/_vendor/packaging/markers.py diff --git a/lib/python3.4/site-packages/pkg_resources/_vendor/packaging/requirements.py b/lib/python3.7/site-packages/pkg_resources/_vendor/packaging/requirements.py similarity index 100% rename from lib/python3.4/site-packages/pkg_resources/_vendor/packaging/requirements.py rename to lib/python3.7/site-packages/pkg_resources/_vendor/packaging/requirements.py diff --git a/lib/python3.4/site-packages/pkg_resources/_vendor/packaging/specifiers.py b/lib/python3.7/site-packages/pkg_resources/_vendor/packaging/specifiers.py similarity index 100% rename from lib/python3.4/site-packages/pkg_resources/_vendor/packaging/specifiers.py rename to lib/python3.7/site-packages/pkg_resources/_vendor/packaging/specifiers.py diff --git a/lib/python3.4/site-packages/pkg_resources/_vendor/packaging/utils.py b/lib/python3.7/site-packages/pkg_resources/_vendor/packaging/utils.py similarity index 100% rename from lib/python3.4/site-packages/pkg_resources/_vendor/packaging/utils.py rename to lib/python3.7/site-packages/pkg_resources/_vendor/packaging/utils.py diff --git a/lib/python3.4/site-packages/pkg_resources/_vendor/packaging/version.py b/lib/python3.7/site-packages/pkg_resources/_vendor/packaging/version.py similarity index 100% rename from lib/python3.4/site-packages/pkg_resources/_vendor/packaging/version.py rename to lib/python3.7/site-packages/pkg_resources/_vendor/packaging/version.py diff --git a/lib/python3.4/site-packages/pkg_resources/_vendor/pyparsing.py b/lib/python3.7/site-packages/pkg_resources/_vendor/pyparsing.py similarity index 96% rename from lib/python3.4/site-packages/pkg_resources/_vendor/pyparsing.py rename to lib/python3.7/site-packages/pkg_resources/_vendor/pyparsing.py index a212243..cf75e1e 100644 --- a/lib/python3.4/site-packages/pkg_resources/_vendor/pyparsing.py +++ b/lib/python3.7/site-packages/pkg_resources/_vendor/pyparsing.py @@ -1,6 +1,6 @@ # module pyparsing.py # -# Copyright (c) 2003-2016 Paul T. McGuire +# Copyright (c) 2003-2018 Paul T. McGuire # # Permission is hereby granted, free of charge, to any person obtaining # a copy of this software and associated documentation files (the @@ -25,6 +25,7 @@ __doc__ = \ """ pyparsing module - Classes and methods to define and execute parsing grammars +============================================================================= The pyparsing module is an alternative approach to creating and executing simple grammars, vs. the traditional lex/yacc approach, or the use of regular expressions. With pyparsing, you @@ -58,10 +59,23 @@ The pyparsing module handles some of the problems that are typically vexing when - extra or missing whitespace (the above program will also handle "Hello,World!", "Hello , World !", etc.) - quoted strings - embedded comments + + +Getting Started - +----------------- +Visit the classes L{ParserElement} and L{ParseResults} to see the base classes that most other pyparsing +classes inherit from. Use the docstrings for examples of how to: + - construct literal match expressions from L{Literal} and L{CaselessLiteral} classes + - construct character word-group expressions using the L{Word} class + - see how to create repetitive expressions using L{ZeroOrMore} and L{OneOrMore} classes + - use L{'+'}, L{'|'}, L{'^'}, and L{'&'} operators to combine simple expressions into more complex ones + - associate names with your parsed results using L{ParserElement.setResultsName} + - find some helpful expression short-cuts like L{delimitedList} and L{oneOf} + - find more useful common expressions in the L{pyparsing_common} namespace class """ -__version__ = "2.1.10" -__versionTime__ = "07 Oct 2016 01:31 UTC" +__version__ = "2.2.1" +__versionTime__ = "18 Sep 2018 00:49 UTC" __author__ = "Paul McGuire " import string @@ -82,6 +96,15 @@ try: except ImportError: from threading import RLock +try: + # Python 3 + from collections.abc import Iterable + from collections.abc import MutableMapping +except ImportError: + # Python 2.7 + from collections import Iterable + from collections import MutableMapping + try: from collections import OrderedDict as _OrderedDict except ImportError: @@ -144,7 +167,7 @@ else: except UnicodeEncodeError: # Else encode it ret = unicode(obj).encode(sys.getdefaultencoding(), 'xmlcharrefreplace') - xmlcharref = Regex('&#\d+;') + xmlcharref = Regex(r'&#\d+;') xmlcharref.setParseAction(lambda t: '\\u' + hex(int(t[0][2:-1]))[2:]) return xmlcharref.transformString(ret) @@ -809,7 +832,7 @@ class ParseResults(object): return None def getName(self): - """ + r""" Returns the results name for this token expression. Useful when several different expressions might match at a particular location. @@ -940,7 +963,7 @@ class ParseResults(object): def __dir__(self): return (dir(type(self)) + list(self.keys())) -collections.MutableMapping.register(ParseResults) +MutableMapping.register(ParseResults) def col (loc,strg): """Returns current column within a string, counting newlines as line separators. @@ -1025,11 +1048,11 @@ def _trim_arity(func, maxargs=2): # special handling for Python 3.5.0 - extra deep call stack by 1 offset = -3 if system_version == (3,5,0) else -2 frame_summary = traceback.extract_stack(limit=-offset+limit-1)[offset] - return [(frame_summary.filename, frame_summary.lineno)] + return [frame_summary[:2]] def extract_tb(tb, limit=0): frames = traceback.extract_tb(tb, limit=limit) frame_summary = frames[-1] - return [(frame_summary.filename, frame_summary.lineno)] + return [frame_summary[:2]] else: extract_stack = traceback.extract_stack extract_tb = traceback.extract_tb @@ -1226,7 +1249,7 @@ class ParserElement(object): def setParseAction( self, *fns, **kwargs ): """ - Define action to perform when successfully matching parse element definition. + Define one or more actions to perform when successfully matching parse element definition. Parse action fn is a callable method with 0-3 arguments, called as C{fn(s,loc,toks)}, C{fn(loc,toks)}, C{fn(toks)}, or just C{fn()}, where: - s = the original string being parsed (see note below) @@ -1264,7 +1287,7 @@ class ParserElement(object): def addParseAction( self, *fns, **kwargs ): """ - Add parse action to expression's list of parse actions. See L{I{setParseAction}}. + Add one or more parse actions to expression's list of parse actions. See L{I{setParseAction}}. See examples in L{I{copy}}. """ @@ -1374,7 +1397,7 @@ class ParserElement(object): else: preloc = loc tokensStart = preloc - if self.mayIndexError or loc >= len(instring): + if self.mayIndexError or preloc >= len(instring): try: loc,tokens = self.parseImpl( instring, preloc, doActions ) except IndexError: @@ -1408,7 +1431,6 @@ class ParserElement(object): self.resultsName, asList=self.saveAsList and isinstance(tokens,(ParseResults,list)), modal=self.modalResults ) - if debugging: #~ print ("Matched",self,"->",retTokens.asList()) if (self.debugActions[1] ): @@ -1443,10 +1465,14 @@ class ParserElement(object): def clear(self): cache.clear() + + def cache_len(self): + return len(cache) self.get = types.MethodType(get, self) self.set = types.MethodType(set, self) self.clear = types.MethodType(clear, self) + self.__len__ = types.MethodType(cache_len, self) if _OrderedDict is not None: class _FifoCache(object): @@ -1460,15 +1486,22 @@ class ParserElement(object): def set(self, key, value): cache[key] = value - if len(cache) > size: - cache.popitem(False) + while len(cache) > size: + try: + cache.popitem(False) + except KeyError: + pass def clear(self): cache.clear() + def cache_len(self): + return len(cache) + self.get = types.MethodType(get, self) self.set = types.MethodType(set, self) self.clear = types.MethodType(clear, self) + self.__len__ = types.MethodType(cache_len, self) else: class _FifoCache(object): @@ -1483,7 +1516,7 @@ class ParserElement(object): def set(self, key, value): cache[key] = value - if len(cache) > size: + while len(key_fifo) > size: cache.pop(key_fifo.popleft(), None) key_fifo.append(key) @@ -1491,9 +1524,13 @@ class ParserElement(object): cache.clear() key_fifo.clear() + def cache_len(self): + return len(cache) + self.get = types.MethodType(get, self) self.set = types.MethodType(set, self) self.clear = types.MethodType(clear, self) + self.__len__ = types.MethodType(cache_len, self) # argument cache for optimizing repeated calls when backtracking through recursive expressions packrat_cache = {} # this is set later by enabledPackrat(); this is here so that resetCache() doesn't fail @@ -1743,8 +1780,12 @@ class ParserElement(object): cap_word = Word(alphas.upper(), alphas.lower()) print(cap_word.searchString("More than Iron, more than Lead, more than Gold I need Electricity")) + + # the sum() builtin can be used to merge results into a single ParseResults object + print(sum(cap_word.searchString("More than Iron, more than Lead, more than Gold I need Electricity"))) prints:: - ['More', 'Iron', 'Lead', 'Gold', 'I'] + [['More'], ['Iron'], ['Lead'], ['Gold'], ['I'], ['Electricity']] + ['More', 'Iron', 'Lead', 'Gold', 'I', 'Electricity'] """ try: return ParseResults([ t for t,s,e in self.scanString( instring, maxMatches ) ]) @@ -1819,7 +1860,7 @@ class ParserElement(object): warnings.warn("Cannot combine element of type %s with ParserElement" % type(other), SyntaxWarning, stacklevel=2) return None - return And( [ self, And._ErrorStop(), other ] ) + return self + And._ErrorStop() + other def __rsub__(self, other ): """ @@ -2722,7 +2763,7 @@ class Word(Token): class Regex(Token): - """ + r""" Token for matching strings that match a given regular expression. Defined with string specifying the regular expression in a form recognized by the inbuilt Python re module. If the given regex contains named groups (defined using C{(?P...)}), these will be preserved as @@ -2911,7 +2952,7 @@ class QuotedString(Token): # replace escaped characters if self.escChar: - ret = re.sub(self.escCharReplacePattern,"\g<1>",ret) + ret = re.sub(self.escCharReplacePattern, r"\g<1>", ret) # replace escaped quotes if self.escQuote: @@ -3223,7 +3264,7 @@ class ParseExpression(ParserElement): if isinstance( exprs, basestring ): self.exprs = [ ParserElement._literalStringClass( exprs ) ] - elif isinstance( exprs, collections.Iterable ): + elif isinstance( exprs, Iterable ): exprs = list(exprs) # if sequence of strings provided, wrap with Literal if all(isinstance(expr, basestring) for expr in exprs): @@ -4374,7 +4415,7 @@ def traceParseAction(f): @traceParseAction def remove_duplicate_chars(tokens): - return ''.join(sorted(set(''.join(tokens))) + return ''.join(sorted(set(''.join(tokens)))) wds = OneOrMore(wd).setParseAction(remove_duplicate_chars) print(wds.parseString("slkdjs sld sldd sdlf sdljf")) @@ -4564,7 +4605,7 @@ def oneOf( strs, caseless=False, useRegex=True ): symbols = [] if isinstance(strs,basestring): symbols = strs.split() - elif isinstance(strs, collections.Iterable): + elif isinstance(strs, Iterable): symbols = list(strs) else: warnings.warn("Invalid argument to oneOf, expected string or iterable", @@ -4715,7 +4756,7 @@ stringEnd = StringEnd().setName("stringEnd") _escapedPunc = Word( _bslash, r"\[]-*.$+^?()~ ", exact=2 ).setParseAction(lambda s,l,t:t[0][1]) _escapedHexChar = Regex(r"\\0?[xX][0-9a-fA-F]+").setParseAction(lambda s,l,t:unichr(int(t[0].lstrip(r'\0x'),16))) _escapedOctChar = Regex(r"\\0[0-7]+").setParseAction(lambda s,l,t:unichr(int(t[0][1:],8))) -_singleChar = _escapedPunc | _escapedHexChar | _escapedOctChar | Word(printables, excludeChars=r'\]', exact=1) | Regex(r"\w", re.UNICODE) +_singleChar = _escapedPunc | _escapedHexChar | _escapedOctChar | CharsNotIn(r'\]', exact=1) _charRange = Group(_singleChar + Suppress("-") + _singleChar) _reBracketExpr = Literal("[") + Optional("^").setResultsName("negate") + Group( OneOrMore( _charRange | _singleChar ) ).setResultsName("body") + "]" @@ -5020,7 +5061,9 @@ def infixNotation( baseExpr, opList, lpar=Suppress('('), rpar=Suppress(')') ): constants C{opAssoc.RIGHT} and C{opAssoc.LEFT}. - parseAction is the parse action to be associated with expressions matching this operator expression (the - parse action tuple member may be omitted) + parse action tuple member may be omitted); if the parse action + is passed a tuple or list of functions, this is equivalent to + calling C{setParseAction(*fn)} (L{ParserElement.setParseAction}) - lpar - expression for matching left-parentheses (default=C{Suppress('(')}) - rpar - expression for matching right-parentheses (default=C{Suppress(')')}) @@ -5093,7 +5136,10 @@ def infixNotation( baseExpr, opList, lpar=Suppress('('), rpar=Suppress(')') ): else: raise ValueError("operator must indicate right or left associativity") if pa: - matchExpr.setParseAction( pa ) + if isinstance(pa, (tuple, list)): + matchExpr.setParseAction(*pa) + else: + matchExpr.setParseAction(pa) thisExpr <<= ( matchExpr.setName(termName) | lastExpr ) lastExpr = thisExpr ret <<= lastExpr diff --git a/lib/python3.4/site-packages/pkg_resources/_vendor/six.py b/lib/python3.7/site-packages/pkg_resources/_vendor/six.py similarity index 100% rename from lib/python3.4/site-packages/pkg_resources/_vendor/six.py rename to lib/python3.7/site-packages/pkg_resources/_vendor/six.py diff --git a/lib/python3.4/site-packages/pkg_resources/extern/__init__.py b/lib/python3.7/site-packages/pkg_resources/extern/__init__.py similarity index 97% rename from lib/python3.4/site-packages/pkg_resources/extern/__init__.py rename to lib/python3.7/site-packages/pkg_resources/extern/__init__.py index b4156fe..c1eb9e9 100644 --- a/lib/python3.4/site-packages/pkg_resources/extern/__init__.py +++ b/lib/python3.7/site-packages/pkg_resources/extern/__init__.py @@ -48,7 +48,7 @@ class VendorImporter: # on later Python versions to cause relative imports # in the vendor package to resolve the same modules # as those going through this importer. - if sys.version_info > (3, 3): + if prefix and sys.version_info > (3, 3): del sys.modules[extant] return mod except ImportError: diff --git a/lib/python3.4/site-packages/pkg_resources/py31compat.py b/lib/python3.7/site-packages/pkg_resources/py31compat.py similarity index 86% rename from lib/python3.4/site-packages/pkg_resources/py31compat.py rename to lib/python3.7/site-packages/pkg_resources/py31compat.py index 331a51b..a381c42 100644 --- a/lib/python3.4/site-packages/pkg_resources/py31compat.py +++ b/lib/python3.7/site-packages/pkg_resources/py31compat.py @@ -2,6 +2,8 @@ import os import errno import sys +from .extern import six + def _makedirs_31(path, exist_ok=False): try: @@ -15,8 +17,7 @@ def _makedirs_31(path, exist_ok=False): # and exists_ok considerations are disentangled. # See https://github.com/pypa/setuptools/pull/1083#issuecomment-315168663 needs_makedirs = ( - sys.version_info < (3, 2, 5) or - (3, 3) <= sys.version_info < (3, 3, 6) or + six.PY2 or (3, 4) <= sys.version_info < (3, 4, 1) ) makedirs = _makedirs_31 if needs_makedirs else os.makedirs diff --git a/lib/python3.7/site-packages/setuptools-40.5.0.dist-info/AUTHORS.txt b/lib/python3.7/site-packages/setuptools-40.5.0.dist-info/AUTHORS.txt new file mode 100644 index 0000000..e845ac7 --- /dev/null +++ b/lib/python3.7/site-packages/setuptools-40.5.0.dist-info/AUTHORS.txt @@ -0,0 +1,421 @@ +Adam Chainz +Adam Wentz +Adrien Morison +Alan Yee +Aleks Bunin +Alex Gaynor +Alex Grönholm +Alex Morega +Alex Stachowiak +Alexander Shtyrov +Alexandre Conrad +Alli +Anatoly Techtonik +Andrei Geacar +Andrew Gaul +Andrey Bulgakov +Andrés Delfino <34587441+andresdelfino@users.noreply.github.com> +Andrés Delfino +Andy Freeland +Andy Kluger +Anish Tambe +Anrs Hu +Anthony Sottile +Antoine Musso +Anton Ovchinnikov +Anton Patrushev +Antonio Alvarado Hernandez +Antony Lee +Antti Kaihola +Anubhav Patel +Anuj Godase +AQNOUCH Mohammed +AraHaan +Arindam Choudhury +Armin Ronacher +Ashley Manton +Atsushi Odagiri +Avner Cohen +Baptiste Mispelon +Barney Gale +barneygale +Bartek Ogryczak +Bastian Venthur +Ben Darnell +Ben Hoyt +Ben Rosser +Bence Nagy +Benjamin VanEvery +Benoit Pierre +Berker Peksag +Bernardo B. Marques +Bernhard M. Wiedemann +Bogdan Opanchuk +Brad Erickson +Bradley Ayers +Brandon L. Reiss +Brett Randall +Brian Rosner +BrownTruck +Bruno Oliveira +Bruno Renié +Bstrdsmkr +Buck Golemon +burrows +Bussonnier Matthias +c22 +Calvin Smith +Carl Meyer +Carlos Liam +Carol Willing +Carter Thayer +Cass +Chandrasekhar Atina +Chris Brinker +Chris Jerdonek +Chris McDonough +Chris Wolfe +Christian Heimes +Christian Oudard +Christopher Snyder +Clark Boylan +Clay McClure +Cody +Cody Soyland +Colin Watson +Connor Osborn +Cooper Lees +Cooper Ry Lees +Cory Benfield +Cory Wright +Craig Kerstiens +Cristian Sorinel +Curtis Doty +Damian Quiroga +Dan Black +Dan Savilonis +Dan Sully +daniel +Daniel Collins +Daniel Hahler +Daniel Holth +Daniel Jost +Daniel Shaulov +Daniele Procida +Danny Hermes +Dav Clark +Dave Abrahams +David Aguilar +David Black +David Caro +David Evans +David Linke +David Pursehouse +David Tucker +David Wales +Davidovich +derwolfe +Dmitry Gladkov +Domen Kožar +Donald Stufft +Dongweiming +Douglas Thor +DrFeathers +Dustin Ingram +Dwayne Bailey +Ed Morley <501702+edmorley@users.noreply.github.com> +Ed Morley +Eli Schwartz +Emil Styrke +Endoh Takanao +enoch +Eric Gillingham +Eric Hanchrow +Eric Hopper +Erik M. Bray +Erik Rose +Ernest W Durbin III +Ernest W. Durbin III +Erwin Janssen +Eugene Vereshchagin +fiber-space +Filip Kokosiński +Florian Briand +Francesco +Francesco Montesano +Gabriel Curio +Gabriel de Perthuis +Garry Polley +gdanielson +Geoffrey Lehée +Geoffrey Sneddon +George Song +Georgi Valkov +Giftlin Rajaiah +gizmoguy1 +gkdoc <40815324+gkdoc@users.noreply.github.com> +GOTO Hayato <3532528+gh640@users.noreply.github.com> +Guilherme Espada +Guy Rozendorn +Hari Charan +Herbert Pfennig +Hsiaoming Yang +Hugo +Hugo Lopes Tavares +hugovk +Hynek Schlawack +Ian Bicking +Ian Cordasco +Ian Lee +Ian Stapleton Cordasco +Ian Wienand +Ian Wienand +Igor Kuzmitshov +Igor Sobreira +Ilya Baryshev +INADA Naoki +Ionel Cristian Mărieș +Ionel Maries Cristian +Jakub Stasiak +Jakub Vysoky +Jakub Wilk +James Cleveland +James Cleveland +James Firth +James Polley +Jan Pokorný +Jannis Leidel +jarondl +Jason R. Coombs +Jay Graves +Jean-Christophe Fillion-Robin +Jeff Barber +Jeff Dairiki +Jeremy Stanley +Jeremy Zafran +Jim Garrison +Jivan Amara +John-Scott Atlakson +Jon Banafato +Jon Dufresne +Jon Parise +Jon Wayne Parrott +Jonas Nockert +Jonathan Herbert +Joost Molenaar +Jorge Niedbalski +Joseph Long +Josh Bronson +Josh Hansen +Josh Schneier +Julien Demoor +jwg4 +Jyrki Pulliainen +Kamal Bin Mustafa +kaustav haldar +keanemind +Kelsey Hightower +Kenneth Belitzky +Kenneth Reitz +Kenneth Reitz +Kevin Burke +Kevin Carter +Kevin Frommelt +Kexuan Sun +Kit Randel +kpinc +Kumar McMillan +Kyle Persohn +Laurent Bristiel +Laurie Opperman +Leon Sasson +Lev Givon +Lincoln de Sousa +Lipis +Loren Carvalho +Lucas Cimon +Ludovic Gasc +Luke Macken +Luo Jiebin +luojiebin +luz.paz +Marc Abramowitz +Marc Tamlyn +Marcus Smith +Mariatta +Mark Kohler +Markus Hametner +Masklinn +Matej Stuchlik +Mathew Jennings +Mathieu Bridon +Matt Good +Matt Maker +Matt Robenolt +matthew +Matthew Einhorn +Matthew Gilliard +Matthew Iversen +Matthew Trumbell +Matthew Willson +Matthias Bussonnier +mattip +Maxim Kurnikov +Maxime Rouyrre +memoselyk +Michael +Michael Aquilina +Michael E. Karpeles +Michael Klich +Michael Williamson +michaelpacer +Mickaël Schoentgen +Miguel Araujo Perez +Mihir Singh +Min RK +MinRK +Miro Hrončok +montefra +Monty Taylor +Nate Coraor +Nathaniel J. Smith +Nehal J Wani +Nick Coghlan +Nick Stenning +Nikhil Benesch +Nitesh Sharma +Nowell Strite +nvdv +Ofekmeister +Oliver Jeeves +Oliver Tonnhofer +Olivier Girardot +Olivier Grisel +Ollie Rutherfurd +OMOTO Kenji +Oren Held +Oscar Benjamin +Oz N Tiram +Patrick Dubroy +Patrick Jenkins +Patrick Lawson +patricktokeeffe +Paul Kehrer +Paul Moore +Paul Nasrat +Paul Oswald +Paul van der Linden +Paulus Schoutsen +Pawel Jasinski +Pekka Klärck +Peter Waller +Phaneendra Chiruvella +Phil Freo +Phil Pennock +Phil Whelan +Philip Molloy +Philippe Ombredanne +Pi Delport +Pierre-Yves Rofes +pip +Pradyun Gedam +Pratik Mallya +Preston Holmes +Przemek Wrzos +Qiangning Hong +R. David Murray +Rafael Caricio +Ralf Schmitt +Razzi Abuissa +Remi Rampin +Rene Dudfield +Richard Jones +RobberPhex +Robert Collins +Robert McGibbon +Robert T. McGibbon +Roey Berman +Rohan Jain +Rohan Jain +Rohan Jain +Roman Bogorodskiy +Romuald Brunet +Ronny Pfannschmidt +Rory McCann +Ross Brattain +Roy Wellington Ⅳ +Roy Wellington Ⅳ +Ryan Wooden +ryneeverett +Sachi King +Salvatore Rinchiera +schlamar +Scott Kitterman +seanj +Sebastian Schaetz +Segev Finer +Sergey Vasilyev +Seth Woodworth +Shlomi Fish +Simeon Visser +Simon Cross +Simon Pichugin +Sorin Sbarnea +Stavros Korokithakis +Stefan Scherfke +Stephan Erb +stepshal +Steve (Gadget) Barnes +Steve Barnes +Steve Kowalik +Steven Myint +stonebig +Stéphane Bidoul (ACSONE) +Stéphane Bidoul +Stéphane Klein +Takayuki SHIMIZUKAWA +Thijs Triemstra +Thomas Fenzl +Thomas Grainger +Thomas Guettler +Thomas Johansson +Thomas Kluyver +Thomas Smith +Tim D. Smith +Tim Harder +Tim Heap +tim smith +tinruufu +Tom Freudenheim +Tom V +Tomer Chachamu +Tony Zhaocheng Tan +Toshio Kuratomi +Travis Swicegood +Tzu-ping Chung +Valentin Haenel +Victor Stinner +Viktor Szépe +Ville Skyttä +Vinay Sajip +Vincent Philippon +Vitaly Babiy +Vladimir Rutsky +W. Trevor King +Wil Tan +Wilfred Hughes +William ML Leslie +Wolfgang Maier +Xavier Fernandez +Xavier Fernandez +xoviat +YAMAMOTO Takashi +Yen Chi Hsuan +Yoval P +Yu Jian +Zearin +Zearin +Zhiping Deng +Zvezdan Petkovic +Łukasz Langa +Семён Марьясин diff --git a/lib/python3.4/site-packages/netifaces-0.10.6.dist-info/INSTALLER b/lib/python3.7/site-packages/setuptools-40.5.0.dist-info/INSTALLER similarity index 100% rename from lib/python3.4/site-packages/netifaces-0.10.6.dist-info/INSTALLER rename to lib/python3.7/site-packages/setuptools-40.5.0.dist-info/INSTALLER diff --git a/lib/python3.7/site-packages/setuptools-40.5.0.dist-info/LICENSE.txt b/lib/python3.7/site-packages/setuptools-40.5.0.dist-info/LICENSE.txt new file mode 100644 index 0000000..d3379fa --- /dev/null +++ b/lib/python3.7/site-packages/setuptools-40.5.0.dist-info/LICENSE.txt @@ -0,0 +1,20 @@ +Copyright (c) 2008-2018 The pip developers (see AUTHORS.txt file) + +Permission is hereby granted, free of charge, to any person obtaining +a copy of this software and associated documentation files (the +"Software"), to deal in the Software without restriction, including +without limitation the rights to use, copy, modify, merge, publish, +distribute, sublicense, and/or sell copies of the Software, and to +permit persons to whom the Software is furnished to do so, subject to +the following conditions: + +The above copyright notice and this permission notice shall be +included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, +EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF +MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND +NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE +LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION +OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION +WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/lib/python3.4/site-packages/setuptools-36.6.0.dist-info/METADATA b/lib/python3.7/site-packages/setuptools-40.5.0.dist-info/METADATA similarity index 73% rename from lib/python3.4/site-packages/setuptools-36.6.0.dist-info/METADATA rename to lib/python3.7/site-packages/setuptools-40.5.0.dist-info/METADATA index 725401a..84ad80f 100644 --- a/lib/python3.4/site-packages/setuptools-36.6.0.dist-info/METADATA +++ b/lib/python3.7/site-packages/setuptools-40.5.0.dist-info/METADATA @@ -1,12 +1,12 @@ -Metadata-Version: 2.0 +Metadata-Version: 2.1 Name: setuptools -Version: 36.6.0 +Version: 40.5.0 Summary: Easily download, build, install, upgrade, and uninstall Python packages Home-page: https://github.com/pypa/setuptools Author: Python Packaging Authority Author-email: distutils-sig@python.org License: UNKNOWN -Description-Content-Type: text/x-rst; charset=UTF-8 +Project-URL: Documentation, https://setuptools.readthedocs.io/ Keywords: CPAN PyPI distutils eggs package management Platform: UNKNOWN Classifier: Development Status :: 5 - Production/Stable @@ -14,10 +14,8 @@ Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: MIT License Classifier: Operating System :: OS Independent Classifier: Programming Language :: Python :: 2 -Classifier: Programming Language :: Python :: 2.6 Classifier: Programming Language :: Python :: 2.7 Classifier: Programming Language :: Python :: 3 -Classifier: Programming Language :: Python :: 3.3 Classifier: Programming Language :: Python :: 3.4 Classifier: Programming Language :: Python :: 3.5 Classifier: Programming Language :: Python :: 3.6 @@ -25,11 +23,8 @@ Classifier: Topic :: Software Development :: Libraries :: Python Modules Classifier: Topic :: System :: Archiving :: Packaging Classifier: Topic :: System :: Systems Administration Classifier: Topic :: Utilities -Requires-Python: >=2.6,!=3.0.*,!=3.1.*,!=3.2.* -Provides-Extra: certs -Requires-Dist: certifi (==2016.9.26); extra == 'certs' -Provides-Extra: ssl -Requires-Dist: wincertstore (==0.2); sys_platform=='win32' and extra == 'ssl' +Requires-Python: >=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.* +Description-Content-Type: text/x-rst; charset=UTF-8 .. image:: https://img.shields.io/pypi/v/setuptools.svg :target: https://pypi.org/project/setuptools @@ -38,19 +33,27 @@ Requires-Dist: wincertstore (==0.2); sys_platform=='win32' and extra == 'ssl' :target: https://setuptools.readthedocs.io .. image:: https://img.shields.io/travis/pypa/setuptools/master.svg?label=Linux%20build%20%40%20Travis%20CI - :target: http://travis-ci.org/pypa/setuptools + :target: https://travis-ci.org/pypa/setuptools -.. image:: https://img.shields.io/appveyor/ci/jaraco/setuptools/master.svg?label=Windows%20build%20%40%20Appveyor - :target: https://ci.appveyor.com/project/jaraco/setuptools/branch/master +.. image:: https://img.shields.io/appveyor/ci/pypa/setuptools/master.svg?label=Windows%20build%20%40%20Appveyor + :target: https://ci.appveyor.com/project/pypa/setuptools/branch/master + +.. image:: https://img.shields.io/codecov/c/github/pypa/setuptools/master.svg + :target: https://codecov.io/gh/pypa/setuptools .. image:: https://img.shields.io/pypi/pyversions/setuptools.svg +.. image:: https://tidelift.com/badges/github/pypa/setuptools + :target: https://tidelift.com/subscription/pkg/pypi-setuptools?utm_source=pypi-setuptools&utm_medium=readme + See the `Installation Instructions `_ in the Python Packaging User's Guide for instructions on installing, upgrading, and uninstalling Setuptools. -The project is `maintained at GitHub `_. +The project is `maintained at GitHub `_ +by the `Setuptools Developers +`_. Questions and comments should be directed to the `distutils-sig mailing list `_. diff --git a/lib/python3.7/site-packages/setuptools-40.5.0.dist-info/RECORD b/lib/python3.7/site-packages/setuptools-40.5.0.dist-info/RECORD new file mode 100644 index 0000000..21301e8 --- /dev/null +++ b/lib/python3.7/site-packages/setuptools-40.5.0.dist-info/RECORD @@ -0,0 +1,155 @@ +../../../bin/easy_install,sha256=QKLJp-fRWSJhUGhPBKFVkN9ioVk34DIKoM0VngCXKKw,299 +../../../bin/easy_install-3.7,sha256=QKLJp-fRWSJhUGhPBKFVkN9ioVk34DIKoM0VngCXKKw,299 +__pycache__/easy_install.cpython-37.pyc,, +easy_install.py,sha256=MDC9vt5AxDsXX5qcKlBz2TnW6Tpuv_AobnfhCJ9X3PM,126 +setuptools-40.5.0.dist-info/AUTHORS.txt,sha256=Pu4WdZapZ2U2wKwWxd830ZxnROCHwmV_TpWoL9dqJ-M,15880 +setuptools-40.5.0.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4 +setuptools-40.5.0.dist-info/LICENSE.txt,sha256=ORqHhOMZ2uVDFHfUzJvFBPxdcf2eieHIDxzThV9dfPo,1090 +setuptools-40.5.0.dist-info/METADATA,sha256=x8fnpPGZbNTSJjrYCx5Fj9YRrWWdArxr20vscZ6_S-o,3034 +setuptools-40.5.0.dist-info/RECORD,, +setuptools-40.5.0.dist-info/WHEEL,sha256=_wJFdOYk7i3xxT8ElOkUJvOdOvfNGbR9g-bf6UQT6sU,110 +setuptools-40.5.0.dist-info/dependency_links.txt,sha256=HlkCFkoK5TbZ5EMLbLKYhLcY_E31kBWD8TqW2EgmatQ,239 +setuptools-40.5.0.dist-info/entry_points.txt,sha256=s4ibTr5_v_-uWueemgrdzLUIL_ageOMqsgCAKZDkY2E,2934 +setuptools-40.5.0.dist-info/top_level.txt,sha256=2HUXVVwA4Pff1xgTFr3GsTXXKaPaO6vlG6oNJ_4u4Tg,38 +setuptools-40.5.0.dist-info/zip-safe,sha256=AbpHGcgLb-kRsJGnwFEktk7uzpZOCcBY74-YBdrKVGs,1 +setuptools/__init__.py,sha256=dsZD3T-_2htjtVAELRWeu83BFxjGaTFB0h3IO7PGi3U,5878 +setuptools/__pycache__/__init__.cpython-37.pyc,, +setuptools/__pycache__/archive_util.cpython-37.pyc,, +setuptools/__pycache__/build_meta.cpython-37.pyc,, +setuptools/__pycache__/config.cpython-37.pyc,, +setuptools/__pycache__/dep_util.cpython-37.pyc,, +setuptools/__pycache__/depends.cpython-37.pyc,, +setuptools/__pycache__/dist.cpython-37.pyc,, +setuptools/__pycache__/extension.cpython-37.pyc,, +setuptools/__pycache__/glibc.cpython-37.pyc,, +setuptools/__pycache__/glob.cpython-37.pyc,, +setuptools/__pycache__/launch.cpython-37.pyc,, +setuptools/__pycache__/lib2to3_ex.cpython-37.pyc,, +setuptools/__pycache__/monkey.cpython-37.pyc,, +setuptools/__pycache__/msvc.cpython-37.pyc,, +setuptools/__pycache__/namespaces.cpython-37.pyc,, +setuptools/__pycache__/package_index.cpython-37.pyc,, +setuptools/__pycache__/pep425tags.cpython-37.pyc,, +setuptools/__pycache__/py27compat.cpython-37.pyc,, +setuptools/__pycache__/py31compat.cpython-37.pyc,, +setuptools/__pycache__/py33compat.cpython-37.pyc,, +setuptools/__pycache__/py36compat.cpython-37.pyc,, +setuptools/__pycache__/sandbox.cpython-37.pyc,, +setuptools/__pycache__/site-patch.cpython-37.pyc,, +setuptools/__pycache__/ssl_support.cpython-37.pyc,, +setuptools/__pycache__/unicode_utils.cpython-37.pyc,, +setuptools/__pycache__/version.cpython-37.pyc,, +setuptools/__pycache__/wheel.cpython-37.pyc,, +setuptools/__pycache__/windows_support.cpython-37.pyc,, +setuptools/_vendor/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0 +setuptools/_vendor/__pycache__/__init__.cpython-37.pyc,, +setuptools/_vendor/__pycache__/pyparsing.cpython-37.pyc,, +setuptools/_vendor/__pycache__/six.cpython-37.pyc,, +setuptools/_vendor/packaging/__about__.py,sha256=zkcCPTN_6TcLW0Nrlg0176-R1QQ_WVPTm8sz1R4-HjM,720 +setuptools/_vendor/packaging/__init__.py,sha256=_vNac5TrzwsrzbOFIbF-5cHqc_Y2aPT2D7zrIR06BOo,513 +setuptools/_vendor/packaging/__pycache__/__about__.cpython-37.pyc,, +setuptools/_vendor/packaging/__pycache__/__init__.cpython-37.pyc,, +setuptools/_vendor/packaging/__pycache__/_compat.cpython-37.pyc,, +setuptools/_vendor/packaging/__pycache__/_structures.cpython-37.pyc,, +setuptools/_vendor/packaging/__pycache__/markers.cpython-37.pyc,, +setuptools/_vendor/packaging/__pycache__/requirements.cpython-37.pyc,, +setuptools/_vendor/packaging/__pycache__/specifiers.cpython-37.pyc,, +setuptools/_vendor/packaging/__pycache__/utils.cpython-37.pyc,, +setuptools/_vendor/packaging/__pycache__/version.cpython-37.pyc,, +setuptools/_vendor/packaging/_compat.py,sha256=Vi_A0rAQeHbU-a9X0tt1yQm9RqkgQbDSxzRw8WlU9kA,860 +setuptools/_vendor/packaging/_structures.py,sha256=RImECJ4c_wTlaTYYwZYLHEiebDMaAJmK1oPARhw1T5o,1416 +setuptools/_vendor/packaging/markers.py,sha256=Gvpk9EY20yKaMTiKgQZ8yFEEpodqVgVYtfekoic1Yts,8239 +setuptools/_vendor/packaging/requirements.py,sha256=t44M2HVWtr8phIz2OhnILzuGT3rTATaovctV1dpnVIg,4343 +setuptools/_vendor/packaging/specifiers.py,sha256=SAMRerzO3fK2IkFZCaZkuwZaL_EGqHNOz4pni4vhnN0,28025 +setuptools/_vendor/packaging/utils.py,sha256=3m6WvPm6NNxE8rkTGmn0r75B_GZSGg7ikafxHsBN1WA,421 +setuptools/_vendor/packaging/version.py,sha256=OwGnxYfr2ghNzYx59qWIBkrK3SnB6n-Zfd1XaLpnnM0,11556 +setuptools/_vendor/pyparsing.py,sha256=tmrp-lu-qO1i75ZzIN5A12nKRRD1Cm4Vpk-5LR9rims,232055 +setuptools/_vendor/six.py,sha256=A6hdJZVjI3t_geebZ9BzUvwRrIXo0lfwzQlM2LcKyas,30098 +setuptools/archive_util.py,sha256=kw8Ib_lKjCcnPKNbS7h8HztRVK0d5RacU3r_KRdVnmM,6592 +setuptools/build_meta.py,sha256=qg4RfvgZF1uZPuO1VMioG8JRhNMp5fHrwgpgkYpnzc8,6021 +setuptools/cli-32.exe,sha256=dfEuovMNnA2HLa3jRfMPVi5tk4R7alCbpTvuxtCyw0Y,65536 +setuptools/cli-64.exe,sha256=KLABu5pyrnokJCv6skjXZ6GsXeyYHGcqOUT3oHI3Xpo,74752 +setuptools/cli.exe,sha256=dfEuovMNnA2HLa3jRfMPVi5tk4R7alCbpTvuxtCyw0Y,65536 +setuptools/command/__init__.py,sha256=NWzJ0A1BEengZpVeqUyWLNm2bk4P3F4iL5QUErHy7kA,594 +setuptools/command/__pycache__/__init__.cpython-37.pyc,, +setuptools/command/__pycache__/alias.cpython-37.pyc,, +setuptools/command/__pycache__/bdist_egg.cpython-37.pyc,, +setuptools/command/__pycache__/bdist_rpm.cpython-37.pyc,, +setuptools/command/__pycache__/bdist_wininst.cpython-37.pyc,, +setuptools/command/__pycache__/build_clib.cpython-37.pyc,, +setuptools/command/__pycache__/build_ext.cpython-37.pyc,, +setuptools/command/__pycache__/build_py.cpython-37.pyc,, +setuptools/command/__pycache__/develop.cpython-37.pyc,, +setuptools/command/__pycache__/dist_info.cpython-37.pyc,, +setuptools/command/__pycache__/easy_install.cpython-37.pyc,, +setuptools/command/__pycache__/egg_info.cpython-37.pyc,, +setuptools/command/__pycache__/install.cpython-37.pyc,, +setuptools/command/__pycache__/install_egg_info.cpython-37.pyc,, +setuptools/command/__pycache__/install_lib.cpython-37.pyc,, +setuptools/command/__pycache__/install_scripts.cpython-37.pyc,, +setuptools/command/__pycache__/py36compat.cpython-37.pyc,, +setuptools/command/__pycache__/register.cpython-37.pyc,, +setuptools/command/__pycache__/rotate.cpython-37.pyc,, +setuptools/command/__pycache__/saveopts.cpython-37.pyc,, +setuptools/command/__pycache__/sdist.cpython-37.pyc,, +setuptools/command/__pycache__/setopt.cpython-37.pyc,, +setuptools/command/__pycache__/test.cpython-37.pyc,, +setuptools/command/__pycache__/upload.cpython-37.pyc,, +setuptools/command/__pycache__/upload_docs.cpython-37.pyc,, +setuptools/command/alias.py,sha256=KjpE0sz_SDIHv3fpZcIQK-sCkJz-SrC6Gmug6b9Nkc8,2426 +setuptools/command/bdist_egg.py,sha256=be-IBpr1zhS9i6GjKANJgzkbH3ChImdWY7S-j0r2BK8,18167 +setuptools/command/bdist_rpm.py,sha256=B7l0TnzCGb-0nLlm6rS00jWLkojASwVmdhW2w5Qz_Ak,1508 +setuptools/command/bdist_wininst.py,sha256=_6dz3lpB1tY200LxKPLM7qgwTCceOMgaWFF-jW2-pm0,637 +setuptools/command/build_clib.py,sha256=bQ9aBr-5ZSO-9fGsGsDLz0mnnFteHUZnftVLkhvHDq0,4484 +setuptools/command/build_ext.py,sha256=81CTgsqjBjNl_HOgCJ1lQ5vv1NIM3RBpcoVGpqT4N1M,12897 +setuptools/command/build_py.py,sha256=yWyYaaS9F3o9JbIczn064A5g1C5_UiKRDxGaTqYbtLE,9596 +setuptools/command/develop.py,sha256=Sl1iMOORbAnp5BqiXmyMBD0uuvEnhSfOCqbxIPRiJPc,8060 +setuptools/command/dist_info.py,sha256=5t6kOfrdgALT-P3ogss6PF9k-Leyesueycuk3dUyZnI,960 +setuptools/command/easy_install.py,sha256=qVo2Ju2TLg6gIu48SrM3tG8fHXFLtsMcQMu9-hAz-y8,89333 +setuptools/command/egg_info.py,sha256=HCc6PW4SrjaWtxy_YbXw34YwTcNdqUpv6n7QjL4qHgk,25093 +setuptools/command/install.py,sha256=a0EZpL_A866KEdhicTGbuyD_TYl1sykfzdrri-zazT4,4683 +setuptools/command/install_egg_info.py,sha256=4zq_Ad3jE-EffParuyDEnvxU6efB-Xhrzdr8aB6Ln_8,3195 +setuptools/command/install_lib.py,sha256=n2iLR8f1MlYeGHtV2oFxDpUiL-wyLaQgwSAFX-YIEv4,5012 +setuptools/command/install_scripts.py,sha256=UD0rEZ6861mTYhIdzcsqKnUl8PozocXWl9VBQ1VTWnc,2439 +setuptools/command/launcher manifest.xml,sha256=xlLbjWrB01tKC0-hlVkOKkiSPbzMml2eOPtJ_ucCnbE,628 +setuptools/command/py36compat.py,sha256=SzjZcOxF7zdFUT47Zv2n7AM3H8koDys_0OpS-n9gIfc,4986 +setuptools/command/register.py,sha256=LO3MvYKPE8dN1m-KkrBRHC68ZFoPvA_vI8Xgp7vv6zI,534 +setuptools/command/rotate.py,sha256=co5C1EkI7P0GGT6Tqz-T2SIj2LBJTZXYELpmao6d4KQ,2164 +setuptools/command/saveopts.py,sha256=za7QCBcQimKKriWcoCcbhxPjUz30gSB74zuTL47xpP4,658 +setuptools/command/sdist.py,sha256=obDTe2BmWt2PlnFPZZh7e0LWvemEsbCCO9MzhrTZjm8,6711 +setuptools/command/setopt.py,sha256=NTWDyx-gjDF-txf4dO577s7LOzHVoKR0Mq33rFxaRr8,5085 +setuptools/command/test.py,sha256=fSl5OsZWSmFR3QJRvyy2OxbcYkuIkPvykWNOhFvAcUA,9228 +setuptools/command/upload.py,sha256=unktlo8fqx8yXU7F5hKkshNhQVG1tTIN3ObD9ERD0KE,1493 +setuptools/command/upload_docs.py,sha256=oXiGplM_cUKLwE4CWWw98RzCufAu8tBhMC97GegFcms,7311 +setuptools/config.py,sha256=tqFgKh3PYAIqkNgmotUSQHBTylRHJoh7mt8w0g82ax0,18695 +setuptools/dep_util.py,sha256=fgixvC1R7sH3r13ktyf7N0FALoqEXL1cBarmNpSEoWg,935 +setuptools/depends.py,sha256=hC8QIDcM3VDpRXvRVA6OfL9AaQfxvhxHcN_w6sAyNq8,5837 +setuptools/dist.py,sha256=lN_1YtfOsPg6hLVaOCDCPOlgTSoIL1FRu5jCNJuXmSg,42621 +setuptools/extension.py,sha256=uc6nHI-MxwmNCNPbUiBnybSyqhpJqjbhvOQ-emdvt_E,1729 +setuptools/extern/__init__.py,sha256=TxeNKFMSfBMzBpBDiHx8Dh3RzsdVmvWaXhtZ03DZMs0,2499 +setuptools/extern/__pycache__/__init__.cpython-37.pyc,, +setuptools/glibc.py,sha256=X64VvGPL2AbURKwYRsWJOXXGAYOiF_v2qixeTkAULuU,3146 +setuptools/glob.py,sha256=o75cHrOxYsvn854thSxE0x9k8JrKDuhP_rRXlVB00Q4,5084 +setuptools/gui-32.exe,sha256=XBr0bHMA6Hpz2s9s9Bzjl-PwXfa9nH4ie0rFn4V2kWA,65536 +setuptools/gui-64.exe,sha256=aYKMhX1IJLn4ULHgWX0sE0yREUt6B3TEHf_jOw6yNyE,75264 +setuptools/gui.exe,sha256=XBr0bHMA6Hpz2s9s9Bzjl-PwXfa9nH4ie0rFn4V2kWA,65536 +setuptools/launch.py,sha256=sd7ejwhBocCDx_wG9rIs0OaZ8HtmmFU8ZC6IR_S0Lvg,787 +setuptools/lib2to3_ex.py,sha256=t5e12hbR2pi9V4ezWDTB4JM-AISUnGOkmcnYHek3xjg,2013 +setuptools/monkey.py,sha256=_WJYLhz9FhwvpF5dDQKjcsiXmOvH0tb51ut5RdD5i4c,5204 +setuptools/msvc.py,sha256=uuRFaZzjJt5Fv3ZmyKUUuLtjx12_8G9RILigGec4irI,40838 +setuptools/namespaces.py,sha256=F0Nrbv8KCT2OrO7rwa03om4N4GZKAlnce-rr-cgDQa8,3199 +setuptools/package_index.py,sha256=yeifZQhJVRwPSaQmRrVPxbXRy-1lF5KdTFV8NAb3YcE,40342 +setuptools/pep425tags.py,sha256=bSGwlybcIpssx9kAv_hqAUJzfEpXSzYRp2u-nDYPdbk,10862 +setuptools/py27compat.py,sha256=3mwxRMDk5Q5O1rSXOERbQDXhFqwDJhhUitfMW_qpUCo,536 +setuptools/py31compat.py,sha256=REvrUBibUHgqI9S-ww0C9bhU-n8PyaQ8Slr1_NRxaaE,820 +setuptools/py33compat.py,sha256=OubjldHJH1KGE1CKt1kRU-Q55keftHT3ea1YoL0ZSco,1195 +setuptools/py36compat.py,sha256=VUDWxmu5rt4QHlGTRtAFu6W5jvfL6WBjeDAzeoBy0OM,2891 +setuptools/sandbox.py,sha256=9UbwfEL5QY436oMI1LtFWohhoZ-UzwHvGyZjUH_qhkw,14276 +setuptools/script (dev).tmpl,sha256=RUzQzCQUaXtwdLtYHWYbIQmOaES5Brqq1FvUA_tu-5I,218 +setuptools/script.tmpl,sha256=WGTt5piezO27c-Dbx6l5Q4T3Ff20A5z7872hv3aAhYY,138 +setuptools/site-patch.py,sha256=OumkIHMuoSenRSW1382kKWI1VAwxNE86E5W8iDd34FY,2302 +setuptools/ssl_support.py,sha256=YBDJsCZjSp62CWjxmSkke9kn9rhHHj25Cus6zhJRW3c,8492 +setuptools/unicode_utils.py,sha256=NOiZ_5hD72A6w-4wVj8awHFM3n51Kmw1Ic_vx15XFqw,996 +setuptools/version.py,sha256=og_cuZQb0QI6ukKZFfZWPlr1HgJBPPn2vO2m_bI9ZTE,144 +setuptools/wheel.py,sha256=A8hKSqHWZ5KM0-VP_DtptxpMxVF9pQwjWZcHGklxq2o,8102 +setuptools/windows_support.py,sha256=5GrfqSP2-dLGJoZTq2g6dCKkyQxxa2n5IQiXlJCoYEE,714 diff --git a/lib/python3.4/site-packages/pip-9.0.1.dist-info/WHEEL b/lib/python3.7/site-packages/setuptools-40.5.0.dist-info/WHEEL similarity index 70% rename from lib/python3.4/site-packages/pip-9.0.1.dist-info/WHEEL rename to lib/python3.7/site-packages/setuptools-40.5.0.dist-info/WHEEL index 8b6dd1b..c4bde30 100644 --- a/lib/python3.4/site-packages/pip-9.0.1.dist-info/WHEEL +++ b/lib/python3.7/site-packages/setuptools-40.5.0.dist-info/WHEEL @@ -1,5 +1,5 @@ Wheel-Version: 1.0 -Generator: bdist_wheel (0.29.0) +Generator: bdist_wheel (0.32.3) Root-Is-Purelib: true Tag: py2-none-any Tag: py3-none-any diff --git a/lib/python3.4/site-packages/setuptools-36.6.0.dist-info/dependency_links.txt b/lib/python3.7/site-packages/setuptools-40.5.0.dist-info/dependency_links.txt similarity index 100% rename from lib/python3.4/site-packages/setuptools-36.6.0.dist-info/dependency_links.txt rename to lib/python3.7/site-packages/setuptools-40.5.0.dist-info/dependency_links.txt diff --git a/lib/python3.4/site-packages/setuptools-36.6.0.dist-info/entry_points.txt b/lib/python3.7/site-packages/setuptools-40.5.0.dist-info/entry_points.txt similarity index 98% rename from lib/python3.4/site-packages/setuptools-36.6.0.dist-info/entry_points.txt rename to lib/python3.7/site-packages/setuptools-40.5.0.dist-info/entry_points.txt index 4159fd0..b429e52 100644 --- a/lib/python3.4/site-packages/setuptools-36.6.0.dist-info/entry_points.txt +++ b/lib/python3.7/site-packages/setuptools-40.5.0.dist-info/entry_points.txt @@ -1,6 +1,5 @@ [console_scripts] easy_install = setuptools.command.easy_install:main -easy_install-3.6 = setuptools.command.easy_install:main [distutils.commands] alias = setuptools.command.alias:alias diff --git a/lib/python3.4/site-packages/setuptools-36.6.0.dist-info/top_level.txt b/lib/python3.7/site-packages/setuptools-40.5.0.dist-info/top_level.txt similarity index 100% rename from lib/python3.4/site-packages/setuptools-36.6.0.dist-info/top_level.txt rename to lib/python3.7/site-packages/setuptools-40.5.0.dist-info/top_level.txt diff --git a/lib/python3.7/site-packages/setuptools-40.5.0.dist-info/zip-safe b/lib/python3.7/site-packages/setuptools-40.5.0.dist-info/zip-safe new file mode 100644 index 0000000..8b13789 --- /dev/null +++ b/lib/python3.7/site-packages/setuptools-40.5.0.dist-info/zip-safe @@ -0,0 +1 @@ + diff --git a/lib/python3.4/site-packages/setuptools/__init__.py b/lib/python3.7/site-packages/setuptools/__init__.py similarity index 84% rename from lib/python3.4/site-packages/setuptools/__init__.py rename to lib/python3.7/site-packages/setuptools/__init__.py index 04f7674..54309b5 100644 --- a/lib/python3.4/site-packages/setuptools/__init__.py +++ b/lib/python3.7/site-packages/setuptools/__init__.py @@ -1,12 +1,14 @@ """Extensions to the 'distutils' for large or complex distributions""" import os +import sys import functools import distutils.core import distutils.filelist from distutils.util import convert_path from fnmatch import fnmatchcase +from setuptools.extern.six import PY3 from setuptools.extern.six.moves import filter, map import setuptools.version @@ -15,11 +17,17 @@ from setuptools.dist import Distribution, Feature from setuptools.depends import Require from . import monkey +__metaclass__ = type + + __all__ = [ 'setup', 'Distribution', 'Feature', 'Command', 'Extension', 'Require', - 'find_packages', + 'find_packages' ] +if PY3: + __all__.append('find_namespace_packages') + __version__ = setuptools.version.__version__ bootstrap_install_from = None @@ -31,7 +39,7 @@ run_2to3_on_doctests = True lib2to3_fixer_packages = ['lib2to3.fixes'] -class PackageFinder(object): +class PackageFinder: """ Generate a list of all Python packages found within a directory """ @@ -109,7 +117,30 @@ class PEP420PackageFinder(PackageFinder): find_packages = PackageFinder.find -setup = distutils.core.setup +if PY3: + find_namespace_packages = PEP420PackageFinder.find + + +def _install_setup_requires(attrs): + # Note: do not use `setuptools.Distribution` directly, as + # our PEP 517 backend patch `distutils.core.Distribution`. + dist = distutils.core.Distribution(dict( + (k, v) for k, v in attrs.items() + if k in ('dependency_links', 'setup_requires') + )) + # Honor setup.cfg's options. + dist.parse_config_files(ignore_option_errors=True) + if dist.setup_requires: + dist.fetch_build_eggs(dist.setup_requires) + + +def setup(**attrs): + # Make sure we have any requirements needed to interpret 'attrs'. + _install_setup_requires(attrs) + return distutils.core.setup(**attrs) + +setup.__doc__ = distutils.core.setup.__doc__ + _Command = monkey.get_unpatched(distutils.core.Command) diff --git a/lib/python3.7/site-packages/setuptools/_vendor/__init__.py b/lib/python3.7/site-packages/setuptools/_vendor/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/lib/python3.7/site-packages/setuptools/_vendor/packaging/__about__.py b/lib/python3.7/site-packages/setuptools/_vendor/packaging/__about__.py new file mode 100644 index 0000000..95d330e --- /dev/null +++ b/lib/python3.7/site-packages/setuptools/_vendor/packaging/__about__.py @@ -0,0 +1,21 @@ +# This file is dual licensed under the terms of the Apache License, Version +# 2.0, and the BSD License. See the LICENSE file in the root of this repository +# for complete details. +from __future__ import absolute_import, division, print_function + +__all__ = [ + "__title__", "__summary__", "__uri__", "__version__", "__author__", + "__email__", "__license__", "__copyright__", +] + +__title__ = "packaging" +__summary__ = "Core utilities for Python packages" +__uri__ = "https://github.com/pypa/packaging" + +__version__ = "16.8" + +__author__ = "Donald Stufft and individual contributors" +__email__ = "donald@stufft.io" + +__license__ = "BSD or Apache License, Version 2.0" +__copyright__ = "Copyright 2014-2016 %s" % __author__ diff --git a/lib/python3.7/site-packages/setuptools/_vendor/packaging/__init__.py b/lib/python3.7/site-packages/setuptools/_vendor/packaging/__init__.py new file mode 100644 index 0000000..5ee6220 --- /dev/null +++ b/lib/python3.7/site-packages/setuptools/_vendor/packaging/__init__.py @@ -0,0 +1,14 @@ +# This file is dual licensed under the terms of the Apache License, Version +# 2.0, and the BSD License. See the LICENSE file in the root of this repository +# for complete details. +from __future__ import absolute_import, division, print_function + +from .__about__ import ( + __author__, __copyright__, __email__, __license__, __summary__, __title__, + __uri__, __version__ +) + +__all__ = [ + "__title__", "__summary__", "__uri__", "__version__", "__author__", + "__email__", "__license__", "__copyright__", +] diff --git a/lib/python3.7/site-packages/setuptools/_vendor/packaging/_compat.py b/lib/python3.7/site-packages/setuptools/_vendor/packaging/_compat.py new file mode 100644 index 0000000..210bb80 --- /dev/null +++ b/lib/python3.7/site-packages/setuptools/_vendor/packaging/_compat.py @@ -0,0 +1,30 @@ +# This file is dual licensed under the terms of the Apache License, Version +# 2.0, and the BSD License. See the LICENSE file in the root of this repository +# for complete details. +from __future__ import absolute_import, division, print_function + +import sys + + +PY2 = sys.version_info[0] == 2 +PY3 = sys.version_info[0] == 3 + +# flake8: noqa + +if PY3: + string_types = str, +else: + string_types = basestring, + + +def with_metaclass(meta, *bases): + """ + Create a base class with a metaclass. + """ + # This requires a bit of explanation: the basic idea is to make a dummy + # metaclass for one level of class instantiation that replaces itself with + # the actual metaclass. + class metaclass(meta): + def __new__(cls, name, this_bases, d): + return meta(name, bases, d) + return type.__new__(metaclass, 'temporary_class', (), {}) diff --git a/lib/python3.7/site-packages/setuptools/_vendor/packaging/_structures.py b/lib/python3.7/site-packages/setuptools/_vendor/packaging/_structures.py new file mode 100644 index 0000000..ccc2786 --- /dev/null +++ b/lib/python3.7/site-packages/setuptools/_vendor/packaging/_structures.py @@ -0,0 +1,68 @@ +# This file is dual licensed under the terms of the Apache License, Version +# 2.0, and the BSD License. See the LICENSE file in the root of this repository +# for complete details. +from __future__ import absolute_import, division, print_function + + +class Infinity(object): + + def __repr__(self): + return "Infinity" + + def __hash__(self): + return hash(repr(self)) + + def __lt__(self, other): + return False + + def __le__(self, other): + return False + + def __eq__(self, other): + return isinstance(other, self.__class__) + + def __ne__(self, other): + return not isinstance(other, self.__class__) + + def __gt__(self, other): + return True + + def __ge__(self, other): + return True + + def __neg__(self): + return NegativeInfinity + +Infinity = Infinity() + + +class NegativeInfinity(object): + + def __repr__(self): + return "-Infinity" + + def __hash__(self): + return hash(repr(self)) + + def __lt__(self, other): + return True + + def __le__(self, other): + return True + + def __eq__(self, other): + return isinstance(other, self.__class__) + + def __ne__(self, other): + return not isinstance(other, self.__class__) + + def __gt__(self, other): + return False + + def __ge__(self, other): + return False + + def __neg__(self): + return Infinity + +NegativeInfinity = NegativeInfinity() diff --git a/lib/python3.7/site-packages/setuptools/_vendor/packaging/markers.py b/lib/python3.7/site-packages/setuptools/_vendor/packaging/markers.py new file mode 100644 index 0000000..031332a --- /dev/null +++ b/lib/python3.7/site-packages/setuptools/_vendor/packaging/markers.py @@ -0,0 +1,301 @@ +# This file is dual licensed under the terms of the Apache License, Version +# 2.0, and the BSD License. See the LICENSE file in the root of this repository +# for complete details. +from __future__ import absolute_import, division, print_function + +import operator +import os +import platform +import sys + +from setuptools.extern.pyparsing import ParseException, ParseResults, stringStart, stringEnd +from setuptools.extern.pyparsing import ZeroOrMore, Group, Forward, QuotedString +from setuptools.extern.pyparsing import Literal as L # noqa + +from ._compat import string_types +from .specifiers import Specifier, InvalidSpecifier + + +__all__ = [ + "InvalidMarker", "UndefinedComparison", "UndefinedEnvironmentName", + "Marker", "default_environment", +] + + +class InvalidMarker(ValueError): + """ + An invalid marker was found, users should refer to PEP 508. + """ + + +class UndefinedComparison(ValueError): + """ + An invalid operation was attempted on a value that doesn't support it. + """ + + +class UndefinedEnvironmentName(ValueError): + """ + A name was attempted to be used that does not exist inside of the + environment. + """ + + +class Node(object): + + def __init__(self, value): + self.value = value + + def __str__(self): + return str(self.value) + + def __repr__(self): + return "<{0}({1!r})>".format(self.__class__.__name__, str(self)) + + def serialize(self): + raise NotImplementedError + + +class Variable(Node): + + def serialize(self): + return str(self) + + +class Value(Node): + + def serialize(self): + return '"{0}"'.format(self) + + +class Op(Node): + + def serialize(self): + return str(self) + + +VARIABLE = ( + L("implementation_version") | + L("platform_python_implementation") | + L("implementation_name") | + L("python_full_version") | + L("platform_release") | + L("platform_version") | + L("platform_machine") | + L("platform_system") | + L("python_version") | + L("sys_platform") | + L("os_name") | + L("os.name") | # PEP-345 + L("sys.platform") | # PEP-345 + L("platform.version") | # PEP-345 + L("platform.machine") | # PEP-345 + L("platform.python_implementation") | # PEP-345 + L("python_implementation") | # undocumented setuptools legacy + L("extra") +) +ALIASES = { + 'os.name': 'os_name', + 'sys.platform': 'sys_platform', + 'platform.version': 'platform_version', + 'platform.machine': 'platform_machine', + 'platform.python_implementation': 'platform_python_implementation', + 'python_implementation': 'platform_python_implementation' +} +VARIABLE.setParseAction(lambda s, l, t: Variable(ALIASES.get(t[0], t[0]))) + +VERSION_CMP = ( + L("===") | + L("==") | + L(">=") | + L("<=") | + L("!=") | + L("~=") | + L(">") | + L("<") +) + +MARKER_OP = VERSION_CMP | L("not in") | L("in") +MARKER_OP.setParseAction(lambda s, l, t: Op(t[0])) + +MARKER_VALUE = QuotedString("'") | QuotedString('"') +MARKER_VALUE.setParseAction(lambda s, l, t: Value(t[0])) + +BOOLOP = L("and") | L("or") + +MARKER_VAR = VARIABLE | MARKER_VALUE + +MARKER_ITEM = Group(MARKER_VAR + MARKER_OP + MARKER_VAR) +MARKER_ITEM.setParseAction(lambda s, l, t: tuple(t[0])) + +LPAREN = L("(").suppress() +RPAREN = L(")").suppress() + +MARKER_EXPR = Forward() +MARKER_ATOM = MARKER_ITEM | Group(LPAREN + MARKER_EXPR + RPAREN) +MARKER_EXPR << MARKER_ATOM + ZeroOrMore(BOOLOP + MARKER_EXPR) + +MARKER = stringStart + MARKER_EXPR + stringEnd + + +def _coerce_parse_result(results): + if isinstance(results, ParseResults): + return [_coerce_parse_result(i) for i in results] + else: + return results + + +def _format_marker(marker, first=True): + assert isinstance(marker, (list, tuple, string_types)) + + # Sometimes we have a structure like [[...]] which is a single item list + # where the single item is itself it's own list. In that case we want skip + # the rest of this function so that we don't get extraneous () on the + # outside. + if (isinstance(marker, list) and len(marker) == 1 and + isinstance(marker[0], (list, tuple))): + return _format_marker(marker[0]) + + if isinstance(marker, list): + inner = (_format_marker(m, first=False) for m in marker) + if first: + return " ".join(inner) + else: + return "(" + " ".join(inner) + ")" + elif isinstance(marker, tuple): + return " ".join([m.serialize() for m in marker]) + else: + return marker + + +_operators = { + "in": lambda lhs, rhs: lhs in rhs, + "not in": lambda lhs, rhs: lhs not in rhs, + "<": operator.lt, + "<=": operator.le, + "==": operator.eq, + "!=": operator.ne, + ">=": operator.ge, + ">": operator.gt, +} + + +def _eval_op(lhs, op, rhs): + try: + spec = Specifier("".join([op.serialize(), rhs])) + except InvalidSpecifier: + pass + else: + return spec.contains(lhs) + + oper = _operators.get(op.serialize()) + if oper is None: + raise UndefinedComparison( + "Undefined {0!r} on {1!r} and {2!r}.".format(op, lhs, rhs) + ) + + return oper(lhs, rhs) + + +_undefined = object() + + +def _get_env(environment, name): + value = environment.get(name, _undefined) + + if value is _undefined: + raise UndefinedEnvironmentName( + "{0!r} does not exist in evaluation environment.".format(name) + ) + + return value + + +def _evaluate_markers(markers, environment): + groups = [[]] + + for marker in markers: + assert isinstance(marker, (list, tuple, string_types)) + + if isinstance(marker, list): + groups[-1].append(_evaluate_markers(marker, environment)) + elif isinstance(marker, tuple): + lhs, op, rhs = marker + + if isinstance(lhs, Variable): + lhs_value = _get_env(environment, lhs.value) + rhs_value = rhs.value + else: + lhs_value = lhs.value + rhs_value = _get_env(environment, rhs.value) + + groups[-1].append(_eval_op(lhs_value, op, rhs_value)) + else: + assert marker in ["and", "or"] + if marker == "or": + groups.append([]) + + return any(all(item) for item in groups) + + +def format_full_version(info): + version = '{0.major}.{0.minor}.{0.micro}'.format(info) + kind = info.releaselevel + if kind != 'final': + version += kind[0] + str(info.serial) + return version + + +def default_environment(): + if hasattr(sys, 'implementation'): + iver = format_full_version(sys.implementation.version) + implementation_name = sys.implementation.name + else: + iver = '0' + implementation_name = '' + + return { + "implementation_name": implementation_name, + "implementation_version": iver, + "os_name": os.name, + "platform_machine": platform.machine(), + "platform_release": platform.release(), + "platform_system": platform.system(), + "platform_version": platform.version(), + "python_full_version": platform.python_version(), + "platform_python_implementation": platform.python_implementation(), + "python_version": platform.python_version()[:3], + "sys_platform": sys.platform, + } + + +class Marker(object): + + def __init__(self, marker): + try: + self._markers = _coerce_parse_result(MARKER.parseString(marker)) + except ParseException as e: + err_str = "Invalid marker: {0!r}, parse error at {1!r}".format( + marker, marker[e.loc:e.loc + 8]) + raise InvalidMarker(err_str) + + def __str__(self): + return _format_marker(self._markers) + + def __repr__(self): + return "".format(str(self)) + + def evaluate(self, environment=None): + """Evaluate a marker. + + Return the boolean from evaluating the given marker against the + environment. environment is an optional argument to override all or + part of the determined environment. + + The environment is determined from the current Python process. + """ + current_environment = default_environment() + if environment is not None: + current_environment.update(environment) + + return _evaluate_markers(self._markers, current_environment) diff --git a/lib/python3.7/site-packages/setuptools/_vendor/packaging/requirements.py b/lib/python3.7/site-packages/setuptools/_vendor/packaging/requirements.py new file mode 100644 index 0000000..5b49341 --- /dev/null +++ b/lib/python3.7/site-packages/setuptools/_vendor/packaging/requirements.py @@ -0,0 +1,127 @@ +# This file is dual licensed under the terms of the Apache License, Version +# 2.0, and the BSD License. See the LICENSE file in the root of this repository +# for complete details. +from __future__ import absolute_import, division, print_function + +import string +import re + +from setuptools.extern.pyparsing import stringStart, stringEnd, originalTextFor, ParseException +from setuptools.extern.pyparsing import ZeroOrMore, Word, Optional, Regex, Combine +from setuptools.extern.pyparsing import Literal as L # noqa +from setuptools.extern.six.moves.urllib import parse as urlparse + +from .markers import MARKER_EXPR, Marker +from .specifiers import LegacySpecifier, Specifier, SpecifierSet + + +class InvalidRequirement(ValueError): + """ + An invalid requirement was found, users should refer to PEP 508. + """ + + +ALPHANUM = Word(string.ascii_letters + string.digits) + +LBRACKET = L("[").suppress() +RBRACKET = L("]").suppress() +LPAREN = L("(").suppress() +RPAREN = L(")").suppress() +COMMA = L(",").suppress() +SEMICOLON = L(";").suppress() +AT = L("@").suppress() + +PUNCTUATION = Word("-_.") +IDENTIFIER_END = ALPHANUM | (ZeroOrMore(PUNCTUATION) + ALPHANUM) +IDENTIFIER = Combine(ALPHANUM + ZeroOrMore(IDENTIFIER_END)) + +NAME = IDENTIFIER("name") +EXTRA = IDENTIFIER + +URI = Regex(r'[^ ]+')("url") +URL = (AT + URI) + +EXTRAS_LIST = EXTRA + ZeroOrMore(COMMA + EXTRA) +EXTRAS = (LBRACKET + Optional(EXTRAS_LIST) + RBRACKET)("extras") + +VERSION_PEP440 = Regex(Specifier._regex_str, re.VERBOSE | re.IGNORECASE) +VERSION_LEGACY = Regex(LegacySpecifier._regex_str, re.VERBOSE | re.IGNORECASE) + +VERSION_ONE = VERSION_PEP440 ^ VERSION_LEGACY +VERSION_MANY = Combine(VERSION_ONE + ZeroOrMore(COMMA + VERSION_ONE), + joinString=",", adjacent=False)("_raw_spec") +_VERSION_SPEC = Optional(((LPAREN + VERSION_MANY + RPAREN) | VERSION_MANY)) +_VERSION_SPEC.setParseAction(lambda s, l, t: t._raw_spec or '') + +VERSION_SPEC = originalTextFor(_VERSION_SPEC)("specifier") +VERSION_SPEC.setParseAction(lambda s, l, t: t[1]) + +MARKER_EXPR = originalTextFor(MARKER_EXPR())("marker") +MARKER_EXPR.setParseAction( + lambda s, l, t: Marker(s[t._original_start:t._original_end]) +) +MARKER_SEPERATOR = SEMICOLON +MARKER = MARKER_SEPERATOR + MARKER_EXPR + +VERSION_AND_MARKER = VERSION_SPEC + Optional(MARKER) +URL_AND_MARKER = URL + Optional(MARKER) + +NAMED_REQUIREMENT = \ + NAME + Optional(EXTRAS) + (URL_AND_MARKER | VERSION_AND_MARKER) + +REQUIREMENT = stringStart + NAMED_REQUIREMENT + stringEnd + + +class Requirement(object): + """Parse a requirement. + + Parse a given requirement string into its parts, such as name, specifier, + URL, and extras. Raises InvalidRequirement on a badly-formed requirement + string. + """ + + # TODO: Can we test whether something is contained within a requirement? + # If so how do we do that? Do we need to test against the _name_ of + # the thing as well as the version? What about the markers? + # TODO: Can we normalize the name and extra name? + + def __init__(self, requirement_string): + try: + req = REQUIREMENT.parseString(requirement_string) + except ParseException as e: + raise InvalidRequirement( + "Invalid requirement, parse error at \"{0!r}\"".format( + requirement_string[e.loc:e.loc + 8])) + + self.name = req.name + if req.url: + parsed_url = urlparse.urlparse(req.url) + if not (parsed_url.scheme and parsed_url.netloc) or ( + not parsed_url.scheme and not parsed_url.netloc): + raise InvalidRequirement("Invalid URL given") + self.url = req.url + else: + self.url = None + self.extras = set(req.extras.asList() if req.extras else []) + self.specifier = SpecifierSet(req.specifier) + self.marker = req.marker if req.marker else None + + def __str__(self): + parts = [self.name] + + if self.extras: + parts.append("[{0}]".format(",".join(sorted(self.extras)))) + + if self.specifier: + parts.append(str(self.specifier)) + + if self.url: + parts.append("@ {0}".format(self.url)) + + if self.marker: + parts.append("; {0}".format(self.marker)) + + return "".join(parts) + + def __repr__(self): + return "".format(str(self)) diff --git a/lib/python3.7/site-packages/setuptools/_vendor/packaging/specifiers.py b/lib/python3.7/site-packages/setuptools/_vendor/packaging/specifiers.py new file mode 100644 index 0000000..7f5a76c --- /dev/null +++ b/lib/python3.7/site-packages/setuptools/_vendor/packaging/specifiers.py @@ -0,0 +1,774 @@ +# This file is dual licensed under the terms of the Apache License, Version +# 2.0, and the BSD License. See the LICENSE file in the root of this repository +# for complete details. +from __future__ import absolute_import, division, print_function + +import abc +import functools +import itertools +import re + +from ._compat import string_types, with_metaclass +from .version import Version, LegacyVersion, parse + + +class InvalidSpecifier(ValueError): + """ + An invalid specifier was found, users should refer to PEP 440. + """ + + +class BaseSpecifier(with_metaclass(abc.ABCMeta, object)): + + @abc.abstractmethod + def __str__(self): + """ + Returns the str representation of this Specifier like object. This + should be representative of the Specifier itself. + """ + + @abc.abstractmethod + def __hash__(self): + """ + Returns a hash value for this Specifier like object. + """ + + @abc.abstractmethod + def __eq__(self, other): + """ + Returns a boolean representing whether or not the two Specifier like + objects are equal. + """ + + @abc.abstractmethod + def __ne__(self, other): + """ + Returns a boolean representing whether or not the two Specifier like + objects are not equal. + """ + + @abc.abstractproperty + def prereleases(self): + """ + Returns whether or not pre-releases as a whole are allowed by this + specifier. + """ + + @prereleases.setter + def prereleases(self, value): + """ + Sets whether or not pre-releases as a whole are allowed by this + specifier. + """ + + @abc.abstractmethod + def contains(self, item, prereleases=None): + """ + Determines if the given item is contained within this specifier. + """ + + @abc.abstractmethod + def filter(self, iterable, prereleases=None): + """ + Takes an iterable of items and filters them so that only items which + are contained within this specifier are allowed in it. + """ + + +class _IndividualSpecifier(BaseSpecifier): + + _operators = {} + + def __init__(self, spec="", prereleases=None): + match = self._regex.search(spec) + if not match: + raise InvalidSpecifier("Invalid specifier: '{0}'".format(spec)) + + self._spec = ( + match.group("operator").strip(), + match.group("version").strip(), + ) + + # Store whether or not this Specifier should accept prereleases + self._prereleases = prereleases + + def __repr__(self): + pre = ( + ", prereleases={0!r}".format(self.prereleases) + if self._prereleases is not None + else "" + ) + + return "<{0}({1!r}{2})>".format( + self.__class__.__name__, + str(self), + pre, + ) + + def __str__(self): + return "{0}{1}".format(*self._spec) + + def __hash__(self): + return hash(self._spec) + + def __eq__(self, other): + if isinstance(other, string_types): + try: + other = self.__class__(other) + except InvalidSpecifier: + return NotImplemented + elif not isinstance(other, self.__class__): + return NotImplemented + + return self._spec == other._spec + + def __ne__(self, other): + if isinstance(other, string_types): + try: + other = self.__class__(other) + except InvalidSpecifier: + return NotImplemented + elif not isinstance(other, self.__class__): + return NotImplemented + + return self._spec != other._spec + + def _get_operator(self, op): + return getattr(self, "_compare_{0}".format(self._operators[op])) + + def _coerce_version(self, version): + if not isinstance(version, (LegacyVersion, Version)): + version = parse(version) + return version + + @property + def operator(self): + return self._spec[0] + + @property + def version(self): + return self._spec[1] + + @property + def prereleases(self): + return self._prereleases + + @prereleases.setter + def prereleases(self, value): + self._prereleases = value + + def __contains__(self, item): + return self.contains(item) + + def contains(self, item, prereleases=None): + # Determine if prereleases are to be allowed or not. + if prereleases is None: + prereleases = self.prereleases + + # Normalize item to a Version or LegacyVersion, this allows us to have + # a shortcut for ``"2.0" in Specifier(">=2") + item = self._coerce_version(item) + + # Determine if we should be supporting prereleases in this specifier + # or not, if we do not support prereleases than we can short circuit + # logic if this version is a prereleases. + if item.is_prerelease and not prereleases: + return False + + # Actually do the comparison to determine if this item is contained + # within this Specifier or not. + return self._get_operator(self.operator)(item, self.version) + + def filter(self, iterable, prereleases=None): + yielded = False + found_prereleases = [] + + kw = {"prereleases": prereleases if prereleases is not None else True} + + # Attempt to iterate over all the values in the iterable and if any of + # them match, yield them. + for version in iterable: + parsed_version = self._coerce_version(version) + + if self.contains(parsed_version, **kw): + # If our version is a prerelease, and we were not set to allow + # prereleases, then we'll store it for later incase nothing + # else matches this specifier. + if (parsed_version.is_prerelease and not + (prereleases or self.prereleases)): + found_prereleases.append(version) + # Either this is not a prerelease, or we should have been + # accepting prereleases from the begining. + else: + yielded = True + yield version + + # Now that we've iterated over everything, determine if we've yielded + # any values, and if we have not and we have any prereleases stored up + # then we will go ahead and yield the prereleases. + if not yielded and found_prereleases: + for version in found_prereleases: + yield version + + +class LegacySpecifier(_IndividualSpecifier): + + _regex_str = ( + r""" + (?P(==|!=|<=|>=|<|>)) + \s* + (?P + [^,;\s)]* # Since this is a "legacy" specifier, and the version + # string can be just about anything, we match everything + # except for whitespace, a semi-colon for marker support, + # a closing paren since versions can be enclosed in + # them, and a comma since it's a version separator. + ) + """ + ) + + _regex = re.compile( + r"^\s*" + _regex_str + r"\s*$", re.VERBOSE | re.IGNORECASE) + + _operators = { + "==": "equal", + "!=": "not_equal", + "<=": "less_than_equal", + ">=": "greater_than_equal", + "<": "less_than", + ">": "greater_than", + } + + def _coerce_version(self, version): + if not isinstance(version, LegacyVersion): + version = LegacyVersion(str(version)) + return version + + def _compare_equal(self, prospective, spec): + return prospective == self._coerce_version(spec) + + def _compare_not_equal(self, prospective, spec): + return prospective != self._coerce_version(spec) + + def _compare_less_than_equal(self, prospective, spec): + return prospective <= self._coerce_version(spec) + + def _compare_greater_than_equal(self, prospective, spec): + return prospective >= self._coerce_version(spec) + + def _compare_less_than(self, prospective, spec): + return prospective < self._coerce_version(spec) + + def _compare_greater_than(self, prospective, spec): + return prospective > self._coerce_version(spec) + + +def _require_version_compare(fn): + @functools.wraps(fn) + def wrapped(self, prospective, spec): + if not isinstance(prospective, Version): + return False + return fn(self, prospective, spec) + return wrapped + + +class Specifier(_IndividualSpecifier): + + _regex_str = ( + r""" + (?P(~=|==|!=|<=|>=|<|>|===)) + (?P + (?: + # The identity operators allow for an escape hatch that will + # do an exact string match of the version you wish to install. + # This will not be parsed by PEP 440 and we cannot determine + # any semantic meaning from it. This operator is discouraged + # but included entirely as an escape hatch. + (?<====) # Only match for the identity operator + \s* + [^\s]* # We just match everything, except for whitespace + # since we are only testing for strict identity. + ) + | + (?: + # The (non)equality operators allow for wild card and local + # versions to be specified so we have to define these two + # operators separately to enable that. + (?<===|!=) # Only match for equals and not equals + + \s* + v? + (?:[0-9]+!)? # epoch + [0-9]+(?:\.[0-9]+)* # release + (?: # pre release + [-_\.]? + (a|b|c|rc|alpha|beta|pre|preview) + [-_\.]? + [0-9]* + )? + (?: # post release + (?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*) + )? + + # You cannot use a wild card and a dev or local version + # together so group them with a | and make them optional. + (?: + (?:[-_\.]?dev[-_\.]?[0-9]*)? # dev release + (?:\+[a-z0-9]+(?:[-_\.][a-z0-9]+)*)? # local + | + \.\* # Wild card syntax of .* + )? + ) + | + (?: + # The compatible operator requires at least two digits in the + # release segment. + (?<=~=) # Only match for the compatible operator + + \s* + v? + (?:[0-9]+!)? # epoch + [0-9]+(?:\.[0-9]+)+ # release (We have a + instead of a *) + (?: # pre release + [-_\.]? + (a|b|c|rc|alpha|beta|pre|preview) + [-_\.]? + [0-9]* + )? + (?: # post release + (?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*) + )? + (?:[-_\.]?dev[-_\.]?[0-9]*)? # dev release + ) + | + (?: + # All other operators only allow a sub set of what the + # (non)equality operators do. Specifically they do not allow + # local versions to be specified nor do they allow the prefix + # matching wild cards. + (?=": "greater_than_equal", + "<": "less_than", + ">": "greater_than", + "===": "arbitrary", + } + + @_require_version_compare + def _compare_compatible(self, prospective, spec): + # Compatible releases have an equivalent combination of >= and ==. That + # is that ~=2.2 is equivalent to >=2.2,==2.*. This allows us to + # implement this in terms of the other specifiers instead of + # implementing it ourselves. The only thing we need to do is construct + # the other specifiers. + + # We want everything but the last item in the version, but we want to + # ignore post and dev releases and we want to treat the pre-release as + # it's own separate segment. + prefix = ".".join( + list( + itertools.takewhile( + lambda x: (not x.startswith("post") and not + x.startswith("dev")), + _version_split(spec), + ) + )[:-1] + ) + + # Add the prefix notation to the end of our string + prefix += ".*" + + return (self._get_operator(">=")(prospective, spec) and + self._get_operator("==")(prospective, prefix)) + + @_require_version_compare + def _compare_equal(self, prospective, spec): + # We need special logic to handle prefix matching + if spec.endswith(".*"): + # In the case of prefix matching we want to ignore local segment. + prospective = Version(prospective.public) + # Split the spec out by dots, and pretend that there is an implicit + # dot in between a release segment and a pre-release segment. + spec = _version_split(spec[:-2]) # Remove the trailing .* + + # Split the prospective version out by dots, and pretend that there + # is an implicit dot in between a release segment and a pre-release + # segment. + prospective = _version_split(str(prospective)) + + # Shorten the prospective version to be the same length as the spec + # so that we can determine if the specifier is a prefix of the + # prospective version or not. + prospective = prospective[:len(spec)] + + # Pad out our two sides with zeros so that they both equal the same + # length. + spec, prospective = _pad_version(spec, prospective) + else: + # Convert our spec string into a Version + spec = Version(spec) + + # If the specifier does not have a local segment, then we want to + # act as if the prospective version also does not have a local + # segment. + if not spec.local: + prospective = Version(prospective.public) + + return prospective == spec + + @_require_version_compare + def _compare_not_equal(self, prospective, spec): + return not self._compare_equal(prospective, spec) + + @_require_version_compare + def _compare_less_than_equal(self, prospective, spec): + return prospective <= Version(spec) + + @_require_version_compare + def _compare_greater_than_equal(self, prospective, spec): + return prospective >= Version(spec) + + @_require_version_compare + def _compare_less_than(self, prospective, spec): + # Convert our spec to a Version instance, since we'll want to work with + # it as a version. + spec = Version(spec) + + # Check to see if the prospective version is less than the spec + # version. If it's not we can short circuit and just return False now + # instead of doing extra unneeded work. + if not prospective < spec: + return False + + # This special case is here so that, unless the specifier itself + # includes is a pre-release version, that we do not accept pre-release + # versions for the version mentioned in the specifier (e.g. <3.1 should + # not match 3.1.dev0, but should match 3.0.dev0). + if not spec.is_prerelease and prospective.is_prerelease: + if Version(prospective.base_version) == Version(spec.base_version): + return False + + # If we've gotten to here, it means that prospective version is both + # less than the spec version *and* it's not a pre-release of the same + # version in the spec. + return True + + @_require_version_compare + def _compare_greater_than(self, prospective, spec): + # Convert our spec to a Version instance, since we'll want to work with + # it as a version. + spec = Version(spec) + + # Check to see if the prospective version is greater than the spec + # version. If it's not we can short circuit and just return False now + # instead of doing extra unneeded work. + if not prospective > spec: + return False + + # This special case is here so that, unless the specifier itself + # includes is a post-release version, that we do not accept + # post-release versions for the version mentioned in the specifier + # (e.g. >3.1 should not match 3.0.post0, but should match 3.2.post0). + if not spec.is_postrelease and prospective.is_postrelease: + if Version(prospective.base_version) == Version(spec.base_version): + return False + + # Ensure that we do not allow a local version of the version mentioned + # in the specifier, which is techincally greater than, to match. + if prospective.local is not None: + if Version(prospective.base_version) == Version(spec.base_version): + return False + + # If we've gotten to here, it means that prospective version is both + # greater than the spec version *and* it's not a pre-release of the + # same version in the spec. + return True + + def _compare_arbitrary(self, prospective, spec): + return str(prospective).lower() == str(spec).lower() + + @property + def prereleases(self): + # If there is an explicit prereleases set for this, then we'll just + # blindly use that. + if self._prereleases is not None: + return self._prereleases + + # Look at all of our specifiers and determine if they are inclusive + # operators, and if they are if they are including an explicit + # prerelease. + operator, version = self._spec + if operator in ["==", ">=", "<=", "~=", "==="]: + # The == specifier can include a trailing .*, if it does we + # want to remove before parsing. + if operator == "==" and version.endswith(".*"): + version = version[:-2] + + # Parse the version, and if it is a pre-release than this + # specifier allows pre-releases. + if parse(version).is_prerelease: + return True + + return False + + @prereleases.setter + def prereleases(self, value): + self._prereleases = value + + +_prefix_regex = re.compile(r"^([0-9]+)((?:a|b|c|rc)[0-9]+)$") + + +def _version_split(version): + result = [] + for item in version.split("."): + match = _prefix_regex.search(item) + if match: + result.extend(match.groups()) + else: + result.append(item) + return result + + +def _pad_version(left, right): + left_split, right_split = [], [] + + # Get the release segment of our versions + left_split.append(list(itertools.takewhile(lambda x: x.isdigit(), left))) + right_split.append(list(itertools.takewhile(lambda x: x.isdigit(), right))) + + # Get the rest of our versions + left_split.append(left[len(left_split[0]):]) + right_split.append(right[len(right_split[0]):]) + + # Insert our padding + left_split.insert( + 1, + ["0"] * max(0, len(right_split[0]) - len(left_split[0])), + ) + right_split.insert( + 1, + ["0"] * max(0, len(left_split[0]) - len(right_split[0])), + ) + + return ( + list(itertools.chain(*left_split)), + list(itertools.chain(*right_split)), + ) + + +class SpecifierSet(BaseSpecifier): + + def __init__(self, specifiers="", prereleases=None): + # Split on , to break each indidivual specifier into it's own item, and + # strip each item to remove leading/trailing whitespace. + specifiers = [s.strip() for s in specifiers.split(",") if s.strip()] + + # Parsed each individual specifier, attempting first to make it a + # Specifier and falling back to a LegacySpecifier. + parsed = set() + for specifier in specifiers: + try: + parsed.add(Specifier(specifier)) + except InvalidSpecifier: + parsed.add(LegacySpecifier(specifier)) + + # Turn our parsed specifiers into a frozen set and save them for later. + self._specs = frozenset(parsed) + + # Store our prereleases value so we can use it later to determine if + # we accept prereleases or not. + self._prereleases = prereleases + + def __repr__(self): + pre = ( + ", prereleases={0!r}".format(self.prereleases) + if self._prereleases is not None + else "" + ) + + return "".format(str(self), pre) + + def __str__(self): + return ",".join(sorted(str(s) for s in self._specs)) + + def __hash__(self): + return hash(self._specs) + + def __and__(self, other): + if isinstance(other, string_types): + other = SpecifierSet(other) + elif not isinstance(other, SpecifierSet): + return NotImplemented + + specifier = SpecifierSet() + specifier._specs = frozenset(self._specs | other._specs) + + if self._prereleases is None and other._prereleases is not None: + specifier._prereleases = other._prereleases + elif self._prereleases is not None and other._prereleases is None: + specifier._prereleases = self._prereleases + elif self._prereleases == other._prereleases: + specifier._prereleases = self._prereleases + else: + raise ValueError( + "Cannot combine SpecifierSets with True and False prerelease " + "overrides." + ) + + return specifier + + def __eq__(self, other): + if isinstance(other, string_types): + other = SpecifierSet(other) + elif isinstance(other, _IndividualSpecifier): + other = SpecifierSet(str(other)) + elif not isinstance(other, SpecifierSet): + return NotImplemented + + return self._specs == other._specs + + def __ne__(self, other): + if isinstance(other, string_types): + other = SpecifierSet(other) + elif isinstance(other, _IndividualSpecifier): + other = SpecifierSet(str(other)) + elif not isinstance(other, SpecifierSet): + return NotImplemented + + return self._specs != other._specs + + def __len__(self): + return len(self._specs) + + def __iter__(self): + return iter(self._specs) + + @property + def prereleases(self): + # If we have been given an explicit prerelease modifier, then we'll + # pass that through here. + if self._prereleases is not None: + return self._prereleases + + # If we don't have any specifiers, and we don't have a forced value, + # then we'll just return None since we don't know if this should have + # pre-releases or not. + if not self._specs: + return None + + # Otherwise we'll see if any of the given specifiers accept + # prereleases, if any of them do we'll return True, otherwise False. + return any(s.prereleases for s in self._specs) + + @prereleases.setter + def prereleases(self, value): + self._prereleases = value + + def __contains__(self, item): + return self.contains(item) + + def contains(self, item, prereleases=None): + # Ensure that our item is a Version or LegacyVersion instance. + if not isinstance(item, (LegacyVersion, Version)): + item = parse(item) + + # Determine if we're forcing a prerelease or not, if we're not forcing + # one for this particular filter call, then we'll use whatever the + # SpecifierSet thinks for whether or not we should support prereleases. + if prereleases is None: + prereleases = self.prereleases + + # We can determine if we're going to allow pre-releases by looking to + # see if any of the underlying items supports them. If none of them do + # and this item is a pre-release then we do not allow it and we can + # short circuit that here. + # Note: This means that 1.0.dev1 would not be contained in something + # like >=1.0.devabc however it would be in >=1.0.debabc,>0.0.dev0 + if not prereleases and item.is_prerelease: + return False + + # We simply dispatch to the underlying specs here to make sure that the + # given version is contained within all of them. + # Note: This use of all() here means that an empty set of specifiers + # will always return True, this is an explicit design decision. + return all( + s.contains(item, prereleases=prereleases) + for s in self._specs + ) + + def filter(self, iterable, prereleases=None): + # Determine if we're forcing a prerelease or not, if we're not forcing + # one for this particular filter call, then we'll use whatever the + # SpecifierSet thinks for whether or not we should support prereleases. + if prereleases is None: + prereleases = self.prereleases + + # If we have any specifiers, then we want to wrap our iterable in the + # filter method for each one, this will act as a logical AND amongst + # each specifier. + if self._specs: + for spec in self._specs: + iterable = spec.filter(iterable, prereleases=bool(prereleases)) + return iterable + # If we do not have any specifiers, then we need to have a rough filter + # which will filter out any pre-releases, unless there are no final + # releases, and which will filter out LegacyVersion in general. + else: + filtered = [] + found_prereleases = [] + + for item in iterable: + # Ensure that we some kind of Version class for this item. + if not isinstance(item, (LegacyVersion, Version)): + parsed_version = parse(item) + else: + parsed_version = item + + # Filter out any item which is parsed as a LegacyVersion + if isinstance(parsed_version, LegacyVersion): + continue + + # Store any item which is a pre-release for later unless we've + # already found a final version or we are accepting prereleases + if parsed_version.is_prerelease and not prereleases: + if not filtered: + found_prereleases.append(item) + else: + filtered.append(item) + + # If we've found no items except for pre-releases, then we'll go + # ahead and use the pre-releases + if not filtered and found_prereleases and prereleases is None: + return found_prereleases + + return filtered diff --git a/lib/python3.7/site-packages/setuptools/_vendor/packaging/utils.py b/lib/python3.7/site-packages/setuptools/_vendor/packaging/utils.py new file mode 100644 index 0000000..942387c --- /dev/null +++ b/lib/python3.7/site-packages/setuptools/_vendor/packaging/utils.py @@ -0,0 +1,14 @@ +# This file is dual licensed under the terms of the Apache License, Version +# 2.0, and the BSD License. See the LICENSE file in the root of this repository +# for complete details. +from __future__ import absolute_import, division, print_function + +import re + + +_canonicalize_regex = re.compile(r"[-_.]+") + + +def canonicalize_name(name): + # This is taken from PEP 503. + return _canonicalize_regex.sub("-", name).lower() diff --git a/lib/python3.7/site-packages/setuptools/_vendor/packaging/version.py b/lib/python3.7/site-packages/setuptools/_vendor/packaging/version.py new file mode 100644 index 0000000..83b5ee8 --- /dev/null +++ b/lib/python3.7/site-packages/setuptools/_vendor/packaging/version.py @@ -0,0 +1,393 @@ +# This file is dual licensed under the terms of the Apache License, Version +# 2.0, and the BSD License. See the LICENSE file in the root of this repository +# for complete details. +from __future__ import absolute_import, division, print_function + +import collections +import itertools +import re + +from ._structures import Infinity + + +__all__ = [ + "parse", "Version", "LegacyVersion", "InvalidVersion", "VERSION_PATTERN" +] + + +_Version = collections.namedtuple( + "_Version", + ["epoch", "release", "dev", "pre", "post", "local"], +) + + +def parse(version): + """ + Parse the given version string and return either a :class:`Version` object + or a :class:`LegacyVersion` object depending on if the given version is + a valid PEP 440 version or a legacy version. + """ + try: + return Version(version) + except InvalidVersion: + return LegacyVersion(version) + + +class InvalidVersion(ValueError): + """ + An invalid version was found, users should refer to PEP 440. + """ + + +class _BaseVersion(object): + + def __hash__(self): + return hash(self._key) + + def __lt__(self, other): + return self._compare(other, lambda s, o: s < o) + + def __le__(self, other): + return self._compare(other, lambda s, o: s <= o) + + def __eq__(self, other): + return self._compare(other, lambda s, o: s == o) + + def __ge__(self, other): + return self._compare(other, lambda s, o: s >= o) + + def __gt__(self, other): + return self._compare(other, lambda s, o: s > o) + + def __ne__(self, other): + return self._compare(other, lambda s, o: s != o) + + def _compare(self, other, method): + if not isinstance(other, _BaseVersion): + return NotImplemented + + return method(self._key, other._key) + + +class LegacyVersion(_BaseVersion): + + def __init__(self, version): + self._version = str(version) + self._key = _legacy_cmpkey(self._version) + + def __str__(self): + return self._version + + def __repr__(self): + return "".format(repr(str(self))) + + @property + def public(self): + return self._version + + @property + def base_version(self): + return self._version + + @property + def local(self): + return None + + @property + def is_prerelease(self): + return False + + @property + def is_postrelease(self): + return False + + +_legacy_version_component_re = re.compile( + r"(\d+ | [a-z]+ | \.| -)", re.VERBOSE, +) + +_legacy_version_replacement_map = { + "pre": "c", "preview": "c", "-": "final-", "rc": "c", "dev": "@", +} + + +def _parse_version_parts(s): + for part in _legacy_version_component_re.split(s): + part = _legacy_version_replacement_map.get(part, part) + + if not part or part == ".": + continue + + if part[:1] in "0123456789": + # pad for numeric comparison + yield part.zfill(8) + else: + yield "*" + part + + # ensure that alpha/beta/candidate are before final + yield "*final" + + +def _legacy_cmpkey(version): + # We hardcode an epoch of -1 here. A PEP 440 version can only have a epoch + # greater than or equal to 0. This will effectively put the LegacyVersion, + # which uses the defacto standard originally implemented by setuptools, + # as before all PEP 440 versions. + epoch = -1 + + # This scheme is taken from pkg_resources.parse_version setuptools prior to + # it's adoption of the packaging library. + parts = [] + for part in _parse_version_parts(version.lower()): + if part.startswith("*"): + # remove "-" before a prerelease tag + if part < "*final": + while parts and parts[-1] == "*final-": + parts.pop() + + # remove trailing zeros from each series of numeric parts + while parts and parts[-1] == "00000000": + parts.pop() + + parts.append(part) + parts = tuple(parts) + + return epoch, parts + +# Deliberately not anchored to the start and end of the string, to make it +# easier for 3rd party code to reuse +VERSION_PATTERN = r""" + v? + (?: + (?:(?P[0-9]+)!)? # epoch + (?P[0-9]+(?:\.[0-9]+)*) # release segment + (?P
                                          # pre-release
+            [-_\.]?
+            (?P(a|b|c|rc|alpha|beta|pre|preview))
+            [-_\.]?
+            (?P[0-9]+)?
+        )?
+        (?P                                         # post release
+            (?:-(?P[0-9]+))
+            |
+            (?:
+                [-_\.]?
+                (?Ppost|rev|r)
+                [-_\.]?
+                (?P[0-9]+)?
+            )
+        )?
+        (?P                                          # dev release
+            [-_\.]?
+            (?Pdev)
+            [-_\.]?
+            (?P[0-9]+)?
+        )?
+    )
+    (?:\+(?P[a-z0-9]+(?:[-_\.][a-z0-9]+)*))?       # local version
+"""
+
+
+class Version(_BaseVersion):
+
+    _regex = re.compile(
+        r"^\s*" + VERSION_PATTERN + r"\s*$",
+        re.VERBOSE | re.IGNORECASE,
+    )
+
+    def __init__(self, version):
+        # Validate the version and parse it into pieces
+        match = self._regex.search(version)
+        if not match:
+            raise InvalidVersion("Invalid version: '{0}'".format(version))
+
+        # Store the parsed out pieces of the version
+        self._version = _Version(
+            epoch=int(match.group("epoch")) if match.group("epoch") else 0,
+            release=tuple(int(i) for i in match.group("release").split(".")),
+            pre=_parse_letter_version(
+                match.group("pre_l"),
+                match.group("pre_n"),
+            ),
+            post=_parse_letter_version(
+                match.group("post_l"),
+                match.group("post_n1") or match.group("post_n2"),
+            ),
+            dev=_parse_letter_version(
+                match.group("dev_l"),
+                match.group("dev_n"),
+            ),
+            local=_parse_local_version(match.group("local")),
+        )
+
+        # Generate a key which will be used for sorting
+        self._key = _cmpkey(
+            self._version.epoch,
+            self._version.release,
+            self._version.pre,
+            self._version.post,
+            self._version.dev,
+            self._version.local,
+        )
+
+    def __repr__(self):
+        return "".format(repr(str(self)))
+
+    def __str__(self):
+        parts = []
+
+        # Epoch
+        if self._version.epoch != 0:
+            parts.append("{0}!".format(self._version.epoch))
+
+        # Release segment
+        parts.append(".".join(str(x) for x in self._version.release))
+
+        # Pre-release
+        if self._version.pre is not None:
+            parts.append("".join(str(x) for x in self._version.pre))
+
+        # Post-release
+        if self._version.post is not None:
+            parts.append(".post{0}".format(self._version.post[1]))
+
+        # Development release
+        if self._version.dev is not None:
+            parts.append(".dev{0}".format(self._version.dev[1]))
+
+        # Local version segment
+        if self._version.local is not None:
+            parts.append(
+                "+{0}".format(".".join(str(x) for x in self._version.local))
+            )
+
+        return "".join(parts)
+
+    @property
+    def public(self):
+        return str(self).split("+", 1)[0]
+
+    @property
+    def base_version(self):
+        parts = []
+
+        # Epoch
+        if self._version.epoch != 0:
+            parts.append("{0}!".format(self._version.epoch))
+
+        # Release segment
+        parts.append(".".join(str(x) for x in self._version.release))
+
+        return "".join(parts)
+
+    @property
+    def local(self):
+        version_string = str(self)
+        if "+" in version_string:
+            return version_string.split("+", 1)[1]
+
+    @property
+    def is_prerelease(self):
+        return bool(self._version.dev or self._version.pre)
+
+    @property
+    def is_postrelease(self):
+        return bool(self._version.post)
+
+
+def _parse_letter_version(letter, number):
+    if letter:
+        # We consider there to be an implicit 0 in a pre-release if there is
+        # not a numeral associated with it.
+        if number is None:
+            number = 0
+
+        # We normalize any letters to their lower case form
+        letter = letter.lower()
+
+        # We consider some words to be alternate spellings of other words and
+        # in those cases we want to normalize the spellings to our preferred
+        # spelling.
+        if letter == "alpha":
+            letter = "a"
+        elif letter == "beta":
+            letter = "b"
+        elif letter in ["c", "pre", "preview"]:
+            letter = "rc"
+        elif letter in ["rev", "r"]:
+            letter = "post"
+
+        return letter, int(number)
+    if not letter and number:
+        # We assume if we are given a number, but we are not given a letter
+        # then this is using the implicit post release syntax (e.g. 1.0-1)
+        letter = "post"
+
+        return letter, int(number)
+
+
+_local_version_seperators = re.compile(r"[\._-]")
+
+
+def _parse_local_version(local):
+    """
+    Takes a string like abc.1.twelve and turns it into ("abc", 1, "twelve").
+    """
+    if local is not None:
+        return tuple(
+            part.lower() if not part.isdigit() else int(part)
+            for part in _local_version_seperators.split(local)
+        )
+
+
+def _cmpkey(epoch, release, pre, post, dev, local):
+    # When we compare a release version, we want to compare it with all of the
+    # trailing zeros removed. So we'll use a reverse the list, drop all the now
+    # leading zeros until we come to something non zero, then take the rest
+    # re-reverse it back into the correct order and make it a tuple and use
+    # that for our sorting key.
+    release = tuple(
+        reversed(list(
+            itertools.dropwhile(
+                lambda x: x == 0,
+                reversed(release),
+            )
+        ))
+    )
+
+    # We need to "trick" the sorting algorithm to put 1.0.dev0 before 1.0a0.
+    # We'll do this by abusing the pre segment, but we _only_ want to do this
+    # if there is not a pre or a post segment. If we have one of those then
+    # the normal sorting rules will handle this case correctly.
+    if pre is None and post is None and dev is not None:
+        pre = -Infinity
+    # Versions without a pre-release (except as noted above) should sort after
+    # those with one.
+    elif pre is None:
+        pre = Infinity
+
+    # Versions without a post segment should sort before those with one.
+    if post is None:
+        post = -Infinity
+
+    # Versions without a development segment should sort after those with one.
+    if dev is None:
+        dev = Infinity
+
+    if local is None:
+        # Versions without a local segment should sort before those with one.
+        local = -Infinity
+    else:
+        # Versions with a local segment need that segment parsed to implement
+        # the sorting rules in PEP440.
+        # - Alpha numeric segments sort before numeric segments
+        # - Alpha numeric segments sort lexicographically
+        # - Numeric segments sort numerically
+        # - Shorter versions sort before longer versions when the prefixes
+        #   match exactly
+        local = tuple(
+            (i, "") if isinstance(i, int) else (-Infinity, i)
+            for i in local
+        )
+
+    return epoch, release, pre, post, dev, local
diff --git a/lib/python3.7/site-packages/setuptools/_vendor/pyparsing.py b/lib/python3.7/site-packages/setuptools/_vendor/pyparsing.py
new file mode 100644
index 0000000..cf75e1e
--- /dev/null
+++ b/lib/python3.7/site-packages/setuptools/_vendor/pyparsing.py
@@ -0,0 +1,5742 @@
+# module pyparsing.py
+#
+# Copyright (c) 2003-2018  Paul T. McGuire
+#
+# Permission is hereby granted, free of charge, to any person obtaining
+# a copy of this software and associated documentation files (the
+# "Software"), to deal in the Software without restriction, including
+# without limitation the rights to use, copy, modify, merge, publish,
+# distribute, sublicense, and/or sell copies of the Software, and to
+# permit persons to whom the Software is furnished to do so, subject to
+# the following conditions:
+#
+# The above copyright notice and this permission notice shall be
+# included in all copies or substantial portions of the Software.
+#
+# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
+# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
+# IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
+# CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
+# TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
+# SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
+#
+
+__doc__ = \
+"""
+pyparsing module - Classes and methods to define and execute parsing grammars
+=============================================================================
+
+The pyparsing module is an alternative approach to creating and executing simple grammars,
+vs. the traditional lex/yacc approach, or the use of regular expressions.  With pyparsing, you
+don't need to learn a new syntax for defining grammars or matching expressions - the parsing module
+provides a library of classes that you use to construct the grammar directly in Python.
+
+Here is a program to parse "Hello, World!" (or any greeting of the form 
+C{", !"}), built up using L{Word}, L{Literal}, and L{And} elements 
+(L{'+'} operator gives L{And} expressions, strings are auto-converted to
+L{Literal} expressions)::
+
+    from pyparsing import Word, alphas
+
+    # define grammar of a greeting
+    greet = Word(alphas) + "," + Word(alphas) + "!"
+
+    hello = "Hello, World!"
+    print (hello, "->", greet.parseString(hello))
+
+The program outputs the following::
+
+    Hello, World! -> ['Hello', ',', 'World', '!']
+
+The Python representation of the grammar is quite readable, owing to the self-explanatory
+class names, and the use of '+', '|' and '^' operators.
+
+The L{ParseResults} object returned from L{ParserElement.parseString} can be accessed as a nested list, a dictionary, or an
+object with named attributes.
+
+The pyparsing module handles some of the problems that are typically vexing when writing text parsers:
+ - extra or missing whitespace (the above program will also handle "Hello,World!", "Hello  ,  World  !", etc.)
+ - quoted strings
+ - embedded comments
+
+
+Getting Started -
+-----------------
+Visit the classes L{ParserElement} and L{ParseResults} to see the base classes that most other pyparsing
+classes inherit from. Use the docstrings for examples of how to:
+ - construct literal match expressions from L{Literal} and L{CaselessLiteral} classes
+ - construct character word-group expressions using the L{Word} class
+ - see how to create repetitive expressions using L{ZeroOrMore} and L{OneOrMore} classes
+ - use L{'+'}, L{'|'}, L{'^'}, and L{'&'} operators to combine simple expressions into more complex ones
+ - associate names with your parsed results using L{ParserElement.setResultsName}
+ - find some helpful expression short-cuts like L{delimitedList} and L{oneOf}
+ - find more useful common expressions in the L{pyparsing_common} namespace class
+"""
+
+__version__ = "2.2.1"
+__versionTime__ = "18 Sep 2018 00:49 UTC"
+__author__ = "Paul McGuire "
+
+import string
+from weakref import ref as wkref
+import copy
+import sys
+import warnings
+import re
+import sre_constants
+import collections
+import pprint
+import traceback
+import types
+from datetime import datetime
+
+try:
+    from _thread import RLock
+except ImportError:
+    from threading import RLock
+
+try:
+    # Python 3
+    from collections.abc import Iterable
+    from collections.abc import MutableMapping
+except ImportError:
+    # Python 2.7
+    from collections import Iterable
+    from collections import MutableMapping
+
+try:
+    from collections import OrderedDict as _OrderedDict
+except ImportError:
+    try:
+        from ordereddict import OrderedDict as _OrderedDict
+    except ImportError:
+        _OrderedDict = None
+
+#~ sys.stderr.write( "testing pyparsing module, version %s, %s\n" % (__version__,__versionTime__ ) )
+
+__all__ = [
+'And', 'CaselessKeyword', 'CaselessLiteral', 'CharsNotIn', 'Combine', 'Dict', 'Each', 'Empty',
+'FollowedBy', 'Forward', 'GoToColumn', 'Group', 'Keyword', 'LineEnd', 'LineStart', 'Literal',
+'MatchFirst', 'NoMatch', 'NotAny', 'OneOrMore', 'OnlyOnce', 'Optional', 'Or',
+'ParseBaseException', 'ParseElementEnhance', 'ParseException', 'ParseExpression', 'ParseFatalException',
+'ParseResults', 'ParseSyntaxException', 'ParserElement', 'QuotedString', 'RecursiveGrammarException',
+'Regex', 'SkipTo', 'StringEnd', 'StringStart', 'Suppress', 'Token', 'TokenConverter', 
+'White', 'Word', 'WordEnd', 'WordStart', 'ZeroOrMore',
+'alphanums', 'alphas', 'alphas8bit', 'anyCloseTag', 'anyOpenTag', 'cStyleComment', 'col',
+'commaSeparatedList', 'commonHTMLEntity', 'countedArray', 'cppStyleComment', 'dblQuotedString',
+'dblSlashComment', 'delimitedList', 'dictOf', 'downcaseTokens', 'empty', 'hexnums',
+'htmlComment', 'javaStyleComment', 'line', 'lineEnd', 'lineStart', 'lineno',
+'makeHTMLTags', 'makeXMLTags', 'matchOnlyAtCol', 'matchPreviousExpr', 'matchPreviousLiteral',
+'nestedExpr', 'nullDebugAction', 'nums', 'oneOf', 'opAssoc', 'operatorPrecedence', 'printables',
+'punc8bit', 'pythonStyleComment', 'quotedString', 'removeQuotes', 'replaceHTMLEntity', 
+'replaceWith', 'restOfLine', 'sglQuotedString', 'srange', 'stringEnd',
+'stringStart', 'traceParseAction', 'unicodeString', 'upcaseTokens', 'withAttribute',
+'indentedBlock', 'originalTextFor', 'ungroup', 'infixNotation','locatedExpr', 'withClass',
+'CloseMatch', 'tokenMap', 'pyparsing_common',
+]
+
+system_version = tuple(sys.version_info)[:3]
+PY_3 = system_version[0] == 3
+if PY_3:
+    _MAX_INT = sys.maxsize
+    basestring = str
+    unichr = chr
+    _ustr = str
+
+    # build list of single arg builtins, that can be used as parse actions
+    singleArgBuiltins = [sum, len, sorted, reversed, list, tuple, set, any, all, min, max]
+
+else:
+    _MAX_INT = sys.maxint
+    range = xrange
+
+    def _ustr(obj):
+        """Drop-in replacement for str(obj) that tries to be Unicode friendly. It first tries
+           str(obj). If that fails with a UnicodeEncodeError, then it tries unicode(obj). It
+           then < returns the unicode object | encodes it with the default encoding | ... >.
+        """
+        if isinstance(obj,unicode):
+            return obj
+
+        try:
+            # If this works, then _ustr(obj) has the same behaviour as str(obj), so
+            # it won't break any existing code.
+            return str(obj)
+
+        except UnicodeEncodeError:
+            # Else encode it
+            ret = unicode(obj).encode(sys.getdefaultencoding(), 'xmlcharrefreplace')
+            xmlcharref = Regex(r'&#\d+;')
+            xmlcharref.setParseAction(lambda t: '\\u' + hex(int(t[0][2:-1]))[2:])
+            return xmlcharref.transformString(ret)
+
+    # build list of single arg builtins, tolerant of Python version, that can be used as parse actions
+    singleArgBuiltins = []
+    import __builtin__
+    for fname in "sum len sorted reversed list tuple set any all min max".split():
+        try:
+            singleArgBuiltins.append(getattr(__builtin__,fname))
+        except AttributeError:
+            continue
+            
+_generatorType = type((y for y in range(1)))
+ 
+def _xml_escape(data):
+    """Escape &, <, >, ", ', etc. in a string of data."""
+
+    # ampersand must be replaced first
+    from_symbols = '&><"\''
+    to_symbols = ('&'+s+';' for s in "amp gt lt quot apos".split())
+    for from_,to_ in zip(from_symbols, to_symbols):
+        data = data.replace(from_, to_)
+    return data
+
+class _Constants(object):
+    pass
+
+alphas     = string.ascii_uppercase + string.ascii_lowercase
+nums       = "0123456789"
+hexnums    = nums + "ABCDEFabcdef"
+alphanums  = alphas + nums
+_bslash    = chr(92)
+printables = "".join(c for c in string.printable if c not in string.whitespace)
+
+class ParseBaseException(Exception):
+    """base exception class for all parsing runtime exceptions"""
+    # Performance tuning: we construct a *lot* of these, so keep this
+    # constructor as small and fast as possible
+    def __init__( self, pstr, loc=0, msg=None, elem=None ):
+        self.loc = loc
+        if msg is None:
+            self.msg = pstr
+            self.pstr = ""
+        else:
+            self.msg = msg
+            self.pstr = pstr
+        self.parserElement = elem
+        self.args = (pstr, loc, msg)
+
+    @classmethod
+    def _from_exception(cls, pe):
+        """
+        internal factory method to simplify creating one type of ParseException 
+        from another - avoids having __init__ signature conflicts among subclasses
+        """
+        return cls(pe.pstr, pe.loc, pe.msg, pe.parserElement)
+
+    def __getattr__( self, aname ):
+        """supported attributes by name are:
+            - lineno - returns the line number of the exception text
+            - col - returns the column number of the exception text
+            - line - returns the line containing the exception text
+        """
+        if( aname == "lineno" ):
+            return lineno( self.loc, self.pstr )
+        elif( aname in ("col", "column") ):
+            return col( self.loc, self.pstr )
+        elif( aname == "line" ):
+            return line( self.loc, self.pstr )
+        else:
+            raise AttributeError(aname)
+
+    def __str__( self ):
+        return "%s (at char %d), (line:%d, col:%d)" % \
+                ( self.msg, self.loc, self.lineno, self.column )
+    def __repr__( self ):
+        return _ustr(self)
+    def markInputline( self, markerString = ">!<" ):
+        """Extracts the exception line from the input string, and marks
+           the location of the exception with a special symbol.
+        """
+        line_str = self.line
+        line_column = self.column - 1
+        if markerString:
+            line_str = "".join((line_str[:line_column],
+                                markerString, line_str[line_column:]))
+        return line_str.strip()
+    def __dir__(self):
+        return "lineno col line".split() + dir(type(self))
+
+class ParseException(ParseBaseException):
+    """
+    Exception thrown when parse expressions don't match class;
+    supported attributes by name are:
+     - lineno - returns the line number of the exception text
+     - col - returns the column number of the exception text
+     - line - returns the line containing the exception text
+        
+    Example::
+        try:
+            Word(nums).setName("integer").parseString("ABC")
+        except ParseException as pe:
+            print(pe)
+            print("column: {}".format(pe.col))
+            
+    prints::
+       Expected integer (at char 0), (line:1, col:1)
+        column: 1
+    """
+    pass
+
+class ParseFatalException(ParseBaseException):
+    """user-throwable exception thrown when inconsistent parse content
+       is found; stops all parsing immediately"""
+    pass
+
+class ParseSyntaxException(ParseFatalException):
+    """just like L{ParseFatalException}, but thrown internally when an
+       L{ErrorStop} ('-' operator) indicates that parsing is to stop 
+       immediately because an unbacktrackable syntax error has been found"""
+    pass
+
+#~ class ReparseException(ParseBaseException):
+    #~ """Experimental class - parse actions can raise this exception to cause
+       #~ pyparsing to reparse the input string:
+        #~ - with a modified input string, and/or
+        #~ - with a modified start location
+       #~ Set the values of the ReparseException in the constructor, and raise the
+       #~ exception in a parse action to cause pyparsing to use the new string/location.
+       #~ Setting the values as None causes no change to be made.
+       #~ """
+    #~ def __init_( self, newstring, restartLoc ):
+        #~ self.newParseText = newstring
+        #~ self.reparseLoc = restartLoc
+
+class RecursiveGrammarException(Exception):
+    """exception thrown by L{ParserElement.validate} if the grammar could be improperly recursive"""
+    def __init__( self, parseElementList ):
+        self.parseElementTrace = parseElementList
+
+    def __str__( self ):
+        return "RecursiveGrammarException: %s" % self.parseElementTrace
+
+class _ParseResultsWithOffset(object):
+    def __init__(self,p1,p2):
+        self.tup = (p1,p2)
+    def __getitem__(self,i):
+        return self.tup[i]
+    def __repr__(self):
+        return repr(self.tup[0])
+    def setOffset(self,i):
+        self.tup = (self.tup[0],i)
+
+class ParseResults(object):
+    """
+    Structured parse results, to provide multiple means of access to the parsed data:
+       - as a list (C{len(results)})
+       - by list index (C{results[0], results[1]}, etc.)
+       - by attribute (C{results.} - see L{ParserElement.setResultsName})
+
+    Example::
+        integer = Word(nums)
+        date_str = (integer.setResultsName("year") + '/' 
+                        + integer.setResultsName("month") + '/' 
+                        + integer.setResultsName("day"))
+        # equivalent form:
+        # date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
+
+        # parseString returns a ParseResults object
+        result = date_str.parseString("1999/12/31")
+
+        def test(s, fn=repr):
+            print("%s -> %s" % (s, fn(eval(s))))
+        test("list(result)")
+        test("result[0]")
+        test("result['month']")
+        test("result.day")
+        test("'month' in result")
+        test("'minutes' in result")
+        test("result.dump()", str)
+    prints::
+        list(result) -> ['1999', '/', '12', '/', '31']
+        result[0] -> '1999'
+        result['month'] -> '12'
+        result.day -> '31'
+        'month' in result -> True
+        'minutes' in result -> False
+        result.dump() -> ['1999', '/', '12', '/', '31']
+        - day: 31
+        - month: 12
+        - year: 1999
+    """
+    def __new__(cls, toklist=None, name=None, asList=True, modal=True ):
+        if isinstance(toklist, cls):
+            return toklist
+        retobj = object.__new__(cls)
+        retobj.__doinit = True
+        return retobj
+
+    # Performance tuning: we construct a *lot* of these, so keep this
+    # constructor as small and fast as possible
+    def __init__( self, toklist=None, name=None, asList=True, modal=True, isinstance=isinstance ):
+        if self.__doinit:
+            self.__doinit = False
+            self.__name = None
+            self.__parent = None
+            self.__accumNames = {}
+            self.__asList = asList
+            self.__modal = modal
+            if toklist is None:
+                toklist = []
+            if isinstance(toklist, list):
+                self.__toklist = toklist[:]
+            elif isinstance(toklist, _generatorType):
+                self.__toklist = list(toklist)
+            else:
+                self.__toklist = [toklist]
+            self.__tokdict = dict()
+
+        if name is not None and name:
+            if not modal:
+                self.__accumNames[name] = 0
+            if isinstance(name,int):
+                name = _ustr(name) # will always return a str, but use _ustr for consistency
+            self.__name = name
+            if not (isinstance(toklist, (type(None), basestring, list)) and toklist in (None,'',[])):
+                if isinstance(toklist,basestring):
+                    toklist = [ toklist ]
+                if asList:
+                    if isinstance(toklist,ParseResults):
+                        self[name] = _ParseResultsWithOffset(toklist.copy(),0)
+                    else:
+                        self[name] = _ParseResultsWithOffset(ParseResults(toklist[0]),0)
+                    self[name].__name = name
+                else:
+                    try:
+                        self[name] = toklist[0]
+                    except (KeyError,TypeError,IndexError):
+                        self[name] = toklist
+
+    def __getitem__( self, i ):
+        if isinstance( i, (int,slice) ):
+            return self.__toklist[i]
+        else:
+            if i not in self.__accumNames:
+                return self.__tokdict[i][-1][0]
+            else:
+                return ParseResults([ v[0] for v in self.__tokdict[i] ])
+
+    def __setitem__( self, k, v, isinstance=isinstance ):
+        if isinstance(v,_ParseResultsWithOffset):
+            self.__tokdict[k] = self.__tokdict.get(k,list()) + [v]
+            sub = v[0]
+        elif isinstance(k,(int,slice)):
+            self.__toklist[k] = v
+            sub = v
+        else:
+            self.__tokdict[k] = self.__tokdict.get(k,list()) + [_ParseResultsWithOffset(v,0)]
+            sub = v
+        if isinstance(sub,ParseResults):
+            sub.__parent = wkref(self)
+
+    def __delitem__( self, i ):
+        if isinstance(i,(int,slice)):
+            mylen = len( self.__toklist )
+            del self.__toklist[i]
+
+            # convert int to slice
+            if isinstance(i, int):
+                if i < 0:
+                    i += mylen
+                i = slice(i, i+1)
+            # get removed indices
+            removed = list(range(*i.indices(mylen)))
+            removed.reverse()
+            # fixup indices in token dictionary
+            for name,occurrences in self.__tokdict.items():
+                for j in removed:
+                    for k, (value, position) in enumerate(occurrences):
+                        occurrences[k] = _ParseResultsWithOffset(value, position - (position > j))
+        else:
+            del self.__tokdict[i]
+
+    def __contains__( self, k ):
+        return k in self.__tokdict
+
+    def __len__( self ): return len( self.__toklist )
+    def __bool__(self): return ( not not self.__toklist )
+    __nonzero__ = __bool__
+    def __iter__( self ): return iter( self.__toklist )
+    def __reversed__( self ): return iter( self.__toklist[::-1] )
+    def _iterkeys( self ):
+        if hasattr(self.__tokdict, "iterkeys"):
+            return self.__tokdict.iterkeys()
+        else:
+            return iter(self.__tokdict)
+
+    def _itervalues( self ):
+        return (self[k] for k in self._iterkeys())
+            
+    def _iteritems( self ):
+        return ((k, self[k]) for k in self._iterkeys())
+
+    if PY_3:
+        keys = _iterkeys       
+        """Returns an iterator of all named result keys (Python 3.x only)."""
+
+        values = _itervalues
+        """Returns an iterator of all named result values (Python 3.x only)."""
+
+        items = _iteritems
+        """Returns an iterator of all named result key-value tuples (Python 3.x only)."""
+
+    else:
+        iterkeys = _iterkeys
+        """Returns an iterator of all named result keys (Python 2.x only)."""
+
+        itervalues = _itervalues
+        """Returns an iterator of all named result values (Python 2.x only)."""
+
+        iteritems = _iteritems
+        """Returns an iterator of all named result key-value tuples (Python 2.x only)."""
+
+        def keys( self ):
+            """Returns all named result keys (as a list in Python 2.x, as an iterator in Python 3.x)."""
+            return list(self.iterkeys())
+
+        def values( self ):
+            """Returns all named result values (as a list in Python 2.x, as an iterator in Python 3.x)."""
+            return list(self.itervalues())
+                
+        def items( self ):
+            """Returns all named result key-values (as a list of tuples in Python 2.x, as an iterator in Python 3.x)."""
+            return list(self.iteritems())
+
+    def haskeys( self ):
+        """Since keys() returns an iterator, this method is helpful in bypassing
+           code that looks for the existence of any defined results names."""
+        return bool(self.__tokdict)
+        
+    def pop( self, *args, **kwargs):
+        """
+        Removes and returns item at specified index (default=C{last}).
+        Supports both C{list} and C{dict} semantics for C{pop()}. If passed no
+        argument or an integer argument, it will use C{list} semantics
+        and pop tokens from the list of parsed tokens. If passed a 
+        non-integer argument (most likely a string), it will use C{dict}
+        semantics and pop the corresponding value from any defined 
+        results names. A second default return value argument is 
+        supported, just as in C{dict.pop()}.
+
+        Example::
+            def remove_first(tokens):
+                tokens.pop(0)
+            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']
+            print(OneOrMore(Word(nums)).addParseAction(remove_first).parseString("0 123 321")) # -> ['123', '321']
+
+            label = Word(alphas)
+            patt = label("LABEL") + OneOrMore(Word(nums))
+            print(patt.parseString("AAB 123 321").dump())
+
+            # Use pop() in a parse action to remove named result (note that corresponding value is not
+            # removed from list form of results)
+            def remove_LABEL(tokens):
+                tokens.pop("LABEL")
+                return tokens
+            patt.addParseAction(remove_LABEL)
+            print(patt.parseString("AAB 123 321").dump())
+        prints::
+            ['AAB', '123', '321']
+            - LABEL: AAB
+
+            ['AAB', '123', '321']
+        """
+        if not args:
+            args = [-1]
+        for k,v in kwargs.items():
+            if k == 'default':
+                args = (args[0], v)
+            else:
+                raise TypeError("pop() got an unexpected keyword argument '%s'" % k)
+        if (isinstance(args[0], int) or 
+                        len(args) == 1 or 
+                        args[0] in self):
+            index = args[0]
+            ret = self[index]
+            del self[index]
+            return ret
+        else:
+            defaultvalue = args[1]
+            return defaultvalue
+
+    def get(self, key, defaultValue=None):
+        """
+        Returns named result matching the given key, or if there is no
+        such name, then returns the given C{defaultValue} or C{None} if no
+        C{defaultValue} is specified.
+
+        Similar to C{dict.get()}.
+        
+        Example::
+            integer = Word(nums)
+            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           
+
+            result = date_str.parseString("1999/12/31")
+            print(result.get("year")) # -> '1999'
+            print(result.get("hour", "not specified")) # -> 'not specified'
+            print(result.get("hour")) # -> None
+        """
+        if key in self:
+            return self[key]
+        else:
+            return defaultValue
+
+    def insert( self, index, insStr ):
+        """
+        Inserts new element at location index in the list of parsed tokens.
+        
+        Similar to C{list.insert()}.
+
+        Example::
+            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']
+
+            # use a parse action to insert the parse location in the front of the parsed results
+            def insert_locn(locn, tokens):
+                tokens.insert(0, locn)
+            print(OneOrMore(Word(nums)).addParseAction(insert_locn).parseString("0 123 321")) # -> [0, '0', '123', '321']
+        """
+        self.__toklist.insert(index, insStr)
+        # fixup indices in token dictionary
+        for name,occurrences in self.__tokdict.items():
+            for k, (value, position) in enumerate(occurrences):
+                occurrences[k] = _ParseResultsWithOffset(value, position + (position > index))
+
+    def append( self, item ):
+        """
+        Add single element to end of ParseResults list of elements.
+
+        Example::
+            print(OneOrMore(Word(nums)).parseString("0 123 321")) # -> ['0', '123', '321']
+            
+            # use a parse action to compute the sum of the parsed integers, and add it to the end
+            def append_sum(tokens):
+                tokens.append(sum(map(int, tokens)))
+            print(OneOrMore(Word(nums)).addParseAction(append_sum).parseString("0 123 321")) # -> ['0', '123', '321', 444]
+        """
+        self.__toklist.append(item)
+
+    def extend( self, itemseq ):
+        """
+        Add sequence of elements to end of ParseResults list of elements.
+
+        Example::
+            patt = OneOrMore(Word(alphas))
+            
+            # use a parse action to append the reverse of the matched strings, to make a palindrome
+            def make_palindrome(tokens):
+                tokens.extend(reversed([t[::-1] for t in tokens]))
+                return ''.join(tokens)
+            print(patt.addParseAction(make_palindrome).parseString("lskdj sdlkjf lksd")) # -> 'lskdjsdlkjflksddsklfjkldsjdksl'
+        """
+        if isinstance(itemseq, ParseResults):
+            self += itemseq
+        else:
+            self.__toklist.extend(itemseq)
+
+    def clear( self ):
+        """
+        Clear all elements and results names.
+        """
+        del self.__toklist[:]
+        self.__tokdict.clear()
+
+    def __getattr__( self, name ):
+        try:
+            return self[name]
+        except KeyError:
+            return ""
+            
+        if name in self.__tokdict:
+            if name not in self.__accumNames:
+                return self.__tokdict[name][-1][0]
+            else:
+                return ParseResults([ v[0] for v in self.__tokdict[name] ])
+        else:
+            return ""
+
+    def __add__( self, other ):
+        ret = self.copy()
+        ret += other
+        return ret
+
+    def __iadd__( self, other ):
+        if other.__tokdict:
+            offset = len(self.__toklist)
+            addoffset = lambda a: offset if a<0 else a+offset
+            otheritems = other.__tokdict.items()
+            otherdictitems = [(k, _ParseResultsWithOffset(v[0],addoffset(v[1])) )
+                                for (k,vlist) in otheritems for v in vlist]
+            for k,v in otherdictitems:
+                self[k] = v
+                if isinstance(v[0],ParseResults):
+                    v[0].__parent = wkref(self)
+            
+        self.__toklist += other.__toklist
+        self.__accumNames.update( other.__accumNames )
+        return self
+
+    def __radd__(self, other):
+        if isinstance(other,int) and other == 0:
+            # useful for merging many ParseResults using sum() builtin
+            return self.copy()
+        else:
+            # this may raise a TypeError - so be it
+            return other + self
+        
+    def __repr__( self ):
+        return "(%s, %s)" % ( repr( self.__toklist ), repr( self.__tokdict ) )
+
+    def __str__( self ):
+        return '[' + ', '.join(_ustr(i) if isinstance(i, ParseResults) else repr(i) for i in self.__toklist) + ']'
+
+    def _asStringList( self, sep='' ):
+        out = []
+        for item in self.__toklist:
+            if out and sep:
+                out.append(sep)
+            if isinstance( item, ParseResults ):
+                out += item._asStringList()
+            else:
+                out.append( _ustr(item) )
+        return out
+
+    def asList( self ):
+        """
+        Returns the parse results as a nested list of matching tokens, all converted to strings.
+
+        Example::
+            patt = OneOrMore(Word(alphas))
+            result = patt.parseString("sldkj lsdkj sldkj")
+            # even though the result prints in string-like form, it is actually a pyparsing ParseResults
+            print(type(result), result) # ->  ['sldkj', 'lsdkj', 'sldkj']
+            
+            # Use asList() to create an actual list
+            result_list = result.asList()
+            print(type(result_list), result_list) # ->  ['sldkj', 'lsdkj', 'sldkj']
+        """
+        return [res.asList() if isinstance(res,ParseResults) else res for res in self.__toklist]
+
+    def asDict( self ):
+        """
+        Returns the named parse results as a nested dictionary.
+
+        Example::
+            integer = Word(nums)
+            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
+            
+            result = date_str.parseString('12/31/1999')
+            print(type(result), repr(result)) # ->  (['12', '/', '31', '/', '1999'], {'day': [('1999', 4)], 'year': [('12', 0)], 'month': [('31', 2)]})
+            
+            result_dict = result.asDict()
+            print(type(result_dict), repr(result_dict)) # ->  {'day': '1999', 'year': '12', 'month': '31'}
+
+            # even though a ParseResults supports dict-like access, sometime you just need to have a dict
+            import json
+            print(json.dumps(result)) # -> Exception: TypeError: ... is not JSON serializable
+            print(json.dumps(result.asDict())) # -> {"month": "31", "day": "1999", "year": "12"}
+        """
+        if PY_3:
+            item_fn = self.items
+        else:
+            item_fn = self.iteritems
+            
+        def toItem(obj):
+            if isinstance(obj, ParseResults):
+                if obj.haskeys():
+                    return obj.asDict()
+                else:
+                    return [toItem(v) for v in obj]
+            else:
+                return obj
+                
+        return dict((k,toItem(v)) for k,v in item_fn())
+
+    def copy( self ):
+        """
+        Returns a new copy of a C{ParseResults} object.
+        """
+        ret = ParseResults( self.__toklist )
+        ret.__tokdict = self.__tokdict.copy()
+        ret.__parent = self.__parent
+        ret.__accumNames.update( self.__accumNames )
+        ret.__name = self.__name
+        return ret
+
+    def asXML( self, doctag=None, namedItemsOnly=False, indent="", formatted=True ):
+        """
+        (Deprecated) Returns the parse results as XML. Tags are created for tokens and lists that have defined results names.
+        """
+        nl = "\n"
+        out = []
+        namedItems = dict((v[1],k) for (k,vlist) in self.__tokdict.items()
+                                                            for v in vlist)
+        nextLevelIndent = indent + "  "
+
+        # collapse out indents if formatting is not desired
+        if not formatted:
+            indent = ""
+            nextLevelIndent = ""
+            nl = ""
+
+        selfTag = None
+        if doctag is not None:
+            selfTag = doctag
+        else:
+            if self.__name:
+                selfTag = self.__name
+
+        if not selfTag:
+            if namedItemsOnly:
+                return ""
+            else:
+                selfTag = "ITEM"
+
+        out += [ nl, indent, "<", selfTag, ">" ]
+
+        for i,res in enumerate(self.__toklist):
+            if isinstance(res,ParseResults):
+                if i in namedItems:
+                    out += [ res.asXML(namedItems[i],
+                                        namedItemsOnly and doctag is None,
+                                        nextLevelIndent,
+                                        formatted)]
+                else:
+                    out += [ res.asXML(None,
+                                        namedItemsOnly and doctag is None,
+                                        nextLevelIndent,
+                                        formatted)]
+            else:
+                # individual token, see if there is a name for it
+                resTag = None
+                if i in namedItems:
+                    resTag = namedItems[i]
+                if not resTag:
+                    if namedItemsOnly:
+                        continue
+                    else:
+                        resTag = "ITEM"
+                xmlBodyText = _xml_escape(_ustr(res))
+                out += [ nl, nextLevelIndent, "<", resTag, ">",
+                                                xmlBodyText,
+                                                "" ]
+
+        out += [ nl, indent, "" ]
+        return "".join(out)
+
+    def __lookup(self,sub):
+        for k,vlist in self.__tokdict.items():
+            for v,loc in vlist:
+                if sub is v:
+                    return k
+        return None
+
+    def getName(self):
+        r"""
+        Returns the results name for this token expression. Useful when several 
+        different expressions might match at a particular location.
+
+        Example::
+            integer = Word(nums)
+            ssn_expr = Regex(r"\d\d\d-\d\d-\d\d\d\d")
+            house_number_expr = Suppress('#') + Word(nums, alphanums)
+            user_data = (Group(house_number_expr)("house_number") 
+                        | Group(ssn_expr)("ssn")
+                        | Group(integer)("age"))
+            user_info = OneOrMore(user_data)
+            
+            result = user_info.parseString("22 111-22-3333 #221B")
+            for item in result:
+                print(item.getName(), ':', item[0])
+        prints::
+            age : 22
+            ssn : 111-22-3333
+            house_number : 221B
+        """
+        if self.__name:
+            return self.__name
+        elif self.__parent:
+            par = self.__parent()
+            if par:
+                return par.__lookup(self)
+            else:
+                return None
+        elif (len(self) == 1 and
+               len(self.__tokdict) == 1 and
+               next(iter(self.__tokdict.values()))[0][1] in (0,-1)):
+            return next(iter(self.__tokdict.keys()))
+        else:
+            return None
+
+    def dump(self, indent='', depth=0, full=True):
+        """
+        Diagnostic method for listing out the contents of a C{ParseResults}.
+        Accepts an optional C{indent} argument so that this string can be embedded
+        in a nested display of other data.
+
+        Example::
+            integer = Word(nums)
+            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
+            
+            result = date_str.parseString('12/31/1999')
+            print(result.dump())
+        prints::
+            ['12', '/', '31', '/', '1999']
+            - day: 1999
+            - month: 31
+            - year: 12
+        """
+        out = []
+        NL = '\n'
+        out.append( indent+_ustr(self.asList()) )
+        if full:
+            if self.haskeys():
+                items = sorted((str(k), v) for k,v in self.items())
+                for k,v in items:
+                    if out:
+                        out.append(NL)
+                    out.append( "%s%s- %s: " % (indent,('  '*depth), k) )
+                    if isinstance(v,ParseResults):
+                        if v:
+                            out.append( v.dump(indent,depth+1) )
+                        else:
+                            out.append(_ustr(v))
+                    else:
+                        out.append(repr(v))
+            elif any(isinstance(vv,ParseResults) for vv in self):
+                v = self
+                for i,vv in enumerate(v):
+                    if isinstance(vv,ParseResults):
+                        out.append("\n%s%s[%d]:\n%s%s%s" % (indent,('  '*(depth)),i,indent,('  '*(depth+1)),vv.dump(indent,depth+1) ))
+                    else:
+                        out.append("\n%s%s[%d]:\n%s%s%s" % (indent,('  '*(depth)),i,indent,('  '*(depth+1)),_ustr(vv)))
+            
+        return "".join(out)
+
+    def pprint(self, *args, **kwargs):
+        """
+        Pretty-printer for parsed results as a list, using the C{pprint} module.
+        Accepts additional positional or keyword args as defined for the 
+        C{pprint.pprint} method. (U{http://docs.python.org/3/library/pprint.html#pprint.pprint})
+
+        Example::
+            ident = Word(alphas, alphanums)
+            num = Word(nums)
+            func = Forward()
+            term = ident | num | Group('(' + func + ')')
+            func <<= ident + Group(Optional(delimitedList(term)))
+            result = func.parseString("fna a,b,(fnb c,d,200),100")
+            result.pprint(width=40)
+        prints::
+            ['fna',
+             ['a',
+              'b',
+              ['(', 'fnb', ['c', 'd', '200'], ')'],
+              '100']]
+        """
+        pprint.pprint(self.asList(), *args, **kwargs)
+
+    # add support for pickle protocol
+    def __getstate__(self):
+        return ( self.__toklist,
+                 ( self.__tokdict.copy(),
+                   self.__parent is not None and self.__parent() or None,
+                   self.__accumNames,
+                   self.__name ) )
+
+    def __setstate__(self,state):
+        self.__toklist = state[0]
+        (self.__tokdict,
+         par,
+         inAccumNames,
+         self.__name) = state[1]
+        self.__accumNames = {}
+        self.__accumNames.update(inAccumNames)
+        if par is not None:
+            self.__parent = wkref(par)
+        else:
+            self.__parent = None
+
+    def __getnewargs__(self):
+        return self.__toklist, self.__name, self.__asList, self.__modal
+
+    def __dir__(self):
+        return (dir(type(self)) + list(self.keys()))
+
+MutableMapping.register(ParseResults)
+
+def col (loc,strg):
+    """Returns current column within a string, counting newlines as line separators.
+   The first column is number 1.
+
+   Note: the default parsing behavior is to expand tabs in the input string
+   before starting the parsing process.  See L{I{ParserElement.parseString}} for more information
+   on parsing strings containing C{}s, and suggested methods to maintain a
+   consistent view of the parsed string, the parse location, and line and column
+   positions within the parsed string.
+   """
+    s = strg
+    return 1 if 0} for more information
+   on parsing strings containing C{}s, and suggested methods to maintain a
+   consistent view of the parsed string, the parse location, and line and column
+   positions within the parsed string.
+   """
+    return strg.count("\n",0,loc) + 1
+
+def line( loc, strg ):
+    """Returns the line of text containing loc within a string, counting newlines as line separators.
+       """
+    lastCR = strg.rfind("\n", 0, loc)
+    nextCR = strg.find("\n", loc)
+    if nextCR >= 0:
+        return strg[lastCR+1:nextCR]
+    else:
+        return strg[lastCR+1:]
+
+def _defaultStartDebugAction( instring, loc, expr ):
+    print (("Match " + _ustr(expr) + " at loc " + _ustr(loc) + "(%d,%d)" % ( lineno(loc,instring), col(loc,instring) )))
+
+def _defaultSuccessDebugAction( instring, startloc, endloc, expr, toks ):
+    print ("Matched " + _ustr(expr) + " -> " + str(toks.asList()))
+
+def _defaultExceptionDebugAction( instring, loc, expr, exc ):
+    print ("Exception raised:" + _ustr(exc))
+
+def nullDebugAction(*args):
+    """'Do-nothing' debug action, to suppress debugging output during parsing."""
+    pass
+
+# Only works on Python 3.x - nonlocal is toxic to Python 2 installs
+#~ 'decorator to trim function calls to match the arity of the target'
+#~ def _trim_arity(func, maxargs=3):
+    #~ if func in singleArgBuiltins:
+        #~ return lambda s,l,t: func(t)
+    #~ limit = 0
+    #~ foundArity = False
+    #~ def wrapper(*args):
+        #~ nonlocal limit,foundArity
+        #~ while 1:
+            #~ try:
+                #~ ret = func(*args[limit:])
+                #~ foundArity = True
+                #~ return ret
+            #~ except TypeError:
+                #~ if limit == maxargs or foundArity:
+                    #~ raise
+                #~ limit += 1
+                #~ continue
+    #~ return wrapper
+
+# this version is Python 2.x-3.x cross-compatible
+'decorator to trim function calls to match the arity of the target'
+def _trim_arity(func, maxargs=2):
+    if func in singleArgBuiltins:
+        return lambda s,l,t: func(t)
+    limit = [0]
+    foundArity = [False]
+    
+    # traceback return data structure changed in Py3.5 - normalize back to plain tuples
+    if system_version[:2] >= (3,5):
+        def extract_stack(limit=0):
+            # special handling for Python 3.5.0 - extra deep call stack by 1
+            offset = -3 if system_version == (3,5,0) else -2
+            frame_summary = traceback.extract_stack(limit=-offset+limit-1)[offset]
+            return [frame_summary[:2]]
+        def extract_tb(tb, limit=0):
+            frames = traceback.extract_tb(tb, limit=limit)
+            frame_summary = frames[-1]
+            return [frame_summary[:2]]
+    else:
+        extract_stack = traceback.extract_stack
+        extract_tb = traceback.extract_tb
+    
+    # synthesize what would be returned by traceback.extract_stack at the call to 
+    # user's parse action 'func', so that we don't incur call penalty at parse time
+    
+    LINE_DIFF = 6
+    # IF ANY CODE CHANGES, EVEN JUST COMMENTS OR BLANK LINES, BETWEEN THE NEXT LINE AND 
+    # THE CALL TO FUNC INSIDE WRAPPER, LINE_DIFF MUST BE MODIFIED!!!!
+    this_line = extract_stack(limit=2)[-1]
+    pa_call_line_synth = (this_line[0], this_line[1]+LINE_DIFF)
+
+    def wrapper(*args):
+        while 1:
+            try:
+                ret = func(*args[limit[0]:])
+                foundArity[0] = True
+                return ret
+            except TypeError:
+                # re-raise TypeErrors if they did not come from our arity testing
+                if foundArity[0]:
+                    raise
+                else:
+                    try:
+                        tb = sys.exc_info()[-1]
+                        if not extract_tb(tb, limit=2)[-1][:2] == pa_call_line_synth:
+                            raise
+                    finally:
+                        del tb
+
+                if limit[0] <= maxargs:
+                    limit[0] += 1
+                    continue
+                raise
+
+    # copy func name to wrapper for sensible debug output
+    func_name = ""
+    try:
+        func_name = getattr(func, '__name__', 
+                            getattr(func, '__class__').__name__)
+    except Exception:
+        func_name = str(func)
+    wrapper.__name__ = func_name
+
+    return wrapper
+
+class ParserElement(object):
+    """Abstract base level parser element class."""
+    DEFAULT_WHITE_CHARS = " \n\t\r"
+    verbose_stacktrace = False
+
+    @staticmethod
+    def setDefaultWhitespaceChars( chars ):
+        r"""
+        Overrides the default whitespace chars
+
+        Example::
+            # default whitespace chars are space,  and newline
+            OneOrMore(Word(alphas)).parseString("abc def\nghi jkl")  # -> ['abc', 'def', 'ghi', 'jkl']
+            
+            # change to just treat newline as significant
+            ParserElement.setDefaultWhitespaceChars(" \t")
+            OneOrMore(Word(alphas)).parseString("abc def\nghi jkl")  # -> ['abc', 'def']
+        """
+        ParserElement.DEFAULT_WHITE_CHARS = chars
+
+    @staticmethod
+    def inlineLiteralsUsing(cls):
+        """
+        Set class to be used for inclusion of string literals into a parser.
+        
+        Example::
+            # default literal class used is Literal
+            integer = Word(nums)
+            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           
+
+            date_str.parseString("1999/12/31")  # -> ['1999', '/', '12', '/', '31']
+
+
+            # change to Suppress
+            ParserElement.inlineLiteralsUsing(Suppress)
+            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")           
+
+            date_str.parseString("1999/12/31")  # -> ['1999', '12', '31']
+        """
+        ParserElement._literalStringClass = cls
+
+    def __init__( self, savelist=False ):
+        self.parseAction = list()
+        self.failAction = None
+        #~ self.name = ""  # don't define self.name, let subclasses try/except upcall
+        self.strRepr = None
+        self.resultsName = None
+        self.saveAsList = savelist
+        self.skipWhitespace = True
+        self.whiteChars = ParserElement.DEFAULT_WHITE_CHARS
+        self.copyDefaultWhiteChars = True
+        self.mayReturnEmpty = False # used when checking for left-recursion
+        self.keepTabs = False
+        self.ignoreExprs = list()
+        self.debug = False
+        self.streamlined = False
+        self.mayIndexError = True # used to optimize exception handling for subclasses that don't advance parse index
+        self.errmsg = ""
+        self.modalResults = True # used to mark results names as modal (report only last) or cumulative (list all)
+        self.debugActions = ( None, None, None ) #custom debug actions
+        self.re = None
+        self.callPreparse = True # used to avoid redundant calls to preParse
+        self.callDuringTry = False
+
+    def copy( self ):
+        """
+        Make a copy of this C{ParserElement}.  Useful for defining different parse actions
+        for the same parsing pattern, using copies of the original parse element.
+        
+        Example::
+            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
+            integerK = integer.copy().addParseAction(lambda toks: toks[0]*1024) + Suppress("K")
+            integerM = integer.copy().addParseAction(lambda toks: toks[0]*1024*1024) + Suppress("M")
+            
+            print(OneOrMore(integerK | integerM | integer).parseString("5K 100 640K 256M"))
+        prints::
+            [5120, 100, 655360, 268435456]
+        Equivalent form of C{expr.copy()} is just C{expr()}::
+            integerM = integer().addParseAction(lambda toks: toks[0]*1024*1024) + Suppress("M")
+        """
+        cpy = copy.copy( self )
+        cpy.parseAction = self.parseAction[:]
+        cpy.ignoreExprs = self.ignoreExprs[:]
+        if self.copyDefaultWhiteChars:
+            cpy.whiteChars = ParserElement.DEFAULT_WHITE_CHARS
+        return cpy
+
+    def setName( self, name ):
+        """
+        Define name for this expression, makes debugging and exception messages clearer.
+        
+        Example::
+            Word(nums).parseString("ABC")  # -> Exception: Expected W:(0123...) (at char 0), (line:1, col:1)
+            Word(nums).setName("integer").parseString("ABC")  # -> Exception: Expected integer (at char 0), (line:1, col:1)
+        """
+        self.name = name
+        self.errmsg = "Expected " + self.name
+        if hasattr(self,"exception"):
+            self.exception.msg = self.errmsg
+        return self
+
+    def setResultsName( self, name, listAllMatches=False ):
+        """
+        Define name for referencing matching tokens as a nested attribute
+        of the returned parse results.
+        NOTE: this returns a *copy* of the original C{ParserElement} object;
+        this is so that the client can define a basic element, such as an
+        integer, and reference it in multiple places with different names.
+
+        You can also set results names using the abbreviated syntax,
+        C{expr("name")} in place of C{expr.setResultsName("name")} - 
+        see L{I{__call__}<__call__>}.
+
+        Example::
+            date_str = (integer.setResultsName("year") + '/' 
+                        + integer.setResultsName("month") + '/' 
+                        + integer.setResultsName("day"))
+
+            # equivalent form:
+            date_str = integer("year") + '/' + integer("month") + '/' + integer("day")
+        """
+        newself = self.copy()
+        if name.endswith("*"):
+            name = name[:-1]
+            listAllMatches=True
+        newself.resultsName = name
+        newself.modalResults = not listAllMatches
+        return newself
+
+    def setBreak(self,breakFlag = True):
+        """Method to invoke the Python pdb debugger when this element is
+           about to be parsed. Set C{breakFlag} to True to enable, False to
+           disable.
+        """
+        if breakFlag:
+            _parseMethod = self._parse
+            def breaker(instring, loc, doActions=True, callPreParse=True):
+                import pdb
+                pdb.set_trace()
+                return _parseMethod( instring, loc, doActions, callPreParse )
+            breaker._originalParseMethod = _parseMethod
+            self._parse = breaker
+        else:
+            if hasattr(self._parse,"_originalParseMethod"):
+                self._parse = self._parse._originalParseMethod
+        return self
+
+    def setParseAction( self, *fns, **kwargs ):
+        """
+        Define one or more actions to perform when successfully matching parse element definition.
+        Parse action fn is a callable method with 0-3 arguments, called as C{fn(s,loc,toks)},
+        C{fn(loc,toks)}, C{fn(toks)}, or just C{fn()}, where:
+         - s   = the original string being parsed (see note below)
+         - loc = the location of the matching substring
+         - toks = a list of the matched tokens, packaged as a C{L{ParseResults}} object
+        If the functions in fns modify the tokens, they can return them as the return
+        value from fn, and the modified list of tokens will replace the original.
+        Otherwise, fn does not need to return any value.
+
+        Optional keyword arguments:
+         - callDuringTry = (default=C{False}) indicate if parse action should be run during lookaheads and alternate testing
+
+        Note: the default parsing behavior is to expand tabs in the input string
+        before starting the parsing process.  See L{I{parseString}} for more information
+        on parsing strings containing C{}s, and suggested methods to maintain a
+        consistent view of the parsed string, the parse location, and line and column
+        positions within the parsed string.
+        
+        Example::
+            integer = Word(nums)
+            date_str = integer + '/' + integer + '/' + integer
+
+            date_str.parseString("1999/12/31")  # -> ['1999', '/', '12', '/', '31']
+
+            # use parse action to convert to ints at parse time
+            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
+            date_str = integer + '/' + integer + '/' + integer
+
+            # note that integer fields are now ints, not strings
+            date_str.parseString("1999/12/31")  # -> [1999, '/', 12, '/', 31]
+        """
+        self.parseAction = list(map(_trim_arity, list(fns)))
+        self.callDuringTry = kwargs.get("callDuringTry", False)
+        return self
+
+    def addParseAction( self, *fns, **kwargs ):
+        """
+        Add one or more parse actions to expression's list of parse actions. See L{I{setParseAction}}.
+        
+        See examples in L{I{copy}}.
+        """
+        self.parseAction += list(map(_trim_arity, list(fns)))
+        self.callDuringTry = self.callDuringTry or kwargs.get("callDuringTry", False)
+        return self
+
+    def addCondition(self, *fns, **kwargs):
+        """Add a boolean predicate function to expression's list of parse actions. See 
+        L{I{setParseAction}} for function call signatures. Unlike C{setParseAction}, 
+        functions passed to C{addCondition} need to return boolean success/fail of the condition.
+
+        Optional keyword arguments:
+         - message = define a custom message to be used in the raised exception
+         - fatal   = if True, will raise ParseFatalException to stop parsing immediately; otherwise will raise ParseException
+         
+        Example::
+            integer = Word(nums).setParseAction(lambda toks: int(toks[0]))
+            year_int = integer.copy()
+            year_int.addCondition(lambda toks: toks[0] >= 2000, message="Only support years 2000 and later")
+            date_str = year_int + '/' + integer + '/' + integer
+
+            result = date_str.parseString("1999/12/31")  # -> Exception: Only support years 2000 and later (at char 0), (line:1, col:1)
+        """
+        msg = kwargs.get("message", "failed user-defined condition")
+        exc_type = ParseFatalException if kwargs.get("fatal", False) else ParseException
+        for fn in fns:
+            def pa(s,l,t):
+                if not bool(_trim_arity(fn)(s,l,t)):
+                    raise exc_type(s,l,msg)
+            self.parseAction.append(pa)
+        self.callDuringTry = self.callDuringTry or kwargs.get("callDuringTry", False)
+        return self
+
+    def setFailAction( self, fn ):
+        """Define action to perform if parsing fails at this expression.
+           Fail acton fn is a callable function that takes the arguments
+           C{fn(s,loc,expr,err)} where:
+            - s = string being parsed
+            - loc = location where expression match was attempted and failed
+            - expr = the parse expression that failed
+            - err = the exception thrown
+           The function returns no value.  It may throw C{L{ParseFatalException}}
+           if it is desired to stop parsing immediately."""
+        self.failAction = fn
+        return self
+
+    def _skipIgnorables( self, instring, loc ):
+        exprsFound = True
+        while exprsFound:
+            exprsFound = False
+            for e in self.ignoreExprs:
+                try:
+                    while 1:
+                        loc,dummy = e._parse( instring, loc )
+                        exprsFound = True
+                except ParseException:
+                    pass
+        return loc
+
+    def preParse( self, instring, loc ):
+        if self.ignoreExprs:
+            loc = self._skipIgnorables( instring, loc )
+
+        if self.skipWhitespace:
+            wt = self.whiteChars
+            instrlen = len(instring)
+            while loc < instrlen and instring[loc] in wt:
+                loc += 1
+
+        return loc
+
+    def parseImpl( self, instring, loc, doActions=True ):
+        return loc, []
+
+    def postParse( self, instring, loc, tokenlist ):
+        return tokenlist
+
+    #~ @profile
+    def _parseNoCache( self, instring, loc, doActions=True, callPreParse=True ):
+        debugging = ( self.debug ) #and doActions )
+
+        if debugging or self.failAction:
+            #~ print ("Match",self,"at loc",loc,"(%d,%d)" % ( lineno(loc,instring), col(loc,instring) ))
+            if (self.debugActions[0] ):
+                self.debugActions[0]( instring, loc, self )
+            if callPreParse and self.callPreparse:
+                preloc = self.preParse( instring, loc )
+            else:
+                preloc = loc
+            tokensStart = preloc
+            try:
+                try:
+                    loc,tokens = self.parseImpl( instring, preloc, doActions )
+                except IndexError:
+                    raise ParseException( instring, len(instring), self.errmsg, self )
+            except ParseBaseException as err:
+                #~ print ("Exception raised:", err)
+                if self.debugActions[2]:
+                    self.debugActions[2]( instring, tokensStart, self, err )
+                if self.failAction:
+                    self.failAction( instring, tokensStart, self, err )
+                raise
+        else:
+            if callPreParse and self.callPreparse:
+                preloc = self.preParse( instring, loc )
+            else:
+                preloc = loc
+            tokensStart = preloc
+            if self.mayIndexError or preloc >= len(instring):
+                try:
+                    loc,tokens = self.parseImpl( instring, preloc, doActions )
+                except IndexError:
+                    raise ParseException( instring, len(instring), self.errmsg, self )
+            else:
+                loc,tokens = self.parseImpl( instring, preloc, doActions )
+
+        tokens = self.postParse( instring, loc, tokens )
+
+        retTokens = ParseResults( tokens, self.resultsName, asList=self.saveAsList, modal=self.modalResults )
+        if self.parseAction and (doActions or self.callDuringTry):
+            if debugging:
+                try:
+                    for fn in self.parseAction:
+                        tokens = fn( instring, tokensStart, retTokens )
+                        if tokens is not None:
+                            retTokens = ParseResults( tokens,
+                                                      self.resultsName,
+                                                      asList=self.saveAsList and isinstance(tokens,(ParseResults,list)),
+                                                      modal=self.modalResults )
+                except ParseBaseException as err:
+                    #~ print "Exception raised in user parse action:", err
+                    if (self.debugActions[2] ):
+                        self.debugActions[2]( instring, tokensStart, self, err )
+                    raise
+            else:
+                for fn in self.parseAction:
+                    tokens = fn( instring, tokensStart, retTokens )
+                    if tokens is not None:
+                        retTokens = ParseResults( tokens,
+                                                  self.resultsName,
+                                                  asList=self.saveAsList and isinstance(tokens,(ParseResults,list)),
+                                                  modal=self.modalResults )
+        if debugging:
+            #~ print ("Matched",self,"->",retTokens.asList())
+            if (self.debugActions[1] ):
+                self.debugActions[1]( instring, tokensStart, loc, self, retTokens )
+
+        return loc, retTokens
+
+    def tryParse( self, instring, loc ):
+        try:
+            return self._parse( instring, loc, doActions=False )[0]
+        except ParseFatalException:
+            raise ParseException( instring, loc, self.errmsg, self)
+    
+    def canParseNext(self, instring, loc):
+        try:
+            self.tryParse(instring, loc)
+        except (ParseException, IndexError):
+            return False
+        else:
+            return True
+
+    class _UnboundedCache(object):
+        def __init__(self):
+            cache = {}
+            self.not_in_cache = not_in_cache = object()
+
+            def get(self, key):
+                return cache.get(key, not_in_cache)
+
+            def set(self, key, value):
+                cache[key] = value
+
+            def clear(self):
+                cache.clear()
+                
+            def cache_len(self):
+                return len(cache)
+
+            self.get = types.MethodType(get, self)
+            self.set = types.MethodType(set, self)
+            self.clear = types.MethodType(clear, self)
+            self.__len__ = types.MethodType(cache_len, self)
+
+    if _OrderedDict is not None:
+        class _FifoCache(object):
+            def __init__(self, size):
+                self.not_in_cache = not_in_cache = object()
+
+                cache = _OrderedDict()
+
+                def get(self, key):
+                    return cache.get(key, not_in_cache)
+
+                def set(self, key, value):
+                    cache[key] = value
+                    while len(cache) > size:
+                        try:
+                            cache.popitem(False)
+                        except KeyError:
+                            pass
+
+                def clear(self):
+                    cache.clear()
+
+                def cache_len(self):
+                    return len(cache)
+
+                self.get = types.MethodType(get, self)
+                self.set = types.MethodType(set, self)
+                self.clear = types.MethodType(clear, self)
+                self.__len__ = types.MethodType(cache_len, self)
+
+    else:
+        class _FifoCache(object):
+            def __init__(self, size):
+                self.not_in_cache = not_in_cache = object()
+
+                cache = {}
+                key_fifo = collections.deque([], size)
+
+                def get(self, key):
+                    return cache.get(key, not_in_cache)
+
+                def set(self, key, value):
+                    cache[key] = value
+                    while len(key_fifo) > size:
+                        cache.pop(key_fifo.popleft(), None)
+                    key_fifo.append(key)
+
+                def clear(self):
+                    cache.clear()
+                    key_fifo.clear()
+
+                def cache_len(self):
+                    return len(cache)
+
+                self.get = types.MethodType(get, self)
+                self.set = types.MethodType(set, self)
+                self.clear = types.MethodType(clear, self)
+                self.__len__ = types.MethodType(cache_len, self)
+
+    # argument cache for optimizing repeated calls when backtracking through recursive expressions
+    packrat_cache = {} # this is set later by enabledPackrat(); this is here so that resetCache() doesn't fail
+    packrat_cache_lock = RLock()
+    packrat_cache_stats = [0, 0]
+
+    # this method gets repeatedly called during backtracking with the same arguments -
+    # we can cache these arguments and save ourselves the trouble of re-parsing the contained expression
+    def _parseCache( self, instring, loc, doActions=True, callPreParse=True ):
+        HIT, MISS = 0, 1
+        lookup = (self, instring, loc, callPreParse, doActions)
+        with ParserElement.packrat_cache_lock:
+            cache = ParserElement.packrat_cache
+            value = cache.get(lookup)
+            if value is cache.not_in_cache:
+                ParserElement.packrat_cache_stats[MISS] += 1
+                try:
+                    value = self._parseNoCache(instring, loc, doActions, callPreParse)
+                except ParseBaseException as pe:
+                    # cache a copy of the exception, without the traceback
+                    cache.set(lookup, pe.__class__(*pe.args))
+                    raise
+                else:
+                    cache.set(lookup, (value[0], value[1].copy()))
+                    return value
+            else:
+                ParserElement.packrat_cache_stats[HIT] += 1
+                if isinstance(value, Exception):
+                    raise value
+                return (value[0], value[1].copy())
+
+    _parse = _parseNoCache
+
+    @staticmethod
+    def resetCache():
+        ParserElement.packrat_cache.clear()
+        ParserElement.packrat_cache_stats[:] = [0] * len(ParserElement.packrat_cache_stats)
+
+    _packratEnabled = False
+    @staticmethod
+    def enablePackrat(cache_size_limit=128):
+        """Enables "packrat" parsing, which adds memoizing to the parsing logic.
+           Repeated parse attempts at the same string location (which happens
+           often in many complex grammars) can immediately return a cached value,
+           instead of re-executing parsing/validating code.  Memoizing is done of
+           both valid results and parsing exceptions.
+           
+           Parameters:
+            - cache_size_limit - (default=C{128}) - if an integer value is provided
+              will limit the size of the packrat cache; if None is passed, then
+              the cache size will be unbounded; if 0 is passed, the cache will
+              be effectively disabled.
+            
+           This speedup may break existing programs that use parse actions that
+           have side-effects.  For this reason, packrat parsing is disabled when
+           you first import pyparsing.  To activate the packrat feature, your
+           program must call the class method C{ParserElement.enablePackrat()}.  If
+           your program uses C{psyco} to "compile as you go", you must call
+           C{enablePackrat} before calling C{psyco.full()}.  If you do not do this,
+           Python will crash.  For best results, call C{enablePackrat()} immediately
+           after importing pyparsing.
+           
+           Example::
+               import pyparsing
+               pyparsing.ParserElement.enablePackrat()
+        """
+        if not ParserElement._packratEnabled:
+            ParserElement._packratEnabled = True
+            if cache_size_limit is None:
+                ParserElement.packrat_cache = ParserElement._UnboundedCache()
+            else:
+                ParserElement.packrat_cache = ParserElement._FifoCache(cache_size_limit)
+            ParserElement._parse = ParserElement._parseCache
+
+    def parseString( self, instring, parseAll=False ):
+        """
+        Execute the parse expression with the given string.
+        This is the main interface to the client code, once the complete
+        expression has been built.
+
+        If you want the grammar to require that the entire input string be
+        successfully parsed, then set C{parseAll} to True (equivalent to ending
+        the grammar with C{L{StringEnd()}}).
+
+        Note: C{parseString} implicitly calls C{expandtabs()} on the input string,
+        in order to report proper column numbers in parse actions.
+        If the input string contains tabs and
+        the grammar uses parse actions that use the C{loc} argument to index into the
+        string being parsed, you can ensure you have a consistent view of the input
+        string by:
+         - calling C{parseWithTabs} on your grammar before calling C{parseString}
+           (see L{I{parseWithTabs}})
+         - define your parse action using the full C{(s,loc,toks)} signature, and
+           reference the input string using the parse action's C{s} argument
+         - explictly expand the tabs in your input string before calling
+           C{parseString}
+        
+        Example::
+            Word('a').parseString('aaaaabaaa')  # -> ['aaaaa']
+            Word('a').parseString('aaaaabaaa', parseAll=True)  # -> Exception: Expected end of text
+        """
+        ParserElement.resetCache()
+        if not self.streamlined:
+            self.streamline()
+            #~ self.saveAsList = True
+        for e in self.ignoreExprs:
+            e.streamline()
+        if not self.keepTabs:
+            instring = instring.expandtabs()
+        try:
+            loc, tokens = self._parse( instring, 0 )
+            if parseAll:
+                loc = self.preParse( instring, loc )
+                se = Empty() + StringEnd()
+                se._parse( instring, loc )
+        except ParseBaseException as exc:
+            if ParserElement.verbose_stacktrace:
+                raise
+            else:
+                # catch and re-raise exception from here, clears out pyparsing internal stack trace
+                raise exc
+        else:
+            return tokens
+
+    def scanString( self, instring, maxMatches=_MAX_INT, overlap=False ):
+        """
+        Scan the input string for expression matches.  Each match will return the
+        matching tokens, start location, and end location.  May be called with optional
+        C{maxMatches} argument, to clip scanning after 'n' matches are found.  If
+        C{overlap} is specified, then overlapping matches will be reported.
+
+        Note that the start and end locations are reported relative to the string
+        being parsed.  See L{I{parseString}} for more information on parsing
+        strings with embedded tabs.
+
+        Example::
+            source = "sldjf123lsdjjkf345sldkjf879lkjsfd987"
+            print(source)
+            for tokens,start,end in Word(alphas).scanString(source):
+                print(' '*start + '^'*(end-start))
+                print(' '*start + tokens[0])
+        
+        prints::
+        
+            sldjf123lsdjjkf345sldkjf879lkjsfd987
+            ^^^^^
+            sldjf
+                    ^^^^^^^
+                    lsdjjkf
+                              ^^^^^^
+                              sldkjf
+                                       ^^^^^^
+                                       lkjsfd
+        """
+        if not self.streamlined:
+            self.streamline()
+        for e in self.ignoreExprs:
+            e.streamline()
+
+        if not self.keepTabs:
+            instring = _ustr(instring).expandtabs()
+        instrlen = len(instring)
+        loc = 0
+        preparseFn = self.preParse
+        parseFn = self._parse
+        ParserElement.resetCache()
+        matches = 0
+        try:
+            while loc <= instrlen and matches < maxMatches:
+                try:
+                    preloc = preparseFn( instring, loc )
+                    nextLoc,tokens = parseFn( instring, preloc, callPreParse=False )
+                except ParseException:
+                    loc = preloc+1
+                else:
+                    if nextLoc > loc:
+                        matches += 1
+                        yield tokens, preloc, nextLoc
+                        if overlap:
+                            nextloc = preparseFn( instring, loc )
+                            if nextloc > loc:
+                                loc = nextLoc
+                            else:
+                                loc += 1
+                        else:
+                            loc = nextLoc
+                    else:
+                        loc = preloc+1
+        except ParseBaseException as exc:
+            if ParserElement.verbose_stacktrace:
+                raise
+            else:
+                # catch and re-raise exception from here, clears out pyparsing internal stack trace
+                raise exc
+
+    def transformString( self, instring ):
+        """
+        Extension to C{L{scanString}}, to modify matching text with modified tokens that may
+        be returned from a parse action.  To use C{transformString}, define a grammar and
+        attach a parse action to it that modifies the returned token list.
+        Invoking C{transformString()} on a target string will then scan for matches,
+        and replace the matched text patterns according to the logic in the parse
+        action.  C{transformString()} returns the resulting transformed string.
+        
+        Example::
+            wd = Word(alphas)
+            wd.setParseAction(lambda toks: toks[0].title())
+            
+            print(wd.transformString("now is the winter of our discontent made glorious summer by this sun of york."))
+        Prints::
+            Now Is The Winter Of Our Discontent Made Glorious Summer By This Sun Of York.
+        """
+        out = []
+        lastE = 0
+        # force preservation of s, to minimize unwanted transformation of string, and to
+        # keep string locs straight between transformString and scanString
+        self.keepTabs = True
+        try:
+            for t,s,e in self.scanString( instring ):
+                out.append( instring[lastE:s] )
+                if t:
+                    if isinstance(t,ParseResults):
+                        out += t.asList()
+                    elif isinstance(t,list):
+                        out += t
+                    else:
+                        out.append(t)
+                lastE = e
+            out.append(instring[lastE:])
+            out = [o for o in out if o]
+            return "".join(map(_ustr,_flatten(out)))
+        except ParseBaseException as exc:
+            if ParserElement.verbose_stacktrace:
+                raise
+            else:
+                # catch and re-raise exception from here, clears out pyparsing internal stack trace
+                raise exc
+
+    def searchString( self, instring, maxMatches=_MAX_INT ):
+        """
+        Another extension to C{L{scanString}}, simplifying the access to the tokens found
+        to match the given parse expression.  May be called with optional
+        C{maxMatches} argument, to clip searching after 'n' matches are found.
+        
+        Example::
+            # a capitalized word starts with an uppercase letter, followed by zero or more lowercase letters
+            cap_word = Word(alphas.upper(), alphas.lower())
+            
+            print(cap_word.searchString("More than Iron, more than Lead, more than Gold I need Electricity"))
+
+            # the sum() builtin can be used to merge results into a single ParseResults object
+            print(sum(cap_word.searchString("More than Iron, more than Lead, more than Gold I need Electricity")))
+        prints::
+            [['More'], ['Iron'], ['Lead'], ['Gold'], ['I'], ['Electricity']]
+            ['More', 'Iron', 'Lead', 'Gold', 'I', 'Electricity']
+        """
+        try:
+            return ParseResults([ t for t,s,e in self.scanString( instring, maxMatches ) ])
+        except ParseBaseException as exc:
+            if ParserElement.verbose_stacktrace:
+                raise
+            else:
+                # catch and re-raise exception from here, clears out pyparsing internal stack trace
+                raise exc
+
+    def split(self, instring, maxsplit=_MAX_INT, includeSeparators=False):
+        """
+        Generator method to split a string using the given expression as a separator.
+        May be called with optional C{maxsplit} argument, to limit the number of splits;
+        and the optional C{includeSeparators} argument (default=C{False}), if the separating
+        matching text should be included in the split results.
+        
+        Example::        
+            punc = oneOf(list(".,;:/-!?"))
+            print(list(punc.split("This, this?, this sentence, is badly punctuated!")))
+        prints::
+            ['This', ' this', '', ' this sentence', ' is badly punctuated', '']
+        """
+        splits = 0
+        last = 0
+        for t,s,e in self.scanString(instring, maxMatches=maxsplit):
+            yield instring[last:s]
+            if includeSeparators:
+                yield t[0]
+            last = e
+        yield instring[last:]
+
+    def __add__(self, other ):
+        """
+        Implementation of + operator - returns C{L{And}}. Adding strings to a ParserElement
+        converts them to L{Literal}s by default.
+        
+        Example::
+            greet = Word(alphas) + "," + Word(alphas) + "!"
+            hello = "Hello, World!"
+            print (hello, "->", greet.parseString(hello))
+        Prints::
+            Hello, World! -> ['Hello', ',', 'World', '!']
+        """
+        if isinstance( other, basestring ):
+            other = ParserElement._literalStringClass( other )
+        if not isinstance( other, ParserElement ):
+            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
+                    SyntaxWarning, stacklevel=2)
+            return None
+        return And( [ self, other ] )
+
+    def __radd__(self, other ):
+        """
+        Implementation of + operator when left operand is not a C{L{ParserElement}}
+        """
+        if isinstance( other, basestring ):
+            other = ParserElement._literalStringClass( other )
+        if not isinstance( other, ParserElement ):
+            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
+                    SyntaxWarning, stacklevel=2)
+            return None
+        return other + self
+
+    def __sub__(self, other):
+        """
+        Implementation of - operator, returns C{L{And}} with error stop
+        """
+        if isinstance( other, basestring ):
+            other = ParserElement._literalStringClass( other )
+        if not isinstance( other, ParserElement ):
+            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
+                    SyntaxWarning, stacklevel=2)
+            return None
+        return self + And._ErrorStop() + other
+
+    def __rsub__(self, other ):
+        """
+        Implementation of - operator when left operand is not a C{L{ParserElement}}
+        """
+        if isinstance( other, basestring ):
+            other = ParserElement._literalStringClass( other )
+        if not isinstance( other, ParserElement ):
+            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
+                    SyntaxWarning, stacklevel=2)
+            return None
+        return other - self
+
+    def __mul__(self,other):
+        """
+        Implementation of * operator, allows use of C{expr * 3} in place of
+        C{expr + expr + expr}.  Expressions may also me multiplied by a 2-integer
+        tuple, similar to C{{min,max}} multipliers in regular expressions.  Tuples
+        may also include C{None} as in:
+         - C{expr*(n,None)} or C{expr*(n,)} is equivalent
+              to C{expr*n + L{ZeroOrMore}(expr)}
+              (read as "at least n instances of C{expr}")
+         - C{expr*(None,n)} is equivalent to C{expr*(0,n)}
+              (read as "0 to n instances of C{expr}")
+         - C{expr*(None,None)} is equivalent to C{L{ZeroOrMore}(expr)}
+         - C{expr*(1,None)} is equivalent to C{L{OneOrMore}(expr)}
+
+        Note that C{expr*(None,n)} does not raise an exception if
+        more than n exprs exist in the input stream; that is,
+        C{expr*(None,n)} does not enforce a maximum number of expr
+        occurrences.  If this behavior is desired, then write
+        C{expr*(None,n) + ~expr}
+        """
+        if isinstance(other,int):
+            minElements, optElements = other,0
+        elif isinstance(other,tuple):
+            other = (other + (None, None))[:2]
+            if other[0] is None:
+                other = (0, other[1])
+            if isinstance(other[0],int) and other[1] is None:
+                if other[0] == 0:
+                    return ZeroOrMore(self)
+                if other[0] == 1:
+                    return OneOrMore(self)
+                else:
+                    return self*other[0] + ZeroOrMore(self)
+            elif isinstance(other[0],int) and isinstance(other[1],int):
+                minElements, optElements = other
+                optElements -= minElements
+            else:
+                raise TypeError("cannot multiply 'ParserElement' and ('%s','%s') objects", type(other[0]),type(other[1]))
+        else:
+            raise TypeError("cannot multiply 'ParserElement' and '%s' objects", type(other))
+
+        if minElements < 0:
+            raise ValueError("cannot multiply ParserElement by negative value")
+        if optElements < 0:
+            raise ValueError("second tuple value must be greater or equal to first tuple value")
+        if minElements == optElements == 0:
+            raise ValueError("cannot multiply ParserElement by 0 or (0,0)")
+
+        if (optElements):
+            def makeOptionalList(n):
+                if n>1:
+                    return Optional(self + makeOptionalList(n-1))
+                else:
+                    return Optional(self)
+            if minElements:
+                if minElements == 1:
+                    ret = self + makeOptionalList(optElements)
+                else:
+                    ret = And([self]*minElements) + makeOptionalList(optElements)
+            else:
+                ret = makeOptionalList(optElements)
+        else:
+            if minElements == 1:
+                ret = self
+            else:
+                ret = And([self]*minElements)
+        return ret
+
+    def __rmul__(self, other):
+        return self.__mul__(other)
+
+    def __or__(self, other ):
+        """
+        Implementation of | operator - returns C{L{MatchFirst}}
+        """
+        if isinstance( other, basestring ):
+            other = ParserElement._literalStringClass( other )
+        if not isinstance( other, ParserElement ):
+            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
+                    SyntaxWarning, stacklevel=2)
+            return None
+        return MatchFirst( [ self, other ] )
+
+    def __ror__(self, other ):
+        """
+        Implementation of | operator when left operand is not a C{L{ParserElement}}
+        """
+        if isinstance( other, basestring ):
+            other = ParserElement._literalStringClass( other )
+        if not isinstance( other, ParserElement ):
+            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
+                    SyntaxWarning, stacklevel=2)
+            return None
+        return other | self
+
+    def __xor__(self, other ):
+        """
+        Implementation of ^ operator - returns C{L{Or}}
+        """
+        if isinstance( other, basestring ):
+            other = ParserElement._literalStringClass( other )
+        if not isinstance( other, ParserElement ):
+            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
+                    SyntaxWarning, stacklevel=2)
+            return None
+        return Or( [ self, other ] )
+
+    def __rxor__(self, other ):
+        """
+        Implementation of ^ operator when left operand is not a C{L{ParserElement}}
+        """
+        if isinstance( other, basestring ):
+            other = ParserElement._literalStringClass( other )
+        if not isinstance( other, ParserElement ):
+            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
+                    SyntaxWarning, stacklevel=2)
+            return None
+        return other ^ self
+
+    def __and__(self, other ):
+        """
+        Implementation of & operator - returns C{L{Each}}
+        """
+        if isinstance( other, basestring ):
+            other = ParserElement._literalStringClass( other )
+        if not isinstance( other, ParserElement ):
+            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
+                    SyntaxWarning, stacklevel=2)
+            return None
+        return Each( [ self, other ] )
+
+    def __rand__(self, other ):
+        """
+        Implementation of & operator when left operand is not a C{L{ParserElement}}
+        """
+        if isinstance( other, basestring ):
+            other = ParserElement._literalStringClass( other )
+        if not isinstance( other, ParserElement ):
+            warnings.warn("Cannot combine element of type %s with ParserElement" % type(other),
+                    SyntaxWarning, stacklevel=2)
+            return None
+        return other & self
+
+    def __invert__( self ):
+        """
+        Implementation of ~ operator - returns C{L{NotAny}}
+        """
+        return NotAny( self )
+
+    def __call__(self, name=None):
+        """
+        Shortcut for C{L{setResultsName}}, with C{listAllMatches=False}.
+        
+        If C{name} is given with a trailing C{'*'} character, then C{listAllMatches} will be
+        passed as C{True}.
+           
+        If C{name} is omitted, same as calling C{L{copy}}.
+
+        Example::
+            # these are equivalent
+            userdata = Word(alphas).setResultsName("name") + Word(nums+"-").setResultsName("socsecno")
+            userdata = Word(alphas)("name") + Word(nums+"-")("socsecno")             
+        """
+        if name is not None:
+            return self.setResultsName(name)
+        else:
+            return self.copy()
+
+    def suppress( self ):
+        """
+        Suppresses the output of this C{ParserElement}; useful to keep punctuation from
+        cluttering up returned output.
+        """
+        return Suppress( self )
+
+    def leaveWhitespace( self ):
+        """
+        Disables the skipping of whitespace before matching the characters in the
+        C{ParserElement}'s defined pattern.  This is normally only used internally by
+        the pyparsing module, but may be needed in some whitespace-sensitive grammars.
+        """
+        self.skipWhitespace = False
+        return self
+
+    def setWhitespaceChars( self, chars ):
+        """
+        Overrides the default whitespace chars
+        """
+        self.skipWhitespace = True
+        self.whiteChars = chars
+        self.copyDefaultWhiteChars = False
+        return self
+
+    def parseWithTabs( self ):
+        """
+        Overrides default behavior to expand C{}s to spaces before parsing the input string.
+        Must be called before C{parseString} when the input grammar contains elements that
+        match C{} characters.
+        """
+        self.keepTabs = True
+        return self
+
+    def ignore( self, other ):
+        """
+        Define expression to be ignored (e.g., comments) while doing pattern
+        matching; may be called repeatedly, to define multiple comment or other
+        ignorable patterns.
+        
+        Example::
+            patt = OneOrMore(Word(alphas))
+            patt.parseString('ablaj /* comment */ lskjd') # -> ['ablaj']
+            
+            patt.ignore(cStyleComment)
+            patt.parseString('ablaj /* comment */ lskjd') # -> ['ablaj', 'lskjd']
+        """
+        if isinstance(other, basestring):
+            other = Suppress(other)
+
+        if isinstance( other, Suppress ):
+            if other not in self.ignoreExprs:
+                self.ignoreExprs.append(other)
+        else:
+            self.ignoreExprs.append( Suppress( other.copy() ) )
+        return self
+
+    def setDebugActions( self, startAction, successAction, exceptionAction ):
+        """
+        Enable display of debugging messages while doing pattern matching.
+        """
+        self.debugActions = (startAction or _defaultStartDebugAction,
+                             successAction or _defaultSuccessDebugAction,
+                             exceptionAction or _defaultExceptionDebugAction)
+        self.debug = True
+        return self
+
+    def setDebug( self, flag=True ):
+        """
+        Enable display of debugging messages while doing pattern matching.
+        Set C{flag} to True to enable, False to disable.
+
+        Example::
+            wd = Word(alphas).setName("alphaword")
+            integer = Word(nums).setName("numword")
+            term = wd | integer
+            
+            # turn on debugging for wd
+            wd.setDebug()
+
+            OneOrMore(term).parseString("abc 123 xyz 890")
+        
+        prints::
+            Match alphaword at loc 0(1,1)
+            Matched alphaword -> ['abc']
+            Match alphaword at loc 3(1,4)
+            Exception raised:Expected alphaword (at char 4), (line:1, col:5)
+            Match alphaword at loc 7(1,8)
+            Matched alphaword -> ['xyz']
+            Match alphaword at loc 11(1,12)
+            Exception raised:Expected alphaword (at char 12), (line:1, col:13)
+            Match alphaword at loc 15(1,16)
+            Exception raised:Expected alphaword (at char 15), (line:1, col:16)
+
+        The output shown is that produced by the default debug actions - custom debug actions can be
+        specified using L{setDebugActions}. Prior to attempting
+        to match the C{wd} expression, the debugging message C{"Match  at loc (,)"}
+        is shown. Then if the parse succeeds, a C{"Matched"} message is shown, or an C{"Exception raised"}
+        message is shown. Also note the use of L{setName} to assign a human-readable name to the expression,
+        which makes debugging and exception messages easier to understand - for instance, the default
+        name created for the C{Word} expression without calling C{setName} is C{"W:(ABCD...)"}.
+        """
+        if flag:
+            self.setDebugActions( _defaultStartDebugAction, _defaultSuccessDebugAction, _defaultExceptionDebugAction )
+        else:
+            self.debug = False
+        return self
+
+    def __str__( self ):
+        return self.name
+
+    def __repr__( self ):
+        return _ustr(self)
+
+    def streamline( self ):
+        self.streamlined = True
+        self.strRepr = None
+        return self
+
+    def checkRecursion( self, parseElementList ):
+        pass
+
+    def validate( self, validateTrace=[] ):
+        """
+        Check defined expressions for valid structure, check for infinite recursive definitions.
+        """
+        self.checkRecursion( [] )
+
+    def parseFile( self, file_or_filename, parseAll=False ):
+        """
+        Execute the parse expression on the given file or filename.
+        If a filename is specified (instead of a file object),
+        the entire file is opened, read, and closed before parsing.
+        """
+        try:
+            file_contents = file_or_filename.read()
+        except AttributeError:
+            with open(file_or_filename, "r") as f:
+                file_contents = f.read()
+        try:
+            return self.parseString(file_contents, parseAll)
+        except ParseBaseException as exc:
+            if ParserElement.verbose_stacktrace:
+                raise
+            else:
+                # catch and re-raise exception from here, clears out pyparsing internal stack trace
+                raise exc
+
+    def __eq__(self,other):
+        if isinstance(other, ParserElement):
+            return self is other or vars(self) == vars(other)
+        elif isinstance(other, basestring):
+            return self.matches(other)
+        else:
+            return super(ParserElement,self)==other
+
+    def __ne__(self,other):
+        return not (self == other)
+
+    def __hash__(self):
+        return hash(id(self))
+
+    def __req__(self,other):
+        return self == other
+
+    def __rne__(self,other):
+        return not (self == other)
+
+    def matches(self, testString, parseAll=True):
+        """
+        Method for quick testing of a parser against a test string. Good for simple 
+        inline microtests of sub expressions while building up larger parser.
+           
+        Parameters:
+         - testString - to test against this expression for a match
+         - parseAll - (default=C{True}) - flag to pass to C{L{parseString}} when running tests
+            
+        Example::
+            expr = Word(nums)
+            assert expr.matches("100")
+        """
+        try:
+            self.parseString(_ustr(testString), parseAll=parseAll)
+            return True
+        except ParseBaseException:
+            return False
+                
+    def runTests(self, tests, parseAll=True, comment='#', fullDump=True, printResults=True, failureTests=False):
+        """
+        Execute the parse expression on a series of test strings, showing each
+        test, the parsed results or where the parse failed. Quick and easy way to
+        run a parse expression against a list of sample strings.
+           
+        Parameters:
+         - tests - a list of separate test strings, or a multiline string of test strings
+         - parseAll - (default=C{True}) - flag to pass to C{L{parseString}} when running tests           
+         - comment - (default=C{'#'}) - expression for indicating embedded comments in the test 
+              string; pass None to disable comment filtering
+         - fullDump - (default=C{True}) - dump results as list followed by results names in nested outline;
+              if False, only dump nested list
+         - printResults - (default=C{True}) prints test output to stdout
+         - failureTests - (default=C{False}) indicates if these tests are expected to fail parsing
+
+        Returns: a (success, results) tuple, where success indicates that all tests succeeded
+        (or failed if C{failureTests} is True), and the results contain a list of lines of each 
+        test's output
+        
+        Example::
+            number_expr = pyparsing_common.number.copy()
+
+            result = number_expr.runTests('''
+                # unsigned integer
+                100
+                # negative integer
+                -100
+                # float with scientific notation
+                6.02e23
+                # integer with scientific notation
+                1e-12
+                ''')
+            print("Success" if result[0] else "Failed!")
+
+            result = number_expr.runTests('''
+                # stray character
+                100Z
+                # missing leading digit before '.'
+                -.100
+                # too many '.'
+                3.14.159
+                ''', failureTests=True)
+            print("Success" if result[0] else "Failed!")
+        prints::
+            # unsigned integer
+            100
+            [100]
+
+            # negative integer
+            -100
+            [-100]
+
+            # float with scientific notation
+            6.02e23
+            [6.02e+23]
+
+            # integer with scientific notation
+            1e-12
+            [1e-12]
+
+            Success
+            
+            # stray character
+            100Z
+               ^
+            FAIL: Expected end of text (at char 3), (line:1, col:4)
+
+            # missing leading digit before '.'
+            -.100
+            ^
+            FAIL: Expected {real number with scientific notation | real number | signed integer} (at char 0), (line:1, col:1)
+
+            # too many '.'
+            3.14.159
+                ^
+            FAIL: Expected end of text (at char 4), (line:1, col:5)
+
+            Success
+
+        Each test string must be on a single line. If you want to test a string that spans multiple
+        lines, create a test like this::
+
+            expr.runTest(r"this is a test\\n of strings that spans \\n 3 lines")
+        
+        (Note that this is a raw string literal, you must include the leading 'r'.)
+        """
+        if isinstance(tests, basestring):
+            tests = list(map(str.strip, tests.rstrip().splitlines()))
+        if isinstance(comment, basestring):
+            comment = Literal(comment)
+        allResults = []
+        comments = []
+        success = True
+        for t in tests:
+            if comment is not None and comment.matches(t, False) or comments and not t:
+                comments.append(t)
+                continue
+            if not t:
+                continue
+            out = ['\n'.join(comments), t]
+            comments = []
+            try:
+                t = t.replace(r'\n','\n')
+                result = self.parseString(t, parseAll=parseAll)
+                out.append(result.dump(full=fullDump))
+                success = success and not failureTests
+            except ParseBaseException as pe:
+                fatal = "(FATAL)" if isinstance(pe, ParseFatalException) else ""
+                if '\n' in t:
+                    out.append(line(pe.loc, t))
+                    out.append(' '*(col(pe.loc,t)-1) + '^' + fatal)
+                else:
+                    out.append(' '*pe.loc + '^' + fatal)
+                out.append("FAIL: " + str(pe))
+                success = success and failureTests
+                result = pe
+            except Exception as exc:
+                out.append("FAIL-EXCEPTION: " + str(exc))
+                success = success and failureTests
+                result = exc
+
+            if printResults:
+                if fullDump:
+                    out.append('')
+                print('\n'.join(out))
+
+            allResults.append((t, result))
+        
+        return success, allResults
+
+        
+class Token(ParserElement):
+    """
+    Abstract C{ParserElement} subclass, for defining atomic matching patterns.
+    """
+    def __init__( self ):
+        super(Token,self).__init__( savelist=False )
+
+
+class Empty(Token):
+    """
+    An empty token, will always match.
+    """
+    def __init__( self ):
+        super(Empty,self).__init__()
+        self.name = "Empty"
+        self.mayReturnEmpty = True
+        self.mayIndexError = False
+
+
+class NoMatch(Token):
+    """
+    A token that will never match.
+    """
+    def __init__( self ):
+        super(NoMatch,self).__init__()
+        self.name = "NoMatch"
+        self.mayReturnEmpty = True
+        self.mayIndexError = False
+        self.errmsg = "Unmatchable token"
+
+    def parseImpl( self, instring, loc, doActions=True ):
+        raise ParseException(instring, loc, self.errmsg, self)
+
+
+class Literal(Token):
+    """
+    Token to exactly match a specified string.
+    
+    Example::
+        Literal('blah').parseString('blah')  # -> ['blah']
+        Literal('blah').parseString('blahfooblah')  # -> ['blah']
+        Literal('blah').parseString('bla')  # -> Exception: Expected "blah"
+    
+    For case-insensitive matching, use L{CaselessLiteral}.
+    
+    For keyword matching (force word break before and after the matched string),
+    use L{Keyword} or L{CaselessKeyword}.
+    """
+    def __init__( self, matchString ):
+        super(Literal,self).__init__()
+        self.match = matchString
+        self.matchLen = len(matchString)
+        try:
+            self.firstMatchChar = matchString[0]
+        except IndexError:
+            warnings.warn("null string passed to Literal; use Empty() instead",
+                            SyntaxWarning, stacklevel=2)
+            self.__class__ = Empty
+        self.name = '"%s"' % _ustr(self.match)
+        self.errmsg = "Expected " + self.name
+        self.mayReturnEmpty = False
+        self.mayIndexError = False
+
+    # Performance tuning: this routine gets called a *lot*
+    # if this is a single character match string  and the first character matches,
+    # short-circuit as quickly as possible, and avoid calling startswith
+    #~ @profile
+    def parseImpl( self, instring, loc, doActions=True ):
+        if (instring[loc] == self.firstMatchChar and
+            (self.matchLen==1 or instring.startswith(self.match,loc)) ):
+            return loc+self.matchLen, self.match
+        raise ParseException(instring, loc, self.errmsg, self)
+_L = Literal
+ParserElement._literalStringClass = Literal
+
+class Keyword(Token):
+    """
+    Token to exactly match a specified string as a keyword, that is, it must be
+    immediately followed by a non-keyword character.  Compare with C{L{Literal}}:
+     - C{Literal("if")} will match the leading C{'if'} in C{'ifAndOnlyIf'}.
+     - C{Keyword("if")} will not; it will only match the leading C{'if'} in C{'if x=1'}, or C{'if(y==2)'}
+    Accepts two optional constructor arguments in addition to the keyword string:
+     - C{identChars} is a string of characters that would be valid identifier characters,
+          defaulting to all alphanumerics + "_" and "$"
+     - C{caseless} allows case-insensitive matching, default is C{False}.
+       
+    Example::
+        Keyword("start").parseString("start")  # -> ['start']
+        Keyword("start").parseString("starting")  # -> Exception
+
+    For case-insensitive matching, use L{CaselessKeyword}.
+    """
+    DEFAULT_KEYWORD_CHARS = alphanums+"_$"
+
+    def __init__( self, matchString, identChars=None, caseless=False ):
+        super(Keyword,self).__init__()
+        if identChars is None:
+            identChars = Keyword.DEFAULT_KEYWORD_CHARS
+        self.match = matchString
+        self.matchLen = len(matchString)
+        try:
+            self.firstMatchChar = matchString[0]
+        except IndexError:
+            warnings.warn("null string passed to Keyword; use Empty() instead",
+                            SyntaxWarning, stacklevel=2)
+        self.name = '"%s"' % self.match
+        self.errmsg = "Expected " + self.name
+        self.mayReturnEmpty = False
+        self.mayIndexError = False
+        self.caseless = caseless
+        if caseless:
+            self.caselessmatch = matchString.upper()
+            identChars = identChars.upper()
+        self.identChars = set(identChars)
+
+    def parseImpl( self, instring, loc, doActions=True ):
+        if self.caseless:
+            if ( (instring[ loc:loc+self.matchLen ].upper() == self.caselessmatch) and
+                 (loc >= len(instring)-self.matchLen or instring[loc+self.matchLen].upper() not in self.identChars) and
+                 (loc == 0 or instring[loc-1].upper() not in self.identChars) ):
+                return loc+self.matchLen, self.match
+        else:
+            if (instring[loc] == self.firstMatchChar and
+                (self.matchLen==1 or instring.startswith(self.match,loc)) and
+                (loc >= len(instring)-self.matchLen or instring[loc+self.matchLen] not in self.identChars) and
+                (loc == 0 or instring[loc-1] not in self.identChars) ):
+                return loc+self.matchLen, self.match
+        raise ParseException(instring, loc, self.errmsg, self)
+
+    def copy(self):
+        c = super(Keyword,self).copy()
+        c.identChars = Keyword.DEFAULT_KEYWORD_CHARS
+        return c
+
+    @staticmethod
+    def setDefaultKeywordChars( chars ):
+        """Overrides the default Keyword chars
+        """
+        Keyword.DEFAULT_KEYWORD_CHARS = chars
+
+class CaselessLiteral(Literal):
+    """
+    Token to match a specified string, ignoring case of letters.
+    Note: the matched results will always be in the case of the given
+    match string, NOT the case of the input text.
+
+    Example::
+        OneOrMore(CaselessLiteral("CMD")).parseString("cmd CMD Cmd10") # -> ['CMD', 'CMD', 'CMD']
+        
+    (Contrast with example for L{CaselessKeyword}.)
+    """
+    def __init__( self, matchString ):
+        super(CaselessLiteral,self).__init__( matchString.upper() )
+        # Preserve the defining literal.
+        self.returnString = matchString
+        self.name = "'%s'" % self.returnString
+        self.errmsg = "Expected " + self.name
+
+    def parseImpl( self, instring, loc, doActions=True ):
+        if instring[ loc:loc+self.matchLen ].upper() == self.match:
+            return loc+self.matchLen, self.returnString
+        raise ParseException(instring, loc, self.errmsg, self)
+
+class CaselessKeyword(Keyword):
+    """
+    Caseless version of L{Keyword}.
+
+    Example::
+        OneOrMore(CaselessKeyword("CMD")).parseString("cmd CMD Cmd10") # -> ['CMD', 'CMD']
+        
+    (Contrast with example for L{CaselessLiteral}.)
+    """
+    def __init__( self, matchString, identChars=None ):
+        super(CaselessKeyword,self).__init__( matchString, identChars, caseless=True )
+
+    def parseImpl( self, instring, loc, doActions=True ):
+        if ( (instring[ loc:loc+self.matchLen ].upper() == self.caselessmatch) and
+             (loc >= len(instring)-self.matchLen or instring[loc+self.matchLen].upper() not in self.identChars) ):
+            return loc+self.matchLen, self.match
+        raise ParseException(instring, loc, self.errmsg, self)
+
+class CloseMatch(Token):
+    """
+    A variation on L{Literal} which matches "close" matches, that is, 
+    strings with at most 'n' mismatching characters. C{CloseMatch} takes parameters:
+     - C{match_string} - string to be matched
+     - C{maxMismatches} - (C{default=1}) maximum number of mismatches allowed to count as a match
+    
+    The results from a successful parse will contain the matched text from the input string and the following named results:
+     - C{mismatches} - a list of the positions within the match_string where mismatches were found
+     - C{original} - the original match_string used to compare against the input string
+    
+    If C{mismatches} is an empty list, then the match was an exact match.
+    
+    Example::
+        patt = CloseMatch("ATCATCGAATGGA")
+        patt.parseString("ATCATCGAAXGGA") # -> (['ATCATCGAAXGGA'], {'mismatches': [[9]], 'original': ['ATCATCGAATGGA']})
+        patt.parseString("ATCAXCGAAXGGA") # -> Exception: Expected 'ATCATCGAATGGA' (with up to 1 mismatches) (at char 0), (line:1, col:1)
+
+        # exact match
+        patt.parseString("ATCATCGAATGGA") # -> (['ATCATCGAATGGA'], {'mismatches': [[]], 'original': ['ATCATCGAATGGA']})
+
+        # close match allowing up to 2 mismatches
+        patt = CloseMatch("ATCATCGAATGGA", maxMismatches=2)
+        patt.parseString("ATCAXCGAAXGGA") # -> (['ATCAXCGAAXGGA'], {'mismatches': [[4, 9]], 'original': ['ATCATCGAATGGA']})
+    """
+    def __init__(self, match_string, maxMismatches=1):
+        super(CloseMatch,self).__init__()
+        self.name = match_string
+        self.match_string = match_string
+        self.maxMismatches = maxMismatches
+        self.errmsg = "Expected %r (with up to %d mismatches)" % (self.match_string, self.maxMismatches)
+        self.mayIndexError = False
+        self.mayReturnEmpty = False
+
+    def parseImpl( self, instring, loc, doActions=True ):
+        start = loc
+        instrlen = len(instring)
+        maxloc = start + len(self.match_string)
+
+        if maxloc <= instrlen:
+            match_string = self.match_string
+            match_stringloc = 0
+            mismatches = []
+            maxMismatches = self.maxMismatches
+
+            for match_stringloc,s_m in enumerate(zip(instring[loc:maxloc], self.match_string)):
+                src,mat = s_m
+                if src != mat:
+                    mismatches.append(match_stringloc)
+                    if len(mismatches) > maxMismatches:
+                        break
+            else:
+                loc = match_stringloc + 1
+                results = ParseResults([instring[start:loc]])
+                results['original'] = self.match_string
+                results['mismatches'] = mismatches
+                return loc, results
+
+        raise ParseException(instring, loc, self.errmsg, self)
+
+
+class Word(Token):
+    """
+    Token for matching words composed of allowed character sets.
+    Defined with string containing all allowed initial characters,
+    an optional string containing allowed body characters (if omitted,
+    defaults to the initial character set), and an optional minimum,
+    maximum, and/or exact length.  The default value for C{min} is 1 (a
+    minimum value < 1 is not valid); the default values for C{max} and C{exact}
+    are 0, meaning no maximum or exact length restriction. An optional
+    C{excludeChars} parameter can list characters that might be found in 
+    the input C{bodyChars} string; useful to define a word of all printables
+    except for one or two characters, for instance.
+    
+    L{srange} is useful for defining custom character set strings for defining 
+    C{Word} expressions, using range notation from regular expression character sets.
+    
+    A common mistake is to use C{Word} to match a specific literal string, as in 
+    C{Word("Address")}. Remember that C{Word} uses the string argument to define
+    I{sets} of matchable characters. This expression would match "Add", "AAA",
+    "dAred", or any other word made up of the characters 'A', 'd', 'r', 'e', and 's'.
+    To match an exact literal string, use L{Literal} or L{Keyword}.
+
+    pyparsing includes helper strings for building Words:
+     - L{alphas}
+     - L{nums}
+     - L{alphanums}
+     - L{hexnums}
+     - L{alphas8bit} (alphabetic characters in ASCII range 128-255 - accented, tilded, umlauted, etc.)
+     - L{punc8bit} (non-alphabetic characters in ASCII range 128-255 - currency, symbols, superscripts, diacriticals, etc.)
+     - L{printables} (any non-whitespace character)
+
+    Example::
+        # a word composed of digits
+        integer = Word(nums) # equivalent to Word("0123456789") or Word(srange("0-9"))
+        
+        # a word with a leading capital, and zero or more lowercase
+        capital_word = Word(alphas.upper(), alphas.lower())
+
+        # hostnames are alphanumeric, with leading alpha, and '-'
+        hostname = Word(alphas, alphanums+'-')
+        
+        # roman numeral (not a strict parser, accepts invalid mix of characters)
+        roman = Word("IVXLCDM")
+        
+        # any string of non-whitespace characters, except for ','
+        csv_value = Word(printables, excludeChars=",")
+    """
+    def __init__( self, initChars, bodyChars=None, min=1, max=0, exact=0, asKeyword=False, excludeChars=None ):
+        super(Word,self).__init__()
+        if excludeChars:
+            initChars = ''.join(c for c in initChars if c not in excludeChars)
+            if bodyChars:
+                bodyChars = ''.join(c for c in bodyChars if c not in excludeChars)
+        self.initCharsOrig = initChars
+        self.initChars = set(initChars)
+        if bodyChars :
+            self.bodyCharsOrig = bodyChars
+            self.bodyChars = set(bodyChars)
+        else:
+            self.bodyCharsOrig = initChars
+            self.bodyChars = set(initChars)
+
+        self.maxSpecified = max > 0
+
+        if min < 1:
+            raise ValueError("cannot specify a minimum length < 1; use Optional(Word()) if zero-length word is permitted")
+
+        self.minLen = min
+
+        if max > 0:
+            self.maxLen = max
+        else:
+            self.maxLen = _MAX_INT
+
+        if exact > 0:
+            self.maxLen = exact
+            self.minLen = exact
+
+        self.name = _ustr(self)
+        self.errmsg = "Expected " + self.name
+        self.mayIndexError = False
+        self.asKeyword = asKeyword
+
+        if ' ' not in self.initCharsOrig+self.bodyCharsOrig and (min==1 and max==0 and exact==0):
+            if self.bodyCharsOrig == self.initCharsOrig:
+                self.reString = "[%s]+" % _escapeRegexRangeChars(self.initCharsOrig)
+            elif len(self.initCharsOrig) == 1:
+                self.reString = "%s[%s]*" % \
+                                      (re.escape(self.initCharsOrig),
+                                      _escapeRegexRangeChars(self.bodyCharsOrig),)
+            else:
+                self.reString = "[%s][%s]*" % \
+                                      (_escapeRegexRangeChars(self.initCharsOrig),
+                                      _escapeRegexRangeChars(self.bodyCharsOrig),)
+            if self.asKeyword:
+                self.reString = r"\b"+self.reString+r"\b"
+            try:
+                self.re = re.compile( self.reString )
+            except Exception:
+                self.re = None
+
+    def parseImpl( self, instring, loc, doActions=True ):
+        if self.re:
+            result = self.re.match(instring,loc)
+            if not result:
+                raise ParseException(instring, loc, self.errmsg, self)
+
+            loc = result.end()
+            return loc, result.group()
+
+        if not(instring[ loc ] in self.initChars):
+            raise ParseException(instring, loc, self.errmsg, self)
+
+        start = loc
+        loc += 1
+        instrlen = len(instring)
+        bodychars = self.bodyChars
+        maxloc = start + self.maxLen
+        maxloc = min( maxloc, instrlen )
+        while loc < maxloc and instring[loc] in bodychars:
+            loc += 1
+
+        throwException = False
+        if loc - start < self.minLen:
+            throwException = True
+        if self.maxSpecified and loc < instrlen and instring[loc] in bodychars:
+            throwException = True
+        if self.asKeyword:
+            if (start>0 and instring[start-1] in bodychars) or (loc4:
+                    return s[:4]+"..."
+                else:
+                    return s
+
+            if ( self.initCharsOrig != self.bodyCharsOrig ):
+                self.strRepr = "W:(%s,%s)" % ( charsAsStr(self.initCharsOrig), charsAsStr(self.bodyCharsOrig) )
+            else:
+                self.strRepr = "W:(%s)" % charsAsStr(self.initCharsOrig)
+
+        return self.strRepr
+
+
+class Regex(Token):
+    r"""
+    Token for matching strings that match a given regular expression.
+    Defined with string specifying the regular expression in a form recognized by the inbuilt Python re module.
+    If the given regex contains named groups (defined using C{(?P...)}), these will be preserved as 
+    named parse results.
+
+    Example::
+        realnum = Regex(r"[+-]?\d+\.\d*")
+        date = Regex(r'(?P\d{4})-(?P\d\d?)-(?P\d\d?)')
+        # ref: http://stackoverflow.com/questions/267399/how-do-you-match-only-valid-roman-numerals-with-a-regular-expression
+        roman = Regex(r"M{0,4}(CM|CD|D?C{0,3})(XC|XL|L?X{0,3})(IX|IV|V?I{0,3})")
+    """
+    compiledREtype = type(re.compile("[A-Z]"))
+    def __init__( self, pattern, flags=0):
+        """The parameters C{pattern} and C{flags} are passed to the C{re.compile()} function as-is. See the Python C{re} module for an explanation of the acceptable patterns and flags."""
+        super(Regex,self).__init__()
+
+        if isinstance(pattern, basestring):
+            if not pattern:
+                warnings.warn("null string passed to Regex; use Empty() instead",
+                        SyntaxWarning, stacklevel=2)
+
+            self.pattern = pattern
+            self.flags = flags
+
+            try:
+                self.re = re.compile(self.pattern, self.flags)
+                self.reString = self.pattern
+            except sre_constants.error:
+                warnings.warn("invalid pattern (%s) passed to Regex" % pattern,
+                    SyntaxWarning, stacklevel=2)
+                raise
+
+        elif isinstance(pattern, Regex.compiledREtype):
+            self.re = pattern
+            self.pattern = \
+            self.reString = str(pattern)
+            self.flags = flags
+            
+        else:
+            raise ValueError("Regex may only be constructed with a string or a compiled RE object")
+
+        self.name = _ustr(self)
+        self.errmsg = "Expected " + self.name
+        self.mayIndexError = False
+        self.mayReturnEmpty = True
+
+    def parseImpl( self, instring, loc, doActions=True ):
+        result = self.re.match(instring,loc)
+        if not result:
+            raise ParseException(instring, loc, self.errmsg, self)
+
+        loc = result.end()
+        d = result.groupdict()
+        ret = ParseResults(result.group())
+        if d:
+            for k in d:
+                ret[k] = d[k]
+        return loc,ret
+
+    def __str__( self ):
+        try:
+            return super(Regex,self).__str__()
+        except Exception:
+            pass
+
+        if self.strRepr is None:
+            self.strRepr = "Re:(%s)" % repr(self.pattern)
+
+        return self.strRepr
+
+
+class QuotedString(Token):
+    r"""
+    Token for matching strings that are delimited by quoting characters.
+    
+    Defined with the following parameters:
+        - quoteChar - string of one or more characters defining the quote delimiting string
+        - escChar - character to escape quotes, typically backslash (default=C{None})
+        - escQuote - special quote sequence to escape an embedded quote string (such as SQL's "" to escape an embedded ") (default=C{None})
+        - multiline - boolean indicating whether quotes can span multiple lines (default=C{False})
+        - unquoteResults - boolean indicating whether the matched text should be unquoted (default=C{True})
+        - endQuoteChar - string of one or more characters defining the end of the quote delimited string (default=C{None} => same as quoteChar)
+        - convertWhitespaceEscapes - convert escaped whitespace (C{'\t'}, C{'\n'}, etc.) to actual whitespace (default=C{True})
+
+    Example::
+        qs = QuotedString('"')
+        print(qs.searchString('lsjdf "This is the quote" sldjf'))
+        complex_qs = QuotedString('{{', endQuoteChar='}}')
+        print(complex_qs.searchString('lsjdf {{This is the "quote"}} sldjf'))
+        sql_qs = QuotedString('"', escQuote='""')
+        print(sql_qs.searchString('lsjdf "This is the quote with ""embedded"" quotes" sldjf'))
+    prints::
+        [['This is the quote']]
+        [['This is the "quote"']]
+        [['This is the quote with "embedded" quotes']]
+    """
+    def __init__( self, quoteChar, escChar=None, escQuote=None, multiline=False, unquoteResults=True, endQuoteChar=None, convertWhitespaceEscapes=True):
+        super(QuotedString,self).__init__()
+
+        # remove white space from quote chars - wont work anyway
+        quoteChar = quoteChar.strip()
+        if not quoteChar:
+            warnings.warn("quoteChar cannot be the empty string",SyntaxWarning,stacklevel=2)
+            raise SyntaxError()
+
+        if endQuoteChar is None:
+            endQuoteChar = quoteChar
+        else:
+            endQuoteChar = endQuoteChar.strip()
+            if not endQuoteChar:
+                warnings.warn("endQuoteChar cannot be the empty string",SyntaxWarning,stacklevel=2)
+                raise SyntaxError()
+
+        self.quoteChar = quoteChar
+        self.quoteCharLen = len(quoteChar)
+        self.firstQuoteChar = quoteChar[0]
+        self.endQuoteChar = endQuoteChar
+        self.endQuoteCharLen = len(endQuoteChar)
+        self.escChar = escChar
+        self.escQuote = escQuote
+        self.unquoteResults = unquoteResults
+        self.convertWhitespaceEscapes = convertWhitespaceEscapes
+
+        if multiline:
+            self.flags = re.MULTILINE | re.DOTALL
+            self.pattern = r'%s(?:[^%s%s]' % \
+                ( re.escape(self.quoteChar),
+                  _escapeRegexRangeChars(self.endQuoteChar[0]),
+                  (escChar is not None and _escapeRegexRangeChars(escChar) or '') )
+        else:
+            self.flags = 0
+            self.pattern = r'%s(?:[^%s\n\r%s]' % \
+                ( re.escape(self.quoteChar),
+                  _escapeRegexRangeChars(self.endQuoteChar[0]),
+                  (escChar is not None and _escapeRegexRangeChars(escChar) or '') )
+        if len(self.endQuoteChar) > 1:
+            self.pattern += (
+                '|(?:' + ')|(?:'.join("%s[^%s]" % (re.escape(self.endQuoteChar[:i]),
+                                               _escapeRegexRangeChars(self.endQuoteChar[i]))
+                                    for i in range(len(self.endQuoteChar)-1,0,-1)) + ')'
+                )
+        if escQuote:
+            self.pattern += (r'|(?:%s)' % re.escape(escQuote))
+        if escChar:
+            self.pattern += (r'|(?:%s.)' % re.escape(escChar))
+            self.escCharReplacePattern = re.escape(self.escChar)+"(.)"
+        self.pattern += (r')*%s' % re.escape(self.endQuoteChar))
+
+        try:
+            self.re = re.compile(self.pattern, self.flags)
+            self.reString = self.pattern
+        except sre_constants.error:
+            warnings.warn("invalid pattern (%s) passed to Regex" % self.pattern,
+                SyntaxWarning, stacklevel=2)
+            raise
+
+        self.name = _ustr(self)
+        self.errmsg = "Expected " + self.name
+        self.mayIndexError = False
+        self.mayReturnEmpty = True
+
+    def parseImpl( self, instring, loc, doActions=True ):
+        result = instring[loc] == self.firstQuoteChar and self.re.match(instring,loc) or None
+        if not result:
+            raise ParseException(instring, loc, self.errmsg, self)
+
+        loc = result.end()
+        ret = result.group()
+
+        if self.unquoteResults:
+
+            # strip off quotes
+            ret = ret[self.quoteCharLen:-self.endQuoteCharLen]
+
+            if isinstance(ret,basestring):
+                # replace escaped whitespace
+                if '\\' in ret and self.convertWhitespaceEscapes:
+                    ws_map = {
+                        r'\t' : '\t',
+                        r'\n' : '\n',
+                        r'\f' : '\f',
+                        r'\r' : '\r',
+                    }
+                    for wslit,wschar in ws_map.items():
+                        ret = ret.replace(wslit, wschar)
+
+                # replace escaped characters
+                if self.escChar:
+                    ret = re.sub(self.escCharReplacePattern, r"\g<1>", ret)
+
+                # replace escaped quotes
+                if self.escQuote:
+                    ret = ret.replace(self.escQuote, self.endQuoteChar)
+
+        return loc, ret
+
+    def __str__( self ):
+        try:
+            return super(QuotedString,self).__str__()
+        except Exception:
+            pass
+
+        if self.strRepr is None:
+            self.strRepr = "quoted string, starting with %s ending with %s" % (self.quoteChar, self.endQuoteChar)
+
+        return self.strRepr
+
+
+class CharsNotIn(Token):
+    """
+    Token for matching words composed of characters I{not} in a given set (will
+    include whitespace in matched characters if not listed in the provided exclusion set - see example).
+    Defined with string containing all disallowed characters, and an optional
+    minimum, maximum, and/or exact length.  The default value for C{min} is 1 (a
+    minimum value < 1 is not valid); the default values for C{max} and C{exact}
+    are 0, meaning no maximum or exact length restriction.
+
+    Example::
+        # define a comma-separated-value as anything that is not a ','
+        csv_value = CharsNotIn(',')
+        print(delimitedList(csv_value).parseString("dkls,lsdkjf,s12 34,@!#,213"))
+    prints::
+        ['dkls', 'lsdkjf', 's12 34', '@!#', '213']
+    """
+    def __init__( self, notChars, min=1, max=0, exact=0 ):
+        super(CharsNotIn,self).__init__()
+        self.skipWhitespace = False
+        self.notChars = notChars
+
+        if min < 1:
+            raise ValueError("cannot specify a minimum length < 1; use Optional(CharsNotIn()) if zero-length char group is permitted")
+
+        self.minLen = min
+
+        if max > 0:
+            self.maxLen = max
+        else:
+            self.maxLen = _MAX_INT
+
+        if exact > 0:
+            self.maxLen = exact
+            self.minLen = exact
+
+        self.name = _ustr(self)
+        self.errmsg = "Expected " + self.name
+        self.mayReturnEmpty = ( self.minLen == 0 )
+        self.mayIndexError = False
+
+    def parseImpl( self, instring, loc, doActions=True ):
+        if instring[loc] in self.notChars:
+            raise ParseException(instring, loc, self.errmsg, self)
+
+        start = loc
+        loc += 1
+        notchars = self.notChars
+        maxlen = min( start+self.maxLen, len(instring) )
+        while loc < maxlen and \
+              (instring[loc] not in notchars):
+            loc += 1
+
+        if loc - start < self.minLen:
+            raise ParseException(instring, loc, self.errmsg, self)
+
+        return loc, instring[start:loc]
+
+    def __str__( self ):
+        try:
+            return super(CharsNotIn, self).__str__()
+        except Exception:
+            pass
+
+        if self.strRepr is None:
+            if len(self.notChars) > 4:
+                self.strRepr = "!W:(%s...)" % self.notChars[:4]
+            else:
+                self.strRepr = "!W:(%s)" % self.notChars
+
+        return self.strRepr
+
+class White(Token):
+    """
+    Special matching class for matching whitespace.  Normally, whitespace is ignored
+    by pyparsing grammars.  This class is included when some whitespace structures
+    are significant.  Define with a string containing the whitespace characters to be
+    matched; default is C{" \\t\\r\\n"}.  Also takes optional C{min}, C{max}, and C{exact} arguments,
+    as defined for the C{L{Word}} class.
+    """
+    whiteStrs = {
+        " " : "",
+        "\t": "",
+        "\n": "",
+        "\r": "",
+        "\f": "",
+        }
+    def __init__(self, ws=" \t\r\n", min=1, max=0, exact=0):
+        super(White,self).__init__()
+        self.matchWhite = ws
+        self.setWhitespaceChars( "".join(c for c in self.whiteChars if c not in self.matchWhite) )
+        #~ self.leaveWhitespace()
+        self.name = ("".join(White.whiteStrs[c] for c in self.matchWhite))
+        self.mayReturnEmpty = True
+        self.errmsg = "Expected " + self.name
+
+        self.minLen = min
+
+        if max > 0:
+            self.maxLen = max
+        else:
+            self.maxLen = _MAX_INT
+
+        if exact > 0:
+            self.maxLen = exact
+            self.minLen = exact
+
+    def parseImpl( self, instring, loc, doActions=True ):
+        if not(instring[ loc ] in self.matchWhite):
+            raise ParseException(instring, loc, self.errmsg, self)
+        start = loc
+        loc += 1
+        maxloc = start + self.maxLen
+        maxloc = min( maxloc, len(instring) )
+        while loc < maxloc and instring[loc] in self.matchWhite:
+            loc += 1
+
+        if loc - start < self.minLen:
+            raise ParseException(instring, loc, self.errmsg, self)
+
+        return loc, instring[start:loc]
+
+
+class _PositionToken(Token):
+    def __init__( self ):
+        super(_PositionToken,self).__init__()
+        self.name=self.__class__.__name__
+        self.mayReturnEmpty = True
+        self.mayIndexError = False
+
+class GoToColumn(_PositionToken):
+    """
+    Token to advance to a specific column of input text; useful for tabular report scraping.
+    """
+    def __init__( self, colno ):
+        super(GoToColumn,self).__init__()
+        self.col = colno
+
+    def preParse( self, instring, loc ):
+        if col(loc,instring) != self.col:
+            instrlen = len(instring)
+            if self.ignoreExprs:
+                loc = self._skipIgnorables( instring, loc )
+            while loc < instrlen and instring[loc].isspace() and col( loc, instring ) != self.col :
+                loc += 1
+        return loc
+
+    def parseImpl( self, instring, loc, doActions=True ):
+        thiscol = col( loc, instring )
+        if thiscol > self.col:
+            raise ParseException( instring, loc, "Text not in expected column", self )
+        newloc = loc + self.col - thiscol
+        ret = instring[ loc: newloc ]
+        return newloc, ret
+
+
+class LineStart(_PositionToken):
+    """
+    Matches if current position is at the beginning of a line within the parse string
+    
+    Example::
+    
+        test = '''\
+        AAA this line
+        AAA and this line
+          AAA but not this one
+        B AAA and definitely not this one
+        '''
+
+        for t in (LineStart() + 'AAA' + restOfLine).searchString(test):
+            print(t)
+    
+    Prints::
+        ['AAA', ' this line']
+        ['AAA', ' and this line']    
+
+    """
+    def __init__( self ):
+        super(LineStart,self).__init__()
+        self.errmsg = "Expected start of line"
+
+    def parseImpl( self, instring, loc, doActions=True ):
+        if col(loc, instring) == 1:
+            return loc, []
+        raise ParseException(instring, loc, self.errmsg, self)
+
+class LineEnd(_PositionToken):
+    """
+    Matches if current position is at the end of a line within the parse string
+    """
+    def __init__( self ):
+        super(LineEnd,self).__init__()
+        self.setWhitespaceChars( ParserElement.DEFAULT_WHITE_CHARS.replace("\n","") )
+        self.errmsg = "Expected end of line"
+
+    def parseImpl( self, instring, loc, doActions=True ):
+        if loc len(instring):
+            return loc, []
+        else:
+            raise ParseException(instring, loc, self.errmsg, self)
+
+class WordStart(_PositionToken):
+    """
+    Matches if the current position is at the beginning of a Word, and
+    is not preceded by any character in a given set of C{wordChars}
+    (default=C{printables}). To emulate the C{\b} behavior of regular expressions,
+    use C{WordStart(alphanums)}. C{WordStart} will also match at the beginning of
+    the string being parsed, or at the beginning of a line.
+    """
+    def __init__(self, wordChars = printables):
+        super(WordStart,self).__init__()
+        self.wordChars = set(wordChars)
+        self.errmsg = "Not at the start of a word"
+
+    def parseImpl(self, instring, loc, doActions=True ):
+        if loc != 0:
+            if (instring[loc-1] in self.wordChars or
+                instring[loc] not in self.wordChars):
+                raise ParseException(instring, loc, self.errmsg, self)
+        return loc, []
+
+class WordEnd(_PositionToken):
+    """
+    Matches if the current position is at the end of a Word, and
+    is not followed by any character in a given set of C{wordChars}
+    (default=C{printables}). To emulate the C{\b} behavior of regular expressions,
+    use C{WordEnd(alphanums)}. C{WordEnd} will also match at the end of
+    the string being parsed, or at the end of a line.
+    """
+    def __init__(self, wordChars = printables):
+        super(WordEnd,self).__init__()
+        self.wordChars = set(wordChars)
+        self.skipWhitespace = False
+        self.errmsg = "Not at the end of a word"
+
+    def parseImpl(self, instring, loc, doActions=True ):
+        instrlen = len(instring)
+        if instrlen>0 and loc maxExcLoc:
+                    maxException = err
+                    maxExcLoc = err.loc
+            except IndexError:
+                if len(instring) > maxExcLoc:
+                    maxException = ParseException(instring,len(instring),e.errmsg,self)
+                    maxExcLoc = len(instring)
+            else:
+                # save match among all matches, to retry longest to shortest
+                matches.append((loc2, e))
+
+        if matches:
+            matches.sort(key=lambda x: -x[0])
+            for _,e in matches:
+                try:
+                    return e._parse( instring, loc, doActions )
+                except ParseException as err:
+                    err.__traceback__ = None
+                    if err.loc > maxExcLoc:
+                        maxException = err
+                        maxExcLoc = err.loc
+
+        if maxException is not None:
+            maxException.msg = self.errmsg
+            raise maxException
+        else:
+            raise ParseException(instring, loc, "no defined alternatives to match", self)
+
+
+    def __ixor__(self, other ):
+        if isinstance( other, basestring ):
+            other = ParserElement._literalStringClass( other )
+        return self.append( other ) #Or( [ self, other ] )
+
+    def __str__( self ):
+        if hasattr(self,"name"):
+            return self.name
+
+        if self.strRepr is None:
+            self.strRepr = "{" + " ^ ".join(_ustr(e) for e in self.exprs) + "}"
+
+        return self.strRepr
+
+    def checkRecursion( self, parseElementList ):
+        subRecCheckList = parseElementList[:] + [ self ]
+        for e in self.exprs:
+            e.checkRecursion( subRecCheckList )
+
+
+class MatchFirst(ParseExpression):
+    """
+    Requires that at least one C{ParseExpression} is found.
+    If two expressions match, the first one listed is the one that will match.
+    May be constructed using the C{'|'} operator.
+
+    Example::
+        # construct MatchFirst using '|' operator
+        
+        # watch the order of expressions to match
+        number = Word(nums) | Combine(Word(nums) + '.' + Word(nums))
+        print(number.searchString("123 3.1416 789")) #  Fail! -> [['123'], ['3'], ['1416'], ['789']]
+
+        # put more selective expression first
+        number = Combine(Word(nums) + '.' + Word(nums)) | Word(nums)
+        print(number.searchString("123 3.1416 789")) #  Better -> [['123'], ['3.1416'], ['789']]
+    """
+    def __init__( self, exprs, savelist = False ):
+        super(MatchFirst,self).__init__(exprs, savelist)
+        if self.exprs:
+            self.mayReturnEmpty = any(e.mayReturnEmpty for e in self.exprs)
+        else:
+            self.mayReturnEmpty = True
+
+    def parseImpl( self, instring, loc, doActions=True ):
+        maxExcLoc = -1
+        maxException = None
+        for e in self.exprs:
+            try:
+                ret = e._parse( instring, loc, doActions )
+                return ret
+            except ParseException as err:
+                if err.loc > maxExcLoc:
+                    maxException = err
+                    maxExcLoc = err.loc
+            except IndexError:
+                if len(instring) > maxExcLoc:
+                    maxException = ParseException(instring,len(instring),e.errmsg,self)
+                    maxExcLoc = len(instring)
+
+        # only got here if no expression matched, raise exception for match that made it the furthest
+        else:
+            if maxException is not None:
+                maxException.msg = self.errmsg
+                raise maxException
+            else:
+                raise ParseException(instring, loc, "no defined alternatives to match", self)
+
+    def __ior__(self, other ):
+        if isinstance( other, basestring ):
+            other = ParserElement._literalStringClass( other )
+        return self.append( other ) #MatchFirst( [ self, other ] )
+
+    def __str__( self ):
+        if hasattr(self,"name"):
+            return self.name
+
+        if self.strRepr is None:
+            self.strRepr = "{" + " | ".join(_ustr(e) for e in self.exprs) + "}"
+
+        return self.strRepr
+
+    def checkRecursion( self, parseElementList ):
+        subRecCheckList = parseElementList[:] + [ self ]
+        for e in self.exprs:
+            e.checkRecursion( subRecCheckList )
+
+
+class Each(ParseExpression):
+    """
+    Requires all given C{ParseExpression}s to be found, but in any order.
+    Expressions may be separated by whitespace.
+    May be constructed using the C{'&'} operator.
+
+    Example::
+        color = oneOf("RED ORANGE YELLOW GREEN BLUE PURPLE BLACK WHITE BROWN")
+        shape_type = oneOf("SQUARE CIRCLE TRIANGLE STAR HEXAGON OCTAGON")
+        integer = Word(nums)
+        shape_attr = "shape:" + shape_type("shape")
+        posn_attr = "posn:" + Group(integer("x") + ',' + integer("y"))("posn")
+        color_attr = "color:" + color("color")
+        size_attr = "size:" + integer("size")
+
+        # use Each (using operator '&') to accept attributes in any order 
+        # (shape and posn are required, color and size are optional)
+        shape_spec = shape_attr & posn_attr & Optional(color_attr) & Optional(size_attr)
+
+        shape_spec.runTests('''
+            shape: SQUARE color: BLACK posn: 100, 120
+            shape: CIRCLE size: 50 color: BLUE posn: 50,80
+            color:GREEN size:20 shape:TRIANGLE posn:20,40
+            '''
+            )
+    prints::
+        shape: SQUARE color: BLACK posn: 100, 120
+        ['shape:', 'SQUARE', 'color:', 'BLACK', 'posn:', ['100', ',', '120']]
+        - color: BLACK
+        - posn: ['100', ',', '120']
+          - x: 100
+          - y: 120
+        - shape: SQUARE
+
+
+        shape: CIRCLE size: 50 color: BLUE posn: 50,80
+        ['shape:', 'CIRCLE', 'size:', '50', 'color:', 'BLUE', 'posn:', ['50', ',', '80']]
+        - color: BLUE
+        - posn: ['50', ',', '80']
+          - x: 50
+          - y: 80
+        - shape: CIRCLE
+        - size: 50
+
+
+        color: GREEN size: 20 shape: TRIANGLE posn: 20,40
+        ['color:', 'GREEN', 'size:', '20', 'shape:', 'TRIANGLE', 'posn:', ['20', ',', '40']]
+        - color: GREEN
+        - posn: ['20', ',', '40']
+          - x: 20
+          - y: 40
+        - shape: TRIANGLE
+        - size: 20
+    """
+    def __init__( self, exprs, savelist = True ):
+        super(Each,self).__init__(exprs, savelist)
+        self.mayReturnEmpty = all(e.mayReturnEmpty for e in self.exprs)
+        self.skipWhitespace = True
+        self.initExprGroups = True
+
+    def parseImpl( self, instring, loc, doActions=True ):
+        if self.initExprGroups:
+            self.opt1map = dict((id(e.expr),e) for e in self.exprs if isinstance(e,Optional))
+            opt1 = [ e.expr for e in self.exprs if isinstance(e,Optional) ]
+            opt2 = [ e for e in self.exprs if e.mayReturnEmpty and not isinstance(e,Optional)]
+            self.optionals = opt1 + opt2
+            self.multioptionals = [ e.expr for e in self.exprs if isinstance(e,ZeroOrMore) ]
+            self.multirequired = [ e.expr for e in self.exprs if isinstance(e,OneOrMore) ]
+            self.required = [ e for e in self.exprs if not isinstance(e,(Optional,ZeroOrMore,OneOrMore)) ]
+            self.required += self.multirequired
+            self.initExprGroups = False
+        tmpLoc = loc
+        tmpReqd = self.required[:]
+        tmpOpt  = self.optionals[:]
+        matchOrder = []
+
+        keepMatching = True
+        while keepMatching:
+            tmpExprs = tmpReqd + tmpOpt + self.multioptionals + self.multirequired
+            failed = []
+            for e in tmpExprs:
+                try:
+                    tmpLoc = e.tryParse( instring, tmpLoc )
+                except ParseException:
+                    failed.append(e)
+                else:
+                    matchOrder.append(self.opt1map.get(id(e),e))
+                    if e in tmpReqd:
+                        tmpReqd.remove(e)
+                    elif e in tmpOpt:
+                        tmpOpt.remove(e)
+            if len(failed) == len(tmpExprs):
+                keepMatching = False
+
+        if tmpReqd:
+            missing = ", ".join(_ustr(e) for e in tmpReqd)
+            raise ParseException(instring,loc,"Missing one or more required elements (%s)" % missing )
+
+        # add any unmatched Optionals, in case they have default values defined
+        matchOrder += [e for e in self.exprs if isinstance(e,Optional) and e.expr in tmpOpt]
+
+        resultlist = []
+        for e in matchOrder:
+            loc,results = e._parse(instring,loc,doActions)
+            resultlist.append(results)
+
+        finalResults = sum(resultlist, ParseResults([]))
+        return loc, finalResults
+
+    def __str__( self ):
+        if hasattr(self,"name"):
+            return self.name
+
+        if self.strRepr is None:
+            self.strRepr = "{" + " & ".join(_ustr(e) for e in self.exprs) + "}"
+
+        return self.strRepr
+
+    def checkRecursion( self, parseElementList ):
+        subRecCheckList = parseElementList[:] + [ self ]
+        for e in self.exprs:
+            e.checkRecursion( subRecCheckList )
+
+
+class ParseElementEnhance(ParserElement):
+    """
+    Abstract subclass of C{ParserElement}, for combining and post-processing parsed tokens.
+    """
+    def __init__( self, expr, savelist=False ):
+        super(ParseElementEnhance,self).__init__(savelist)
+        if isinstance( expr, basestring ):
+            if issubclass(ParserElement._literalStringClass, Token):
+                expr = ParserElement._literalStringClass(expr)
+            else:
+                expr = ParserElement._literalStringClass(Literal(expr))
+        self.expr = expr
+        self.strRepr = None
+        if expr is not None:
+            self.mayIndexError = expr.mayIndexError
+            self.mayReturnEmpty = expr.mayReturnEmpty
+            self.setWhitespaceChars( expr.whiteChars )
+            self.skipWhitespace = expr.skipWhitespace
+            self.saveAsList = expr.saveAsList
+            self.callPreparse = expr.callPreparse
+            self.ignoreExprs.extend(expr.ignoreExprs)
+
+    def parseImpl( self, instring, loc, doActions=True ):
+        if self.expr is not None:
+            return self.expr._parse( instring, loc, doActions, callPreParse=False )
+        else:
+            raise ParseException("",loc,self.errmsg,self)
+
+    def leaveWhitespace( self ):
+        self.skipWhitespace = False
+        self.expr = self.expr.copy()
+        if self.expr is not None:
+            self.expr.leaveWhitespace()
+        return self
+
+    def ignore( self, other ):
+        if isinstance( other, Suppress ):
+            if other not in self.ignoreExprs:
+                super( ParseElementEnhance, self).ignore( other )
+                if self.expr is not None:
+                    self.expr.ignore( self.ignoreExprs[-1] )
+        else:
+            super( ParseElementEnhance, self).ignore( other )
+            if self.expr is not None:
+                self.expr.ignore( self.ignoreExprs[-1] )
+        return self
+
+    def streamline( self ):
+        super(ParseElementEnhance,self).streamline()
+        if self.expr is not None:
+            self.expr.streamline()
+        return self
+
+    def checkRecursion( self, parseElementList ):
+        if self in parseElementList:
+            raise RecursiveGrammarException( parseElementList+[self] )
+        subRecCheckList = parseElementList[:] + [ self ]
+        if self.expr is not None:
+            self.expr.checkRecursion( subRecCheckList )
+
+    def validate( self, validateTrace=[] ):
+        tmp = validateTrace[:]+[self]
+        if self.expr is not None:
+            self.expr.validate(tmp)
+        self.checkRecursion( [] )
+
+    def __str__( self ):
+        try:
+            return super(ParseElementEnhance,self).__str__()
+        except Exception:
+            pass
+
+        if self.strRepr is None and self.expr is not None:
+            self.strRepr = "%s:(%s)" % ( self.__class__.__name__, _ustr(self.expr) )
+        return self.strRepr
+
+
+class FollowedBy(ParseElementEnhance):
+    """
+    Lookahead matching of the given parse expression.  C{FollowedBy}
+    does I{not} advance the parsing position within the input string, it only
+    verifies that the specified parse expression matches at the current
+    position.  C{FollowedBy} always returns a null token list.
+
+    Example::
+        # use FollowedBy to match a label only if it is followed by a ':'
+        data_word = Word(alphas)
+        label = data_word + FollowedBy(':')
+        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
+        
+        OneOrMore(attr_expr).parseString("shape: SQUARE color: BLACK posn: upper left").pprint()
+    prints::
+        [['shape', 'SQUARE'], ['color', 'BLACK'], ['posn', 'upper left']]
+    """
+    def __init__( self, expr ):
+        super(FollowedBy,self).__init__(expr)
+        self.mayReturnEmpty = True
+
+    def parseImpl( self, instring, loc, doActions=True ):
+        self.expr.tryParse( instring, loc )
+        return loc, []
+
+
+class NotAny(ParseElementEnhance):
+    """
+    Lookahead to disallow matching with the given parse expression.  C{NotAny}
+    does I{not} advance the parsing position within the input string, it only
+    verifies that the specified parse expression does I{not} match at the current
+    position.  Also, C{NotAny} does I{not} skip over leading whitespace. C{NotAny}
+    always returns a null token list.  May be constructed using the '~' operator.
+
+    Example::
+        
+    """
+    def __init__( self, expr ):
+        super(NotAny,self).__init__(expr)
+        #~ self.leaveWhitespace()
+        self.skipWhitespace = False  # do NOT use self.leaveWhitespace(), don't want to propagate to exprs
+        self.mayReturnEmpty = True
+        self.errmsg = "Found unwanted token, "+_ustr(self.expr)
+
+    def parseImpl( self, instring, loc, doActions=True ):
+        if self.expr.canParseNext(instring, loc):
+            raise ParseException(instring, loc, self.errmsg, self)
+        return loc, []
+
+    def __str__( self ):
+        if hasattr(self,"name"):
+            return self.name
+
+        if self.strRepr is None:
+            self.strRepr = "~{" + _ustr(self.expr) + "}"
+
+        return self.strRepr
+
+class _MultipleMatch(ParseElementEnhance):
+    def __init__( self, expr, stopOn=None):
+        super(_MultipleMatch, self).__init__(expr)
+        self.saveAsList = True
+        ender = stopOn
+        if isinstance(ender, basestring):
+            ender = ParserElement._literalStringClass(ender)
+        self.not_ender = ~ender if ender is not None else None
+
+    def parseImpl( self, instring, loc, doActions=True ):
+        self_expr_parse = self.expr._parse
+        self_skip_ignorables = self._skipIgnorables
+        check_ender = self.not_ender is not None
+        if check_ender:
+            try_not_ender = self.not_ender.tryParse
+        
+        # must be at least one (but first see if we are the stopOn sentinel;
+        # if so, fail)
+        if check_ender:
+            try_not_ender(instring, loc)
+        loc, tokens = self_expr_parse( instring, loc, doActions, callPreParse=False )
+        try:
+            hasIgnoreExprs = (not not self.ignoreExprs)
+            while 1:
+                if check_ender:
+                    try_not_ender(instring, loc)
+                if hasIgnoreExprs:
+                    preloc = self_skip_ignorables( instring, loc )
+                else:
+                    preloc = loc
+                loc, tmptokens = self_expr_parse( instring, preloc, doActions )
+                if tmptokens or tmptokens.haskeys():
+                    tokens += tmptokens
+        except (ParseException,IndexError):
+            pass
+
+        return loc, tokens
+        
+class OneOrMore(_MultipleMatch):
+    """
+    Repetition of one or more of the given expression.
+    
+    Parameters:
+     - expr - expression that must match one or more times
+     - stopOn - (default=C{None}) - expression for a terminating sentinel
+          (only required if the sentinel would ordinarily match the repetition 
+          expression)          
+
+    Example::
+        data_word = Word(alphas)
+        label = data_word + FollowedBy(':')
+        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word).setParseAction(' '.join))
+
+        text = "shape: SQUARE posn: upper left color: BLACK"
+        OneOrMore(attr_expr).parseString(text).pprint()  # Fail! read 'color' as data instead of next label -> [['shape', 'SQUARE color']]
+
+        # use stopOn attribute for OneOrMore to avoid reading label string as part of the data
+        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
+        OneOrMore(attr_expr).parseString(text).pprint() # Better -> [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'BLACK']]
+        
+        # could also be written as
+        (attr_expr * (1,)).parseString(text).pprint()
+    """
+
+    def __str__( self ):
+        if hasattr(self,"name"):
+            return self.name
+
+        if self.strRepr is None:
+            self.strRepr = "{" + _ustr(self.expr) + "}..."
+
+        return self.strRepr
+
+class ZeroOrMore(_MultipleMatch):
+    """
+    Optional repetition of zero or more of the given expression.
+    
+    Parameters:
+     - expr - expression that must match zero or more times
+     - stopOn - (default=C{None}) - expression for a terminating sentinel
+          (only required if the sentinel would ordinarily match the repetition 
+          expression)          
+
+    Example: similar to L{OneOrMore}
+    """
+    def __init__( self, expr, stopOn=None):
+        super(ZeroOrMore,self).__init__(expr, stopOn=stopOn)
+        self.mayReturnEmpty = True
+        
+    def parseImpl( self, instring, loc, doActions=True ):
+        try:
+            return super(ZeroOrMore, self).parseImpl(instring, loc, doActions)
+        except (ParseException,IndexError):
+            return loc, []
+
+    def __str__( self ):
+        if hasattr(self,"name"):
+            return self.name
+
+        if self.strRepr is None:
+            self.strRepr = "[" + _ustr(self.expr) + "]..."
+
+        return self.strRepr
+
+class _NullToken(object):
+    def __bool__(self):
+        return False
+    __nonzero__ = __bool__
+    def __str__(self):
+        return ""
+
+_optionalNotMatched = _NullToken()
+class Optional(ParseElementEnhance):
+    """
+    Optional matching of the given expression.
+
+    Parameters:
+     - expr - expression that must match zero or more times
+     - default (optional) - value to be returned if the optional expression is not found.
+
+    Example::
+        # US postal code can be a 5-digit zip, plus optional 4-digit qualifier
+        zip = Combine(Word(nums, exact=5) + Optional('-' + Word(nums, exact=4)))
+        zip.runTests('''
+            # traditional ZIP code
+            12345
+            
+            # ZIP+4 form
+            12101-0001
+            
+            # invalid ZIP
+            98765-
+            ''')
+    prints::
+        # traditional ZIP code
+        12345
+        ['12345']
+
+        # ZIP+4 form
+        12101-0001
+        ['12101-0001']
+
+        # invalid ZIP
+        98765-
+             ^
+        FAIL: Expected end of text (at char 5), (line:1, col:6)
+    """
+    def __init__( self, expr, default=_optionalNotMatched ):
+        super(Optional,self).__init__( expr, savelist=False )
+        self.saveAsList = self.expr.saveAsList
+        self.defaultValue = default
+        self.mayReturnEmpty = True
+
+    def parseImpl( self, instring, loc, doActions=True ):
+        try:
+            loc, tokens = self.expr._parse( instring, loc, doActions, callPreParse=False )
+        except (ParseException,IndexError):
+            if self.defaultValue is not _optionalNotMatched:
+                if self.expr.resultsName:
+                    tokens = ParseResults([ self.defaultValue ])
+                    tokens[self.expr.resultsName] = self.defaultValue
+                else:
+                    tokens = [ self.defaultValue ]
+            else:
+                tokens = []
+        return loc, tokens
+
+    def __str__( self ):
+        if hasattr(self,"name"):
+            return self.name
+
+        if self.strRepr is None:
+            self.strRepr = "[" + _ustr(self.expr) + "]"
+
+        return self.strRepr
+
+class SkipTo(ParseElementEnhance):
+    """
+    Token for skipping over all undefined text until the matched expression is found.
+
+    Parameters:
+     - expr - target expression marking the end of the data to be skipped
+     - include - (default=C{False}) if True, the target expression is also parsed 
+          (the skipped text and target expression are returned as a 2-element list).
+     - ignore - (default=C{None}) used to define grammars (typically quoted strings and 
+          comments) that might contain false matches to the target expression
+     - failOn - (default=C{None}) define expressions that are not allowed to be 
+          included in the skipped test; if found before the target expression is found, 
+          the SkipTo is not a match
+
+    Example::
+        report = '''
+            Outstanding Issues Report - 1 Jan 2000
+
+               # | Severity | Description                               |  Days Open
+            -----+----------+-------------------------------------------+-----------
+             101 | Critical | Intermittent system crash                 |          6
+              94 | Cosmetic | Spelling error on Login ('log|n')         |         14
+              79 | Minor    | System slow when running too many reports |         47
+            '''
+        integer = Word(nums)
+        SEP = Suppress('|')
+        # use SkipTo to simply match everything up until the next SEP
+        # - ignore quoted strings, so that a '|' character inside a quoted string does not match
+        # - parse action will call token.strip() for each matched token, i.e., the description body
+        string_data = SkipTo(SEP, ignore=quotedString)
+        string_data.setParseAction(tokenMap(str.strip))
+        ticket_expr = (integer("issue_num") + SEP 
+                      + string_data("sev") + SEP 
+                      + string_data("desc") + SEP 
+                      + integer("days_open"))
+        
+        for tkt in ticket_expr.searchString(report):
+            print tkt.dump()
+    prints::
+        ['101', 'Critical', 'Intermittent system crash', '6']
+        - days_open: 6
+        - desc: Intermittent system crash
+        - issue_num: 101
+        - sev: Critical
+        ['94', 'Cosmetic', "Spelling error on Login ('log|n')", '14']
+        - days_open: 14
+        - desc: Spelling error on Login ('log|n')
+        - issue_num: 94
+        - sev: Cosmetic
+        ['79', 'Minor', 'System slow when running too many reports', '47']
+        - days_open: 47
+        - desc: System slow when running too many reports
+        - issue_num: 79
+        - sev: Minor
+    """
+    def __init__( self, other, include=False, ignore=None, failOn=None ):
+        super( SkipTo, self ).__init__( other )
+        self.ignoreExpr = ignore
+        self.mayReturnEmpty = True
+        self.mayIndexError = False
+        self.includeMatch = include
+        self.asList = False
+        if isinstance(failOn, basestring):
+            self.failOn = ParserElement._literalStringClass(failOn)
+        else:
+            self.failOn = failOn
+        self.errmsg = "No match found for "+_ustr(self.expr)
+
+    def parseImpl( self, instring, loc, doActions=True ):
+        startloc = loc
+        instrlen = len(instring)
+        expr = self.expr
+        expr_parse = self.expr._parse
+        self_failOn_canParseNext = self.failOn.canParseNext if self.failOn is not None else None
+        self_ignoreExpr_tryParse = self.ignoreExpr.tryParse if self.ignoreExpr is not None else None
+        
+        tmploc = loc
+        while tmploc <= instrlen:
+            if self_failOn_canParseNext is not None:
+                # break if failOn expression matches
+                if self_failOn_canParseNext(instring, tmploc):
+                    break
+                    
+            if self_ignoreExpr_tryParse is not None:
+                # advance past ignore expressions
+                while 1:
+                    try:
+                        tmploc = self_ignoreExpr_tryParse(instring, tmploc)
+                    except ParseBaseException:
+                        break
+            
+            try:
+                expr_parse(instring, tmploc, doActions=False, callPreParse=False)
+            except (ParseException, IndexError):
+                # no match, advance loc in string
+                tmploc += 1
+            else:
+                # matched skipto expr, done
+                break
+
+        else:
+            # ran off the end of the input string without matching skipto expr, fail
+            raise ParseException(instring, loc, self.errmsg, self)
+
+        # build up return values
+        loc = tmploc
+        skiptext = instring[startloc:loc]
+        skipresult = ParseResults(skiptext)
+        
+        if self.includeMatch:
+            loc, mat = expr_parse(instring,loc,doActions,callPreParse=False)
+            skipresult += mat
+
+        return loc, skipresult
+
+class Forward(ParseElementEnhance):
+    """
+    Forward declaration of an expression to be defined later -
+    used for recursive grammars, such as algebraic infix notation.
+    When the expression is known, it is assigned to the C{Forward} variable using the '<<' operator.
+
+    Note: take care when assigning to C{Forward} not to overlook precedence of operators.
+    Specifically, '|' has a lower precedence than '<<', so that::
+        fwdExpr << a | b | c
+    will actually be evaluated as::
+        (fwdExpr << a) | b | c
+    thereby leaving b and c out as parseable alternatives.  It is recommended that you
+    explicitly group the values inserted into the C{Forward}::
+        fwdExpr << (a | b | c)
+    Converting to use the '<<=' operator instead will avoid this problem.
+
+    See L{ParseResults.pprint} for an example of a recursive parser created using
+    C{Forward}.
+    """
+    def __init__( self, other=None ):
+        super(Forward,self).__init__( other, savelist=False )
+
+    def __lshift__( self, other ):
+        if isinstance( other, basestring ):
+            other = ParserElement._literalStringClass(other)
+        self.expr = other
+        self.strRepr = None
+        self.mayIndexError = self.expr.mayIndexError
+        self.mayReturnEmpty = self.expr.mayReturnEmpty
+        self.setWhitespaceChars( self.expr.whiteChars )
+        self.skipWhitespace = self.expr.skipWhitespace
+        self.saveAsList = self.expr.saveAsList
+        self.ignoreExprs.extend(self.expr.ignoreExprs)
+        return self
+        
+    def __ilshift__(self, other):
+        return self << other
+    
+    def leaveWhitespace( self ):
+        self.skipWhitespace = False
+        return self
+
+    def streamline( self ):
+        if not self.streamlined:
+            self.streamlined = True
+            if self.expr is not None:
+                self.expr.streamline()
+        return self
+
+    def validate( self, validateTrace=[] ):
+        if self not in validateTrace:
+            tmp = validateTrace[:]+[self]
+            if self.expr is not None:
+                self.expr.validate(tmp)
+        self.checkRecursion([])
+
+    def __str__( self ):
+        if hasattr(self,"name"):
+            return self.name
+        return self.__class__.__name__ + ": ..."
+
+        # stubbed out for now - creates awful memory and perf issues
+        self._revertClass = self.__class__
+        self.__class__ = _ForwardNoRecurse
+        try:
+            if self.expr is not None:
+                retString = _ustr(self.expr)
+            else:
+                retString = "None"
+        finally:
+            self.__class__ = self._revertClass
+        return self.__class__.__name__ + ": " + retString
+
+    def copy(self):
+        if self.expr is not None:
+            return super(Forward,self).copy()
+        else:
+            ret = Forward()
+            ret <<= self
+            return ret
+
+class _ForwardNoRecurse(Forward):
+    def __str__( self ):
+        return "..."
+
+class TokenConverter(ParseElementEnhance):
+    """
+    Abstract subclass of C{ParseExpression}, for converting parsed results.
+    """
+    def __init__( self, expr, savelist=False ):
+        super(TokenConverter,self).__init__( expr )#, savelist )
+        self.saveAsList = False
+
+class Combine(TokenConverter):
+    """
+    Converter to concatenate all matching tokens to a single string.
+    By default, the matching patterns must also be contiguous in the input string;
+    this can be disabled by specifying C{'adjacent=False'} in the constructor.
+
+    Example::
+        real = Word(nums) + '.' + Word(nums)
+        print(real.parseString('3.1416')) # -> ['3', '.', '1416']
+        # will also erroneously match the following
+        print(real.parseString('3. 1416')) # -> ['3', '.', '1416']
+
+        real = Combine(Word(nums) + '.' + Word(nums))
+        print(real.parseString('3.1416')) # -> ['3.1416']
+        # no match when there are internal spaces
+        print(real.parseString('3. 1416')) # -> Exception: Expected W:(0123...)
+    """
+    def __init__( self, expr, joinString="", adjacent=True ):
+        super(Combine,self).__init__( expr )
+        # suppress whitespace-stripping in contained parse expressions, but re-enable it on the Combine itself
+        if adjacent:
+            self.leaveWhitespace()
+        self.adjacent = adjacent
+        self.skipWhitespace = True
+        self.joinString = joinString
+        self.callPreparse = True
+
+    def ignore( self, other ):
+        if self.adjacent:
+            ParserElement.ignore(self, other)
+        else:
+            super( Combine, self).ignore( other )
+        return self
+
+    def postParse( self, instring, loc, tokenlist ):
+        retToks = tokenlist.copy()
+        del retToks[:]
+        retToks += ParseResults([ "".join(tokenlist._asStringList(self.joinString)) ], modal=self.modalResults)
+
+        if self.resultsName and retToks.haskeys():
+            return [ retToks ]
+        else:
+            return retToks
+
+class Group(TokenConverter):
+    """
+    Converter to return the matched tokens as a list - useful for returning tokens of C{L{ZeroOrMore}} and C{L{OneOrMore}} expressions.
+
+    Example::
+        ident = Word(alphas)
+        num = Word(nums)
+        term = ident | num
+        func = ident + Optional(delimitedList(term))
+        print(func.parseString("fn a,b,100"))  # -> ['fn', 'a', 'b', '100']
+
+        func = ident + Group(Optional(delimitedList(term)))
+        print(func.parseString("fn a,b,100"))  # -> ['fn', ['a', 'b', '100']]
+    """
+    def __init__( self, expr ):
+        super(Group,self).__init__( expr )
+        self.saveAsList = True
+
+    def postParse( self, instring, loc, tokenlist ):
+        return [ tokenlist ]
+
+class Dict(TokenConverter):
+    """
+    Converter to return a repetitive expression as a list, but also as a dictionary.
+    Each element can also be referenced using the first token in the expression as its key.
+    Useful for tabular report scraping when the first column can be used as a item key.
+
+    Example::
+        data_word = Word(alphas)
+        label = data_word + FollowedBy(':')
+        attr_expr = Group(label + Suppress(':') + OneOrMore(data_word).setParseAction(' '.join))
+
+        text = "shape: SQUARE posn: upper left color: light blue texture: burlap"
+        attr_expr = (label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
+        
+        # print attributes as plain groups
+        print(OneOrMore(attr_expr).parseString(text).dump())
+        
+        # instead of OneOrMore(expr), parse using Dict(OneOrMore(Group(expr))) - Dict will auto-assign names
+        result = Dict(OneOrMore(Group(attr_expr))).parseString(text)
+        print(result.dump())
+        
+        # access named fields as dict entries, or output as dict
+        print(result['shape'])        
+        print(result.asDict())
+    prints::
+        ['shape', 'SQUARE', 'posn', 'upper left', 'color', 'light blue', 'texture', 'burlap']
+
+        [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'light blue'], ['texture', 'burlap']]
+        - color: light blue
+        - posn: upper left
+        - shape: SQUARE
+        - texture: burlap
+        SQUARE
+        {'color': 'light blue', 'posn': 'upper left', 'texture': 'burlap', 'shape': 'SQUARE'}
+    See more examples at L{ParseResults} of accessing fields by results name.
+    """
+    def __init__( self, expr ):
+        super(Dict,self).__init__( expr )
+        self.saveAsList = True
+
+    def postParse( self, instring, loc, tokenlist ):
+        for i,tok in enumerate(tokenlist):
+            if len(tok) == 0:
+                continue
+            ikey = tok[0]
+            if isinstance(ikey,int):
+                ikey = _ustr(tok[0]).strip()
+            if len(tok)==1:
+                tokenlist[ikey] = _ParseResultsWithOffset("",i)
+            elif len(tok)==2 and not isinstance(tok[1],ParseResults):
+                tokenlist[ikey] = _ParseResultsWithOffset(tok[1],i)
+            else:
+                dictvalue = tok.copy() #ParseResults(i)
+                del dictvalue[0]
+                if len(dictvalue)!= 1 or (isinstance(dictvalue,ParseResults) and dictvalue.haskeys()):
+                    tokenlist[ikey] = _ParseResultsWithOffset(dictvalue,i)
+                else:
+                    tokenlist[ikey] = _ParseResultsWithOffset(dictvalue[0],i)
+
+        if self.resultsName:
+            return [ tokenlist ]
+        else:
+            return tokenlist
+
+
+class Suppress(TokenConverter):
+    """
+    Converter for ignoring the results of a parsed expression.
+
+    Example::
+        source = "a, b, c,d"
+        wd = Word(alphas)
+        wd_list1 = wd + ZeroOrMore(',' + wd)
+        print(wd_list1.parseString(source))
+
+        # often, delimiters that are useful during parsing are just in the
+        # way afterward - use Suppress to keep them out of the parsed output
+        wd_list2 = wd + ZeroOrMore(Suppress(',') + wd)
+        print(wd_list2.parseString(source))
+    prints::
+        ['a', ',', 'b', ',', 'c', ',', 'd']
+        ['a', 'b', 'c', 'd']
+    (See also L{delimitedList}.)
+    """
+    def postParse( self, instring, loc, tokenlist ):
+        return []
+
+    def suppress( self ):
+        return self
+
+
+class OnlyOnce(object):
+    """
+    Wrapper for parse actions, to ensure they are only called once.
+    """
+    def __init__(self, methodCall):
+        self.callable = _trim_arity(methodCall)
+        self.called = False
+    def __call__(self,s,l,t):
+        if not self.called:
+            results = self.callable(s,l,t)
+            self.called = True
+            return results
+        raise ParseException(s,l,"")
+    def reset(self):
+        self.called = False
+
+def traceParseAction(f):
+    """
+    Decorator for debugging parse actions. 
+    
+    When the parse action is called, this decorator will print C{">> entering I{method-name}(line:I{current_source_line}, I{parse_location}, I{matched_tokens})".}
+    When the parse action completes, the decorator will print C{"<<"} followed by the returned value, or any exception that the parse action raised.
+
+    Example::
+        wd = Word(alphas)
+
+        @traceParseAction
+        def remove_duplicate_chars(tokens):
+            return ''.join(sorted(set(''.join(tokens))))
+
+        wds = OneOrMore(wd).setParseAction(remove_duplicate_chars)
+        print(wds.parseString("slkdjs sld sldd sdlf sdljf"))
+    prints::
+        >>entering remove_duplicate_chars(line: 'slkdjs sld sldd sdlf sdljf', 0, (['slkdjs', 'sld', 'sldd', 'sdlf', 'sdljf'], {}))
+        <3:
+            thisFunc = paArgs[0].__class__.__name__ + '.' + thisFunc
+        sys.stderr.write( ">>entering %s(line: '%s', %d, %r)\n" % (thisFunc,line(l,s),l,t) )
+        try:
+            ret = f(*paArgs)
+        except Exception as exc:
+            sys.stderr.write( "< ['aa', 'bb', 'cc']
+        delimitedList(Word(hexnums), delim=':', combine=True).parseString("AA:BB:CC:DD:EE") # -> ['AA:BB:CC:DD:EE']
+    """
+    dlName = _ustr(expr)+" ["+_ustr(delim)+" "+_ustr(expr)+"]..."
+    if combine:
+        return Combine( expr + ZeroOrMore( delim + expr ) ).setName(dlName)
+    else:
+        return ( expr + ZeroOrMore( Suppress( delim ) + expr ) ).setName(dlName)
+
+def countedArray( expr, intExpr=None ):
+    """
+    Helper to define a counted list of expressions.
+    This helper defines a pattern of the form::
+        integer expr expr expr...
+    where the leading integer tells how many expr expressions follow.
+    The matched tokens returns the array of expr tokens as a list - the leading count token is suppressed.
+    
+    If C{intExpr} is specified, it should be a pyparsing expression that produces an integer value.
+
+    Example::
+        countedArray(Word(alphas)).parseString('2 ab cd ef')  # -> ['ab', 'cd']
+
+        # in this parser, the leading integer value is given in binary,
+        # '10' indicating that 2 values are in the array
+        binaryConstant = Word('01').setParseAction(lambda t: int(t[0], 2))
+        countedArray(Word(alphas), intExpr=binaryConstant).parseString('10 ab cd ef')  # -> ['ab', 'cd']
+    """
+    arrayExpr = Forward()
+    def countFieldParseAction(s,l,t):
+        n = t[0]
+        arrayExpr << (n and Group(And([expr]*n)) or Group(empty))
+        return []
+    if intExpr is None:
+        intExpr = Word(nums).setParseAction(lambda t:int(t[0]))
+    else:
+        intExpr = intExpr.copy()
+    intExpr.setName("arrayLen")
+    intExpr.addParseAction(countFieldParseAction, callDuringTry=True)
+    return ( intExpr + arrayExpr ).setName('(len) ' + _ustr(expr) + '...')
+
+def _flatten(L):
+    ret = []
+    for i in L:
+        if isinstance(i,list):
+            ret.extend(_flatten(i))
+        else:
+            ret.append(i)
+    return ret
+
+def matchPreviousLiteral(expr):
+    """
+    Helper to define an expression that is indirectly defined from
+    the tokens matched in a previous expression, that is, it looks
+    for a 'repeat' of a previous expression.  For example::
+        first = Word(nums)
+        second = matchPreviousLiteral(first)
+        matchExpr = first + ":" + second
+    will match C{"1:1"}, but not C{"1:2"}.  Because this matches a
+    previous literal, will also match the leading C{"1:1"} in C{"1:10"}.
+    If this is not desired, use C{matchPreviousExpr}.
+    Do I{not} use with packrat parsing enabled.
+    """
+    rep = Forward()
+    def copyTokenToRepeater(s,l,t):
+        if t:
+            if len(t) == 1:
+                rep << t[0]
+            else:
+                # flatten t tokens
+                tflat = _flatten(t.asList())
+                rep << And(Literal(tt) for tt in tflat)
+        else:
+            rep << Empty()
+    expr.addParseAction(copyTokenToRepeater, callDuringTry=True)
+    rep.setName('(prev) ' + _ustr(expr))
+    return rep
+
+def matchPreviousExpr(expr):
+    """
+    Helper to define an expression that is indirectly defined from
+    the tokens matched in a previous expression, that is, it looks
+    for a 'repeat' of a previous expression.  For example::
+        first = Word(nums)
+        second = matchPreviousExpr(first)
+        matchExpr = first + ":" + second
+    will match C{"1:1"}, but not C{"1:2"}.  Because this matches by
+    expressions, will I{not} match the leading C{"1:1"} in C{"1:10"};
+    the expressions are evaluated first, and then compared, so
+    C{"1"} is compared with C{"10"}.
+    Do I{not} use with packrat parsing enabled.
+    """
+    rep = Forward()
+    e2 = expr.copy()
+    rep <<= e2
+    def copyTokenToRepeater(s,l,t):
+        matchTokens = _flatten(t.asList())
+        def mustMatchTheseTokens(s,l,t):
+            theseTokens = _flatten(t.asList())
+            if  theseTokens != matchTokens:
+                raise ParseException("",0,"")
+        rep.setParseAction( mustMatchTheseTokens, callDuringTry=True )
+    expr.addParseAction(copyTokenToRepeater, callDuringTry=True)
+    rep.setName('(prev) ' + _ustr(expr))
+    return rep
+
+def _escapeRegexRangeChars(s):
+    #~  escape these chars: ^-]
+    for c in r"\^-]":
+        s = s.replace(c,_bslash+c)
+    s = s.replace("\n",r"\n")
+    s = s.replace("\t",r"\t")
+    return _ustr(s)
+
+def oneOf( strs, caseless=False, useRegex=True ):
+    """
+    Helper to quickly define a set of alternative Literals, and makes sure to do
+    longest-first testing when there is a conflict, regardless of the input order,
+    but returns a C{L{MatchFirst}} for best performance.
+
+    Parameters:
+     - strs - a string of space-delimited literals, or a collection of string literals
+     - caseless - (default=C{False}) - treat all literals as caseless
+     - useRegex - (default=C{True}) - as an optimization, will generate a Regex
+          object; otherwise, will generate a C{MatchFirst} object (if C{caseless=True}, or
+          if creating a C{Regex} raises an exception)
+
+    Example::
+        comp_oper = oneOf("< = > <= >= !=")
+        var = Word(alphas)
+        number = Word(nums)
+        term = var | number
+        comparison_expr = term + comp_oper + term
+        print(comparison_expr.searchString("B = 12  AA=23 B<=AA AA>12"))
+    prints::
+        [['B', '=', '12'], ['AA', '=', '23'], ['B', '<=', 'AA'], ['AA', '>', '12']]
+    """
+    if caseless:
+        isequal = ( lambda a,b: a.upper() == b.upper() )
+        masks = ( lambda a,b: b.upper().startswith(a.upper()) )
+        parseElementClass = CaselessLiteral
+    else:
+        isequal = ( lambda a,b: a == b )
+        masks = ( lambda a,b: b.startswith(a) )
+        parseElementClass = Literal
+
+    symbols = []
+    if isinstance(strs,basestring):
+        symbols = strs.split()
+    elif isinstance(strs, Iterable):
+        symbols = list(strs)
+    else:
+        warnings.warn("Invalid argument to oneOf, expected string or iterable",
+                SyntaxWarning, stacklevel=2)
+    if not symbols:
+        return NoMatch()
+
+    i = 0
+    while i < len(symbols)-1:
+        cur = symbols[i]
+        for j,other in enumerate(symbols[i+1:]):
+            if ( isequal(other, cur) ):
+                del symbols[i+j+1]
+                break
+            elif ( masks(cur, other) ):
+                del symbols[i+j+1]
+                symbols.insert(i,other)
+                cur = other
+                break
+        else:
+            i += 1
+
+    if not caseless and useRegex:
+        #~ print (strs,"->", "|".join( [ _escapeRegexChars(sym) for sym in symbols] ))
+        try:
+            if len(symbols)==len("".join(symbols)):
+                return Regex( "[%s]" % "".join(_escapeRegexRangeChars(sym) for sym in symbols) ).setName(' | '.join(symbols))
+            else:
+                return Regex( "|".join(re.escape(sym) for sym in symbols) ).setName(' | '.join(symbols))
+        except Exception:
+            warnings.warn("Exception creating Regex for oneOf, building MatchFirst",
+                    SyntaxWarning, stacklevel=2)
+
+
+    # last resort, just use MatchFirst
+    return MatchFirst(parseElementClass(sym) for sym in symbols).setName(' | '.join(symbols))
+
+def dictOf( key, value ):
+    """
+    Helper to easily and clearly define a dictionary by specifying the respective patterns
+    for the key and value.  Takes care of defining the C{L{Dict}}, C{L{ZeroOrMore}}, and C{L{Group}} tokens
+    in the proper order.  The key pattern can include delimiting markers or punctuation,
+    as long as they are suppressed, thereby leaving the significant key text.  The value
+    pattern can include named results, so that the C{Dict} results can include named token
+    fields.
+
+    Example::
+        text = "shape: SQUARE posn: upper left color: light blue texture: burlap"
+        attr_expr = (label + Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join))
+        print(OneOrMore(attr_expr).parseString(text).dump())
+        
+        attr_label = label
+        attr_value = Suppress(':') + OneOrMore(data_word, stopOn=label).setParseAction(' '.join)
+
+        # similar to Dict, but simpler call format
+        result = dictOf(attr_label, attr_value).parseString(text)
+        print(result.dump())
+        print(result['shape'])
+        print(result.shape)  # object attribute access works too
+        print(result.asDict())
+    prints::
+        [['shape', 'SQUARE'], ['posn', 'upper left'], ['color', 'light blue'], ['texture', 'burlap']]
+        - color: light blue
+        - posn: upper left
+        - shape: SQUARE
+        - texture: burlap
+        SQUARE
+        SQUARE
+        {'color': 'light blue', 'shape': 'SQUARE', 'posn': 'upper left', 'texture': 'burlap'}
+    """
+    return Dict( ZeroOrMore( Group ( key + value ) ) )
+
+def originalTextFor(expr, asString=True):
+    """
+    Helper to return the original, untokenized text for a given expression.  Useful to
+    restore the parsed fields of an HTML start tag into the raw tag text itself, or to
+    revert separate tokens with intervening whitespace back to the original matching
+    input text. By default, returns astring containing the original parsed text.  
+       
+    If the optional C{asString} argument is passed as C{False}, then the return value is a 
+    C{L{ParseResults}} containing any results names that were originally matched, and a 
+    single token containing the original matched text from the input string.  So if 
+    the expression passed to C{L{originalTextFor}} contains expressions with defined
+    results names, you must set C{asString} to C{False} if you want to preserve those
+    results name values.
+
+    Example::
+        src = "this is test  bold text  normal text "
+        for tag in ("b","i"):
+            opener,closer = makeHTMLTags(tag)
+            patt = originalTextFor(opener + SkipTo(closer) + closer)
+            print(patt.searchString(src)[0])
+    prints::
+        [' bold text ']
+        ['text']
+    """
+    locMarker = Empty().setParseAction(lambda s,loc,t: loc)
+    endlocMarker = locMarker.copy()
+    endlocMarker.callPreparse = False
+    matchExpr = locMarker("_original_start") + expr + endlocMarker("_original_end")
+    if asString:
+        extractText = lambda s,l,t: s[t._original_start:t._original_end]
+    else:
+        def extractText(s,l,t):
+            t[:] = [s[t.pop('_original_start'):t.pop('_original_end')]]
+    matchExpr.setParseAction(extractText)
+    matchExpr.ignoreExprs = expr.ignoreExprs
+    return matchExpr
+
+def ungroup(expr): 
+    """
+    Helper to undo pyparsing's default grouping of And expressions, even
+    if all but one are non-empty.
+    """
+    return TokenConverter(expr).setParseAction(lambda t:t[0])
+
+def locatedExpr(expr):
+    """
+    Helper to decorate a returned token with its starting and ending locations in the input string.
+    This helper adds the following results names:
+     - locn_start = location where matched expression begins
+     - locn_end = location where matched expression ends
+     - value = the actual parsed results
+
+    Be careful if the input text contains C{} characters, you may want to call
+    C{L{ParserElement.parseWithTabs}}
+
+    Example::
+        wd = Word(alphas)
+        for match in locatedExpr(wd).searchString("ljsdf123lksdjjf123lkkjj1222"):
+            print(match)
+    prints::
+        [[0, 'ljsdf', 5]]
+        [[8, 'lksdjjf', 15]]
+        [[18, 'lkkjj', 23]]
+    """
+    locator = Empty().setParseAction(lambda s,l,t: l)
+    return Group(locator("locn_start") + expr("value") + locator.copy().leaveWhitespace()("locn_end"))
+
+
+# convenience constants for positional expressions
+empty       = Empty().setName("empty")
+lineStart   = LineStart().setName("lineStart")
+lineEnd     = LineEnd().setName("lineEnd")
+stringStart = StringStart().setName("stringStart")
+stringEnd   = StringEnd().setName("stringEnd")
+
+_escapedPunc = Word( _bslash, r"\[]-*.$+^?()~ ", exact=2 ).setParseAction(lambda s,l,t:t[0][1])
+_escapedHexChar = Regex(r"\\0?[xX][0-9a-fA-F]+").setParseAction(lambda s,l,t:unichr(int(t[0].lstrip(r'\0x'),16)))
+_escapedOctChar = Regex(r"\\0[0-7]+").setParseAction(lambda s,l,t:unichr(int(t[0][1:],8)))
+_singleChar = _escapedPunc | _escapedHexChar | _escapedOctChar | CharsNotIn(r'\]', exact=1)
+_charRange = Group(_singleChar + Suppress("-") + _singleChar)
+_reBracketExpr = Literal("[") + Optional("^").setResultsName("negate") + Group( OneOrMore( _charRange | _singleChar ) ).setResultsName("body") + "]"
+
+def srange(s):
+    r"""
+    Helper to easily define string ranges for use in Word construction.  Borrows
+    syntax from regexp '[]' string range definitions::
+        srange("[0-9]")   -> "0123456789"
+        srange("[a-z]")   -> "abcdefghijklmnopqrstuvwxyz"
+        srange("[a-z$_]") -> "abcdefghijklmnopqrstuvwxyz$_"
+    The input string must be enclosed in []'s, and the returned string is the expanded
+    character set joined into a single string.
+    The values enclosed in the []'s may be:
+     - a single character
+     - an escaped character with a leading backslash (such as C{\-} or C{\]})
+     - an escaped hex character with a leading C{'\x'} (C{\x21}, which is a C{'!'} character) 
+         (C{\0x##} is also supported for backwards compatibility) 
+     - an escaped octal character with a leading C{'\0'} (C{\041}, which is a C{'!'} character)
+     - a range of any of the above, separated by a dash (C{'a-z'}, etc.)
+     - any combination of the above (C{'aeiouy'}, C{'a-zA-Z0-9_$'}, etc.)
+    """
+    _expanded = lambda p: p if not isinstance(p,ParseResults) else ''.join(unichr(c) for c in range(ord(p[0]),ord(p[1])+1))
+    try:
+        return "".join(_expanded(part) for part in _reBracketExpr.parseString(s).body)
+    except Exception:
+        return ""
+
+def matchOnlyAtCol(n):
+    """
+    Helper method for defining parse actions that require matching at a specific
+    column in the input text.
+    """
+    def verifyCol(strg,locn,toks):
+        if col(locn,strg) != n:
+            raise ParseException(strg,locn,"matched token not at column %d" % n)
+    return verifyCol
+
+def replaceWith(replStr):
+    """
+    Helper method for common parse actions that simply return a literal value.  Especially
+    useful when used with C{L{transformString}()}.
+
+    Example::
+        num = Word(nums).setParseAction(lambda toks: int(toks[0]))
+        na = oneOf("N/A NA").setParseAction(replaceWith(math.nan))
+        term = na | num
+        
+        OneOrMore(term).parseString("324 234 N/A 234") # -> [324, 234, nan, 234]
+    """
+    return lambda s,l,t: [replStr]
+
+def removeQuotes(s,l,t):
+    """
+    Helper parse action for removing quotation marks from parsed quoted strings.
+
+    Example::
+        # by default, quotation marks are included in parsed results
+        quotedString.parseString("'Now is the Winter of our Discontent'") # -> ["'Now is the Winter of our Discontent'"]
+
+        # use removeQuotes to strip quotation marks from parsed results
+        quotedString.setParseAction(removeQuotes)
+        quotedString.parseString("'Now is the Winter of our Discontent'") # -> ["Now is the Winter of our Discontent"]
+    """
+    return t[0][1:-1]
+
+def tokenMap(func, *args):
+    """
+    Helper to define a parse action by mapping a function to all elements of a ParseResults list.If any additional 
+    args are passed, they are forwarded to the given function as additional arguments after
+    the token, as in C{hex_integer = Word(hexnums).setParseAction(tokenMap(int, 16))}, which will convert the
+    parsed data to an integer using base 16.
+
+    Example (compare the last to example in L{ParserElement.transformString}::
+        hex_ints = OneOrMore(Word(hexnums)).setParseAction(tokenMap(int, 16))
+        hex_ints.runTests('''
+            00 11 22 aa FF 0a 0d 1a
+            ''')
+        
+        upperword = Word(alphas).setParseAction(tokenMap(str.upper))
+        OneOrMore(upperword).runTests('''
+            my kingdom for a horse
+            ''')
+
+        wd = Word(alphas).setParseAction(tokenMap(str.title))
+        OneOrMore(wd).setParseAction(' '.join).runTests('''
+            now is the winter of our discontent made glorious summer by this sun of york
+            ''')
+    prints::
+        00 11 22 aa FF 0a 0d 1a
+        [0, 17, 34, 170, 255, 10, 13, 26]
+
+        my kingdom for a horse
+        ['MY', 'KINGDOM', 'FOR', 'A', 'HORSE']
+
+        now is the winter of our discontent made glorious summer by this sun of york
+        ['Now Is The Winter Of Our Discontent Made Glorious Summer By This Sun Of York']
+    """
+    def pa(s,l,t):
+        return [func(tokn, *args) for tokn in t]
+
+    try:
+        func_name = getattr(func, '__name__', 
+                            getattr(func, '__class__').__name__)
+    except Exception:
+        func_name = str(func)
+    pa.__name__ = func_name
+
+    return pa
+
+upcaseTokens = tokenMap(lambda t: _ustr(t).upper())
+"""(Deprecated) Helper parse action to convert tokens to upper case. Deprecated in favor of L{pyparsing_common.upcaseTokens}"""
+
+downcaseTokens = tokenMap(lambda t: _ustr(t).lower())
+"""(Deprecated) Helper parse action to convert tokens to lower case. Deprecated in favor of L{pyparsing_common.downcaseTokens}"""
+    
+def _makeTags(tagStr, xml):
+    """Internal helper to construct opening and closing tag expressions, given a tag name"""
+    if isinstance(tagStr,basestring):
+        resname = tagStr
+        tagStr = Keyword(tagStr, caseless=not xml)
+    else:
+        resname = tagStr.name
+
+    tagAttrName = Word(alphas,alphanums+"_-:")
+    if (xml):
+        tagAttrValue = dblQuotedString.copy().setParseAction( removeQuotes )
+        openTag = Suppress("<") + tagStr("tag") + \
+                Dict(ZeroOrMore(Group( tagAttrName + Suppress("=") + tagAttrValue ))) + \
+                Optional("/",default=[False]).setResultsName("empty").setParseAction(lambda s,l,t:t[0]=='/') + Suppress(">")
+    else:
+        printablesLessRAbrack = "".join(c for c in printables if c not in ">")
+        tagAttrValue = quotedString.copy().setParseAction( removeQuotes ) | Word(printablesLessRAbrack)
+        openTag = Suppress("<") + tagStr("tag") + \
+                Dict(ZeroOrMore(Group( tagAttrName.setParseAction(downcaseTokens) + \
+                Optional( Suppress("=") + tagAttrValue ) ))) + \
+                Optional("/",default=[False]).setResultsName("empty").setParseAction(lambda s,l,t:t[0]=='/') + Suppress(">")
+    closeTag = Combine(_L("")
+
+    openTag = openTag.setResultsName("start"+"".join(resname.replace(":"," ").title().split())).setName("<%s>" % resname)
+    closeTag = closeTag.setResultsName("end"+"".join(resname.replace(":"," ").title().split())).setName("" % resname)
+    openTag.tag = resname
+    closeTag.tag = resname
+    return openTag, closeTag
+
+def makeHTMLTags(tagStr):
+    """
+    Helper to construct opening and closing tag expressions for HTML, given a tag name. Matches
+    tags in either upper or lower case, attributes with namespaces and with quoted or unquoted values.
+
+    Example::
+        text = 'More info at the pyparsing wiki page'
+        # makeHTMLTags returns pyparsing expressions for the opening and closing tags as a 2-tuple
+        a,a_end = makeHTMLTags("A")
+        link_expr = a + SkipTo(a_end)("link_text") + a_end
+        
+        for link in link_expr.searchString(text):
+            # attributes in the  tag (like "href" shown here) are also accessible as named results
+            print(link.link_text, '->', link.href)
+    prints::
+        pyparsing -> http://pyparsing.wikispaces.com
+    """
+    return _makeTags( tagStr, False )
+
+def makeXMLTags(tagStr):
+    """
+    Helper to construct opening and closing tag expressions for XML, given a tag name. Matches
+    tags only in the given upper/lower case.
+
+    Example: similar to L{makeHTMLTags}
+    """
+    return _makeTags( tagStr, True )
+
+def withAttribute(*args,**attrDict):
+    """
+    Helper to create a validating parse action to be used with start tags created
+    with C{L{makeXMLTags}} or C{L{makeHTMLTags}}. Use C{withAttribute} to qualify a starting tag
+    with a required attribute value, to avoid false matches on common tags such as
+    C{} or C{
}. + + Call C{withAttribute} with a series of attribute names and values. Specify the list + of filter attributes names and values as: + - keyword arguments, as in C{(align="right")}, or + - as an explicit dict with C{**} operator, when an attribute name is also a Python + reserved word, as in C{**{"class":"Customer", "align":"right"}} + - a list of name-value tuples, as in ( ("ns1:class", "Customer"), ("ns2:align","right") ) + For attribute names with a namespace prefix, you must use the second form. Attribute + names are matched insensitive to upper/lower case. + + If just testing for C{class} (with or without a namespace), use C{L{withClass}}. + + To verify that the attribute exists, but without specifying a value, pass + C{withAttribute.ANY_VALUE} as the value. + + Example:: + html = ''' +
+ Some text +
1 4 0 1 0
+
1,3 2,3 1,1
+
this has no type
+
+ + ''' + div,div_end = makeHTMLTags("div") + + # only match div tag having a type attribute with value "grid" + div_grid = div().setParseAction(withAttribute(type="grid")) + grid_expr = div_grid + SkipTo(div | div_end)("body") + for grid_header in grid_expr.searchString(html): + print(grid_header.body) + + # construct a match with any div tag having a type attribute, regardless of the value + div_any_type = div().setParseAction(withAttribute(type=withAttribute.ANY_VALUE)) + div_expr = div_any_type + SkipTo(div | div_end)("body") + for div_header in div_expr.searchString(html): + print(div_header.body) + prints:: + 1 4 0 1 0 + + 1 4 0 1 0 + 1,3 2,3 1,1 + """ + if args: + attrs = args[:] + else: + attrs = attrDict.items() + attrs = [(k,v) for k,v in attrs] + def pa(s,l,tokens): + for attrName,attrValue in attrs: + if attrName not in tokens: + raise ParseException(s,l,"no matching attribute " + attrName) + if attrValue != withAttribute.ANY_VALUE and tokens[attrName] != attrValue: + raise ParseException(s,l,"attribute '%s' has value '%s', must be '%s'" % + (attrName, tokens[attrName], attrValue)) + return pa +withAttribute.ANY_VALUE = object() + +def withClass(classname, namespace=''): + """ + Simplified version of C{L{withAttribute}} when matching on a div class - made + difficult because C{class} is a reserved word in Python. + + Example:: + html = ''' +
+ Some text +
1 4 0 1 0
+
1,3 2,3 1,1
+
this <div> has no class
+
+ + ''' + div,div_end = makeHTMLTags("div") + div_grid = div().setParseAction(withClass("grid")) + + grid_expr = div_grid + SkipTo(div | div_end)("body") + for grid_header in grid_expr.searchString(html): + print(grid_header.body) + + div_any_type = div().setParseAction(withClass(withAttribute.ANY_VALUE)) + div_expr = div_any_type + SkipTo(div | div_end)("body") + for div_header in div_expr.searchString(html): + print(div_header.body) + prints:: + 1 4 0 1 0 + + 1 4 0 1 0 + 1,3 2,3 1,1 + """ + classattr = "%s:class" % namespace if namespace else "class" + return withAttribute(**{classattr : classname}) + +opAssoc = _Constants() +opAssoc.LEFT = object() +opAssoc.RIGHT = object() + +def infixNotation( baseExpr, opList, lpar=Suppress('('), rpar=Suppress(')') ): + """ + Helper method for constructing grammars of expressions made up of + operators working in a precedence hierarchy. Operators may be unary or + binary, left- or right-associative. Parse actions can also be attached + to operator expressions. The generated parser will also recognize the use + of parentheses to override operator precedences (see example below). + + Note: if you define a deep operator list, you may see performance issues + when using infixNotation. See L{ParserElement.enablePackrat} for a + mechanism to potentially improve your parser performance. + + Parameters: + - baseExpr - expression representing the most basic element for the nested + - opList - list of tuples, one for each operator precedence level in the + expression grammar; each tuple is of the form + (opExpr, numTerms, rightLeftAssoc, parseAction), where: + - opExpr is the pyparsing expression for the operator; + may also be a string, which will be converted to a Literal; + if numTerms is 3, opExpr is a tuple of two expressions, for the + two operators separating the 3 terms + - numTerms is the number of terms for this operator (must + be 1, 2, or 3) + - rightLeftAssoc is the indicator whether the operator is + right or left associative, using the pyparsing-defined + constants C{opAssoc.RIGHT} and C{opAssoc.LEFT}. + - parseAction is the parse action to be associated with + expressions matching this operator expression (the + parse action tuple member may be omitted); if the parse action + is passed a tuple or list of functions, this is equivalent to + calling C{setParseAction(*fn)} (L{ParserElement.setParseAction}) + - lpar - expression for matching left-parentheses (default=C{Suppress('(')}) + - rpar - expression for matching right-parentheses (default=C{Suppress(')')}) + + Example:: + # simple example of four-function arithmetic with ints and variable names + integer = pyparsing_common.signed_integer + varname = pyparsing_common.identifier + + arith_expr = infixNotation(integer | varname, + [ + ('-', 1, opAssoc.RIGHT), + (oneOf('* /'), 2, opAssoc.LEFT), + (oneOf('+ -'), 2, opAssoc.LEFT), + ]) + + arith_expr.runTests(''' + 5+3*6 + (5+3)*6 + -2--11 + ''', fullDump=False) + prints:: + 5+3*6 + [[5, '+', [3, '*', 6]]] + + (5+3)*6 + [[[5, '+', 3], '*', 6]] + + -2--11 + [[['-', 2], '-', ['-', 11]]] + """ + ret = Forward() + lastExpr = baseExpr | ( lpar + ret + rpar ) + for i,operDef in enumerate(opList): + opExpr,arity,rightLeftAssoc,pa = (operDef + (None,))[:4] + termName = "%s term" % opExpr if arity < 3 else "%s%s term" % opExpr + if arity == 3: + if opExpr is None or len(opExpr) != 2: + raise ValueError("if numterms=3, opExpr must be a tuple or list of two expressions") + opExpr1, opExpr2 = opExpr + thisExpr = Forward().setName(termName) + if rightLeftAssoc == opAssoc.LEFT: + if arity == 1: + matchExpr = FollowedBy(lastExpr + opExpr) + Group( lastExpr + OneOrMore( opExpr ) ) + elif arity == 2: + if opExpr is not None: + matchExpr = FollowedBy(lastExpr + opExpr + lastExpr) + Group( lastExpr + OneOrMore( opExpr + lastExpr ) ) + else: + matchExpr = FollowedBy(lastExpr+lastExpr) + Group( lastExpr + OneOrMore(lastExpr) ) + elif arity == 3: + matchExpr = FollowedBy(lastExpr + opExpr1 + lastExpr + opExpr2 + lastExpr) + \ + Group( lastExpr + opExpr1 + lastExpr + opExpr2 + lastExpr ) + else: + raise ValueError("operator must be unary (1), binary (2), or ternary (3)") + elif rightLeftAssoc == opAssoc.RIGHT: + if arity == 1: + # try to avoid LR with this extra test + if not isinstance(opExpr, Optional): + opExpr = Optional(opExpr) + matchExpr = FollowedBy(opExpr.expr + thisExpr) + Group( opExpr + thisExpr ) + elif arity == 2: + if opExpr is not None: + matchExpr = FollowedBy(lastExpr + opExpr + thisExpr) + Group( lastExpr + OneOrMore( opExpr + thisExpr ) ) + else: + matchExpr = FollowedBy(lastExpr + thisExpr) + Group( lastExpr + OneOrMore( thisExpr ) ) + elif arity == 3: + matchExpr = FollowedBy(lastExpr + opExpr1 + thisExpr + opExpr2 + thisExpr) + \ + Group( lastExpr + opExpr1 + thisExpr + opExpr2 + thisExpr ) + else: + raise ValueError("operator must be unary (1), binary (2), or ternary (3)") + else: + raise ValueError("operator must indicate right or left associativity") + if pa: + if isinstance(pa, (tuple, list)): + matchExpr.setParseAction(*pa) + else: + matchExpr.setParseAction(pa) + thisExpr <<= ( matchExpr.setName(termName) | lastExpr ) + lastExpr = thisExpr + ret <<= lastExpr + return ret + +operatorPrecedence = infixNotation +"""(Deprecated) Former name of C{L{infixNotation}}, will be dropped in a future release.""" + +dblQuotedString = Combine(Regex(r'"(?:[^"\n\r\\]|(?:"")|(?:\\(?:[^x]|x[0-9a-fA-F]+)))*')+'"').setName("string enclosed in double quotes") +sglQuotedString = Combine(Regex(r"'(?:[^'\n\r\\]|(?:'')|(?:\\(?:[^x]|x[0-9a-fA-F]+)))*")+"'").setName("string enclosed in single quotes") +quotedString = Combine(Regex(r'"(?:[^"\n\r\\]|(?:"")|(?:\\(?:[^x]|x[0-9a-fA-F]+)))*')+'"'| + Regex(r"'(?:[^'\n\r\\]|(?:'')|(?:\\(?:[^x]|x[0-9a-fA-F]+)))*")+"'").setName("quotedString using single or double quotes") +unicodeString = Combine(_L('u') + quotedString.copy()).setName("unicode string literal") + +def nestedExpr(opener="(", closer=")", content=None, ignoreExpr=quotedString.copy()): + """ + Helper method for defining nested lists enclosed in opening and closing + delimiters ("(" and ")" are the default). + + Parameters: + - opener - opening character for a nested list (default=C{"("}); can also be a pyparsing expression + - closer - closing character for a nested list (default=C{")"}); can also be a pyparsing expression + - content - expression for items within the nested lists (default=C{None}) + - ignoreExpr - expression for ignoring opening and closing delimiters (default=C{quotedString}) + + If an expression is not provided for the content argument, the nested + expression will capture all whitespace-delimited content between delimiters + as a list of separate values. + + Use the C{ignoreExpr} argument to define expressions that may contain + opening or closing characters that should not be treated as opening + or closing characters for nesting, such as quotedString or a comment + expression. Specify multiple expressions using an C{L{Or}} or C{L{MatchFirst}}. + The default is L{quotedString}, but if no expressions are to be ignored, + then pass C{None} for this argument. + + Example:: + data_type = oneOf("void int short long char float double") + decl_data_type = Combine(data_type + Optional(Word('*'))) + ident = Word(alphas+'_', alphanums+'_') + number = pyparsing_common.number + arg = Group(decl_data_type + ident) + LPAR,RPAR = map(Suppress, "()") + + code_body = nestedExpr('{', '}', ignoreExpr=(quotedString | cStyleComment)) + + c_function = (decl_data_type("type") + + ident("name") + + LPAR + Optional(delimitedList(arg), [])("args") + RPAR + + code_body("body")) + c_function.ignore(cStyleComment) + + source_code = ''' + int is_odd(int x) { + return (x%2); + } + + int dec_to_hex(char hchar) { + if (hchar >= '0' && hchar <= '9') { + return (ord(hchar)-ord('0')); + } else { + return (10+ord(hchar)-ord('A')); + } + } + ''' + for func in c_function.searchString(source_code): + print("%(name)s (%(type)s) args: %(args)s" % func) + + prints:: + is_odd (int) args: [['int', 'x']] + dec_to_hex (int) args: [['char', 'hchar']] + """ + if opener == closer: + raise ValueError("opening and closing strings cannot be the same") + if content is None: + if isinstance(opener,basestring) and isinstance(closer,basestring): + if len(opener) == 1 and len(closer)==1: + if ignoreExpr is not None: + content = (Combine(OneOrMore(~ignoreExpr + + CharsNotIn(opener+closer+ParserElement.DEFAULT_WHITE_CHARS,exact=1)) + ).setParseAction(lambda t:t[0].strip())) + else: + content = (empty.copy()+CharsNotIn(opener+closer+ParserElement.DEFAULT_WHITE_CHARS + ).setParseAction(lambda t:t[0].strip())) + else: + if ignoreExpr is not None: + content = (Combine(OneOrMore(~ignoreExpr + + ~Literal(opener) + ~Literal(closer) + + CharsNotIn(ParserElement.DEFAULT_WHITE_CHARS,exact=1)) + ).setParseAction(lambda t:t[0].strip())) + else: + content = (Combine(OneOrMore(~Literal(opener) + ~Literal(closer) + + CharsNotIn(ParserElement.DEFAULT_WHITE_CHARS,exact=1)) + ).setParseAction(lambda t:t[0].strip())) + else: + raise ValueError("opening and closing arguments must be strings if no content expression is given") + ret = Forward() + if ignoreExpr is not None: + ret <<= Group( Suppress(opener) + ZeroOrMore( ignoreExpr | ret | content ) + Suppress(closer) ) + else: + ret <<= Group( Suppress(opener) + ZeroOrMore( ret | content ) + Suppress(closer) ) + ret.setName('nested %s%s expression' % (opener,closer)) + return ret + +def indentedBlock(blockStatementExpr, indentStack, indent=True): + """ + Helper method for defining space-delimited indentation blocks, such as + those used to define block statements in Python source code. + + Parameters: + - blockStatementExpr - expression defining syntax of statement that + is repeated within the indented block + - indentStack - list created by caller to manage indentation stack + (multiple statementWithIndentedBlock expressions within a single grammar + should share a common indentStack) + - indent - boolean indicating whether block must be indented beyond the + the current level; set to False for block of left-most statements + (default=C{True}) + + A valid block must contain at least one C{blockStatement}. + + Example:: + data = ''' + def A(z): + A1 + B = 100 + G = A2 + A2 + A3 + B + def BB(a,b,c): + BB1 + def BBA(): + bba1 + bba2 + bba3 + C + D + def spam(x,y): + def eggs(z): + pass + ''' + + + indentStack = [1] + stmt = Forward() + + identifier = Word(alphas, alphanums) + funcDecl = ("def" + identifier + Group( "(" + Optional( delimitedList(identifier) ) + ")" ) + ":") + func_body = indentedBlock(stmt, indentStack) + funcDef = Group( funcDecl + func_body ) + + rvalue = Forward() + funcCall = Group(identifier + "(" + Optional(delimitedList(rvalue)) + ")") + rvalue << (funcCall | identifier | Word(nums)) + assignment = Group(identifier + "=" + rvalue) + stmt << ( funcDef | assignment | identifier ) + + module_body = OneOrMore(stmt) + + parseTree = module_body.parseString(data) + parseTree.pprint() + prints:: + [['def', + 'A', + ['(', 'z', ')'], + ':', + [['A1'], [['B', '=', '100']], [['G', '=', 'A2']], ['A2'], ['A3']]], + 'B', + ['def', + 'BB', + ['(', 'a', 'b', 'c', ')'], + ':', + [['BB1'], [['def', 'BBA', ['(', ')'], ':', [['bba1'], ['bba2'], ['bba3']]]]]], + 'C', + 'D', + ['def', + 'spam', + ['(', 'x', 'y', ')'], + ':', + [[['def', 'eggs', ['(', 'z', ')'], ':', [['pass']]]]]]] + """ + def checkPeerIndent(s,l,t): + if l >= len(s): return + curCol = col(l,s) + if curCol != indentStack[-1]: + if curCol > indentStack[-1]: + raise ParseFatalException(s,l,"illegal nesting") + raise ParseException(s,l,"not a peer entry") + + def checkSubIndent(s,l,t): + curCol = col(l,s) + if curCol > indentStack[-1]: + indentStack.append( curCol ) + else: + raise ParseException(s,l,"not a subentry") + + def checkUnindent(s,l,t): + if l >= len(s): return + curCol = col(l,s) + if not(indentStack and curCol < indentStack[-1] and curCol <= indentStack[-2]): + raise ParseException(s,l,"not an unindent") + indentStack.pop() + + NL = OneOrMore(LineEnd().setWhitespaceChars("\t ").suppress()) + INDENT = (Empty() + Empty().setParseAction(checkSubIndent)).setName('INDENT') + PEER = Empty().setParseAction(checkPeerIndent).setName('') + UNDENT = Empty().setParseAction(checkUnindent).setName('UNINDENT') + if indent: + smExpr = Group( Optional(NL) + + #~ FollowedBy(blockStatementExpr) + + INDENT + (OneOrMore( PEER + Group(blockStatementExpr) + Optional(NL) )) + UNDENT) + else: + smExpr = Group( Optional(NL) + + (OneOrMore( PEER + Group(blockStatementExpr) + Optional(NL) )) ) + blockStatementExpr.ignore(_bslash + LineEnd()) + return smExpr.setName('indented block') + +alphas8bit = srange(r"[\0xc0-\0xd6\0xd8-\0xf6\0xf8-\0xff]") +punc8bit = srange(r"[\0xa1-\0xbf\0xd7\0xf7]") + +anyOpenTag,anyCloseTag = makeHTMLTags(Word(alphas,alphanums+"_:").setName('any tag')) +_htmlEntityMap = dict(zip("gt lt amp nbsp quot apos".split(),'><& "\'')) +commonHTMLEntity = Regex('&(?P' + '|'.join(_htmlEntityMap.keys()) +");").setName("common HTML entity") +def replaceHTMLEntity(t): + """Helper parser action to replace common HTML entities with their special characters""" + return _htmlEntityMap.get(t.entity) + +# it's easy to get these comment structures wrong - they're very common, so may as well make them available +cStyleComment = Combine(Regex(r"/\*(?:[^*]|\*(?!/))*") + '*/').setName("C style comment") +"Comment of the form C{/* ... */}" + +htmlComment = Regex(r"").setName("HTML comment") +"Comment of the form C{}" + +restOfLine = Regex(r".*").leaveWhitespace().setName("rest of line") +dblSlashComment = Regex(r"//(?:\\\n|[^\n])*").setName("// comment") +"Comment of the form C{// ... (to end of line)}" + +cppStyleComment = Combine(Regex(r"/\*(?:[^*]|\*(?!/))*") + '*/'| dblSlashComment).setName("C++ style comment") +"Comment of either form C{L{cStyleComment}} or C{L{dblSlashComment}}" + +javaStyleComment = cppStyleComment +"Same as C{L{cppStyleComment}}" + +pythonStyleComment = Regex(r"#.*").setName("Python style comment") +"Comment of the form C{# ... (to end of line)}" + +_commasepitem = Combine(OneOrMore(Word(printables, excludeChars=',') + + Optional( Word(" \t") + + ~Literal(",") + ~LineEnd() ) ) ).streamline().setName("commaItem") +commaSeparatedList = delimitedList( Optional( quotedString.copy() | _commasepitem, default="") ).setName("commaSeparatedList") +"""(Deprecated) Predefined expression of 1 or more printable words or quoted strings, separated by commas. + This expression is deprecated in favor of L{pyparsing_common.comma_separated_list}.""" + +# some other useful expressions - using lower-case class name since we are really using this as a namespace +class pyparsing_common: + """ + Here are some common low-level expressions that may be useful in jump-starting parser development: + - numeric forms (L{integers}, L{reals}, L{scientific notation}) + - common L{programming identifiers} + - network addresses (L{MAC}, L{IPv4}, L{IPv6}) + - ISO8601 L{dates} and L{datetime} + - L{UUID} + - L{comma-separated list} + Parse actions: + - C{L{convertToInteger}} + - C{L{convertToFloat}} + - C{L{convertToDate}} + - C{L{convertToDatetime}} + - C{L{stripHTMLTags}} + - C{L{upcaseTokens}} + - C{L{downcaseTokens}} + + Example:: + pyparsing_common.number.runTests(''' + # any int or real number, returned as the appropriate type + 100 + -100 + +100 + 3.14159 + 6.02e23 + 1e-12 + ''') + + pyparsing_common.fnumber.runTests(''' + # any int or real number, returned as float + 100 + -100 + +100 + 3.14159 + 6.02e23 + 1e-12 + ''') + + pyparsing_common.hex_integer.runTests(''' + # hex numbers + 100 + FF + ''') + + pyparsing_common.fraction.runTests(''' + # fractions + 1/2 + -3/4 + ''') + + pyparsing_common.mixed_integer.runTests(''' + # mixed fractions + 1 + 1/2 + -3/4 + 1-3/4 + ''') + + import uuid + pyparsing_common.uuid.setParseAction(tokenMap(uuid.UUID)) + pyparsing_common.uuid.runTests(''' + # uuid + 12345678-1234-5678-1234-567812345678 + ''') + prints:: + # any int or real number, returned as the appropriate type + 100 + [100] + + -100 + [-100] + + +100 + [100] + + 3.14159 + [3.14159] + + 6.02e23 + [6.02e+23] + + 1e-12 + [1e-12] + + # any int or real number, returned as float + 100 + [100.0] + + -100 + [-100.0] + + +100 + [100.0] + + 3.14159 + [3.14159] + + 6.02e23 + [6.02e+23] + + 1e-12 + [1e-12] + + # hex numbers + 100 + [256] + + FF + [255] + + # fractions + 1/2 + [0.5] + + -3/4 + [-0.75] + + # mixed fractions + 1 + [1] + + 1/2 + [0.5] + + -3/4 + [-0.75] + + 1-3/4 + [1.75] + + # uuid + 12345678-1234-5678-1234-567812345678 + [UUID('12345678-1234-5678-1234-567812345678')] + """ + + convertToInteger = tokenMap(int) + """ + Parse action for converting parsed integers to Python int + """ + + convertToFloat = tokenMap(float) + """ + Parse action for converting parsed numbers to Python float + """ + + integer = Word(nums).setName("integer").setParseAction(convertToInteger) + """expression that parses an unsigned integer, returns an int""" + + hex_integer = Word(hexnums).setName("hex integer").setParseAction(tokenMap(int,16)) + """expression that parses a hexadecimal integer, returns an int""" + + signed_integer = Regex(r'[+-]?\d+').setName("signed integer").setParseAction(convertToInteger) + """expression that parses an integer with optional leading sign, returns an int""" + + fraction = (signed_integer().setParseAction(convertToFloat) + '/' + signed_integer().setParseAction(convertToFloat)).setName("fraction") + """fractional expression of an integer divided by an integer, returns a float""" + fraction.addParseAction(lambda t: t[0]/t[-1]) + + mixed_integer = (fraction | signed_integer + Optional(Optional('-').suppress() + fraction)).setName("fraction or mixed integer-fraction") + """mixed integer of the form 'integer - fraction', with optional leading integer, returns float""" + mixed_integer.addParseAction(sum) + + real = Regex(r'[+-]?\d+\.\d*').setName("real number").setParseAction(convertToFloat) + """expression that parses a floating point number and returns a float""" + + sci_real = Regex(r'[+-]?\d+([eE][+-]?\d+|\.\d*([eE][+-]?\d+)?)').setName("real number with scientific notation").setParseAction(convertToFloat) + """expression that parses a floating point number with optional scientific notation and returns a float""" + + # streamlining this expression makes the docs nicer-looking + number = (sci_real | real | signed_integer).streamline() + """any numeric expression, returns the corresponding Python type""" + + fnumber = Regex(r'[+-]?\d+\.?\d*([eE][+-]?\d+)?').setName("fnumber").setParseAction(convertToFloat) + """any int or real number, returned as float""" + + identifier = Word(alphas+'_', alphanums+'_').setName("identifier") + """typical code identifier (leading alpha or '_', followed by 0 or more alphas, nums, or '_')""" + + ipv4_address = Regex(r'(25[0-5]|2[0-4][0-9]|1?[0-9]{1,2})(\.(25[0-5]|2[0-4][0-9]|1?[0-9]{1,2})){3}').setName("IPv4 address") + "IPv4 address (C{0.0.0.0 - 255.255.255.255})" + + _ipv6_part = Regex(r'[0-9a-fA-F]{1,4}').setName("hex_integer") + _full_ipv6_address = (_ipv6_part + (':' + _ipv6_part)*7).setName("full IPv6 address") + _short_ipv6_address = (Optional(_ipv6_part + (':' + _ipv6_part)*(0,6)) + "::" + Optional(_ipv6_part + (':' + _ipv6_part)*(0,6))).setName("short IPv6 address") + _short_ipv6_address.addCondition(lambda t: sum(1 for tt in t if pyparsing_common._ipv6_part.matches(tt)) < 8) + _mixed_ipv6_address = ("::ffff:" + ipv4_address).setName("mixed IPv6 address") + ipv6_address = Combine((_full_ipv6_address | _mixed_ipv6_address | _short_ipv6_address).setName("IPv6 address")).setName("IPv6 address") + "IPv6 address (long, short, or mixed form)" + + mac_address = Regex(r'[0-9a-fA-F]{2}([:.-])[0-9a-fA-F]{2}(?:\1[0-9a-fA-F]{2}){4}').setName("MAC address") + "MAC address xx:xx:xx:xx:xx (may also have '-' or '.' delimiters)" + + @staticmethod + def convertToDate(fmt="%Y-%m-%d"): + """ + Helper to create a parse action for converting parsed date string to Python datetime.date + + Params - + - fmt - format to be passed to datetime.strptime (default=C{"%Y-%m-%d"}) + + Example:: + date_expr = pyparsing_common.iso8601_date.copy() + date_expr.setParseAction(pyparsing_common.convertToDate()) + print(date_expr.parseString("1999-12-31")) + prints:: + [datetime.date(1999, 12, 31)] + """ + def cvt_fn(s,l,t): + try: + return datetime.strptime(t[0], fmt).date() + except ValueError as ve: + raise ParseException(s, l, str(ve)) + return cvt_fn + + @staticmethod + def convertToDatetime(fmt="%Y-%m-%dT%H:%M:%S.%f"): + """ + Helper to create a parse action for converting parsed datetime string to Python datetime.datetime + + Params - + - fmt - format to be passed to datetime.strptime (default=C{"%Y-%m-%dT%H:%M:%S.%f"}) + + Example:: + dt_expr = pyparsing_common.iso8601_datetime.copy() + dt_expr.setParseAction(pyparsing_common.convertToDatetime()) + print(dt_expr.parseString("1999-12-31T23:59:59.999")) + prints:: + [datetime.datetime(1999, 12, 31, 23, 59, 59, 999000)] + """ + def cvt_fn(s,l,t): + try: + return datetime.strptime(t[0], fmt) + except ValueError as ve: + raise ParseException(s, l, str(ve)) + return cvt_fn + + iso8601_date = Regex(r'(?P\d{4})(?:-(?P\d\d)(?:-(?P\d\d))?)?').setName("ISO8601 date") + "ISO8601 date (C{yyyy-mm-dd})" + + iso8601_datetime = Regex(r'(?P\d{4})-(?P\d\d)-(?P\d\d)[T ](?P\d\d):(?P\d\d)(:(?P\d\d(\.\d*)?)?)?(?PZ|[+-]\d\d:?\d\d)?').setName("ISO8601 datetime") + "ISO8601 datetime (C{yyyy-mm-ddThh:mm:ss.s(Z|+-00:00)}) - trailing seconds, milliseconds, and timezone optional; accepts separating C{'T'} or C{' '}" + + uuid = Regex(r'[0-9a-fA-F]{8}(-[0-9a-fA-F]{4}){3}-[0-9a-fA-F]{12}').setName("UUID") + "UUID (C{xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx})" + + _html_stripper = anyOpenTag.suppress() | anyCloseTag.suppress() + @staticmethod + def stripHTMLTags(s, l, tokens): + """ + Parse action to remove HTML tags from web page HTML source + + Example:: + # strip HTML links from normal text + text = 'More info at the
pyparsing wiki page' + td,td_end = makeHTMLTags("TD") + table_text = td + SkipTo(td_end).setParseAction(pyparsing_common.stripHTMLTags)("body") + td_end + + print(table_text.parseString(text).body) # -> 'More info at the pyparsing wiki page' + """ + return pyparsing_common._html_stripper.transformString(tokens[0]) + + _commasepitem = Combine(OneOrMore(~Literal(",") + ~LineEnd() + Word(printables, excludeChars=',') + + Optional( White(" \t") ) ) ).streamline().setName("commaItem") + comma_separated_list = delimitedList( Optional( quotedString.copy() | _commasepitem, default="") ).setName("comma separated list") + """Predefined expression of 1 or more printable words or quoted strings, separated by commas.""" + + upcaseTokens = staticmethod(tokenMap(lambda t: _ustr(t).upper())) + """Parse action to convert tokens to upper case.""" + + downcaseTokens = staticmethod(tokenMap(lambda t: _ustr(t).lower())) + """Parse action to convert tokens to lower case.""" + + +if __name__ == "__main__": + + selectToken = CaselessLiteral("select") + fromToken = CaselessLiteral("from") + + ident = Word(alphas, alphanums + "_$") + + columnName = delimitedList(ident, ".", combine=True).setParseAction(upcaseTokens) + columnNameList = Group(delimitedList(columnName)).setName("columns") + columnSpec = ('*' | columnNameList) + + tableName = delimitedList(ident, ".", combine=True).setParseAction(upcaseTokens) + tableNameList = Group(delimitedList(tableName)).setName("tables") + + simpleSQL = selectToken("command") + columnSpec("columns") + fromToken + tableNameList("tables") + + # demo runTests method, including embedded comments in test string + simpleSQL.runTests(""" + # '*' as column list and dotted table name + select * from SYS.XYZZY + + # caseless match on "SELECT", and casts back to "select" + SELECT * from XYZZY, ABC + + # list of column names, and mixed case SELECT keyword + Select AA,BB,CC from Sys.dual + + # multiple tables + Select A, B, C from Sys.dual, Table2 + + # invalid SELECT keyword - should fail + Xelect A, B, C from Sys.dual + + # incomplete command - should fail + Select + + # invalid column name - should fail + Select ^^^ frox Sys.dual + + """) + + pyparsing_common.number.runTests(""" + 100 + -100 + +100 + 3.14159 + 6.02e23 + 1e-12 + """) + + # any int or real number, returned as float + pyparsing_common.fnumber.runTests(""" + 100 + -100 + +100 + 3.14159 + 6.02e23 + 1e-12 + """) + + pyparsing_common.hex_integer.runTests(""" + 100 + FF + """) + + import uuid + pyparsing_common.uuid.setParseAction(tokenMap(uuid.UUID)) + pyparsing_common.uuid.runTests(""" + 12345678-1234-5678-1234-567812345678 + """) diff --git a/lib/python3.4/site-packages/six.py b/lib/python3.7/site-packages/setuptools/_vendor/six.py similarity index 96% rename from lib/python3.4/site-packages/six.py rename to lib/python3.7/site-packages/setuptools/_vendor/six.py index 6bf4fd3..190c023 100644 --- a/lib/python3.4/site-packages/six.py +++ b/lib/python3.7/site-packages/setuptools/_vendor/six.py @@ -1,4 +1,6 @@ -# Copyright (c) 2010-2017 Benjamin Peterson +"""Utilities for writing code that runs on Python 2 and 3""" + +# Copyright (c) 2010-2015 Benjamin Peterson # # Permission is hereby granted, free of charge, to any person obtaining a copy # of this software and associated documentation files (the "Software"), to deal @@ -18,8 +20,6 @@ # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE # SOFTWARE. -"""Utilities for writing code that runs on Python 2 and 3""" - from __future__ import absolute_import import functools @@ -29,7 +29,7 @@ import sys import types __author__ = "Benjamin Peterson " -__version__ = "1.11.0" +__version__ = "1.10.0" # Useful for very coarse version differentiation. @@ -241,7 +241,6 @@ _moved_attributes = [ MovedAttribute("map", "itertools", "builtins", "imap", "map"), MovedAttribute("getcwd", "os", "os", "getcwdu", "getcwd"), MovedAttribute("getcwdb", "os", "os", "getcwd", "getcwdb"), - MovedAttribute("getoutput", "commands", "subprocess"), MovedAttribute("range", "__builtin__", "builtins", "xrange", "range"), MovedAttribute("reload_module", "__builtin__", "importlib" if PY34 else "imp", "reload"), MovedAttribute("reduce", "__builtin__", "functools"), @@ -263,11 +262,10 @@ _moved_attributes = [ MovedModule("html_entities", "htmlentitydefs", "html.entities"), MovedModule("html_parser", "HTMLParser", "html.parser"), MovedModule("http_client", "httplib", "http.client"), - MovedModule("email_mime_base", "email.MIMEBase", "email.mime.base"), - MovedModule("email_mime_image", "email.MIMEImage", "email.mime.image"), MovedModule("email_mime_multipart", "email.MIMEMultipart", "email.mime.multipart"), MovedModule("email_mime_nonmultipart", "email.MIMENonMultipart", "email.mime.nonmultipart"), MovedModule("email_mime_text", "email.MIMEText", "email.mime.text"), + MovedModule("email_mime_base", "email.MIMEBase", "email.mime.base"), MovedModule("BaseHTTPServer", "BaseHTTPServer", "http.server"), MovedModule("CGIHTTPServer", "CGIHTTPServer", "http.server"), MovedModule("SimpleHTTPServer", "SimpleHTTPServer", "http.server"), @@ -339,12 +337,10 @@ _urllib_parse_moved_attributes = [ MovedAttribute("quote_plus", "urllib", "urllib.parse"), MovedAttribute("unquote", "urllib", "urllib.parse"), MovedAttribute("unquote_plus", "urllib", "urllib.parse"), - MovedAttribute("unquote_to_bytes", "urllib", "urllib.parse", "unquote", "unquote_to_bytes"), MovedAttribute("urlencode", "urllib", "urllib.parse"), MovedAttribute("splitquery", "urllib", "urllib.parse"), MovedAttribute("splittag", "urllib", "urllib.parse"), MovedAttribute("splituser", "urllib", "urllib.parse"), - MovedAttribute("splitvalue", "urllib", "urllib.parse"), MovedAttribute("uses_fragment", "urlparse", "urllib.parse"), MovedAttribute("uses_netloc", "urlparse", "urllib.parse"), MovedAttribute("uses_params", "urlparse", "urllib.parse"), @@ -420,8 +416,6 @@ _urllib_request_moved_attributes = [ MovedAttribute("URLopener", "urllib", "urllib.request"), MovedAttribute("FancyURLopener", "urllib", "urllib.request"), MovedAttribute("proxy_bypass", "urllib", "urllib.request"), - MovedAttribute("parse_http_list", "urllib2", "urllib.request"), - MovedAttribute("parse_keqv_list", "urllib2", "urllib.request"), ] for attr in _urllib_request_moved_attributes: setattr(Module_six_moves_urllib_request, attr.name, attr) @@ -685,15 +679,11 @@ if PY3: exec_ = getattr(moves.builtins, "exec") def reraise(tp, value, tb=None): - try: - if value is None: - value = tp() - if value.__traceback__ is not tb: - raise value.with_traceback(tb) - raise value - finally: - value = None - tb = None + if value is None: + value = tp() + if value.__traceback__ is not tb: + raise value.with_traceback(tb) + raise value else: def exec_(_code_, _globs_=None, _locs_=None): @@ -709,28 +699,19 @@ else: exec("""exec _code_ in _globs_, _locs_""") exec_("""def reraise(tp, value, tb=None): - try: - raise tp, value, tb - finally: - tb = None + raise tp, value, tb """) if sys.version_info[:2] == (3, 2): exec_("""def raise_from(value, from_value): - try: - if from_value is None: - raise value - raise value from from_value - finally: - value = None + if from_value is None: + raise value + raise value from from_value """) elif sys.version_info[:2] > (3, 2): exec_("""def raise_from(value, from_value): - try: - raise value from from_value - finally: - value = None + raise value from from_value """) else: def raise_from(value, from_value): @@ -821,14 +802,10 @@ def with_metaclass(meta, *bases): # This requires a bit of explanation: the basic idea is to make a dummy # metaclass for one level of class instantiation that replaces itself with # the actual metaclass. - class metaclass(type): + class metaclass(meta): def __new__(cls, name, this_bases, d): return meta(name, bases, d) - - @classmethod - def __prepare__(cls, name, this_bases): - return meta.__prepare__(name, bases) return type.__new__(metaclass, 'temporary_class', (), {}) diff --git a/lib/python3.4/site-packages/setuptools/archive_util.py b/lib/python3.7/site-packages/setuptools/archive_util.py similarity index 98% rename from lib/python3.4/site-packages/setuptools/archive_util.py rename to lib/python3.7/site-packages/setuptools/archive_util.py index cc82b3d..8143604 100644 --- a/lib/python3.4/site-packages/setuptools/archive_util.py +++ b/lib/python3.7/site-packages/setuptools/archive_util.py @@ -8,7 +8,7 @@ import posixpath import contextlib from distutils.errors import DistutilsError -from pkg_resources import ensure_directory, ContextualZipFile +from pkg_resources import ensure_directory __all__ = [ "unpack_archive", "unpack_zipfile", "unpack_tarfile", "default_filter", @@ -98,7 +98,7 @@ def unpack_zipfile(filename, extract_dir, progress_filter=default_filter): if not zipfile.is_zipfile(filename): raise UnrecognizedFormat("%s is not a zip file" % (filename,)) - with ContextualZipFile(filename) as z: + with zipfile.ZipFile(filename) as z: for info in z.infolist(): name = info.filename diff --git a/lib/python3.4/site-packages/setuptools/build_meta.py b/lib/python3.7/site-packages/setuptools/build_meta.py similarity index 68% rename from lib/python3.4/site-packages/setuptools/build_meta.py rename to lib/python3.7/site-packages/setuptools/build_meta.py index 54f2987..0067a7a 100644 --- a/lib/python3.4/site-packages/setuptools/build_meta.py +++ b/lib/python3.7/site-packages/setuptools/build_meta.py @@ -61,14 +61,28 @@ class Distribution(setuptools.dist.Distribution): distutils.core.Distribution = orig +def _to_str(s): + """ + Convert a filename to a string (on Python 2, explicitly + a byte string, not Unicode) as distutils checks for the + exact type str. + """ + if sys.version_info[0] == 2 and not isinstance(s, str): + # Assume it's Unicode, as that's what the PEP says + # should be provided. + return s.encode(sys.getfilesystemencoding()) + return s + + def _run_setup(setup_script='setup.py'): # Note that we can reuse our build directory between calls # Correctness comes first, then optimization later __file__ = setup_script + __name__ = '__main__' f = getattr(tokenize, 'open', open)(__file__) code = f.read().replace('\\r\\n', '\\n') f.close() - exec(compile(code, __file__, 'exec')) + exec(compile(code, __file__, 'exec'), locals()) def _fix_config(config_settings): @@ -77,9 +91,8 @@ def _fix_config(config_settings): return config_settings -def _get_build_requires(config_settings): +def _get_build_requires(config_settings, requirements): config_settings = _fix_config(config_settings) - requirements = ['setuptools', 'wheel'] sys.argv = sys.argv[:1] + ['egg_info'] + \ config_settings["--global-option"] @@ -92,24 +105,47 @@ def _get_build_requires(config_settings): return requirements +def _get_immediate_subdirectories(a_dir): + return [name for name in os.listdir(a_dir) + if os.path.isdir(os.path.join(a_dir, name))] + + def get_requires_for_build_wheel(config_settings=None): config_settings = _fix_config(config_settings) - return _get_build_requires(config_settings) + return _get_build_requires(config_settings, requirements=['setuptools', 'wheel']) def get_requires_for_build_sdist(config_settings=None): config_settings = _fix_config(config_settings) - return _get_build_requires(config_settings) + return _get_build_requires(config_settings, requirements=['setuptools']) def prepare_metadata_for_build_wheel(metadata_directory, config_settings=None): - sys.argv = sys.argv[:1] + ['dist_info', '--egg-base', metadata_directory] + sys.argv = sys.argv[:1] + ['dist_info', '--egg-base', _to_str(metadata_directory)] _run_setup() - dist_infos = [f for f in os.listdir(metadata_directory) - if f.endswith('.dist-info')] + dist_info_directory = metadata_directory + while True: + dist_infos = [f for f in os.listdir(dist_info_directory) + if f.endswith('.dist-info')] + + if len(dist_infos) == 0 and \ + len(_get_immediate_subdirectories(dist_info_directory)) == 1: + dist_info_directory = os.path.join( + dist_info_directory, os.listdir(dist_info_directory)[0]) + continue + + assert len(dist_infos) == 1 + break + + # PEP 517 requires that the .dist-info directory be placed in the + # metadata_directory. To comply, we MUST copy the directory to the root + if dist_info_directory != metadata_directory: + shutil.move( + os.path.join(dist_info_directory, dist_infos[0]), + metadata_directory) + shutil.rmtree(dist_info_directory, ignore_errors=True) - assert len(dist_infos) == 1 return dist_infos[0] @@ -135,11 +171,9 @@ def build_sdist(sdist_directory, config_settings=None): config_settings = _fix_config(config_settings) sdist_directory = os.path.abspath(sdist_directory) sys.argv = sys.argv[:1] + ['sdist'] + \ - config_settings["--global-option"] + config_settings["--global-option"] + \ + ["--dist-dir", sdist_directory] _run_setup() - if sdist_directory != 'dist': - shutil.rmtree(sdist_directory) - shutil.copytree('dist', sdist_directory) sdists = [f for f in os.listdir(sdist_directory) if f.endswith('.tar.gz')] diff --git a/lib/python3.4/site-packages/setuptools/cli-32.exe b/lib/python3.7/site-packages/setuptools/cli-32.exe similarity index 100% rename from lib/python3.4/site-packages/setuptools/cli-32.exe rename to lib/python3.7/site-packages/setuptools/cli-32.exe diff --git a/lib/python3.4/site-packages/setuptools/cli-64.exe b/lib/python3.7/site-packages/setuptools/cli-64.exe similarity index 100% rename from lib/python3.4/site-packages/setuptools/cli-64.exe rename to lib/python3.7/site-packages/setuptools/cli-64.exe diff --git a/lib/python3.4/site-packages/setuptools/cli.exe b/lib/python3.7/site-packages/setuptools/cli.exe similarity index 100% rename from lib/python3.4/site-packages/setuptools/cli.exe rename to lib/python3.7/site-packages/setuptools/cli.exe diff --git a/lib/python3.4/site-packages/setuptools/command/__init__.py b/lib/python3.7/site-packages/setuptools/command/__init__.py similarity index 95% rename from lib/python3.4/site-packages/setuptools/command/__init__.py rename to lib/python3.7/site-packages/setuptools/command/__init__.py index 4fe3bb5..fe619e2 100644 --- a/lib/python3.4/site-packages/setuptools/command/__init__.py +++ b/lib/python3.7/site-packages/setuptools/command/__init__.py @@ -2,7 +2,8 @@ __all__ = [ 'alias', 'bdist_egg', 'bdist_rpm', 'build_ext', 'build_py', 'develop', 'easy_install', 'egg_info', 'install', 'install_lib', 'rotate', 'saveopts', 'sdist', 'setopt', 'test', 'install_egg_info', 'install_scripts', - 'register', 'bdist_wininst', 'upload_docs', 'upload', 'build_clib', 'dist_info', + 'register', 'bdist_wininst', 'upload_docs', 'upload', 'build_clib', + 'dist_info', ] from distutils.command.bdist import bdist diff --git a/lib/python3.4/site-packages/setuptools/command/alias.py b/lib/python3.7/site-packages/setuptools/command/alias.py similarity index 100% rename from lib/python3.4/site-packages/setuptools/command/alias.py rename to lib/python3.7/site-packages/setuptools/command/alias.py diff --git a/lib/python3.4/site-packages/setuptools/command/bdist_egg.py b/lib/python3.7/site-packages/setuptools/command/bdist_egg.py similarity index 95% rename from lib/python3.4/site-packages/setuptools/command/bdist_egg.py rename to lib/python3.7/site-packages/setuptools/command/bdist_egg.py index 51755d5..9f8df91 100644 --- a/lib/python3.4/site-packages/setuptools/command/bdist_egg.py +++ b/lib/python3.7/site-packages/setuptools/command/bdist_egg.py @@ -8,6 +8,7 @@ from distutils import log from types import CodeType import sys import os +import re import textwrap import marshal @@ -38,6 +39,7 @@ def strip_module(filename): filename = filename[:-6] return filename + def sorted_walk(dir): """Do os.walk in a reproducible way, independent of indeterministic filesystem readdir order @@ -47,6 +49,7 @@ def sorted_walk(dir): files.sort() yield base, dirs, files + def write_stub(resource, pyfile): _stub_template = textwrap.dedent(""" def __bootstrap__(): @@ -240,11 +243,28 @@ class bdist_egg(Command): log.info("Removing .py files from temporary directory") for base, dirs, files in walk_egg(self.bdist_dir): for name in files: + path = os.path.join(base, name) + if name.endswith('.py'): - path = os.path.join(base, name) log.debug("Deleting %s", path) os.unlink(path) + if base.endswith('__pycache__'): + path_old = path + + pattern = r'(?P.+)\.(?P[^.]+)\.pyc' + m = re.match(pattern, name) + path_new = os.path.join( + base, os.pardir, m.group('name') + '.pyc') + log.info( + "Renaming file from [%s] to [%s]" + % (path_old, path_new)) + try: + os.remove(path_new) + except OSError: + pass + os.rename(path_old, path_new) + def zip_safe(self): safe = getattr(self.distribution, 'zip_safe', None) if safe is not None: @@ -391,10 +411,12 @@ def scan_module(egg_dir, base, name, stubs): return True # Extension module pkg = base[len(egg_dir) + 1:].replace(os.sep, '.') module = pkg + (pkg and '.' or '') + os.path.splitext(name)[0] - if sys.version_info < (3, 3): + if six.PY2: skip = 8 # skip magic & date - else: + elif sys.version_info < (3, 7): skip = 12 # skip magic & date & file size + else: + skip = 16 # skip magic & reserved? & date & file size f = open(filename, 'rb') f.read(skip) code = marshal.load(f) diff --git a/lib/python3.4/site-packages/setuptools/command/bdist_rpm.py b/lib/python3.7/site-packages/setuptools/command/bdist_rpm.py similarity index 100% rename from lib/python3.4/site-packages/setuptools/command/bdist_rpm.py rename to lib/python3.7/site-packages/setuptools/command/bdist_rpm.py diff --git a/lib/python3.4/site-packages/setuptools/command/bdist_wininst.py b/lib/python3.7/site-packages/setuptools/command/bdist_wininst.py similarity index 100% rename from lib/python3.4/site-packages/setuptools/command/bdist_wininst.py rename to lib/python3.7/site-packages/setuptools/command/bdist_wininst.py diff --git a/lib/python3.4/site-packages/setuptools/command/build_clib.py b/lib/python3.7/site-packages/setuptools/command/build_clib.py similarity index 100% rename from lib/python3.4/site-packages/setuptools/command/build_clib.py rename to lib/python3.7/site-packages/setuptools/command/build_clib.py diff --git a/lib/python3.4/site-packages/setuptools/command/build_ext.py b/lib/python3.7/site-packages/setuptools/command/build_ext.py similarity index 97% rename from lib/python3.4/site-packages/setuptools/command/build_ext.py rename to lib/python3.7/site-packages/setuptools/command/build_ext.py index 36f53f0..60a8a32 100644 --- a/lib/python3.4/site-packages/setuptools/command/build_ext.py +++ b/lib/python3.7/site-packages/setuptools/command/build_ext.py @@ -15,6 +15,9 @@ from setuptools.extern import six try: # Attempt to use Cython for building extensions, if available from Cython.Distutils.build_ext import build_ext as _build_ext + # Additionally, assert that the compiler module will load + # also. Ref #1229. + __import__('Cython.Compiler.Main') except ImportError: _build_ext = _du_build_ext @@ -109,7 +112,7 @@ class build_ext(_build_ext): and get_abi3_suffix() ) if use_abi3: - so_ext = _get_config_var_837('EXT_SUFFIX') + so_ext = get_config_var('EXT_SUFFIX') filename = filename[:-len(so_ext)] filename = filename + get_abi3_suffix() if isinstance(ext, Library): @@ -316,13 +319,3 @@ else: self.create_static_lib( objects, basename, output_dir, debug, target_lang ) - - -def _get_config_var_837(name): - """ - In https://github.com/pypa/setuptools/pull/837, we discovered - Python 3.3.0 exposes the extension suffix under the name 'SO'. - """ - if sys.version_info < (3, 3, 1): - name = 'SO' - return get_config_var(name) diff --git a/lib/python3.4/site-packages/setuptools/command/build_py.py b/lib/python3.7/site-packages/setuptools/command/build_py.py similarity index 100% rename from lib/python3.4/site-packages/setuptools/command/build_py.py rename to lib/python3.7/site-packages/setuptools/command/build_py.py diff --git a/lib/python3.4/site-packages/setuptools/command/develop.py b/lib/python3.7/site-packages/setuptools/command/develop.py similarity index 98% rename from lib/python3.4/site-packages/setuptools/command/develop.py rename to lib/python3.7/site-packages/setuptools/command/develop.py index 85b23c6..fdc9fc4 100644 --- a/lib/python3.4/site-packages/setuptools/command/develop.py +++ b/lib/python3.7/site-packages/setuptools/command/develop.py @@ -12,6 +12,8 @@ from setuptools.command.easy_install import easy_install from setuptools import namespaces import setuptools +__metaclass__ = type + class develop(namespaces.DevelopInstaller, easy_install): """Set up package for development""" @@ -95,7 +97,9 @@ class develop(namespaces.DevelopInstaller, easy_install): path_to_setup = egg_base.replace(os.sep, '/').rstrip('/') if path_to_setup != os.curdir: path_to_setup = '../' * (path_to_setup.count('/') + 1) - resolved = normalize_path(os.path.join(install_dir, egg_path, path_to_setup)) + resolved = normalize_path( + os.path.join(install_dir, egg_path, path_to_setup) + ) if resolved != normalize_path(os.curdir): raise DistutilsOptionError( "Can't get a consistent path to setup script from" @@ -190,7 +194,7 @@ class develop(namespaces.DevelopInstaller, easy_install): return easy_install.install_wrapper_scripts(self, dist) -class VersionlessRequirement(object): +class VersionlessRequirement: """ Adapt a pkg_resources.Distribution to simply return the project name as the 'requirement' so that scripts will work across diff --git a/lib/python3.4/site-packages/setuptools/command/dist_info.py b/lib/python3.7/site-packages/setuptools/command/dist_info.py similarity index 81% rename from lib/python3.4/site-packages/setuptools/command/dist_info.py rename to lib/python3.7/site-packages/setuptools/command/dist_info.py index c8dc659..c45258f 100644 --- a/lib/python3.4/site-packages/setuptools/command/dist_info.py +++ b/lib/python3.7/site-packages/setuptools/command/dist_info.py @@ -4,9 +4,9 @@ As defined in the wheel specification """ import os -import shutil from distutils.core import Command +from distutils import log class dist_info(Command): @@ -26,12 +26,11 @@ class dist_info(Command): def run(self): egg_info = self.get_finalized_command('egg_info') + egg_info.egg_base = self.egg_base + egg_info.finalize_options() egg_info.run() dist_info_dir = egg_info.egg_info[:-len('.egg-info')] + '.dist-info' + log.info("creating '{}'".format(os.path.abspath(dist_info_dir))) bdist_wheel = self.get_finalized_command('bdist_wheel') bdist_wheel.egg2dist(egg_info.egg_info, dist_info_dir) - - if self.egg_base: - shutil.move(dist_info_dir, os.path.join( - self.egg_base, dist_info_dir)) diff --git a/lib/python3.4/site-packages/setuptools/command/easy_install.py b/lib/python3.7/site-packages/setuptools/command/easy_install.py similarity index 94% rename from lib/python3.4/site-packages/setuptools/command/easy_install.py rename to lib/python3.7/site-packages/setuptools/command/easy_install.py index 8fba7b4..7115f0b 100644 --- a/lib/python3.4/site-packages/setuptools/command/easy_install.py +++ b/lib/python3.7/site-packages/setuptools/command/easy_install.py @@ -40,12 +40,13 @@ import subprocess import shlex import io +from sysconfig import get_config_vars, get_path + from setuptools.extern import six from setuptools.extern.six.moves import configparser, map from setuptools import Command from setuptools.sandbox import run_setup -from setuptools.py31compat import get_path, get_config_vars from setuptools.py27compat import rmtree_safe from setuptools.command import setopt from setuptools.archive_util import unpack_archive @@ -53,6 +54,7 @@ from setuptools.package_index import ( PackageIndex, parse_requirement_arg, URL_SCHEME, ) from setuptools.command import bdist_egg, egg_info +from setuptools.wheel import Wheel from pkg_resources import ( yield_lines, normalize_path, resource_string, ensure_directory, get_distribution, find_distributions, Environment, Requirement, @@ -61,6 +63,8 @@ from pkg_resources import ( ) import pkg_resources.py31compat +__metaclass__ = type + # Turn on PEP440Warnings warnings.filterwarnings("default", category=pkg_resources.PEP440Warning) @@ -92,7 +96,7 @@ def samefile(p1, p2): if six.PY2: - def _to_ascii(s): + def _to_bytes(s): return s def isascii(s): @@ -103,8 +107,8 @@ if six.PY2: return False else: - def _to_ascii(s): - return s.encode('ascii') + def _to_bytes(s): + return s.encode('utf8') def isascii(s): try: @@ -148,13 +152,15 @@ class easy_install(Command): ('local-snapshots-ok', 'l', "allow building eggs from local checkouts"), ('version', None, "print version information and exit"), + ('install-layout=', None, "installation layout to choose (known values: deb)"), + ('force-installation-into-system-dir', '0', "force installation into /usr"), ('no-find-links', None, "Don't load find-links defined in packages being installed") ] boolean_options = [ 'zip-ok', 'multi-version', 'exclude-scripts', 'upgrade', 'always-copy', 'editable', - 'no-deps', 'local-snapshots-ok', 'version' + 'no-deps', 'local-snapshots-ok', 'version', 'force-installation-into-system-dir' ] if site.ENABLE_USER_SITE: @@ -202,6 +208,11 @@ class easy_install(Command): self.site_dirs = None self.installed_projects = {} self.sitepy_installed = False + # enable custom installation, known values: deb + self.install_layout = None + self.force_installation_into_system_dir = None + self.multiarch = None + # Always read easy_install options, even if we are subclassed, or have # an independent instance created. This ensures that defaults will # always come from the standard configuration file(s)' "easy_install" @@ -270,6 +281,15 @@ class easy_install(Command): self.expand_basedirs() self.expand_dirs() + if self.install_layout: + if not self.install_layout.lower() in ['deb']: + raise DistutilsOptionError("unknown value for --install-layout") + self.install_layout = self.install_layout.lower() + + import sysconfig + if sys.version_info[:2] >= (3, 3): + self.multiarch = sysconfig.get_config_var('MULTIARCH') + self._expand( 'install_dir', 'script_dir', 'build_directory', 'site_dirs', @@ -296,6 +316,15 @@ class easy_install(Command): if self.user and self.install_purelib: self.install_dir = self.install_purelib self.script_dir = self.install_scripts + + if self.prefix == '/usr' and not self.force_installation_into_system_dir: + raise DistutilsOptionError("""installation into /usr + +Trying to install into the system managed parts of the file system. Please +consider to install to another location, or use the option +--force-installation-into-system-dir to overwrite this warning. +""") + # default --record from the install command self.set_undefined_options('install', ('record', 'record')) # Should this be moved to the if statement below? It's not used @@ -318,7 +347,7 @@ class easy_install(Command): self.all_site_dirs.append(normalize_path(d)) if not self.editable: self.check_site_dir() - self.index_url = self.index_url or "https://pypi.python.org/simple" + self.index_url = self.index_url or "https://pypi.org/simple/" self.shadow_path = self.all_site_dirs[:] for path_item in self.install_dir, normalize_path(self.script_dir): if path_item not in self.shadow_path: @@ -410,7 +439,7 @@ class easy_install(Command): for spec in self.args: self.easy_install(spec, not self.no_deps) if self.record: - outputs = self.outputs + outputs = list(sorted(self.outputs)) if self.root: # strip any package prefix root_len = len(self.root) for counter in range(len(outputs)): @@ -628,7 +657,7 @@ class easy_install(Command): @contextlib.contextmanager def _tmpdir(self): - tmpdir = tempfile.mkdtemp(prefix=six.u("easy_install-")) + tmpdir = tempfile.mkdtemp(prefix=u"easy_install-") try: # cast to str as workaround for #709 and #710 and #712 yield str(tmpdir) @@ -801,7 +830,7 @@ class easy_install(Command): if is_script: body = self._load_template(dev_path) % locals() script_text = ScriptWriter.get_header(script_text) + body - self.write_script(script_name, _to_ascii(script_text), 'b') + self.write_script(script_name, _to_bytes(script_text), 'b') @staticmethod def _load_template(dev_path): @@ -827,14 +856,16 @@ class easy_install(Command): target = os.path.join(self.script_dir, script_name) self.add_output(target) + if self.dry_run: + return + mask = current_umask() - if not self.dry_run: - ensure_directory(target) - if os.path.exists(target): - os.unlink(target) - with open(target, "w" + mode) as f: - f.write(contents) - chmod(target, 0o777 - mask) + ensure_directory(target) + if os.path.exists(target): + os.unlink(target) + with open(target, "w" + mode) as f: + f.write(contents) + chmod(target, 0o777 - mask) def install_eggs(self, spec, dist_filename, tmpdir): # .egg dirs or files are already built, so just return them @@ -842,6 +873,8 @@ class easy_install(Command): return [self.install_egg(dist_filename, tmpdir)] elif dist_filename.lower().endswith('.exe'): return [self.install_exe(dist_filename, tmpdir)] + elif dist_filename.lower().endswith('.whl'): + return [self.install_wheel(dist_filename, tmpdir)] # Anything else, try to extract and build setup_base = tmpdir @@ -1038,6 +1071,35 @@ class easy_install(Command): f.write('\n'.join(locals()[name]) + '\n') f.close() + def install_wheel(self, wheel_path, tmpdir): + wheel = Wheel(wheel_path) + assert wheel.is_compatible() + destination = os.path.join(self.install_dir, wheel.egg_name()) + destination = os.path.abspath(destination) + if not self.dry_run: + ensure_directory(destination) + if os.path.isdir(destination) and not os.path.islink(destination): + dir_util.remove_tree(destination, dry_run=self.dry_run) + elif os.path.exists(destination): + self.execute( + os.unlink, + (destination,), + "Removing " + destination, + ) + try: + self.execute( + wheel.install_as_egg, + (destination,), + ("Installing %s to %s") % ( + os.path.basename(wheel_path), + os.path.dirname(destination) + ), + ) + finally: + update_dist_caches(destination, fix_zipimporter_caches=False) + self.add_output(destination) + return self.egg_distribution(destination) + __mv_warning = textwrap.dedent(""" Because this distribution was installed --multi-version, before you can import modules from this package in an application, you will need to @@ -1216,7 +1278,6 @@ class easy_install(Command): def byte_compile(self, to_compile): if sys.dont_write_bytecode: - self.warn('byte-compiling is disabled, skipping.') return from distutils.util import byte_compile @@ -1311,11 +1372,28 @@ class easy_install(Command): self.debug_print("os.makedirs('%s', 0o700)" % path) os.makedirs(path, 0o700) + if sys.version[:3] in ('2.3', '2.4', '2.5') or 'real_prefix' in sys.__dict__: + sitedir_name = 'site-packages' + else: + sitedir_name = 'dist-packages' + INSTALL_SCHEMES = dict( posix=dict( install_dir='$base/lib/python$py_version_short/site-packages', script_dir='$base/bin', ), + unix_local = dict( + install_dir = '$base/local/lib/python$py_version_short/%s' % sitedir_name, + script_dir = '$base/local/bin', + ), + posix_local = dict( + install_dir = '$base/local/lib/python$py_version_short/%s' % sitedir_name, + script_dir = '$base/local/bin', + ), + deb_system = dict( + install_dir = '$base/lib/python3/%s' % sitedir_name, + script_dir = '$base/bin', + ), ) DEFAULT_SCHEME = dict( @@ -1326,11 +1404,18 @@ class easy_install(Command): def _expand(self, *attrs): config_vars = self.get_finalized_command('install').config_vars - if self.prefix: + if self.prefix or self.install_layout: + if self.install_layout and self.install_layout in ['deb']: + scheme_name = "deb_system" + self.prefix = '/usr' + elif self.prefix or 'real_prefix' in sys.__dict__: + scheme_name = os.name + else: + scheme_name = "posix_local" # Set default install_dir/scripts from --prefix config_vars = config_vars.copy() config_vars['base'] = self.prefix - scheme = self.INSTALL_SCHEMES.get(os.name, self.DEFAULT_SCHEME) + scheme = self.INSTALL_SCHEMES.get(scheme_name,self.DEFAULT_SCHEME) for attr, val in scheme.items(): if getattr(self, attr, None) is None: setattr(self, attr, val) @@ -1370,11 +1455,17 @@ def get_site_dirs(): sitedirs.append(os.path.join(prefix, "Lib", "site-packages")) elif os.sep == '/': sitedirs.extend([ + os.path.join( + prefix, + "local/lib", + "python" + sys.version[:3], + "dist-packages", + ), os.path.join( prefix, "lib", "python" + sys.version[:3], - "site-packages", + "dist-packages", ), os.path.join(prefix, "lib", "site-python"), ]) @@ -1817,7 +1908,7 @@ def _update_zipimporter_cache(normalized_path, cache, updater=None): # get/del patterns instead. For more detailed information see the # following links: # https://github.com/pypa/setuptools/issues/202#issuecomment-202913420 - # https://bitbucket.org/pypy/pypy/src/dd07756a34a41f674c0cacfbc8ae1d4cc9ea2ae4/pypy/module/zipimport/interp_zipimport.py#cl-99 + # http://bit.ly/2h9itJX old_entry = cache[p] del cache[p] new_entry = updater and updater(p, old_entry) @@ -2016,7 +2107,7 @@ class WindowsCommandSpec(CommandSpec): split_args = dict(posix=False) -class ScriptWriter(object): +class ScriptWriter: """ Encapsulates behavior around writing entry point scripts for console and gui apps. @@ -2049,12 +2140,10 @@ class ScriptWriter(object): @classmethod def get_script_header(cls, script_text, executable=None, wininst=False): # for backward compatibility - warnings.warn("Use get_header", DeprecationWarning) + warnings.warn("Use get_header", DeprecationWarning, stacklevel=2) if wininst: executable = "python.exe" - cmd = cls.command_spec_class.best().from_param(executable) - cmd.install_options(script_text) - return cmd.as_header() + return cls.get_header(script_text, executable) @classmethod def get_args(cls, dist, header=None): diff --git a/lib/python3.4/site-packages/setuptools/command/egg_info.py b/lib/python3.7/site-packages/setuptools/command/egg_info.py similarity index 97% rename from lib/python3.4/site-packages/setuptools/command/egg_info.py rename to lib/python3.7/site-packages/setuptools/command/egg_info.py index a183d15..f3ad36b 100644 --- a/lib/python3.4/site-packages/setuptools/command/egg_info.py +++ b/lib/python3.7/site-packages/setuptools/command/egg_info.py @@ -30,7 +30,7 @@ from pkg_resources import ( import setuptools.unicode_utils as unicode_utils from setuptools.glob import glob -from pkg_resources.extern import packaging +from setuptools.extern import packaging def translate_pattern(glob): @@ -116,7 +116,33 @@ def translate_pattern(glob): return re.compile(pat, flags=re.MULTILINE|re.DOTALL) -class egg_info(Command): +class InfoCommon: + tag_build = None + tag_date = None + + @property + def name(self): + return safe_name(self.distribution.get_name()) + + def tagged_version(self): + version = self.distribution.get_version() + # egg_info may be called more than once for a distribution, + # in which case the version string already contains all tags. + if self.vtags and version.endswith(self.vtags): + return safe_version(version) + return safe_version(version + self.vtags) + + def tags(self): + version = '' + if self.tag_build: + version += self.tag_build + if self.tag_date: + version += time.strftime("-%Y%m%d") + return version + vtags = property(tags) + + +class egg_info(InfoCommon, Command): description = "create a distribution's .egg-info directory" user_options = [ @@ -133,14 +159,11 @@ class egg_info(Command): } def initialize_options(self): - self.egg_name = None - self.egg_version = None self.egg_base = None + self.egg_name = None self.egg_info = None - self.tag_build = None - self.tag_date = 0 + self.egg_version = None self.broken_egg_info = False - self.vtags = None #################################### # allow the 'tag_svn_revision' to be detected and @@ -160,9 +183,7 @@ class egg_info(Command): build tag. Install build keys in a deterministic order to avoid arbitrary reordering on subsequent builds. """ - # python 2.6 compatibility - odict = getattr(collections, 'OrderedDict', dict) - egg_info = odict() + egg_info = collections.OrderedDict() # follow the order these keys would have been added # when PYTHONHASHSEED=0 egg_info['tag_build'] = self.tags() @@ -170,10 +191,12 @@ class egg_info(Command): edit_config(filename, dict(egg_info=egg_info)) def finalize_options(self): - self.egg_name = safe_name(self.distribution.get_name()) - self.vtags = self.tags() + # Note: we need to capture the current value returned + # by `self.tagged_version()`, so we can later update + # `self.distribution.metadata.version` without + # repercussions. + self.egg_name = self.name self.egg_version = self.tagged_version() - parsed_version = parse_version(self.egg_version) try: @@ -256,16 +279,9 @@ class egg_info(Command): if not self.dry_run: os.unlink(filename) - def tagged_version(self): - version = self.distribution.get_version() - # egg_info may be called more than once for a distribution, - # in which case the version string already contains all tags. - if self.vtags and version.endswith(self.vtags): - return safe_version(version) - return safe_version(version + self.vtags) - def run(self): self.mkpath(self.egg_info) + os.utime(self.egg_info, None) installer = self.distribution.fetch_build_egg for ep in iter_entry_points('egg_info.writers'): ep.require(installer=installer) @@ -279,14 +295,6 @@ class egg_info(Command): self.find_sources() - def tags(self): - version = '' - if self.tag_build: - version += self.tag_build - if self.tag_date: - version += time.strftime("-%Y%m%d") - return version - def find_sources(self): """Generate SOURCES.txt manifest file""" manifest_filename = os.path.join(self.egg_info, "SOURCES.txt") @@ -599,10 +607,7 @@ def write_pkg_info(cmd, basename, filename): metadata = cmd.distribution.metadata metadata.version, oldver = cmd.egg_version, metadata.version metadata.name, oldname = cmd.egg_name, metadata.name - metadata.long_description_content_type = getattr( - cmd.distribution, - 'long_description_content_type' - ) + try: # write unescaped data to PKG-INFO, so older pkg_resources # can still parse it @@ -626,7 +631,7 @@ def warn_depends_obsolete(cmd, basename, filename): def _write_requirements(stream, reqs): lines = yield_lines(reqs or ()) append_cr = lambda line: line + '\n' - lines = map(append_cr, lines) + lines = map(append_cr, sorted(lines)) stream.writelines(lines) @@ -642,7 +647,7 @@ def write_requirements(cmd, basename, filename): def write_setup_requirements(cmd, basename, filename): - data = StringIO() + data = io.StringIO() _write_requirements(data, cmd.distribution.setup_requires) cmd.write_or_delete_file("setup-requirements", filename, data.getvalue()) diff --git a/lib/python3.4/site-packages/setuptools/command/install.py b/lib/python3.7/site-packages/setuptools/command/install.py similarity index 100% rename from lib/python3.4/site-packages/setuptools/command/install.py rename to lib/python3.7/site-packages/setuptools/command/install.py diff --git a/lib/python3.4/site-packages/setuptools/command/install_egg_info.py b/lib/python3.7/site-packages/setuptools/command/install_egg_info.py similarity index 68% rename from lib/python3.4/site-packages/setuptools/command/install_egg_info.py rename to lib/python3.7/site-packages/setuptools/command/install_egg_info.py index edc4718..5f405bc 100644 --- a/lib/python3.4/site-packages/setuptools/command/install_egg_info.py +++ b/lib/python3.7/site-packages/setuptools/command/install_egg_info.py @@ -1,5 +1,5 @@ from distutils import log, dir_util -import os +import os, sys from setuptools import Command from setuptools import namespaces @@ -18,14 +18,31 @@ class install_egg_info(namespaces.Installer, Command): def initialize_options(self): self.install_dir = None + self.install_layout = None + self.prefix_option = None def finalize_options(self): self.set_undefined_options('install_lib', ('install_dir', 'install_dir')) + self.set_undefined_options('install',('install_layout','install_layout')) + if sys.hexversion > 0x2060000: + self.set_undefined_options('install',('prefix_option','prefix_option')) ei_cmd = self.get_finalized_command("egg_info") basename = pkg_resources.Distribution( None, None, ei_cmd.egg_name, ei_cmd.egg_version ).egg_name() + '.egg-info' + + if self.install_layout: + if not self.install_layout.lower() in ['deb']: + raise DistutilsOptionError("unknown value for --install-layout") + self.install_layout = self.install_layout.lower() + basename = basename.replace('-py%s' % pkg_resources.PY_MAJOR, '') + elif self.prefix_option or 'real_prefix' in sys.__dict__: + # don't modify for virtualenv + pass + else: + basename = basename.replace('-py%s' % pkg_resources.PY_MAJOR, '') + self.source = ei_cmd.egg_info self.target = os.path.join(self.install_dir, basename) self.outputs = [] @@ -55,6 +72,9 @@ class install_egg_info(namespaces.Installer, Command): for skip in '.svn/', 'CVS/': if src.startswith(skip) or '/' + skip in src: return None + if self.install_layout and self.install_layout in ['deb'] and src.startswith('SOURCES.txt'): + log.info("Skipping SOURCES.txt") + return None self.outputs.append(dst) log.debug("Copying %s to %s", src, dst) return dst diff --git a/lib/python3.4/site-packages/setuptools/command/install_lib.py b/lib/python3.7/site-packages/setuptools/command/install_lib.py similarity index 76% rename from lib/python3.4/site-packages/setuptools/command/install_lib.py rename to lib/python3.7/site-packages/setuptools/command/install_lib.py index 2b31c3e..578e002 100644 --- a/lib/python3.4/site-packages/setuptools/command/install_lib.py +++ b/lib/python3.7/site-packages/setuptools/command/install_lib.py @@ -1,4 +1,5 @@ import os +import sys import imp from itertools import product, starmap import distutils.command.install_lib as orig @@ -7,6 +8,18 @@ import distutils.command.install_lib as orig class install_lib(orig.install_lib): """Don't add compiled flags to filenames of non-Python files""" + def initialize_options(self): + orig.install_lib.initialize_options(self) + self.multiarch = None + self.install_layout = None + + def finalize_options(self): + orig.install_lib.finalize_options(self) + self.set_undefined_options('install',('install_layout','install_layout')) + if self.install_layout == 'deb' and sys.version_info[:2] >= (3, 3): + import sysconfig + self.multiarch = sysconfig.get_config_var('MULTIARCH') + def run(self): self.build() outfiles = self.install() @@ -91,6 +104,8 @@ class install_lib(orig.install_lib): exclude = self.get_exclusions() if not exclude: + import distutils.dir_util + distutils.dir_util._multiarch = self.multiarch return orig.install_lib.copy_tree(self, infile, outfile) # Exclude namespace package __init__.py* files from the output @@ -100,12 +115,24 @@ class install_lib(orig.install_lib): outfiles = [] + if self.multiarch: + import sysconfig + ext_suffix = sysconfig.get_config_var ('EXT_SUFFIX') + if ext_suffix.endswith(self.multiarch + ext_suffix[-3:]): + new_suffix = None + else: + new_suffix = "%s-%s%s" % (ext_suffix[:-3], self.multiarch, ext_suffix[-3:]) + def pf(src, dst): if dst in exclude: log.warn("Skipping installation of %s (namespace package)", dst) return False + if self.multiarch and new_suffix and dst.endswith(ext_suffix) and not dst.endswith(new_suffix): + dst = dst.replace(ext_suffix, new_suffix) + log.info("renaming extension to %s", os.path.basename(dst)) + log.info("copying %s -> %s", src, os.path.dirname(dst)) outfiles.append(dst) return dst diff --git a/lib/python3.4/site-packages/setuptools/command/install_scripts.py b/lib/python3.7/site-packages/setuptools/command/install_scripts.py similarity index 100% rename from lib/python3.4/site-packages/setuptools/command/install_scripts.py rename to lib/python3.7/site-packages/setuptools/command/install_scripts.py diff --git a/lib/python3.4/site-packages/setuptools/command/launcher manifest.xml b/lib/python3.7/site-packages/setuptools/command/launcher manifest.xml similarity index 100% rename from lib/python3.4/site-packages/setuptools/command/launcher manifest.xml rename to lib/python3.7/site-packages/setuptools/command/launcher manifest.xml diff --git a/lib/python3.4/site-packages/setuptools/command/py36compat.py b/lib/python3.7/site-packages/setuptools/command/py36compat.py similarity index 100% rename from lib/python3.4/site-packages/setuptools/command/py36compat.py rename to lib/python3.7/site-packages/setuptools/command/py36compat.py diff --git a/lib/python3.7/site-packages/setuptools/command/register.py b/lib/python3.7/site-packages/setuptools/command/register.py new file mode 100644 index 0000000..98bc015 --- /dev/null +++ b/lib/python3.7/site-packages/setuptools/command/register.py @@ -0,0 +1,18 @@ +from distutils import log +import distutils.command.register as orig + + +class register(orig.register): + __doc__ = orig.register.__doc__ + + def run(self): + try: + # Make sure that we are using valid current name/version info + self.run_command('egg_info') + orig.register.run(self) + finally: + self.announce( + "WARNING: Registering is deprecated, use twine to " + "upload instead (https://pypi.org/p/twine/)", + log.WARN + ) diff --git a/lib/python3.4/site-packages/setuptools/command/rotate.py b/lib/python3.7/site-packages/setuptools/command/rotate.py similarity index 100% rename from lib/python3.4/site-packages/setuptools/command/rotate.py rename to lib/python3.7/site-packages/setuptools/command/rotate.py diff --git a/lib/python3.4/site-packages/setuptools/command/saveopts.py b/lib/python3.7/site-packages/setuptools/command/saveopts.py similarity index 100% rename from lib/python3.4/site-packages/setuptools/command/saveopts.py rename to lib/python3.7/site-packages/setuptools/command/saveopts.py diff --git a/lib/python3.4/site-packages/setuptools/command/sdist.py b/lib/python3.7/site-packages/setuptools/command/sdist.py similarity index 96% rename from lib/python3.4/site-packages/setuptools/command/sdist.py rename to lib/python3.7/site-packages/setuptools/command/sdist.py index 508148e..bcfae4d 100644 --- a/lib/python3.4/site-packages/setuptools/command/sdist.py +++ b/lib/python3.7/site-packages/setuptools/command/sdist.py @@ -51,13 +51,6 @@ class sdist(sdist_add_defaults, orig.sdist): for cmd_name in self.get_sub_commands(): self.run_command(cmd_name) - # Call check_metadata only if no 'check' command - # (distutils <= 2.6) - import distutils.command - - if 'check' not in distutils.command.__all__: - self.check_metadata() - self.make_distribution() dist_files = getattr(self.distribution, 'dist_files', []) diff --git a/lib/python3.4/site-packages/setuptools/command/setopt.py b/lib/python3.7/site-packages/setuptools/command/setopt.py similarity index 100% rename from lib/python3.4/site-packages/setuptools/command/setopt.py rename to lib/python3.7/site-packages/setuptools/command/setopt.py diff --git a/lib/python3.4/site-packages/setuptools/command/test.py b/lib/python3.7/site-packages/setuptools/command/test.py similarity index 95% rename from lib/python3.4/site-packages/setuptools/command/test.py rename to lib/python3.7/site-packages/setuptools/command/test.py index 638d0c5..dde0118 100644 --- a/lib/python3.4/site-packages/setuptools/command/test.py +++ b/lib/python3.7/site-packages/setuptools/command/test.py @@ -3,6 +3,7 @@ import operator import sys import contextlib import itertools +import unittest from distutils.errors import DistutilsError, DistutilsOptionError from distutils import log from unittest import TestLoader @@ -14,10 +15,16 @@ from pkg_resources import (resource_listdir, resource_exists, normalize_path, working_set, _namespace_packages, evaluate_marker, add_activation_listener, require, EntryPoint) from setuptools import Command -from setuptools.py31compat import unittest_main + +__metaclass__ = type class ScanningLoader(TestLoader): + + def __init__(self): + TestLoader.__init__(self) + self._visited = set() + def loadTestsFromModule(self, module, pattern=None): """Return a suite of all tests cases contained in the given module @@ -25,6 +32,10 @@ class ScanningLoader(TestLoader): If the module has an ``additional_tests`` function, call it and add the return value to the tests. """ + if module in self._visited: + return None + self._visited.add(module) + tests = [] tests.append(TestLoader.loadTestsFromModule(self, module)) @@ -49,7 +60,7 @@ class ScanningLoader(TestLoader): # adapted from jaraco.classes.properties:NonDataProperty -class NonDataProperty(object): +class NonDataProperty: def __init__(self, fget): self.fget = fget @@ -101,6 +112,8 @@ class test(Command): return list(self._test_args()) def _test_args(self): + if not self.test_suite and sys.version_info >= (2, 7): + yield 'discover' if self.verbose: yield '--verbose' if self.test_suite: @@ -230,12 +243,11 @@ class test(Command): del_modules.append(name) list(map(sys.modules.__delitem__, del_modules)) - exit_kwarg = {} if sys.version_info < (2, 7) else {"exit": False} - test = unittest_main( + test = unittest.main( None, None, self._argv, testLoader=self._resolve_as_ep(self.test_loader), testRunner=self._resolve_as_ep(self.test_runner), - **exit_kwarg + exit=False, ) if not test.result.wasSuccessful(): msg = 'Test failed: %s' % test.result diff --git a/lib/python3.4/site-packages/setuptools/command/upload.py b/lib/python3.7/site-packages/setuptools/command/upload.py similarity index 78% rename from lib/python3.4/site-packages/setuptools/command/upload.py rename to lib/python3.7/site-packages/setuptools/command/upload.py index a44173a..72f24d8 100644 --- a/lib/python3.4/site-packages/setuptools/command/upload.py +++ b/lib/python3.7/site-packages/setuptools/command/upload.py @@ -1,4 +1,5 @@ import getpass +from distutils import log from distutils.command import upload as orig @@ -8,6 +9,16 @@ class upload(orig.upload): in a variety of different ways. """ + def run(self): + try: + orig.upload.run(self) + finally: + self.announce( + "WARNING: Uploading via this command is deprecated, use twine " + "to upload instead (https://pypi.org/p/twine/)", + log.WARN + ) + def finalize_options(self): orig.upload.finalize_options(self) self.username = ( diff --git a/lib/python3.4/site-packages/setuptools/command/upload_docs.py b/lib/python3.7/site-packages/setuptools/command/upload_docs.py similarity index 100% rename from lib/python3.4/site-packages/setuptools/command/upload_docs.py rename to lib/python3.7/site-packages/setuptools/command/upload_docs.py diff --git a/lib/python3.4/site-packages/setuptools/config.py b/lib/python3.7/site-packages/setuptools/config.py similarity index 85% rename from lib/python3.4/site-packages/setuptools/config.py rename to lib/python3.7/site-packages/setuptools/config.py index 9a62e2e..73a3bf7 100644 --- a/lib/python3.4/site-packages/setuptools/config.py +++ b/lib/python3.7/site-packages/setuptools/config.py @@ -4,10 +4,14 @@ import os import sys from collections import defaultdict from functools import partial +from importlib import import_module from distutils.errors import DistutilsOptionError, DistutilsFileError -from setuptools.py26compat import import_module -from setuptools.extern.six import string_types +from setuptools.extern.packaging.version import LegacyVersion, parse +from setuptools.extern.six import string_types, PY3 + + +__metaclass__ = type def read_configuration( @@ -101,18 +105,18 @@ def parse_configuration( If False exceptions are propagated as expected. :rtype: list """ - meta = ConfigMetadataHandler( - distribution.metadata, command_options, ignore_option_errors) - meta.parse() - options = ConfigOptionsHandler( distribution, command_options, ignore_option_errors) options.parse() - return [meta, options] + meta = ConfigMetadataHandler( + distribution.metadata, command_options, ignore_option_errors, distribution.package_dir) + meta.parse() + + return meta, options -class ConfigHandler(object): +class ConfigHandler: """Handles metadata supplied in configuration files.""" section_prefix = None @@ -280,7 +284,7 @@ class ConfigHandler(object): return f.read() @classmethod - def _parse_attr(cls, value): + def _parse_attr(cls, value, package_dir=None): """Represents value as a module attribute. Examples: @@ -300,7 +304,21 @@ class ConfigHandler(object): module_name = '.'.join(attrs_path) module_name = module_name or '__init__' - sys.path.insert(0, os.getcwd()) + parent_path = os.getcwd() + if package_dir: + if attrs_path[0] in package_dir: + # A custom path was specified for the module we want to import + custom_path = package_dir[attrs_path[0]] + parts = custom_path.rsplit('/', 1) + if len(parts) > 1: + parent_path = os.path.join(os.getcwd(), parts[0]) + module_name = parts[1] + else: + module_name = custom_path + elif '' in package_dir: + # A custom parent directory was specified for all root modules + parent_path = os.path.join(os.getcwd(), package_dir['']) + sys.path.insert(0, parent_path) try: module = import_module(module_name) value = getattr(module, attr_name) @@ -399,11 +417,18 @@ class ConfigMetadataHandler(ConfigHandler): """ + def __init__(self, target_obj, options, ignore_option_errors=False, + package_dir=None): + super(ConfigMetadataHandler, self).__init__(target_obj, options, + ignore_option_errors) + self.package_dir = package_dir + @property def parsers(self): """Metadata item name to parser function mapping.""" parse_list = self._parse_list parse_file = self._parse_file + parse_dict = self._parse_dict return { 'platforms': parse_list, @@ -416,6 +441,7 @@ class ConfigMetadataHandler(ConfigHandler): 'description': parse_file, 'long_description': parse_file, 'version': self._parse_version, + 'project_urls': parse_dict, } def _parse_version(self, value): @@ -425,7 +451,19 @@ class ConfigMetadataHandler(ConfigHandler): :rtype: str """ - version = self._parse_attr(value) + version = self._parse_file(value) + + if version != value: + version = version.strip() + # Be strict about versions loaded from file because it's easy to + # accidentally include newlines and other unintended content + if isinstance(parse(version), LegacyVersion): + raise DistutilsOptionError('Version loaded from %s does not comply with PEP 440: %s' % ( + value, version + )) + return version + + version = self._parse_attr(value, self.package_dir) if callable(version): version = version() @@ -477,16 +515,24 @@ class ConfigOptionsHandler(ConfigHandler): :param value: :rtype: list """ - find_directive = 'find:' + find_directives = ['find:', 'find_namespace:'] + trimmed_value = value.strip() - if not value.startswith(find_directive): + if not trimmed_value in find_directives: return self._parse_list(value) + findns = trimmed_value == find_directives[1] + if findns and not PY3: + raise DistutilsOptionError('find_namespace: directive is unsupported on Python < 3.3') + # Read function arguments from a dedicated section. find_kwargs = self.parse_section_packages__find( self.sections.get('packages.find', {})) - from setuptools import find_packages + if findns: + from setuptools import find_namespace_packages as find_packages + else: + from setuptools import find_packages return find_packages(**find_kwargs) @@ -552,3 +598,11 @@ class ConfigOptionsHandler(ConfigHandler): parse_list = partial(self._parse_list, separator=';') self['extras_require'] = self._parse_section_to_dict( section_options, parse_list) + + def parse_section_data_files(self, section_options): + """Parses `data_files` configuration file section. + + :param dict section_options: + """ + parsed = self._parse_section_to_dict(section_options, self._parse_list) + self['data_files'] = [(k, v) for k, v in parsed.items()] diff --git a/lib/python3.4/site-packages/setuptools/dep_util.py b/lib/python3.7/site-packages/setuptools/dep_util.py similarity index 100% rename from lib/python3.4/site-packages/setuptools/dep_util.py rename to lib/python3.7/site-packages/setuptools/dep_util.py diff --git a/lib/python3.4/site-packages/setuptools/depends.py b/lib/python3.7/site-packages/setuptools/depends.py similarity index 100% rename from lib/python3.4/site-packages/setuptools/depends.py rename to lib/python3.7/site-packages/setuptools/depends.py diff --git a/lib/python3.4/site-packages/setuptools/dist.py b/lib/python3.7/site-packages/setuptools/dist.py similarity index 89% rename from lib/python3.4/site-packages/setuptools/dist.py rename to lib/python3.7/site-packages/setuptools/dist.py index a2ca879..2360e20 100644 --- a/lib/python3.4/site-packages/setuptools/dist.py +++ b/lib/python3.7/site-packages/setuptools/dist.py @@ -1,3 +1,4 @@ +# -*- coding: utf-8 -*- __all__ = ['Distribution'] import re @@ -14,10 +15,11 @@ from distutils.errors import ( DistutilsOptionError, DistutilsPlatformError, DistutilsSetupError, ) from distutils.util import rfc822_escape +from distutils.version import StrictVersion from setuptools.extern import six +from setuptools.extern import packaging from setuptools.extern.six.moves import map, filter, filterfalse -from pkg_resources.extern import packaging from setuptools.depends import Require from setuptools import windows_support @@ -26,8 +28,8 @@ from setuptools.config import parse_configuration import pkg_resources from .py36compat import Distribution_parse_config_files -__import__('pkg_resources.extern.packaging.specifiers') -__import__('pkg_resources.extern.packaging.version') +__import__('setuptools.extern.packaging.specifiers') +__import__('setuptools.extern.packaging.version') def _get_unpatched(cls): @@ -35,35 +37,56 @@ def _get_unpatched(cls): return get_unpatched(cls) +def get_metadata_version(dist_md): + if dist_md.long_description_content_type or dist_md.provides_extras: + return StrictVersion('2.1') + elif (dist_md.maintainer is not None or + dist_md.maintainer_email is not None or + getattr(dist_md, 'python_requires', None) is not None): + return StrictVersion('1.2') + elif (dist_md.provides or dist_md.requires or dist_md.obsoletes or + dist_md.classifiers or dist_md.download_url): + return StrictVersion('1.1') + + return StrictVersion('1.0') + + # Based on Python 3.5 version def write_pkg_file(self, file): """Write the PKG-INFO format data to a file object. """ - version = '1.0' - if (self.provides or self.requires or self.obsoletes or - self.classifiers or self.download_url): - version = '1.1' - # Setuptools specific for PEP 345 - if hasattr(self, 'python_requires'): - version = '1.2' + version = get_metadata_version(self) file.write('Metadata-Version: %s\n' % version) file.write('Name: %s\n' % self.get_name()) file.write('Version: %s\n' % self.get_version()) file.write('Summary: %s\n' % self.get_description()) file.write('Home-page: %s\n' % self.get_url()) - file.write('Author: %s\n' % self.get_contact()) - file.write('Author-email: %s\n' % self.get_contact_email()) + + if version < StrictVersion('1.2'): + file.write('Author: %s\n' % self.get_contact()) + file.write('Author-email: %s\n' % self.get_contact_email()) + else: + optional_fields = ( + ('Author', 'author'), + ('Author-email', 'author_email'), + ('Maintainer', 'maintainer'), + ('Maintainer-email', 'maintainer_email'), + ) + + for field, attr in optional_fields: + attr_val = getattr(self, attr) + if six.PY2: + attr_val = self._encode_field(attr_val) + + if attr_val is not None: + file.write('%s: %s\n' % (field, attr_val)) + file.write('License: %s\n' % self.get_license()) if self.download_url: file.write('Download-URL: %s\n' % self.download_url) - - long_desc_content_type = getattr( - self, - 'long_description_content_type', - None - ) or 'UNKNOWN' - file.write('Description-Content-Type: %s\n' % long_desc_content_type) + for project_url in self.project_urls.items(): + file.write('Project-URL: %s, %s\n' % project_url) long_desc = rfc822_escape(self.get_long_description()) file.write('Description: %s\n' % long_desc) @@ -72,7 +95,12 @@ def write_pkg_file(self, file): if keywords: file.write('Keywords: %s\n' % keywords) - self._write_list(file, 'Platform', self.get_platforms()) + if version >= StrictVersion('1.2'): + for platform in self.get_platforms(): + file.write('Platform: %s\n' % platform) + else: + self._write_list(file, 'Platform', self.get_platforms()) + self._write_list(file, 'Classifier', self.get_classifiers()) # PEP 314 @@ -84,14 +112,15 @@ def write_pkg_file(self, file): if hasattr(self, 'python_requires'): file.write('Requires-Python: %s\n' % self.python_requires) - -# from Python 3.4 -def write_pkg_info(self, base_dir): - """Write the PKG-INFO file into the release tree. - """ - with open(os.path.join(base_dir, 'PKG-INFO'), 'w', - encoding='UTF-8') as pkg_info: - self.write_pkg_file(pkg_info) + # PEP 566 + if self.long_description_content_type: + file.write( + 'Description-Content-Type: %s\n' % + self.long_description_content_type + ) + if self.provides_extras: + for extra in sorted(self.provides_extras): + file.write('Provides-Extra: %s\n' % extra) sequence = tuple, list @@ -166,6 +195,8 @@ def check_requirements(dist, attr, value): """Verify that install_requires is a valid requirements list""" try: list(pkg_resources.parse_requirements(value)) + if isinstance(value, (dict, set)): + raise TypeError("Unordered types are not allowed") except (TypeError, ValueError) as error: tmpl = ( "{attr!r} must be a string or list of strings " @@ -297,6 +328,12 @@ class Distribution(Distribution_parse_config_files, _Distribution): distribution for the included and excluded features. """ + _DISTUTILS_UNSUPPORTED_METADATA = { + 'long_description_content_type': None, + 'project_urls': dict, + 'provides_extras': set, + } + _patched_dist = None def patch_missing_pkg_info(self, attrs): @@ -316,26 +353,36 @@ class Distribution(Distribution_parse_config_files, _Distribution): have_package_data = hasattr(self, "package_data") if not have_package_data: self.package_data = {} - _attrs_dict = attrs or {} - if 'features' in _attrs_dict or 'require_features' in _attrs_dict: + attrs = attrs or {} + if 'features' in attrs or 'require_features' in attrs: Feature.warn_deprecated() self.require_features = [] self.features = {} self.dist_files = [] - self.src_root = attrs and attrs.pop("src_root", None) + # Filter-out setuptools' specific options. + self.src_root = attrs.pop("src_root", None) self.patch_missing_pkg_info(attrs) - self.long_description_content_type = _attrs_dict.get( - 'long_description_content_type' - ) - # Make sure we have any eggs needed to interpret 'attrs' - if attrs is not None: - self.dependency_links = attrs.pop('dependency_links', []) - assert_string_list(self, 'dependency_links', self.dependency_links) - if attrs and 'setup_requires' in attrs: - self.fetch_build_eggs(attrs['setup_requires']) + self.dependency_links = attrs.pop('dependency_links', []) + self.setup_requires = attrs.pop('setup_requires', []) for ep in pkg_resources.iter_entry_points('distutils.setup_keywords'): vars(self).setdefault(ep.name, None) - _Distribution.__init__(self, attrs) + _Distribution.__init__(self, { + k: v for k, v in attrs.items() + if k not in self._DISTUTILS_UNSUPPORTED_METADATA + }) + + # Fill-in missing metadata fields not supported by distutils. + # Note some fields may have been set by other tools (e.g. pbr) + # above; they are taken preferrentially to setup() arguments + for option, default in self._DISTUTILS_UNSUPPORTED_METADATA.items(): + for source in self.metadata.__dict__, attrs: + if option in source: + value = source[option] + break + else: + value = default() if default else None + setattr(self.metadata, option, value) + if isinstance(self.metadata.version, numbers.Number): # Some people apparently take "version number" too literally :) self.metadata.version = str(self.metadata.version) @@ -368,6 +415,16 @@ class Distribution(Distribution_parse_config_files, _Distribution): """ if getattr(self, 'python_requires', None): self.metadata.python_requires = self.python_requires + + if getattr(self, 'extras_require', None): + for extra in self.extras_require.keys(): + # Since this gets called multiple times at points where the + # keys have become 'converted' extras, ensure that we are only + # truly adding extras we haven't seen before here. + extra = extra.split(':')[0] + if extra: + self.metadata.provides_extras.add(extra) + self._convert_extras_requirements() self._move_install_requirements_markers() @@ -427,14 +484,15 @@ class Distribution(Distribution_parse_config_files, _Distribution): req.marker = None return req - def parse_config_files(self, filenames=None): + def parse_config_files(self, filenames=None, ignore_option_errors=False): """Parses configuration files from various levels and loads configuration. """ _Distribution.parse_config_files(self, filenames=filenames) - parse_configuration(self, self.command_options) + parse_configuration(self, self.command_options, + ignore_option_errors=ignore_option_errors) self._finalize_requires() def parse_command_line(self): @@ -497,19 +555,20 @@ class Distribution(Distribution_parse_config_files, _Distribution): """Fetch an egg needed for building""" from setuptools.command.easy_install import easy_install dist = self.__class__({'script_args': ['easy_install']}) - dist.parse_config_files() opts = dist.get_option_dict('easy_install') - keep = ( - 'find_links', 'site_dirs', 'index_url', 'optimize', - 'site_dirs', 'allow_hosts' - ) - for key in list(opts): - if key not in keep: - del opts[key] # don't use any other settings + opts.clear() + opts.update( + (k, v) + for k, v in self.get_option_dict('easy_install').items() + if k in ( + # don't use any other settings + 'find_links', 'site_dirs', 'index_url', + 'optimize', 'site_dirs', 'allow_hosts', + )) if self.dependency_links: links = self.dependency_links[:] if 'find_links' in opts: - links = opts['find_links'][1].split() + links + links = opts['find_links'][1] + links opts['find_links'] = ('setup', links) install_dir = self.get_egg_cache_dir() cmd = easy_install( diff --git a/lib/python3.4/site-packages/setuptools/extension.py b/lib/python3.7/site-packages/setuptools/extension.py similarity index 100% rename from lib/python3.4/site-packages/setuptools/extension.py rename to lib/python3.7/site-packages/setuptools/extension.py diff --git a/lib/python3.7/site-packages/setuptools/extern/__init__.py b/lib/python3.7/site-packages/setuptools/extern/__init__.py new file mode 100644 index 0000000..cb2fa32 --- /dev/null +++ b/lib/python3.7/site-packages/setuptools/extern/__init__.py @@ -0,0 +1,73 @@ +import sys + + +class VendorImporter: + """ + A PEP 302 meta path importer for finding optionally-vendored + or otherwise naturally-installed packages from root_name. + """ + + def __init__(self, root_name, vendored_names=(), vendor_pkg=None): + self.root_name = root_name + self.vendored_names = set(vendored_names) + self.vendor_pkg = vendor_pkg or root_name.replace('extern', '_vendor') + + @property + def search_path(self): + """ + Search first the vendor package then as a natural package. + """ + yield self.vendor_pkg + '.' + yield '' + + def find_module(self, fullname, path=None): + """ + Return self when fullname starts with root_name and the + target module is one vendored through this importer. + """ + root, base, target = fullname.partition(self.root_name + '.') + if root: + return + if not any(map(target.startswith, self.vendored_names)): + return + return self + + def load_module(self, fullname): + """ + Iterate over the search path to locate and load fullname. + """ + root, base, target = fullname.partition(self.root_name + '.') + for prefix in self.search_path: + try: + extant = prefix + target + __import__(extant) + mod = sys.modules[extant] + sys.modules[fullname] = mod + # mysterious hack: + # Remove the reference to the extant package/module + # on later Python versions to cause relative imports + # in the vendor package to resolve the same modules + # as those going through this importer. + if sys.version_info >= (3, ): + del sys.modules[extant] + return mod + except ImportError: + pass + else: + raise ImportError( + "The '{target}' package is required; " + "normally this is bundled with this package so if you get " + "this warning, consult the packager of your " + "distribution.".format(**locals()) + ) + + def install(self): + """ + Install this importer into sys.meta_path if not already present. + """ + if self not in sys.meta_path: + sys.meta_path.append(self) + + +names = 'six', 'packaging', 'pyparsing', +VendorImporter(__name__, names, 'setuptools._vendor').install() diff --git a/lib/python3.7/site-packages/setuptools/glibc.py b/lib/python3.7/site-packages/setuptools/glibc.py new file mode 100644 index 0000000..a134591 --- /dev/null +++ b/lib/python3.7/site-packages/setuptools/glibc.py @@ -0,0 +1,86 @@ +# This file originally from pip: +# https://github.com/pypa/pip/blob/8f4f15a5a95d7d5b511ceaee9ed261176c181970/src/pip/_internal/utils/glibc.py +from __future__ import absolute_import + +import ctypes +import re +import warnings + + +def glibc_version_string(): + "Returns glibc version string, or None if not using glibc." + + # ctypes.CDLL(None) internally calls dlopen(NULL), and as the dlopen + # manpage says, "If filename is NULL, then the returned handle is for the + # main program". This way we can let the linker do the work to figure out + # which libc our process is actually using. + process_namespace = ctypes.CDLL(None) + try: + gnu_get_libc_version = process_namespace.gnu_get_libc_version + except AttributeError: + # Symbol doesn't exist -> therefore, we are not linked to + # glibc. + return None + + # Call gnu_get_libc_version, which returns a string like "2.5" + gnu_get_libc_version.restype = ctypes.c_char_p + version_str = gnu_get_libc_version() + # py2 / py3 compatibility: + if not isinstance(version_str, str): + version_str = version_str.decode("ascii") + + return version_str + + +# Separated out from have_compatible_glibc for easier unit testing +def check_glibc_version(version_str, required_major, minimum_minor): + # Parse string and check against requested version. + # + # We use a regexp instead of str.split because we want to discard any + # random junk that might come after the minor version -- this might happen + # in patched/forked versions of glibc (e.g. Linaro's version of glibc + # uses version strings like "2.20-2014.11"). See gh-3588. + m = re.match(r"(?P[0-9]+)\.(?P[0-9]+)", version_str) + if not m: + warnings.warn("Expected glibc version with 2 components major.minor," + " got: %s" % version_str, RuntimeWarning) + return False + return (int(m.group("major")) == required_major and + int(m.group("minor")) >= minimum_minor) + + +def have_compatible_glibc(required_major, minimum_minor): + version_str = glibc_version_string() + if version_str is None: + return False + return check_glibc_version(version_str, required_major, minimum_minor) + + +# platform.libc_ver regularly returns completely nonsensical glibc +# versions. E.g. on my computer, platform says: +# +# ~$ python2.7 -c 'import platform; print(platform.libc_ver())' +# ('glibc', '2.7') +# ~$ python3.5 -c 'import platform; print(platform.libc_ver())' +# ('glibc', '2.9') +# +# But the truth is: +# +# ~$ ldd --version +# ldd (Debian GLIBC 2.22-11) 2.22 +# +# This is unfortunate, because it means that the linehaul data on libc +# versions that was generated by pip 8.1.2 and earlier is useless and +# misleading. Solution: instead of using platform, use our code that actually +# works. +def libc_ver(): + """Try to determine the glibc version + + Returns a tuple of strings (lib, version) which default to empty strings + in case the lookup fails. + """ + glibc_version = glibc_version_string() + if glibc_version is None: + return ("", "") + else: + return ("glibc", glibc_version) diff --git a/lib/python3.4/site-packages/setuptools/glob.py b/lib/python3.7/site-packages/setuptools/glob.py similarity index 93% rename from lib/python3.4/site-packages/setuptools/glob.py rename to lib/python3.7/site-packages/setuptools/glob.py index 6c781de..9d7cbc5 100644 --- a/lib/python3.4/site-packages/setuptools/glob.py +++ b/lib/python3.7/site-packages/setuptools/glob.py @@ -3,14 +3,12 @@ Filename globbing utility. Mostly a copy of `glob` from Python 3.5. Changes include: * `yield from` and PEP3102 `*` removed. - * `bytes` changed to `six.binary_type`. * Hidden files are not ignored. """ import os import re import fnmatch -from setuptools.extern.six import binary_type __all__ = ["glob", "iglob", "escape"] @@ -92,7 +90,7 @@ def _iglob(pathname, recursive): def glob1(dirname, pattern): if not dirname: - if isinstance(pattern, binary_type): + if isinstance(pattern, bytes): dirname = os.curdir.encode('ASCII') else: dirname = os.curdir @@ -129,8 +127,8 @@ def glob2(dirname, pattern): # Recursively yields relative pathnames inside a literal directory. def _rlistdir(dirname): if not dirname: - if isinstance(dirname, binary_type): - dirname = binary_type(os.curdir, 'ASCII') + if isinstance(dirname, bytes): + dirname = os.curdir.encode('ASCII') else: dirname = os.curdir try: @@ -149,7 +147,7 @@ magic_check_bytes = re.compile(b'([*?[])') def has_magic(s): - if isinstance(s, binary_type): + if isinstance(s, bytes): match = magic_check_bytes.search(s) else: match = magic_check.search(s) @@ -157,7 +155,7 @@ def has_magic(s): def _isrecursive(pattern): - if isinstance(pattern, binary_type): + if isinstance(pattern, bytes): return pattern == b'**' else: return pattern == '**' @@ -169,7 +167,7 @@ def escape(pathname): # Escaping is done by wrapping any of "*?[" between square brackets. # Metacharacters do not work in the drive part and shouldn't be escaped. drive, pathname = os.path.splitdrive(pathname) - if isinstance(pathname, binary_type): + if isinstance(pathname, bytes): pathname = magic_check_bytes.sub(br'[\1]', pathname) else: pathname = magic_check.sub(r'[\1]', pathname) diff --git a/lib/python3.4/site-packages/setuptools/gui-32.exe b/lib/python3.7/site-packages/setuptools/gui-32.exe similarity index 100% rename from lib/python3.4/site-packages/setuptools/gui-32.exe rename to lib/python3.7/site-packages/setuptools/gui-32.exe diff --git a/lib/python3.4/site-packages/setuptools/gui-64.exe b/lib/python3.7/site-packages/setuptools/gui-64.exe similarity index 100% rename from lib/python3.4/site-packages/setuptools/gui-64.exe rename to lib/python3.7/site-packages/setuptools/gui-64.exe diff --git a/lib/python3.4/site-packages/setuptools/gui.exe b/lib/python3.7/site-packages/setuptools/gui.exe similarity index 100% rename from lib/python3.4/site-packages/setuptools/gui.exe rename to lib/python3.7/site-packages/setuptools/gui.exe diff --git a/lib/python3.4/site-packages/setuptools/launch.py b/lib/python3.7/site-packages/setuptools/launch.py similarity index 100% rename from lib/python3.4/site-packages/setuptools/launch.py rename to lib/python3.7/site-packages/setuptools/launch.py diff --git a/lib/python3.4/site-packages/setuptools/lib2to3_ex.py b/lib/python3.7/site-packages/setuptools/lib2to3_ex.py similarity index 100% rename from lib/python3.4/site-packages/setuptools/lib2to3_ex.py rename to lib/python3.7/site-packages/setuptools/lib2to3_ex.py diff --git a/lib/python3.4/site-packages/setuptools/monkey.py b/lib/python3.7/site-packages/setuptools/monkey.py similarity index 89% rename from lib/python3.4/site-packages/setuptools/monkey.py rename to lib/python3.7/site-packages/setuptools/monkey.py index 6d3711e..05a738b 100644 --- a/lib/python3.4/site-packages/setuptools/monkey.py +++ b/lib/python3.7/site-packages/setuptools/monkey.py @@ -7,9 +7,9 @@ import distutils.filelist import platform import types import functools +from importlib import import_module import inspect -from .py26compat import import_module from setuptools.extern import six import setuptools @@ -75,8 +75,6 @@ def patch_all(): needs_warehouse = ( sys.version_info < (2, 7, 13) or - (3, 0) < sys.version_info < (3, 3, 7) - or (3, 4) < sys.version_info < (3, 4, 6) or (3, 5) < sys.version_info <= (3, 5, 3) @@ -87,7 +85,6 @@ def patch_all(): distutils.config.PyPIRCCommand.DEFAULT_REPOSITORY = warehouse _patch_distribution_metadata_write_pkg_file() - _patch_distribution_metadata_write_pkg_info() # Install Distribution throughout the distutils for module in distutils.dist, distutils.core, distutils.cmd: @@ -111,21 +108,6 @@ def _patch_distribution_metadata_write_pkg_file(): ) -def _patch_distribution_metadata_write_pkg_info(): - """ - Workaround issue #197 - Python 3 prior to 3.2.2 uses an environment-local - encoding to save the pkg_info. Monkey-patch its write_pkg_info method to - correct this undesirable behavior. - """ - environment_local = (3,) <= sys.version_info[:3] < (3, 2, 2) - if not environment_local: - return - - distutils.dist.DistributionMetadata.write_pkg_info = ( - setuptools.dist.write_pkg_info - ) - - def patch_func(replacement, target_mod, func_name): """ Patch func_name in target_mod with replacement diff --git a/lib/python3.4/site-packages/setuptools/msvc.py b/lib/python3.7/site-packages/setuptools/msvc.py similarity index 99% rename from lib/python3.4/site-packages/setuptools/msvc.py rename to lib/python3.7/site-packages/setuptools/msvc.py index 8e3b638..b9c472f 100644 --- a/lib/python3.4/site-packages/setuptools/msvc.py +++ b/lib/python3.7/site-packages/setuptools/msvc.py @@ -22,7 +22,7 @@ import sys import platform import itertools import distutils.errors -from pkg_resources.extern.packaging.version import LegacyVersion +from setuptools.extern.packaging.version import LegacyVersion from setuptools.extern.six.moves import filterfalse @@ -48,7 +48,7 @@ else: _msvc9_suppress_errors = ( # msvc9compiler isn't available on some platforms ImportError, - + # msvc9compiler raises DistutilsPlatformError in some # environments. See #1118. distutils.errors.DistutilsPlatformError, @@ -232,8 +232,7 @@ def _augment_exception(exc, version, arch=''): elif version >= 14.0: # For VC++ 14.0 Redirect user to Visual C++ Build Tools message += (' Get it with "Microsoft Visual C++ Build Tools": ' - r'http://landinghub.visualstudio.com/' - 'visual-cpp-build-tools') + r'https://visualstudio.microsoft.com/downloads/') exc.args = (message, ) diff --git a/lib/python3.4/site-packages/setuptools/namespaces.py b/lib/python3.7/site-packages/setuptools/namespaces.py similarity index 100% rename from lib/python3.4/site-packages/setuptools/namespaces.py rename to lib/python3.7/site-packages/setuptools/namespaces.py diff --git a/lib/python3.4/site-packages/setuptools/package_index.py b/lib/python3.7/site-packages/setuptools/package_index.py similarity index 93% rename from lib/python3.4/site-packages/setuptools/package_index.py rename to lib/python3.7/site-packages/setuptools/package_index.py index a6363b1..1608b91 100644 --- a/lib/python3.4/site-packages/setuptools/package_index.py +++ b/lib/python3.7/site-packages/setuptools/package_index.py @@ -7,13 +7,9 @@ import socket import base64 import hashlib import itertools +import warnings from functools import wraps -try: - from urllib.parse import splituser -except ImportError: - from urllib2 import splituser - from setuptools.extern import six from setuptools.extern.six.moves import urllib, http_client, configparser, map @@ -21,21 +17,23 @@ import setuptools from pkg_resources import ( CHECKOUT_DIST, Distribution, BINARY_DIST, normalize_path, SOURCE_DIST, Environment, find_distributions, safe_name, safe_version, - to_filename, Requirement, DEVELOP_DIST, + to_filename, Requirement, DEVELOP_DIST, EGG_DIST, ) from setuptools import ssl_support from distutils import log from distutils.errors import DistutilsError from fnmatch import translate -from setuptools.py26compat import strip_fragment from setuptools.py27compat import get_all_headers +from setuptools.py33compat import unescape +from setuptools.wheel import Wheel + +__metaclass__ = type EGG_FRAGMENT = re.compile(r'^egg=([-A-Za-z0-9_.+!]+)$') -HREF = re.compile("""href\\s*=\\s*['"]?([^'"> ]+)""", re.I) -# this is here to fix emacs' cruddy broken syntax highlighting +HREF = re.compile(r"""href\s*=\s*['"]?([^'"> ]+)""", re.I) PYPI_MD5 = re.compile( - '([^<]+)\n\\s+\\(md5\\)' + r'([^<]+)\n\s+\(md5\)' ) URL_SCHEME = re.compile('([-+.a-z0-9]{2,}):', re.I).match EXTENSIONS = ".tar.gz .tar.bz2 .tar .zip .tgz".split() @@ -116,6 +114,17 @@ def distros_for_location(location, basename, metadata=None): if basename.endswith('.egg') and '-' in basename: # only one, unambiguous interpretation return [Distribution.from_location(location, basename, metadata)] + if basename.endswith('.whl') and '-' in basename: + wheel = Wheel(basename) + if not wheel.is_compatible(): + return [] + return [Distribution( + location=location, + project_name=wheel.project_name, + version=wheel.version, + # Increase priority over eggs. + precedence=EGG_DIST + 1, + )] if basename.endswith('.exe'): win_base, py_ver, platform = parse_bdist_wininst(basename) if win_base is not None: @@ -141,7 +150,7 @@ def distros_for_filename(filename, metadata=None): def interpret_distro_name( location, basename, metadata, py_version=None, precedence=SOURCE_DIST, platform=None - ): +): """Generate alternative interpretations of a source distro name Note: if `location` is a filesystem filename, you should call @@ -228,7 +237,7 @@ def find_external_links(url, page): yield urllib.parse.urljoin(url, htmldecode(match.group(1))) -class ContentChecker(object): +class ContentChecker: """ A null content checker that defines the interface for checking content """ @@ -290,9 +299,9 @@ class PackageIndex(Environment): """A distribution index that scans web pages for download URLs""" def __init__( - self, index_url="https://pypi.python.org/simple", hosts=('*',), + self, index_url="https://pypi.org/simple/", hosts=('*',), ca_bundle=None, verify_ssl=True, *args, **kw - ): + ): Environment.__init__(self, *args, **kw) self.index_url = index_url + "/" [:not index_url.endswith('/')] self.scanned_urls = {} @@ -346,7 +355,8 @@ class PackageIndex(Environment): base = f.url # handle redirects page = f.read() - if not isinstance(page, str): # We are in Python 3 and got bytes. We want str. + if not isinstance(page, str): + # In Python 3 and got bytes but want str. if isinstance(f, urllib.error.HTTPError): # Errors have no charset, assume latin1: charset = 'latin-1' @@ -381,8 +391,9 @@ class PackageIndex(Environment): is_file = s and s.group(1).lower() == 'file' if is_file or self.allows(urllib.parse.urlparse(url)[1]): return True - msg = ("\nNote: Bypassing %s (disallowed host; see " - "http://bit.ly/1dg9ijs for details).\n") + msg = ( + "\nNote: Bypassing %s (disallowed host; see " + "http://bit.ly/2hrImnY for details).\n") if fatal: raise DistutilsError(msg % url) else: @@ -500,15 +511,16 @@ class PackageIndex(Environment): """ checker is a ContentChecker """ - checker.report(self.debug, + checker.report( + self.debug, "Validating %%s checksum for %s" % filename) if not checker.is_valid(): tfp.close() os.unlink(filename) raise DistutilsError( "%s validation failed for %s; " - "possible download problem?" % ( - checker.hash.name, os.path.basename(filename)) + "possible download problem?" + % (checker.hash.name, os.path.basename(filename)) ) def add_find_links(self, urls): @@ -536,7 +548,8 @@ class PackageIndex(Environment): if self[requirement.key]: # we've seen at least one distro meth, msg = self.info, "Couldn't retrieve index page for %r" else: # no distros seen for this name, might be misspelled - meth, msg = (self.warn, + meth, msg = ( + self.warn, "Couldn't find index page for %r (maybe misspelled?)") meth(msg, requirement.unsafe_name) self.scan_all() @@ -577,8 +590,7 @@ class PackageIndex(Environment): def fetch_distribution( self, requirement, tmpdir, force_scan=False, source=False, - develop_ok=False, local_index=None - ): + develop_ok=False, local_index=None): """Obtain a distribution suitable for fulfilling `requirement` `requirement` must be a ``pkg_resources.Requirement`` instance. @@ -609,12 +621,19 @@ class PackageIndex(Environment): if dist.precedence == DEVELOP_DIST and not develop_ok: if dist not in skipped: - self.warn("Skipping development or system egg: %s", dist) + self.warn( + "Skipping development or system egg: %s", dist, + ) skipped[dist] = 1 continue - if dist in req and (dist.precedence <= SOURCE_DIST or not source): - dist.download_location = self.download(dist.location, tmpdir) + test = ( + dist in req + and (dist.precedence <= SOURCE_DIST or not source) + ) + if test: + loc = self.download(dist.location, tmpdir) + dist.download_location = loc if os.path.exists(dist.download_location): return dist @@ -704,10 +723,10 @@ class PackageIndex(Environment): def _download_to(self, url, filename): self.info("Downloading %s", url) # Download the file - fp, info = None, None + fp = None try: checker = HashChecker.from_url(url) - fp = self.open_url(strip_fragment(url)) + fp = self.open_url(url) if isinstance(fp, urllib.error.HTTPError): raise DistutilsError( "Can't download %s: %s %s" % (url, fp.code, fp.msg) @@ -830,13 +849,14 @@ class PackageIndex(Environment): raise DistutilsError("Unexpected HTML page found at " + url) def _download_svn(self, url, filename): + warnings.warn("SVN download support is deprecated", UserWarning) url = url.split('#', 1)[0] # remove any fragment for svn's sake creds = '' if url.lower().startswith('svn:') and '@' in url: scheme, netloc, path, p, q, f = urllib.parse.urlparse(url) if not netloc and path.startswith('//') and '/' in path[2:]: netloc, path = path[2:].split('/', 1) - auth, host = splituser(netloc) + auth, host = urllib.parse.splituser(netloc) if auth: if ':' in auth: user, pw = auth.split(':', 1) @@ -915,27 +935,20 @@ class PackageIndex(Environment): entity_sub = re.compile(r'&(#(\d+|x[\da-fA-F]+)|[\w.:-]+);?').sub -def uchr(c): - if not isinstance(c, int): - return c - if c > 255: - return six.unichr(c) - return chr(c) - - def decode_entity(match): - what = match.group(1) - if what.startswith('#x'): - what = int(what[2:], 16) - elif what.startswith('#'): - what = int(what[1:]) - else: - what = six.moves.html_entities.name2codepoint.get(what, match.group(0)) - return uchr(what) + what = match.group(0) + return unescape(what) def htmldecode(text): - """Decode HTML entities in the given text.""" + """ + Decode HTML entities in the given text. + + >>> htmldecode( + ... 'https://../package_name-0.1.2.tar.gz' + ... '?tokena=A&tokenb=B">package_name-0.1.2.tar.gz') + 'https://../package_name-0.1.2.tar.gz?tokena=A&tokenb=B">package_name-0.1.2.tar.gz' + """ return entity_sub(decode_entity, text) @@ -969,15 +982,14 @@ def _encode_auth(auth): auth_s = urllib.parse.unquote(auth) # convert to bytes auth_bytes = auth_s.encode() - # use the legacy interface for Python 2.3 support - encoded_bytes = base64.encodestring(auth_bytes) + encoded_bytes = base64.b64encode(auth_bytes) # convert back to a string encoded = encoded_bytes.decode() # strip the trailing carriage return return encoded.replace('\n', '') -class Credential(object): +class Credential: """ A username/password pair. Use like a namedtuple. """ @@ -1043,7 +1055,7 @@ def open_with_auth(url, opener=urllib.request.urlopen): raise http_client.InvalidURL("nonnumeric port: ''") if scheme in ('http', 'https'): - auth, host = splituser(netloc) + auth, host = urllib.parse.splituser(netloc) else: auth = None @@ -1103,7 +1115,8 @@ def local_open(url): f += '/' files.append('{name}'.format(name=f)) else: - tmpl = ("{url}" + tmpl = ( + "{url}" "{files}") body = tmpl.format(url=url, files='\n'.join(files)) status, message = 200, "OK" diff --git a/lib/python3.7/site-packages/setuptools/pep425tags.py b/lib/python3.7/site-packages/setuptools/pep425tags.py new file mode 100644 index 0000000..8bf4277 --- /dev/null +++ b/lib/python3.7/site-packages/setuptools/pep425tags.py @@ -0,0 +1,319 @@ +# This file originally from pip: +# https://github.com/pypa/pip/blob/8f4f15a5a95d7d5b511ceaee9ed261176c181970/src/pip/_internal/pep425tags.py +"""Generate and work with PEP 425 Compatibility Tags.""" +from __future__ import absolute_import + +import distutils.util +from distutils import log +import platform +import re +import sys +import sysconfig +import warnings +from collections import OrderedDict + +from .extern import six + +from . import glibc + +_osx_arch_pat = re.compile(r'(.+)_(\d+)_(\d+)_(.+)') + + +def get_config_var(var): + try: + return sysconfig.get_config_var(var) + except IOError as e: # Issue #1074 + warnings.warn("{}".format(e), RuntimeWarning) + return None + + +def get_abbr_impl(): + """Return abbreviated implementation name.""" + if hasattr(sys, 'pypy_version_info'): + pyimpl = 'pp' + elif sys.platform.startswith('java'): + pyimpl = 'jy' + elif sys.platform == 'cli': + pyimpl = 'ip' + else: + pyimpl = 'cp' + return pyimpl + + +def get_impl_ver(): + """Return implementation version.""" + impl_ver = get_config_var("py_version_nodot") + if not impl_ver or get_abbr_impl() == 'pp': + impl_ver = ''.join(map(str, get_impl_version_info())) + return impl_ver + + +def get_impl_version_info(): + """Return sys.version_info-like tuple for use in decrementing the minor + version.""" + if get_abbr_impl() == 'pp': + # as per https://github.com/pypa/pip/issues/2882 + return (sys.version_info[0], sys.pypy_version_info.major, + sys.pypy_version_info.minor) + else: + return sys.version_info[0], sys.version_info[1] + + +def get_impl_tag(): + """ + Returns the Tag for this specific implementation. + """ + return "{}{}".format(get_abbr_impl(), get_impl_ver()) + + +def get_flag(var, fallback, expected=True, warn=True): + """Use a fallback method for determining SOABI flags if the needed config + var is unset or unavailable.""" + val = get_config_var(var) + if val is None: + if warn: + log.debug("Config variable '%s' is unset, Python ABI tag may " + "be incorrect", var) + return fallback() + return val == expected + + +def get_abi_tag(): + """Return the ABI tag based on SOABI (if available) or emulate SOABI + (CPython 2, PyPy).""" + soabi = get_config_var('SOABI') + impl = get_abbr_impl() + if not soabi and impl in {'cp', 'pp'} and hasattr(sys, 'maxunicode'): + d = '' + m = '' + u = '' + if get_flag('Py_DEBUG', + lambda: hasattr(sys, 'gettotalrefcount'), + warn=(impl == 'cp')): + d = 'd' + if get_flag('WITH_PYMALLOC', + lambda: impl == 'cp', + warn=(impl == 'cp')): + m = 'm' + if get_flag('Py_UNICODE_SIZE', + lambda: sys.maxunicode == 0x10ffff, + expected=4, + warn=(impl == 'cp' and + six.PY2)) \ + and six.PY2: + u = 'u' + abi = '%s%s%s%s%s' % (impl, get_impl_ver(), d, m, u) + elif soabi and soabi.startswith('cpython-'): + abi = 'cp' + soabi.split('-')[1] + elif soabi: + abi = soabi.replace('.', '_').replace('-', '_') + else: + abi = None + return abi + + +def _is_running_32bit(): + return sys.maxsize == 2147483647 + + +def get_platform(): + """Return our platform name 'win32', 'linux_x86_64'""" + if sys.platform == 'darwin': + # distutils.util.get_platform() returns the release based on the value + # of MACOSX_DEPLOYMENT_TARGET on which Python was built, which may + # be significantly older than the user's current machine. + release, _, machine = platform.mac_ver() + split_ver = release.split('.') + + if machine == "x86_64" and _is_running_32bit(): + machine = "i386" + elif machine == "ppc64" and _is_running_32bit(): + machine = "ppc" + + return 'macosx_{}_{}_{}'.format(split_ver[0], split_ver[1], machine) + + # XXX remove distutils dependency + result = distutils.util.get_platform().replace('.', '_').replace('-', '_') + if result == "linux_x86_64" and _is_running_32bit(): + # 32 bit Python program (running on a 64 bit Linux): pip should only + # install and run 32 bit compiled extensions in that case. + result = "linux_i686" + + return result + + +def is_manylinux1_compatible(): + # Only Linux, and only x86-64 / i686 + if get_platform() not in {"linux_x86_64", "linux_i686"}: + return False + + # Check for presence of _manylinux module + try: + import _manylinux + return bool(_manylinux.manylinux1_compatible) + except (ImportError, AttributeError): + # Fall through to heuristic check below + pass + + # Check glibc version. CentOS 5 uses glibc 2.5. + return glibc.have_compatible_glibc(2, 5) + + +def get_darwin_arches(major, minor, machine): + """Return a list of supported arches (including group arches) for + the given major, minor and machine architecture of an macOS machine. + """ + arches = [] + + def _supports_arch(major, minor, arch): + # Looking at the application support for macOS versions in the chart + # provided by https://en.wikipedia.org/wiki/OS_X#Versions it appears + # our timeline looks roughly like: + # + # 10.0 - Introduces ppc support. + # 10.4 - Introduces ppc64, i386, and x86_64 support, however the ppc64 + # and x86_64 support is CLI only, and cannot be used for GUI + # applications. + # 10.5 - Extends ppc64 and x86_64 support to cover GUI applications. + # 10.6 - Drops support for ppc64 + # 10.7 - Drops support for ppc + # + # Given that we do not know if we're installing a CLI or a GUI + # application, we must be conservative and assume it might be a GUI + # application and behave as if ppc64 and x86_64 support did not occur + # until 10.5. + # + # Note: The above information is taken from the "Application support" + # column in the chart not the "Processor support" since I believe + # that we care about what instruction sets an application can use + # not which processors the OS supports. + if arch == 'ppc': + return (major, minor) <= (10, 5) + if arch == 'ppc64': + return (major, minor) == (10, 5) + if arch == 'i386': + return (major, minor) >= (10, 4) + if arch == 'x86_64': + return (major, minor) >= (10, 5) + if arch in groups: + for garch in groups[arch]: + if _supports_arch(major, minor, garch): + return True + return False + + groups = OrderedDict([ + ("fat", ("i386", "ppc")), + ("intel", ("x86_64", "i386")), + ("fat64", ("x86_64", "ppc64")), + ("fat32", ("x86_64", "i386", "ppc")), + ]) + + if _supports_arch(major, minor, machine): + arches.append(machine) + + for garch in groups: + if machine in groups[garch] and _supports_arch(major, minor, garch): + arches.append(garch) + + arches.append('universal') + + return arches + + +def get_supported(versions=None, noarch=False, platform=None, + impl=None, abi=None): + """Return a list of supported tags for each version specified in + `versions`. + + :param versions: a list of string versions, of the form ["33", "32"], + or None. The first version will be assumed to support our ABI. + :param platform: specify the exact platform you want valid + tags for, or None. If None, use the local system platform. + :param impl: specify the exact implementation you want valid + tags for, or None. If None, use the local interpreter impl. + :param abi: specify the exact abi you want valid + tags for, or None. If None, use the local interpreter abi. + """ + supported = [] + + # Versions must be given with respect to the preference + if versions is None: + versions = [] + version_info = get_impl_version_info() + major = version_info[:-1] + # Support all previous minor Python versions. + for minor in range(version_info[-1], -1, -1): + versions.append(''.join(map(str, major + (minor,)))) + + impl = impl or get_abbr_impl() + + abis = [] + + abi = abi or get_abi_tag() + if abi: + abis[0:0] = [abi] + + abi3s = set() + import imp + for suffix in imp.get_suffixes(): + if suffix[0].startswith('.abi'): + abi3s.add(suffix[0].split('.', 2)[1]) + + abis.extend(sorted(list(abi3s))) + + abis.append('none') + + if not noarch: + arch = platform or get_platform() + if arch.startswith('macosx'): + # support macosx-10.6-intel on macosx-10.9-x86_64 + match = _osx_arch_pat.match(arch) + if match: + name, major, minor, actual_arch = match.groups() + tpl = '{}_{}_%i_%s'.format(name, major) + arches = [] + for m in reversed(range(int(minor) + 1)): + for a in get_darwin_arches(int(major), m, actual_arch): + arches.append(tpl % (m, a)) + else: + # arch pattern didn't match (?!) + arches = [arch] + elif platform is None and is_manylinux1_compatible(): + arches = [arch.replace('linux', 'manylinux1'), arch] + else: + arches = [arch] + + # Current version, current API (built specifically for our Python): + for abi in abis: + for arch in arches: + supported.append(('%s%s' % (impl, versions[0]), abi, arch)) + + # abi3 modules compatible with older version of Python + for version in versions[1:]: + # abi3 was introduced in Python 3.2 + if version in {'31', '30'}: + break + for abi in abi3s: # empty set if not Python 3 + for arch in arches: + supported.append(("%s%s" % (impl, version), abi, arch)) + + # Has binaries, does not use the Python API: + for arch in arches: + supported.append(('py%s' % (versions[0][0]), 'none', arch)) + + # No abi / arch, but requires our implementation: + supported.append(('%s%s' % (impl, versions[0]), 'none', 'any')) + # Tagged specifically as being cross-version compatible + # (with just the major version specified) + supported.append(('%s%s' % (impl, versions[0][0]), 'none', 'any')) + + # No abi / arch, generic Python + for i, version in enumerate(versions): + supported.append(('py%s' % (version,), 'none', 'any')) + if i == 0: + supported.append(('py%s' % (version[0]), 'none', 'any')) + + return supported + + +implementation_tag = get_impl_tag() diff --git a/lib/python3.4/site-packages/setuptools/py27compat.py b/lib/python3.7/site-packages/setuptools/py27compat.py similarity index 100% rename from lib/python3.4/site-packages/setuptools/py27compat.py rename to lib/python3.7/site-packages/setuptools/py27compat.py diff --git a/lib/python3.7/site-packages/setuptools/py31compat.py b/lib/python3.7/site-packages/setuptools/py31compat.py new file mode 100644 index 0000000..1a0705e --- /dev/null +++ b/lib/python3.7/site-packages/setuptools/py31compat.py @@ -0,0 +1,32 @@ +__all__ = [] + +__metaclass__ = type + + +try: + # Python >=3.2 + from tempfile import TemporaryDirectory +except ImportError: + import shutil + import tempfile + + class TemporaryDirectory: + """ + Very simple temporary directory context manager. + Will try to delete afterward, but will also ignore OS and similar + errors on deletion. + """ + + def __init__(self): + self.name = None # Handle mkdtemp raising an exception + self.name = tempfile.mkdtemp() + + def __enter__(self): + return self.name + + def __exit__(self, exctype, excvalue, exctrace): + try: + shutil.rmtree(self.name, True) + except OSError: # removal errors are not the only possible + pass + self.name = None diff --git a/lib/python3.4/site-packages/setuptools/py33compat.py b/lib/python3.7/site-packages/setuptools/py33compat.py similarity index 80% rename from lib/python3.4/site-packages/setuptools/py33compat.py rename to lib/python3.7/site-packages/setuptools/py33compat.py index af64d5d..87cf539 100644 --- a/lib/python3.4/site-packages/setuptools/py33compat.py +++ b/lib/python3.7/site-packages/setuptools/py33compat.py @@ -2,13 +2,20 @@ import dis import array import collections -from setuptools.extern import six +try: + import html +except ImportError: + html = None +from setuptools.extern import six +from setuptools.extern.six.moves import html_parser + +__metaclass__ = type OpArg = collections.namedtuple('OpArg', 'opcode arg') -class Bytecode_compat(object): +class Bytecode_compat: def __init__(self, code): self.code = code @@ -43,3 +50,6 @@ class Bytecode_compat(object): Bytecode = getattr(dis, 'Bytecode', Bytecode_compat) + + +unescape = getattr(html, 'unescape', html_parser.HTMLParser().unescape) diff --git a/lib/python3.4/site-packages/setuptools/py36compat.py b/lib/python3.7/site-packages/setuptools/py36compat.py similarity index 100% rename from lib/python3.4/site-packages/setuptools/py36compat.py rename to lib/python3.7/site-packages/setuptools/py36compat.py diff --git a/lib/python3.4/site-packages/setuptools/sandbox.py b/lib/python3.7/site-packages/setuptools/sandbox.py similarity index 98% rename from lib/python3.4/site-packages/setuptools/sandbox.py rename to lib/python3.7/site-packages/setuptools/sandbox.py index 1d981f4..685f3f7 100644 --- a/lib/python3.4/site-packages/setuptools/sandbox.py +++ b/lib/python3.7/site-packages/setuptools/sandbox.py @@ -39,10 +39,6 @@ def _execfile(filename, globals, locals=None): mode = 'rb' with open(filename, mode) as stream: script = stream.read() - # compile() function in Python 2.6 and 3.1 requires LF line endings. - if sys.version_info[:2] < (2, 7) or sys.version_info[:2] >= (3, 0) and sys.version_info[:2] < (3, 2): - script = script.replace(b'\r\n', b'\n') - script = script.replace(b'\r', b'\n') if locals is None: locals = globals code = compile(script, filename, 'exec') diff --git a/lib/python3.4/site-packages/setuptools/script (dev).tmpl b/lib/python3.7/site-packages/setuptools/script (dev).tmpl similarity index 66% rename from lib/python3.4/site-packages/setuptools/script (dev).tmpl rename to lib/python3.7/site-packages/setuptools/script (dev).tmpl index d58b1bb..39a24b0 100644 --- a/lib/python3.4/site-packages/setuptools/script (dev).tmpl +++ b/lib/python3.7/site-packages/setuptools/script (dev).tmpl @@ -2,4 +2,5 @@ __requires__ = %(spec)r __import__('pkg_resources').require(%(spec)r) __file__ = %(dev_path)r -exec(compile(open(__file__).read(), __file__, 'exec')) +with open(__file__) as f: + exec(compile(f.read(), __file__, 'exec')) diff --git a/lib/python3.4/site-packages/setuptools/script.tmpl b/lib/python3.7/site-packages/setuptools/script.tmpl similarity index 100% rename from lib/python3.4/site-packages/setuptools/script.tmpl rename to lib/python3.7/site-packages/setuptools/script.tmpl diff --git a/lib/python3.4/site-packages/setuptools/site-patch.py b/lib/python3.7/site-packages/setuptools/site-patch.py similarity index 97% rename from lib/python3.4/site-packages/setuptools/site-patch.py rename to lib/python3.7/site-packages/setuptools/site-patch.py index 0d2d2ff..40b00de 100644 --- a/lib/python3.4/site-packages/setuptools/site-patch.py +++ b/lib/python3.7/site-packages/setuptools/site-patch.py @@ -23,7 +23,7 @@ def __boot(): break else: try: - import imp # Avoid import loop in Python >= 3.3 + import imp # Avoid import loop in Python 3 stream, path, descr = imp.find_module('site', [item]) except ImportError: continue diff --git a/lib/python3.4/site-packages/setuptools/ssl_support.py b/lib/python3.7/site-packages/setuptools/ssl_support.py similarity index 95% rename from lib/python3.4/site-packages/setuptools/ssl_support.py rename to lib/python3.7/site-packages/setuptools/ssl_support.py index 72b18ef..6362f1f 100644 --- a/lib/python3.4/site-packages/setuptools/ssl_support.py +++ b/lib/python3.7/site-packages/setuptools/ssl_support.py @@ -186,9 +186,14 @@ class VerifyingHTTPSConn(HTTPSConnection): else: actual_host = self.host - self.sock = ssl.wrap_socket( - sock, cert_reqs=ssl.CERT_REQUIRED, ca_certs=self.ca_bundle - ) + if hasattr(ssl, 'create_default_context'): + ctx = ssl.create_default_context(cafile=self.ca_bundle) + self.sock = ctx.wrap_socket(sock, server_hostname=actual_host) + else: + # This is for python < 2.7.9 and < 3.4? + self.sock = ssl.wrap_socket( + sock, cert_reqs=ssl.CERT_REQUIRED, ca_certs=self.ca_bundle + ) try: match_hostname(self.sock.getpeercert(), actual_host) except CertificateError: diff --git a/lib/python3.4/site-packages/setuptools/unicode_utils.py b/lib/python3.7/site-packages/setuptools/unicode_utils.py similarity index 100% rename from lib/python3.4/site-packages/setuptools/unicode_utils.py rename to lib/python3.7/site-packages/setuptools/unicode_utils.py diff --git a/lib/python3.4/site-packages/setuptools/version.py b/lib/python3.7/site-packages/setuptools/version.py similarity index 100% rename from lib/python3.4/site-packages/setuptools/version.py rename to lib/python3.7/site-packages/setuptools/version.py diff --git a/lib/python3.7/site-packages/setuptools/wheel.py b/lib/python3.7/site-packages/setuptools/wheel.py new file mode 100644 index 0000000..95a794a --- /dev/null +++ b/lib/python3.7/site-packages/setuptools/wheel.py @@ -0,0 +1,210 @@ +"""Wheels support.""" + +from distutils.util import get_platform +import email +import itertools +import os +import posixpath +import re +import zipfile + +from pkg_resources import Distribution, PathMetadata, parse_version +from setuptools.extern.packaging.utils import canonicalize_name +from setuptools.extern.six import PY3 +from setuptools import Distribution as SetuptoolsDistribution +from setuptools import pep425tags +from setuptools.command.egg_info import write_requirements + + +__metaclass__ = type + + +WHEEL_NAME = re.compile( + r"""^(?P.+?)-(?P\d.*?) + ((-(?P\d.*?))?-(?P.+?)-(?P.+?)-(?P.+?) + )\.whl$""", + re.VERBOSE).match + +NAMESPACE_PACKAGE_INIT = '''\ +try: + __import__('pkg_resources').declare_namespace(__name__) +except ImportError: + __path__ = __import__('pkgutil').extend_path(__path__, __name__) +''' + + +def unpack(src_dir, dst_dir): + '''Move everything under `src_dir` to `dst_dir`, and delete the former.''' + for dirpath, dirnames, filenames in os.walk(src_dir): + subdir = os.path.relpath(dirpath, src_dir) + for f in filenames: + src = os.path.join(dirpath, f) + dst = os.path.join(dst_dir, subdir, f) + os.renames(src, dst) + for n, d in reversed(list(enumerate(dirnames))): + src = os.path.join(dirpath, d) + dst = os.path.join(dst_dir, subdir, d) + if not os.path.exists(dst): + # Directory does not exist in destination, + # rename it and prune it from os.walk list. + os.renames(src, dst) + del dirnames[n] + # Cleanup. + for dirpath, dirnames, filenames in os.walk(src_dir, topdown=True): + assert not filenames + os.rmdir(dirpath) + + +class Wheel: + + def __init__(self, filename): + match = WHEEL_NAME(os.path.basename(filename)) + if match is None: + raise ValueError('invalid wheel name: %r' % filename) + self.filename = filename + for k, v in match.groupdict().items(): + setattr(self, k, v) + + def tags(self): + '''List tags (py_version, abi, platform) supported by this wheel.''' + return itertools.product( + self.py_version.split('.'), + self.abi.split('.'), + self.platform.split('.'), + ) + + def is_compatible(self): + '''Is the wheel is compatible with the current platform?''' + supported_tags = pep425tags.get_supported() + return next((True for t in self.tags() if t in supported_tags), False) + + def egg_name(self): + return Distribution( + project_name=self.project_name, version=self.version, + platform=(None if self.platform == 'any' else get_platform()), + ).egg_name() + '.egg' + + def get_dist_info(self, zf): + # find the correct name of the .dist-info dir in the wheel file + for member in zf.namelist(): + dirname = posixpath.dirname(member) + if (dirname.endswith('.dist-info') and + canonicalize_name(dirname).startswith( + canonicalize_name(self.project_name))): + return dirname + raise ValueError("unsupported wheel format. .dist-info not found") + + def install_as_egg(self, destination_eggdir): + '''Install wheel as an egg directory.''' + with zipfile.ZipFile(self.filename) as zf: + self._install_as_egg(destination_eggdir, zf) + + def _install_as_egg(self, destination_eggdir, zf): + dist_basename = '%s-%s' % (self.project_name, self.version) + dist_info = self.get_dist_info(zf) + dist_data = '%s.data' % dist_basename + egg_info = os.path.join(destination_eggdir, 'EGG-INFO') + + self._convert_metadata(zf, destination_eggdir, dist_info, egg_info) + self._move_data_entries(destination_eggdir, dist_data) + self._fix_namespace_packages(egg_info, destination_eggdir) + + @staticmethod + def _convert_metadata(zf, destination_eggdir, dist_info, egg_info): + def get_metadata(name): + with zf.open(posixpath.join(dist_info, name)) as fp: + value = fp.read().decode('utf-8') if PY3 else fp.read() + return email.parser.Parser().parsestr(value) + + wheel_metadata = get_metadata('WHEEL') + # Check wheel format version is supported. + wheel_version = parse_version(wheel_metadata.get('Wheel-Version')) + wheel_v1 = ( + parse_version('1.0') <= wheel_version < parse_version('2.0dev0') + ) + if not wheel_v1: + raise ValueError( + 'unsupported wheel format version: %s' % wheel_version) + # Extract to target directory. + os.mkdir(destination_eggdir) + zf.extractall(destination_eggdir) + # Convert metadata. + dist_info = os.path.join(destination_eggdir, dist_info) + dist = Distribution.from_location( + destination_eggdir, dist_info, + metadata=PathMetadata(destination_eggdir, dist_info), + ) + + # Note: Evaluate and strip markers now, + # as it's difficult to convert back from the syntax: + # foobar; "linux" in sys_platform and extra == 'test' + def raw_req(req): + req.marker = None + return str(req) + install_requires = list(sorted(map(raw_req, dist.requires()))) + extras_require = { + extra: sorted( + req + for req in map(raw_req, dist.requires((extra,))) + if req not in install_requires + ) + for extra in dist.extras + } + os.rename(dist_info, egg_info) + os.rename( + os.path.join(egg_info, 'METADATA'), + os.path.join(egg_info, 'PKG-INFO'), + ) + setup_dist = SetuptoolsDistribution( + attrs=dict( + install_requires=install_requires, + extras_require=extras_require, + ), + ) + write_requirements( + setup_dist.get_command_obj('egg_info'), + None, + os.path.join(egg_info, 'requires.txt'), + ) + + @staticmethod + def _move_data_entries(destination_eggdir, dist_data): + """Move data entries to their correct location.""" + dist_data = os.path.join(destination_eggdir, dist_data) + dist_data_scripts = os.path.join(dist_data, 'scripts') + if os.path.exists(dist_data_scripts): + egg_info_scripts = os.path.join( + destination_eggdir, 'EGG-INFO', 'scripts') + os.mkdir(egg_info_scripts) + for entry in os.listdir(dist_data_scripts): + # Remove bytecode, as it's not properly handled + # during easy_install scripts install phase. + if entry.endswith('.pyc'): + os.unlink(os.path.join(dist_data_scripts, entry)) + else: + os.rename( + os.path.join(dist_data_scripts, entry), + os.path.join(egg_info_scripts, entry), + ) + os.rmdir(dist_data_scripts) + for subdir in filter(os.path.exists, ( + os.path.join(dist_data, d) + for d in ('data', 'headers', 'purelib', 'platlib') + )): + unpack(subdir, destination_eggdir) + if os.path.exists(dist_data): + os.rmdir(dist_data) + + @staticmethod + def _fix_namespace_packages(egg_info, destination_eggdir): + namespace_packages = os.path.join( + egg_info, 'namespace_packages.txt') + if os.path.exists(namespace_packages): + with open(namespace_packages) as fp: + namespace_packages = fp.read().split() + for mod in namespace_packages: + mod_dir = os.path.join(destination_eggdir, *mod.split('.')) + mod_init = os.path.join(mod_dir, '__init__.py') + if os.path.exists(mod_dir) and not os.path.exists(mod_init): + with open(mod_init, 'w') as fp: + fp.write(NAMESPACE_PACKAGE_INIT) diff --git a/lib/python3.4/site-packages/setuptools/windows_support.py b/lib/python3.7/site-packages/setuptools/windows_support.py similarity index 100% rename from lib/python3.4/site-packages/setuptools/windows_support.py rename to lib/python3.7/site-packages/setuptools/windows_support.py diff --git a/lib/python3.4/site-packages/pip-9.0.1.dist-info/INSTALLER b/lib/python3.7/site-packages/six-1.12.0.dist-info/INSTALLER similarity index 100% rename from lib/python3.4/site-packages/pip-9.0.1.dist-info/INSTALLER rename to lib/python3.7/site-packages/six-1.12.0.dist-info/INSTALLER diff --git a/lib/python3.7/site-packages/six-1.12.0.dist-info/LICENSE b/lib/python3.7/site-packages/six-1.12.0.dist-info/LICENSE new file mode 100644 index 0000000..365d107 --- /dev/null +++ b/lib/python3.7/site-packages/six-1.12.0.dist-info/LICENSE @@ -0,0 +1,18 @@ +Copyright (c) 2010-2018 Benjamin Peterson + +Permission is hereby granted, free of charge, to any person obtaining a copy of +this software and associated documentation files (the "Software"), to deal in +the Software without restriction, including without limitation the rights to +use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of +the Software, and to permit persons to whom the Software is furnished to do so, +subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all +copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS +FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR +COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER +IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN +CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/lib/python3.4/site-packages/six-1.11.0.dist-info/METADATA b/lib/python3.7/site-packages/six-1.12.0.dist-info/METADATA similarity index 65% rename from lib/python3.4/site-packages/six-1.11.0.dist-info/METADATA rename to lib/python3.7/site-packages/six-1.12.0.dist-info/METADATA index 04e93dc..df8db11 100644 --- a/lib/python3.4/site-packages/six-1.11.0.dist-info/METADATA +++ b/lib/python3.7/site-packages/six-1.12.0.dist-info/METADATA @@ -1,27 +1,36 @@ -Metadata-Version: 2.0 +Metadata-Version: 2.1 Name: six -Version: 1.11.0 +Version: 1.12.0 Summary: Python 2 and 3 compatibility utilities -Home-page: http://pypi.python.org/pypi/six/ +Home-page: https://github.com/benjaminp/six Author: Benjamin Peterson Author-email: benjamin@python.org License: MIT Platform: UNKNOWN +Classifier: Development Status :: 5 - Production/Stable Classifier: Programming Language :: Python :: 2 Classifier: Programming Language :: Python :: 3 Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: MIT License Classifier: Topic :: Software Development :: Libraries Classifier: Topic :: Utilities +Requires-Python: >=2.6, !=3.0.*, !=3.1.* -.. image:: http://img.shields.io/pypi/v/six.svg - :target: https://pypi.python.org/pypi/six +.. image:: https://img.shields.io/pypi/v/six.svg + :target: https://pypi.org/project/six/ + :alt: six on PyPI .. image:: https://travis-ci.org/benjaminp/six.svg?branch=master - :target: https://travis-ci.org/benjaminp/six + :target: https://travis-ci.org/benjaminp/six + :alt: six on TravisCI -.. image:: http://img.shields.io/badge/license-MIT-green.svg +.. image:: https://readthedocs.org/projects/six/badge/?version=latest + :target: https://six.readthedocs.io/ + :alt: six's documentation on Read the Docs + +.. image:: https://img.shields.io/badge/license-MIT-green.svg :target: https://github.com/benjaminp/six/blob/master/LICENSE + :alt: MIT License badge Six is a Python 2 and 3 compatibility library. It provides utility functions for smoothing over the differences between the Python versions with the goal of @@ -32,7 +41,7 @@ Six supports every Python version since 2.6. It is contained in only one Python file, so it can be easily copied into your project. (The copyright and license notice must be retained.) -Online documentation is at http://six.rtfd.org. +Online documentation is at https://six.readthedocs.io/. Bugs can be reported to https://github.com/benjaminp/six. The code can also be found there. diff --git a/lib/python3.7/site-packages/six-1.12.0.dist-info/RECORD b/lib/python3.7/site-packages/six-1.12.0.dist-info/RECORD new file mode 100644 index 0000000..c2634d8 --- /dev/null +++ b/lib/python3.7/site-packages/six-1.12.0.dist-info/RECORD @@ -0,0 +1,8 @@ +__pycache__/six.cpython-37.pyc,, +six-1.12.0.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4 +six-1.12.0.dist-info/LICENSE,sha256=5zL1TaWPPpzwxI6LUSlIk2_Pc2G9WK-mOpo8OSv3lK0,1066 +six-1.12.0.dist-info/METADATA,sha256=CRdYkKPKCFJr7-qA8PDpBklGXfXJ3xu4mu5tkLBDL04,1940 +six-1.12.0.dist-info/RECORD,, +six-1.12.0.dist-info/WHEEL,sha256=_wJFdOYk7i3xxT8ElOkUJvOdOvfNGbR9g-bf6UQT6sU,110 +six-1.12.0.dist-info/top_level.txt,sha256=_iVH_iYEtEXnD8nYGQYpYFUvkUW9sEO1GYbkeKSAais,4 +six.py,sha256=h9jch2pS86y4R36pKRS3LOYUCVFNIJMRwjZ4fJDtJ44,32452 diff --git a/lib/python3.4/site-packages/six-1.11.0.dist-info/WHEEL b/lib/python3.7/site-packages/six-1.12.0.dist-info/WHEEL similarity index 70% rename from lib/python3.4/site-packages/six-1.11.0.dist-info/WHEEL rename to lib/python3.7/site-packages/six-1.12.0.dist-info/WHEEL index 8b6dd1b..c4bde30 100644 --- a/lib/python3.4/site-packages/six-1.11.0.dist-info/WHEEL +++ b/lib/python3.7/site-packages/six-1.12.0.dist-info/WHEEL @@ -1,5 +1,5 @@ Wheel-Version: 1.0 -Generator: bdist_wheel (0.29.0) +Generator: bdist_wheel (0.32.3) Root-Is-Purelib: true Tag: py2-none-any Tag: py3-none-any diff --git a/lib/python3.4/site-packages/six-1.11.0.dist-info/top_level.txt b/lib/python3.7/site-packages/six-1.12.0.dist-info/top_level.txt similarity index 100% rename from lib/python3.4/site-packages/six-1.11.0.dist-info/top_level.txt rename to lib/python3.7/site-packages/six-1.12.0.dist-info/top_level.txt diff --git a/lib/python3.7/site-packages/six.py b/lib/python3.7/site-packages/six.py new file mode 100644 index 0000000..89b2188 --- /dev/null +++ b/lib/python3.7/site-packages/six.py @@ -0,0 +1,952 @@ +# Copyright (c) 2010-2018 Benjamin Peterson +# +# Permission is hereby granted, free of charge, to any person obtaining a copy +# of this software and associated documentation files (the "Software"), to deal +# in the Software without restriction, including without limitation the rights +# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +# copies of the Software, and to permit persons to whom the Software is +# furnished to do so, subject to the following conditions: +# +# The above copyright notice and this permission notice shall be included in all +# copies or substantial portions of the Software. +# +# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +# SOFTWARE. + +"""Utilities for writing code that runs on Python 2 and 3""" + +from __future__ import absolute_import + +import functools +import itertools +import operator +import sys +import types + +__author__ = "Benjamin Peterson " +__version__ = "1.12.0" + + +# Useful for very coarse version differentiation. +PY2 = sys.version_info[0] == 2 +PY3 = sys.version_info[0] == 3 +PY34 = sys.version_info[0:2] >= (3, 4) + +if PY3: + string_types = str, + integer_types = int, + class_types = type, + text_type = str + binary_type = bytes + + MAXSIZE = sys.maxsize +else: + string_types = basestring, + integer_types = (int, long) + class_types = (type, types.ClassType) + text_type = unicode + binary_type = str + + if sys.platform.startswith("java"): + # Jython always uses 32 bits. + MAXSIZE = int((1 << 31) - 1) + else: + # It's possible to have sizeof(long) != sizeof(Py_ssize_t). + class X(object): + + def __len__(self): + return 1 << 31 + try: + len(X()) + except OverflowError: + # 32-bit + MAXSIZE = int((1 << 31) - 1) + else: + # 64-bit + MAXSIZE = int((1 << 63) - 1) + del X + + +def _add_doc(func, doc): + """Add documentation to a function.""" + func.__doc__ = doc + + +def _import_module(name): + """Import module, returning the module after the last dot.""" + __import__(name) + return sys.modules[name] + + +class _LazyDescr(object): + + def __init__(self, name): + self.name = name + + def __get__(self, obj, tp): + result = self._resolve() + setattr(obj, self.name, result) # Invokes __set__. + try: + # This is a bit ugly, but it avoids running this again by + # removing this descriptor. + delattr(obj.__class__, self.name) + except AttributeError: + pass + return result + + +class MovedModule(_LazyDescr): + + def __init__(self, name, old, new=None): + super(MovedModule, self).__init__(name) + if PY3: + if new is None: + new = name + self.mod = new + else: + self.mod = old + + def _resolve(self): + return _import_module(self.mod) + + def __getattr__(self, attr): + _module = self._resolve() + value = getattr(_module, attr) + setattr(self, attr, value) + return value + + +class _LazyModule(types.ModuleType): + + def __init__(self, name): + super(_LazyModule, self).__init__(name) + self.__doc__ = self.__class__.__doc__ + + def __dir__(self): + attrs = ["__doc__", "__name__"] + attrs += [attr.name for attr in self._moved_attributes] + return attrs + + # Subclasses should override this + _moved_attributes = [] + + +class MovedAttribute(_LazyDescr): + + def __init__(self, name, old_mod, new_mod, old_attr=None, new_attr=None): + super(MovedAttribute, self).__init__(name) + if PY3: + if new_mod is None: + new_mod = name + self.mod = new_mod + if new_attr is None: + if old_attr is None: + new_attr = name + else: + new_attr = old_attr + self.attr = new_attr + else: + self.mod = old_mod + if old_attr is None: + old_attr = name + self.attr = old_attr + + def _resolve(self): + module = _import_module(self.mod) + return getattr(module, self.attr) + + +class _SixMetaPathImporter(object): + + """ + A meta path importer to import six.moves and its submodules. + + This class implements a PEP302 finder and loader. It should be compatible + with Python 2.5 and all existing versions of Python3 + """ + + def __init__(self, six_module_name): + self.name = six_module_name + self.known_modules = {} + + def _add_module(self, mod, *fullnames): + for fullname in fullnames: + self.known_modules[self.name + "." + fullname] = mod + + def _get_module(self, fullname): + return self.known_modules[self.name + "." + fullname] + + def find_module(self, fullname, path=None): + if fullname in self.known_modules: + return self + return None + + def __get_module(self, fullname): + try: + return self.known_modules[fullname] + except KeyError: + raise ImportError("This loader does not know module " + fullname) + + def load_module(self, fullname): + try: + # in case of a reload + return sys.modules[fullname] + except KeyError: + pass + mod = self.__get_module(fullname) + if isinstance(mod, MovedModule): + mod = mod._resolve() + else: + mod.__loader__ = self + sys.modules[fullname] = mod + return mod + + def is_package(self, fullname): + """ + Return true, if the named module is a package. + + We need this method to get correct spec objects with + Python 3.4 (see PEP451) + """ + return hasattr(self.__get_module(fullname), "__path__") + + def get_code(self, fullname): + """Return None + + Required, if is_package is implemented""" + self.__get_module(fullname) # eventually raises ImportError + return None + get_source = get_code # same as get_code + +_importer = _SixMetaPathImporter(__name__) + + +class _MovedItems(_LazyModule): + + """Lazy loading of moved objects""" + __path__ = [] # mark as package + + +_moved_attributes = [ + MovedAttribute("cStringIO", "cStringIO", "io", "StringIO"), + MovedAttribute("filter", "itertools", "builtins", "ifilter", "filter"), + MovedAttribute("filterfalse", "itertools", "itertools", "ifilterfalse", "filterfalse"), + MovedAttribute("input", "__builtin__", "builtins", "raw_input", "input"), + MovedAttribute("intern", "__builtin__", "sys"), + MovedAttribute("map", "itertools", "builtins", "imap", "map"), + MovedAttribute("getcwd", "os", "os", "getcwdu", "getcwd"), + MovedAttribute("getcwdb", "os", "os", "getcwd", "getcwdb"), + MovedAttribute("getoutput", "commands", "subprocess"), + MovedAttribute("range", "__builtin__", "builtins", "xrange", "range"), + MovedAttribute("reload_module", "__builtin__", "importlib" if PY34 else "imp", "reload"), + MovedAttribute("reduce", "__builtin__", "functools"), + MovedAttribute("shlex_quote", "pipes", "shlex", "quote"), + MovedAttribute("StringIO", "StringIO", "io"), + MovedAttribute("UserDict", "UserDict", "collections"), + MovedAttribute("UserList", "UserList", "collections"), + MovedAttribute("UserString", "UserString", "collections"), + MovedAttribute("xrange", "__builtin__", "builtins", "xrange", "range"), + MovedAttribute("zip", "itertools", "builtins", "izip", "zip"), + MovedAttribute("zip_longest", "itertools", "itertools", "izip_longest", "zip_longest"), + MovedModule("builtins", "__builtin__"), + MovedModule("configparser", "ConfigParser"), + MovedModule("copyreg", "copy_reg"), + MovedModule("dbm_gnu", "gdbm", "dbm.gnu"), + MovedModule("_dummy_thread", "dummy_thread", "_dummy_thread"), + MovedModule("http_cookiejar", "cookielib", "http.cookiejar"), + MovedModule("http_cookies", "Cookie", "http.cookies"), + MovedModule("html_entities", "htmlentitydefs", "html.entities"), + MovedModule("html_parser", "HTMLParser", "html.parser"), + MovedModule("http_client", "httplib", "http.client"), + MovedModule("email_mime_base", "email.MIMEBase", "email.mime.base"), + MovedModule("email_mime_image", "email.MIMEImage", "email.mime.image"), + MovedModule("email_mime_multipart", "email.MIMEMultipart", "email.mime.multipart"), + MovedModule("email_mime_nonmultipart", "email.MIMENonMultipart", "email.mime.nonmultipart"), + MovedModule("email_mime_text", "email.MIMEText", "email.mime.text"), + MovedModule("BaseHTTPServer", "BaseHTTPServer", "http.server"), + MovedModule("CGIHTTPServer", "CGIHTTPServer", "http.server"), + MovedModule("SimpleHTTPServer", "SimpleHTTPServer", "http.server"), + MovedModule("cPickle", "cPickle", "pickle"), + MovedModule("queue", "Queue"), + MovedModule("reprlib", "repr"), + MovedModule("socketserver", "SocketServer"), + MovedModule("_thread", "thread", "_thread"), + MovedModule("tkinter", "Tkinter"), + MovedModule("tkinter_dialog", "Dialog", "tkinter.dialog"), + MovedModule("tkinter_filedialog", "FileDialog", "tkinter.filedialog"), + MovedModule("tkinter_scrolledtext", "ScrolledText", "tkinter.scrolledtext"), + MovedModule("tkinter_simpledialog", "SimpleDialog", "tkinter.simpledialog"), + MovedModule("tkinter_tix", "Tix", "tkinter.tix"), + MovedModule("tkinter_ttk", "ttk", "tkinter.ttk"), + MovedModule("tkinter_constants", "Tkconstants", "tkinter.constants"), + MovedModule("tkinter_dnd", "Tkdnd", "tkinter.dnd"), + MovedModule("tkinter_colorchooser", "tkColorChooser", + "tkinter.colorchooser"), + MovedModule("tkinter_commondialog", "tkCommonDialog", + "tkinter.commondialog"), + MovedModule("tkinter_tkfiledialog", "tkFileDialog", "tkinter.filedialog"), + MovedModule("tkinter_font", "tkFont", "tkinter.font"), + MovedModule("tkinter_messagebox", "tkMessageBox", "tkinter.messagebox"), + MovedModule("tkinter_tksimpledialog", "tkSimpleDialog", + "tkinter.simpledialog"), + MovedModule("urllib_parse", __name__ + ".moves.urllib_parse", "urllib.parse"), + MovedModule("urllib_error", __name__ + ".moves.urllib_error", "urllib.error"), + MovedModule("urllib", __name__ + ".moves.urllib", __name__ + ".moves.urllib"), + MovedModule("urllib_robotparser", "robotparser", "urllib.robotparser"), + MovedModule("xmlrpc_client", "xmlrpclib", "xmlrpc.client"), + MovedModule("xmlrpc_server", "SimpleXMLRPCServer", "xmlrpc.server"), +] +# Add windows specific modules. +if sys.platform == "win32": + _moved_attributes += [ + MovedModule("winreg", "_winreg"), + ] + +for attr in _moved_attributes: + setattr(_MovedItems, attr.name, attr) + if isinstance(attr, MovedModule): + _importer._add_module(attr, "moves." + attr.name) +del attr + +_MovedItems._moved_attributes = _moved_attributes + +moves = _MovedItems(__name__ + ".moves") +_importer._add_module(moves, "moves") + + +class Module_six_moves_urllib_parse(_LazyModule): + + """Lazy loading of moved objects in six.moves.urllib_parse""" + + +_urllib_parse_moved_attributes = [ + MovedAttribute("ParseResult", "urlparse", "urllib.parse"), + MovedAttribute("SplitResult", "urlparse", "urllib.parse"), + MovedAttribute("parse_qs", "urlparse", "urllib.parse"), + MovedAttribute("parse_qsl", "urlparse", "urllib.parse"), + MovedAttribute("urldefrag", "urlparse", "urllib.parse"), + MovedAttribute("urljoin", "urlparse", "urllib.parse"), + MovedAttribute("urlparse", "urlparse", "urllib.parse"), + MovedAttribute("urlsplit", "urlparse", "urllib.parse"), + MovedAttribute("urlunparse", "urlparse", "urllib.parse"), + MovedAttribute("urlunsplit", "urlparse", "urllib.parse"), + MovedAttribute("quote", "urllib", "urllib.parse"), + MovedAttribute("quote_plus", "urllib", "urllib.parse"), + MovedAttribute("unquote", "urllib", "urllib.parse"), + MovedAttribute("unquote_plus", "urllib", "urllib.parse"), + MovedAttribute("unquote_to_bytes", "urllib", "urllib.parse", "unquote", "unquote_to_bytes"), + MovedAttribute("urlencode", "urllib", "urllib.parse"), + MovedAttribute("splitquery", "urllib", "urllib.parse"), + MovedAttribute("splittag", "urllib", "urllib.parse"), + MovedAttribute("splituser", "urllib", "urllib.parse"), + MovedAttribute("splitvalue", "urllib", "urllib.parse"), + MovedAttribute("uses_fragment", "urlparse", "urllib.parse"), + MovedAttribute("uses_netloc", "urlparse", "urllib.parse"), + MovedAttribute("uses_params", "urlparse", "urllib.parse"), + MovedAttribute("uses_query", "urlparse", "urllib.parse"), + MovedAttribute("uses_relative", "urlparse", "urllib.parse"), +] +for attr in _urllib_parse_moved_attributes: + setattr(Module_six_moves_urllib_parse, attr.name, attr) +del attr + +Module_six_moves_urllib_parse._moved_attributes = _urllib_parse_moved_attributes + +_importer._add_module(Module_six_moves_urllib_parse(__name__ + ".moves.urllib_parse"), + "moves.urllib_parse", "moves.urllib.parse") + + +class Module_six_moves_urllib_error(_LazyModule): + + """Lazy loading of moved objects in six.moves.urllib_error""" + + +_urllib_error_moved_attributes = [ + MovedAttribute("URLError", "urllib2", "urllib.error"), + MovedAttribute("HTTPError", "urllib2", "urllib.error"), + MovedAttribute("ContentTooShortError", "urllib", "urllib.error"), +] +for attr in _urllib_error_moved_attributes: + setattr(Module_six_moves_urllib_error, attr.name, attr) +del attr + +Module_six_moves_urllib_error._moved_attributes = _urllib_error_moved_attributes + +_importer._add_module(Module_six_moves_urllib_error(__name__ + ".moves.urllib.error"), + "moves.urllib_error", "moves.urllib.error") + + +class Module_six_moves_urllib_request(_LazyModule): + + """Lazy loading of moved objects in six.moves.urllib_request""" + + +_urllib_request_moved_attributes = [ + MovedAttribute("urlopen", "urllib2", "urllib.request"), + MovedAttribute("install_opener", "urllib2", "urllib.request"), + MovedAttribute("build_opener", "urllib2", "urllib.request"), + MovedAttribute("pathname2url", "urllib", "urllib.request"), + MovedAttribute("url2pathname", "urllib", "urllib.request"), + MovedAttribute("getproxies", "urllib", "urllib.request"), + MovedAttribute("Request", "urllib2", "urllib.request"), + MovedAttribute("OpenerDirector", "urllib2", "urllib.request"), + MovedAttribute("HTTPDefaultErrorHandler", "urllib2", "urllib.request"), + MovedAttribute("HTTPRedirectHandler", "urllib2", "urllib.request"), + MovedAttribute("HTTPCookieProcessor", "urllib2", "urllib.request"), + MovedAttribute("ProxyHandler", "urllib2", "urllib.request"), + MovedAttribute("BaseHandler", "urllib2", "urllib.request"), + MovedAttribute("HTTPPasswordMgr", "urllib2", "urllib.request"), + MovedAttribute("HTTPPasswordMgrWithDefaultRealm", "urllib2", "urllib.request"), + MovedAttribute("AbstractBasicAuthHandler", "urllib2", "urllib.request"), + MovedAttribute("HTTPBasicAuthHandler", "urllib2", "urllib.request"), + MovedAttribute("ProxyBasicAuthHandler", "urllib2", "urllib.request"), + MovedAttribute("AbstractDigestAuthHandler", "urllib2", "urllib.request"), + MovedAttribute("HTTPDigestAuthHandler", "urllib2", "urllib.request"), + MovedAttribute("ProxyDigestAuthHandler", "urllib2", "urllib.request"), + MovedAttribute("HTTPHandler", "urllib2", "urllib.request"), + MovedAttribute("HTTPSHandler", "urllib2", "urllib.request"), + MovedAttribute("FileHandler", "urllib2", "urllib.request"), + MovedAttribute("FTPHandler", "urllib2", "urllib.request"), + MovedAttribute("CacheFTPHandler", "urllib2", "urllib.request"), + MovedAttribute("UnknownHandler", "urllib2", "urllib.request"), + MovedAttribute("HTTPErrorProcessor", "urllib2", "urllib.request"), + MovedAttribute("urlretrieve", "urllib", "urllib.request"), + MovedAttribute("urlcleanup", "urllib", "urllib.request"), + MovedAttribute("URLopener", "urllib", "urllib.request"), + MovedAttribute("FancyURLopener", "urllib", "urllib.request"), + MovedAttribute("proxy_bypass", "urllib", "urllib.request"), + MovedAttribute("parse_http_list", "urllib2", "urllib.request"), + MovedAttribute("parse_keqv_list", "urllib2", "urllib.request"), +] +for attr in _urllib_request_moved_attributes: + setattr(Module_six_moves_urllib_request, attr.name, attr) +del attr + +Module_six_moves_urllib_request._moved_attributes = _urllib_request_moved_attributes + +_importer._add_module(Module_six_moves_urllib_request(__name__ + ".moves.urllib.request"), + "moves.urllib_request", "moves.urllib.request") + + +class Module_six_moves_urllib_response(_LazyModule): + + """Lazy loading of moved objects in six.moves.urllib_response""" + + +_urllib_response_moved_attributes = [ + MovedAttribute("addbase", "urllib", "urllib.response"), + MovedAttribute("addclosehook", "urllib", "urllib.response"), + MovedAttribute("addinfo", "urllib", "urllib.response"), + MovedAttribute("addinfourl", "urllib", "urllib.response"), +] +for attr in _urllib_response_moved_attributes: + setattr(Module_six_moves_urllib_response, attr.name, attr) +del attr + +Module_six_moves_urllib_response._moved_attributes = _urllib_response_moved_attributes + +_importer._add_module(Module_six_moves_urllib_response(__name__ + ".moves.urllib.response"), + "moves.urllib_response", "moves.urllib.response") + + +class Module_six_moves_urllib_robotparser(_LazyModule): + + """Lazy loading of moved objects in six.moves.urllib_robotparser""" + + +_urllib_robotparser_moved_attributes = [ + MovedAttribute("RobotFileParser", "robotparser", "urllib.robotparser"), +] +for attr in _urllib_robotparser_moved_attributes: + setattr(Module_six_moves_urllib_robotparser, attr.name, attr) +del attr + +Module_six_moves_urllib_robotparser._moved_attributes = _urllib_robotparser_moved_attributes + +_importer._add_module(Module_six_moves_urllib_robotparser(__name__ + ".moves.urllib.robotparser"), + "moves.urllib_robotparser", "moves.urllib.robotparser") + + +class Module_six_moves_urllib(types.ModuleType): + + """Create a six.moves.urllib namespace that resembles the Python 3 namespace""" + __path__ = [] # mark as package + parse = _importer._get_module("moves.urllib_parse") + error = _importer._get_module("moves.urllib_error") + request = _importer._get_module("moves.urllib_request") + response = _importer._get_module("moves.urllib_response") + robotparser = _importer._get_module("moves.urllib_robotparser") + + def __dir__(self): + return ['parse', 'error', 'request', 'response', 'robotparser'] + +_importer._add_module(Module_six_moves_urllib(__name__ + ".moves.urllib"), + "moves.urllib") + + +def add_move(move): + """Add an item to six.moves.""" + setattr(_MovedItems, move.name, move) + + +def remove_move(name): + """Remove item from six.moves.""" + try: + delattr(_MovedItems, name) + except AttributeError: + try: + del moves.__dict__[name] + except KeyError: + raise AttributeError("no such move, %r" % (name,)) + + +if PY3: + _meth_func = "__func__" + _meth_self = "__self__" + + _func_closure = "__closure__" + _func_code = "__code__" + _func_defaults = "__defaults__" + _func_globals = "__globals__" +else: + _meth_func = "im_func" + _meth_self = "im_self" + + _func_closure = "func_closure" + _func_code = "func_code" + _func_defaults = "func_defaults" + _func_globals = "func_globals" + + +try: + advance_iterator = next +except NameError: + def advance_iterator(it): + return it.next() +next = advance_iterator + + +try: + callable = callable +except NameError: + def callable(obj): + return any("__call__" in klass.__dict__ for klass in type(obj).__mro__) + + +if PY3: + def get_unbound_function(unbound): + return unbound + + create_bound_method = types.MethodType + + def create_unbound_method(func, cls): + return func + + Iterator = object +else: + def get_unbound_function(unbound): + return unbound.im_func + + def create_bound_method(func, obj): + return types.MethodType(func, obj, obj.__class__) + + def create_unbound_method(func, cls): + return types.MethodType(func, None, cls) + + class Iterator(object): + + def next(self): + return type(self).__next__(self) + + callable = callable +_add_doc(get_unbound_function, + """Get the function out of a possibly unbound function""") + + +get_method_function = operator.attrgetter(_meth_func) +get_method_self = operator.attrgetter(_meth_self) +get_function_closure = operator.attrgetter(_func_closure) +get_function_code = operator.attrgetter(_func_code) +get_function_defaults = operator.attrgetter(_func_defaults) +get_function_globals = operator.attrgetter(_func_globals) + + +if PY3: + def iterkeys(d, **kw): + return iter(d.keys(**kw)) + + def itervalues(d, **kw): + return iter(d.values(**kw)) + + def iteritems(d, **kw): + return iter(d.items(**kw)) + + def iterlists(d, **kw): + return iter(d.lists(**kw)) + + viewkeys = operator.methodcaller("keys") + + viewvalues = operator.methodcaller("values") + + viewitems = operator.methodcaller("items") +else: + def iterkeys(d, **kw): + return d.iterkeys(**kw) + + def itervalues(d, **kw): + return d.itervalues(**kw) + + def iteritems(d, **kw): + return d.iteritems(**kw) + + def iterlists(d, **kw): + return d.iterlists(**kw) + + viewkeys = operator.methodcaller("viewkeys") + + viewvalues = operator.methodcaller("viewvalues") + + viewitems = operator.methodcaller("viewitems") + +_add_doc(iterkeys, "Return an iterator over the keys of a dictionary.") +_add_doc(itervalues, "Return an iterator over the values of a dictionary.") +_add_doc(iteritems, + "Return an iterator over the (key, value) pairs of a dictionary.") +_add_doc(iterlists, + "Return an iterator over the (key, [values]) pairs of a dictionary.") + + +if PY3: + def b(s): + return s.encode("latin-1") + + def u(s): + return s + unichr = chr + import struct + int2byte = struct.Struct(">B").pack + del struct + byte2int = operator.itemgetter(0) + indexbytes = operator.getitem + iterbytes = iter + import io + StringIO = io.StringIO + BytesIO = io.BytesIO + _assertCountEqual = "assertCountEqual" + if sys.version_info[1] <= 1: + _assertRaisesRegex = "assertRaisesRegexp" + _assertRegex = "assertRegexpMatches" + else: + _assertRaisesRegex = "assertRaisesRegex" + _assertRegex = "assertRegex" +else: + def b(s): + return s + # Workaround for standalone backslash + + def u(s): + return unicode(s.replace(r'\\', r'\\\\'), "unicode_escape") + unichr = unichr + int2byte = chr + + def byte2int(bs): + return ord(bs[0]) + + def indexbytes(buf, i): + return ord(buf[i]) + iterbytes = functools.partial(itertools.imap, ord) + import StringIO + StringIO = BytesIO = StringIO.StringIO + _assertCountEqual = "assertItemsEqual" + _assertRaisesRegex = "assertRaisesRegexp" + _assertRegex = "assertRegexpMatches" +_add_doc(b, """Byte literal""") +_add_doc(u, """Text literal""") + + +def assertCountEqual(self, *args, **kwargs): + return getattr(self, _assertCountEqual)(*args, **kwargs) + + +def assertRaisesRegex(self, *args, **kwargs): + return getattr(self, _assertRaisesRegex)(*args, **kwargs) + + +def assertRegex(self, *args, **kwargs): + return getattr(self, _assertRegex)(*args, **kwargs) + + +if PY3: + exec_ = getattr(moves.builtins, "exec") + + def reraise(tp, value, tb=None): + try: + if value is None: + value = tp() + if value.__traceback__ is not tb: + raise value.with_traceback(tb) + raise value + finally: + value = None + tb = None + +else: + def exec_(_code_, _globs_=None, _locs_=None): + """Execute code in a namespace.""" + if _globs_ is None: + frame = sys._getframe(1) + _globs_ = frame.f_globals + if _locs_ is None: + _locs_ = frame.f_locals + del frame + elif _locs_ is None: + _locs_ = _globs_ + exec("""exec _code_ in _globs_, _locs_""") + + exec_("""def reraise(tp, value, tb=None): + try: + raise tp, value, tb + finally: + tb = None +""") + + +if sys.version_info[:2] == (3, 2): + exec_("""def raise_from(value, from_value): + try: + if from_value is None: + raise value + raise value from from_value + finally: + value = None +""") +elif sys.version_info[:2] > (3, 2): + exec_("""def raise_from(value, from_value): + try: + raise value from from_value + finally: + value = None +""") +else: + def raise_from(value, from_value): + raise value + + +print_ = getattr(moves.builtins, "print", None) +if print_ is None: + def print_(*args, **kwargs): + """The new-style print function for Python 2.4 and 2.5.""" + fp = kwargs.pop("file", sys.stdout) + if fp is None: + return + + def write(data): + if not isinstance(data, basestring): + data = str(data) + # If the file has an encoding, encode unicode with it. + if (isinstance(fp, file) and + isinstance(data, unicode) and + fp.encoding is not None): + errors = getattr(fp, "errors", None) + if errors is None: + errors = "strict" + data = data.encode(fp.encoding, errors) + fp.write(data) + want_unicode = False + sep = kwargs.pop("sep", None) + if sep is not None: + if isinstance(sep, unicode): + want_unicode = True + elif not isinstance(sep, str): + raise TypeError("sep must be None or a string") + end = kwargs.pop("end", None) + if end is not None: + if isinstance(end, unicode): + want_unicode = True + elif not isinstance(end, str): + raise TypeError("end must be None or a string") + if kwargs: + raise TypeError("invalid keyword arguments to print()") + if not want_unicode: + for arg in args: + if isinstance(arg, unicode): + want_unicode = True + break + if want_unicode: + newline = unicode("\n") + space = unicode(" ") + else: + newline = "\n" + space = " " + if sep is None: + sep = space + if end is None: + end = newline + for i, arg in enumerate(args): + if i: + write(sep) + write(arg) + write(end) +if sys.version_info[:2] < (3, 3): + _print = print_ + + def print_(*args, **kwargs): + fp = kwargs.get("file", sys.stdout) + flush = kwargs.pop("flush", False) + _print(*args, **kwargs) + if flush and fp is not None: + fp.flush() + +_add_doc(reraise, """Reraise an exception.""") + +if sys.version_info[0:2] < (3, 4): + def wraps(wrapped, assigned=functools.WRAPPER_ASSIGNMENTS, + updated=functools.WRAPPER_UPDATES): + def wrapper(f): + f = functools.wraps(wrapped, assigned, updated)(f) + f.__wrapped__ = wrapped + return f + return wrapper +else: + wraps = functools.wraps + + +def with_metaclass(meta, *bases): + """Create a base class with a metaclass.""" + # This requires a bit of explanation: the basic idea is to make a dummy + # metaclass for one level of class instantiation that replaces itself with + # the actual metaclass. + class metaclass(type): + + def __new__(cls, name, this_bases, d): + return meta(name, bases, d) + + @classmethod + def __prepare__(cls, name, this_bases): + return meta.__prepare__(name, bases) + return type.__new__(metaclass, 'temporary_class', (), {}) + + +def add_metaclass(metaclass): + """Class decorator for creating a class with a metaclass.""" + def wrapper(cls): + orig_vars = cls.__dict__.copy() + slots = orig_vars.get('__slots__') + if slots is not None: + if isinstance(slots, str): + slots = [slots] + for slots_var in slots: + orig_vars.pop(slots_var) + orig_vars.pop('__dict__', None) + orig_vars.pop('__weakref__', None) + if hasattr(cls, '__qualname__'): + orig_vars['__qualname__'] = cls.__qualname__ + return metaclass(cls.__name__, cls.__bases__, orig_vars) + return wrapper + + +def ensure_binary(s, encoding='utf-8', errors='strict'): + """Coerce **s** to six.binary_type. + + For Python 2: + - `unicode` -> encoded to `str` + - `str` -> `str` + + For Python 3: + - `str` -> encoded to `bytes` + - `bytes` -> `bytes` + """ + if isinstance(s, text_type): + return s.encode(encoding, errors) + elif isinstance(s, binary_type): + return s + else: + raise TypeError("not expecting type '%s'" % type(s)) + + +def ensure_str(s, encoding='utf-8', errors='strict'): + """Coerce *s* to `str`. + + For Python 2: + - `unicode` -> encoded to `str` + - `str` -> `str` + + For Python 3: + - `str` -> `str` + - `bytes` -> decoded to `str` + """ + if not isinstance(s, (text_type, binary_type)): + raise TypeError("not expecting type '%s'" % type(s)) + if PY2 and isinstance(s, text_type): + s = s.encode(encoding, errors) + elif PY3 and isinstance(s, binary_type): + s = s.decode(encoding, errors) + return s + + +def ensure_text(s, encoding='utf-8', errors='strict'): + """Coerce *s* to six.text_type. + + For Python 2: + - `unicode` -> `unicode` + - `str` -> `unicode` + + For Python 3: + - `str` -> `str` + - `bytes` -> decoded to `str` + """ + if isinstance(s, binary_type): + return s.decode(encoding, errors) + elif isinstance(s, text_type): + return s + else: + raise TypeError("not expecting type '%s'" % type(s)) + + + +def python_2_unicode_compatible(klass): + """ + A decorator that defines __unicode__ and __str__ methods under Python 2. + Under Python 3 it does nothing. + + To support Python 2 and 3 with a single code base, define a __str__ method + returning text and apply this decorator to the class. + """ + if PY2: + if '__str__' not in klass.__dict__: + raise ValueError("@python_2_unicode_compatible cannot be applied " + "to %s because it doesn't define __str__()." % + klass.__name__) + klass.__unicode__ = klass.__str__ + klass.__str__ = lambda self: self.__unicode__().encode('utf-8') + return klass + + +# Complete the moves implementation. +# This code is at the end of this module to speed up module loading. +# Turn this module into a package. +__path__ = [] # required for PEP 302 and PEP 451 +__package__ = __name__ # see PEP 366 @ReservedAssignment +if globals().get("__spec__") is not None: + __spec__.submodule_search_locations = [] # PEP 451 @UndefinedVariable +# Remove other six meta path importers, since they cause problems. This can +# happen if six is removed from sys.modules and then reloaded. (Setuptools does +# this for some reason.) +if sys.meta_path: + for i, importer in enumerate(sys.meta_path): + # Here's some real nastiness: Another "instance" of the six module might + # be floating around. Therefore, we can't use isinstance() to check for + # the six meta path importer, since the other six instance will have + # inserted an importer with different class. + if (type(importer).__name__ == "_SixMetaPathImporter" and + importer.name == __name__): + del sys.meta_path[i] + break + del i, importer +# Finally, add the importer to the meta path import hook. +sys.meta_path.append(_importer) diff --git a/lib/python3.4/site-packages/sqlalchemy/__init__.py b/lib/python3.7/site-packages/sqlalchemy/__init__.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/__init__.py rename to lib/python3.7/site-packages/sqlalchemy/__init__.py diff --git a/lib/python3.4/site-packages/sqlalchemy/connectors/__init__.py b/lib/python3.7/site-packages/sqlalchemy/connectors/__init__.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/connectors/__init__.py rename to lib/python3.7/site-packages/sqlalchemy/connectors/__init__.py diff --git a/lib/python3.4/site-packages/sqlalchemy/connectors/mxodbc.py b/lib/python3.7/site-packages/sqlalchemy/connectors/mxodbc.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/connectors/mxodbc.py rename to lib/python3.7/site-packages/sqlalchemy/connectors/mxodbc.py diff --git a/lib/python3.4/site-packages/sqlalchemy/connectors/pyodbc.py b/lib/python3.7/site-packages/sqlalchemy/connectors/pyodbc.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/connectors/pyodbc.py rename to lib/python3.7/site-packages/sqlalchemy/connectors/pyodbc.py diff --git a/lib/python3.4/site-packages/sqlalchemy/connectors/zxJDBC.py b/lib/python3.7/site-packages/sqlalchemy/connectors/zxJDBC.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/connectors/zxJDBC.py rename to lib/python3.7/site-packages/sqlalchemy/connectors/zxJDBC.py diff --git a/lib/python3.7/site-packages/sqlalchemy/cprocessors.cpython-35m-x86_64-linux-gnu.so b/lib/python3.7/site-packages/sqlalchemy/cprocessors.cpython-35m-x86_64-linux-gnu.so new file mode 100755 index 0000000..d245100 Binary files /dev/null and b/lib/python3.7/site-packages/sqlalchemy/cprocessors.cpython-35m-x86_64-linux-gnu.so differ diff --git a/lib/python3.7/site-packages/sqlalchemy/cprocessors.cpython-36m-x86_64-linux-gnu.so b/lib/python3.7/site-packages/sqlalchemy/cprocessors.cpython-36m-x86_64-linux-gnu.so new file mode 100755 index 0000000..80daff0 Binary files /dev/null and b/lib/python3.7/site-packages/sqlalchemy/cprocessors.cpython-36m-x86_64-linux-gnu.so differ diff --git a/lib/python3.7/site-packages/sqlalchemy/cprocessors.cpython-37m-x86_64-linux-gnu.so b/lib/python3.7/site-packages/sqlalchemy/cprocessors.cpython-37m-x86_64-linux-gnu.so new file mode 100755 index 0000000..08ed2f1 Binary files /dev/null and b/lib/python3.7/site-packages/sqlalchemy/cprocessors.cpython-37m-x86_64-linux-gnu.so differ diff --git a/lib/python3.7/site-packages/sqlalchemy/cresultproxy.cpython-35m-x86_64-linux-gnu.so b/lib/python3.7/site-packages/sqlalchemy/cresultproxy.cpython-35m-x86_64-linux-gnu.so new file mode 100755 index 0000000..2302fb2 Binary files /dev/null and b/lib/python3.7/site-packages/sqlalchemy/cresultproxy.cpython-35m-x86_64-linux-gnu.so differ diff --git a/lib/python3.7/site-packages/sqlalchemy/cresultproxy.cpython-36m-x86_64-linux-gnu.so b/lib/python3.7/site-packages/sqlalchemy/cresultproxy.cpython-36m-x86_64-linux-gnu.so new file mode 100755 index 0000000..c1105e6 Binary files /dev/null and b/lib/python3.7/site-packages/sqlalchemy/cresultproxy.cpython-36m-x86_64-linux-gnu.so differ diff --git a/lib/python3.7/site-packages/sqlalchemy/cresultproxy.cpython-37m-x86_64-linux-gnu.so b/lib/python3.7/site-packages/sqlalchemy/cresultproxy.cpython-37m-x86_64-linux-gnu.so new file mode 100755 index 0000000..90b9c93 Binary files /dev/null and b/lib/python3.7/site-packages/sqlalchemy/cresultproxy.cpython-37m-x86_64-linux-gnu.so differ diff --git a/lib/python3.7/site-packages/sqlalchemy/cutils.cpython-35m-x86_64-linux-gnu.so b/lib/python3.7/site-packages/sqlalchemy/cutils.cpython-35m-x86_64-linux-gnu.so new file mode 100755 index 0000000..1edf4b0 Binary files /dev/null and b/lib/python3.7/site-packages/sqlalchemy/cutils.cpython-35m-x86_64-linux-gnu.so differ diff --git a/lib/python3.7/site-packages/sqlalchemy/cutils.cpython-36m-x86_64-linux-gnu.so b/lib/python3.7/site-packages/sqlalchemy/cutils.cpython-36m-x86_64-linux-gnu.so new file mode 100755 index 0000000..6a8b36c Binary files /dev/null and b/lib/python3.7/site-packages/sqlalchemy/cutils.cpython-36m-x86_64-linux-gnu.so differ diff --git a/lib/python3.7/site-packages/sqlalchemy/cutils.cpython-37m-x86_64-linux-gnu.so b/lib/python3.7/site-packages/sqlalchemy/cutils.cpython-37m-x86_64-linux-gnu.so new file mode 100755 index 0000000..3e22d8c Binary files /dev/null and b/lib/python3.7/site-packages/sqlalchemy/cutils.cpython-37m-x86_64-linux-gnu.so differ diff --git a/lib/python3.4/site-packages/sqlalchemy/databases/__init__.py b/lib/python3.7/site-packages/sqlalchemy/databases/__init__.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/databases/__init__.py rename to lib/python3.7/site-packages/sqlalchemy/databases/__init__.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/__init__.py b/lib/python3.7/site-packages/sqlalchemy/dialects/__init__.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/__init__.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/__init__.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/firebird/__init__.py b/lib/python3.7/site-packages/sqlalchemy/dialects/firebird/__init__.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/firebird/__init__.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/firebird/__init__.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/firebird/base.py b/lib/python3.7/site-packages/sqlalchemy/dialects/firebird/base.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/firebird/base.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/firebird/base.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/firebird/fdb.py b/lib/python3.7/site-packages/sqlalchemy/dialects/firebird/fdb.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/firebird/fdb.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/firebird/fdb.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/firebird/kinterbasdb.py b/lib/python3.7/site-packages/sqlalchemy/dialects/firebird/kinterbasdb.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/firebird/kinterbasdb.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/firebird/kinterbasdb.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/mssql/__init__.py b/lib/python3.7/site-packages/sqlalchemy/dialects/mssql/__init__.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/mssql/__init__.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/mssql/__init__.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/mssql/adodbapi.py b/lib/python3.7/site-packages/sqlalchemy/dialects/mssql/adodbapi.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/mssql/adodbapi.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/mssql/adodbapi.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/mssql/base.py b/lib/python3.7/site-packages/sqlalchemy/dialects/mssql/base.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/mssql/base.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/mssql/base.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/mssql/information_schema.py b/lib/python3.7/site-packages/sqlalchemy/dialects/mssql/information_schema.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/mssql/information_schema.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/mssql/information_schema.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/mssql/mxodbc.py b/lib/python3.7/site-packages/sqlalchemy/dialects/mssql/mxodbc.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/mssql/mxodbc.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/mssql/mxodbc.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/mssql/pymssql.py b/lib/python3.7/site-packages/sqlalchemy/dialects/mssql/pymssql.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/mssql/pymssql.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/mssql/pymssql.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/mssql/pyodbc.py b/lib/python3.7/site-packages/sqlalchemy/dialects/mssql/pyodbc.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/mssql/pyodbc.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/mssql/pyodbc.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/mssql/zxjdbc.py b/lib/python3.7/site-packages/sqlalchemy/dialects/mssql/zxjdbc.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/mssql/zxjdbc.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/mssql/zxjdbc.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/mysql/__init__.py b/lib/python3.7/site-packages/sqlalchemy/dialects/mysql/__init__.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/mysql/__init__.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/mysql/__init__.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/mysql/base.py b/lib/python3.7/site-packages/sqlalchemy/dialects/mysql/base.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/mysql/base.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/mysql/base.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/mysql/cymysql.py b/lib/python3.7/site-packages/sqlalchemy/dialects/mysql/cymysql.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/mysql/cymysql.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/mysql/cymysql.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/mysql/gaerdbms.py b/lib/python3.7/site-packages/sqlalchemy/dialects/mysql/gaerdbms.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/mysql/gaerdbms.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/mysql/gaerdbms.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/mysql/mysqlconnector.py b/lib/python3.7/site-packages/sqlalchemy/dialects/mysql/mysqlconnector.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/mysql/mysqlconnector.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/mysql/mysqlconnector.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/mysql/mysqldb.py b/lib/python3.7/site-packages/sqlalchemy/dialects/mysql/mysqldb.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/mysql/mysqldb.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/mysql/mysqldb.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/mysql/oursql.py b/lib/python3.7/site-packages/sqlalchemy/dialects/mysql/oursql.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/mysql/oursql.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/mysql/oursql.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/mysql/pymysql.py b/lib/python3.7/site-packages/sqlalchemy/dialects/mysql/pymysql.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/mysql/pymysql.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/mysql/pymysql.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/mysql/pyodbc.py b/lib/python3.7/site-packages/sqlalchemy/dialects/mysql/pyodbc.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/mysql/pyodbc.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/mysql/pyodbc.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/mysql/zxjdbc.py b/lib/python3.7/site-packages/sqlalchemy/dialects/mysql/zxjdbc.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/mysql/zxjdbc.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/mysql/zxjdbc.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/oracle/__init__.py b/lib/python3.7/site-packages/sqlalchemy/dialects/oracle/__init__.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/oracle/__init__.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/oracle/__init__.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/oracle/base.py b/lib/python3.7/site-packages/sqlalchemy/dialects/oracle/base.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/oracle/base.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/oracle/base.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/oracle/cx_oracle.py b/lib/python3.7/site-packages/sqlalchemy/dialects/oracle/cx_oracle.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/oracle/cx_oracle.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/oracle/cx_oracle.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/oracle/zxjdbc.py b/lib/python3.7/site-packages/sqlalchemy/dialects/oracle/zxjdbc.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/oracle/zxjdbc.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/oracle/zxjdbc.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/postgres.py b/lib/python3.7/site-packages/sqlalchemy/dialects/postgres.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/postgres.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/postgres.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/postgresql/__init__.py b/lib/python3.7/site-packages/sqlalchemy/dialects/postgresql/__init__.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/postgresql/__init__.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/postgresql/__init__.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/postgresql/base.py b/lib/python3.7/site-packages/sqlalchemy/dialects/postgresql/base.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/postgresql/base.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/postgresql/base.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/postgresql/constraints.py b/lib/python3.7/site-packages/sqlalchemy/dialects/postgresql/constraints.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/postgresql/constraints.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/postgresql/constraints.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/postgresql/hstore.py b/lib/python3.7/site-packages/sqlalchemy/dialects/postgresql/hstore.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/postgresql/hstore.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/postgresql/hstore.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/postgresql/json.py b/lib/python3.7/site-packages/sqlalchemy/dialects/postgresql/json.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/postgresql/json.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/postgresql/json.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/postgresql/pg8000.py b/lib/python3.7/site-packages/sqlalchemy/dialects/postgresql/pg8000.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/postgresql/pg8000.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/postgresql/pg8000.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/postgresql/psycopg2.py b/lib/python3.7/site-packages/sqlalchemy/dialects/postgresql/psycopg2.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/postgresql/psycopg2.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/postgresql/psycopg2.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/postgresql/psycopg2cffi.py b/lib/python3.7/site-packages/sqlalchemy/dialects/postgresql/psycopg2cffi.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/postgresql/psycopg2cffi.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/postgresql/psycopg2cffi.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/postgresql/pypostgresql.py b/lib/python3.7/site-packages/sqlalchemy/dialects/postgresql/pypostgresql.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/postgresql/pypostgresql.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/postgresql/pypostgresql.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/postgresql/ranges.py b/lib/python3.7/site-packages/sqlalchemy/dialects/postgresql/ranges.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/postgresql/ranges.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/postgresql/ranges.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/postgresql/zxjdbc.py b/lib/python3.7/site-packages/sqlalchemy/dialects/postgresql/zxjdbc.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/postgresql/zxjdbc.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/postgresql/zxjdbc.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/sqlite/__init__.py b/lib/python3.7/site-packages/sqlalchemy/dialects/sqlite/__init__.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/sqlite/__init__.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/sqlite/__init__.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/sqlite/base.py b/lib/python3.7/site-packages/sqlalchemy/dialects/sqlite/base.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/sqlite/base.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/sqlite/base.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/sqlite/pysqlcipher.py b/lib/python3.7/site-packages/sqlalchemy/dialects/sqlite/pysqlcipher.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/sqlite/pysqlcipher.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/sqlite/pysqlcipher.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/sqlite/pysqlite.py b/lib/python3.7/site-packages/sqlalchemy/dialects/sqlite/pysqlite.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/sqlite/pysqlite.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/sqlite/pysqlite.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/sybase/__init__.py b/lib/python3.7/site-packages/sqlalchemy/dialects/sybase/__init__.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/sybase/__init__.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/sybase/__init__.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/sybase/base.py b/lib/python3.7/site-packages/sqlalchemy/dialects/sybase/base.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/sybase/base.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/sybase/base.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/sybase/mxodbc.py b/lib/python3.7/site-packages/sqlalchemy/dialects/sybase/mxodbc.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/sybase/mxodbc.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/sybase/mxodbc.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/sybase/pyodbc.py b/lib/python3.7/site-packages/sqlalchemy/dialects/sybase/pyodbc.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/sybase/pyodbc.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/sybase/pyodbc.py diff --git a/lib/python3.4/site-packages/sqlalchemy/dialects/sybase/pysybase.py b/lib/python3.7/site-packages/sqlalchemy/dialects/sybase/pysybase.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/dialects/sybase/pysybase.py rename to lib/python3.7/site-packages/sqlalchemy/dialects/sybase/pysybase.py diff --git a/lib/python3.4/site-packages/sqlalchemy/engine/__init__.py b/lib/python3.7/site-packages/sqlalchemy/engine/__init__.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/engine/__init__.py rename to lib/python3.7/site-packages/sqlalchemy/engine/__init__.py diff --git a/lib/python3.4/site-packages/sqlalchemy/engine/base.py b/lib/python3.7/site-packages/sqlalchemy/engine/base.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/engine/base.py rename to lib/python3.7/site-packages/sqlalchemy/engine/base.py diff --git a/lib/python3.4/site-packages/sqlalchemy/engine/default.py b/lib/python3.7/site-packages/sqlalchemy/engine/default.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/engine/default.py rename to lib/python3.7/site-packages/sqlalchemy/engine/default.py diff --git a/lib/python3.4/site-packages/sqlalchemy/engine/interfaces.py b/lib/python3.7/site-packages/sqlalchemy/engine/interfaces.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/engine/interfaces.py rename to lib/python3.7/site-packages/sqlalchemy/engine/interfaces.py diff --git a/lib/python3.4/site-packages/sqlalchemy/engine/reflection.py b/lib/python3.7/site-packages/sqlalchemy/engine/reflection.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/engine/reflection.py rename to lib/python3.7/site-packages/sqlalchemy/engine/reflection.py diff --git a/lib/python3.4/site-packages/sqlalchemy/engine/result.py b/lib/python3.7/site-packages/sqlalchemy/engine/result.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/engine/result.py rename to lib/python3.7/site-packages/sqlalchemy/engine/result.py diff --git a/lib/python3.4/site-packages/sqlalchemy/engine/strategies.py b/lib/python3.7/site-packages/sqlalchemy/engine/strategies.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/engine/strategies.py rename to lib/python3.7/site-packages/sqlalchemy/engine/strategies.py diff --git a/lib/python3.4/site-packages/sqlalchemy/engine/threadlocal.py b/lib/python3.7/site-packages/sqlalchemy/engine/threadlocal.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/engine/threadlocal.py rename to lib/python3.7/site-packages/sqlalchemy/engine/threadlocal.py diff --git a/lib/python3.4/site-packages/sqlalchemy/engine/url.py b/lib/python3.7/site-packages/sqlalchemy/engine/url.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/engine/url.py rename to lib/python3.7/site-packages/sqlalchemy/engine/url.py diff --git a/lib/python3.4/site-packages/sqlalchemy/engine/util.py b/lib/python3.7/site-packages/sqlalchemy/engine/util.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/engine/util.py rename to lib/python3.7/site-packages/sqlalchemy/engine/util.py diff --git a/lib/python3.4/site-packages/sqlalchemy/event/__init__.py b/lib/python3.7/site-packages/sqlalchemy/event/__init__.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/event/__init__.py rename to lib/python3.7/site-packages/sqlalchemy/event/__init__.py diff --git a/lib/python3.4/site-packages/sqlalchemy/event/api.py b/lib/python3.7/site-packages/sqlalchemy/event/api.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/event/api.py rename to lib/python3.7/site-packages/sqlalchemy/event/api.py diff --git a/lib/python3.4/site-packages/sqlalchemy/event/attr.py b/lib/python3.7/site-packages/sqlalchemy/event/attr.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/event/attr.py rename to lib/python3.7/site-packages/sqlalchemy/event/attr.py diff --git a/lib/python3.4/site-packages/sqlalchemy/event/base.py b/lib/python3.7/site-packages/sqlalchemy/event/base.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/event/base.py rename to lib/python3.7/site-packages/sqlalchemy/event/base.py diff --git a/lib/python3.4/site-packages/sqlalchemy/event/legacy.py b/lib/python3.7/site-packages/sqlalchemy/event/legacy.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/event/legacy.py rename to lib/python3.7/site-packages/sqlalchemy/event/legacy.py diff --git a/lib/python3.4/site-packages/sqlalchemy/event/registry.py b/lib/python3.7/site-packages/sqlalchemy/event/registry.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/event/registry.py rename to lib/python3.7/site-packages/sqlalchemy/event/registry.py diff --git a/lib/python3.4/site-packages/sqlalchemy/events.py b/lib/python3.7/site-packages/sqlalchemy/events.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/events.py rename to lib/python3.7/site-packages/sqlalchemy/events.py diff --git a/lib/python3.4/site-packages/sqlalchemy/exc.py b/lib/python3.7/site-packages/sqlalchemy/exc.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/exc.py rename to lib/python3.7/site-packages/sqlalchemy/exc.py diff --git a/lib/python3.4/site-packages/sqlalchemy/ext/__init__.py b/lib/python3.7/site-packages/sqlalchemy/ext/__init__.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/ext/__init__.py rename to lib/python3.7/site-packages/sqlalchemy/ext/__init__.py diff --git a/lib/python3.4/site-packages/sqlalchemy/ext/associationproxy.py b/lib/python3.7/site-packages/sqlalchemy/ext/associationproxy.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/ext/associationproxy.py rename to lib/python3.7/site-packages/sqlalchemy/ext/associationproxy.py diff --git a/lib/python3.4/site-packages/sqlalchemy/ext/automap.py b/lib/python3.7/site-packages/sqlalchemy/ext/automap.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/ext/automap.py rename to lib/python3.7/site-packages/sqlalchemy/ext/automap.py diff --git a/lib/python3.4/site-packages/sqlalchemy/ext/baked.py b/lib/python3.7/site-packages/sqlalchemy/ext/baked.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/ext/baked.py rename to lib/python3.7/site-packages/sqlalchemy/ext/baked.py diff --git a/lib/python3.4/site-packages/sqlalchemy/ext/compiler.py b/lib/python3.7/site-packages/sqlalchemy/ext/compiler.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/ext/compiler.py rename to lib/python3.7/site-packages/sqlalchemy/ext/compiler.py diff --git a/lib/python3.4/site-packages/sqlalchemy/ext/declarative/__init__.py b/lib/python3.7/site-packages/sqlalchemy/ext/declarative/__init__.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/ext/declarative/__init__.py rename to lib/python3.7/site-packages/sqlalchemy/ext/declarative/__init__.py diff --git a/lib/python3.4/site-packages/sqlalchemy/ext/declarative/api.py b/lib/python3.7/site-packages/sqlalchemy/ext/declarative/api.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/ext/declarative/api.py rename to lib/python3.7/site-packages/sqlalchemy/ext/declarative/api.py diff --git a/lib/python3.4/site-packages/sqlalchemy/ext/declarative/base.py b/lib/python3.7/site-packages/sqlalchemy/ext/declarative/base.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/ext/declarative/base.py rename to lib/python3.7/site-packages/sqlalchemy/ext/declarative/base.py diff --git a/lib/python3.4/site-packages/sqlalchemy/ext/declarative/clsregistry.py b/lib/python3.7/site-packages/sqlalchemy/ext/declarative/clsregistry.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/ext/declarative/clsregistry.py rename to lib/python3.7/site-packages/sqlalchemy/ext/declarative/clsregistry.py diff --git a/lib/python3.4/site-packages/sqlalchemy/ext/horizontal_shard.py b/lib/python3.7/site-packages/sqlalchemy/ext/horizontal_shard.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/ext/horizontal_shard.py rename to lib/python3.7/site-packages/sqlalchemy/ext/horizontal_shard.py diff --git a/lib/python3.4/site-packages/sqlalchemy/ext/hybrid.py b/lib/python3.7/site-packages/sqlalchemy/ext/hybrid.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/ext/hybrid.py rename to lib/python3.7/site-packages/sqlalchemy/ext/hybrid.py diff --git a/lib/python3.4/site-packages/sqlalchemy/ext/instrumentation.py b/lib/python3.7/site-packages/sqlalchemy/ext/instrumentation.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/ext/instrumentation.py rename to lib/python3.7/site-packages/sqlalchemy/ext/instrumentation.py diff --git a/lib/python3.4/site-packages/sqlalchemy/ext/mutable.py b/lib/python3.7/site-packages/sqlalchemy/ext/mutable.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/ext/mutable.py rename to lib/python3.7/site-packages/sqlalchemy/ext/mutable.py diff --git a/lib/python3.4/site-packages/sqlalchemy/ext/orderinglist.py b/lib/python3.7/site-packages/sqlalchemy/ext/orderinglist.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/ext/orderinglist.py rename to lib/python3.7/site-packages/sqlalchemy/ext/orderinglist.py diff --git a/lib/python3.4/site-packages/sqlalchemy/ext/serializer.py b/lib/python3.7/site-packages/sqlalchemy/ext/serializer.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/ext/serializer.py rename to lib/python3.7/site-packages/sqlalchemy/ext/serializer.py diff --git a/lib/python3.4/site-packages/sqlalchemy/inspection.py b/lib/python3.7/site-packages/sqlalchemy/inspection.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/inspection.py rename to lib/python3.7/site-packages/sqlalchemy/inspection.py diff --git a/lib/python3.4/site-packages/sqlalchemy/interfaces.py b/lib/python3.7/site-packages/sqlalchemy/interfaces.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/interfaces.py rename to lib/python3.7/site-packages/sqlalchemy/interfaces.py diff --git a/lib/python3.4/site-packages/sqlalchemy/log.py b/lib/python3.7/site-packages/sqlalchemy/log.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/log.py rename to lib/python3.7/site-packages/sqlalchemy/log.py diff --git a/lib/python3.4/site-packages/sqlalchemy/orm/__init__.py b/lib/python3.7/site-packages/sqlalchemy/orm/__init__.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/orm/__init__.py rename to lib/python3.7/site-packages/sqlalchemy/orm/__init__.py diff --git a/lib/python3.4/site-packages/sqlalchemy/orm/attributes.py b/lib/python3.7/site-packages/sqlalchemy/orm/attributes.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/orm/attributes.py rename to lib/python3.7/site-packages/sqlalchemy/orm/attributes.py diff --git a/lib/python3.4/site-packages/sqlalchemy/orm/base.py b/lib/python3.7/site-packages/sqlalchemy/orm/base.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/orm/base.py rename to lib/python3.7/site-packages/sqlalchemy/orm/base.py diff --git a/lib/python3.4/site-packages/sqlalchemy/orm/collections.py b/lib/python3.7/site-packages/sqlalchemy/orm/collections.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/orm/collections.py rename to lib/python3.7/site-packages/sqlalchemy/orm/collections.py diff --git a/lib/python3.4/site-packages/sqlalchemy/orm/dependency.py b/lib/python3.7/site-packages/sqlalchemy/orm/dependency.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/orm/dependency.py rename to lib/python3.7/site-packages/sqlalchemy/orm/dependency.py diff --git a/lib/python3.4/site-packages/sqlalchemy/orm/deprecated_interfaces.py b/lib/python3.7/site-packages/sqlalchemy/orm/deprecated_interfaces.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/orm/deprecated_interfaces.py rename to lib/python3.7/site-packages/sqlalchemy/orm/deprecated_interfaces.py diff --git a/lib/python3.4/site-packages/sqlalchemy/orm/descriptor_props.py b/lib/python3.7/site-packages/sqlalchemy/orm/descriptor_props.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/orm/descriptor_props.py rename to lib/python3.7/site-packages/sqlalchemy/orm/descriptor_props.py diff --git a/lib/python3.4/site-packages/sqlalchemy/orm/dynamic.py b/lib/python3.7/site-packages/sqlalchemy/orm/dynamic.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/orm/dynamic.py rename to lib/python3.7/site-packages/sqlalchemy/orm/dynamic.py diff --git a/lib/python3.4/site-packages/sqlalchemy/orm/evaluator.py b/lib/python3.7/site-packages/sqlalchemy/orm/evaluator.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/orm/evaluator.py rename to lib/python3.7/site-packages/sqlalchemy/orm/evaluator.py diff --git a/lib/python3.4/site-packages/sqlalchemy/orm/events.py b/lib/python3.7/site-packages/sqlalchemy/orm/events.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/orm/events.py rename to lib/python3.7/site-packages/sqlalchemy/orm/events.py diff --git a/lib/python3.4/site-packages/sqlalchemy/orm/exc.py b/lib/python3.7/site-packages/sqlalchemy/orm/exc.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/orm/exc.py rename to lib/python3.7/site-packages/sqlalchemy/orm/exc.py diff --git a/lib/python3.4/site-packages/sqlalchemy/orm/identity.py b/lib/python3.7/site-packages/sqlalchemy/orm/identity.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/orm/identity.py rename to lib/python3.7/site-packages/sqlalchemy/orm/identity.py diff --git a/lib/python3.4/site-packages/sqlalchemy/orm/instrumentation.py b/lib/python3.7/site-packages/sqlalchemy/orm/instrumentation.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/orm/instrumentation.py rename to lib/python3.7/site-packages/sqlalchemy/orm/instrumentation.py diff --git a/lib/python3.4/site-packages/sqlalchemy/orm/interfaces.py b/lib/python3.7/site-packages/sqlalchemy/orm/interfaces.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/orm/interfaces.py rename to lib/python3.7/site-packages/sqlalchemy/orm/interfaces.py diff --git a/lib/python3.4/site-packages/sqlalchemy/orm/loading.py b/lib/python3.7/site-packages/sqlalchemy/orm/loading.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/orm/loading.py rename to lib/python3.7/site-packages/sqlalchemy/orm/loading.py diff --git a/lib/python3.4/site-packages/sqlalchemy/orm/mapper.py b/lib/python3.7/site-packages/sqlalchemy/orm/mapper.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/orm/mapper.py rename to lib/python3.7/site-packages/sqlalchemy/orm/mapper.py diff --git a/lib/python3.4/site-packages/sqlalchemy/orm/path_registry.py b/lib/python3.7/site-packages/sqlalchemy/orm/path_registry.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/orm/path_registry.py rename to lib/python3.7/site-packages/sqlalchemy/orm/path_registry.py diff --git a/lib/python3.4/site-packages/sqlalchemy/orm/persistence.py b/lib/python3.7/site-packages/sqlalchemy/orm/persistence.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/orm/persistence.py rename to lib/python3.7/site-packages/sqlalchemy/orm/persistence.py diff --git a/lib/python3.4/site-packages/sqlalchemy/orm/properties.py b/lib/python3.7/site-packages/sqlalchemy/orm/properties.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/orm/properties.py rename to lib/python3.7/site-packages/sqlalchemy/orm/properties.py diff --git a/lib/python3.4/site-packages/sqlalchemy/orm/query.py b/lib/python3.7/site-packages/sqlalchemy/orm/query.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/orm/query.py rename to lib/python3.7/site-packages/sqlalchemy/orm/query.py diff --git a/lib/python3.4/site-packages/sqlalchemy/orm/relationships.py b/lib/python3.7/site-packages/sqlalchemy/orm/relationships.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/orm/relationships.py rename to lib/python3.7/site-packages/sqlalchemy/orm/relationships.py diff --git a/lib/python3.4/site-packages/sqlalchemy/orm/scoping.py b/lib/python3.7/site-packages/sqlalchemy/orm/scoping.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/orm/scoping.py rename to lib/python3.7/site-packages/sqlalchemy/orm/scoping.py diff --git a/lib/python3.4/site-packages/sqlalchemy/orm/session.py b/lib/python3.7/site-packages/sqlalchemy/orm/session.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/orm/session.py rename to lib/python3.7/site-packages/sqlalchemy/orm/session.py diff --git a/lib/python3.4/site-packages/sqlalchemy/orm/state.py b/lib/python3.7/site-packages/sqlalchemy/orm/state.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/orm/state.py rename to lib/python3.7/site-packages/sqlalchemy/orm/state.py diff --git a/lib/python3.4/site-packages/sqlalchemy/orm/strategies.py b/lib/python3.7/site-packages/sqlalchemy/orm/strategies.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/orm/strategies.py rename to lib/python3.7/site-packages/sqlalchemy/orm/strategies.py diff --git a/lib/python3.4/site-packages/sqlalchemy/orm/strategy_options.py b/lib/python3.7/site-packages/sqlalchemy/orm/strategy_options.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/orm/strategy_options.py rename to lib/python3.7/site-packages/sqlalchemy/orm/strategy_options.py diff --git a/lib/python3.4/site-packages/sqlalchemy/orm/sync.py b/lib/python3.7/site-packages/sqlalchemy/orm/sync.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/orm/sync.py rename to lib/python3.7/site-packages/sqlalchemy/orm/sync.py diff --git a/lib/python3.4/site-packages/sqlalchemy/orm/unitofwork.py b/lib/python3.7/site-packages/sqlalchemy/orm/unitofwork.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/orm/unitofwork.py rename to lib/python3.7/site-packages/sqlalchemy/orm/unitofwork.py diff --git a/lib/python3.4/site-packages/sqlalchemy/orm/util.py b/lib/python3.7/site-packages/sqlalchemy/orm/util.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/orm/util.py rename to lib/python3.7/site-packages/sqlalchemy/orm/util.py diff --git a/lib/python3.4/site-packages/sqlalchemy/pool.py b/lib/python3.7/site-packages/sqlalchemy/pool.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/pool.py rename to lib/python3.7/site-packages/sqlalchemy/pool.py diff --git a/lib/python3.4/site-packages/sqlalchemy/processors.py b/lib/python3.7/site-packages/sqlalchemy/processors.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/processors.py rename to lib/python3.7/site-packages/sqlalchemy/processors.py diff --git a/lib/python3.4/site-packages/sqlalchemy/schema.py b/lib/python3.7/site-packages/sqlalchemy/schema.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/schema.py rename to lib/python3.7/site-packages/sqlalchemy/schema.py diff --git a/lib/python3.4/site-packages/sqlalchemy/sql/__init__.py b/lib/python3.7/site-packages/sqlalchemy/sql/__init__.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/sql/__init__.py rename to lib/python3.7/site-packages/sqlalchemy/sql/__init__.py diff --git a/lib/python3.4/site-packages/sqlalchemy/sql/annotation.py b/lib/python3.7/site-packages/sqlalchemy/sql/annotation.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/sql/annotation.py rename to lib/python3.7/site-packages/sqlalchemy/sql/annotation.py diff --git a/lib/python3.4/site-packages/sqlalchemy/sql/base.py b/lib/python3.7/site-packages/sqlalchemy/sql/base.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/sql/base.py rename to lib/python3.7/site-packages/sqlalchemy/sql/base.py diff --git a/lib/python3.4/site-packages/sqlalchemy/sql/compiler.py b/lib/python3.7/site-packages/sqlalchemy/sql/compiler.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/sql/compiler.py rename to lib/python3.7/site-packages/sqlalchemy/sql/compiler.py diff --git a/lib/python3.4/site-packages/sqlalchemy/sql/crud.py b/lib/python3.7/site-packages/sqlalchemy/sql/crud.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/sql/crud.py rename to lib/python3.7/site-packages/sqlalchemy/sql/crud.py diff --git a/lib/python3.4/site-packages/sqlalchemy/sql/ddl.py b/lib/python3.7/site-packages/sqlalchemy/sql/ddl.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/sql/ddl.py rename to lib/python3.7/site-packages/sqlalchemy/sql/ddl.py diff --git a/lib/python3.4/site-packages/sqlalchemy/sql/default_comparator.py b/lib/python3.7/site-packages/sqlalchemy/sql/default_comparator.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/sql/default_comparator.py rename to lib/python3.7/site-packages/sqlalchemy/sql/default_comparator.py diff --git a/lib/python3.4/site-packages/sqlalchemy/sql/dml.py b/lib/python3.7/site-packages/sqlalchemy/sql/dml.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/sql/dml.py rename to lib/python3.7/site-packages/sqlalchemy/sql/dml.py diff --git a/lib/python3.4/site-packages/sqlalchemy/sql/elements.py b/lib/python3.7/site-packages/sqlalchemy/sql/elements.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/sql/elements.py rename to lib/python3.7/site-packages/sqlalchemy/sql/elements.py diff --git a/lib/python3.4/site-packages/sqlalchemy/sql/expression.py b/lib/python3.7/site-packages/sqlalchemy/sql/expression.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/sql/expression.py rename to lib/python3.7/site-packages/sqlalchemy/sql/expression.py diff --git a/lib/python3.4/site-packages/sqlalchemy/sql/functions.py b/lib/python3.7/site-packages/sqlalchemy/sql/functions.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/sql/functions.py rename to lib/python3.7/site-packages/sqlalchemy/sql/functions.py diff --git a/lib/python3.4/site-packages/sqlalchemy/sql/naming.py b/lib/python3.7/site-packages/sqlalchemy/sql/naming.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/sql/naming.py rename to lib/python3.7/site-packages/sqlalchemy/sql/naming.py diff --git a/lib/python3.4/site-packages/sqlalchemy/sql/operators.py b/lib/python3.7/site-packages/sqlalchemy/sql/operators.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/sql/operators.py rename to lib/python3.7/site-packages/sqlalchemy/sql/operators.py diff --git a/lib/python3.4/site-packages/sqlalchemy/sql/schema.py b/lib/python3.7/site-packages/sqlalchemy/sql/schema.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/sql/schema.py rename to lib/python3.7/site-packages/sqlalchemy/sql/schema.py diff --git a/lib/python3.4/site-packages/sqlalchemy/sql/selectable.py b/lib/python3.7/site-packages/sqlalchemy/sql/selectable.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/sql/selectable.py rename to lib/python3.7/site-packages/sqlalchemy/sql/selectable.py diff --git a/lib/python3.4/site-packages/sqlalchemy/sql/sqltypes.py b/lib/python3.7/site-packages/sqlalchemy/sql/sqltypes.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/sql/sqltypes.py rename to lib/python3.7/site-packages/sqlalchemy/sql/sqltypes.py diff --git a/lib/python3.4/site-packages/sqlalchemy/sql/type_api.py b/lib/python3.7/site-packages/sqlalchemy/sql/type_api.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/sql/type_api.py rename to lib/python3.7/site-packages/sqlalchemy/sql/type_api.py diff --git a/lib/python3.4/site-packages/sqlalchemy/sql/util.py b/lib/python3.7/site-packages/sqlalchemy/sql/util.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/sql/util.py rename to lib/python3.7/site-packages/sqlalchemy/sql/util.py diff --git a/lib/python3.4/site-packages/sqlalchemy/sql/visitors.py b/lib/python3.7/site-packages/sqlalchemy/sql/visitors.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/sql/visitors.py rename to lib/python3.7/site-packages/sqlalchemy/sql/visitors.py diff --git a/lib/python3.4/site-packages/sqlalchemy/testing/__init__.py b/lib/python3.7/site-packages/sqlalchemy/testing/__init__.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/testing/__init__.py rename to lib/python3.7/site-packages/sqlalchemy/testing/__init__.py diff --git a/lib/python3.4/site-packages/sqlalchemy/testing/assertions.py b/lib/python3.7/site-packages/sqlalchemy/testing/assertions.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/testing/assertions.py rename to lib/python3.7/site-packages/sqlalchemy/testing/assertions.py diff --git a/lib/python3.4/site-packages/sqlalchemy/testing/assertsql.py b/lib/python3.7/site-packages/sqlalchemy/testing/assertsql.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/testing/assertsql.py rename to lib/python3.7/site-packages/sqlalchemy/testing/assertsql.py diff --git a/lib/python3.4/site-packages/sqlalchemy/testing/config.py b/lib/python3.7/site-packages/sqlalchemy/testing/config.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/testing/config.py rename to lib/python3.7/site-packages/sqlalchemy/testing/config.py diff --git a/lib/python3.4/site-packages/sqlalchemy/testing/distutils_run.py b/lib/python3.7/site-packages/sqlalchemy/testing/distutils_run.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/testing/distutils_run.py rename to lib/python3.7/site-packages/sqlalchemy/testing/distutils_run.py diff --git a/lib/python3.4/site-packages/sqlalchemy/testing/engines.py b/lib/python3.7/site-packages/sqlalchemy/testing/engines.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/testing/engines.py rename to lib/python3.7/site-packages/sqlalchemy/testing/engines.py diff --git a/lib/python3.4/site-packages/sqlalchemy/testing/entities.py b/lib/python3.7/site-packages/sqlalchemy/testing/entities.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/testing/entities.py rename to lib/python3.7/site-packages/sqlalchemy/testing/entities.py diff --git a/lib/python3.4/site-packages/sqlalchemy/testing/exclusions.py b/lib/python3.7/site-packages/sqlalchemy/testing/exclusions.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/testing/exclusions.py rename to lib/python3.7/site-packages/sqlalchemy/testing/exclusions.py diff --git a/lib/python3.4/site-packages/sqlalchemy/testing/fixtures.py b/lib/python3.7/site-packages/sqlalchemy/testing/fixtures.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/testing/fixtures.py rename to lib/python3.7/site-packages/sqlalchemy/testing/fixtures.py diff --git a/lib/python3.4/site-packages/sqlalchemy/testing/mock.py b/lib/python3.7/site-packages/sqlalchemy/testing/mock.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/testing/mock.py rename to lib/python3.7/site-packages/sqlalchemy/testing/mock.py diff --git a/lib/python3.4/site-packages/sqlalchemy/testing/pickleable.py b/lib/python3.7/site-packages/sqlalchemy/testing/pickleable.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/testing/pickleable.py rename to lib/python3.7/site-packages/sqlalchemy/testing/pickleable.py diff --git a/lib/python3.7/site-packages/sqlalchemy/testing/plugin/__init__.py b/lib/python3.7/site-packages/sqlalchemy/testing/plugin/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/lib/python3.4/site-packages/sqlalchemy/testing/plugin/bootstrap.py b/lib/python3.7/site-packages/sqlalchemy/testing/plugin/bootstrap.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/testing/plugin/bootstrap.py rename to lib/python3.7/site-packages/sqlalchemy/testing/plugin/bootstrap.py diff --git a/lib/python3.4/site-packages/sqlalchemy/testing/plugin/noseplugin.py b/lib/python3.7/site-packages/sqlalchemy/testing/plugin/noseplugin.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/testing/plugin/noseplugin.py rename to lib/python3.7/site-packages/sqlalchemy/testing/plugin/noseplugin.py diff --git a/lib/python3.4/site-packages/sqlalchemy/testing/plugin/plugin_base.py b/lib/python3.7/site-packages/sqlalchemy/testing/plugin/plugin_base.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/testing/plugin/plugin_base.py rename to lib/python3.7/site-packages/sqlalchemy/testing/plugin/plugin_base.py diff --git a/lib/python3.4/site-packages/sqlalchemy/testing/plugin/pytestplugin.py b/lib/python3.7/site-packages/sqlalchemy/testing/plugin/pytestplugin.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/testing/plugin/pytestplugin.py rename to lib/python3.7/site-packages/sqlalchemy/testing/plugin/pytestplugin.py diff --git a/lib/python3.4/site-packages/sqlalchemy/testing/profiling.py b/lib/python3.7/site-packages/sqlalchemy/testing/profiling.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/testing/profiling.py rename to lib/python3.7/site-packages/sqlalchemy/testing/profiling.py diff --git a/lib/python3.4/site-packages/sqlalchemy/testing/provision.py b/lib/python3.7/site-packages/sqlalchemy/testing/provision.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/testing/provision.py rename to lib/python3.7/site-packages/sqlalchemy/testing/provision.py diff --git a/lib/python3.4/site-packages/sqlalchemy/testing/replay_fixture.py b/lib/python3.7/site-packages/sqlalchemy/testing/replay_fixture.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/testing/replay_fixture.py rename to lib/python3.7/site-packages/sqlalchemy/testing/replay_fixture.py diff --git a/lib/python3.4/site-packages/sqlalchemy/testing/requirements.py b/lib/python3.7/site-packages/sqlalchemy/testing/requirements.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/testing/requirements.py rename to lib/python3.7/site-packages/sqlalchemy/testing/requirements.py diff --git a/lib/python3.4/site-packages/sqlalchemy/testing/runner.py b/lib/python3.7/site-packages/sqlalchemy/testing/runner.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/testing/runner.py rename to lib/python3.7/site-packages/sqlalchemy/testing/runner.py diff --git a/lib/python3.4/site-packages/sqlalchemy/testing/schema.py b/lib/python3.7/site-packages/sqlalchemy/testing/schema.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/testing/schema.py rename to lib/python3.7/site-packages/sqlalchemy/testing/schema.py diff --git a/lib/python3.4/site-packages/sqlalchemy/testing/suite/__init__.py b/lib/python3.7/site-packages/sqlalchemy/testing/suite/__init__.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/testing/suite/__init__.py rename to lib/python3.7/site-packages/sqlalchemy/testing/suite/__init__.py diff --git a/lib/python3.4/site-packages/sqlalchemy/testing/suite/test_ddl.py b/lib/python3.7/site-packages/sqlalchemy/testing/suite/test_ddl.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/testing/suite/test_ddl.py rename to lib/python3.7/site-packages/sqlalchemy/testing/suite/test_ddl.py diff --git a/lib/python3.4/site-packages/sqlalchemy/testing/suite/test_dialect.py b/lib/python3.7/site-packages/sqlalchemy/testing/suite/test_dialect.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/testing/suite/test_dialect.py rename to lib/python3.7/site-packages/sqlalchemy/testing/suite/test_dialect.py diff --git a/lib/python3.4/site-packages/sqlalchemy/testing/suite/test_insert.py b/lib/python3.7/site-packages/sqlalchemy/testing/suite/test_insert.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/testing/suite/test_insert.py rename to lib/python3.7/site-packages/sqlalchemy/testing/suite/test_insert.py diff --git a/lib/python3.4/site-packages/sqlalchemy/testing/suite/test_reflection.py b/lib/python3.7/site-packages/sqlalchemy/testing/suite/test_reflection.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/testing/suite/test_reflection.py rename to lib/python3.7/site-packages/sqlalchemy/testing/suite/test_reflection.py diff --git a/lib/python3.4/site-packages/sqlalchemy/testing/suite/test_results.py b/lib/python3.7/site-packages/sqlalchemy/testing/suite/test_results.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/testing/suite/test_results.py rename to lib/python3.7/site-packages/sqlalchemy/testing/suite/test_results.py diff --git a/lib/python3.4/site-packages/sqlalchemy/testing/suite/test_select.py b/lib/python3.7/site-packages/sqlalchemy/testing/suite/test_select.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/testing/suite/test_select.py rename to lib/python3.7/site-packages/sqlalchemy/testing/suite/test_select.py diff --git a/lib/python3.4/site-packages/sqlalchemy/testing/suite/test_sequence.py b/lib/python3.7/site-packages/sqlalchemy/testing/suite/test_sequence.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/testing/suite/test_sequence.py rename to lib/python3.7/site-packages/sqlalchemy/testing/suite/test_sequence.py diff --git a/lib/python3.4/site-packages/sqlalchemy/testing/suite/test_types.py b/lib/python3.7/site-packages/sqlalchemy/testing/suite/test_types.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/testing/suite/test_types.py rename to lib/python3.7/site-packages/sqlalchemy/testing/suite/test_types.py diff --git a/lib/python3.4/site-packages/sqlalchemy/testing/suite/test_update_delete.py b/lib/python3.7/site-packages/sqlalchemy/testing/suite/test_update_delete.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/testing/suite/test_update_delete.py rename to lib/python3.7/site-packages/sqlalchemy/testing/suite/test_update_delete.py diff --git a/lib/python3.4/site-packages/sqlalchemy/testing/util.py b/lib/python3.7/site-packages/sqlalchemy/testing/util.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/testing/util.py rename to lib/python3.7/site-packages/sqlalchemy/testing/util.py diff --git a/lib/python3.4/site-packages/sqlalchemy/testing/warnings.py b/lib/python3.7/site-packages/sqlalchemy/testing/warnings.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/testing/warnings.py rename to lib/python3.7/site-packages/sqlalchemy/testing/warnings.py diff --git a/lib/python3.4/site-packages/sqlalchemy/types.py b/lib/python3.7/site-packages/sqlalchemy/types.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/types.py rename to lib/python3.7/site-packages/sqlalchemy/types.py diff --git a/lib/python3.4/site-packages/sqlalchemy/util/__init__.py b/lib/python3.7/site-packages/sqlalchemy/util/__init__.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/util/__init__.py rename to lib/python3.7/site-packages/sqlalchemy/util/__init__.py diff --git a/lib/python3.4/site-packages/sqlalchemy/util/_collections.py b/lib/python3.7/site-packages/sqlalchemy/util/_collections.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/util/_collections.py rename to lib/python3.7/site-packages/sqlalchemy/util/_collections.py diff --git a/lib/python3.4/site-packages/sqlalchemy/util/compat.py b/lib/python3.7/site-packages/sqlalchemy/util/compat.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/util/compat.py rename to lib/python3.7/site-packages/sqlalchemy/util/compat.py diff --git a/lib/python3.4/site-packages/sqlalchemy/util/deprecations.py b/lib/python3.7/site-packages/sqlalchemy/util/deprecations.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/util/deprecations.py rename to lib/python3.7/site-packages/sqlalchemy/util/deprecations.py diff --git a/lib/python3.4/site-packages/sqlalchemy/util/langhelpers.py b/lib/python3.7/site-packages/sqlalchemy/util/langhelpers.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/util/langhelpers.py rename to lib/python3.7/site-packages/sqlalchemy/util/langhelpers.py diff --git a/lib/python3.4/site-packages/sqlalchemy/util/queue.py b/lib/python3.7/site-packages/sqlalchemy/util/queue.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/util/queue.py rename to lib/python3.7/site-packages/sqlalchemy/util/queue.py diff --git a/lib/python3.4/site-packages/sqlalchemy/util/topological.py b/lib/python3.7/site-packages/sqlalchemy/util/topological.py similarity index 100% rename from lib/python3.4/site-packages/sqlalchemy/util/topological.py rename to lib/python3.7/site-packages/sqlalchemy/util/topological.py