split platform
This commit is contained in:
commit
a2291f58b8
278 changed files with 114251 additions and 0 deletions
8
.gitignore
vendored
Normal file
8
.gitignore
vendored
Normal file
|
@ -0,0 +1,8 @@
|
||||||
|
*.swp
|
||||||
|
*.pyc
|
||||||
|
*.pyo
|
||||||
|
*.pyd
|
||||||
|
__pycache__
|
||||||
|
pip_cache
|
||||||
|
bin
|
||||||
|
.DS_Store
|
155
lib/python3.4/site-packages/SQLAlchemy-0.9.7.egg-info/PKG-INFO
Normal file
155
lib/python3.4/site-packages/SQLAlchemy-0.9.7.egg-info/PKG-INFO
Normal file
|
@ -0,0 +1,155 @@
|
||||||
|
Metadata-Version: 1.1
|
||||||
|
Name: SQLAlchemy
|
||||||
|
Version: 0.9.7
|
||||||
|
Summary: Database Abstraction Library
|
||||||
|
Home-page: http://www.sqlalchemy.org
|
||||||
|
Author: Mike Bayer
|
||||||
|
Author-email: mike_mp@zzzcomputing.com
|
||||||
|
License: MIT License
|
||||||
|
Description: SQLAlchemy
|
||||||
|
==========
|
||||||
|
|
||||||
|
The Python SQL Toolkit and Object Relational Mapper
|
||||||
|
|
||||||
|
Introduction
|
||||||
|
-------------
|
||||||
|
|
||||||
|
SQLAlchemy is the Python SQL toolkit and Object Relational Mapper
|
||||||
|
that gives application developers the full power and
|
||||||
|
flexibility of SQL. SQLAlchemy provides a full suite
|
||||||
|
of well known enterprise-level persistence patterns,
|
||||||
|
designed for efficient and high-performing database
|
||||||
|
access, adapted into a simple and Pythonic domain
|
||||||
|
language.
|
||||||
|
|
||||||
|
Major SQLAlchemy features include:
|
||||||
|
|
||||||
|
* An industrial strength ORM, built
|
||||||
|
from the core on the identity map, unit of work,
|
||||||
|
and data mapper patterns. These patterns
|
||||||
|
allow transparent persistence of objects
|
||||||
|
using a declarative configuration system.
|
||||||
|
Domain models
|
||||||
|
can be constructed and manipulated naturally,
|
||||||
|
and changes are synchronized with the
|
||||||
|
current transaction automatically.
|
||||||
|
* A relationally-oriented query system, exposing
|
||||||
|
the full range of SQL's capabilities
|
||||||
|
explicitly, including joins, subqueries,
|
||||||
|
correlation, and most everything else,
|
||||||
|
in terms of the object model.
|
||||||
|
Writing queries with the ORM uses the same
|
||||||
|
techniques of relational composition you use
|
||||||
|
when writing SQL. While you can drop into
|
||||||
|
literal SQL at any time, it's virtually never
|
||||||
|
needed.
|
||||||
|
* A comprehensive and flexible system
|
||||||
|
of eager loading for related collections and objects.
|
||||||
|
Collections are cached within a session,
|
||||||
|
and can be loaded on individual access, all
|
||||||
|
at once using joins, or by query per collection
|
||||||
|
across the full result set.
|
||||||
|
* A Core SQL construction system and DBAPI
|
||||||
|
interaction layer. The SQLAlchemy Core is
|
||||||
|
separate from the ORM and is a full database
|
||||||
|
abstraction layer in its own right, and includes
|
||||||
|
an extensible Python-based SQL expression
|
||||||
|
language, schema metadata, connection pooling,
|
||||||
|
type coercion, and custom types.
|
||||||
|
* All primary and foreign key constraints are
|
||||||
|
assumed to be composite and natural. Surrogate
|
||||||
|
integer primary keys are of course still the
|
||||||
|
norm, but SQLAlchemy never assumes or hardcodes
|
||||||
|
to this model.
|
||||||
|
* Database introspection and generation. Database
|
||||||
|
schemas can be "reflected" in one step into
|
||||||
|
Python structures representing database metadata;
|
||||||
|
those same structures can then generate
|
||||||
|
CREATE statements right back out - all within
|
||||||
|
the Core, independent of the ORM.
|
||||||
|
|
||||||
|
SQLAlchemy's philosophy:
|
||||||
|
|
||||||
|
* SQL databases behave less and less like object
|
||||||
|
collections the more size and performance start to
|
||||||
|
matter; object collections behave less and less like
|
||||||
|
tables and rows the more abstraction starts to matter.
|
||||||
|
SQLAlchemy aims to accommodate both of these
|
||||||
|
principles.
|
||||||
|
* An ORM doesn't need to hide the "R". A relational
|
||||||
|
database provides rich, set-based functionality
|
||||||
|
that should be fully exposed. SQLAlchemy's
|
||||||
|
ORM provides an open-ended set of patterns
|
||||||
|
that allow a developer to construct a custom
|
||||||
|
mediation layer between a domain model and
|
||||||
|
a relational schema, turning the so-called
|
||||||
|
"object relational impedance" issue into
|
||||||
|
a distant memory.
|
||||||
|
* The developer, in all cases, makes all decisions
|
||||||
|
regarding the design, structure, and naming conventions
|
||||||
|
of both the object model as well as the relational
|
||||||
|
schema. SQLAlchemy only provides the means
|
||||||
|
to automate the execution of these decisions.
|
||||||
|
* With SQLAlchemy, there's no such thing as
|
||||||
|
"the ORM generated a bad query" - you
|
||||||
|
retain full control over the structure of
|
||||||
|
queries, including how joins are organized,
|
||||||
|
how subqueries and correlation is used, what
|
||||||
|
columns are requested. Everything SQLAlchemy
|
||||||
|
does is ultimately the result of a developer-
|
||||||
|
initiated decision.
|
||||||
|
* Don't use an ORM if the problem doesn't need one.
|
||||||
|
SQLAlchemy consists of a Core and separate ORM
|
||||||
|
component. The Core offers a full SQL expression
|
||||||
|
language that allows Pythonic construction
|
||||||
|
of SQL constructs that render directly to SQL
|
||||||
|
strings for a target database, returning
|
||||||
|
result sets that are essentially enhanced DBAPI
|
||||||
|
cursors.
|
||||||
|
* Transactions should be the norm. With SQLAlchemy's
|
||||||
|
ORM, nothing goes to permanent storage until
|
||||||
|
commit() is called. SQLAlchemy encourages applications
|
||||||
|
to create a consistent means of delineating
|
||||||
|
the start and end of a series of operations.
|
||||||
|
* Never render a literal value in a SQL statement.
|
||||||
|
Bound parameters are used to the greatest degree
|
||||||
|
possible, allowing query optimizers to cache
|
||||||
|
query plans effectively and making SQL injection
|
||||||
|
attacks a non-issue.
|
||||||
|
|
||||||
|
Documentation
|
||||||
|
-------------
|
||||||
|
|
||||||
|
Latest documentation is at:
|
||||||
|
|
||||||
|
http://www.sqlalchemy.org/docs/
|
||||||
|
|
||||||
|
Installation / Requirements
|
||||||
|
---------------------------
|
||||||
|
|
||||||
|
Full documentation for installation is at
|
||||||
|
`Installation <http://www.sqlalchemy.org/docs/intro.html#installation>`_.
|
||||||
|
|
||||||
|
Getting Help / Development / Bug reporting
|
||||||
|
------------------------------------------
|
||||||
|
|
||||||
|
Please refer to the `SQLAlchemy Community Guide <http://www.sqlalchemy.org/support.html>`_.
|
||||||
|
|
||||||
|
License
|
||||||
|
-------
|
||||||
|
|
||||||
|
SQLAlchemy is distributed under the `MIT license
|
||||||
|
<http://www.opensource.org/licenses/mit-license.php>`_.
|
||||||
|
|
||||||
|
|
||||||
|
Platform: UNKNOWN
|
||||||
|
Classifier: Development Status :: 5 - Production/Stable
|
||||||
|
Classifier: Intended Audience :: Developers
|
||||||
|
Classifier: License :: OSI Approved :: MIT License
|
||||||
|
Classifier: Programming Language :: Python
|
||||||
|
Classifier: Programming Language :: Python :: 3
|
||||||
|
Classifier: Programming Language :: Python :: Implementation :: CPython
|
||||||
|
Classifier: Programming Language :: Python :: Implementation :: Jython
|
||||||
|
Classifier: Programming Language :: Python :: Implementation :: PyPy
|
||||||
|
Classifier: Topic :: Database :: Front-Ends
|
||||||
|
Classifier: Operating System :: OS Independent
|
|
@ -0,0 +1,682 @@
|
||||||
|
AUTHORS
|
||||||
|
CHANGES
|
||||||
|
LICENSE
|
||||||
|
MANIFEST.in
|
||||||
|
README.dialects.rst
|
||||||
|
README.rst
|
||||||
|
README.unittests.rst
|
||||||
|
setup.cfg
|
||||||
|
setup.py
|
||||||
|
sqla_nose.py
|
||||||
|
doc/contents.html
|
||||||
|
doc/copyright.html
|
||||||
|
doc/faq.html
|
||||||
|
doc/genindex.html
|
||||||
|
doc/glossary.html
|
||||||
|
doc/index.html
|
||||||
|
doc/intro.html
|
||||||
|
doc/search.html
|
||||||
|
doc/searchindex.js
|
||||||
|
doc/_images/sqla_arch_small.png
|
||||||
|
doc/_images/sqla_engine_arch.png
|
||||||
|
doc/_modules/index.html
|
||||||
|
doc/_modules/examples/adjacency_list/adjacency_list.html
|
||||||
|
doc/_modules/examples/association/basic_association.html
|
||||||
|
doc/_modules/examples/association/dict_of_sets_with_default.html
|
||||||
|
doc/_modules/examples/association/proxied_association.html
|
||||||
|
doc/_modules/examples/custom_attributes/custom_management.html
|
||||||
|
doc/_modules/examples/custom_attributes/listen_for_events.html
|
||||||
|
doc/_modules/examples/dogpile_caching/advanced.html
|
||||||
|
doc/_modules/examples/dogpile_caching/caching_query.html
|
||||||
|
doc/_modules/examples/dogpile_caching/environment.html
|
||||||
|
doc/_modules/examples/dogpile_caching/fixture_data.html
|
||||||
|
doc/_modules/examples/dogpile_caching/helloworld.html
|
||||||
|
doc/_modules/examples/dogpile_caching/local_session_caching.html
|
||||||
|
doc/_modules/examples/dogpile_caching/model.html
|
||||||
|
doc/_modules/examples/dogpile_caching/relationship_caching.html
|
||||||
|
doc/_modules/examples/dynamic_dict/dynamic_dict.html
|
||||||
|
doc/_modules/examples/elementtree/adjacency_list.html
|
||||||
|
doc/_modules/examples/elementtree/optimized_al.html
|
||||||
|
doc/_modules/examples/elementtree/pickle.html
|
||||||
|
doc/_modules/examples/generic_associations/discriminator_on_association.html
|
||||||
|
doc/_modules/examples/generic_associations/generic_fk.html
|
||||||
|
doc/_modules/examples/generic_associations/table_per_association.html
|
||||||
|
doc/_modules/examples/generic_associations/table_per_related.html
|
||||||
|
doc/_modules/examples/graphs/directed_graph.html
|
||||||
|
doc/_modules/examples/inheritance/concrete.html
|
||||||
|
doc/_modules/examples/inheritance/joined.html
|
||||||
|
doc/_modules/examples/inheritance/single.html
|
||||||
|
doc/_modules/examples/join_conditions/cast.html
|
||||||
|
doc/_modules/examples/join_conditions/threeway.html
|
||||||
|
doc/_modules/examples/large_collection/large_collection.html
|
||||||
|
doc/_modules/examples/materialized_paths/materialized_paths.html
|
||||||
|
doc/_modules/examples/nested_sets/nested_sets.html
|
||||||
|
doc/_modules/examples/postgis/postgis.html
|
||||||
|
doc/_modules/examples/sharding/attribute_shard.html
|
||||||
|
doc/_modules/examples/versioned_history/history_meta.html
|
||||||
|
doc/_modules/examples/versioned_history/test_versioning.html
|
||||||
|
doc/_modules/examples/versioned_rows/versioned_map.html
|
||||||
|
doc/_modules/examples/versioned_rows/versioned_rows.html
|
||||||
|
doc/_modules/examples/vertical/dictlike-polymorphic.html
|
||||||
|
doc/_modules/examples/vertical/dictlike.html
|
||||||
|
doc/_static/basic.css
|
||||||
|
doc/_static/changelog.css
|
||||||
|
doc/_static/comment-bright.png
|
||||||
|
doc/_static/comment-close.png
|
||||||
|
doc/_static/comment.png
|
||||||
|
doc/_static/default.css
|
||||||
|
doc/_static/detectmobile.js
|
||||||
|
doc/_static/docs.css
|
||||||
|
doc/_static/doctools.js
|
||||||
|
doc/_static/down-pressed.png
|
||||||
|
doc/_static/down.png
|
||||||
|
doc/_static/file.png
|
||||||
|
doc/_static/init.js
|
||||||
|
doc/_static/jquery.js
|
||||||
|
doc/_static/minus.png
|
||||||
|
doc/_static/plus.png
|
||||||
|
doc/_static/pygments.css
|
||||||
|
doc/_static/searchtools.js
|
||||||
|
doc/_static/sidebar.js
|
||||||
|
doc/_static/sphinx_paramlinks.css
|
||||||
|
doc/_static/underscore.js
|
||||||
|
doc/_static/up-pressed.png
|
||||||
|
doc/_static/up.png
|
||||||
|
doc/_static/websupport.js
|
||||||
|
doc/build/Makefile
|
||||||
|
doc/build/conf.py
|
||||||
|
doc/build/contents.rst
|
||||||
|
doc/build/copyright.rst
|
||||||
|
doc/build/faq.rst
|
||||||
|
doc/build/glossary.rst
|
||||||
|
doc/build/index.rst
|
||||||
|
doc/build/intro.rst
|
||||||
|
doc/build/requirements.txt
|
||||||
|
doc/build/sqla_arch_small.png
|
||||||
|
doc/build/testdocs.py
|
||||||
|
doc/build/builder/__init__.py
|
||||||
|
doc/build/builder/autodoc_mods.py
|
||||||
|
doc/build/builder/dialect_info.py
|
||||||
|
doc/build/builder/mako.py
|
||||||
|
doc/build/builder/sqlformatter.py
|
||||||
|
doc/build/builder/util.py
|
||||||
|
doc/build/builder/viewsource.py
|
||||||
|
doc/build/changelog/changelog_01.rst
|
||||||
|
doc/build/changelog/changelog_02.rst
|
||||||
|
doc/build/changelog/changelog_03.rst
|
||||||
|
doc/build/changelog/changelog_04.rst
|
||||||
|
doc/build/changelog/changelog_05.rst
|
||||||
|
doc/build/changelog/changelog_06.rst
|
||||||
|
doc/build/changelog/changelog_07.rst
|
||||||
|
doc/build/changelog/changelog_08.rst
|
||||||
|
doc/build/changelog/changelog_09.rst
|
||||||
|
doc/build/changelog/index.rst
|
||||||
|
doc/build/changelog/migration_04.rst
|
||||||
|
doc/build/changelog/migration_05.rst
|
||||||
|
doc/build/changelog/migration_06.rst
|
||||||
|
doc/build/changelog/migration_07.rst
|
||||||
|
doc/build/changelog/migration_08.rst
|
||||||
|
doc/build/changelog/migration_09.rst
|
||||||
|
doc/build/core/compiler.rst
|
||||||
|
doc/build/core/connections.rst
|
||||||
|
doc/build/core/constraints.rst
|
||||||
|
doc/build/core/ddl.rst
|
||||||
|
doc/build/core/defaults.rst
|
||||||
|
doc/build/core/dml.rst
|
||||||
|
doc/build/core/engines.rst
|
||||||
|
doc/build/core/event.rst
|
||||||
|
doc/build/core/events.rst
|
||||||
|
doc/build/core/exceptions.rst
|
||||||
|
doc/build/core/expression_api.rst
|
||||||
|
doc/build/core/functions.rst
|
||||||
|
doc/build/core/index.rst
|
||||||
|
doc/build/core/inspection.rst
|
||||||
|
doc/build/core/interfaces.rst
|
||||||
|
doc/build/core/internals.rst
|
||||||
|
doc/build/core/metadata.rst
|
||||||
|
doc/build/core/pooling.rst
|
||||||
|
doc/build/core/reflection.rst
|
||||||
|
doc/build/core/schema.rst
|
||||||
|
doc/build/core/selectable.rst
|
||||||
|
doc/build/core/serializer.rst
|
||||||
|
doc/build/core/sqla_engine_arch.png
|
||||||
|
doc/build/core/sqlelement.rst
|
||||||
|
doc/build/core/tutorial.rst
|
||||||
|
doc/build/core/types.rst
|
||||||
|
doc/build/dialects/drizzle.rst
|
||||||
|
doc/build/dialects/firebird.rst
|
||||||
|
doc/build/dialects/index.rst
|
||||||
|
doc/build/dialects/mssql.rst
|
||||||
|
doc/build/dialects/mysql.rst
|
||||||
|
doc/build/dialects/oracle.rst
|
||||||
|
doc/build/dialects/postgresql.rst
|
||||||
|
doc/build/dialects/sqlite.rst
|
||||||
|
doc/build/dialects/sybase.rst
|
||||||
|
doc/build/orm/collections.rst
|
||||||
|
doc/build/orm/deprecated.rst
|
||||||
|
doc/build/orm/events.rst
|
||||||
|
doc/build/orm/examples.rst
|
||||||
|
doc/build/orm/exceptions.rst
|
||||||
|
doc/build/orm/index.rst
|
||||||
|
doc/build/orm/inheritance.rst
|
||||||
|
doc/build/orm/internals.rst
|
||||||
|
doc/build/orm/loading.rst
|
||||||
|
doc/build/orm/mapper_config.rst
|
||||||
|
doc/build/orm/query.rst
|
||||||
|
doc/build/orm/relationships.rst
|
||||||
|
doc/build/orm/session.rst
|
||||||
|
doc/build/orm/tutorial.rst
|
||||||
|
doc/build/orm/extensions/associationproxy.rst
|
||||||
|
doc/build/orm/extensions/automap.rst
|
||||||
|
doc/build/orm/extensions/declarative.rst
|
||||||
|
doc/build/orm/extensions/horizontal_shard.rst
|
||||||
|
doc/build/orm/extensions/hybrid.rst
|
||||||
|
doc/build/orm/extensions/index.rst
|
||||||
|
doc/build/orm/extensions/instrumentation.rst
|
||||||
|
doc/build/orm/extensions/mutable.rst
|
||||||
|
doc/build/orm/extensions/orderinglist.rst
|
||||||
|
doc/build/static/detectmobile.js
|
||||||
|
doc/build/static/docs.css
|
||||||
|
doc/build/static/init.js
|
||||||
|
doc/build/templates/genindex.mako
|
||||||
|
doc/build/templates/layout.mako
|
||||||
|
doc/build/templates/page.mako
|
||||||
|
doc/build/templates/search.mako
|
||||||
|
doc/build/templates/static_base.mako
|
||||||
|
doc/build/texinputs/Makefile
|
||||||
|
doc/build/texinputs/sphinx.sty
|
||||||
|
doc/changelog/changelog_01.html
|
||||||
|
doc/changelog/changelog_02.html
|
||||||
|
doc/changelog/changelog_03.html
|
||||||
|
doc/changelog/changelog_04.html
|
||||||
|
doc/changelog/changelog_05.html
|
||||||
|
doc/changelog/changelog_06.html
|
||||||
|
doc/changelog/changelog_07.html
|
||||||
|
doc/changelog/changelog_08.html
|
||||||
|
doc/changelog/changelog_09.html
|
||||||
|
doc/changelog/index.html
|
||||||
|
doc/changelog/migration_04.html
|
||||||
|
doc/changelog/migration_05.html
|
||||||
|
doc/changelog/migration_06.html
|
||||||
|
doc/changelog/migration_07.html
|
||||||
|
doc/changelog/migration_08.html
|
||||||
|
doc/changelog/migration_09.html
|
||||||
|
doc/core/compiler.html
|
||||||
|
doc/core/connections.html
|
||||||
|
doc/core/constraints.html
|
||||||
|
doc/core/ddl.html
|
||||||
|
doc/core/defaults.html
|
||||||
|
doc/core/dml.html
|
||||||
|
doc/core/engines.html
|
||||||
|
doc/core/event.html
|
||||||
|
doc/core/events.html
|
||||||
|
doc/core/exceptions.html
|
||||||
|
doc/core/expression_api.html
|
||||||
|
doc/core/functions.html
|
||||||
|
doc/core/index.html
|
||||||
|
doc/core/inspection.html
|
||||||
|
doc/core/interfaces.html
|
||||||
|
doc/core/internals.html
|
||||||
|
doc/core/metadata.html
|
||||||
|
doc/core/pooling.html
|
||||||
|
doc/core/reflection.html
|
||||||
|
doc/core/schema.html
|
||||||
|
doc/core/selectable.html
|
||||||
|
doc/core/serializer.html
|
||||||
|
doc/core/sqlelement.html
|
||||||
|
doc/core/tutorial.html
|
||||||
|
doc/core/types.html
|
||||||
|
doc/dialects/drizzle.html
|
||||||
|
doc/dialects/firebird.html
|
||||||
|
doc/dialects/index.html
|
||||||
|
doc/dialects/mssql.html
|
||||||
|
doc/dialects/mysql.html
|
||||||
|
doc/dialects/oracle.html
|
||||||
|
doc/dialects/postgresql.html
|
||||||
|
doc/dialects/sqlite.html
|
||||||
|
doc/dialects/sybase.html
|
||||||
|
doc/orm/collections.html
|
||||||
|
doc/orm/deprecated.html
|
||||||
|
doc/orm/events.html
|
||||||
|
doc/orm/examples.html
|
||||||
|
doc/orm/exceptions.html
|
||||||
|
doc/orm/index.html
|
||||||
|
doc/orm/inheritance.html
|
||||||
|
doc/orm/internals.html
|
||||||
|
doc/orm/loading.html
|
||||||
|
doc/orm/mapper_config.html
|
||||||
|
doc/orm/query.html
|
||||||
|
doc/orm/relationships.html
|
||||||
|
doc/orm/session.html
|
||||||
|
doc/orm/tutorial.html
|
||||||
|
doc/orm/extensions/associationproxy.html
|
||||||
|
doc/orm/extensions/automap.html
|
||||||
|
doc/orm/extensions/declarative.html
|
||||||
|
doc/orm/extensions/horizontal_shard.html
|
||||||
|
doc/orm/extensions/hybrid.html
|
||||||
|
doc/orm/extensions/index.html
|
||||||
|
doc/orm/extensions/instrumentation.html
|
||||||
|
doc/orm/extensions/mutable.html
|
||||||
|
doc/orm/extensions/orderinglist.html
|
||||||
|
examples/__init__.py
|
||||||
|
examples/adjacency_list/__init__.py
|
||||||
|
examples/adjacency_list/adjacency_list.py
|
||||||
|
examples/association/__init__.py
|
||||||
|
examples/association/basic_association.py
|
||||||
|
examples/association/dict_of_sets_with_default.py
|
||||||
|
examples/association/proxied_association.py
|
||||||
|
examples/custom_attributes/__init__.py
|
||||||
|
examples/custom_attributes/custom_management.py
|
||||||
|
examples/custom_attributes/listen_for_events.py
|
||||||
|
examples/dogpile_caching/__init__.py
|
||||||
|
examples/dogpile_caching/advanced.py
|
||||||
|
examples/dogpile_caching/caching_query.py
|
||||||
|
examples/dogpile_caching/environment.py
|
||||||
|
examples/dogpile_caching/fixture_data.py
|
||||||
|
examples/dogpile_caching/helloworld.py
|
||||||
|
examples/dogpile_caching/local_session_caching.py
|
||||||
|
examples/dogpile_caching/model.py
|
||||||
|
examples/dogpile_caching/relationship_caching.py
|
||||||
|
examples/dynamic_dict/__init__.py
|
||||||
|
examples/dynamic_dict/dynamic_dict.py
|
||||||
|
examples/elementtree/__init__.py
|
||||||
|
examples/elementtree/adjacency_list.py
|
||||||
|
examples/elementtree/optimized_al.py
|
||||||
|
examples/elementtree/pickle.py
|
||||||
|
examples/elementtree/test.xml
|
||||||
|
examples/elementtree/test2.xml
|
||||||
|
examples/elementtree/test3.xml
|
||||||
|
examples/generic_associations/__init__.py
|
||||||
|
examples/generic_associations/discriminator_on_association.py
|
||||||
|
examples/generic_associations/generic_fk.py
|
||||||
|
examples/generic_associations/table_per_association.py
|
||||||
|
examples/generic_associations/table_per_related.py
|
||||||
|
examples/graphs/__init__.py
|
||||||
|
examples/graphs/directed_graph.py
|
||||||
|
examples/inheritance/__init__.py
|
||||||
|
examples/inheritance/concrete.py
|
||||||
|
examples/inheritance/joined.py
|
||||||
|
examples/inheritance/single.py
|
||||||
|
examples/join_conditions/__init__.py
|
||||||
|
examples/join_conditions/cast.py
|
||||||
|
examples/join_conditions/threeway.py
|
||||||
|
examples/large_collection/__init__.py
|
||||||
|
examples/large_collection/large_collection.py
|
||||||
|
examples/materialized_paths/__init__.py
|
||||||
|
examples/materialized_paths/materialized_paths.py
|
||||||
|
examples/nested_sets/__init__.py
|
||||||
|
examples/nested_sets/nested_sets.py
|
||||||
|
examples/postgis/__init__.py
|
||||||
|
examples/postgis/postgis.py
|
||||||
|
examples/sharding/__init__.py
|
||||||
|
examples/sharding/attribute_shard.py
|
||||||
|
examples/versioned_history/__init__.py
|
||||||
|
examples/versioned_history/history_meta.py
|
||||||
|
examples/versioned_history/test_versioning.py
|
||||||
|
examples/versioned_rows/__init__.py
|
||||||
|
examples/versioned_rows/versioned_map.py
|
||||||
|
examples/versioned_rows/versioned_rows.py
|
||||||
|
examples/vertical/__init__.py
|
||||||
|
examples/vertical/dictlike-polymorphic.py
|
||||||
|
examples/vertical/dictlike.py
|
||||||
|
lib/SQLAlchemy.egg-info/PKG-INFO
|
||||||
|
lib/SQLAlchemy.egg-info/SOURCES.txt
|
||||||
|
lib/SQLAlchemy.egg-info/dependency_links.txt
|
||||||
|
lib/SQLAlchemy.egg-info/top_level.txt
|
||||||
|
lib/sqlalchemy/__init__.py
|
||||||
|
lib/sqlalchemy/events.py
|
||||||
|
lib/sqlalchemy/exc.py
|
||||||
|
lib/sqlalchemy/inspection.py
|
||||||
|
lib/sqlalchemy/interfaces.py
|
||||||
|
lib/sqlalchemy/log.py
|
||||||
|
lib/sqlalchemy/pool.py
|
||||||
|
lib/sqlalchemy/processors.py
|
||||||
|
lib/sqlalchemy/schema.py
|
||||||
|
lib/sqlalchemy/types.py
|
||||||
|
lib/sqlalchemy/cextension/processors.c
|
||||||
|
lib/sqlalchemy/cextension/resultproxy.c
|
||||||
|
lib/sqlalchemy/cextension/utils.c
|
||||||
|
lib/sqlalchemy/connectors/__init__.py
|
||||||
|
lib/sqlalchemy/connectors/mxodbc.py
|
||||||
|
lib/sqlalchemy/connectors/mysqldb.py
|
||||||
|
lib/sqlalchemy/connectors/pyodbc.py
|
||||||
|
lib/sqlalchemy/connectors/zxJDBC.py
|
||||||
|
lib/sqlalchemy/databases/__init__.py
|
||||||
|
lib/sqlalchemy/dialects/__init__.py
|
||||||
|
lib/sqlalchemy/dialects/postgres.py
|
||||||
|
lib/sqlalchemy/dialects/type_migration_guidelines.txt
|
||||||
|
lib/sqlalchemy/dialects/drizzle/__init__.py
|
||||||
|
lib/sqlalchemy/dialects/drizzle/base.py
|
||||||
|
lib/sqlalchemy/dialects/drizzle/mysqldb.py
|
||||||
|
lib/sqlalchemy/dialects/firebird/__init__.py
|
||||||
|
lib/sqlalchemy/dialects/firebird/base.py
|
||||||
|
lib/sqlalchemy/dialects/firebird/fdb.py
|
||||||
|
lib/sqlalchemy/dialects/firebird/kinterbasdb.py
|
||||||
|
lib/sqlalchemy/dialects/mssql/__init__.py
|
||||||
|
lib/sqlalchemy/dialects/mssql/adodbapi.py
|
||||||
|
lib/sqlalchemy/dialects/mssql/base.py
|
||||||
|
lib/sqlalchemy/dialects/mssql/information_schema.py
|
||||||
|
lib/sqlalchemy/dialects/mssql/mxodbc.py
|
||||||
|
lib/sqlalchemy/dialects/mssql/pymssql.py
|
||||||
|
lib/sqlalchemy/dialects/mssql/pyodbc.py
|
||||||
|
lib/sqlalchemy/dialects/mssql/zxjdbc.py
|
||||||
|
lib/sqlalchemy/dialects/mysql/__init__.py
|
||||||
|
lib/sqlalchemy/dialects/mysql/base.py
|
||||||
|
lib/sqlalchemy/dialects/mysql/cymysql.py
|
||||||
|
lib/sqlalchemy/dialects/mysql/gaerdbms.py
|
||||||
|
lib/sqlalchemy/dialects/mysql/mysqlconnector.py
|
||||||
|
lib/sqlalchemy/dialects/mysql/mysqldb.py
|
||||||
|
lib/sqlalchemy/dialects/mysql/oursql.py
|
||||||
|
lib/sqlalchemy/dialects/mysql/pymysql.py
|
||||||
|
lib/sqlalchemy/dialects/mysql/pyodbc.py
|
||||||
|
lib/sqlalchemy/dialects/mysql/zxjdbc.py
|
||||||
|
lib/sqlalchemy/dialects/oracle/__init__.py
|
||||||
|
lib/sqlalchemy/dialects/oracle/base.py
|
||||||
|
lib/sqlalchemy/dialects/oracle/cx_oracle.py
|
||||||
|
lib/sqlalchemy/dialects/oracle/zxjdbc.py
|
||||||
|
lib/sqlalchemy/dialects/postgresql/__init__.py
|
||||||
|
lib/sqlalchemy/dialects/postgresql/base.py
|
||||||
|
lib/sqlalchemy/dialects/postgresql/constraints.py
|
||||||
|
lib/sqlalchemy/dialects/postgresql/hstore.py
|
||||||
|
lib/sqlalchemy/dialects/postgresql/json.py
|
||||||
|
lib/sqlalchemy/dialects/postgresql/pg8000.py
|
||||||
|
lib/sqlalchemy/dialects/postgresql/psycopg2.py
|
||||||
|
lib/sqlalchemy/dialects/postgresql/pypostgresql.py
|
||||||
|
lib/sqlalchemy/dialects/postgresql/ranges.py
|
||||||
|
lib/sqlalchemy/dialects/postgresql/zxjdbc.py
|
||||||
|
lib/sqlalchemy/dialects/sqlite/__init__.py
|
||||||
|
lib/sqlalchemy/dialects/sqlite/base.py
|
||||||
|
lib/sqlalchemy/dialects/sqlite/pysqlite.py
|
||||||
|
lib/sqlalchemy/dialects/sybase/__init__.py
|
||||||
|
lib/sqlalchemy/dialects/sybase/base.py
|
||||||
|
lib/sqlalchemy/dialects/sybase/mxodbc.py
|
||||||
|
lib/sqlalchemy/dialects/sybase/pyodbc.py
|
||||||
|
lib/sqlalchemy/dialects/sybase/pysybase.py
|
||||||
|
lib/sqlalchemy/engine/__init__.py
|
||||||
|
lib/sqlalchemy/engine/base.py
|
||||||
|
lib/sqlalchemy/engine/default.py
|
||||||
|
lib/sqlalchemy/engine/interfaces.py
|
||||||
|
lib/sqlalchemy/engine/reflection.py
|
||||||
|
lib/sqlalchemy/engine/result.py
|
||||||
|
lib/sqlalchemy/engine/strategies.py
|
||||||
|
lib/sqlalchemy/engine/threadlocal.py
|
||||||
|
lib/sqlalchemy/engine/url.py
|
||||||
|
lib/sqlalchemy/engine/util.py
|
||||||
|
lib/sqlalchemy/event/__init__.py
|
||||||
|
lib/sqlalchemy/event/api.py
|
||||||
|
lib/sqlalchemy/event/attr.py
|
||||||
|
lib/sqlalchemy/event/base.py
|
||||||
|
lib/sqlalchemy/event/legacy.py
|
||||||
|
lib/sqlalchemy/event/registry.py
|
||||||
|
lib/sqlalchemy/ext/__init__.py
|
||||||
|
lib/sqlalchemy/ext/associationproxy.py
|
||||||
|
lib/sqlalchemy/ext/automap.py
|
||||||
|
lib/sqlalchemy/ext/compiler.py
|
||||||
|
lib/sqlalchemy/ext/horizontal_shard.py
|
||||||
|
lib/sqlalchemy/ext/hybrid.py
|
||||||
|
lib/sqlalchemy/ext/instrumentation.py
|
||||||
|
lib/sqlalchemy/ext/mutable.py
|
||||||
|
lib/sqlalchemy/ext/orderinglist.py
|
||||||
|
lib/sqlalchemy/ext/serializer.py
|
||||||
|
lib/sqlalchemy/ext/declarative/__init__.py
|
||||||
|
lib/sqlalchemy/ext/declarative/api.py
|
||||||
|
lib/sqlalchemy/ext/declarative/base.py
|
||||||
|
lib/sqlalchemy/ext/declarative/clsregistry.py
|
||||||
|
lib/sqlalchemy/orm/__init__.py
|
||||||
|
lib/sqlalchemy/orm/attributes.py
|
||||||
|
lib/sqlalchemy/orm/base.py
|
||||||
|
lib/sqlalchemy/orm/collections.py
|
||||||
|
lib/sqlalchemy/orm/dependency.py
|
||||||
|
lib/sqlalchemy/orm/deprecated_interfaces.py
|
||||||
|
lib/sqlalchemy/orm/descriptor_props.py
|
||||||
|
lib/sqlalchemy/orm/dynamic.py
|
||||||
|
lib/sqlalchemy/orm/evaluator.py
|
||||||
|
lib/sqlalchemy/orm/events.py
|
||||||
|
lib/sqlalchemy/orm/exc.py
|
||||||
|
lib/sqlalchemy/orm/identity.py
|
||||||
|
lib/sqlalchemy/orm/instrumentation.py
|
||||||
|
lib/sqlalchemy/orm/interfaces.py
|
||||||
|
lib/sqlalchemy/orm/loading.py
|
||||||
|
lib/sqlalchemy/orm/mapper.py
|
||||||
|
lib/sqlalchemy/orm/path_registry.py
|
||||||
|
lib/sqlalchemy/orm/persistence.py
|
||||||
|
lib/sqlalchemy/orm/properties.py
|
||||||
|
lib/sqlalchemy/orm/query.py
|
||||||
|
lib/sqlalchemy/orm/relationships.py
|
||||||
|
lib/sqlalchemy/orm/scoping.py
|
||||||
|
lib/sqlalchemy/orm/session.py
|
||||||
|
lib/sqlalchemy/orm/state.py
|
||||||
|
lib/sqlalchemy/orm/strategies.py
|
||||||
|
lib/sqlalchemy/orm/strategy_options.py
|
||||||
|
lib/sqlalchemy/orm/sync.py
|
||||||
|
lib/sqlalchemy/orm/unitofwork.py
|
||||||
|
lib/sqlalchemy/orm/util.py
|
||||||
|
lib/sqlalchemy/sql/__init__.py
|
||||||
|
lib/sqlalchemy/sql/annotation.py
|
||||||
|
lib/sqlalchemy/sql/base.py
|
||||||
|
lib/sqlalchemy/sql/compiler.py
|
||||||
|
lib/sqlalchemy/sql/ddl.py
|
||||||
|
lib/sqlalchemy/sql/default_comparator.py
|
||||||
|
lib/sqlalchemy/sql/dml.py
|
||||||
|
lib/sqlalchemy/sql/elements.py
|
||||||
|
lib/sqlalchemy/sql/expression.py
|
||||||
|
lib/sqlalchemy/sql/functions.py
|
||||||
|
lib/sqlalchemy/sql/naming.py
|
||||||
|
lib/sqlalchemy/sql/operators.py
|
||||||
|
lib/sqlalchemy/sql/schema.py
|
||||||
|
lib/sqlalchemy/sql/selectable.py
|
||||||
|
lib/sqlalchemy/sql/sqltypes.py
|
||||||
|
lib/sqlalchemy/sql/type_api.py
|
||||||
|
lib/sqlalchemy/sql/util.py
|
||||||
|
lib/sqlalchemy/sql/visitors.py
|
||||||
|
lib/sqlalchemy/testing/__init__.py
|
||||||
|
lib/sqlalchemy/testing/assertions.py
|
||||||
|
lib/sqlalchemy/testing/assertsql.py
|
||||||
|
lib/sqlalchemy/testing/config.py
|
||||||
|
lib/sqlalchemy/testing/distutils_run.py
|
||||||
|
lib/sqlalchemy/testing/engines.py
|
||||||
|
lib/sqlalchemy/testing/entities.py
|
||||||
|
lib/sqlalchemy/testing/exclusions.py
|
||||||
|
lib/sqlalchemy/testing/fixtures.py
|
||||||
|
lib/sqlalchemy/testing/mock.py
|
||||||
|
lib/sqlalchemy/testing/pickleable.py
|
||||||
|
lib/sqlalchemy/testing/profiling.py
|
||||||
|
lib/sqlalchemy/testing/requirements.py
|
||||||
|
lib/sqlalchemy/testing/runner.py
|
||||||
|
lib/sqlalchemy/testing/schema.py
|
||||||
|
lib/sqlalchemy/testing/util.py
|
||||||
|
lib/sqlalchemy/testing/warnings.py
|
||||||
|
lib/sqlalchemy/testing/plugin/__init__.py
|
||||||
|
lib/sqlalchemy/testing/plugin/noseplugin.py
|
||||||
|
lib/sqlalchemy/testing/plugin/plugin_base.py
|
||||||
|
lib/sqlalchemy/testing/plugin/pytestplugin.py
|
||||||
|
lib/sqlalchemy/testing/suite/__init__.py
|
||||||
|
lib/sqlalchemy/testing/suite/test_ddl.py
|
||||||
|
lib/sqlalchemy/testing/suite/test_insert.py
|
||||||
|
lib/sqlalchemy/testing/suite/test_reflection.py
|
||||||
|
lib/sqlalchemy/testing/suite/test_results.py
|
||||||
|
lib/sqlalchemy/testing/suite/test_select.py
|
||||||
|
lib/sqlalchemy/testing/suite/test_sequence.py
|
||||||
|
lib/sqlalchemy/testing/suite/test_types.py
|
||||||
|
lib/sqlalchemy/testing/suite/test_update_delete.py
|
||||||
|
lib/sqlalchemy/util/__init__.py
|
||||||
|
lib/sqlalchemy/util/_collections.py
|
||||||
|
lib/sqlalchemy/util/compat.py
|
||||||
|
lib/sqlalchemy/util/deprecations.py
|
||||||
|
lib/sqlalchemy/util/langhelpers.py
|
||||||
|
lib/sqlalchemy/util/queue.py
|
||||||
|
lib/sqlalchemy/util/topological.py
|
||||||
|
test/__init__.py
|
||||||
|
test/binary_data_one.dat
|
||||||
|
test/binary_data_two.dat
|
||||||
|
test/conftest.py
|
||||||
|
test/requirements.py
|
||||||
|
test/aaa_profiling/__init__.py
|
||||||
|
test/aaa_profiling/test_compiler.py
|
||||||
|
test/aaa_profiling/test_memusage.py
|
||||||
|
test/aaa_profiling/test_orm.py
|
||||||
|
test/aaa_profiling/test_pool.py
|
||||||
|
test/aaa_profiling/test_resultset.py
|
||||||
|
test/aaa_profiling/test_zoomark.py
|
||||||
|
test/aaa_profiling/test_zoomark_orm.py
|
||||||
|
test/base/__init__.py
|
||||||
|
test/base/test_dependency.py
|
||||||
|
test/base/test_events.py
|
||||||
|
test/base/test_except.py
|
||||||
|
test/base/test_inspect.py
|
||||||
|
test/base/test_utils.py
|
||||||
|
test/dialect/__init__.py
|
||||||
|
test/dialect/test_firebird.py
|
||||||
|
test/dialect/test_mxodbc.py
|
||||||
|
test/dialect/test_oracle.py
|
||||||
|
test/dialect/test_pyodbc.py
|
||||||
|
test/dialect/test_sqlite.py
|
||||||
|
test/dialect/test_suite.py
|
||||||
|
test/dialect/test_sybase.py
|
||||||
|
test/dialect/mssql/__init__.py
|
||||||
|
test/dialect/mssql/test_compiler.py
|
||||||
|
test/dialect/mssql/test_engine.py
|
||||||
|
test/dialect/mssql/test_query.py
|
||||||
|
test/dialect/mssql/test_reflection.py
|
||||||
|
test/dialect/mssql/test_types.py
|
||||||
|
test/dialect/mysql/__init__.py
|
||||||
|
test/dialect/mysql/test_compiler.py
|
||||||
|
test/dialect/mysql/test_dialect.py
|
||||||
|
test/dialect/mysql/test_query.py
|
||||||
|
test/dialect/mysql/test_reflection.py
|
||||||
|
test/dialect/mysql/test_types.py
|
||||||
|
test/dialect/postgresql/__init__.py
|
||||||
|
test/dialect/postgresql/test_compiler.py
|
||||||
|
test/dialect/postgresql/test_dialect.py
|
||||||
|
test/dialect/postgresql/test_query.py
|
||||||
|
test/dialect/postgresql/test_reflection.py
|
||||||
|
test/dialect/postgresql/test_types.py
|
||||||
|
test/engine/__init__.py
|
||||||
|
test/engine/test_bind.py
|
||||||
|
test/engine/test_ddlevents.py
|
||||||
|
test/engine/test_execute.py
|
||||||
|
test/engine/test_logging.py
|
||||||
|
test/engine/test_parseconnect.py
|
||||||
|
test/engine/test_pool.py
|
||||||
|
test/engine/test_processors.py
|
||||||
|
test/engine/test_reconnect.py
|
||||||
|
test/engine/test_reflection.py
|
||||||
|
test/engine/test_transaction.py
|
||||||
|
test/ext/__init__.py
|
||||||
|
test/ext/test_associationproxy.py
|
||||||
|
test/ext/test_automap.py
|
||||||
|
test/ext/test_compiler.py
|
||||||
|
test/ext/test_extendedattr.py
|
||||||
|
test/ext/test_horizontal_shard.py
|
||||||
|
test/ext/test_hybrid.py
|
||||||
|
test/ext/test_mutable.py
|
||||||
|
test/ext/test_orderinglist.py
|
||||||
|
test/ext/test_serializer.py
|
||||||
|
test/ext/declarative/__init__.py
|
||||||
|
test/ext/declarative/test_basic.py
|
||||||
|
test/ext/declarative/test_clsregistry.py
|
||||||
|
test/ext/declarative/test_inheritance.py
|
||||||
|
test/ext/declarative/test_mixin.py
|
||||||
|
test/ext/declarative/test_reflection.py
|
||||||
|
test/orm/__init__.py
|
||||||
|
test/orm/_fixtures.py
|
||||||
|
test/orm/test_association.py
|
||||||
|
test/orm/test_assorted_eager.py
|
||||||
|
test/orm/test_attributes.py
|
||||||
|
test/orm/test_backref_mutations.py
|
||||||
|
test/orm/test_bind.py
|
||||||
|
test/orm/test_bundle.py
|
||||||
|
test/orm/test_cascade.py
|
||||||
|
test/orm/test_collection.py
|
||||||
|
test/orm/test_compile.py
|
||||||
|
test/orm/test_composites.py
|
||||||
|
test/orm/test_cycles.py
|
||||||
|
test/orm/test_default_strategies.py
|
||||||
|
test/orm/test_defaults.py
|
||||||
|
test/orm/test_deferred.py
|
||||||
|
test/orm/test_deprecations.py
|
||||||
|
test/orm/test_descriptor.py
|
||||||
|
test/orm/test_dynamic.py
|
||||||
|
test/orm/test_eager_relations.py
|
||||||
|
test/orm/test_evaluator.py
|
||||||
|
test/orm/test_events.py
|
||||||
|
test/orm/test_expire.py
|
||||||
|
test/orm/test_froms.py
|
||||||
|
test/orm/test_generative.py
|
||||||
|
test/orm/test_hasparent.py
|
||||||
|
test/orm/test_immediate_load.py
|
||||||
|
test/orm/test_inspect.py
|
||||||
|
test/orm/test_instrumentation.py
|
||||||
|
test/orm/test_joins.py
|
||||||
|
test/orm/test_lazy_relations.py
|
||||||
|
test/orm/test_load_on_fks.py
|
||||||
|
test/orm/test_loading.py
|
||||||
|
test/orm/test_lockmode.py
|
||||||
|
test/orm/test_manytomany.py
|
||||||
|
test/orm/test_mapper.py
|
||||||
|
test/orm/test_merge.py
|
||||||
|
test/orm/test_naturalpks.py
|
||||||
|
test/orm/test_of_type.py
|
||||||
|
test/orm/test_onetoone.py
|
||||||
|
test/orm/test_options.py
|
||||||
|
test/orm/test_pickled.py
|
||||||
|
test/orm/test_query.py
|
||||||
|
test/orm/test_rel_fn.py
|
||||||
|
test/orm/test_relationships.py
|
||||||
|
test/orm/test_scoping.py
|
||||||
|
test/orm/test_selectable.py
|
||||||
|
test/orm/test_session.py
|
||||||
|
test/orm/test_subquery_relations.py
|
||||||
|
test/orm/test_sync.py
|
||||||
|
test/orm/test_transaction.py
|
||||||
|
test/orm/test_unitofwork.py
|
||||||
|
test/orm/test_unitofworkv2.py
|
||||||
|
test/orm/test_update_delete.py
|
||||||
|
test/orm/test_utils.py
|
||||||
|
test/orm/test_validators.py
|
||||||
|
test/orm/test_versioning.py
|
||||||
|
test/orm/inheritance/__init__.py
|
||||||
|
test/orm/inheritance/_poly_fixtures.py
|
||||||
|
test/orm/inheritance/test_abc_inheritance.py
|
||||||
|
test/orm/inheritance/test_abc_polymorphic.py
|
||||||
|
test/orm/inheritance/test_assorted_poly.py
|
||||||
|
test/orm/inheritance/test_basic.py
|
||||||
|
test/orm/inheritance/test_concrete.py
|
||||||
|
test/orm/inheritance/test_magazine.py
|
||||||
|
test/orm/inheritance/test_manytomany.py
|
||||||
|
test/orm/inheritance/test_poly_linked_list.py
|
||||||
|
test/orm/inheritance/test_poly_persistence.py
|
||||||
|
test/orm/inheritance/test_polymorphic_rel.py
|
||||||
|
test/orm/inheritance/test_productspec.py
|
||||||
|
test/orm/inheritance/test_relationship.py
|
||||||
|
test/orm/inheritance/test_selects.py
|
||||||
|
test/orm/inheritance/test_single.py
|
||||||
|
test/orm/inheritance/test_with_poly.py
|
||||||
|
test/perf/invalidate_stresstest.py
|
||||||
|
test/perf/orm2010.py
|
||||||
|
test/sql/__init__.py
|
||||||
|
test/sql/test_case_statement.py
|
||||||
|
test/sql/test_compiler.py
|
||||||
|
test/sql/test_constraints.py
|
||||||
|
test/sql/test_cte.py
|
||||||
|
test/sql/test_ddlemit.py
|
||||||
|
test/sql/test_defaults.py
|
||||||
|
test/sql/test_delete.py
|
||||||
|
test/sql/test_functions.py
|
||||||
|
test/sql/test_generative.py
|
||||||
|
test/sql/test_insert.py
|
||||||
|
test/sql/test_inspect.py
|
||||||
|
test/sql/test_join_rewriting.py
|
||||||
|
test/sql/test_labels.py
|
||||||
|
test/sql/test_metadata.py
|
||||||
|
test/sql/test_operators.py
|
||||||
|
test/sql/test_query.py
|
||||||
|
test/sql/test_quote.py
|
||||||
|
test/sql/test_returning.py
|
||||||
|
test/sql/test_rowcount.py
|
||||||
|
test/sql/test_selectable.py
|
||||||
|
test/sql/test_text.py
|
||||||
|
test/sql/test_type_expressions.py
|
||||||
|
test/sql/test_types.py
|
||||||
|
test/sql/test_unicode.py
|
||||||
|
test/sql/test_update.py
|
|
@ -0,0 +1 @@
|
||||||
|
|
|
@ -0,0 +1,366 @@
|
||||||
|
../sqlalchemy/__init__.py
|
||||||
|
../sqlalchemy/events.py
|
||||||
|
../sqlalchemy/exc.py
|
||||||
|
../sqlalchemy/inspection.py
|
||||||
|
../sqlalchemy/interfaces.py
|
||||||
|
../sqlalchemy/log.py
|
||||||
|
../sqlalchemy/pool.py
|
||||||
|
../sqlalchemy/processors.py
|
||||||
|
../sqlalchemy/schema.py
|
||||||
|
../sqlalchemy/types.py
|
||||||
|
../sqlalchemy/connectors/__init__.py
|
||||||
|
../sqlalchemy/connectors/mxodbc.py
|
||||||
|
../sqlalchemy/connectors/mysqldb.py
|
||||||
|
../sqlalchemy/connectors/pyodbc.py
|
||||||
|
../sqlalchemy/connectors/zxJDBC.py
|
||||||
|
../sqlalchemy/databases/__init__.py
|
||||||
|
../sqlalchemy/dialects/__init__.py
|
||||||
|
../sqlalchemy/dialects/postgres.py
|
||||||
|
../sqlalchemy/dialects/drizzle/__init__.py
|
||||||
|
../sqlalchemy/dialects/drizzle/base.py
|
||||||
|
../sqlalchemy/dialects/drizzle/mysqldb.py
|
||||||
|
../sqlalchemy/dialects/firebird/__init__.py
|
||||||
|
../sqlalchemy/dialects/firebird/base.py
|
||||||
|
../sqlalchemy/dialects/firebird/fdb.py
|
||||||
|
../sqlalchemy/dialects/firebird/kinterbasdb.py
|
||||||
|
../sqlalchemy/dialects/mssql/__init__.py
|
||||||
|
../sqlalchemy/dialects/mssql/adodbapi.py
|
||||||
|
../sqlalchemy/dialects/mssql/base.py
|
||||||
|
../sqlalchemy/dialects/mssql/information_schema.py
|
||||||
|
../sqlalchemy/dialects/mssql/mxodbc.py
|
||||||
|
../sqlalchemy/dialects/mssql/pymssql.py
|
||||||
|
../sqlalchemy/dialects/mssql/pyodbc.py
|
||||||
|
../sqlalchemy/dialects/mssql/zxjdbc.py
|
||||||
|
../sqlalchemy/dialects/mysql/__init__.py
|
||||||
|
../sqlalchemy/dialects/mysql/base.py
|
||||||
|
../sqlalchemy/dialects/mysql/cymysql.py
|
||||||
|
../sqlalchemy/dialects/mysql/gaerdbms.py
|
||||||
|
../sqlalchemy/dialects/mysql/mysqlconnector.py
|
||||||
|
../sqlalchemy/dialects/mysql/mysqldb.py
|
||||||
|
../sqlalchemy/dialects/mysql/oursql.py
|
||||||
|
../sqlalchemy/dialects/mysql/pymysql.py
|
||||||
|
../sqlalchemy/dialects/mysql/pyodbc.py
|
||||||
|
../sqlalchemy/dialects/mysql/zxjdbc.py
|
||||||
|
../sqlalchemy/dialects/oracle/__init__.py
|
||||||
|
../sqlalchemy/dialects/oracle/base.py
|
||||||
|
../sqlalchemy/dialects/oracle/cx_oracle.py
|
||||||
|
../sqlalchemy/dialects/oracle/zxjdbc.py
|
||||||
|
../sqlalchemy/dialects/postgresql/__init__.py
|
||||||
|
../sqlalchemy/dialects/postgresql/base.py
|
||||||
|
../sqlalchemy/dialects/postgresql/constraints.py
|
||||||
|
../sqlalchemy/dialects/postgresql/hstore.py
|
||||||
|
../sqlalchemy/dialects/postgresql/json.py
|
||||||
|
../sqlalchemy/dialects/postgresql/pg8000.py
|
||||||
|
../sqlalchemy/dialects/postgresql/psycopg2.py
|
||||||
|
../sqlalchemy/dialects/postgresql/pypostgresql.py
|
||||||
|
../sqlalchemy/dialects/postgresql/ranges.py
|
||||||
|
../sqlalchemy/dialects/postgresql/zxjdbc.py
|
||||||
|
../sqlalchemy/dialects/sqlite/__init__.py
|
||||||
|
../sqlalchemy/dialects/sqlite/base.py
|
||||||
|
../sqlalchemy/dialects/sqlite/pysqlite.py
|
||||||
|
../sqlalchemy/dialects/sybase/__init__.py
|
||||||
|
../sqlalchemy/dialects/sybase/base.py
|
||||||
|
../sqlalchemy/dialects/sybase/mxodbc.py
|
||||||
|
../sqlalchemy/dialects/sybase/pyodbc.py
|
||||||
|
../sqlalchemy/dialects/sybase/pysybase.py
|
||||||
|
../sqlalchemy/engine/__init__.py
|
||||||
|
../sqlalchemy/engine/base.py
|
||||||
|
../sqlalchemy/engine/default.py
|
||||||
|
../sqlalchemy/engine/interfaces.py
|
||||||
|
../sqlalchemy/engine/reflection.py
|
||||||
|
../sqlalchemy/engine/result.py
|
||||||
|
../sqlalchemy/engine/strategies.py
|
||||||
|
../sqlalchemy/engine/threadlocal.py
|
||||||
|
../sqlalchemy/engine/url.py
|
||||||
|
../sqlalchemy/engine/util.py
|
||||||
|
../sqlalchemy/event/__init__.py
|
||||||
|
../sqlalchemy/event/api.py
|
||||||
|
../sqlalchemy/event/attr.py
|
||||||
|
../sqlalchemy/event/base.py
|
||||||
|
../sqlalchemy/event/legacy.py
|
||||||
|
../sqlalchemy/event/registry.py
|
||||||
|
../sqlalchemy/ext/__init__.py
|
||||||
|
../sqlalchemy/ext/associationproxy.py
|
||||||
|
../sqlalchemy/ext/automap.py
|
||||||
|
../sqlalchemy/ext/compiler.py
|
||||||
|
../sqlalchemy/ext/horizontal_shard.py
|
||||||
|
../sqlalchemy/ext/hybrid.py
|
||||||
|
../sqlalchemy/ext/instrumentation.py
|
||||||
|
../sqlalchemy/ext/mutable.py
|
||||||
|
../sqlalchemy/ext/orderinglist.py
|
||||||
|
../sqlalchemy/ext/serializer.py
|
||||||
|
../sqlalchemy/ext/declarative/__init__.py
|
||||||
|
../sqlalchemy/ext/declarative/api.py
|
||||||
|
../sqlalchemy/ext/declarative/base.py
|
||||||
|
../sqlalchemy/ext/declarative/clsregistry.py
|
||||||
|
../sqlalchemy/orm/__init__.py
|
||||||
|
../sqlalchemy/orm/attributes.py
|
||||||
|
../sqlalchemy/orm/base.py
|
||||||
|
../sqlalchemy/orm/collections.py
|
||||||
|
../sqlalchemy/orm/dependency.py
|
||||||
|
../sqlalchemy/orm/deprecated_interfaces.py
|
||||||
|
../sqlalchemy/orm/descriptor_props.py
|
||||||
|
../sqlalchemy/orm/dynamic.py
|
||||||
|
../sqlalchemy/orm/evaluator.py
|
||||||
|
../sqlalchemy/orm/events.py
|
||||||
|
../sqlalchemy/orm/exc.py
|
||||||
|
../sqlalchemy/orm/identity.py
|
||||||
|
../sqlalchemy/orm/instrumentation.py
|
||||||
|
../sqlalchemy/orm/interfaces.py
|
||||||
|
../sqlalchemy/orm/loading.py
|
||||||
|
../sqlalchemy/orm/mapper.py
|
||||||
|
../sqlalchemy/orm/path_registry.py
|
||||||
|
../sqlalchemy/orm/persistence.py
|
||||||
|
../sqlalchemy/orm/properties.py
|
||||||
|
../sqlalchemy/orm/query.py
|
||||||
|
../sqlalchemy/orm/relationships.py
|
||||||
|
../sqlalchemy/orm/scoping.py
|
||||||
|
../sqlalchemy/orm/session.py
|
||||||
|
../sqlalchemy/orm/state.py
|
||||||
|
../sqlalchemy/orm/strategies.py
|
||||||
|
../sqlalchemy/orm/strategy_options.py
|
||||||
|
../sqlalchemy/orm/sync.py
|
||||||
|
../sqlalchemy/orm/unitofwork.py
|
||||||
|
../sqlalchemy/orm/util.py
|
||||||
|
../sqlalchemy/sql/__init__.py
|
||||||
|
../sqlalchemy/sql/annotation.py
|
||||||
|
../sqlalchemy/sql/base.py
|
||||||
|
../sqlalchemy/sql/compiler.py
|
||||||
|
../sqlalchemy/sql/ddl.py
|
||||||
|
../sqlalchemy/sql/default_comparator.py
|
||||||
|
../sqlalchemy/sql/dml.py
|
||||||
|
../sqlalchemy/sql/elements.py
|
||||||
|
../sqlalchemy/sql/expression.py
|
||||||
|
../sqlalchemy/sql/functions.py
|
||||||
|
../sqlalchemy/sql/naming.py
|
||||||
|
../sqlalchemy/sql/operators.py
|
||||||
|
../sqlalchemy/sql/schema.py
|
||||||
|
../sqlalchemy/sql/selectable.py
|
||||||
|
../sqlalchemy/sql/sqltypes.py
|
||||||
|
../sqlalchemy/sql/type_api.py
|
||||||
|
../sqlalchemy/sql/util.py
|
||||||
|
../sqlalchemy/sql/visitors.py
|
||||||
|
../sqlalchemy/testing/__init__.py
|
||||||
|
../sqlalchemy/testing/assertions.py
|
||||||
|
../sqlalchemy/testing/assertsql.py
|
||||||
|
../sqlalchemy/testing/config.py
|
||||||
|
../sqlalchemy/testing/distutils_run.py
|
||||||
|
../sqlalchemy/testing/engines.py
|
||||||
|
../sqlalchemy/testing/entities.py
|
||||||
|
../sqlalchemy/testing/exclusions.py
|
||||||
|
../sqlalchemy/testing/fixtures.py
|
||||||
|
../sqlalchemy/testing/mock.py
|
||||||
|
../sqlalchemy/testing/pickleable.py
|
||||||
|
../sqlalchemy/testing/profiling.py
|
||||||
|
../sqlalchemy/testing/requirements.py
|
||||||
|
../sqlalchemy/testing/runner.py
|
||||||
|
../sqlalchemy/testing/schema.py
|
||||||
|
../sqlalchemy/testing/util.py
|
||||||
|
../sqlalchemy/testing/warnings.py
|
||||||
|
../sqlalchemy/testing/plugin/__init__.py
|
||||||
|
../sqlalchemy/testing/plugin/noseplugin.py
|
||||||
|
../sqlalchemy/testing/plugin/plugin_base.py
|
||||||
|
../sqlalchemy/testing/plugin/pytestplugin.py
|
||||||
|
../sqlalchemy/testing/suite/__init__.py
|
||||||
|
../sqlalchemy/testing/suite/test_ddl.py
|
||||||
|
../sqlalchemy/testing/suite/test_insert.py
|
||||||
|
../sqlalchemy/testing/suite/test_reflection.py
|
||||||
|
../sqlalchemy/testing/suite/test_results.py
|
||||||
|
../sqlalchemy/testing/suite/test_select.py
|
||||||
|
../sqlalchemy/testing/suite/test_sequence.py
|
||||||
|
../sqlalchemy/testing/suite/test_types.py
|
||||||
|
../sqlalchemy/testing/suite/test_update_delete.py
|
||||||
|
../sqlalchemy/util/__init__.py
|
||||||
|
../sqlalchemy/util/_collections.py
|
||||||
|
../sqlalchemy/util/compat.py
|
||||||
|
../sqlalchemy/util/deprecations.py
|
||||||
|
../sqlalchemy/util/langhelpers.py
|
||||||
|
../sqlalchemy/util/queue.py
|
||||||
|
../sqlalchemy/util/topological.py
|
||||||
|
../sqlalchemy/__pycache__/__init__.cpython-34.pyc
|
||||||
|
../sqlalchemy/__pycache__/events.cpython-34.pyc
|
||||||
|
../sqlalchemy/__pycache__/exc.cpython-34.pyc
|
||||||
|
../sqlalchemy/__pycache__/inspection.cpython-34.pyc
|
||||||
|
../sqlalchemy/__pycache__/interfaces.cpython-34.pyc
|
||||||
|
../sqlalchemy/__pycache__/log.cpython-34.pyc
|
||||||
|
../sqlalchemy/__pycache__/pool.cpython-34.pyc
|
||||||
|
../sqlalchemy/__pycache__/processors.cpython-34.pyc
|
||||||
|
../sqlalchemy/__pycache__/schema.cpython-34.pyc
|
||||||
|
../sqlalchemy/__pycache__/types.cpython-34.pyc
|
||||||
|
../sqlalchemy/connectors/__pycache__/__init__.cpython-34.pyc
|
||||||
|
../sqlalchemy/connectors/__pycache__/mxodbc.cpython-34.pyc
|
||||||
|
../sqlalchemy/connectors/__pycache__/mysqldb.cpython-34.pyc
|
||||||
|
../sqlalchemy/connectors/__pycache__/pyodbc.cpython-34.pyc
|
||||||
|
../sqlalchemy/connectors/__pycache__/zxJDBC.cpython-34.pyc
|
||||||
|
../sqlalchemy/databases/__pycache__/__init__.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/__pycache__/__init__.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/__pycache__/postgres.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/drizzle/__pycache__/__init__.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/drizzle/__pycache__/base.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/drizzle/__pycache__/mysqldb.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/firebird/__pycache__/__init__.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/firebird/__pycache__/base.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/firebird/__pycache__/fdb.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/firebird/__pycache__/kinterbasdb.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/mssql/__pycache__/__init__.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/mssql/__pycache__/adodbapi.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/mssql/__pycache__/base.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/mssql/__pycache__/information_schema.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/mssql/__pycache__/mxodbc.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/mssql/__pycache__/pymssql.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/mssql/__pycache__/pyodbc.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/mssql/__pycache__/zxjdbc.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/mysql/__pycache__/__init__.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/mysql/__pycache__/base.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/mysql/__pycache__/cymysql.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/mysql/__pycache__/gaerdbms.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/mysql/__pycache__/mysqlconnector.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/mysql/__pycache__/mysqldb.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/mysql/__pycache__/oursql.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/mysql/__pycache__/pymysql.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/mysql/__pycache__/pyodbc.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/mysql/__pycache__/zxjdbc.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/oracle/__pycache__/__init__.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/oracle/__pycache__/base.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/oracle/__pycache__/cx_oracle.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/oracle/__pycache__/zxjdbc.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/postgresql/__pycache__/__init__.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/postgresql/__pycache__/base.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/postgresql/__pycache__/constraints.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/postgresql/__pycache__/hstore.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/postgresql/__pycache__/json.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/postgresql/__pycache__/pg8000.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/postgresql/__pycache__/psycopg2.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/postgresql/__pycache__/pypostgresql.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/postgresql/__pycache__/ranges.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/postgresql/__pycache__/zxjdbc.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/sqlite/__pycache__/__init__.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/sqlite/__pycache__/base.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/sqlite/__pycache__/pysqlite.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/sybase/__pycache__/__init__.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/sybase/__pycache__/base.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/sybase/__pycache__/mxodbc.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/sybase/__pycache__/pyodbc.cpython-34.pyc
|
||||||
|
../sqlalchemy/dialects/sybase/__pycache__/pysybase.cpython-34.pyc
|
||||||
|
../sqlalchemy/engine/__pycache__/__init__.cpython-34.pyc
|
||||||
|
../sqlalchemy/engine/__pycache__/base.cpython-34.pyc
|
||||||
|
../sqlalchemy/engine/__pycache__/default.cpython-34.pyc
|
||||||
|
../sqlalchemy/engine/__pycache__/interfaces.cpython-34.pyc
|
||||||
|
../sqlalchemy/engine/__pycache__/reflection.cpython-34.pyc
|
||||||
|
../sqlalchemy/engine/__pycache__/result.cpython-34.pyc
|
||||||
|
../sqlalchemy/engine/__pycache__/strategies.cpython-34.pyc
|
||||||
|
../sqlalchemy/engine/__pycache__/threadlocal.cpython-34.pyc
|
||||||
|
../sqlalchemy/engine/__pycache__/url.cpython-34.pyc
|
||||||
|
../sqlalchemy/engine/__pycache__/util.cpython-34.pyc
|
||||||
|
../sqlalchemy/event/__pycache__/__init__.cpython-34.pyc
|
||||||
|
../sqlalchemy/event/__pycache__/api.cpython-34.pyc
|
||||||
|
../sqlalchemy/event/__pycache__/attr.cpython-34.pyc
|
||||||
|
../sqlalchemy/event/__pycache__/base.cpython-34.pyc
|
||||||
|
../sqlalchemy/event/__pycache__/legacy.cpython-34.pyc
|
||||||
|
../sqlalchemy/event/__pycache__/registry.cpython-34.pyc
|
||||||
|
../sqlalchemy/ext/__pycache__/__init__.cpython-34.pyc
|
||||||
|
../sqlalchemy/ext/__pycache__/associationproxy.cpython-34.pyc
|
||||||
|
../sqlalchemy/ext/__pycache__/automap.cpython-34.pyc
|
||||||
|
../sqlalchemy/ext/__pycache__/compiler.cpython-34.pyc
|
||||||
|
../sqlalchemy/ext/__pycache__/horizontal_shard.cpython-34.pyc
|
||||||
|
../sqlalchemy/ext/__pycache__/hybrid.cpython-34.pyc
|
||||||
|
../sqlalchemy/ext/__pycache__/instrumentation.cpython-34.pyc
|
||||||
|
../sqlalchemy/ext/__pycache__/mutable.cpython-34.pyc
|
||||||
|
../sqlalchemy/ext/__pycache__/orderinglist.cpython-34.pyc
|
||||||
|
../sqlalchemy/ext/__pycache__/serializer.cpython-34.pyc
|
||||||
|
../sqlalchemy/ext/declarative/__pycache__/__init__.cpython-34.pyc
|
||||||
|
../sqlalchemy/ext/declarative/__pycache__/api.cpython-34.pyc
|
||||||
|
../sqlalchemy/ext/declarative/__pycache__/base.cpython-34.pyc
|
||||||
|
../sqlalchemy/ext/declarative/__pycache__/clsregistry.cpython-34.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/__init__.cpython-34.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/attributes.cpython-34.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/base.cpython-34.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/collections.cpython-34.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/dependency.cpython-34.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/deprecated_interfaces.cpython-34.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/descriptor_props.cpython-34.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/dynamic.cpython-34.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/evaluator.cpython-34.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/events.cpython-34.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/exc.cpython-34.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/identity.cpython-34.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/instrumentation.cpython-34.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/interfaces.cpython-34.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/loading.cpython-34.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/mapper.cpython-34.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/path_registry.cpython-34.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/persistence.cpython-34.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/properties.cpython-34.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/query.cpython-34.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/relationships.cpython-34.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/scoping.cpython-34.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/session.cpython-34.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/state.cpython-34.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/strategies.cpython-34.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/strategy_options.cpython-34.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/sync.cpython-34.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/unitofwork.cpython-34.pyc
|
||||||
|
../sqlalchemy/orm/__pycache__/util.cpython-34.pyc
|
||||||
|
../sqlalchemy/sql/__pycache__/__init__.cpython-34.pyc
|
||||||
|
../sqlalchemy/sql/__pycache__/annotation.cpython-34.pyc
|
||||||
|
../sqlalchemy/sql/__pycache__/base.cpython-34.pyc
|
||||||
|
../sqlalchemy/sql/__pycache__/compiler.cpython-34.pyc
|
||||||
|
../sqlalchemy/sql/__pycache__/ddl.cpython-34.pyc
|
||||||
|
../sqlalchemy/sql/__pycache__/default_comparator.cpython-34.pyc
|
||||||
|
../sqlalchemy/sql/__pycache__/dml.cpython-34.pyc
|
||||||
|
../sqlalchemy/sql/__pycache__/elements.cpython-34.pyc
|
||||||
|
../sqlalchemy/sql/__pycache__/expression.cpython-34.pyc
|
||||||
|
../sqlalchemy/sql/__pycache__/functions.cpython-34.pyc
|
||||||
|
../sqlalchemy/sql/__pycache__/naming.cpython-34.pyc
|
||||||
|
../sqlalchemy/sql/__pycache__/operators.cpython-34.pyc
|
||||||
|
../sqlalchemy/sql/__pycache__/schema.cpython-34.pyc
|
||||||
|
../sqlalchemy/sql/__pycache__/selectable.cpython-34.pyc
|
||||||
|
../sqlalchemy/sql/__pycache__/sqltypes.cpython-34.pyc
|
||||||
|
../sqlalchemy/sql/__pycache__/type_api.cpython-34.pyc
|
||||||
|
../sqlalchemy/sql/__pycache__/util.cpython-34.pyc
|
||||||
|
../sqlalchemy/sql/__pycache__/visitors.cpython-34.pyc
|
||||||
|
../sqlalchemy/testing/__pycache__/__init__.cpython-34.pyc
|
||||||
|
../sqlalchemy/testing/__pycache__/assertions.cpython-34.pyc
|
||||||
|
../sqlalchemy/testing/__pycache__/assertsql.cpython-34.pyc
|
||||||
|
../sqlalchemy/testing/__pycache__/config.cpython-34.pyc
|
||||||
|
../sqlalchemy/testing/__pycache__/distutils_run.cpython-34.pyc
|
||||||
|
../sqlalchemy/testing/__pycache__/engines.cpython-34.pyc
|
||||||
|
../sqlalchemy/testing/__pycache__/entities.cpython-34.pyc
|
||||||
|
../sqlalchemy/testing/__pycache__/exclusions.cpython-34.pyc
|
||||||
|
../sqlalchemy/testing/__pycache__/fixtures.cpython-34.pyc
|
||||||
|
../sqlalchemy/testing/__pycache__/mock.cpython-34.pyc
|
||||||
|
../sqlalchemy/testing/__pycache__/pickleable.cpython-34.pyc
|
||||||
|
../sqlalchemy/testing/__pycache__/profiling.cpython-34.pyc
|
||||||
|
../sqlalchemy/testing/__pycache__/requirements.cpython-34.pyc
|
||||||
|
../sqlalchemy/testing/__pycache__/runner.cpython-34.pyc
|
||||||
|
../sqlalchemy/testing/__pycache__/schema.cpython-34.pyc
|
||||||
|
../sqlalchemy/testing/__pycache__/util.cpython-34.pyc
|
||||||
|
../sqlalchemy/testing/__pycache__/warnings.cpython-34.pyc
|
||||||
|
../sqlalchemy/testing/plugin/__pycache__/__init__.cpython-34.pyc
|
||||||
|
../sqlalchemy/testing/plugin/__pycache__/noseplugin.cpython-34.pyc
|
||||||
|
../sqlalchemy/testing/plugin/__pycache__/plugin_base.cpython-34.pyc
|
||||||
|
../sqlalchemy/testing/plugin/__pycache__/pytestplugin.cpython-34.pyc
|
||||||
|
../sqlalchemy/testing/suite/__pycache__/__init__.cpython-34.pyc
|
||||||
|
../sqlalchemy/testing/suite/__pycache__/test_ddl.cpython-34.pyc
|
||||||
|
../sqlalchemy/testing/suite/__pycache__/test_insert.cpython-34.pyc
|
||||||
|
../sqlalchemy/testing/suite/__pycache__/test_reflection.cpython-34.pyc
|
||||||
|
../sqlalchemy/testing/suite/__pycache__/test_results.cpython-34.pyc
|
||||||
|
../sqlalchemy/testing/suite/__pycache__/test_select.cpython-34.pyc
|
||||||
|
../sqlalchemy/testing/suite/__pycache__/test_sequence.cpython-34.pyc
|
||||||
|
../sqlalchemy/testing/suite/__pycache__/test_types.cpython-34.pyc
|
||||||
|
../sqlalchemy/testing/suite/__pycache__/test_update_delete.cpython-34.pyc
|
||||||
|
../sqlalchemy/util/__pycache__/__init__.cpython-34.pyc
|
||||||
|
../sqlalchemy/util/__pycache__/_collections.cpython-34.pyc
|
||||||
|
../sqlalchemy/util/__pycache__/compat.cpython-34.pyc
|
||||||
|
../sqlalchemy/util/__pycache__/deprecations.cpython-34.pyc
|
||||||
|
../sqlalchemy/util/__pycache__/langhelpers.cpython-34.pyc
|
||||||
|
../sqlalchemy/util/__pycache__/queue.cpython-34.pyc
|
||||||
|
../sqlalchemy/util/__pycache__/topological.cpython-34.pyc
|
||||||
|
../sqlalchemy/cprocessors.cpython-34m.so
|
||||||
|
../sqlalchemy/cresultproxy.cpython-34m.so
|
||||||
|
../sqlalchemy/cutils.cpython-34m.so
|
||||||
|
./
|
||||||
|
dependency_links.txt
|
||||||
|
PKG-INFO
|
||||||
|
SOURCES.txt
|
||||||
|
top_level.txt
|
|
@ -0,0 +1 @@
|
||||||
|
sqlalchemy
|
16
lib/python3.4/site-packages/_markerlib/__init__.py
Normal file
16
lib/python3.4/site-packages/_markerlib/__init__.py
Normal file
|
@ -0,0 +1,16 @@
|
||||||
|
try:
|
||||||
|
import ast
|
||||||
|
from _markerlib.markers import default_environment, compile, interpret
|
||||||
|
except ImportError:
|
||||||
|
if 'ast' in globals():
|
||||||
|
raise
|
||||||
|
def default_environment():
|
||||||
|
return {}
|
||||||
|
def compile(marker):
|
||||||
|
def marker_fn(environment=None, override=None):
|
||||||
|
# 'empty markers are True' heuristic won't install extra deps.
|
||||||
|
return not marker.strip()
|
||||||
|
marker_fn.__doc__ = marker
|
||||||
|
return marker_fn
|
||||||
|
def interpret(marker, environment=None, override=None):
|
||||||
|
return compile(marker)()
|
119
lib/python3.4/site-packages/_markerlib/markers.py
Normal file
119
lib/python3.4/site-packages/_markerlib/markers.py
Normal file
|
@ -0,0 +1,119 @@
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
"""Interpret PEP 345 environment markers.
|
||||||
|
|
||||||
|
EXPR [in|==|!=|not in] EXPR [or|and] ...
|
||||||
|
|
||||||
|
where EXPR belongs to any of those:
|
||||||
|
|
||||||
|
python_version = '%s.%s' % (sys.version_info[0], sys.version_info[1])
|
||||||
|
python_full_version = sys.version.split()[0]
|
||||||
|
os.name = os.name
|
||||||
|
sys.platform = sys.platform
|
||||||
|
platform.version = platform.version()
|
||||||
|
platform.machine = platform.machine()
|
||||||
|
platform.python_implementation = platform.python_implementation()
|
||||||
|
a free string, like '2.6', or 'win32'
|
||||||
|
"""
|
||||||
|
|
||||||
|
__all__ = ['default_environment', 'compile', 'interpret']
|
||||||
|
|
||||||
|
import ast
|
||||||
|
import os
|
||||||
|
import platform
|
||||||
|
import sys
|
||||||
|
import weakref
|
||||||
|
|
||||||
|
_builtin_compile = compile
|
||||||
|
|
||||||
|
try:
|
||||||
|
from platform import python_implementation
|
||||||
|
except ImportError:
|
||||||
|
if os.name == "java":
|
||||||
|
# Jython 2.5 has ast module, but not platform.python_implementation() function.
|
||||||
|
def python_implementation():
|
||||||
|
return "Jython"
|
||||||
|
else:
|
||||||
|
raise
|
||||||
|
|
||||||
|
|
||||||
|
# restricted set of variables
|
||||||
|
_VARS = {'sys.platform': sys.platform,
|
||||||
|
'python_version': '%s.%s' % sys.version_info[:2],
|
||||||
|
# FIXME parsing sys.platform is not reliable, but there is no other
|
||||||
|
# way to get e.g. 2.7.2+, and the PEP is defined with sys.version
|
||||||
|
'python_full_version': sys.version.split(' ', 1)[0],
|
||||||
|
'os.name': os.name,
|
||||||
|
'platform.version': platform.version(),
|
||||||
|
'platform.machine': platform.machine(),
|
||||||
|
'platform.python_implementation': python_implementation(),
|
||||||
|
'extra': None # wheel extension
|
||||||
|
}
|
||||||
|
|
||||||
|
for var in list(_VARS.keys()):
|
||||||
|
if '.' in var:
|
||||||
|
_VARS[var.replace('.', '_')] = _VARS[var]
|
||||||
|
|
||||||
|
def default_environment():
|
||||||
|
"""Return copy of default PEP 385 globals dictionary."""
|
||||||
|
return dict(_VARS)
|
||||||
|
|
||||||
|
class ASTWhitelist(ast.NodeTransformer):
|
||||||
|
def __init__(self, statement):
|
||||||
|
self.statement = statement # for error messages
|
||||||
|
|
||||||
|
ALLOWED = (ast.Compare, ast.BoolOp, ast.Attribute, ast.Name, ast.Load, ast.Str)
|
||||||
|
# Bool operations
|
||||||
|
ALLOWED += (ast.And, ast.Or)
|
||||||
|
# Comparison operations
|
||||||
|
ALLOWED += (ast.Eq, ast.Gt, ast.GtE, ast.In, ast.Is, ast.IsNot, ast.Lt, ast.LtE, ast.NotEq, ast.NotIn)
|
||||||
|
|
||||||
|
def visit(self, node):
|
||||||
|
"""Ensure statement only contains allowed nodes."""
|
||||||
|
if not isinstance(node, self.ALLOWED):
|
||||||
|
raise SyntaxError('Not allowed in environment markers.\n%s\n%s' %
|
||||||
|
(self.statement,
|
||||||
|
(' ' * node.col_offset) + '^'))
|
||||||
|
return ast.NodeTransformer.visit(self, node)
|
||||||
|
|
||||||
|
def visit_Attribute(self, node):
|
||||||
|
"""Flatten one level of attribute access."""
|
||||||
|
new_node = ast.Name("%s.%s" % (node.value.id, node.attr), node.ctx)
|
||||||
|
return ast.copy_location(new_node, node)
|
||||||
|
|
||||||
|
def parse_marker(marker):
|
||||||
|
tree = ast.parse(marker, mode='eval')
|
||||||
|
new_tree = ASTWhitelist(marker).generic_visit(tree)
|
||||||
|
return new_tree
|
||||||
|
|
||||||
|
def compile_marker(parsed_marker):
|
||||||
|
return _builtin_compile(parsed_marker, '<environment marker>', 'eval',
|
||||||
|
dont_inherit=True)
|
||||||
|
|
||||||
|
_cache = weakref.WeakValueDictionary()
|
||||||
|
|
||||||
|
def compile(marker):
|
||||||
|
"""Return compiled marker as a function accepting an environment dict."""
|
||||||
|
try:
|
||||||
|
return _cache[marker]
|
||||||
|
except KeyError:
|
||||||
|
pass
|
||||||
|
if not marker.strip():
|
||||||
|
def marker_fn(environment=None, override=None):
|
||||||
|
""""""
|
||||||
|
return True
|
||||||
|
else:
|
||||||
|
compiled_marker = compile_marker(parse_marker(marker))
|
||||||
|
def marker_fn(environment=None, override=None):
|
||||||
|
"""override updates environment"""
|
||||||
|
if override is None:
|
||||||
|
override = {}
|
||||||
|
if environment is None:
|
||||||
|
environment = default_environment()
|
||||||
|
environment.update(override)
|
||||||
|
return eval(compiled_marker, environment)
|
||||||
|
marker_fn.__doc__ = marker
|
||||||
|
_cache[marker] = marker_fn
|
||||||
|
return _cache[marker]
|
||||||
|
|
||||||
|
def interpret(marker, environment=None):
|
||||||
|
return compile(marker)(environment)
|
5
lib/python3.4/site-packages/easy_install.py
Normal file
5
lib/python3.4/site-packages/easy_install.py
Normal file
|
@ -0,0 +1,5 @@
|
||||||
|
"""Run the EasyInstall command"""
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
from setuptools.command.easy_install import main
|
||||||
|
main()
|
28
lib/python3.4/site-packages/ed25519-1.4.egg-info/PKG-INFO
Normal file
28
lib/python3.4/site-packages/ed25519-1.4.egg-info/PKG-INFO
Normal file
|
@ -0,0 +1,28 @@
|
||||||
|
Metadata-Version: 1.1
|
||||||
|
Name: ed25519
|
||||||
|
Version: 1.4
|
||||||
|
Summary: Ed25519 public-key signatures
|
||||||
|
Home-page: https://github.com/warner/python-ed25519
|
||||||
|
Author: Brian Warner
|
||||||
|
Author-email: warner-python-ed25519@lothar.com
|
||||||
|
License: MIT
|
||||||
|
Description: Python bindings to the Ed25519 public-key signature system.
|
||||||
|
|
||||||
|
This offers a comfortable python interface to a C implementation of the
|
||||||
|
Ed25519 public-key signature system (http://ed25519.cr.yp.to/), using the
|
||||||
|
portable 'ref' code from the 'SUPERCOP' benchmarking suite.
|
||||||
|
|
||||||
|
This system provides high (128-bit) security, short (32-byte) keys, short
|
||||||
|
(64-byte) signatures, and fast (2-6ms) operation. Please see the README for
|
||||||
|
more details.
|
||||||
|
|
||||||
|
Platform: UNKNOWN
|
||||||
|
Classifier: Development Status :: 5 - Production/Stable
|
||||||
|
Classifier: Intended Audience :: Developers
|
||||||
|
Classifier: License :: OSI Approved :: MIT License
|
||||||
|
Classifier: Programming Language :: Python
|
||||||
|
Classifier: Programming Language :: Python :: 2.6
|
||||||
|
Classifier: Programming Language :: Python :: 2.7
|
||||||
|
Classifier: Programming Language :: Python :: 3.3
|
||||||
|
Classifier: Programming Language :: Python :: 3.4
|
||||||
|
Classifier: Topic :: Security :: Cryptography
|
39
lib/python3.4/site-packages/ed25519-1.4.egg-info/SOURCES.txt
Normal file
39
lib/python3.4/site-packages/ed25519-1.4.egg-info/SOURCES.txt
Normal file
|
@ -0,0 +1,39 @@
|
||||||
|
LICENSE
|
||||||
|
MANIFEST.in
|
||||||
|
Makefile
|
||||||
|
NEWS
|
||||||
|
README.md
|
||||||
|
kat-ed25519.txt
|
||||||
|
kat.py
|
||||||
|
setup.cfg
|
||||||
|
test_ed25519_kat.py
|
||||||
|
versioneer.py
|
||||||
|
bin/edsig
|
||||||
|
ed25519.egg-info/PKG-INFO
|
||||||
|
ed25519.egg-info/SOURCES.txt
|
||||||
|
ed25519.egg-info/dependency_links.txt
|
||||||
|
ed25519.egg-info/top_level.txt
|
||||||
|
src/ed25519/__init__.py
|
||||||
|
src/ed25519/_version.py
|
||||||
|
src/ed25519/keys.py
|
||||||
|
src/ed25519/test_ed25519.py
|
||||||
|
src/ed25519-glue/ed25519module.c
|
||||||
|
src/ed25519-supercop-ref/Makefile
|
||||||
|
src/ed25519-supercop-ref/api.h
|
||||||
|
src/ed25519-supercop-ref/crypto_int32.h
|
||||||
|
src/ed25519-supercop-ref/crypto_sign.h
|
||||||
|
src/ed25519-supercop-ref/crypto_uint32.h
|
||||||
|
src/ed25519-supercop-ref/crypto_verify_32.h
|
||||||
|
src/ed25519-supercop-ref/ed25519.c
|
||||||
|
src/ed25519-supercop-ref/fe25519.c
|
||||||
|
src/ed25519-supercop-ref/fe25519.h
|
||||||
|
src/ed25519-supercop-ref/ge25519.c
|
||||||
|
src/ed25519-supercop-ref/ge25519.h
|
||||||
|
src/ed25519-supercop-ref/ge25519_base.data
|
||||||
|
src/ed25519-supercop-ref/sc25519.c
|
||||||
|
src/ed25519-supercop-ref/sc25519.h
|
||||||
|
src/ed25519-supercop-ref/sha512-blocks.c
|
||||||
|
src/ed25519-supercop-ref/sha512-hash.c
|
||||||
|
src/ed25519-supercop-ref/sha512.h
|
||||||
|
src/ed25519-supercop-ref/test.c
|
||||||
|
src/ed25519-supercop-ref/verify.c
|
|
@ -0,0 +1 @@
|
||||||
|
|
|
@ -0,0 +1,15 @@
|
||||||
|
../ed25519/__init__.py
|
||||||
|
../ed25519/_version.py
|
||||||
|
../ed25519/keys.py
|
||||||
|
../ed25519/test_ed25519.py
|
||||||
|
../ed25519/__pycache__/__init__.cpython-34.pyc
|
||||||
|
../ed25519/__pycache__/_version.cpython-34.pyc
|
||||||
|
../ed25519/__pycache__/keys.cpython-34.pyc
|
||||||
|
../ed25519/__pycache__/test_ed25519.cpython-34.pyc
|
||||||
|
../ed25519/_ed25519.cpython-34m.so
|
||||||
|
./
|
||||||
|
PKG-INFO
|
||||||
|
dependency_links.txt
|
||||||
|
top_level.txt
|
||||||
|
SOURCES.txt
|
||||||
|
../../../../bin/edsig
|
|
@ -0,0 +1 @@
|
||||||
|
ed25519
|
11
lib/python3.4/site-packages/ed25519/__init__.py
Normal file
11
lib/python3.4/site-packages/ed25519/__init__.py
Normal file
|
@ -0,0 +1,11 @@
|
||||||
|
from .keys import (BadSignatureError, BadPrefixError,
|
||||||
|
create_keypair, SigningKey, VerifyingKey,
|
||||||
|
remove_prefix, to_ascii, from_ascii)
|
||||||
|
|
||||||
|
(BadSignatureError, BadPrefixError,
|
||||||
|
create_keypair, SigningKey, VerifyingKey,
|
||||||
|
remove_prefix, to_ascii, from_ascii) # hush pyflakes
|
||||||
|
|
||||||
|
from ._version import get_versions
|
||||||
|
__version__ = str(get_versions()['version'])
|
||||||
|
del get_versions
|
BIN
lib/python3.4/site-packages/ed25519/_ed25519.cpython-34m.so
Executable file
BIN
lib/python3.4/site-packages/ed25519/_ed25519.cpython-34m.so
Executable file
Binary file not shown.
BIN
lib/python3.4/site-packages/ed25519/_ed25519.cpython-35m-x86_64-linux-gnu.so
Executable file
BIN
lib/python3.4/site-packages/ed25519/_ed25519.cpython-35m-x86_64-linux-gnu.so
Executable file
Binary file not shown.
21
lib/python3.4/site-packages/ed25519/_version.py
Normal file
21
lib/python3.4/site-packages/ed25519/_version.py
Normal file
|
@ -0,0 +1,21 @@
|
||||||
|
|
||||||
|
# This file was generated by 'versioneer.py' (0.15) from
|
||||||
|
# revision-control system data, or from the parent directory name of an
|
||||||
|
# unpacked source archive. Distribution tarballs contain a pre-generated copy
|
||||||
|
# of this file.
|
||||||
|
|
||||||
|
import json
|
||||||
|
import sys
|
||||||
|
|
||||||
|
version_json = '''
|
||||||
|
{
|
||||||
|
"dirty": false,
|
||||||
|
"error": null,
|
||||||
|
"full-revisionid": "a8732e8b6ba4e04e83c7ef05f86c565a2b2fc278",
|
||||||
|
"version": "1.4"
|
||||||
|
}
|
||||||
|
''' # END VERSION_JSON
|
||||||
|
|
||||||
|
|
||||||
|
def get_versions():
|
||||||
|
return json.loads(version_json)
|
191
lib/python3.4/site-packages/ed25519/keys.py
Normal file
191
lib/python3.4/site-packages/ed25519/keys.py
Normal file
|
@ -0,0 +1,191 @@
|
||||||
|
import os
|
||||||
|
import base64
|
||||||
|
from . import _ed25519
|
||||||
|
BadSignatureError = _ed25519.BadSignatureError
|
||||||
|
|
||||||
|
def create_keypair(entropy=os.urandom):
|
||||||
|
SEEDLEN = int(_ed25519.SECRETKEYBYTES/2)
|
||||||
|
assert SEEDLEN == 32
|
||||||
|
seed = entropy(SEEDLEN)
|
||||||
|
sk = SigningKey(seed)
|
||||||
|
vk = sk.get_verifying_key()
|
||||||
|
return sk, vk
|
||||||
|
|
||||||
|
class BadPrefixError(Exception):
|
||||||
|
pass
|
||||||
|
|
||||||
|
def remove_prefix(s_bytes, prefix):
|
||||||
|
assert(type(s_bytes) == type(prefix))
|
||||||
|
if s_bytes[:len(prefix)] != prefix:
|
||||||
|
raise BadPrefixError("did not see expected '%s' prefix" % (prefix,))
|
||||||
|
return s_bytes[len(prefix):]
|
||||||
|
|
||||||
|
def to_ascii(s_bytes, prefix="", encoding="base64"):
|
||||||
|
"""Return a version-prefixed ASCII representation of the given binary
|
||||||
|
string. 'encoding' indicates how to do the encoding, and can be one of:
|
||||||
|
* base64
|
||||||
|
* base32
|
||||||
|
* base16 (or hex)
|
||||||
|
|
||||||
|
This function handles bytes, not bits, so it does not append any trailing
|
||||||
|
'=' (unlike standard base64.b64encode). It also lowercases the base32
|
||||||
|
output.
|
||||||
|
|
||||||
|
'prefix' will be prepended to the encoded form, and is useful for
|
||||||
|
distinguishing the purpose and version of the binary string. E.g. you
|
||||||
|
could prepend 'pub0-' to a VerifyingKey string to allow the receiving
|
||||||
|
code to raise a useful error if someone pasted in a signature string by
|
||||||
|
mistake.
|
||||||
|
"""
|
||||||
|
assert isinstance(s_bytes, bytes)
|
||||||
|
if not isinstance(prefix, bytes):
|
||||||
|
prefix = prefix.encode('ascii')
|
||||||
|
if encoding == "base64":
|
||||||
|
s_ascii = base64.b64encode(s_bytes).decode('ascii').rstrip("=")
|
||||||
|
elif encoding == "base32":
|
||||||
|
s_ascii = base64.b32encode(s_bytes).decode('ascii').rstrip("=").lower()
|
||||||
|
elif encoding in ("base16", "hex"):
|
||||||
|
s_ascii = base64.b16encode(s_bytes).decode('ascii').lower()
|
||||||
|
else:
|
||||||
|
raise NotImplementedError
|
||||||
|
return prefix+s_ascii.encode('ascii')
|
||||||
|
|
||||||
|
def from_ascii(s_ascii, prefix="", encoding="base64"):
|
||||||
|
"""This is the opposite of to_ascii. It will throw BadPrefixError if
|
||||||
|
the prefix is not found.
|
||||||
|
"""
|
||||||
|
if isinstance(s_ascii, bytes):
|
||||||
|
s_ascii = s_ascii.decode('ascii')
|
||||||
|
if isinstance(prefix, bytes):
|
||||||
|
prefix = prefix.decode('ascii')
|
||||||
|
s_ascii = remove_prefix(s_ascii.strip(), prefix)
|
||||||
|
if encoding == "base64":
|
||||||
|
s_ascii += "=" * ((4 - len(s_ascii) % 4) % 4)
|
||||||
|
s_bytes = base64.b64decode(s_ascii.encode('ascii'))
|
||||||
|
elif encoding == "base32":
|
||||||
|
s_ascii += "=" * ((8 - len(s_ascii) % 8) % 8)
|
||||||
|
s_bytes = base64.b32decode(s_ascii.upper().encode('ascii'))
|
||||||
|
elif encoding in ("base16", "hex"):
|
||||||
|
s_bytes = base64.b16decode(s_ascii.upper().encode('ascii'))
|
||||||
|
else:
|
||||||
|
raise NotImplementedError
|
||||||
|
return s_bytes
|
||||||
|
|
||||||
|
class SigningKey(object):
|
||||||
|
# this can only be used to reconstruct a key created by create_keypair().
|
||||||
|
def __init__(self, sk_s, prefix="", encoding=None):
|
||||||
|
assert isinstance(sk_s, bytes)
|
||||||
|
if not isinstance(prefix, bytes):
|
||||||
|
prefix = prefix.encode('ascii')
|
||||||
|
sk_s = remove_prefix(sk_s, prefix)
|
||||||
|
if encoding is not None:
|
||||||
|
sk_s = from_ascii(sk_s, encoding=encoding)
|
||||||
|
if len(sk_s) == 32:
|
||||||
|
# create from seed
|
||||||
|
vk_s, sk_s = _ed25519.publickey(sk_s)
|
||||||
|
else:
|
||||||
|
if len(sk_s) != 32+32:
|
||||||
|
raise ValueError("SigningKey takes 32-byte seed or 64-byte string")
|
||||||
|
self.sk_s = sk_s # seed+pubkey
|
||||||
|
self.vk_s = sk_s[32:] # just pubkey
|
||||||
|
|
||||||
|
def to_bytes(self, prefix=""):
|
||||||
|
if not isinstance(prefix, bytes):
|
||||||
|
prefix = prefix.encode('ascii')
|
||||||
|
return prefix+self.sk_s
|
||||||
|
|
||||||
|
def to_ascii(self, prefix="", encoding=None):
|
||||||
|
assert encoding
|
||||||
|
if not isinstance(prefix, bytes):
|
||||||
|
prefix = prefix.encode('ascii')
|
||||||
|
return to_ascii(self.to_seed(), prefix, encoding)
|
||||||
|
|
||||||
|
def to_seed(self, prefix=""):
|
||||||
|
if not isinstance(prefix, bytes):
|
||||||
|
prefix = prefix.encode('ascii')
|
||||||
|
return prefix+self.sk_s[:32]
|
||||||
|
|
||||||
|
def __eq__(self, them):
|
||||||
|
if not isinstance(them, object): return False
|
||||||
|
return (them.__class__ == self.__class__
|
||||||
|
and them.sk_s == self.sk_s)
|
||||||
|
|
||||||
|
def get_verifying_key(self):
|
||||||
|
return VerifyingKey(self.vk_s)
|
||||||
|
|
||||||
|
def sign(self, msg, prefix="", encoding=None):
|
||||||
|
assert isinstance(msg, bytes)
|
||||||
|
if not isinstance(prefix, bytes):
|
||||||
|
prefix = prefix.encode('ascii')
|
||||||
|
sig_and_msg = _ed25519.sign(msg, self.sk_s)
|
||||||
|
# the response is R+S+msg
|
||||||
|
sig_R = sig_and_msg[0:32]
|
||||||
|
sig_S = sig_and_msg[32:64]
|
||||||
|
msg_out = sig_and_msg[64:]
|
||||||
|
sig_out = sig_R + sig_S
|
||||||
|
assert msg_out == msg
|
||||||
|
if encoding:
|
||||||
|
return to_ascii(sig_out, prefix, encoding)
|
||||||
|
return prefix+sig_out
|
||||||
|
|
||||||
|
class VerifyingKey(object):
|
||||||
|
def __init__(self, vk_s, prefix="", encoding=None):
|
||||||
|
if not isinstance(prefix, bytes):
|
||||||
|
prefix = prefix.encode('ascii')
|
||||||
|
if not isinstance(vk_s, bytes):
|
||||||
|
vk_s = vk_s.encode('ascii')
|
||||||
|
assert isinstance(vk_s, bytes)
|
||||||
|
vk_s = remove_prefix(vk_s, prefix)
|
||||||
|
if encoding is not None:
|
||||||
|
vk_s = from_ascii(vk_s, encoding=encoding)
|
||||||
|
|
||||||
|
assert len(vk_s) == 32
|
||||||
|
self.vk_s = vk_s
|
||||||
|
|
||||||
|
def to_bytes(self, prefix=""):
|
||||||
|
if not isinstance(prefix, bytes):
|
||||||
|
prefix = prefix.encode('ascii')
|
||||||
|
return prefix+self.vk_s
|
||||||
|
|
||||||
|
def to_ascii(self, prefix="", encoding=None):
|
||||||
|
assert encoding
|
||||||
|
if not isinstance(prefix, bytes):
|
||||||
|
prefix = prefix.encode('ascii')
|
||||||
|
return to_ascii(self.vk_s, prefix, encoding)
|
||||||
|
|
||||||
|
def __eq__(self, them):
|
||||||
|
if not isinstance(them, object): return False
|
||||||
|
return (them.__class__ == self.__class__
|
||||||
|
and them.vk_s == self.vk_s)
|
||||||
|
|
||||||
|
def verify(self, sig, msg, prefix="", encoding=None):
|
||||||
|
if not isinstance(sig, bytes):
|
||||||
|
sig = sig.encode('ascii')
|
||||||
|
if not isinstance(prefix, bytes):
|
||||||
|
prefix = prefix.encode('ascii')
|
||||||
|
assert isinstance(sig, bytes)
|
||||||
|
assert isinstance(msg, bytes)
|
||||||
|
if encoding:
|
||||||
|
sig = from_ascii(sig, prefix, encoding)
|
||||||
|
else:
|
||||||
|
sig = remove_prefix(sig, prefix)
|
||||||
|
assert len(sig) == 64
|
||||||
|
sig_R = sig[:32]
|
||||||
|
sig_S = sig[32:]
|
||||||
|
sig_and_msg = sig_R + sig_S + msg
|
||||||
|
# this might raise BadSignatureError
|
||||||
|
msg2 = _ed25519.open(sig_and_msg, self.vk_s)
|
||||||
|
assert msg2 == msg
|
||||||
|
|
||||||
|
def selftest():
|
||||||
|
message = b"crypto libraries should always test themselves at powerup"
|
||||||
|
sk = SigningKey(b"priv0-VIsfn5OFGa09Un2MR6Hm7BQ5++xhcQskU2OGXG8jSJl4cWLZrRrVcSN2gVYMGtZT+3354J5jfmqAcuRSD9KIyg",
|
||||||
|
prefix="priv0-", encoding="base64")
|
||||||
|
vk = VerifyingKey(b"pub0-eHFi2a0a1XEjdoFWDBrWU/t9+eCeY35qgHLkUg/SiMo",
|
||||||
|
prefix="pub0-", encoding="base64")
|
||||||
|
assert sk.get_verifying_key() == vk
|
||||||
|
sig = sk.sign(message, prefix="sig0-", encoding="base64")
|
||||||
|
assert sig == b"sig0-E/QrwtSF52x8+q0l4ahA7eJbRKc777ClKNg217Q0z4fiYMCdmAOI+rTLVkiFhX6k3D+wQQfKdJYMxaTUFfv1DQ", sig
|
||||||
|
vk.verify(sig, message, prefix="sig0-", encoding="base64")
|
||||||
|
|
||||||
|
selftest()
|
274
lib/python3.4/site-packages/ed25519/test_ed25519.py
Normal file
274
lib/python3.4/site-packages/ed25519/test_ed25519.py
Normal file
|
@ -0,0 +1,274 @@
|
||||||
|
from __future__ import print_function
|
||||||
|
import sys
|
||||||
|
import unittest
|
||||||
|
import time
|
||||||
|
from binascii import hexlify, unhexlify
|
||||||
|
import ed25519
|
||||||
|
from ed25519 import _ed25519 as raw
|
||||||
|
|
||||||
|
if sys.version_info[0] == 3:
|
||||||
|
def int2byte(i):
|
||||||
|
return bytes((i,))
|
||||||
|
else:
|
||||||
|
int2byte = chr
|
||||||
|
|
||||||
|
def flip_bit(s, bit=0, in_byte=-1):
|
||||||
|
as_bytes = [ord(b) if isinstance(b, str) else b for b in s]
|
||||||
|
as_bytes[in_byte] = as_bytes[in_byte] ^ (0x01<<bit)
|
||||||
|
return b"".join([int2byte(b) for b in as_bytes])
|
||||||
|
|
||||||
|
# the pure-python demonstration code (on my 2010 MacBookPro) takes 5s to
|
||||||
|
# generate a public key, 9s to sign, 14s to verify
|
||||||
|
|
||||||
|
# the SUPERCOP-ref version we use takes 2ms for keygen, 2ms to sign, and 7ms
|
||||||
|
# to verify
|
||||||
|
|
||||||
|
class Basic(unittest.TestCase):
|
||||||
|
timer = None
|
||||||
|
def log(self, msg):
|
||||||
|
return
|
||||||
|
now = time.time()
|
||||||
|
if self.timer is None:
|
||||||
|
self.timer = now
|
||||||
|
else:
|
||||||
|
elapsed = now - self.timer
|
||||||
|
self.timer = now
|
||||||
|
print(" (%f elapsed)" % elapsed)
|
||||||
|
print(msg)
|
||||||
|
|
||||||
|
def test_version(self):
|
||||||
|
# just make sure it can be retrieved
|
||||||
|
ver = ed25519.__version__
|
||||||
|
self.failUnless(isinstance(ver, type("")))
|
||||||
|
|
||||||
|
def test_constants(self):
|
||||||
|
# the secret key we get from raw.keypair() are 64 bytes long, and
|
||||||
|
# are mostly the output of a sha512 call. The first 32 bytes are the
|
||||||
|
# private exponent (random, with a few bits stomped).
|
||||||
|
self.failUnlessEqual(raw.SECRETKEYBYTES, 64)
|
||||||
|
# the public key is the encoded public point
|
||||||
|
self.failUnlessEqual(raw.PUBLICKEYBYTES, 32)
|
||||||
|
self.failUnlessEqual(raw.SIGNATUREKEYBYTES, 64)
|
||||||
|
|
||||||
|
def test_raw(self):
|
||||||
|
sk_s = b"\x00" * 32 # usually urandom(32)
|
||||||
|
vk_s, skvk_s = raw.publickey(sk_s)
|
||||||
|
self.failUnlessEqual(len(vk_s), 32)
|
||||||
|
exp_vks = unhexlify(b"3b6a27bcceb6a42d62a3a8d02a6f0d73"
|
||||||
|
b"653215771de243a63ac048a18b59da29")
|
||||||
|
self.failUnlessEqual(vk_s, exp_vks)
|
||||||
|
self.failUnlessEqual(skvk_s[:32], sk_s)
|
||||||
|
self.failUnlessEqual(skvk_s[32:], vk_s)
|
||||||
|
msg = b"hello world"
|
||||||
|
msg_and_sig = raw.sign(msg, skvk_s)
|
||||||
|
sig = msg_and_sig[:-len(msg)]
|
||||||
|
self.failUnlessEqual(len(sig), 64)
|
||||||
|
exp_sig = unhexlify(b"b0b47780f096ae60bfff8d8e7b19c36b"
|
||||||
|
b"321ae6e69cca972f2ff987ef30f20d29"
|
||||||
|
b"774b53bae404485c4391ddf1b3f37aaa"
|
||||||
|
b"8a9747f984eb0884e8aa533386e73305")
|
||||||
|
self.failUnlessEqual(sig, exp_sig)
|
||||||
|
ret = raw.open(sig+msg, vk_s) # don't raise exception
|
||||||
|
self.failUnlessEqual(ret, msg)
|
||||||
|
self.failUnlessRaises(raw.BadSignatureError,
|
||||||
|
raw.open,
|
||||||
|
sig+msg+b".. NOT!", vk_s)
|
||||||
|
self.failUnlessRaises(raw.BadSignatureError,
|
||||||
|
raw.open,
|
||||||
|
sig+flip_bit(msg), vk_s)
|
||||||
|
self.failUnlessRaises(raw.BadSignatureError,
|
||||||
|
raw.open,
|
||||||
|
sig+msg, flip_bit(vk_s))
|
||||||
|
self.failUnlessRaises(raw.BadSignatureError,
|
||||||
|
raw.open,
|
||||||
|
sig+msg, flip_bit(vk_s, in_byte=2))
|
||||||
|
self.failUnlessRaises(raw.BadSignatureError,
|
||||||
|
raw.open,
|
||||||
|
flip_bit(sig)+msg, vk_s)
|
||||||
|
self.failUnlessRaises(raw.BadSignatureError,
|
||||||
|
raw.open,
|
||||||
|
flip_bit(sig, in_byte=33)+msg, vk_s)
|
||||||
|
|
||||||
|
def test_keypair(self):
|
||||||
|
sk, vk = ed25519.create_keypair()
|
||||||
|
self.failUnless(isinstance(sk, ed25519.SigningKey), sk)
|
||||||
|
self.failUnless(isinstance(vk, ed25519.VerifyingKey), vk)
|
||||||
|
sk2, vk2 = ed25519.create_keypair()
|
||||||
|
self.failIfEqual(hexlify(sk.to_bytes()), hexlify(sk2.to_bytes()))
|
||||||
|
|
||||||
|
# you can control the entropy source
|
||||||
|
def not_so_random(length):
|
||||||
|
return b"4"*length
|
||||||
|
sk1, vk1 = ed25519.create_keypair(entropy=not_so_random)
|
||||||
|
self.failUnlessEqual(sk1.to_ascii(encoding="base64"),
|
||||||
|
b"NDQ0NDQ0NDQ0NDQ0NDQ0NDQ0NDQ0NDQ0NDQ0NDQ0NDQ")
|
||||||
|
self.failUnlessEqual(vk1.to_ascii(encoding="base64"),
|
||||||
|
b"6yzxO/euOl9hQWih+wknLTl3HsS4UjcngV5GbK+O4WM")
|
||||||
|
sk2, vk2 = ed25519.create_keypair(entropy=not_so_random)
|
||||||
|
self.failUnlessEqual(sk1.to_ascii(encoding="base64"),
|
||||||
|
sk2.to_ascii(encoding="base64"))
|
||||||
|
self.failUnlessEqual(vk1.to_ascii(encoding="base64"),
|
||||||
|
vk2.to_ascii(encoding="base64"))
|
||||||
|
|
||||||
|
|
||||||
|
def test_publickey(self):
|
||||||
|
seed = unhexlify(b"4ba96b0b5303328c7405220598a587c4"
|
||||||
|
b"acb06ed9a9601d149f85400195f1ec3d")
|
||||||
|
sk = ed25519.SigningKey(seed)
|
||||||
|
self.failUnlessEqual(hexlify(sk.to_bytes()),
|
||||||
|
(b"4ba96b0b5303328c7405220598a587c4"
|
||||||
|
b"acb06ed9a9601d149f85400195f1ec3d"
|
||||||
|
b"a66d161e090652b054740748f059f92a"
|
||||||
|
b"5b731f1c27b05571f6d942e4f8b7b264"))
|
||||||
|
self.failUnlessEqual(hexlify(sk.to_seed()),
|
||||||
|
(b"4ba96b0b5303328c7405220598a587c4"
|
||||||
|
b"acb06ed9a9601d149f85400195f1ec3d"))
|
||||||
|
self.failUnlessRaises(ValueError,
|
||||||
|
ed25519.SigningKey, b"wrong length")
|
||||||
|
sk2 = ed25519.SigningKey(seed)
|
||||||
|
self.failUnlessEqual(sk, sk2)
|
||||||
|
|
||||||
|
def test_OOP(self):
|
||||||
|
sk_s = unhexlify(b"4ba96b0b5303328c7405220598a587c4"
|
||||||
|
b"acb06ed9a9601d149f85400195f1ec3d"
|
||||||
|
b"a66d161e090652b054740748f059f92a"
|
||||||
|
b"5b731f1c27b05571f6d942e4f8b7b264")
|
||||||
|
sk = ed25519.SigningKey(sk_s)
|
||||||
|
self.failUnlessEqual(len(sk.to_bytes()), 64)
|
||||||
|
self.failUnlessEqual(sk.to_bytes(), sk_s)
|
||||||
|
|
||||||
|
sk2_seed = unhexlify(b"4ba96b0b5303328c7405220598a587c4"
|
||||||
|
b"acb06ed9a9601d149f85400195f1ec3d")
|
||||||
|
sk2 = ed25519.SigningKey(sk2_seed)
|
||||||
|
self.failUnlessEqual(sk2.to_bytes(), sk.to_bytes())
|
||||||
|
|
||||||
|
vk = sk.get_verifying_key()
|
||||||
|
self.failUnlessEqual(len(vk.to_bytes()), 32)
|
||||||
|
exp_vks = unhexlify(b"a66d161e090652b054740748f059f92a"
|
||||||
|
b"5b731f1c27b05571f6d942e4f8b7b264")
|
||||||
|
self.failUnlessEqual(vk.to_bytes(), exp_vks)
|
||||||
|
self.failUnlessEqual(ed25519.VerifyingKey(vk.to_bytes()), vk)
|
||||||
|
msg = b"hello world"
|
||||||
|
sig = sk.sign(msg)
|
||||||
|
self.failUnlessEqual(len(sig), 64)
|
||||||
|
exp_sig = unhexlify(b"6eaffe94f2972b35158b6aaa9b69c1da"
|
||||||
|
b"97f0896aca29c41b1dd7b32e6c9e2ff6"
|
||||||
|
b"76fc8d8b034709cdcc37d8aeb86bebfb"
|
||||||
|
b"173ace3c319e211ea1d7e8d8884c1808")
|
||||||
|
self.failUnlessEqual(sig, exp_sig)
|
||||||
|
self.failUnlessEqual(vk.verify(sig, msg), None) # also, don't throw
|
||||||
|
self.failUnlessRaises(ed25519.BadSignatureError,
|
||||||
|
vk.verify, sig, msg+b".. NOT!")
|
||||||
|
|
||||||
|
def test_object_identity(self):
|
||||||
|
sk1_s = unhexlify(b"ef32972ae3f1252a5aa1395347ea008c"
|
||||||
|
b"bd2fed0773a4ea45e2d2d06c8cf8fbd4"
|
||||||
|
b"c024601a9c5b854fb100ff3116cf4f22"
|
||||||
|
b"a311565f027391cb49d3bbe11c44399d")
|
||||||
|
sk2_s = unhexlify(b"3d550c158900b4c2922b6656d2f80572"
|
||||||
|
b"89de4ee65043745179685ae7d29b944d"
|
||||||
|
b"672b8a2cb23f9e75e1d46ce249cd9c04"
|
||||||
|
b"68f816f1c734a102822b60e18b41eacd")
|
||||||
|
sk1a = ed25519.SigningKey(sk1_s)
|
||||||
|
sk1b = ed25519.SigningKey(sk1_s)
|
||||||
|
vk1a = sk1a.get_verifying_key()
|
||||||
|
vk1b = sk1b.get_verifying_key()
|
||||||
|
sk2 = ed25519.SigningKey(sk2_s)
|
||||||
|
vk2 = sk2.get_verifying_key()
|
||||||
|
self.failUnlessEqual(sk1a, sk1b)
|
||||||
|
self.failIfEqual(sk1a, sk2)
|
||||||
|
self.failUnlessEqual(vk1a, vk1b)
|
||||||
|
self.failIfEqual(vk1a, vk2)
|
||||||
|
|
||||||
|
self.failIfEqual(sk2, b"not a SigningKey")
|
||||||
|
self.failIfEqual(vk2, b"not a VerifyingKey")
|
||||||
|
|
||||||
|
def test_prefix(self):
|
||||||
|
sk1,vk1 = ed25519.create_keypair()
|
||||||
|
PREFIX = b"private0-"
|
||||||
|
p = sk1.to_bytes(PREFIX)
|
||||||
|
# that gives us a binary string with a prefix
|
||||||
|
self.failUnless(p[:len(PREFIX)] == PREFIX, repr(p))
|
||||||
|
sk2 = ed25519.SigningKey(p, prefix=PREFIX)
|
||||||
|
self.failUnlessEqual(sk1, sk2)
|
||||||
|
self.failUnlessEqual(repr(sk1.to_bytes()), repr(sk2.to_bytes()))
|
||||||
|
self.failUnlessRaises(ed25519.BadPrefixError,
|
||||||
|
ed25519.SigningKey, p, prefix=b"WRONG-")
|
||||||
|
# SigningKey.to_seed() can do a prefix too
|
||||||
|
p = sk1.to_seed(PREFIX)
|
||||||
|
self.failUnless(p[:len(PREFIX)] == PREFIX, repr(p))
|
||||||
|
sk3 = ed25519.SigningKey(p, prefix=PREFIX)
|
||||||
|
self.failUnlessEqual(sk1, sk3)
|
||||||
|
self.failUnlessEqual(repr(sk1.to_bytes()), repr(sk3.to_bytes()))
|
||||||
|
self.failUnlessRaises(ed25519.BadPrefixError,
|
||||||
|
ed25519.SigningKey, p, prefix=b"WRONG-")
|
||||||
|
|
||||||
|
# verifying keys can do this too
|
||||||
|
PREFIX = b"public0-"
|
||||||
|
p = vk1.to_bytes(PREFIX)
|
||||||
|
self.failUnless(p.startswith(PREFIX), repr(p))
|
||||||
|
vk2 = ed25519.VerifyingKey(p, prefix=PREFIX)
|
||||||
|
self.failUnlessEqual(vk1, vk2)
|
||||||
|
self.failUnlessEqual(repr(vk1.to_bytes()), repr(vk2.to_bytes()))
|
||||||
|
self.failUnlessRaises(ed25519.BadPrefixError,
|
||||||
|
ed25519.VerifyingKey, p, prefix=b"WRONG-")
|
||||||
|
|
||||||
|
# and signatures
|
||||||
|
PREFIX = b"sig0-"
|
||||||
|
p = sk1.sign(b"msg", PREFIX)
|
||||||
|
self.failUnless(p.startswith(PREFIX), repr(p))
|
||||||
|
vk1.verify(p, b"msg", PREFIX)
|
||||||
|
self.failUnlessRaises(ed25519.BadPrefixError,
|
||||||
|
vk1.verify, p, b"msg", prefix=b"WRONG-")
|
||||||
|
|
||||||
|
def test_ascii(self):
|
||||||
|
b2a = ed25519.to_ascii
|
||||||
|
a2b = ed25519.from_ascii
|
||||||
|
for prefix in ("", "prefix-"):
|
||||||
|
for length in range(0, 100):
|
||||||
|
b1 = b"a"*length
|
||||||
|
for base in ("base64", "base32", "base16", "hex"):
|
||||||
|
a = b2a(b1, prefix, base)
|
||||||
|
b2 = a2b(a, prefix, base)
|
||||||
|
self.failUnlessEqual(b1, b2)
|
||||||
|
|
||||||
|
def test_encoding(self):
|
||||||
|
sk_s = b"\x88" * 32 # usually urandom(32)
|
||||||
|
sk1 = ed25519.SigningKey(sk_s)
|
||||||
|
vk1 = sk1.get_verifying_key()
|
||||||
|
|
||||||
|
def check1(encoding, expected):
|
||||||
|
PREFIX = "private0-"
|
||||||
|
p = sk1.to_ascii(PREFIX, encoding)
|
||||||
|
self.failUnlessEqual(p, expected)
|
||||||
|
sk2 = ed25519.SigningKey(p, prefix=PREFIX, encoding=encoding)
|
||||||
|
self.failUnlessEqual(repr(sk1.to_bytes()), repr(sk2.to_bytes()))
|
||||||
|
self.failUnlessEqual(sk1, sk2)
|
||||||
|
check1("base64", b"private0-iIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIiIg")
|
||||||
|
check1("base32", b"private0-rceirceirceirceirceirceirceirceirceirceirceirceircea")
|
||||||
|
check1("hex", b"private0-8888888888888888888888888888888888888888888888888888888888888888")
|
||||||
|
|
||||||
|
def check2(encoding, expected):
|
||||||
|
PREFIX="public0-"
|
||||||
|
p = vk1.to_ascii(PREFIX, encoding)
|
||||||
|
self.failUnlessEqual(p, expected)
|
||||||
|
vk2 = ed25519.VerifyingKey(p, prefix=PREFIX, encoding=encoding)
|
||||||
|
self.failUnlessEqual(repr(vk1.to_bytes()), repr(vk2.to_bytes()))
|
||||||
|
self.failUnlessEqual(vk1, vk2)
|
||||||
|
check2("base64", b"public0-skkdlQKuKGMKK6yy4MdFEP/N0yjDNP8+E5PnWy0x59w")
|
||||||
|
check2("base32", b"public0-wjer3ficvyuggcrlvszobr2fcd743uziym2p6pqtsptvwljr47oa")
|
||||||
|
check2("hex", b"public0-b2491d9502ae28630a2bacb2e0c74510ffcdd328c334ff3e1393e75b2d31e7dc")
|
||||||
|
|
||||||
|
def check3(encoding, expected):
|
||||||
|
msg = b"msg"
|
||||||
|
PREFIX="sig0-"
|
||||||
|
sig = sk1.sign(msg, PREFIX, encoding)
|
||||||
|
self.failUnlessEqual(sig, expected)
|
||||||
|
vk1.verify(sig, msg, PREFIX, encoding)
|
||||||
|
check3("base64", b"sig0-MNfdUir6tMlaYQ+/p8KANJ5d+bk8g2al76v5MeJCo6RiywxURda3sU580CyiW2FBG/Q7kDRswgYqxbkQw3o5CQ")
|
||||||
|
check3("base32", b"sig0-gdl52urk7k2mswtbb672pquagspf36nzhsbwnjppvp4tdyscuosgfsymkrc5nn5rjz6nalfclnqucg7uhoidi3gcayvmloiqyn5dsci")
|
||||||
|
check3("hex", b"sig0-30d7dd522afab4c95a610fbfa7c280349e5df9b93c8366a5efabf931e242a3a462cb0c5445d6b7b14e7cd02ca25b61411bf43b90346cc2062ac5b910c37a3909")
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
unittest.main()
|
3124
lib/python3.4/site-packages/pkg_resources/__init__.py
Normal file
3124
lib/python3.4/site-packages/pkg_resources/__init__.py
Normal file
File diff suppressed because it is too large
Load diff
|
@ -0,0 +1,31 @@
|
||||||
|
# Copyright 2014 Donald Stufft
|
||||||
|
#
|
||||||
|
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
|
# you may not use this file except in compliance with the License.
|
||||||
|
# You may obtain a copy of the License at
|
||||||
|
#
|
||||||
|
# http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
#
|
||||||
|
# Unless required by applicable law or agreed to in writing, software
|
||||||
|
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
# See the License for the specific language governing permissions and
|
||||||
|
# limitations under the License.
|
||||||
|
from __future__ import absolute_import, division, print_function
|
||||||
|
|
||||||
|
__all__ = [
|
||||||
|
"__title__", "__summary__", "__uri__", "__version__", "__author__",
|
||||||
|
"__email__", "__license__", "__copyright__",
|
||||||
|
]
|
||||||
|
|
||||||
|
__title__ = "packaging"
|
||||||
|
__summary__ = "Core utilities for Python packages"
|
||||||
|
__uri__ = "https://github.com/pypa/packaging"
|
||||||
|
|
||||||
|
__version__ = "15.3"
|
||||||
|
|
||||||
|
__author__ = "Donald Stufft"
|
||||||
|
__email__ = "donald@stufft.io"
|
||||||
|
|
||||||
|
__license__ = "Apache License, Version 2.0"
|
||||||
|
__copyright__ = "Copyright 2014 %s" % __author__
|
|
@ -0,0 +1,24 @@
|
||||||
|
# Copyright 2014 Donald Stufft
|
||||||
|
#
|
||||||
|
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
|
# you may not use this file except in compliance with the License.
|
||||||
|
# You may obtain a copy of the License at
|
||||||
|
#
|
||||||
|
# http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
#
|
||||||
|
# Unless required by applicable law or agreed to in writing, software
|
||||||
|
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
# See the License for the specific language governing permissions and
|
||||||
|
# limitations under the License.
|
||||||
|
from __future__ import absolute_import, division, print_function
|
||||||
|
|
||||||
|
from .__about__ import (
|
||||||
|
__author__, __copyright__, __email__, __license__, __summary__, __title__,
|
||||||
|
__uri__, __version__
|
||||||
|
)
|
||||||
|
|
||||||
|
__all__ = [
|
||||||
|
"__title__", "__summary__", "__uri__", "__version__", "__author__",
|
||||||
|
"__email__", "__license__", "__copyright__",
|
||||||
|
]
|
|
@ -0,0 +1,40 @@
|
||||||
|
# Copyright 2014 Donald Stufft
|
||||||
|
#
|
||||||
|
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
|
# you may not use this file except in compliance with the License.
|
||||||
|
# You may obtain a copy of the License at
|
||||||
|
#
|
||||||
|
# http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
#
|
||||||
|
# Unless required by applicable law or agreed to in writing, software
|
||||||
|
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
# See the License for the specific language governing permissions and
|
||||||
|
# limitations under the License.
|
||||||
|
from __future__ import absolute_import, division, print_function
|
||||||
|
|
||||||
|
import sys
|
||||||
|
|
||||||
|
|
||||||
|
PY2 = sys.version_info[0] == 2
|
||||||
|
PY3 = sys.version_info[0] == 3
|
||||||
|
|
||||||
|
# flake8: noqa
|
||||||
|
|
||||||
|
if PY3:
|
||||||
|
string_types = str,
|
||||||
|
else:
|
||||||
|
string_types = basestring,
|
||||||
|
|
||||||
|
|
||||||
|
def with_metaclass(meta, *bases):
|
||||||
|
"""
|
||||||
|
Create a base class with a metaclass.
|
||||||
|
"""
|
||||||
|
# This requires a bit of explanation: the basic idea is to make a dummy
|
||||||
|
# metaclass for one level of class instantiation that replaces itself with
|
||||||
|
# the actual metaclass.
|
||||||
|
class metaclass(meta):
|
||||||
|
def __new__(cls, name, this_bases, d):
|
||||||
|
return meta(name, bases, d)
|
||||||
|
return type.__new__(metaclass, 'temporary_class', (), {})
|
|
@ -0,0 +1,78 @@
|
||||||
|
# Copyright 2014 Donald Stufft
|
||||||
|
#
|
||||||
|
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
|
# you may not use this file except in compliance with the License.
|
||||||
|
# You may obtain a copy of the License at
|
||||||
|
#
|
||||||
|
# http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
#
|
||||||
|
# Unless required by applicable law or agreed to in writing, software
|
||||||
|
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
# See the License for the specific language governing permissions and
|
||||||
|
# limitations under the License.
|
||||||
|
from __future__ import absolute_import, division, print_function
|
||||||
|
|
||||||
|
|
||||||
|
class Infinity(object):
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return "Infinity"
|
||||||
|
|
||||||
|
def __hash__(self):
|
||||||
|
return hash(repr(self))
|
||||||
|
|
||||||
|
def __lt__(self, other):
|
||||||
|
return False
|
||||||
|
|
||||||
|
def __le__(self, other):
|
||||||
|
return False
|
||||||
|
|
||||||
|
def __eq__(self, other):
|
||||||
|
return isinstance(other, self.__class__)
|
||||||
|
|
||||||
|
def __ne__(self, other):
|
||||||
|
return not isinstance(other, self.__class__)
|
||||||
|
|
||||||
|
def __gt__(self, other):
|
||||||
|
return True
|
||||||
|
|
||||||
|
def __ge__(self, other):
|
||||||
|
return True
|
||||||
|
|
||||||
|
def __neg__(self):
|
||||||
|
return NegativeInfinity
|
||||||
|
|
||||||
|
Infinity = Infinity()
|
||||||
|
|
||||||
|
|
||||||
|
class NegativeInfinity(object):
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return "-Infinity"
|
||||||
|
|
||||||
|
def __hash__(self):
|
||||||
|
return hash(repr(self))
|
||||||
|
|
||||||
|
def __lt__(self, other):
|
||||||
|
return True
|
||||||
|
|
||||||
|
def __le__(self, other):
|
||||||
|
return True
|
||||||
|
|
||||||
|
def __eq__(self, other):
|
||||||
|
return isinstance(other, self.__class__)
|
||||||
|
|
||||||
|
def __ne__(self, other):
|
||||||
|
return not isinstance(other, self.__class__)
|
||||||
|
|
||||||
|
def __gt__(self, other):
|
||||||
|
return False
|
||||||
|
|
||||||
|
def __ge__(self, other):
|
||||||
|
return False
|
||||||
|
|
||||||
|
def __neg__(self):
|
||||||
|
return Infinity
|
||||||
|
|
||||||
|
NegativeInfinity = NegativeInfinity()
|
|
@ -0,0 +1,784 @@
|
||||||
|
# Copyright 2014 Donald Stufft
|
||||||
|
#
|
||||||
|
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
|
# you may not use this file except in compliance with the License.
|
||||||
|
# You may obtain a copy of the License at
|
||||||
|
#
|
||||||
|
# http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
#
|
||||||
|
# Unless required by applicable law or agreed to in writing, software
|
||||||
|
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
# See the License for the specific language governing permissions and
|
||||||
|
# limitations under the License.
|
||||||
|
from __future__ import absolute_import, division, print_function
|
||||||
|
|
||||||
|
import abc
|
||||||
|
import functools
|
||||||
|
import itertools
|
||||||
|
import re
|
||||||
|
|
||||||
|
from ._compat import string_types, with_metaclass
|
||||||
|
from .version import Version, LegacyVersion, parse
|
||||||
|
|
||||||
|
|
||||||
|
class InvalidSpecifier(ValueError):
|
||||||
|
"""
|
||||||
|
An invalid specifier was found, users should refer to PEP 440.
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
class BaseSpecifier(with_metaclass(abc.ABCMeta, object)):
|
||||||
|
|
||||||
|
@abc.abstractmethod
|
||||||
|
def __str__(self):
|
||||||
|
"""
|
||||||
|
Returns the str representation of this Specifier like object. This
|
||||||
|
should be representative of the Specifier itself.
|
||||||
|
"""
|
||||||
|
|
||||||
|
@abc.abstractmethod
|
||||||
|
def __hash__(self):
|
||||||
|
"""
|
||||||
|
Returns a hash value for this Specifier like object.
|
||||||
|
"""
|
||||||
|
|
||||||
|
@abc.abstractmethod
|
||||||
|
def __eq__(self, other):
|
||||||
|
"""
|
||||||
|
Returns a boolean representing whether or not the two Specifier like
|
||||||
|
objects are equal.
|
||||||
|
"""
|
||||||
|
|
||||||
|
@abc.abstractmethod
|
||||||
|
def __ne__(self, other):
|
||||||
|
"""
|
||||||
|
Returns a boolean representing whether or not the two Specifier like
|
||||||
|
objects are not equal.
|
||||||
|
"""
|
||||||
|
|
||||||
|
@abc.abstractproperty
|
||||||
|
def prereleases(self):
|
||||||
|
"""
|
||||||
|
Returns whether or not pre-releases as a whole are allowed by this
|
||||||
|
specifier.
|
||||||
|
"""
|
||||||
|
|
||||||
|
@prereleases.setter
|
||||||
|
def prereleases(self, value):
|
||||||
|
"""
|
||||||
|
Sets whether or not pre-releases as a whole are allowed by this
|
||||||
|
specifier.
|
||||||
|
"""
|
||||||
|
|
||||||
|
@abc.abstractmethod
|
||||||
|
def contains(self, item, prereleases=None):
|
||||||
|
"""
|
||||||
|
Determines if the given item is contained within this specifier.
|
||||||
|
"""
|
||||||
|
|
||||||
|
@abc.abstractmethod
|
||||||
|
def filter(self, iterable, prereleases=None):
|
||||||
|
"""
|
||||||
|
Takes an iterable of items and filters them so that only items which
|
||||||
|
are contained within this specifier are allowed in it.
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
class _IndividualSpecifier(BaseSpecifier):
|
||||||
|
|
||||||
|
_operators = {}
|
||||||
|
|
||||||
|
def __init__(self, spec="", prereleases=None):
|
||||||
|
match = self._regex.search(spec)
|
||||||
|
if not match:
|
||||||
|
raise InvalidSpecifier("Invalid specifier: '{0}'".format(spec))
|
||||||
|
|
||||||
|
self._spec = (
|
||||||
|
match.group("operator").strip(),
|
||||||
|
match.group("version").strip(),
|
||||||
|
)
|
||||||
|
|
||||||
|
# Store whether or not this Specifier should accept prereleases
|
||||||
|
self._prereleases = prereleases
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
pre = (
|
||||||
|
", prereleases={0!r}".format(self.prereleases)
|
||||||
|
if self._prereleases is not None
|
||||||
|
else ""
|
||||||
|
)
|
||||||
|
|
||||||
|
return "<{0}({1!r}{2})>".format(
|
||||||
|
self.__class__.__name__,
|
||||||
|
str(self),
|
||||||
|
pre,
|
||||||
|
)
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return "{0}{1}".format(*self._spec)
|
||||||
|
|
||||||
|
def __hash__(self):
|
||||||
|
return hash(self._spec)
|
||||||
|
|
||||||
|
def __eq__(self, other):
|
||||||
|
if isinstance(other, string_types):
|
||||||
|
try:
|
||||||
|
other = self.__class__(other)
|
||||||
|
except InvalidSpecifier:
|
||||||
|
return NotImplemented
|
||||||
|
elif not isinstance(other, self.__class__):
|
||||||
|
return NotImplemented
|
||||||
|
|
||||||
|
return self._spec == other._spec
|
||||||
|
|
||||||
|
def __ne__(self, other):
|
||||||
|
if isinstance(other, string_types):
|
||||||
|
try:
|
||||||
|
other = self.__class__(other)
|
||||||
|
except InvalidSpecifier:
|
||||||
|
return NotImplemented
|
||||||
|
elif not isinstance(other, self.__class__):
|
||||||
|
return NotImplemented
|
||||||
|
|
||||||
|
return self._spec != other._spec
|
||||||
|
|
||||||
|
def _get_operator(self, op):
|
||||||
|
return getattr(self, "_compare_{0}".format(self._operators[op]))
|
||||||
|
|
||||||
|
def _coerce_version(self, version):
|
||||||
|
if not isinstance(version, (LegacyVersion, Version)):
|
||||||
|
version = parse(version)
|
||||||
|
return version
|
||||||
|
|
||||||
|
@property
|
||||||
|
def operator(self):
|
||||||
|
return self._spec[0]
|
||||||
|
|
||||||
|
@property
|
||||||
|
def version(self):
|
||||||
|
return self._spec[1]
|
||||||
|
|
||||||
|
@property
|
||||||
|
def prereleases(self):
|
||||||
|
return self._prereleases
|
||||||
|
|
||||||
|
@prereleases.setter
|
||||||
|
def prereleases(self, value):
|
||||||
|
self._prereleases = value
|
||||||
|
|
||||||
|
def __contains__(self, item):
|
||||||
|
return self.contains(item)
|
||||||
|
|
||||||
|
def contains(self, item, prereleases=None):
|
||||||
|
# Determine if prereleases are to be allowed or not.
|
||||||
|
if prereleases is None:
|
||||||
|
prereleases = self.prereleases
|
||||||
|
|
||||||
|
# Normalize item to a Version or LegacyVersion, this allows us to have
|
||||||
|
# a shortcut for ``"2.0" in Specifier(">=2")
|
||||||
|
item = self._coerce_version(item)
|
||||||
|
|
||||||
|
# Determine if we should be supporting prereleases in this specifier
|
||||||
|
# or not, if we do not support prereleases than we can short circuit
|
||||||
|
# logic if this version is a prereleases.
|
||||||
|
if item.is_prerelease and not prereleases:
|
||||||
|
return False
|
||||||
|
|
||||||
|
# Actually do the comparison to determine if this item is contained
|
||||||
|
# within this Specifier or not.
|
||||||
|
return self._get_operator(self.operator)(item, self.version)
|
||||||
|
|
||||||
|
def filter(self, iterable, prereleases=None):
|
||||||
|
yielded = False
|
||||||
|
found_prereleases = []
|
||||||
|
|
||||||
|
kw = {"prereleases": prereleases if prereleases is not None else True}
|
||||||
|
|
||||||
|
# Attempt to iterate over all the values in the iterable and if any of
|
||||||
|
# them match, yield them.
|
||||||
|
for version in iterable:
|
||||||
|
parsed_version = self._coerce_version(version)
|
||||||
|
|
||||||
|
if self.contains(parsed_version, **kw):
|
||||||
|
# If our version is a prerelease, and we were not set to allow
|
||||||
|
# prereleases, then we'll store it for later incase nothing
|
||||||
|
# else matches this specifier.
|
||||||
|
if (parsed_version.is_prerelease
|
||||||
|
and not (prereleases or self.prereleases)):
|
||||||
|
found_prereleases.append(version)
|
||||||
|
# Either this is not a prerelease, or we should have been
|
||||||
|
# accepting prereleases from the begining.
|
||||||
|
else:
|
||||||
|
yielded = True
|
||||||
|
yield version
|
||||||
|
|
||||||
|
# Now that we've iterated over everything, determine if we've yielded
|
||||||
|
# any values, and if we have not and we have any prereleases stored up
|
||||||
|
# then we will go ahead and yield the prereleases.
|
||||||
|
if not yielded and found_prereleases:
|
||||||
|
for version in found_prereleases:
|
||||||
|
yield version
|
||||||
|
|
||||||
|
|
||||||
|
class LegacySpecifier(_IndividualSpecifier):
|
||||||
|
|
||||||
|
_regex = re.compile(
|
||||||
|
r"""
|
||||||
|
^
|
||||||
|
\s*
|
||||||
|
(?P<operator>(==|!=|<=|>=|<|>))
|
||||||
|
\s*
|
||||||
|
(?P<version>
|
||||||
|
[^\s]* # We just match everything, except for whitespace since this
|
||||||
|
# is a "legacy" specifier and the version string can be just
|
||||||
|
# about anything.
|
||||||
|
)
|
||||||
|
\s*
|
||||||
|
$
|
||||||
|
""",
|
||||||
|
re.VERBOSE | re.IGNORECASE,
|
||||||
|
)
|
||||||
|
|
||||||
|
_operators = {
|
||||||
|
"==": "equal",
|
||||||
|
"!=": "not_equal",
|
||||||
|
"<=": "less_than_equal",
|
||||||
|
">=": "greater_than_equal",
|
||||||
|
"<": "less_than",
|
||||||
|
">": "greater_than",
|
||||||
|
}
|
||||||
|
|
||||||
|
def _coerce_version(self, version):
|
||||||
|
if not isinstance(version, LegacyVersion):
|
||||||
|
version = LegacyVersion(str(version))
|
||||||
|
return version
|
||||||
|
|
||||||
|
def _compare_equal(self, prospective, spec):
|
||||||
|
return prospective == self._coerce_version(spec)
|
||||||
|
|
||||||
|
def _compare_not_equal(self, prospective, spec):
|
||||||
|
return prospective != self._coerce_version(spec)
|
||||||
|
|
||||||
|
def _compare_less_than_equal(self, prospective, spec):
|
||||||
|
return prospective <= self._coerce_version(spec)
|
||||||
|
|
||||||
|
def _compare_greater_than_equal(self, prospective, spec):
|
||||||
|
return prospective >= self._coerce_version(spec)
|
||||||
|
|
||||||
|
def _compare_less_than(self, prospective, spec):
|
||||||
|
return prospective < self._coerce_version(spec)
|
||||||
|
|
||||||
|
def _compare_greater_than(self, prospective, spec):
|
||||||
|
return prospective > self._coerce_version(spec)
|
||||||
|
|
||||||
|
|
||||||
|
def _require_version_compare(fn):
|
||||||
|
@functools.wraps(fn)
|
||||||
|
def wrapped(self, prospective, spec):
|
||||||
|
if not isinstance(prospective, Version):
|
||||||
|
return False
|
||||||
|
return fn(self, prospective, spec)
|
||||||
|
return wrapped
|
||||||
|
|
||||||
|
|
||||||
|
class Specifier(_IndividualSpecifier):
|
||||||
|
|
||||||
|
_regex = re.compile(
|
||||||
|
r"""
|
||||||
|
^
|
||||||
|
\s*
|
||||||
|
(?P<operator>(~=|==|!=|<=|>=|<|>|===))
|
||||||
|
(?P<version>
|
||||||
|
(?:
|
||||||
|
# The identity operators allow for an escape hatch that will
|
||||||
|
# do an exact string match of the version you wish to install.
|
||||||
|
# This will not be parsed by PEP 440 and we cannot determine
|
||||||
|
# any semantic meaning from it. This operator is discouraged
|
||||||
|
# but included entirely as an escape hatch.
|
||||||
|
(?<====) # Only match for the identity operator
|
||||||
|
\s*
|
||||||
|
[^\s]* # We just match everything, except for whitespace
|
||||||
|
# since we are only testing for strict identity.
|
||||||
|
)
|
||||||
|
|
|
||||||
|
(?:
|
||||||
|
# The (non)equality operators allow for wild card and local
|
||||||
|
# versions to be specified so we have to define these two
|
||||||
|
# operators separately to enable that.
|
||||||
|
(?<===|!=) # Only match for equals and not equals
|
||||||
|
|
||||||
|
\s*
|
||||||
|
v?
|
||||||
|
(?:[0-9]+!)? # epoch
|
||||||
|
[0-9]+(?:\.[0-9]+)* # release
|
||||||
|
(?: # pre release
|
||||||
|
[-_\.]?
|
||||||
|
(a|b|c|rc|alpha|beta|pre|preview)
|
||||||
|
[-_\.]?
|
||||||
|
[0-9]*
|
||||||
|
)?
|
||||||
|
(?: # post release
|
||||||
|
(?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*)
|
||||||
|
)?
|
||||||
|
|
||||||
|
# You cannot use a wild card and a dev or local version
|
||||||
|
# together so group them with a | and make them optional.
|
||||||
|
(?:
|
||||||
|
(?:[-_\.]?dev[-_\.]?[0-9]*)? # dev release
|
||||||
|
(?:\+[a-z0-9]+(?:[-_\.][a-z0-9]+)*)? # local
|
||||||
|
|
|
||||||
|
\.\* # Wild card syntax of .*
|
||||||
|
)?
|
||||||
|
)
|
||||||
|
|
|
||||||
|
(?:
|
||||||
|
# The compatible operator requires at least two digits in the
|
||||||
|
# release segment.
|
||||||
|
(?<=~=) # Only match for the compatible operator
|
||||||
|
|
||||||
|
\s*
|
||||||
|
v?
|
||||||
|
(?:[0-9]+!)? # epoch
|
||||||
|
[0-9]+(?:\.[0-9]+)+ # release (We have a + instead of a *)
|
||||||
|
(?: # pre release
|
||||||
|
[-_\.]?
|
||||||
|
(a|b|c|rc|alpha|beta|pre|preview)
|
||||||
|
[-_\.]?
|
||||||
|
[0-9]*
|
||||||
|
)?
|
||||||
|
(?: # post release
|
||||||
|
(?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*)
|
||||||
|
)?
|
||||||
|
(?:[-_\.]?dev[-_\.]?[0-9]*)? # dev release
|
||||||
|
)
|
||||||
|
|
|
||||||
|
(?:
|
||||||
|
# All other operators only allow a sub set of what the
|
||||||
|
# (non)equality operators do. Specifically they do not allow
|
||||||
|
# local versions to be specified nor do they allow the prefix
|
||||||
|
# matching wild cards.
|
||||||
|
(?<!==|!=|~=) # We have special cases for these
|
||||||
|
# operators so we want to make sure they
|
||||||
|
# don't match here.
|
||||||
|
|
||||||
|
\s*
|
||||||
|
v?
|
||||||
|
(?:[0-9]+!)? # epoch
|
||||||
|
[0-9]+(?:\.[0-9]+)* # release
|
||||||
|
(?: # pre release
|
||||||
|
[-_\.]?
|
||||||
|
(a|b|c|rc|alpha|beta|pre|preview)
|
||||||
|
[-_\.]?
|
||||||
|
[0-9]*
|
||||||
|
)?
|
||||||
|
(?: # post release
|
||||||
|
(?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*)
|
||||||
|
)?
|
||||||
|
(?:[-_\.]?dev[-_\.]?[0-9]*)? # dev release
|
||||||
|
)
|
||||||
|
)
|
||||||
|
\s*
|
||||||
|
$
|
||||||
|
""",
|
||||||
|
re.VERBOSE | re.IGNORECASE,
|
||||||
|
)
|
||||||
|
|
||||||
|
_operators = {
|
||||||
|
"~=": "compatible",
|
||||||
|
"==": "equal",
|
||||||
|
"!=": "not_equal",
|
||||||
|
"<=": "less_than_equal",
|
||||||
|
">=": "greater_than_equal",
|
||||||
|
"<": "less_than",
|
||||||
|
">": "greater_than",
|
||||||
|
"===": "arbitrary",
|
||||||
|
}
|
||||||
|
|
||||||
|
@_require_version_compare
|
||||||
|
def _compare_compatible(self, prospective, spec):
|
||||||
|
# Compatible releases have an equivalent combination of >= and ==. That
|
||||||
|
# is that ~=2.2 is equivalent to >=2.2,==2.*. This allows us to
|
||||||
|
# implement this in terms of the other specifiers instead of
|
||||||
|
# implementing it ourselves. The only thing we need to do is construct
|
||||||
|
# the other specifiers.
|
||||||
|
|
||||||
|
# We want everything but the last item in the version, but we want to
|
||||||
|
# ignore post and dev releases and we want to treat the pre-release as
|
||||||
|
# it's own separate segment.
|
||||||
|
prefix = ".".join(
|
||||||
|
list(
|
||||||
|
itertools.takewhile(
|
||||||
|
lambda x: (not x.startswith("post")
|
||||||
|
and not x.startswith("dev")),
|
||||||
|
_version_split(spec),
|
||||||
|
)
|
||||||
|
)[:-1]
|
||||||
|
)
|
||||||
|
|
||||||
|
# Add the prefix notation to the end of our string
|
||||||
|
prefix += ".*"
|
||||||
|
|
||||||
|
return (self._get_operator(">=")(prospective, spec)
|
||||||
|
and self._get_operator("==")(prospective, prefix))
|
||||||
|
|
||||||
|
@_require_version_compare
|
||||||
|
def _compare_equal(self, prospective, spec):
|
||||||
|
# We need special logic to handle prefix matching
|
||||||
|
if spec.endswith(".*"):
|
||||||
|
# Split the spec out by dots, and pretend that there is an implicit
|
||||||
|
# dot in between a release segment and a pre-release segment.
|
||||||
|
spec = _version_split(spec[:-2]) # Remove the trailing .*
|
||||||
|
|
||||||
|
# Split the prospective version out by dots, and pretend that there
|
||||||
|
# is an implicit dot in between a release segment and a pre-release
|
||||||
|
# segment.
|
||||||
|
prospective = _version_split(str(prospective))
|
||||||
|
|
||||||
|
# Shorten the prospective version to be the same length as the spec
|
||||||
|
# so that we can determine if the specifier is a prefix of the
|
||||||
|
# prospective version or not.
|
||||||
|
prospective = prospective[:len(spec)]
|
||||||
|
|
||||||
|
# Pad out our two sides with zeros so that they both equal the same
|
||||||
|
# length.
|
||||||
|
spec, prospective = _pad_version(spec, prospective)
|
||||||
|
else:
|
||||||
|
# Convert our spec string into a Version
|
||||||
|
spec = Version(spec)
|
||||||
|
|
||||||
|
# If the specifier does not have a local segment, then we want to
|
||||||
|
# act as if the prospective version also does not have a local
|
||||||
|
# segment.
|
||||||
|
if not spec.local:
|
||||||
|
prospective = Version(prospective.public)
|
||||||
|
|
||||||
|
return prospective == spec
|
||||||
|
|
||||||
|
@_require_version_compare
|
||||||
|
def _compare_not_equal(self, prospective, spec):
|
||||||
|
return not self._compare_equal(prospective, spec)
|
||||||
|
|
||||||
|
@_require_version_compare
|
||||||
|
def _compare_less_than_equal(self, prospective, spec):
|
||||||
|
return prospective <= Version(spec)
|
||||||
|
|
||||||
|
@_require_version_compare
|
||||||
|
def _compare_greater_than_equal(self, prospective, spec):
|
||||||
|
return prospective >= Version(spec)
|
||||||
|
|
||||||
|
@_require_version_compare
|
||||||
|
def _compare_less_than(self, prospective, spec):
|
||||||
|
# Convert our spec to a Version instance, since we'll want to work with
|
||||||
|
# it as a version.
|
||||||
|
spec = Version(spec)
|
||||||
|
|
||||||
|
# Check to see if the prospective version is less than the spec
|
||||||
|
# version. If it's not we can short circuit and just return False now
|
||||||
|
# instead of doing extra unneeded work.
|
||||||
|
if not prospective < spec:
|
||||||
|
return False
|
||||||
|
|
||||||
|
# This special case is here so that, unless the specifier itself
|
||||||
|
# includes is a pre-release version, that we do not accept pre-release
|
||||||
|
# versions for the version mentioned in the specifier (e.g. <3.1 should
|
||||||
|
# not match 3.1.dev0, but should match 3.0.dev0).
|
||||||
|
if not spec.is_prerelease and prospective.is_prerelease:
|
||||||
|
if Version(prospective.base_version) == Version(spec.base_version):
|
||||||
|
return False
|
||||||
|
|
||||||
|
# If we've gotten to here, it means that prospective version is both
|
||||||
|
# less than the spec version *and* it's not a pre-release of the same
|
||||||
|
# version in the spec.
|
||||||
|
return True
|
||||||
|
|
||||||
|
@_require_version_compare
|
||||||
|
def _compare_greater_than(self, prospective, spec):
|
||||||
|
# Convert our spec to a Version instance, since we'll want to work with
|
||||||
|
# it as a version.
|
||||||
|
spec = Version(spec)
|
||||||
|
|
||||||
|
# Check to see if the prospective version is greater than the spec
|
||||||
|
# version. If it's not we can short circuit and just return False now
|
||||||
|
# instead of doing extra unneeded work.
|
||||||
|
if not prospective > spec:
|
||||||
|
return False
|
||||||
|
|
||||||
|
# This special case is here so that, unless the specifier itself
|
||||||
|
# includes is a post-release version, that we do not accept
|
||||||
|
# post-release versions for the version mentioned in the specifier
|
||||||
|
# (e.g. >3.1 should not match 3.0.post0, but should match 3.2.post0).
|
||||||
|
if not spec.is_postrelease and prospective.is_postrelease:
|
||||||
|
if Version(prospective.base_version) == Version(spec.base_version):
|
||||||
|
return False
|
||||||
|
|
||||||
|
# Ensure that we do not allow a local version of the version mentioned
|
||||||
|
# in the specifier, which is techincally greater than, to match.
|
||||||
|
if prospective.local is not None:
|
||||||
|
if Version(prospective.base_version) == Version(spec.base_version):
|
||||||
|
return False
|
||||||
|
|
||||||
|
# If we've gotten to here, it means that prospective version is both
|
||||||
|
# greater than the spec version *and* it's not a pre-release of the
|
||||||
|
# same version in the spec.
|
||||||
|
return True
|
||||||
|
|
||||||
|
def _compare_arbitrary(self, prospective, spec):
|
||||||
|
return str(prospective).lower() == str(spec).lower()
|
||||||
|
|
||||||
|
@property
|
||||||
|
def prereleases(self):
|
||||||
|
# If there is an explicit prereleases set for this, then we'll just
|
||||||
|
# blindly use that.
|
||||||
|
if self._prereleases is not None:
|
||||||
|
return self._prereleases
|
||||||
|
|
||||||
|
# Look at all of our specifiers and determine if they are inclusive
|
||||||
|
# operators, and if they are if they are including an explicit
|
||||||
|
# prerelease.
|
||||||
|
operator, version = self._spec
|
||||||
|
if operator in ["==", ">=", "<=", "~=", "==="]:
|
||||||
|
# The == specifier can include a trailing .*, if it does we
|
||||||
|
# want to remove before parsing.
|
||||||
|
if operator == "==" and version.endswith(".*"):
|
||||||
|
version = version[:-2]
|
||||||
|
|
||||||
|
# Parse the version, and if it is a pre-release than this
|
||||||
|
# specifier allows pre-releases.
|
||||||
|
if parse(version).is_prerelease:
|
||||||
|
return True
|
||||||
|
|
||||||
|
return False
|
||||||
|
|
||||||
|
@prereleases.setter
|
||||||
|
def prereleases(self, value):
|
||||||
|
self._prereleases = value
|
||||||
|
|
||||||
|
|
||||||
|
_prefix_regex = re.compile(r"^([0-9]+)((?:a|b|c|rc)[0-9]+)$")
|
||||||
|
|
||||||
|
|
||||||
|
def _version_split(version):
|
||||||
|
result = []
|
||||||
|
for item in version.split("."):
|
||||||
|
match = _prefix_regex.search(item)
|
||||||
|
if match:
|
||||||
|
result.extend(match.groups())
|
||||||
|
else:
|
||||||
|
result.append(item)
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
def _pad_version(left, right):
|
||||||
|
left_split, right_split = [], []
|
||||||
|
|
||||||
|
# Get the release segment of our versions
|
||||||
|
left_split.append(list(itertools.takewhile(lambda x: x.isdigit(), left)))
|
||||||
|
right_split.append(list(itertools.takewhile(lambda x: x.isdigit(), right)))
|
||||||
|
|
||||||
|
# Get the rest of our versions
|
||||||
|
left_split.append(left[len(left_split):])
|
||||||
|
right_split.append(left[len(right_split):])
|
||||||
|
|
||||||
|
# Insert our padding
|
||||||
|
left_split.insert(
|
||||||
|
1,
|
||||||
|
["0"] * max(0, len(right_split[0]) - len(left_split[0])),
|
||||||
|
)
|
||||||
|
right_split.insert(
|
||||||
|
1,
|
||||||
|
["0"] * max(0, len(left_split[0]) - len(right_split[0])),
|
||||||
|
)
|
||||||
|
|
||||||
|
return (
|
||||||
|
list(itertools.chain(*left_split)),
|
||||||
|
list(itertools.chain(*right_split)),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class SpecifierSet(BaseSpecifier):
|
||||||
|
|
||||||
|
def __init__(self, specifiers="", prereleases=None):
|
||||||
|
# Split on , to break each indidivual specifier into it's own item, and
|
||||||
|
# strip each item to remove leading/trailing whitespace.
|
||||||
|
specifiers = [s.strip() for s in specifiers.split(",") if s.strip()]
|
||||||
|
|
||||||
|
# Parsed each individual specifier, attempting first to make it a
|
||||||
|
# Specifier and falling back to a LegacySpecifier.
|
||||||
|
parsed = set()
|
||||||
|
for specifier in specifiers:
|
||||||
|
try:
|
||||||
|
parsed.add(Specifier(specifier))
|
||||||
|
except InvalidSpecifier:
|
||||||
|
parsed.add(LegacySpecifier(specifier))
|
||||||
|
|
||||||
|
# Turn our parsed specifiers into a frozen set and save them for later.
|
||||||
|
self._specs = frozenset(parsed)
|
||||||
|
|
||||||
|
# Store our prereleases value so we can use it later to determine if
|
||||||
|
# we accept prereleases or not.
|
||||||
|
self._prereleases = prereleases
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
pre = (
|
||||||
|
", prereleases={0!r}".format(self.prereleases)
|
||||||
|
if self._prereleases is not None
|
||||||
|
else ""
|
||||||
|
)
|
||||||
|
|
||||||
|
return "<SpecifierSet({0!r}{1})>".format(str(self), pre)
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return ",".join(sorted(str(s) for s in self._specs))
|
||||||
|
|
||||||
|
def __hash__(self):
|
||||||
|
return hash(self._specs)
|
||||||
|
|
||||||
|
def __and__(self, other):
|
||||||
|
if isinstance(other, string_types):
|
||||||
|
other = SpecifierSet(other)
|
||||||
|
elif not isinstance(other, SpecifierSet):
|
||||||
|
return NotImplemented
|
||||||
|
|
||||||
|
specifier = SpecifierSet()
|
||||||
|
specifier._specs = frozenset(self._specs | other._specs)
|
||||||
|
|
||||||
|
if self._prereleases is None and other._prereleases is not None:
|
||||||
|
specifier._prereleases = other._prereleases
|
||||||
|
elif self._prereleases is not None and other._prereleases is None:
|
||||||
|
specifier._prereleases = self._prereleases
|
||||||
|
elif self._prereleases == other._prereleases:
|
||||||
|
specifier._prereleases = self._prereleases
|
||||||
|
else:
|
||||||
|
raise ValueError(
|
||||||
|
"Cannot combine SpecifierSets with True and False prerelease "
|
||||||
|
"overrides."
|
||||||
|
)
|
||||||
|
|
||||||
|
return specifier
|
||||||
|
|
||||||
|
def __eq__(self, other):
|
||||||
|
if isinstance(other, string_types):
|
||||||
|
other = SpecifierSet(other)
|
||||||
|
elif isinstance(other, _IndividualSpecifier):
|
||||||
|
other = SpecifierSet(str(other))
|
||||||
|
elif not isinstance(other, SpecifierSet):
|
||||||
|
return NotImplemented
|
||||||
|
|
||||||
|
return self._specs == other._specs
|
||||||
|
|
||||||
|
def __ne__(self, other):
|
||||||
|
if isinstance(other, string_types):
|
||||||
|
other = SpecifierSet(other)
|
||||||
|
elif isinstance(other, _IndividualSpecifier):
|
||||||
|
other = SpecifierSet(str(other))
|
||||||
|
elif not isinstance(other, SpecifierSet):
|
||||||
|
return NotImplemented
|
||||||
|
|
||||||
|
return self._specs != other._specs
|
||||||
|
|
||||||
|
def __len__(self):
|
||||||
|
return len(self._specs)
|
||||||
|
|
||||||
|
def __iter__(self):
|
||||||
|
return iter(self._specs)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def prereleases(self):
|
||||||
|
# If we have been given an explicit prerelease modifier, then we'll
|
||||||
|
# pass that through here.
|
||||||
|
if self._prereleases is not None:
|
||||||
|
return self._prereleases
|
||||||
|
|
||||||
|
# If we don't have any specifiers, and we don't have a forced value,
|
||||||
|
# then we'll just return None since we don't know if this should have
|
||||||
|
# pre-releases or not.
|
||||||
|
if not self._specs:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Otherwise we'll see if any of the given specifiers accept
|
||||||
|
# prereleases, if any of them do we'll return True, otherwise False.
|
||||||
|
return any(s.prereleases for s in self._specs)
|
||||||
|
|
||||||
|
@prereleases.setter
|
||||||
|
def prereleases(self, value):
|
||||||
|
self._prereleases = value
|
||||||
|
|
||||||
|
def __contains__(self, item):
|
||||||
|
return self.contains(item)
|
||||||
|
|
||||||
|
def contains(self, item, prereleases=None):
|
||||||
|
# Ensure that our item is a Version or LegacyVersion instance.
|
||||||
|
if not isinstance(item, (LegacyVersion, Version)):
|
||||||
|
item = parse(item)
|
||||||
|
|
||||||
|
# Determine if we're forcing a prerelease or not, if we're not forcing
|
||||||
|
# one for this particular filter call, then we'll use whatever the
|
||||||
|
# SpecifierSet thinks for whether or not we should support prereleases.
|
||||||
|
if prereleases is None:
|
||||||
|
prereleases = self.prereleases
|
||||||
|
|
||||||
|
# We can determine if we're going to allow pre-releases by looking to
|
||||||
|
# see if any of the underlying items supports them. If none of them do
|
||||||
|
# and this item is a pre-release then we do not allow it and we can
|
||||||
|
# short circuit that here.
|
||||||
|
# Note: This means that 1.0.dev1 would not be contained in something
|
||||||
|
# like >=1.0.devabc however it would be in >=1.0.debabc,>0.0.dev0
|
||||||
|
if not prereleases and item.is_prerelease:
|
||||||
|
return False
|
||||||
|
|
||||||
|
# We simply dispatch to the underlying specs here to make sure that the
|
||||||
|
# given version is contained within all of them.
|
||||||
|
# Note: This use of all() here means that an empty set of specifiers
|
||||||
|
# will always return True, this is an explicit design decision.
|
||||||
|
return all(
|
||||||
|
s.contains(item, prereleases=prereleases)
|
||||||
|
for s in self._specs
|
||||||
|
)
|
||||||
|
|
||||||
|
def filter(self, iterable, prereleases=None):
|
||||||
|
# Determine if we're forcing a prerelease or not, if we're not forcing
|
||||||
|
# one for this particular filter call, then we'll use whatever the
|
||||||
|
# SpecifierSet thinks for whether or not we should support prereleases.
|
||||||
|
if prereleases is None:
|
||||||
|
prereleases = self.prereleases
|
||||||
|
|
||||||
|
# If we have any specifiers, then we want to wrap our iterable in the
|
||||||
|
# filter method for each one, this will act as a logical AND amongst
|
||||||
|
# each specifier.
|
||||||
|
if self._specs:
|
||||||
|
for spec in self._specs:
|
||||||
|
iterable = spec.filter(iterable, prereleases=bool(prereleases))
|
||||||
|
return iterable
|
||||||
|
# If we do not have any specifiers, then we need to have a rough filter
|
||||||
|
# which will filter out any pre-releases, unless there are no final
|
||||||
|
# releases, and which will filter out LegacyVersion in general.
|
||||||
|
else:
|
||||||
|
filtered = []
|
||||||
|
found_prereleases = []
|
||||||
|
|
||||||
|
for item in iterable:
|
||||||
|
# Ensure that we some kind of Version class for this item.
|
||||||
|
if not isinstance(item, (LegacyVersion, Version)):
|
||||||
|
parsed_version = parse(item)
|
||||||
|
else:
|
||||||
|
parsed_version = item
|
||||||
|
|
||||||
|
# Filter out any item which is parsed as a LegacyVersion
|
||||||
|
if isinstance(parsed_version, LegacyVersion):
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Store any item which is a pre-release for later unless we've
|
||||||
|
# already found a final version or we are accepting prereleases
|
||||||
|
if parsed_version.is_prerelease and not prereleases:
|
||||||
|
if not filtered:
|
||||||
|
found_prereleases.append(item)
|
||||||
|
else:
|
||||||
|
filtered.append(item)
|
||||||
|
|
||||||
|
# If we've found no items except for pre-releases, then we'll go
|
||||||
|
# ahead and use the pre-releases
|
||||||
|
if not filtered and found_prereleases and prereleases is None:
|
||||||
|
return found_prereleases
|
||||||
|
|
||||||
|
return filtered
|
|
@ -0,0 +1,403 @@
|
||||||
|
# Copyright 2014 Donald Stufft
|
||||||
|
#
|
||||||
|
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
|
# you may not use this file except in compliance with the License.
|
||||||
|
# You may obtain a copy of the License at
|
||||||
|
#
|
||||||
|
# http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
#
|
||||||
|
# Unless required by applicable law or agreed to in writing, software
|
||||||
|
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
# See the License for the specific language governing permissions and
|
||||||
|
# limitations under the License.
|
||||||
|
from __future__ import absolute_import, division, print_function
|
||||||
|
|
||||||
|
import collections
|
||||||
|
import itertools
|
||||||
|
import re
|
||||||
|
|
||||||
|
from ._structures import Infinity
|
||||||
|
|
||||||
|
|
||||||
|
__all__ = [
|
||||||
|
"parse", "Version", "LegacyVersion", "InvalidVersion", "VERSION_PATTERN"
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
_Version = collections.namedtuple(
|
||||||
|
"_Version",
|
||||||
|
["epoch", "release", "dev", "pre", "post", "local"],
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def parse(version):
|
||||||
|
"""
|
||||||
|
Parse the given version string and return either a :class:`Version` object
|
||||||
|
or a :class:`LegacyVersion` object depending on if the given version is
|
||||||
|
a valid PEP 440 version or a legacy version.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
return Version(version)
|
||||||
|
except InvalidVersion:
|
||||||
|
return LegacyVersion(version)
|
||||||
|
|
||||||
|
|
||||||
|
class InvalidVersion(ValueError):
|
||||||
|
"""
|
||||||
|
An invalid version was found, users should refer to PEP 440.
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
class _BaseVersion(object):
|
||||||
|
|
||||||
|
def __hash__(self):
|
||||||
|
return hash(self._key)
|
||||||
|
|
||||||
|
def __lt__(self, other):
|
||||||
|
return self._compare(other, lambda s, o: s < o)
|
||||||
|
|
||||||
|
def __le__(self, other):
|
||||||
|
return self._compare(other, lambda s, o: s <= o)
|
||||||
|
|
||||||
|
def __eq__(self, other):
|
||||||
|
return self._compare(other, lambda s, o: s == o)
|
||||||
|
|
||||||
|
def __ge__(self, other):
|
||||||
|
return self._compare(other, lambda s, o: s >= o)
|
||||||
|
|
||||||
|
def __gt__(self, other):
|
||||||
|
return self._compare(other, lambda s, o: s > o)
|
||||||
|
|
||||||
|
def __ne__(self, other):
|
||||||
|
return self._compare(other, lambda s, o: s != o)
|
||||||
|
|
||||||
|
def _compare(self, other, method):
|
||||||
|
if not isinstance(other, _BaseVersion):
|
||||||
|
return NotImplemented
|
||||||
|
|
||||||
|
return method(self._key, other._key)
|
||||||
|
|
||||||
|
|
||||||
|
class LegacyVersion(_BaseVersion):
|
||||||
|
|
||||||
|
def __init__(self, version):
|
||||||
|
self._version = str(version)
|
||||||
|
self._key = _legacy_cmpkey(self._version)
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return self._version
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return "<LegacyVersion({0})>".format(repr(str(self)))
|
||||||
|
|
||||||
|
@property
|
||||||
|
def public(self):
|
||||||
|
return self._version
|
||||||
|
|
||||||
|
@property
|
||||||
|
def base_version(self):
|
||||||
|
return self._version
|
||||||
|
|
||||||
|
@property
|
||||||
|
def local(self):
|
||||||
|
return None
|
||||||
|
|
||||||
|
@property
|
||||||
|
def is_prerelease(self):
|
||||||
|
return False
|
||||||
|
|
||||||
|
@property
|
||||||
|
def is_postrelease(self):
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
_legacy_version_component_re = re.compile(
|
||||||
|
r"(\d+ | [a-z]+ | \.| -)", re.VERBOSE,
|
||||||
|
)
|
||||||
|
|
||||||
|
_legacy_version_replacement_map = {
|
||||||
|
"pre": "c", "preview": "c", "-": "final-", "rc": "c", "dev": "@",
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def _parse_version_parts(s):
|
||||||
|
for part in _legacy_version_component_re.split(s):
|
||||||
|
part = _legacy_version_replacement_map.get(part, part)
|
||||||
|
|
||||||
|
if not part or part == ".":
|
||||||
|
continue
|
||||||
|
|
||||||
|
if part[:1] in "0123456789":
|
||||||
|
# pad for numeric comparison
|
||||||
|
yield part.zfill(8)
|
||||||
|
else:
|
||||||
|
yield "*" + part
|
||||||
|
|
||||||
|
# ensure that alpha/beta/candidate are before final
|
||||||
|
yield "*final"
|
||||||
|
|
||||||
|
|
||||||
|
def _legacy_cmpkey(version):
|
||||||
|
# We hardcode an epoch of -1 here. A PEP 440 version can only have a epoch
|
||||||
|
# greater than or equal to 0. This will effectively put the LegacyVersion,
|
||||||
|
# which uses the defacto standard originally implemented by setuptools,
|
||||||
|
# as before all PEP 440 versions.
|
||||||
|
epoch = -1
|
||||||
|
|
||||||
|
# This scheme is taken from pkg_resources.parse_version setuptools prior to
|
||||||
|
# it's adoption of the packaging library.
|
||||||
|
parts = []
|
||||||
|
for part in _parse_version_parts(version.lower()):
|
||||||
|
if part.startswith("*"):
|
||||||
|
# remove "-" before a prerelease tag
|
||||||
|
if part < "*final":
|
||||||
|
while parts and parts[-1] == "*final-":
|
||||||
|
parts.pop()
|
||||||
|
|
||||||
|
# remove trailing zeros from each series of numeric parts
|
||||||
|
while parts and parts[-1] == "00000000":
|
||||||
|
parts.pop()
|
||||||
|
|
||||||
|
parts.append(part)
|
||||||
|
parts = tuple(parts)
|
||||||
|
|
||||||
|
return epoch, parts
|
||||||
|
|
||||||
|
# Deliberately not anchored to the start and end of the string, to make it
|
||||||
|
# easier for 3rd party code to reuse
|
||||||
|
VERSION_PATTERN = r"""
|
||||||
|
v?
|
||||||
|
(?:
|
||||||
|
(?:(?P<epoch>[0-9]+)!)? # epoch
|
||||||
|
(?P<release>[0-9]+(?:\.[0-9]+)*) # release segment
|
||||||
|
(?P<pre> # pre-release
|
||||||
|
[-_\.]?
|
||||||
|
(?P<pre_l>(a|b|c|rc|alpha|beta|pre|preview))
|
||||||
|
[-_\.]?
|
||||||
|
(?P<pre_n>[0-9]+)?
|
||||||
|
)?
|
||||||
|
(?P<post> # post release
|
||||||
|
(?:-(?P<post_n1>[0-9]+))
|
||||||
|
|
|
||||||
|
(?:
|
||||||
|
[-_\.]?
|
||||||
|
(?P<post_l>post|rev|r)
|
||||||
|
[-_\.]?
|
||||||
|
(?P<post_n2>[0-9]+)?
|
||||||
|
)
|
||||||
|
)?
|
||||||
|
(?P<dev> # dev release
|
||||||
|
[-_\.]?
|
||||||
|
(?P<dev_l>dev)
|
||||||
|
[-_\.]?
|
||||||
|
(?P<dev_n>[0-9]+)?
|
||||||
|
)?
|
||||||
|
)
|
||||||
|
(?:\+(?P<local>[a-z0-9]+(?:[-_\.][a-z0-9]+)*))? # local version
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
class Version(_BaseVersion):
|
||||||
|
|
||||||
|
_regex = re.compile(
|
||||||
|
r"^\s*" + VERSION_PATTERN + r"\s*$",
|
||||||
|
re.VERBOSE | re.IGNORECASE,
|
||||||
|
)
|
||||||
|
|
||||||
|
def __init__(self, version):
|
||||||
|
# Validate the version and parse it into pieces
|
||||||
|
match = self._regex.search(version)
|
||||||
|
if not match:
|
||||||
|
raise InvalidVersion("Invalid version: '{0}'".format(version))
|
||||||
|
|
||||||
|
# Store the parsed out pieces of the version
|
||||||
|
self._version = _Version(
|
||||||
|
epoch=int(match.group("epoch")) if match.group("epoch") else 0,
|
||||||
|
release=tuple(int(i) for i in match.group("release").split(".")),
|
||||||
|
pre=_parse_letter_version(
|
||||||
|
match.group("pre_l"),
|
||||||
|
match.group("pre_n"),
|
||||||
|
),
|
||||||
|
post=_parse_letter_version(
|
||||||
|
match.group("post_l"),
|
||||||
|
match.group("post_n1") or match.group("post_n2"),
|
||||||
|
),
|
||||||
|
dev=_parse_letter_version(
|
||||||
|
match.group("dev_l"),
|
||||||
|
match.group("dev_n"),
|
||||||
|
),
|
||||||
|
local=_parse_local_version(match.group("local")),
|
||||||
|
)
|
||||||
|
|
||||||
|
# Generate a key which will be used for sorting
|
||||||
|
self._key = _cmpkey(
|
||||||
|
self._version.epoch,
|
||||||
|
self._version.release,
|
||||||
|
self._version.pre,
|
||||||
|
self._version.post,
|
||||||
|
self._version.dev,
|
||||||
|
self._version.local,
|
||||||
|
)
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return "<Version({0})>".format(repr(str(self)))
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
parts = []
|
||||||
|
|
||||||
|
# Epoch
|
||||||
|
if self._version.epoch != 0:
|
||||||
|
parts.append("{0}!".format(self._version.epoch))
|
||||||
|
|
||||||
|
# Release segment
|
||||||
|
parts.append(".".join(str(x) for x in self._version.release))
|
||||||
|
|
||||||
|
# Pre-release
|
||||||
|
if self._version.pre is not None:
|
||||||
|
parts.append("".join(str(x) for x in self._version.pre))
|
||||||
|
|
||||||
|
# Post-release
|
||||||
|
if self._version.post is not None:
|
||||||
|
parts.append(".post{0}".format(self._version.post[1]))
|
||||||
|
|
||||||
|
# Development release
|
||||||
|
if self._version.dev is not None:
|
||||||
|
parts.append(".dev{0}".format(self._version.dev[1]))
|
||||||
|
|
||||||
|
# Local version segment
|
||||||
|
if self._version.local is not None:
|
||||||
|
parts.append(
|
||||||
|
"+{0}".format(".".join(str(x) for x in self._version.local))
|
||||||
|
)
|
||||||
|
|
||||||
|
return "".join(parts)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def public(self):
|
||||||
|
return str(self).split("+", 1)[0]
|
||||||
|
|
||||||
|
@property
|
||||||
|
def base_version(self):
|
||||||
|
parts = []
|
||||||
|
|
||||||
|
# Epoch
|
||||||
|
if self._version.epoch != 0:
|
||||||
|
parts.append("{0}!".format(self._version.epoch))
|
||||||
|
|
||||||
|
# Release segment
|
||||||
|
parts.append(".".join(str(x) for x in self._version.release))
|
||||||
|
|
||||||
|
return "".join(parts)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def local(self):
|
||||||
|
version_string = str(self)
|
||||||
|
if "+" in version_string:
|
||||||
|
return version_string.split("+", 1)[1]
|
||||||
|
|
||||||
|
@property
|
||||||
|
def is_prerelease(self):
|
||||||
|
return bool(self._version.dev or self._version.pre)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def is_postrelease(self):
|
||||||
|
return bool(self._version.post)
|
||||||
|
|
||||||
|
|
||||||
|
def _parse_letter_version(letter, number):
|
||||||
|
if letter:
|
||||||
|
# We consider there to be an implicit 0 in a pre-release if there is
|
||||||
|
# not a numeral associated with it.
|
||||||
|
if number is None:
|
||||||
|
number = 0
|
||||||
|
|
||||||
|
# We normalize any letters to their lower case form
|
||||||
|
letter = letter.lower()
|
||||||
|
|
||||||
|
# We consider some words to be alternate spellings of other words and
|
||||||
|
# in those cases we want to normalize the spellings to our preferred
|
||||||
|
# spelling.
|
||||||
|
if letter == "alpha":
|
||||||
|
letter = "a"
|
||||||
|
elif letter == "beta":
|
||||||
|
letter = "b"
|
||||||
|
elif letter in ["c", "pre", "preview"]:
|
||||||
|
letter = "rc"
|
||||||
|
elif letter in ["rev", "r"]:
|
||||||
|
letter = "post"
|
||||||
|
|
||||||
|
return letter, int(number)
|
||||||
|
if not letter and number:
|
||||||
|
# We assume if we are given a number, but we are not given a letter
|
||||||
|
# then this is using the implicit post release syntax (e.g. 1.0-1)
|
||||||
|
letter = "post"
|
||||||
|
|
||||||
|
return letter, int(number)
|
||||||
|
|
||||||
|
|
||||||
|
_local_version_seperators = re.compile(r"[\._-]")
|
||||||
|
|
||||||
|
|
||||||
|
def _parse_local_version(local):
|
||||||
|
"""
|
||||||
|
Takes a string like abc.1.twelve and turns it into ("abc", 1, "twelve").
|
||||||
|
"""
|
||||||
|
if local is not None:
|
||||||
|
return tuple(
|
||||||
|
part.lower() if not part.isdigit() else int(part)
|
||||||
|
for part in _local_version_seperators.split(local)
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _cmpkey(epoch, release, pre, post, dev, local):
|
||||||
|
# When we compare a release version, we want to compare it with all of the
|
||||||
|
# trailing zeros removed. So we'll use a reverse the list, drop all the now
|
||||||
|
# leading zeros until we come to something non zero, then take the rest
|
||||||
|
# re-reverse it back into the correct order and make it a tuple and use
|
||||||
|
# that for our sorting key.
|
||||||
|
release = tuple(
|
||||||
|
reversed(list(
|
||||||
|
itertools.dropwhile(
|
||||||
|
lambda x: x == 0,
|
||||||
|
reversed(release),
|
||||||
|
)
|
||||||
|
))
|
||||||
|
)
|
||||||
|
|
||||||
|
# We need to "trick" the sorting algorithm to put 1.0.dev0 before 1.0a0.
|
||||||
|
# We'll do this by abusing the pre segment, but we _only_ want to do this
|
||||||
|
# if there is not a pre or a post segment. If we have one of those then
|
||||||
|
# the normal sorting rules will handle this case correctly.
|
||||||
|
if pre is None and post is None and dev is not None:
|
||||||
|
pre = -Infinity
|
||||||
|
# Versions without a pre-release (except as noted above) should sort after
|
||||||
|
# those with one.
|
||||||
|
elif pre is None:
|
||||||
|
pre = Infinity
|
||||||
|
|
||||||
|
# Versions without a post segment should sort before those with one.
|
||||||
|
if post is None:
|
||||||
|
post = -Infinity
|
||||||
|
|
||||||
|
# Versions without a development segment should sort after those with one.
|
||||||
|
if dev is None:
|
||||||
|
dev = Infinity
|
||||||
|
|
||||||
|
if local is None:
|
||||||
|
# Versions without a local segment should sort before those with one.
|
||||||
|
local = -Infinity
|
||||||
|
else:
|
||||||
|
# Versions with a local segment need that segment parsed to implement
|
||||||
|
# the sorting rules in PEP440.
|
||||||
|
# - Alpha numeric segments sort before numeric segments
|
||||||
|
# - Alpha numeric segments sort lexicographically
|
||||||
|
# - Numeric segments sort numerically
|
||||||
|
# - Shorter versions sort before longer versions when the prefixes
|
||||||
|
# match exactly
|
||||||
|
local = tuple(
|
||||||
|
(i, "") if isinstance(i, int) else (-Infinity, i)
|
||||||
|
for i in local
|
||||||
|
)
|
||||||
|
|
||||||
|
return epoch, release, pre, post, dev, local
|
|
@ -0,0 +1,238 @@
|
||||||
|
===============================
|
||||||
|
Installing and Using Setuptools
|
||||||
|
===============================
|
||||||
|
|
||||||
|
.. contents:: **Table of Contents**
|
||||||
|
|
||||||
|
|
||||||
|
`Change History <https://pythonhosted.org/setuptools/history.html>`_.
|
||||||
|
|
||||||
|
-------------------------
|
||||||
|
Installation Instructions
|
||||||
|
-------------------------
|
||||||
|
|
||||||
|
The recommended way to bootstrap setuptools on any system is to download
|
||||||
|
`ez_setup.py`_ and run it using the target Python environment. Different
|
||||||
|
operating systems have different recommended techniques to accomplish this
|
||||||
|
basic routine, so below are some examples to get you started.
|
||||||
|
|
||||||
|
Setuptools requires Python 2.6 or later. To install setuptools
|
||||||
|
on Python 2.4 or Python 2.5, use the `bootstrap script for Setuptools 1.x
|
||||||
|
<https://bitbucket.org/pypa/setuptools/raw/bootstrap-py24/ez_setup.py>`_.
|
||||||
|
|
||||||
|
The link provided to ez_setup.py is a bookmark to bootstrap script for the
|
||||||
|
latest known stable release.
|
||||||
|
|
||||||
|
.. _ez_setup.py: https://bootstrap.pypa.io/ez_setup.py
|
||||||
|
|
||||||
|
Windows (Powershell 3 or later)
|
||||||
|
===============================
|
||||||
|
|
||||||
|
For best results, uninstall previous versions FIRST (see `Uninstalling`_).
|
||||||
|
|
||||||
|
Using Windows 8 (which includes PowerShell 3) or earlier versions of Windows
|
||||||
|
with PowerShell 3 installed, it's possible to install with one simple
|
||||||
|
Powershell command. Start up Powershell and paste this command::
|
||||||
|
|
||||||
|
> (Invoke-WebRequest https://bootstrap.pypa.io/ez_setup.py).Content | python -
|
||||||
|
|
||||||
|
You must start the Powershell with Administrative privileges or you may choose
|
||||||
|
to install a user-local installation::
|
||||||
|
|
||||||
|
> (Invoke-WebRequest https://bootstrap.pypa.io/ez_setup.py).Content | python - --user
|
||||||
|
|
||||||
|
If you have Python 3.3 or later, you can use the ``py`` command to install to
|
||||||
|
different Python versions. For example, to install to Python 3.3 if you have
|
||||||
|
Python 2.7 installed::
|
||||||
|
|
||||||
|
> (Invoke-WebRequest https://bootstrap.pypa.io/ez_setup.py).Content | py -3 -
|
||||||
|
|
||||||
|
The recommended way to install setuptools on Windows is to download
|
||||||
|
`ez_setup.py`_ and run it. The script will download the appropriate
|
||||||
|
distribution file and install it for you.
|
||||||
|
|
||||||
|
Once installation is complete, you will find an ``easy_install`` program in
|
||||||
|
your Python ``Scripts`` subdirectory. For simple invocation and best results,
|
||||||
|
add this directory to your ``PATH`` environment variable, if it is not already
|
||||||
|
present. If you did a user-local install, the ``Scripts`` subdirectory is
|
||||||
|
``$env:APPDATA\Python\Scripts``.
|
||||||
|
|
||||||
|
|
||||||
|
Windows (simplified)
|
||||||
|
====================
|
||||||
|
|
||||||
|
For Windows without PowerShell 3 or for installation without a command-line,
|
||||||
|
download `ez_setup.py`_ using your preferred web browser or other technique
|
||||||
|
and "run" that file.
|
||||||
|
|
||||||
|
|
||||||
|
Unix (wget)
|
||||||
|
===========
|
||||||
|
|
||||||
|
Most Linux distributions come with wget.
|
||||||
|
|
||||||
|
Download `ez_setup.py`_ and run it using the target Python version. The script
|
||||||
|
will download the appropriate version and install it for you::
|
||||||
|
|
||||||
|
> wget https://bootstrap.pypa.io/ez_setup.py -O - | python
|
||||||
|
|
||||||
|
Note that you will may need to invoke the command with superuser privileges to
|
||||||
|
install to the system Python::
|
||||||
|
|
||||||
|
> wget https://bootstrap.pypa.io/ez_setup.py -O - | sudo python
|
||||||
|
|
||||||
|
Alternatively, Setuptools may be installed to a user-local path::
|
||||||
|
|
||||||
|
> wget https://bootstrap.pypa.io/ez_setup.py -O - | python - --user
|
||||||
|
|
||||||
|
Note that on some older systems (noted on Debian 6 and CentOS 5 installations),
|
||||||
|
`wget` may refuse to download `ez_setup.py`, complaining that the certificate common name `*.c.ssl.fastly.net`
|
||||||
|
does not match the host name `bootstrap.pypa.io`. In addition, the `ez_setup.py` script may then encounter similar problems using
|
||||||
|
`wget` internally to download `setuptools-x.y.zip`, complaining that the certificate common name of `www.python.org` does not match the
|
||||||
|
host name `pypi.python.org`. Those are known issues, related to a bug in the older versions of `wget`
|
||||||
|
(see `Issue 59 <https://bitbucket.org/pypa/pypi/issue/59#comment-5881915>`_). If you happen to encounter them,
|
||||||
|
install Setuptools as follows::
|
||||||
|
|
||||||
|
> wget --no-check-certificate https://bootstrap.pypa.io/ez_setup.py
|
||||||
|
> python ez_setup.py --insecure
|
||||||
|
|
||||||
|
|
||||||
|
Unix including Mac OS X (curl)
|
||||||
|
==============================
|
||||||
|
|
||||||
|
If your system has curl installed, follow the ``wget`` instructions but
|
||||||
|
replace ``wget`` with ``curl`` and ``-O`` with ``-o``. For example::
|
||||||
|
|
||||||
|
> curl https://bootstrap.pypa.io/ez_setup.py -o - | python
|
||||||
|
|
||||||
|
|
||||||
|
Advanced Installation
|
||||||
|
=====================
|
||||||
|
|
||||||
|
For more advanced installation options, such as installing to custom
|
||||||
|
locations or prefixes, download and extract the source
|
||||||
|
tarball from `Setuptools on PyPI <https://pypi.python.org/pypi/setuptools>`_
|
||||||
|
and run setup.py with any supported distutils and Setuptools options.
|
||||||
|
For example::
|
||||||
|
|
||||||
|
setuptools-x.x$ python setup.py install --prefix=/opt/setuptools
|
||||||
|
|
||||||
|
Use ``--help`` to get a full options list, but we recommend consulting
|
||||||
|
the `EasyInstall manual`_ for detailed instructions, especially `the section
|
||||||
|
on custom installation locations`_.
|
||||||
|
|
||||||
|
.. _EasyInstall manual: https://pythonhosted.org/setuptools/EasyInstall
|
||||||
|
.. _the section on custom installation locations: https://pythonhosted.org/setuptools/EasyInstall#custom-installation-locations
|
||||||
|
|
||||||
|
|
||||||
|
Downloads
|
||||||
|
=========
|
||||||
|
|
||||||
|
All setuptools downloads can be found at `the project's home page in the Python
|
||||||
|
Package Index`_. Scroll to the very bottom of the page to find the links.
|
||||||
|
|
||||||
|
.. _the project's home page in the Python Package Index: https://pypi.python.org/pypi/setuptools
|
||||||
|
|
||||||
|
In addition to the PyPI downloads, the development version of ``setuptools``
|
||||||
|
is available from the `Bitbucket repo`_, and in-development versions of the
|
||||||
|
`0.6 branch`_ are available as well.
|
||||||
|
|
||||||
|
.. _Bitbucket repo: https://bitbucket.org/pypa/setuptools/get/default.tar.gz#egg=setuptools-dev
|
||||||
|
.. _0.6 branch: http://svn.python.org/projects/sandbox/branches/setuptools-0.6/#egg=setuptools-dev06
|
||||||
|
|
||||||
|
Uninstalling
|
||||||
|
============
|
||||||
|
|
||||||
|
On Windows, if Setuptools was installed using an ``.exe`` or ``.msi``
|
||||||
|
installer, simply use the uninstall feature of "Add/Remove Programs" in the
|
||||||
|
Control Panel.
|
||||||
|
|
||||||
|
Otherwise, to uninstall Setuptools or Distribute, regardless of the Python
|
||||||
|
version, delete all ``setuptools*`` and ``distribute*`` files and
|
||||||
|
directories from your system's ``site-packages`` directory
|
||||||
|
(and any other ``sys.path`` directories) FIRST.
|
||||||
|
|
||||||
|
If you are upgrading or otherwise plan to re-install Setuptools or Distribute,
|
||||||
|
nothing further needs to be done. If you want to completely remove Setuptools,
|
||||||
|
you may also want to remove the 'easy_install' and 'easy_install-x.x' scripts
|
||||||
|
and associated executables installed to the Python scripts directory.
|
||||||
|
|
||||||
|
--------------------------------
|
||||||
|
Using Setuptools and EasyInstall
|
||||||
|
--------------------------------
|
||||||
|
|
||||||
|
Here are some of the available manuals, tutorials, and other resources for
|
||||||
|
learning about Setuptools, Python Eggs, and EasyInstall:
|
||||||
|
|
||||||
|
* `The EasyInstall user's guide and reference manual`_
|
||||||
|
* `The setuptools Developer's Guide`_
|
||||||
|
* `The pkg_resources API reference`_
|
||||||
|
* `The Internal Structure of Python Eggs`_
|
||||||
|
|
||||||
|
Questions, comments, and bug reports should be directed to the `distutils-sig
|
||||||
|
mailing list`_. If you have written (or know of) any tutorials, documentation,
|
||||||
|
plug-ins, or other resources for setuptools users, please let us know about
|
||||||
|
them there, so this reference list can be updated. If you have working,
|
||||||
|
*tested* patches to correct problems or add features, you may submit them to
|
||||||
|
the `setuptools bug tracker`_.
|
||||||
|
|
||||||
|
.. _setuptools bug tracker: https://bitbucket.org/pypa/setuptools/issues
|
||||||
|
.. _The Internal Structure of Python Eggs: https://pythonhosted.org/setuptools/formats.html
|
||||||
|
.. _The setuptools Developer's Guide: https://pythonhosted.org/setuptools/setuptools.html
|
||||||
|
.. _The pkg_resources API reference: https://pythonhosted.org/setuptools/pkg_resources.html
|
||||||
|
.. _The EasyInstall user's guide and reference manual: https://pythonhosted.org/setuptools/easy_install.html
|
||||||
|
.. _distutils-sig mailing list: http://mail.python.org/pipermail/distutils-sig/
|
||||||
|
|
||||||
|
|
||||||
|
-------
|
||||||
|
Credits
|
||||||
|
-------
|
||||||
|
|
||||||
|
* The original design for the ``.egg`` format and the ``pkg_resources`` API was
|
||||||
|
co-created by Phillip Eby and Bob Ippolito. Bob also implemented the first
|
||||||
|
version of ``pkg_resources``, and supplied the OS X operating system version
|
||||||
|
compatibility algorithm.
|
||||||
|
|
||||||
|
* Ian Bicking implemented many early "creature comfort" features of
|
||||||
|
easy_install, including support for downloading via Sourceforge and
|
||||||
|
Subversion repositories. Ian's comments on the Web-SIG about WSGI
|
||||||
|
application deployment also inspired the concept of "entry points" in eggs,
|
||||||
|
and he has given talks at PyCon and elsewhere to inform and educate the
|
||||||
|
community about eggs and setuptools.
|
||||||
|
|
||||||
|
* Jim Fulton contributed time and effort to build automated tests of various
|
||||||
|
aspects of ``easy_install``, and supplied the doctests for the command-line
|
||||||
|
``.exe`` wrappers on Windows.
|
||||||
|
|
||||||
|
* Phillip J. Eby is the seminal author of setuptools, and
|
||||||
|
first proposed the idea of an importable binary distribution format for
|
||||||
|
Python application plug-ins.
|
||||||
|
|
||||||
|
* Significant parts of the implementation of setuptools were funded by the Open
|
||||||
|
Source Applications Foundation, to provide a plug-in infrastructure for the
|
||||||
|
Chandler PIM application. In addition, many OSAF staffers (such as Mike
|
||||||
|
"Code Bear" Taylor) contributed their time and stress as guinea pigs for the
|
||||||
|
use of eggs and setuptools, even before eggs were "cool". (Thanks, guys!)
|
||||||
|
|
||||||
|
* Tarek Ziadé is the principal author of the Distribute fork, which
|
||||||
|
re-invigorated the community on the project, encouraged renewed innovation,
|
||||||
|
and addressed many defects.
|
||||||
|
|
||||||
|
* Since the merge with Distribute, Jason R. Coombs is the
|
||||||
|
maintainer of setuptools. The project is maintained in coordination with
|
||||||
|
the Python Packaging Authority (PyPA) and the larger Python community.
|
||||||
|
|
||||||
|
.. _files:
|
||||||
|
|
||||||
|
|
||||||
|
---------------
|
||||||
|
Code of Conduct
|
||||||
|
---------------
|
||||||
|
|
||||||
|
Everyone interacting in the setuptools project's codebases, issue trackers,
|
||||||
|
chat rooms, and mailing lists is expected to follow the
|
||||||
|
`PyPA Code of Conduct`_.
|
||||||
|
|
||||||
|
.. _PyPA Code of Conduct: https://www.pypa.io/en/latest/code-of-conduct/
|
||||||
|
|
||||||
|
|
268
lib/python3.4/site-packages/setuptools-18.5.dist-info/METADATA
Normal file
268
lib/python3.4/site-packages/setuptools-18.5.dist-info/METADATA
Normal file
|
@ -0,0 +1,268 @@
|
||||||
|
Metadata-Version: 2.0
|
||||||
|
Name: setuptools
|
||||||
|
Version: 18.5
|
||||||
|
Summary: Easily download, build, install, upgrade, and uninstall Python packages
|
||||||
|
Home-page: https://bitbucket.org/pypa/setuptools
|
||||||
|
Author: Python Packaging Authority
|
||||||
|
Author-email: distutils-sig@python.org
|
||||||
|
License: PSF or ZPL
|
||||||
|
Keywords: CPAN PyPI distutils eggs package management
|
||||||
|
Platform: UNKNOWN
|
||||||
|
Classifier: Development Status :: 5 - Production/Stable
|
||||||
|
Classifier: Intended Audience :: Developers
|
||||||
|
Classifier: License :: OSI Approved :: Python Software Foundation License
|
||||||
|
Classifier: License :: OSI Approved :: Zope Public License
|
||||||
|
Classifier: Operating System :: OS Independent
|
||||||
|
Classifier: Programming Language :: Python :: 2.6
|
||||||
|
Classifier: Programming Language :: Python :: 2.7
|
||||||
|
Classifier: Programming Language :: Python :: 3
|
||||||
|
Classifier: Programming Language :: Python :: 3.3
|
||||||
|
Classifier: Programming Language :: Python :: 3.4
|
||||||
|
Classifier: Programming Language :: Python :: 3.5
|
||||||
|
Classifier: Topic :: Software Development :: Libraries :: Python Modules
|
||||||
|
Classifier: Topic :: System :: Archiving :: Packaging
|
||||||
|
Classifier: Topic :: System :: Systems Administration
|
||||||
|
Classifier: Topic :: Utilities
|
||||||
|
Provides-Extra: certs
|
||||||
|
Requires-Dist: certifi (==2015.04.28); extra == 'certs'
|
||||||
|
Provides-Extra: ssl
|
||||||
|
Requires-Dist: wincertstore (==0.2); sys_platform=='win32' and extra == 'ssl'
|
||||||
|
|
||||||
|
===============================
|
||||||
|
Installing and Using Setuptools
|
||||||
|
===============================
|
||||||
|
|
||||||
|
.. contents:: **Table of Contents**
|
||||||
|
|
||||||
|
|
||||||
|
`Change History <https://pythonhosted.org/setuptools/history.html>`_.
|
||||||
|
|
||||||
|
-------------------------
|
||||||
|
Installation Instructions
|
||||||
|
-------------------------
|
||||||
|
|
||||||
|
The recommended way to bootstrap setuptools on any system is to download
|
||||||
|
`ez_setup.py`_ and run it using the target Python environment. Different
|
||||||
|
operating systems have different recommended techniques to accomplish this
|
||||||
|
basic routine, so below are some examples to get you started.
|
||||||
|
|
||||||
|
Setuptools requires Python 2.6 or later. To install setuptools
|
||||||
|
on Python 2.4 or Python 2.5, use the `bootstrap script for Setuptools 1.x
|
||||||
|
<https://bitbucket.org/pypa/setuptools/raw/bootstrap-py24/ez_setup.py>`_.
|
||||||
|
|
||||||
|
The link provided to ez_setup.py is a bookmark to bootstrap script for the
|
||||||
|
latest known stable release.
|
||||||
|
|
||||||
|
.. _ez_setup.py: https://bootstrap.pypa.io/ez_setup.py
|
||||||
|
|
||||||
|
Windows (Powershell 3 or later)
|
||||||
|
===============================
|
||||||
|
|
||||||
|
For best results, uninstall previous versions FIRST (see `Uninstalling`_).
|
||||||
|
|
||||||
|
Using Windows 8 (which includes PowerShell 3) or earlier versions of Windows
|
||||||
|
with PowerShell 3 installed, it's possible to install with one simple
|
||||||
|
Powershell command. Start up Powershell and paste this command::
|
||||||
|
|
||||||
|
> (Invoke-WebRequest https://bootstrap.pypa.io/ez_setup.py).Content | python -
|
||||||
|
|
||||||
|
You must start the Powershell with Administrative privileges or you may choose
|
||||||
|
to install a user-local installation::
|
||||||
|
|
||||||
|
> (Invoke-WebRequest https://bootstrap.pypa.io/ez_setup.py).Content | python - --user
|
||||||
|
|
||||||
|
If you have Python 3.3 or later, you can use the ``py`` command to install to
|
||||||
|
different Python versions. For example, to install to Python 3.3 if you have
|
||||||
|
Python 2.7 installed::
|
||||||
|
|
||||||
|
> (Invoke-WebRequest https://bootstrap.pypa.io/ez_setup.py).Content | py -3 -
|
||||||
|
|
||||||
|
The recommended way to install setuptools on Windows is to download
|
||||||
|
`ez_setup.py`_ and run it. The script will download the appropriate
|
||||||
|
distribution file and install it for you.
|
||||||
|
|
||||||
|
Once installation is complete, you will find an ``easy_install`` program in
|
||||||
|
your Python ``Scripts`` subdirectory. For simple invocation and best results,
|
||||||
|
add this directory to your ``PATH`` environment variable, if it is not already
|
||||||
|
present. If you did a user-local install, the ``Scripts`` subdirectory is
|
||||||
|
``$env:APPDATA\Python\Scripts``.
|
||||||
|
|
||||||
|
|
||||||
|
Windows (simplified)
|
||||||
|
====================
|
||||||
|
|
||||||
|
For Windows without PowerShell 3 or for installation without a command-line,
|
||||||
|
download `ez_setup.py`_ using your preferred web browser or other technique
|
||||||
|
and "run" that file.
|
||||||
|
|
||||||
|
|
||||||
|
Unix (wget)
|
||||||
|
===========
|
||||||
|
|
||||||
|
Most Linux distributions come with wget.
|
||||||
|
|
||||||
|
Download `ez_setup.py`_ and run it using the target Python version. The script
|
||||||
|
will download the appropriate version and install it for you::
|
||||||
|
|
||||||
|
> wget https://bootstrap.pypa.io/ez_setup.py -O - | python
|
||||||
|
|
||||||
|
Note that you will may need to invoke the command with superuser privileges to
|
||||||
|
install to the system Python::
|
||||||
|
|
||||||
|
> wget https://bootstrap.pypa.io/ez_setup.py -O - | sudo python
|
||||||
|
|
||||||
|
Alternatively, Setuptools may be installed to a user-local path::
|
||||||
|
|
||||||
|
> wget https://bootstrap.pypa.io/ez_setup.py -O - | python - --user
|
||||||
|
|
||||||
|
Note that on some older systems (noted on Debian 6 and CentOS 5 installations),
|
||||||
|
`wget` may refuse to download `ez_setup.py`, complaining that the certificate common name `*.c.ssl.fastly.net`
|
||||||
|
does not match the host name `bootstrap.pypa.io`. In addition, the `ez_setup.py` script may then encounter similar problems using
|
||||||
|
`wget` internally to download `setuptools-x.y.zip`, complaining that the certificate common name of `www.python.org` does not match the
|
||||||
|
host name `pypi.python.org`. Those are known issues, related to a bug in the older versions of `wget`
|
||||||
|
(see `Issue 59 <https://bitbucket.org/pypa/pypi/issue/59#comment-5881915>`_). If you happen to encounter them,
|
||||||
|
install Setuptools as follows::
|
||||||
|
|
||||||
|
> wget --no-check-certificate https://bootstrap.pypa.io/ez_setup.py
|
||||||
|
> python ez_setup.py --insecure
|
||||||
|
|
||||||
|
|
||||||
|
Unix including Mac OS X (curl)
|
||||||
|
==============================
|
||||||
|
|
||||||
|
If your system has curl installed, follow the ``wget`` instructions but
|
||||||
|
replace ``wget`` with ``curl`` and ``-O`` with ``-o``. For example::
|
||||||
|
|
||||||
|
> curl https://bootstrap.pypa.io/ez_setup.py -o - | python
|
||||||
|
|
||||||
|
|
||||||
|
Advanced Installation
|
||||||
|
=====================
|
||||||
|
|
||||||
|
For more advanced installation options, such as installing to custom
|
||||||
|
locations or prefixes, download and extract the source
|
||||||
|
tarball from `Setuptools on PyPI <https://pypi.python.org/pypi/setuptools>`_
|
||||||
|
and run setup.py with any supported distutils and Setuptools options.
|
||||||
|
For example::
|
||||||
|
|
||||||
|
setuptools-x.x$ python setup.py install --prefix=/opt/setuptools
|
||||||
|
|
||||||
|
Use ``--help`` to get a full options list, but we recommend consulting
|
||||||
|
the `EasyInstall manual`_ for detailed instructions, especially `the section
|
||||||
|
on custom installation locations`_.
|
||||||
|
|
||||||
|
.. _EasyInstall manual: https://pythonhosted.org/setuptools/EasyInstall
|
||||||
|
.. _the section on custom installation locations: https://pythonhosted.org/setuptools/EasyInstall#custom-installation-locations
|
||||||
|
|
||||||
|
|
||||||
|
Downloads
|
||||||
|
=========
|
||||||
|
|
||||||
|
All setuptools downloads can be found at `the project's home page in the Python
|
||||||
|
Package Index`_. Scroll to the very bottom of the page to find the links.
|
||||||
|
|
||||||
|
.. _the project's home page in the Python Package Index: https://pypi.python.org/pypi/setuptools
|
||||||
|
|
||||||
|
In addition to the PyPI downloads, the development version of ``setuptools``
|
||||||
|
is available from the `Bitbucket repo`_, and in-development versions of the
|
||||||
|
`0.6 branch`_ are available as well.
|
||||||
|
|
||||||
|
.. _Bitbucket repo: https://bitbucket.org/pypa/setuptools/get/default.tar.gz#egg=setuptools-dev
|
||||||
|
.. _0.6 branch: http://svn.python.org/projects/sandbox/branches/setuptools-0.6/#egg=setuptools-dev06
|
||||||
|
|
||||||
|
Uninstalling
|
||||||
|
============
|
||||||
|
|
||||||
|
On Windows, if Setuptools was installed using an ``.exe`` or ``.msi``
|
||||||
|
installer, simply use the uninstall feature of "Add/Remove Programs" in the
|
||||||
|
Control Panel.
|
||||||
|
|
||||||
|
Otherwise, to uninstall Setuptools or Distribute, regardless of the Python
|
||||||
|
version, delete all ``setuptools*`` and ``distribute*`` files and
|
||||||
|
directories from your system's ``site-packages`` directory
|
||||||
|
(and any other ``sys.path`` directories) FIRST.
|
||||||
|
|
||||||
|
If you are upgrading or otherwise plan to re-install Setuptools or Distribute,
|
||||||
|
nothing further needs to be done. If you want to completely remove Setuptools,
|
||||||
|
you may also want to remove the 'easy_install' and 'easy_install-x.x' scripts
|
||||||
|
and associated executables installed to the Python scripts directory.
|
||||||
|
|
||||||
|
--------------------------------
|
||||||
|
Using Setuptools and EasyInstall
|
||||||
|
--------------------------------
|
||||||
|
|
||||||
|
Here are some of the available manuals, tutorials, and other resources for
|
||||||
|
learning about Setuptools, Python Eggs, and EasyInstall:
|
||||||
|
|
||||||
|
* `The EasyInstall user's guide and reference manual`_
|
||||||
|
* `The setuptools Developer's Guide`_
|
||||||
|
* `The pkg_resources API reference`_
|
||||||
|
* `The Internal Structure of Python Eggs`_
|
||||||
|
|
||||||
|
Questions, comments, and bug reports should be directed to the `distutils-sig
|
||||||
|
mailing list`_. If you have written (or know of) any tutorials, documentation,
|
||||||
|
plug-ins, or other resources for setuptools users, please let us know about
|
||||||
|
them there, so this reference list can be updated. If you have working,
|
||||||
|
*tested* patches to correct problems or add features, you may submit them to
|
||||||
|
the `setuptools bug tracker`_.
|
||||||
|
|
||||||
|
.. _setuptools bug tracker: https://bitbucket.org/pypa/setuptools/issues
|
||||||
|
.. _The Internal Structure of Python Eggs: https://pythonhosted.org/setuptools/formats.html
|
||||||
|
.. _The setuptools Developer's Guide: https://pythonhosted.org/setuptools/setuptools.html
|
||||||
|
.. _The pkg_resources API reference: https://pythonhosted.org/setuptools/pkg_resources.html
|
||||||
|
.. _The EasyInstall user's guide and reference manual: https://pythonhosted.org/setuptools/easy_install.html
|
||||||
|
.. _distutils-sig mailing list: http://mail.python.org/pipermail/distutils-sig/
|
||||||
|
|
||||||
|
|
||||||
|
-------
|
||||||
|
Credits
|
||||||
|
-------
|
||||||
|
|
||||||
|
* The original design for the ``.egg`` format and the ``pkg_resources`` API was
|
||||||
|
co-created by Phillip Eby and Bob Ippolito. Bob also implemented the first
|
||||||
|
version of ``pkg_resources``, and supplied the OS X operating system version
|
||||||
|
compatibility algorithm.
|
||||||
|
|
||||||
|
* Ian Bicking implemented many early "creature comfort" features of
|
||||||
|
easy_install, including support for downloading via Sourceforge and
|
||||||
|
Subversion repositories. Ian's comments on the Web-SIG about WSGI
|
||||||
|
application deployment also inspired the concept of "entry points" in eggs,
|
||||||
|
and he has given talks at PyCon and elsewhere to inform and educate the
|
||||||
|
community about eggs and setuptools.
|
||||||
|
|
||||||
|
* Jim Fulton contributed time and effort to build automated tests of various
|
||||||
|
aspects of ``easy_install``, and supplied the doctests for the command-line
|
||||||
|
``.exe`` wrappers on Windows.
|
||||||
|
|
||||||
|
* Phillip J. Eby is the seminal author of setuptools, and
|
||||||
|
first proposed the idea of an importable binary distribution format for
|
||||||
|
Python application plug-ins.
|
||||||
|
|
||||||
|
* Significant parts of the implementation of setuptools were funded by the Open
|
||||||
|
Source Applications Foundation, to provide a plug-in infrastructure for the
|
||||||
|
Chandler PIM application. In addition, many OSAF staffers (such as Mike
|
||||||
|
"Code Bear" Taylor) contributed their time and stress as guinea pigs for the
|
||||||
|
use of eggs and setuptools, even before eggs were "cool". (Thanks, guys!)
|
||||||
|
|
||||||
|
* Tarek Ziadé is the principal author of the Distribute fork, which
|
||||||
|
re-invigorated the community on the project, encouraged renewed innovation,
|
||||||
|
and addressed many defects.
|
||||||
|
|
||||||
|
* Since the merge with Distribute, Jason R. Coombs is the
|
||||||
|
maintainer of setuptools. The project is maintained in coordination with
|
||||||
|
the Python Packaging Authority (PyPA) and the larger Python community.
|
||||||
|
|
||||||
|
.. _files:
|
||||||
|
|
||||||
|
|
||||||
|
---------------
|
||||||
|
Code of Conduct
|
||||||
|
---------------
|
||||||
|
|
||||||
|
Everyone interacting in the setuptools project's codebases, issue trackers,
|
||||||
|
chat rooms, and mailing lists is expected to follow the
|
||||||
|
`PyPA Code of Conduct`_.
|
||||||
|
|
||||||
|
.. _PyPA Code of Conduct: https://www.pypa.io/en/latest/code-of-conduct/
|
||||||
|
|
||||||
|
|
124
lib/python3.4/site-packages/setuptools-18.5.dist-info/RECORD
Normal file
124
lib/python3.4/site-packages/setuptools-18.5.dist-info/RECORD
Normal file
|
@ -0,0 +1,124 @@
|
||||||
|
easy_install.py,sha256=MDC9vt5AxDsXX5qcKlBz2TnW6Tpuv_AobnfhCJ9X3PM,126
|
||||||
|
_markerlib/__init__.py,sha256=GSmhZqvAitLJHhSgtqqusfq2nJ_ClP3oy3Lm0uZLIsU,552
|
||||||
|
_markerlib/markers.py,sha256=YuFp0-osufFIoqnzG3L0Z2fDCx4Vln3VUDeXJ2DA_1I,3979
|
||||||
|
pkg_resources/__init__.py,sha256=HXzgG0K1DI6hcI-gkMYoq6hPsoERSezP32FMP0Hxt00,106833
|
||||||
|
pkg_resources/_vendor/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
|
||||||
|
pkg_resources/_vendor/packaging/__about__.py,sha256=YzdrW-1lWmyCBDyrcNkZbJo4tiDWXpoiqPjfyCYMzIE,1073
|
||||||
|
pkg_resources/_vendor/packaging/__init__.py,sha256=2V8n-eEpSgBuXlV8hlMmhU7ZklpsrrusWMZNp2gC4Hs,906
|
||||||
|
pkg_resources/_vendor/packaging/_compat.py,sha256=wofog8iYo_zudt_10i6JiXKHDs5GhCuXC09hCuSJiv4,1253
|
||||||
|
pkg_resources/_vendor/packaging/_structures.py,sha256=93YvgrEE2HgFp8AdXy0pwCRVnZeutRHO_-puJ7T0cPw,1809
|
||||||
|
pkg_resources/_vendor/packaging/specifiers.py,sha256=UV9T01_kKloA8PSeMI3HTYBSJ_4KLs00yLvrlciZ3yU,28079
|
||||||
|
pkg_resources/_vendor/packaging/version.py,sha256=dEGrWZJZ6sef1xMxSfDCego2hS3Q86by0hUIFVk-AGc,11949
|
||||||
|
setuptools/__init__.py,sha256=3k29_xXPwjFtkdBbuqaQ-VA3_Mdqyq4ADDIwfsc9ISo,5424
|
||||||
|
setuptools/archive_util.py,sha256=N30WE5ZQjkytzhAodAXw4FkK-9J5AP1ChrClHnZthOA,6609
|
||||||
|
setuptools/cli-32.exe,sha256=dfEuovMNnA2HLa3jRfMPVi5tk4R7alCbpTvuxtCyw0Y,65536
|
||||||
|
setuptools/cli-64.exe,sha256=KLABu5pyrnokJCv6skjXZ6GsXeyYHGcqOUT3oHI3Xpo,74752
|
||||||
|
setuptools/cli-arm-32.exe,sha256=0pFNIi2SmY2gdY91Y4LRhj1wuBsnv5cG1fus3iBJv40,69120
|
||||||
|
setuptools/cli.exe,sha256=dfEuovMNnA2HLa3jRfMPVi5tk4R7alCbpTvuxtCyw0Y,65536
|
||||||
|
setuptools/compat.py,sha256=-Hl58PuLOEHUDM3-qsgmk50qNiyLLF3RgBgJ-eGGZG0,2094
|
||||||
|
setuptools/depends.py,sha256=gMRnrqQSr_Yp_wf09O88vKSQah1YjjEi5PsDNezM2Hs,6370
|
||||||
|
setuptools/dist.py,sha256=alEPOcofbGQSfkVQk6h0yEGNyKiQyCvNQa5YmnUm6wU,35320
|
||||||
|
setuptools/extension.py,sha256=nQ9GFTKxRIwmE1W7t1ZSBmuPAUJK_gVRYOCwxA1L38U,1649
|
||||||
|
setuptools/gui-32.exe,sha256=XBr0bHMA6Hpz2s9s9Bzjl-PwXfa9nH4ie0rFn4V2kWA,65536
|
||||||
|
setuptools/gui-64.exe,sha256=aYKMhX1IJLn4ULHgWX0sE0yREUt6B3TEHf_jOw6yNyE,75264
|
||||||
|
setuptools/gui-arm-32.exe,sha256=R5gRWLkY7wvO_CVGxoi7LZVTv0h-DKsKScy6fkbp4XI,69120
|
||||||
|
setuptools/gui.exe,sha256=XBr0bHMA6Hpz2s9s9Bzjl-PwXfa9nH4ie0rFn4V2kWA,65536
|
||||||
|
setuptools/lib2to3_ex.py,sha256=6jPF9sJuHiz0cyg4cwIBLl2VMAxcl3GYSZwWAOuJplU,1998
|
||||||
|
setuptools/msvc9_support.py,sha256=fo2vjb-dna1SEuHezQCTuelCo6XFBv5cqaI56ABJ1vw,2187
|
||||||
|
setuptools/package_index.py,sha256=aavWGle1RtmelMjeTOxjFoGpEdGdGY4hfRnLHUkbD0c,38760
|
||||||
|
setuptools/py26compat.py,sha256=ggKS8_aZWWHHS792vF3uXp5vmUkGNk3vjLreLTHr_-Q,431
|
||||||
|
setuptools/py27compat.py,sha256=CGj-jZcFgHUkrEdLvArkxHj96tAaMbG2-yJtUVU7QVI,306
|
||||||
|
setuptools/py31compat.py,sha256=O3X_wdWrvXTifeSFbRaCMuc23cDhMHJn7QlITb5cQ8E,1637
|
||||||
|
setuptools/sandbox.py,sha256=Gg5UfpsC5xsg_1x68pWRXG4t3nRWNpTk7m13lrnMTKw,13925
|
||||||
|
setuptools/script (dev).tmpl,sha256=f7MR17dTkzaqkCMSVseyOCMVrPVSMdmTQsaB8cZzfuI,201
|
||||||
|
setuptools/script.tmpl,sha256=WGTt5piezO27c-Dbx6l5Q4T3Ff20A5z7872hv3aAhYY,138
|
||||||
|
setuptools/site-patch.py,sha256=K-0-cAx36mX_PG-qPZwosG9ZLCliRjquKQ4nHiJvvzg,2389
|
||||||
|
setuptools/ssl_support.py,sha256=FASqXlRCmXAi6LUWLUIo0u14MpJqHBgkOc5KPHSRrtI,8044
|
||||||
|
setuptools/unicode_utils.py,sha256=gvhAHRj1LELCz-1MP3rfXGi__O1CAm5aksO9Njd2lpU,981
|
||||||
|
setuptools/utils.py,sha256=08Z7mt-9mvrx-XvmS5EyKoRn2lxNTlgFsUwBU3Eq9JQ,293
|
||||||
|
setuptools/version.py,sha256=puctVnBYQ-XKHe7xBHqCJwzHsPjWBWn-FQt0kRDJjEg,21
|
||||||
|
setuptools/windows_support.py,sha256=5GrfqSP2-dLGJoZTq2g6dCKkyQxxa2n5IQiXlJCoYEE,714
|
||||||
|
setuptools/command/__init__.py,sha256=gQMXoLa0TtUtmUZY0ptSouWWA5kcTArWyDQ6QwkjoVQ,554
|
||||||
|
setuptools/command/alias.py,sha256=1sLQxZcNh6dDQpDmm4G7UGGTol83nY1NTPmNBbm2siI,2381
|
||||||
|
setuptools/command/bdist_egg.py,sha256=3eblnHDm1t8Hwh8K5z1QaWOVkxUvxQc40KV_YZVHNFs,17184
|
||||||
|
setuptools/command/bdist_rpm.py,sha256=B7l0TnzCGb-0nLlm6rS00jWLkojASwVmdhW2w5Qz_Ak,1508
|
||||||
|
setuptools/command/bdist_wininst.py,sha256=_6dz3lpB1tY200LxKPLM7qgwTCceOMgaWFF-jW2-pm0,637
|
||||||
|
setuptools/command/build_ext.py,sha256=pkQ8xp3YPVGGLkGv-SvfxC_GqFpboph1AFEoMFOgQMo,11964
|
||||||
|
setuptools/command/build_py.py,sha256=uTgiBroMgyV-Lq4Kt42PLWQknM9G8c8_6TiDv4H5_Sw,7915
|
||||||
|
setuptools/command/develop.py,sha256=uyRwABU1JnhQhZO9rS8-nenkzLwKKJt2P7WPnsXrHd4,6610
|
||||||
|
setuptools/command/easy_install.py,sha256=_gRt2BDjiJpHuDPJzOFOTThWjspKy7NYIV_Br_PmyB0,87190
|
||||||
|
setuptools/command/egg_info.py,sha256=7AEcwMZQ5zl48_Cu_srTxCUqlJBokW10jRlFHZof2fs,16852
|
||||||
|
setuptools/command/install.py,sha256=QwaFiZRU3ytIHoPh8uJ9EqV3Fu9C4ca4B7UGAo95tws,4685
|
||||||
|
setuptools/command/install_egg_info.py,sha256=KXNB8O34-rK-cZZZr2fnT8kqNktDmTfUA88X2Iln66c,4001
|
||||||
|
setuptools/command/install_lib.py,sha256=ntpy-9xiFHfDmXmX_Lfp7nMchw7FpgyP66H7reixI_Y,3771
|
||||||
|
setuptools/command/install_scripts.py,sha256=vX2JC6v7l090N7CrTfihWBklNbPvfNKAY2LRtukM9XE,2231
|
||||||
|
setuptools/command/launcher manifest.xml,sha256=xlLbjWrB01tKC0-hlVkOKkiSPbzMml2eOPtJ_ucCnbE,628
|
||||||
|
setuptools/command/register.py,sha256=bHlMm1qmBbSdahTOT8w6UhA-EgeQIz7p6cD-qOauaiI,270
|
||||||
|
setuptools/command/rotate.py,sha256=Qm7SOa32L9XG5b_C7_SSYvKM5rqFXroeQ6w8GXIsY2o,2038
|
||||||
|
setuptools/command/saveopts.py,sha256=za7QCBcQimKKriWcoCcbhxPjUz30gSB74zuTL47xpP4,658
|
||||||
|
setuptools/command/sdist.py,sha256=rMT2qS0u4GYJtL4IXiYG-ElEa111wqzQVHpv9uE1L5w,7079
|
||||||
|
setuptools/command/setopt.py,sha256=Z3_kav60D2XHZjM0cRhGo7wbBYo7nr4U_U-wMMbpmu8,5080
|
||||||
|
setuptools/command/test.py,sha256=yJEniqTzav6R6vimRG3tb7l233rGDSAmFafXIHe9UzU,6562
|
||||||
|
setuptools/command/upload_docs.py,sha256=di-XRGtxW5TSFYR6nK9XZj3I5JIU4V00SOFRhptdOGc,6782
|
||||||
|
setuptools-18.5.dist-info/DESCRIPTION.rst,sha256=MDsJej8DPV2OKpAKpu74g-2xksRd-uGTeZn4W7D1dnI,9940
|
||||||
|
setuptools-18.5.dist-info/METADATA,sha256=AnJr1ZA0xypJknGm_uX312zkByzCrA6_ZyjZDI0d408,11256
|
||||||
|
setuptools-18.5.dist-info/RECORD,,
|
||||||
|
setuptools-18.5.dist-info/WHEEL,sha256=GrqQvamwgBV4nLoJe0vhYRSWzWsx7xjlt74FT0SWYfE,110
|
||||||
|
setuptools-18.5.dist-info/dependency_links.txt,sha256=g1tkmtmOY1n1KRGVLZKBtbJf0CCZ2Jil8uGvMfQRJNE,226
|
||||||
|
setuptools-18.5.dist-info/entry_points.txt,sha256=xrrbAWSD2o_blM5eb2oXvmCTvfdcjUMunUT4T8C-AAs,2793
|
||||||
|
setuptools-18.5.dist-info/metadata.json,sha256=kuSZ-oLefqfph4-5BoCXPJKfOx98Dz9L4EgOJ08ZZl0,4678
|
||||||
|
setuptools-18.5.dist-info/top_level.txt,sha256=7780fzudMJkykiTcIrAQ8m8Lll6kot3EEePye3VJgEE,49
|
||||||
|
setuptools-18.5.dist-info/zip-safe,sha256=AbpHGcgLb-kRsJGnwFEktk7uzpZOCcBY74-YBdrKVGs,1
|
||||||
|
/srv/openmedialibrary/platform/Linux_x86_64/home/.local/bin/easy_install,sha256=7h7lc5DCAhnE08UwEm49wGymGDGdOcrkRdncTKYmXIQ,233
|
||||||
|
/srv/openmedialibrary/platform/Linux_x86_64/home/.local/bin/easy_install-3.4,sha256=7h7lc5DCAhnE08UwEm49wGymGDGdOcrkRdncTKYmXIQ,233
|
||||||
|
setuptools/__pycache__/package_index.cpython-34.pyc,,
|
||||||
|
setuptools/__pycache__/msvc9_support.cpython-34.pyc,,
|
||||||
|
setuptools/command/__pycache__/install.cpython-34.pyc,,
|
||||||
|
pkg_resources/_vendor/packaging/__pycache__/specifiers.cpython-34.pyc,,
|
||||||
|
setuptools/__pycache__/depends.cpython-34.pyc,,
|
||||||
|
setuptools/__pycache__/py26compat.cpython-34.pyc,,
|
||||||
|
setuptools/__pycache__/ssl_support.cpython-34.pyc,,
|
||||||
|
pkg_resources/_vendor/packaging/__pycache__/__init__.cpython-34.pyc,,
|
||||||
|
setuptools/__pycache__/archive_util.cpython-34.pyc,,
|
||||||
|
setuptools/command/__pycache__/bdist_egg.cpython-34.pyc,,
|
||||||
|
__pycache__/easy_install.cpython-34.pyc,,
|
||||||
|
setuptools/__pycache__/compat.cpython-34.pyc,,
|
||||||
|
pkg_resources/_vendor/packaging/__pycache__/_structures.cpython-34.pyc,,
|
||||||
|
setuptools/__pycache__/__init__.cpython-34.pyc,,
|
||||||
|
setuptools/command/__pycache__/egg_info.cpython-34.pyc,,
|
||||||
|
pkg_resources/_vendor/__pycache__/__init__.cpython-34.pyc,,
|
||||||
|
setuptools/__pycache__/windows_support.cpython-34.pyc,,
|
||||||
|
setuptools/command/__pycache__/build_ext.cpython-34.pyc,,
|
||||||
|
setuptools/command/__pycache__/install_scripts.cpython-34.pyc,,
|
||||||
|
setuptools/__pycache__/unicode_utils.cpython-34.pyc,,
|
||||||
|
setuptools/command/__pycache__/register.cpython-34.pyc,,
|
||||||
|
setuptools/command/__pycache__/install_egg_info.cpython-34.pyc,,
|
||||||
|
setuptools/command/__pycache__/build_py.cpython-34.pyc,,
|
||||||
|
setuptools/command/__pycache__/setopt.cpython-34.pyc,,
|
||||||
|
setuptools/__pycache__/utils.cpython-34.pyc,,
|
||||||
|
pkg_resources/_vendor/packaging/__pycache__/version.cpython-34.pyc,,
|
||||||
|
setuptools/command/__pycache__/saveopts.cpython-34.pyc,,
|
||||||
|
setuptools/__pycache__/py27compat.cpython-34.pyc,,
|
||||||
|
pkg_resources/__pycache__/__init__.cpython-34.pyc,,
|
||||||
|
setuptools/command/__pycache__/rotate.cpython-34.pyc,,
|
||||||
|
setuptools/__pycache__/extension.cpython-34.pyc,,
|
||||||
|
setuptools/command/__pycache__/bdist_wininst.cpython-34.pyc,,
|
||||||
|
setuptools/command/__pycache__/test.cpython-34.pyc,,
|
||||||
|
setuptools/command/__pycache__/upload_docs.cpython-34.pyc,,
|
||||||
|
_markerlib/__pycache__/__init__.cpython-34.pyc,,
|
||||||
|
setuptools/command/__pycache__/sdist.cpython-34.pyc,,
|
||||||
|
_markerlib/__pycache__/markers.cpython-34.pyc,,
|
||||||
|
setuptools/__pycache__/site-patch.cpython-34.pyc,,
|
||||||
|
pkg_resources/_vendor/packaging/__pycache__/_compat.cpython-34.pyc,,
|
||||||
|
setuptools/__pycache__/py31compat.cpython-34.pyc,,
|
||||||
|
setuptools/__pycache__/sandbox.cpython-34.pyc,,
|
||||||
|
pkg_resources/_vendor/packaging/__pycache__/__about__.cpython-34.pyc,,
|
||||||
|
setuptools/__pycache__/dist.cpython-34.pyc,,
|
||||||
|
setuptools/__pycache__/lib2to3_ex.cpython-34.pyc,,
|
||||||
|
setuptools/command/__pycache__/bdist_rpm.cpython-34.pyc,,
|
||||||
|
setuptools/__pycache__/version.cpython-34.pyc,,
|
||||||
|
setuptools/command/__pycache__/install_lib.cpython-34.pyc,,
|
||||||
|
setuptools/command/__pycache__/develop.cpython-34.pyc,,
|
||||||
|
setuptools/command/__pycache__/__init__.cpython-34.pyc,,
|
||||||
|
setuptools/command/__pycache__/easy_install.cpython-34.pyc,,
|
||||||
|
setuptools/command/__pycache__/alias.cpython-34.pyc,,
|
|
@ -0,0 +1,6 @@
|
||||||
|
Wheel-Version: 1.0
|
||||||
|
Generator: bdist_wheel (0.26.0)
|
||||||
|
Root-Is-Purelib: true
|
||||||
|
Tag: py2-none-any
|
||||||
|
Tag: py3-none-any
|
||||||
|
|
|
@ -0,0 +1,2 @@
|
||||||
|
https://pypi.python.org/packages/source/c/certifi/certifi-2015.04.28.tar.gz#md5=12c7c3a063b2ff97a0f8291d8de41e8c
|
||||||
|
https://pypi.python.org/packages/source/w/wincertstore/wincertstore-0.2.zip#md5=ae728f2f007185648d0c7a8679b361e2
|
|
@ -0,0 +1,61 @@
|
||||||
|
[console_scripts]
|
||||||
|
easy_install = setuptools.command.easy_install:main
|
||||||
|
easy_install-3.5 = setuptools.command.easy_install:main
|
||||||
|
|
||||||
|
[distutils.commands]
|
||||||
|
alias = setuptools.command.alias:alias
|
||||||
|
bdist_egg = setuptools.command.bdist_egg:bdist_egg
|
||||||
|
bdist_rpm = setuptools.command.bdist_rpm:bdist_rpm
|
||||||
|
bdist_wininst = setuptools.command.bdist_wininst:bdist_wininst
|
||||||
|
build_ext = setuptools.command.build_ext:build_ext
|
||||||
|
build_py = setuptools.command.build_py:build_py
|
||||||
|
develop = setuptools.command.develop:develop
|
||||||
|
easy_install = setuptools.command.easy_install:easy_install
|
||||||
|
egg_info = setuptools.command.egg_info:egg_info
|
||||||
|
install = setuptools.command.install:install
|
||||||
|
install_egg_info = setuptools.command.install_egg_info:install_egg_info
|
||||||
|
install_lib = setuptools.command.install_lib:install_lib
|
||||||
|
install_scripts = setuptools.command.install_scripts:install_scripts
|
||||||
|
register = setuptools.command.register:register
|
||||||
|
rotate = setuptools.command.rotate:rotate
|
||||||
|
saveopts = setuptools.command.saveopts:saveopts
|
||||||
|
sdist = setuptools.command.sdist:sdist
|
||||||
|
setopt = setuptools.command.setopt:setopt
|
||||||
|
test = setuptools.command.test:test
|
||||||
|
upload_docs = setuptools.command.upload_docs:upload_docs
|
||||||
|
|
||||||
|
[distutils.setup_keywords]
|
||||||
|
convert_2to3_doctests = setuptools.dist:assert_string_list
|
||||||
|
dependency_links = setuptools.dist:assert_string_list
|
||||||
|
eager_resources = setuptools.dist:assert_string_list
|
||||||
|
entry_points = setuptools.dist:check_entry_points
|
||||||
|
exclude_package_data = setuptools.dist:check_package_data
|
||||||
|
extras_require = setuptools.dist:check_extras
|
||||||
|
include_package_data = setuptools.dist:assert_bool
|
||||||
|
install_requires = setuptools.dist:check_requirements
|
||||||
|
namespace_packages = setuptools.dist:check_nsp
|
||||||
|
package_data = setuptools.dist:check_package_data
|
||||||
|
packages = setuptools.dist:check_packages
|
||||||
|
setup_requires = setuptools.dist:check_requirements
|
||||||
|
test_loader = setuptools.dist:check_importable
|
||||||
|
test_runner = setuptools.dist:check_importable
|
||||||
|
test_suite = setuptools.dist:check_test_suite
|
||||||
|
tests_require = setuptools.dist:check_requirements
|
||||||
|
use_2to3 = setuptools.dist:assert_bool
|
||||||
|
use_2to3_exclude_fixers = setuptools.dist:assert_string_list
|
||||||
|
use_2to3_fixers = setuptools.dist:assert_string_list
|
||||||
|
zip_safe = setuptools.dist:assert_bool
|
||||||
|
|
||||||
|
[egg_info.writers]
|
||||||
|
PKG-INFO = setuptools.command.egg_info:write_pkg_info
|
||||||
|
dependency_links.txt = setuptools.command.egg_info:overwrite_arg
|
||||||
|
depends.txt = setuptools.command.egg_info:warn_depends_obsolete
|
||||||
|
eager_resources.txt = setuptools.command.egg_info:overwrite_arg
|
||||||
|
entry_points.txt = setuptools.command.egg_info:write_entries
|
||||||
|
namespace_packages.txt = setuptools.command.egg_info:overwrite_arg
|
||||||
|
requires.txt = setuptools.command.egg_info:write_requirements
|
||||||
|
top_level.txt = setuptools.command.egg_info:write_toplevel_names
|
||||||
|
|
||||||
|
[setuptools.installation]
|
||||||
|
eggsecutable = setuptools.command.easy_install:bootstrap
|
||||||
|
|
|
@ -0,0 +1 @@
|
||||||
|
{"generator": "bdist_wheel (0.26.0)", "summary": "Easily download, build, install, upgrade, and uninstall Python packages", "classifiers": ["Development Status :: 5 - Production/Stable", "Intended Audience :: Developers", "License :: OSI Approved :: Python Software Foundation License", "License :: OSI Approved :: Zope Public License", "Operating System :: OS Independent", "Programming Language :: Python :: 2.6", "Programming Language :: Python :: 2.7", "Programming Language :: Python :: 3", "Programming Language :: Python :: 3.3", "Programming Language :: Python :: 3.4", "Programming Language :: Python :: 3.5", "Topic :: Software Development :: Libraries :: Python Modules", "Topic :: System :: Archiving :: Packaging", "Topic :: System :: Systems Administration", "Topic :: Utilities"], "extensions": {"python.details": {"project_urls": {"Home": "https://bitbucket.org/pypa/setuptools"}, "contacts": [{"email": "distutils-sig@python.org", "name": "Python Packaging Authority", "role": "author"}], "document_names": {"description": "DESCRIPTION.rst"}}, "python.exports": {"console_scripts": {"easy_install": "setuptools.command.easy_install:main", "easy_install-3.5": "setuptools.command.easy_install:main"}, "distutils.commands": {"alias": "setuptools.command.alias:alias", "bdist_egg": "setuptools.command.bdist_egg:bdist_egg", "bdist_rpm": "setuptools.command.bdist_rpm:bdist_rpm", "bdist_wininst": "setuptools.command.bdist_wininst:bdist_wininst", "build_ext": "setuptools.command.build_ext:build_ext", "build_py": "setuptools.command.build_py:build_py", "develop": "setuptools.command.develop:develop", "easy_install": "setuptools.command.easy_install:easy_install", "egg_info": "setuptools.command.egg_info:egg_info", "install": "setuptools.command.install:install", "install_egg_info": "setuptools.command.install_egg_info:install_egg_info", "install_lib": "setuptools.command.install_lib:install_lib", "install_scripts": "setuptools.command.install_scripts:install_scripts", "register": "setuptools.command.register:register", "rotate": "setuptools.command.rotate:rotate", "saveopts": "setuptools.command.saveopts:saveopts", "sdist": "setuptools.command.sdist:sdist", "setopt": "setuptools.command.setopt:setopt", "test": "setuptools.command.test:test", "upload_docs": "setuptools.command.upload_docs:upload_docs"}, "distutils.setup_keywords": {"convert_2to3_doctests": "setuptools.dist:assert_string_list", "dependency_links": "setuptools.dist:assert_string_list", "eager_resources": "setuptools.dist:assert_string_list", "entry_points": "setuptools.dist:check_entry_points", "exclude_package_data": "setuptools.dist:check_package_data", "extras_require": "setuptools.dist:check_extras", "include_package_data": "setuptools.dist:assert_bool", "install_requires": "setuptools.dist:check_requirements", "namespace_packages": "setuptools.dist:check_nsp", "package_data": "setuptools.dist:check_package_data", "packages": "setuptools.dist:check_packages", "setup_requires": "setuptools.dist:check_requirements", "test_loader": "setuptools.dist:check_importable", "test_runner": "setuptools.dist:check_importable", "test_suite": "setuptools.dist:check_test_suite", "tests_require": "setuptools.dist:check_requirements", "use_2to3": "setuptools.dist:assert_bool", "use_2to3_exclude_fixers": "setuptools.dist:assert_string_list", "use_2to3_fixers": "setuptools.dist:assert_string_list", "zip_safe": "setuptools.dist:assert_bool"}, "egg_info.writers": {"PKG-INFO": "setuptools.command.egg_info:write_pkg_info", "dependency_links.txt": "setuptools.command.egg_info:overwrite_arg", "depends.txt": "setuptools.command.egg_info:warn_depends_obsolete", "eager_resources.txt": "setuptools.command.egg_info:overwrite_arg", "entry_points.txt": "setuptools.command.egg_info:write_entries", "namespace_packages.txt": "setuptools.command.egg_info:overwrite_arg", "requires.txt": "setuptools.command.egg_info:write_requirements", "top_level.txt": "setuptools.command.egg_info:write_toplevel_names"}, "setuptools.installation": {"eggsecutable": "setuptools.command.easy_install:bootstrap"}}, "python.commands": {"wrap_console": {"easy_install": "setuptools.command.easy_install:main", "easy_install-3.5": "setuptools.command.easy_install:main"}}}, "keywords": ["CPAN", "PyPI", "distutils", "eggs", "package", "management"], "license": "PSF or ZPL", "metadata_version": "2.0", "name": "setuptools", "extras": ["certs", "ssl"], "run_requires": [{"requires": ["certifi (==2015.04.28)"], "extra": "certs"}, {"requires": ["wincertstore (==0.2)"], "extra": "ssl", "environment": "sys_platform=='win32'"}], "version": "18.5", "test_requires": [{"requires": ["pytest", "setuptools[ssl]"]}]}
|
|
@ -0,0 +1,4 @@
|
||||||
|
_markerlib
|
||||||
|
easy_install
|
||||||
|
pkg_resources
|
||||||
|
setuptools
|
|
@ -0,0 +1 @@
|
||||||
|
|
168
lib/python3.4/site-packages/setuptools/__init__.py
Normal file
168
lib/python3.4/site-packages/setuptools/__init__.py
Normal file
|
@ -0,0 +1,168 @@
|
||||||
|
"""Extensions to the 'distutils' for large or complex distributions"""
|
||||||
|
|
||||||
|
import os
|
||||||
|
import functools
|
||||||
|
import distutils.core
|
||||||
|
import distutils.filelist
|
||||||
|
from distutils.core import Command as _Command
|
||||||
|
from distutils.util import convert_path
|
||||||
|
from fnmatch import fnmatchcase
|
||||||
|
|
||||||
|
import setuptools.version
|
||||||
|
from setuptools.extension import Extension
|
||||||
|
from setuptools.dist import Distribution, Feature, _get_unpatched
|
||||||
|
from setuptools.depends import Require
|
||||||
|
from setuptools.compat import filterfalse
|
||||||
|
|
||||||
|
__all__ = [
|
||||||
|
'setup', 'Distribution', 'Feature', 'Command', 'Extension', 'Require',
|
||||||
|
'find_packages'
|
||||||
|
]
|
||||||
|
|
||||||
|
__version__ = setuptools.version.__version__
|
||||||
|
|
||||||
|
bootstrap_install_from = None
|
||||||
|
|
||||||
|
# If we run 2to3 on .py files, should we also convert docstrings?
|
||||||
|
# Default: yes; assume that we can detect doctests reliably
|
||||||
|
run_2to3_on_doctests = True
|
||||||
|
# Standard package names for fixer packages
|
||||||
|
lib2to3_fixer_packages = ['lib2to3.fixes']
|
||||||
|
|
||||||
|
|
||||||
|
class PackageFinder(object):
|
||||||
|
@classmethod
|
||||||
|
def find(cls, where='.', exclude=(), include=('*',)):
|
||||||
|
"""Return a list all Python packages found within directory 'where'
|
||||||
|
|
||||||
|
'where' should be supplied as a "cross-platform" (i.e. URL-style)
|
||||||
|
path; it will be converted to the appropriate local path syntax.
|
||||||
|
'exclude' is a sequence of package names to exclude; '*' can be used
|
||||||
|
as a wildcard in the names, such that 'foo.*' will exclude all
|
||||||
|
subpackages of 'foo' (but not 'foo' itself).
|
||||||
|
|
||||||
|
'include' is a sequence of package names to include. If it's
|
||||||
|
specified, only the named packages will be included. If it's not
|
||||||
|
specified, all found packages will be included. 'include' can contain
|
||||||
|
shell style wildcard patterns just like 'exclude'.
|
||||||
|
|
||||||
|
The list of included packages is built up first and then any
|
||||||
|
explicitly excluded packages are removed from it.
|
||||||
|
"""
|
||||||
|
out = cls._find_packages_iter(convert_path(where))
|
||||||
|
out = cls.require_parents(out)
|
||||||
|
includes = cls._build_filter(*include)
|
||||||
|
excludes = cls._build_filter('ez_setup', '*__pycache__', *exclude)
|
||||||
|
out = filter(includes, out)
|
||||||
|
out = filterfalse(excludes, out)
|
||||||
|
return list(out)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def require_parents(packages):
|
||||||
|
"""
|
||||||
|
Exclude any apparent package that apparently doesn't include its
|
||||||
|
parent.
|
||||||
|
|
||||||
|
For example, exclude 'foo.bar' if 'foo' is not present.
|
||||||
|
"""
|
||||||
|
found = []
|
||||||
|
for pkg in packages:
|
||||||
|
base, sep, child = pkg.rpartition('.')
|
||||||
|
if base and base not in found:
|
||||||
|
continue
|
||||||
|
found.append(pkg)
|
||||||
|
yield pkg
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _candidate_dirs(base_path):
|
||||||
|
"""
|
||||||
|
Return all dirs in base_path that might be packages.
|
||||||
|
"""
|
||||||
|
has_dot = lambda name: '.' in name
|
||||||
|
for root, dirs, files in os.walk(base_path, followlinks=True):
|
||||||
|
# Exclude directories that contain a period, as they cannot be
|
||||||
|
# packages. Mutate the list to avoid traversal.
|
||||||
|
dirs[:] = filterfalse(has_dot, dirs)
|
||||||
|
for dir in dirs:
|
||||||
|
yield os.path.relpath(os.path.join(root, dir), base_path)
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def _find_packages_iter(cls, base_path):
|
||||||
|
candidates = cls._candidate_dirs(base_path)
|
||||||
|
return (
|
||||||
|
path.replace(os.path.sep, '.')
|
||||||
|
for path in candidates
|
||||||
|
if cls._looks_like_package(os.path.join(base_path, path))
|
||||||
|
)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _looks_like_package(path):
|
||||||
|
return os.path.isfile(os.path.join(path, '__init__.py'))
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _build_filter(*patterns):
|
||||||
|
"""
|
||||||
|
Given a list of patterns, return a callable that will be true only if
|
||||||
|
the input matches one of the patterns.
|
||||||
|
"""
|
||||||
|
return lambda name: any(fnmatchcase(name, pat=pat) for pat in patterns)
|
||||||
|
|
||||||
|
class PEP420PackageFinder(PackageFinder):
|
||||||
|
@staticmethod
|
||||||
|
def _looks_like_package(path):
|
||||||
|
return True
|
||||||
|
|
||||||
|
find_packages = PackageFinder.find
|
||||||
|
|
||||||
|
setup = distutils.core.setup
|
||||||
|
|
||||||
|
_Command = _get_unpatched(_Command)
|
||||||
|
|
||||||
|
class Command(_Command):
|
||||||
|
__doc__ = _Command.__doc__
|
||||||
|
|
||||||
|
command_consumes_arguments = False
|
||||||
|
|
||||||
|
def __init__(self, dist, **kw):
|
||||||
|
"""
|
||||||
|
Construct the command for dist, updating
|
||||||
|
vars(self) with any keyword parameters.
|
||||||
|
"""
|
||||||
|
_Command.__init__(self, dist)
|
||||||
|
vars(self).update(kw)
|
||||||
|
|
||||||
|
def reinitialize_command(self, command, reinit_subcommands=0, **kw):
|
||||||
|
cmd = _Command.reinitialize_command(self, command, reinit_subcommands)
|
||||||
|
vars(cmd).update(kw)
|
||||||
|
return cmd
|
||||||
|
|
||||||
|
# we can't patch distutils.cmd, alas
|
||||||
|
distutils.core.Command = Command
|
||||||
|
|
||||||
|
|
||||||
|
def _find_all_simple(path):
|
||||||
|
"""
|
||||||
|
Find all files under 'path'
|
||||||
|
"""
|
||||||
|
results = (
|
||||||
|
os.path.join(base, file)
|
||||||
|
for base, dirs, files in os.walk(path, followlinks=True)
|
||||||
|
for file in files
|
||||||
|
)
|
||||||
|
return filter(os.path.isfile, results)
|
||||||
|
|
||||||
|
|
||||||
|
def findall(dir=os.curdir):
|
||||||
|
"""
|
||||||
|
Find all files under 'dir' and return the list of full filenames.
|
||||||
|
Unless dir is '.', return full filenames with dir prepended.
|
||||||
|
"""
|
||||||
|
files = _find_all_simple(dir)
|
||||||
|
if dir == os.curdir:
|
||||||
|
make_rel = functools.partial(os.path.relpath, start=dir)
|
||||||
|
files = map(make_rel, files)
|
||||||
|
return list(files)
|
||||||
|
|
||||||
|
|
||||||
|
# fix findall bug in distutils (http://bugs.python.org/issue12885)
|
||||||
|
distutils.filelist.findall = findall
|
170
lib/python3.4/site-packages/setuptools/archive_util.py
Normal file
170
lib/python3.4/site-packages/setuptools/archive_util.py
Normal file
|
@ -0,0 +1,170 @@
|
||||||
|
"""Utilities for extracting common archive formats"""
|
||||||
|
|
||||||
|
|
||||||
|
__all__ = [
|
||||||
|
"unpack_archive", "unpack_zipfile", "unpack_tarfile", "default_filter",
|
||||||
|
"UnrecognizedFormat", "extraction_drivers", "unpack_directory",
|
||||||
|
]
|
||||||
|
|
||||||
|
import zipfile
|
||||||
|
import tarfile
|
||||||
|
import os
|
||||||
|
import shutil
|
||||||
|
import posixpath
|
||||||
|
import contextlib
|
||||||
|
from pkg_resources import ensure_directory, ContextualZipFile
|
||||||
|
from distutils.errors import DistutilsError
|
||||||
|
|
||||||
|
class UnrecognizedFormat(DistutilsError):
|
||||||
|
"""Couldn't recognize the archive type"""
|
||||||
|
|
||||||
|
def default_filter(src,dst):
|
||||||
|
"""The default progress/filter callback; returns True for all files"""
|
||||||
|
return dst
|
||||||
|
|
||||||
|
|
||||||
|
def unpack_archive(filename, extract_dir, progress_filter=default_filter,
|
||||||
|
drivers=None):
|
||||||
|
"""Unpack `filename` to `extract_dir`, or raise ``UnrecognizedFormat``
|
||||||
|
|
||||||
|
`progress_filter` is a function taking two arguments: a source path
|
||||||
|
internal to the archive ('/'-separated), and a filesystem path where it
|
||||||
|
will be extracted. The callback must return the desired extract path
|
||||||
|
(which may be the same as the one passed in), or else ``None`` to skip
|
||||||
|
that file or directory. The callback can thus be used to report on the
|
||||||
|
progress of the extraction, as well as to filter the items extracted or
|
||||||
|
alter their extraction paths.
|
||||||
|
|
||||||
|
`drivers`, if supplied, must be a non-empty sequence of functions with the
|
||||||
|
same signature as this function (minus the `drivers` argument), that raise
|
||||||
|
``UnrecognizedFormat`` if they do not support extracting the designated
|
||||||
|
archive type. The `drivers` are tried in sequence until one is found that
|
||||||
|
does not raise an error, or until all are exhausted (in which case
|
||||||
|
``UnrecognizedFormat`` is raised). If you do not supply a sequence of
|
||||||
|
drivers, the module's ``extraction_drivers`` constant will be used, which
|
||||||
|
means that ``unpack_zipfile`` and ``unpack_tarfile`` will be tried, in that
|
||||||
|
order.
|
||||||
|
"""
|
||||||
|
for driver in drivers or extraction_drivers:
|
||||||
|
try:
|
||||||
|
driver(filename, extract_dir, progress_filter)
|
||||||
|
except UnrecognizedFormat:
|
||||||
|
continue
|
||||||
|
else:
|
||||||
|
return
|
||||||
|
else:
|
||||||
|
raise UnrecognizedFormat(
|
||||||
|
"Not a recognized archive type: %s" % filename
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def unpack_directory(filename, extract_dir, progress_filter=default_filter):
|
||||||
|
""""Unpack" a directory, using the same interface as for archives
|
||||||
|
|
||||||
|
Raises ``UnrecognizedFormat`` if `filename` is not a directory
|
||||||
|
"""
|
||||||
|
if not os.path.isdir(filename):
|
||||||
|
raise UnrecognizedFormat("%s is not a directory" % filename)
|
||||||
|
|
||||||
|
paths = {
|
||||||
|
filename: ('', extract_dir),
|
||||||
|
}
|
||||||
|
for base, dirs, files in os.walk(filename):
|
||||||
|
src, dst = paths[base]
|
||||||
|
for d in dirs:
|
||||||
|
paths[os.path.join(base, d)] = src + d + '/', os.path.join(dst, d)
|
||||||
|
for f in files:
|
||||||
|
target = os.path.join(dst, f)
|
||||||
|
target = progress_filter(src + f, target)
|
||||||
|
if not target:
|
||||||
|
# skip non-files
|
||||||
|
continue
|
||||||
|
ensure_directory(target)
|
||||||
|
f = os.path.join(base, f)
|
||||||
|
shutil.copyfile(f, target)
|
||||||
|
shutil.copystat(f, target)
|
||||||
|
|
||||||
|
|
||||||
|
def unpack_zipfile(filename, extract_dir, progress_filter=default_filter):
|
||||||
|
"""Unpack zip `filename` to `extract_dir`
|
||||||
|
|
||||||
|
Raises ``UnrecognizedFormat`` if `filename` is not a zipfile (as determined
|
||||||
|
by ``zipfile.is_zipfile()``). See ``unpack_archive()`` for an explanation
|
||||||
|
of the `progress_filter` argument.
|
||||||
|
"""
|
||||||
|
|
||||||
|
if not zipfile.is_zipfile(filename):
|
||||||
|
raise UnrecognizedFormat("%s is not a zip file" % (filename,))
|
||||||
|
|
||||||
|
with ContextualZipFile(filename) as z:
|
||||||
|
for info in z.infolist():
|
||||||
|
name = info.filename
|
||||||
|
|
||||||
|
# don't extract absolute paths or ones with .. in them
|
||||||
|
if name.startswith('/') or '..' in name.split('/'):
|
||||||
|
continue
|
||||||
|
|
||||||
|
target = os.path.join(extract_dir, *name.split('/'))
|
||||||
|
target = progress_filter(name, target)
|
||||||
|
if not target:
|
||||||
|
continue
|
||||||
|
if name.endswith('/'):
|
||||||
|
# directory
|
||||||
|
ensure_directory(target)
|
||||||
|
else:
|
||||||
|
# file
|
||||||
|
ensure_directory(target)
|
||||||
|
data = z.read(info.filename)
|
||||||
|
with open(target, 'wb') as f:
|
||||||
|
f.write(data)
|
||||||
|
unix_attributes = info.external_attr >> 16
|
||||||
|
if unix_attributes:
|
||||||
|
os.chmod(target, unix_attributes)
|
||||||
|
|
||||||
|
|
||||||
|
def unpack_tarfile(filename, extract_dir, progress_filter=default_filter):
|
||||||
|
"""Unpack tar/tar.gz/tar.bz2 `filename` to `extract_dir`
|
||||||
|
|
||||||
|
Raises ``UnrecognizedFormat`` if `filename` is not a tarfile (as determined
|
||||||
|
by ``tarfile.open()``). See ``unpack_archive()`` for an explanation
|
||||||
|
of the `progress_filter` argument.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
tarobj = tarfile.open(filename)
|
||||||
|
except tarfile.TarError:
|
||||||
|
raise UnrecognizedFormat(
|
||||||
|
"%s is not a compressed or uncompressed tar file" % (filename,)
|
||||||
|
)
|
||||||
|
with contextlib.closing(tarobj):
|
||||||
|
# don't do any chowning!
|
||||||
|
tarobj.chown = lambda *args: None
|
||||||
|
for member in tarobj:
|
||||||
|
name = member.name
|
||||||
|
# don't extract absolute paths or ones with .. in them
|
||||||
|
if not name.startswith('/') and '..' not in name.split('/'):
|
||||||
|
prelim_dst = os.path.join(extract_dir, *name.split('/'))
|
||||||
|
|
||||||
|
# resolve any links and to extract the link targets as normal
|
||||||
|
# files
|
||||||
|
while member is not None and (member.islnk() or member.issym()):
|
||||||
|
linkpath = member.linkname
|
||||||
|
if member.issym():
|
||||||
|
base = posixpath.dirname(member.name)
|
||||||
|
linkpath = posixpath.join(base, linkpath)
|
||||||
|
linkpath = posixpath.normpath(linkpath)
|
||||||
|
member = tarobj._getmember(linkpath)
|
||||||
|
|
||||||
|
if member is not None and (member.isfile() or member.isdir()):
|
||||||
|
final_dst = progress_filter(name, prelim_dst)
|
||||||
|
if final_dst:
|
||||||
|
if final_dst.endswith(os.sep):
|
||||||
|
final_dst = final_dst[:-1]
|
||||||
|
try:
|
||||||
|
# XXX Ugh
|
||||||
|
tarobj._extract_member(member, final_dst)
|
||||||
|
except tarfile.ExtractError:
|
||||||
|
# chown/chmod/mkfifo/mknode/makedev failed
|
||||||
|
pass
|
||||||
|
return True
|
||||||
|
|
||||||
|
extraction_drivers = unpack_directory, unpack_zipfile, unpack_tarfile
|
BIN
lib/python3.4/site-packages/setuptools/cli-32.exe
Normal file
BIN
lib/python3.4/site-packages/setuptools/cli-32.exe
Normal file
Binary file not shown.
BIN
lib/python3.4/site-packages/setuptools/cli-64.exe
Normal file
BIN
lib/python3.4/site-packages/setuptools/cli-64.exe
Normal file
Binary file not shown.
BIN
lib/python3.4/site-packages/setuptools/cli-arm-32.exe
Normal file
BIN
lib/python3.4/site-packages/setuptools/cli-arm-32.exe
Normal file
Binary file not shown.
BIN
lib/python3.4/site-packages/setuptools/cli.exe
Normal file
BIN
lib/python3.4/site-packages/setuptools/cli.exe
Normal file
Binary file not shown.
18
lib/python3.4/site-packages/setuptools/command/__init__.py
Normal file
18
lib/python3.4/site-packages/setuptools/command/__init__.py
Normal file
|
@ -0,0 +1,18 @@
|
||||||
|
__all__ = [
|
||||||
|
'alias', 'bdist_egg', 'bdist_rpm', 'build_ext', 'build_py', 'develop',
|
||||||
|
'easy_install', 'egg_info', 'install', 'install_lib', 'rotate', 'saveopts',
|
||||||
|
'sdist', 'setopt', 'test', 'install_egg_info', 'install_scripts',
|
||||||
|
'register', 'bdist_wininst', 'upload_docs',
|
||||||
|
]
|
||||||
|
|
||||||
|
from distutils.command.bdist import bdist
|
||||||
|
import sys
|
||||||
|
|
||||||
|
from setuptools.command import install_scripts
|
||||||
|
|
||||||
|
|
||||||
|
if 'egg' not in bdist.format_commands:
|
||||||
|
bdist.format_command['egg'] = ('bdist_egg', "Python .egg file")
|
||||||
|
bdist.format_commands.append('egg')
|
||||||
|
|
||||||
|
del bdist, sys
|
78
lib/python3.4/site-packages/setuptools/command/alias.py
Normal file
78
lib/python3.4/site-packages/setuptools/command/alias.py
Normal file
|
@ -0,0 +1,78 @@
|
||||||
|
from distutils.errors import DistutilsOptionError
|
||||||
|
|
||||||
|
from setuptools.command.setopt import edit_config, option_base, config_file
|
||||||
|
|
||||||
|
|
||||||
|
def shquote(arg):
|
||||||
|
"""Quote an argument for later parsing by shlex.split()"""
|
||||||
|
for c in '"', "'", "\\", "#":
|
||||||
|
if c in arg:
|
||||||
|
return repr(arg)
|
||||||
|
if arg.split() != [arg]:
|
||||||
|
return repr(arg)
|
||||||
|
return arg
|
||||||
|
|
||||||
|
|
||||||
|
class alias(option_base):
|
||||||
|
"""Define a shortcut that invokes one or more commands"""
|
||||||
|
|
||||||
|
description = "define a shortcut to invoke one or more commands"
|
||||||
|
command_consumes_arguments = True
|
||||||
|
|
||||||
|
user_options = [
|
||||||
|
('remove', 'r', 'remove (unset) the alias'),
|
||||||
|
] + option_base.user_options
|
||||||
|
|
||||||
|
boolean_options = option_base.boolean_options + ['remove']
|
||||||
|
|
||||||
|
def initialize_options(self):
|
||||||
|
option_base.initialize_options(self)
|
||||||
|
self.args = None
|
||||||
|
self.remove = None
|
||||||
|
|
||||||
|
def finalize_options(self):
|
||||||
|
option_base.finalize_options(self)
|
||||||
|
if self.remove and len(self.args) != 1:
|
||||||
|
raise DistutilsOptionError(
|
||||||
|
"Must specify exactly one argument (the alias name) when "
|
||||||
|
"using --remove"
|
||||||
|
)
|
||||||
|
|
||||||
|
def run(self):
|
||||||
|
aliases = self.distribution.get_option_dict('aliases')
|
||||||
|
|
||||||
|
if not self.args:
|
||||||
|
print("Command Aliases")
|
||||||
|
print("---------------")
|
||||||
|
for alias in aliases:
|
||||||
|
print("setup.py alias", format_alias(alias, aliases))
|
||||||
|
return
|
||||||
|
|
||||||
|
elif len(self.args) == 1:
|
||||||
|
alias, = self.args
|
||||||
|
if self.remove:
|
||||||
|
command = None
|
||||||
|
elif alias in aliases:
|
||||||
|
print("setup.py alias", format_alias(alias, aliases))
|
||||||
|
return
|
||||||
|
else:
|
||||||
|
print("No alias definition found for %r" % alias)
|
||||||
|
return
|
||||||
|
else:
|
||||||
|
alias = self.args[0]
|
||||||
|
command = ' '.join(map(shquote, self.args[1:]))
|
||||||
|
|
||||||
|
edit_config(self.filename, {'aliases': {alias: command}}, self.dry_run)
|
||||||
|
|
||||||
|
|
||||||
|
def format_alias(name, aliases):
|
||||||
|
source, command = aliases[name]
|
||||||
|
if source == config_file('global'):
|
||||||
|
source = '--global-config '
|
||||||
|
elif source == config_file('user'):
|
||||||
|
source = '--user-config '
|
||||||
|
elif source == config_file('local'):
|
||||||
|
source = ''
|
||||||
|
else:
|
||||||
|
source = '--filename=%r' % source
|
||||||
|
return source + name + ' ' + command
|
470
lib/python3.4/site-packages/setuptools/command/bdist_egg.py
Normal file
470
lib/python3.4/site-packages/setuptools/command/bdist_egg.py
Normal file
|
@ -0,0 +1,470 @@
|
||||||
|
"""setuptools.command.bdist_egg
|
||||||
|
|
||||||
|
Build .egg distributions"""
|
||||||
|
|
||||||
|
from distutils.errors import DistutilsSetupError
|
||||||
|
from distutils.dir_util import remove_tree, mkpath
|
||||||
|
from distutils import log
|
||||||
|
from types import CodeType
|
||||||
|
import sys
|
||||||
|
import os
|
||||||
|
import marshal
|
||||||
|
import textwrap
|
||||||
|
|
||||||
|
from pkg_resources import get_build_platform, Distribution, ensure_directory
|
||||||
|
from pkg_resources import EntryPoint
|
||||||
|
from setuptools.compat import basestring
|
||||||
|
from setuptools.extension import Library
|
||||||
|
from setuptools import Command
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Python 2.7 or >=3.2
|
||||||
|
from sysconfig import get_path, get_python_version
|
||||||
|
|
||||||
|
def _get_purelib():
|
||||||
|
return get_path("purelib")
|
||||||
|
except ImportError:
|
||||||
|
from distutils.sysconfig import get_python_lib, get_python_version
|
||||||
|
|
||||||
|
def _get_purelib():
|
||||||
|
return get_python_lib(False)
|
||||||
|
|
||||||
|
|
||||||
|
def strip_module(filename):
|
||||||
|
if '.' in filename:
|
||||||
|
filename = os.path.splitext(filename)[0]
|
||||||
|
if filename.endswith('module'):
|
||||||
|
filename = filename[:-6]
|
||||||
|
return filename
|
||||||
|
|
||||||
|
|
||||||
|
def write_stub(resource, pyfile):
|
||||||
|
_stub_template = textwrap.dedent("""
|
||||||
|
def __bootstrap__():
|
||||||
|
global __bootstrap__, __loader__, __file__
|
||||||
|
import sys, pkg_resources, imp
|
||||||
|
__file__ = pkg_resources.resource_filename(__name__, %r)
|
||||||
|
__loader__ = None; del __bootstrap__, __loader__
|
||||||
|
imp.load_dynamic(__name__,__file__)
|
||||||
|
__bootstrap__()
|
||||||
|
""").lstrip()
|
||||||
|
with open(pyfile, 'w') as f:
|
||||||
|
f.write(_stub_template % resource)
|
||||||
|
|
||||||
|
|
||||||
|
class bdist_egg(Command):
|
||||||
|
description = "create an \"egg\" distribution"
|
||||||
|
|
||||||
|
user_options = [
|
||||||
|
('bdist-dir=', 'b',
|
||||||
|
"temporary directory for creating the distribution"),
|
||||||
|
('plat-name=', 'p', "platform name to embed in generated filenames "
|
||||||
|
"(default: %s)" % get_build_platform()),
|
||||||
|
('exclude-source-files', None,
|
||||||
|
"remove all .py files from the generated egg"),
|
||||||
|
('keep-temp', 'k',
|
||||||
|
"keep the pseudo-installation tree around after " +
|
||||||
|
"creating the distribution archive"),
|
||||||
|
('dist-dir=', 'd',
|
||||||
|
"directory to put final built distributions in"),
|
||||||
|
('skip-build', None,
|
||||||
|
"skip rebuilding everything (for testing/debugging)"),
|
||||||
|
]
|
||||||
|
|
||||||
|
boolean_options = [
|
||||||
|
'keep-temp', 'skip-build', 'exclude-source-files'
|
||||||
|
]
|
||||||
|
|
||||||
|
def initialize_options(self):
|
||||||
|
self.bdist_dir = None
|
||||||
|
self.plat_name = None
|
||||||
|
self.keep_temp = 0
|
||||||
|
self.dist_dir = None
|
||||||
|
self.skip_build = 0
|
||||||
|
self.egg_output = None
|
||||||
|
self.exclude_source_files = None
|
||||||
|
|
||||||
|
def finalize_options(self):
|
||||||
|
ei_cmd = self.ei_cmd = self.get_finalized_command("egg_info")
|
||||||
|
self.egg_info = ei_cmd.egg_info
|
||||||
|
|
||||||
|
if self.bdist_dir is None:
|
||||||
|
bdist_base = self.get_finalized_command('bdist').bdist_base
|
||||||
|
self.bdist_dir = os.path.join(bdist_base, 'egg')
|
||||||
|
|
||||||
|
if self.plat_name is None:
|
||||||
|
self.plat_name = get_build_platform()
|
||||||
|
|
||||||
|
self.set_undefined_options('bdist', ('dist_dir', 'dist_dir'))
|
||||||
|
|
||||||
|
if self.egg_output is None:
|
||||||
|
|
||||||
|
# Compute filename of the output egg
|
||||||
|
basename = Distribution(
|
||||||
|
None, None, ei_cmd.egg_name, ei_cmd.egg_version,
|
||||||
|
get_python_version(),
|
||||||
|
self.distribution.has_ext_modules() and self.plat_name
|
||||||
|
).egg_name()
|
||||||
|
|
||||||
|
self.egg_output = os.path.join(self.dist_dir, basename + '.egg')
|
||||||
|
|
||||||
|
def do_install_data(self):
|
||||||
|
# Hack for packages that install data to install's --install-lib
|
||||||
|
self.get_finalized_command('install').install_lib = self.bdist_dir
|
||||||
|
|
||||||
|
site_packages = os.path.normcase(os.path.realpath(_get_purelib()))
|
||||||
|
old, self.distribution.data_files = self.distribution.data_files, []
|
||||||
|
|
||||||
|
for item in old:
|
||||||
|
if isinstance(item, tuple) and len(item) == 2:
|
||||||
|
if os.path.isabs(item[0]):
|
||||||
|
realpath = os.path.realpath(item[0])
|
||||||
|
normalized = os.path.normcase(realpath)
|
||||||
|
if normalized == site_packages or normalized.startswith(
|
||||||
|
site_packages + os.sep
|
||||||
|
):
|
||||||
|
item = realpath[len(site_packages) + 1:], item[1]
|
||||||
|
# XXX else: raise ???
|
||||||
|
self.distribution.data_files.append(item)
|
||||||
|
|
||||||
|
try:
|
||||||
|
log.info("installing package data to %s" % self.bdist_dir)
|
||||||
|
self.call_command('install_data', force=0, root=None)
|
||||||
|
finally:
|
||||||
|
self.distribution.data_files = old
|
||||||
|
|
||||||
|
def get_outputs(self):
|
||||||
|
return [self.egg_output]
|
||||||
|
|
||||||
|
def call_command(self, cmdname, **kw):
|
||||||
|
"""Invoke reinitialized command `cmdname` with keyword args"""
|
||||||
|
for dirname in INSTALL_DIRECTORY_ATTRS:
|
||||||
|
kw.setdefault(dirname, self.bdist_dir)
|
||||||
|
kw.setdefault('skip_build', self.skip_build)
|
||||||
|
kw.setdefault('dry_run', self.dry_run)
|
||||||
|
cmd = self.reinitialize_command(cmdname, **kw)
|
||||||
|
self.run_command(cmdname)
|
||||||
|
return cmd
|
||||||
|
|
||||||
|
def run(self):
|
||||||
|
# Generate metadata first
|
||||||
|
self.run_command("egg_info")
|
||||||
|
# We run install_lib before install_data, because some data hacks
|
||||||
|
# pull their data path from the install_lib command.
|
||||||
|
log.info("installing library code to %s" % self.bdist_dir)
|
||||||
|
instcmd = self.get_finalized_command('install')
|
||||||
|
old_root = instcmd.root
|
||||||
|
instcmd.root = None
|
||||||
|
if self.distribution.has_c_libraries() and not self.skip_build:
|
||||||
|
self.run_command('build_clib')
|
||||||
|
cmd = self.call_command('install_lib', warn_dir=0)
|
||||||
|
instcmd.root = old_root
|
||||||
|
|
||||||
|
all_outputs, ext_outputs = self.get_ext_outputs()
|
||||||
|
self.stubs = []
|
||||||
|
to_compile = []
|
||||||
|
for (p, ext_name) in enumerate(ext_outputs):
|
||||||
|
filename, ext = os.path.splitext(ext_name)
|
||||||
|
pyfile = os.path.join(self.bdist_dir, strip_module(filename) +
|
||||||
|
'.py')
|
||||||
|
self.stubs.append(pyfile)
|
||||||
|
log.info("creating stub loader for %s" % ext_name)
|
||||||
|
if not self.dry_run:
|
||||||
|
write_stub(os.path.basename(ext_name), pyfile)
|
||||||
|
to_compile.append(pyfile)
|
||||||
|
ext_outputs[p] = ext_name.replace(os.sep, '/')
|
||||||
|
|
||||||
|
if to_compile:
|
||||||
|
cmd.byte_compile(to_compile)
|
||||||
|
if self.distribution.data_files:
|
||||||
|
self.do_install_data()
|
||||||
|
|
||||||
|
# Make the EGG-INFO directory
|
||||||
|
archive_root = self.bdist_dir
|
||||||
|
egg_info = os.path.join(archive_root, 'EGG-INFO')
|
||||||
|
self.mkpath(egg_info)
|
||||||
|
if self.distribution.scripts:
|
||||||
|
script_dir = os.path.join(egg_info, 'scripts')
|
||||||
|
log.info("installing scripts to %s" % script_dir)
|
||||||
|
self.call_command('install_scripts', install_dir=script_dir,
|
||||||
|
no_ep=1)
|
||||||
|
|
||||||
|
self.copy_metadata_to(egg_info)
|
||||||
|
native_libs = os.path.join(egg_info, "native_libs.txt")
|
||||||
|
if all_outputs:
|
||||||
|
log.info("writing %s" % native_libs)
|
||||||
|
if not self.dry_run:
|
||||||
|
ensure_directory(native_libs)
|
||||||
|
libs_file = open(native_libs, 'wt')
|
||||||
|
libs_file.write('\n'.join(all_outputs))
|
||||||
|
libs_file.write('\n')
|
||||||
|
libs_file.close()
|
||||||
|
elif os.path.isfile(native_libs):
|
||||||
|
log.info("removing %s" % native_libs)
|
||||||
|
if not self.dry_run:
|
||||||
|
os.unlink(native_libs)
|
||||||
|
|
||||||
|
write_safety_flag(
|
||||||
|
os.path.join(archive_root, 'EGG-INFO'), self.zip_safe()
|
||||||
|
)
|
||||||
|
|
||||||
|
if os.path.exists(os.path.join(self.egg_info, 'depends.txt')):
|
||||||
|
log.warn(
|
||||||
|
"WARNING: 'depends.txt' will not be used by setuptools 0.6!\n"
|
||||||
|
"Use the install_requires/extras_require setup() args instead."
|
||||||
|
)
|
||||||
|
|
||||||
|
if self.exclude_source_files:
|
||||||
|
self.zap_pyfiles()
|
||||||
|
|
||||||
|
# Make the archive
|
||||||
|
make_zipfile(self.egg_output, archive_root, verbose=self.verbose,
|
||||||
|
dry_run=self.dry_run, mode=self.gen_header())
|
||||||
|
if not self.keep_temp:
|
||||||
|
remove_tree(self.bdist_dir, dry_run=self.dry_run)
|
||||||
|
|
||||||
|
# Add to 'Distribution.dist_files' so that the "upload" command works
|
||||||
|
getattr(self.distribution, 'dist_files', []).append(
|
||||||
|
('bdist_egg', get_python_version(), self.egg_output))
|
||||||
|
|
||||||
|
def zap_pyfiles(self):
|
||||||
|
log.info("Removing .py files from temporary directory")
|
||||||
|
for base, dirs, files in walk_egg(self.bdist_dir):
|
||||||
|
for name in files:
|
||||||
|
if name.endswith('.py'):
|
||||||
|
path = os.path.join(base, name)
|
||||||
|
log.debug("Deleting %s", path)
|
||||||
|
os.unlink(path)
|
||||||
|
|
||||||
|
def zip_safe(self):
|
||||||
|
safe = getattr(self.distribution, 'zip_safe', None)
|
||||||
|
if safe is not None:
|
||||||
|
return safe
|
||||||
|
log.warn("zip_safe flag not set; analyzing archive contents...")
|
||||||
|
return analyze_egg(self.bdist_dir, self.stubs)
|
||||||
|
|
||||||
|
def gen_header(self):
|
||||||
|
epm = EntryPoint.parse_map(self.distribution.entry_points or '')
|
||||||
|
ep = epm.get('setuptools.installation', {}).get('eggsecutable')
|
||||||
|
if ep is None:
|
||||||
|
return 'w' # not an eggsecutable, do it the usual way.
|
||||||
|
|
||||||
|
if not ep.attrs or ep.extras:
|
||||||
|
raise DistutilsSetupError(
|
||||||
|
"eggsecutable entry point (%r) cannot have 'extras' "
|
||||||
|
"or refer to a module" % (ep,)
|
||||||
|
)
|
||||||
|
|
||||||
|
pyver = sys.version[:3]
|
||||||
|
pkg = ep.module_name
|
||||||
|
full = '.'.join(ep.attrs)
|
||||||
|
base = ep.attrs[0]
|
||||||
|
basename = os.path.basename(self.egg_output)
|
||||||
|
|
||||||
|
header = (
|
||||||
|
"#!/bin/sh\n"
|
||||||
|
'if [ `basename $0` = "%(basename)s" ]\n'
|
||||||
|
'then exec python%(pyver)s -c "'
|
||||||
|
"import sys, os; sys.path.insert(0, os.path.abspath('$0')); "
|
||||||
|
"from %(pkg)s import %(base)s; sys.exit(%(full)s())"
|
||||||
|
'" "$@"\n'
|
||||||
|
'else\n'
|
||||||
|
' echo $0 is not the correct name for this egg file.\n'
|
||||||
|
' echo Please rename it back to %(basename)s and try again.\n'
|
||||||
|
' exec false\n'
|
||||||
|
'fi\n'
|
||||||
|
) % locals()
|
||||||
|
|
||||||
|
if not self.dry_run:
|
||||||
|
mkpath(os.path.dirname(self.egg_output), dry_run=self.dry_run)
|
||||||
|
f = open(self.egg_output, 'w')
|
||||||
|
f.write(header)
|
||||||
|
f.close()
|
||||||
|
return 'a'
|
||||||
|
|
||||||
|
def copy_metadata_to(self, target_dir):
|
||||||
|
"Copy metadata (egg info) to the target_dir"
|
||||||
|
# normalize the path (so that a forward-slash in egg_info will
|
||||||
|
# match using startswith below)
|
||||||
|
norm_egg_info = os.path.normpath(self.egg_info)
|
||||||
|
prefix = os.path.join(norm_egg_info, '')
|
||||||
|
for path in self.ei_cmd.filelist.files:
|
||||||
|
if path.startswith(prefix):
|
||||||
|
target = os.path.join(target_dir, path[len(prefix):])
|
||||||
|
ensure_directory(target)
|
||||||
|
self.copy_file(path, target)
|
||||||
|
|
||||||
|
def get_ext_outputs(self):
|
||||||
|
"""Get a list of relative paths to C extensions in the output distro"""
|
||||||
|
|
||||||
|
all_outputs = []
|
||||||
|
ext_outputs = []
|
||||||
|
|
||||||
|
paths = {self.bdist_dir: ''}
|
||||||
|
for base, dirs, files in os.walk(self.bdist_dir):
|
||||||
|
for filename in files:
|
||||||
|
if os.path.splitext(filename)[1].lower() in NATIVE_EXTENSIONS:
|
||||||
|
all_outputs.append(paths[base] + filename)
|
||||||
|
for filename in dirs:
|
||||||
|
paths[os.path.join(base, filename)] = (paths[base] +
|
||||||
|
filename + '/')
|
||||||
|
|
||||||
|
if self.distribution.has_ext_modules():
|
||||||
|
build_cmd = self.get_finalized_command('build_ext')
|
||||||
|
for ext in build_cmd.extensions:
|
||||||
|
if isinstance(ext, Library):
|
||||||
|
continue
|
||||||
|
fullname = build_cmd.get_ext_fullname(ext.name)
|
||||||
|
filename = build_cmd.get_ext_filename(fullname)
|
||||||
|
if not os.path.basename(filename).startswith('dl-'):
|
||||||
|
if os.path.exists(os.path.join(self.bdist_dir, filename)):
|
||||||
|
ext_outputs.append(filename)
|
||||||
|
|
||||||
|
return all_outputs, ext_outputs
|
||||||
|
|
||||||
|
|
||||||
|
NATIVE_EXTENSIONS = dict.fromkeys('.dll .so .dylib .pyd'.split())
|
||||||
|
|
||||||
|
|
||||||
|
def walk_egg(egg_dir):
|
||||||
|
"""Walk an unpacked egg's contents, skipping the metadata directory"""
|
||||||
|
walker = os.walk(egg_dir)
|
||||||
|
base, dirs, files = next(walker)
|
||||||
|
if 'EGG-INFO' in dirs:
|
||||||
|
dirs.remove('EGG-INFO')
|
||||||
|
yield base, dirs, files
|
||||||
|
for bdf in walker:
|
||||||
|
yield bdf
|
||||||
|
|
||||||
|
|
||||||
|
def analyze_egg(egg_dir, stubs):
|
||||||
|
# check for existing flag in EGG-INFO
|
||||||
|
for flag, fn in safety_flags.items():
|
||||||
|
if os.path.exists(os.path.join(egg_dir, 'EGG-INFO', fn)):
|
||||||
|
return flag
|
||||||
|
if not can_scan():
|
||||||
|
return False
|
||||||
|
safe = True
|
||||||
|
for base, dirs, files in walk_egg(egg_dir):
|
||||||
|
for name in files:
|
||||||
|
if name.endswith('.py') or name.endswith('.pyw'):
|
||||||
|
continue
|
||||||
|
elif name.endswith('.pyc') or name.endswith('.pyo'):
|
||||||
|
# always scan, even if we already know we're not safe
|
||||||
|
safe = scan_module(egg_dir, base, name, stubs) and safe
|
||||||
|
return safe
|
||||||
|
|
||||||
|
|
||||||
|
def write_safety_flag(egg_dir, safe):
|
||||||
|
# Write or remove zip safety flag file(s)
|
||||||
|
for flag, fn in safety_flags.items():
|
||||||
|
fn = os.path.join(egg_dir, fn)
|
||||||
|
if os.path.exists(fn):
|
||||||
|
if safe is None or bool(safe) != flag:
|
||||||
|
os.unlink(fn)
|
||||||
|
elif safe is not None and bool(safe) == flag:
|
||||||
|
f = open(fn, 'wt')
|
||||||
|
f.write('\n')
|
||||||
|
f.close()
|
||||||
|
|
||||||
|
|
||||||
|
safety_flags = {
|
||||||
|
True: 'zip-safe',
|
||||||
|
False: 'not-zip-safe',
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def scan_module(egg_dir, base, name, stubs):
|
||||||
|
"""Check whether module possibly uses unsafe-for-zipfile stuff"""
|
||||||
|
|
||||||
|
filename = os.path.join(base, name)
|
||||||
|
if filename[:-1] in stubs:
|
||||||
|
return True # Extension module
|
||||||
|
pkg = base[len(egg_dir) + 1:].replace(os.sep, '.')
|
||||||
|
module = pkg + (pkg and '.' or '') + os.path.splitext(name)[0]
|
||||||
|
if sys.version_info < (3, 3):
|
||||||
|
skip = 8 # skip magic & date
|
||||||
|
else:
|
||||||
|
skip = 12 # skip magic & date & file size
|
||||||
|
f = open(filename, 'rb')
|
||||||
|
f.read(skip)
|
||||||
|
code = marshal.load(f)
|
||||||
|
f.close()
|
||||||
|
safe = True
|
||||||
|
symbols = dict.fromkeys(iter_symbols(code))
|
||||||
|
for bad in ['__file__', '__path__']:
|
||||||
|
if bad in symbols:
|
||||||
|
log.warn("%s: module references %s", module, bad)
|
||||||
|
safe = False
|
||||||
|
if 'inspect' in symbols:
|
||||||
|
for bad in [
|
||||||
|
'getsource', 'getabsfile', 'getsourcefile', 'getfile'
|
||||||
|
'getsourcelines', 'findsource', 'getcomments', 'getframeinfo',
|
||||||
|
'getinnerframes', 'getouterframes', 'stack', 'trace'
|
||||||
|
]:
|
||||||
|
if bad in symbols:
|
||||||
|
log.warn("%s: module MAY be using inspect.%s", module, bad)
|
||||||
|
safe = False
|
||||||
|
return safe
|
||||||
|
|
||||||
|
|
||||||
|
def iter_symbols(code):
|
||||||
|
"""Yield names and strings used by `code` and its nested code objects"""
|
||||||
|
for name in code.co_names:
|
||||||
|
yield name
|
||||||
|
for const in code.co_consts:
|
||||||
|
if isinstance(const, basestring):
|
||||||
|
yield const
|
||||||
|
elif isinstance(const, CodeType):
|
||||||
|
for name in iter_symbols(const):
|
||||||
|
yield name
|
||||||
|
|
||||||
|
|
||||||
|
def can_scan():
|
||||||
|
if not sys.platform.startswith('java') and sys.platform != 'cli':
|
||||||
|
# CPython, PyPy, etc.
|
||||||
|
return True
|
||||||
|
log.warn("Unable to analyze compiled code on this platform.")
|
||||||
|
log.warn("Please ask the author to include a 'zip_safe'"
|
||||||
|
" setting (either True or False) in the package's setup.py")
|
||||||
|
|
||||||
|
# Attribute names of options for commands that might need to be convinced to
|
||||||
|
# install to the egg build directory
|
||||||
|
|
||||||
|
INSTALL_DIRECTORY_ATTRS = [
|
||||||
|
'install_lib', 'install_dir', 'install_data', 'install_base'
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
def make_zipfile(zip_filename, base_dir, verbose=0, dry_run=0, compress=True,
|
||||||
|
mode='w'):
|
||||||
|
"""Create a zip file from all the files under 'base_dir'. The output
|
||||||
|
zip file will be named 'base_dir' + ".zip". Uses either the "zipfile"
|
||||||
|
Python module (if available) or the InfoZIP "zip" utility (if installed
|
||||||
|
and found on the default search path). If neither tool is available,
|
||||||
|
raises DistutilsExecError. Returns the name of the output zip file.
|
||||||
|
"""
|
||||||
|
import zipfile
|
||||||
|
|
||||||
|
mkpath(os.path.dirname(zip_filename), dry_run=dry_run)
|
||||||
|
log.info("creating '%s' and adding '%s' to it", zip_filename, base_dir)
|
||||||
|
|
||||||
|
def visit(z, dirname, names):
|
||||||
|
for name in names:
|
||||||
|
path = os.path.normpath(os.path.join(dirname, name))
|
||||||
|
if os.path.isfile(path):
|
||||||
|
p = path[len(base_dir) + 1:]
|
||||||
|
if not dry_run:
|
||||||
|
z.write(path, p)
|
||||||
|
log.debug("adding '%s'" % p)
|
||||||
|
|
||||||
|
compression = zipfile.ZIP_DEFLATED if compress else zipfile.ZIP_STORED
|
||||||
|
if not dry_run:
|
||||||
|
z = zipfile.ZipFile(zip_filename, mode, compression=compression)
|
||||||
|
for dirname, dirs, files in os.walk(base_dir):
|
||||||
|
visit(z, dirname, files)
|
||||||
|
z.close()
|
||||||
|
else:
|
||||||
|
for dirname, dirs, files in os.walk(base_dir):
|
||||||
|
visit(None, dirname, files)
|
||||||
|
return zip_filename
|
43
lib/python3.4/site-packages/setuptools/command/bdist_rpm.py
Normal file
43
lib/python3.4/site-packages/setuptools/command/bdist_rpm.py
Normal file
|
@ -0,0 +1,43 @@
|
||||||
|
import distutils.command.bdist_rpm as orig
|
||||||
|
|
||||||
|
|
||||||
|
class bdist_rpm(orig.bdist_rpm):
|
||||||
|
"""
|
||||||
|
Override the default bdist_rpm behavior to do the following:
|
||||||
|
|
||||||
|
1. Run egg_info to ensure the name and version are properly calculated.
|
||||||
|
2. Always run 'install' using --single-version-externally-managed to
|
||||||
|
disable eggs in RPM distributions.
|
||||||
|
3. Replace dash with underscore in the version numbers for better RPM
|
||||||
|
compatibility.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def run(self):
|
||||||
|
# ensure distro name is up-to-date
|
||||||
|
self.run_command('egg_info')
|
||||||
|
|
||||||
|
orig.bdist_rpm.run(self)
|
||||||
|
|
||||||
|
def _make_spec_file(self):
|
||||||
|
version = self.distribution.get_version()
|
||||||
|
rpmversion = version.replace('-', '_')
|
||||||
|
spec = orig.bdist_rpm._make_spec_file(self)
|
||||||
|
line23 = '%define version ' + version
|
||||||
|
line24 = '%define version ' + rpmversion
|
||||||
|
spec = [
|
||||||
|
line.replace(
|
||||||
|
"Source0: %{name}-%{version}.tar",
|
||||||
|
"Source0: %{name}-%{unmangled_version}.tar"
|
||||||
|
).replace(
|
||||||
|
"setup.py install ",
|
||||||
|
"setup.py install --single-version-externally-managed "
|
||||||
|
).replace(
|
||||||
|
"%setup",
|
||||||
|
"%setup -n %{name}-%{unmangled_version}"
|
||||||
|
).replace(line23, line24)
|
||||||
|
for line in spec
|
||||||
|
]
|
||||||
|
insert_loc = spec.index(line24) + 1
|
||||||
|
unmangled_version = "%define unmangled_version " + version
|
||||||
|
spec.insert(insert_loc, unmangled_version)
|
||||||
|
return spec
|
|
@ -0,0 +1,21 @@
|
||||||
|
import distutils.command.bdist_wininst as orig
|
||||||
|
|
||||||
|
|
||||||
|
class bdist_wininst(orig.bdist_wininst):
|
||||||
|
def reinitialize_command(self, command, reinit_subcommands=0):
|
||||||
|
"""
|
||||||
|
Supplement reinitialize_command to work around
|
||||||
|
http://bugs.python.org/issue20819
|
||||||
|
"""
|
||||||
|
cmd = self.distribution.reinitialize_command(
|
||||||
|
command, reinit_subcommands)
|
||||||
|
if command in ('install', 'install_lib'):
|
||||||
|
cmd.install_lib = None
|
||||||
|
return cmd
|
||||||
|
|
||||||
|
def run(self):
|
||||||
|
self._is_running = True
|
||||||
|
try:
|
||||||
|
orig.bdist_wininst.run(self)
|
||||||
|
finally:
|
||||||
|
self._is_running = False
|
296
lib/python3.4/site-packages/setuptools/command/build_ext.py
Normal file
296
lib/python3.4/site-packages/setuptools/command/build_ext.py
Normal file
|
@ -0,0 +1,296 @@
|
||||||
|
from distutils.command.build_ext import build_ext as _du_build_ext
|
||||||
|
from distutils.file_util import copy_file
|
||||||
|
from distutils.ccompiler import new_compiler
|
||||||
|
from distutils.sysconfig import customize_compiler
|
||||||
|
from distutils.errors import DistutilsError
|
||||||
|
from distutils import log
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
import itertools
|
||||||
|
|
||||||
|
from setuptools.extension import Library
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Attempt to use Cython for building extensions, if available
|
||||||
|
from Cython.Distutils.build_ext import build_ext as _build_ext
|
||||||
|
except ImportError:
|
||||||
|
_build_ext = _du_build_ext
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Python 2.7 or >=3.2
|
||||||
|
from sysconfig import _CONFIG_VARS
|
||||||
|
except ImportError:
|
||||||
|
from distutils.sysconfig import get_config_var
|
||||||
|
|
||||||
|
get_config_var("LDSHARED") # make sure _config_vars is initialized
|
||||||
|
del get_config_var
|
||||||
|
from distutils.sysconfig import _config_vars as _CONFIG_VARS
|
||||||
|
|
||||||
|
have_rtld = False
|
||||||
|
use_stubs = False
|
||||||
|
libtype = 'shared'
|
||||||
|
|
||||||
|
if sys.platform == "darwin":
|
||||||
|
use_stubs = True
|
||||||
|
elif os.name != 'nt':
|
||||||
|
try:
|
||||||
|
import dl
|
||||||
|
use_stubs = have_rtld = hasattr(dl, 'RTLD_NOW')
|
||||||
|
except ImportError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
if_dl = lambda s: s if have_rtld else ''
|
||||||
|
|
||||||
|
class build_ext(_build_ext):
|
||||||
|
def run(self):
|
||||||
|
"""Build extensions in build directory, then copy if --inplace"""
|
||||||
|
old_inplace, self.inplace = self.inplace, 0
|
||||||
|
_build_ext.run(self)
|
||||||
|
self.inplace = old_inplace
|
||||||
|
if old_inplace:
|
||||||
|
self.copy_extensions_to_source()
|
||||||
|
|
||||||
|
def copy_extensions_to_source(self):
|
||||||
|
build_py = self.get_finalized_command('build_py')
|
||||||
|
for ext in self.extensions:
|
||||||
|
fullname = self.get_ext_fullname(ext.name)
|
||||||
|
filename = self.get_ext_filename(fullname)
|
||||||
|
modpath = fullname.split('.')
|
||||||
|
package = '.'.join(modpath[:-1])
|
||||||
|
package_dir = build_py.get_package_dir(package)
|
||||||
|
dest_filename = os.path.join(package_dir,
|
||||||
|
os.path.basename(filename))
|
||||||
|
src_filename = os.path.join(self.build_lib, filename)
|
||||||
|
|
||||||
|
# Always copy, even if source is older than destination, to ensure
|
||||||
|
# that the right extensions for the current Python/platform are
|
||||||
|
# used.
|
||||||
|
copy_file(
|
||||||
|
src_filename, dest_filename, verbose=self.verbose,
|
||||||
|
dry_run=self.dry_run
|
||||||
|
)
|
||||||
|
if ext._needs_stub:
|
||||||
|
self.write_stub(package_dir or os.curdir, ext, True)
|
||||||
|
|
||||||
|
def get_ext_filename(self, fullname):
|
||||||
|
filename = _build_ext.get_ext_filename(self, fullname)
|
||||||
|
if fullname in self.ext_map:
|
||||||
|
ext = self.ext_map[fullname]
|
||||||
|
if isinstance(ext, Library):
|
||||||
|
fn, ext = os.path.splitext(filename)
|
||||||
|
return self.shlib_compiler.library_filename(fn, libtype)
|
||||||
|
elif use_stubs and ext._links_to_dynamic:
|
||||||
|
d, fn = os.path.split(filename)
|
||||||
|
return os.path.join(d, 'dl-' + fn)
|
||||||
|
return filename
|
||||||
|
|
||||||
|
def initialize_options(self):
|
||||||
|
_build_ext.initialize_options(self)
|
||||||
|
self.shlib_compiler = None
|
||||||
|
self.shlibs = []
|
||||||
|
self.ext_map = {}
|
||||||
|
|
||||||
|
def finalize_options(self):
|
||||||
|
_build_ext.finalize_options(self)
|
||||||
|
self.extensions = self.extensions or []
|
||||||
|
self.check_extensions_list(self.extensions)
|
||||||
|
self.shlibs = [ext for ext in self.extensions
|
||||||
|
if isinstance(ext, Library)]
|
||||||
|
if self.shlibs:
|
||||||
|
self.setup_shlib_compiler()
|
||||||
|
for ext in self.extensions:
|
||||||
|
ext._full_name = self.get_ext_fullname(ext.name)
|
||||||
|
for ext in self.extensions:
|
||||||
|
fullname = ext._full_name
|
||||||
|
self.ext_map[fullname] = ext
|
||||||
|
|
||||||
|
# distutils 3.1 will also ask for module names
|
||||||
|
# XXX what to do with conflicts?
|
||||||
|
self.ext_map[fullname.split('.')[-1]] = ext
|
||||||
|
|
||||||
|
ltd = self.shlibs and self.links_to_dynamic(ext) or False
|
||||||
|
ns = ltd and use_stubs and not isinstance(ext, Library)
|
||||||
|
ext._links_to_dynamic = ltd
|
||||||
|
ext._needs_stub = ns
|
||||||
|
filename = ext._file_name = self.get_ext_filename(fullname)
|
||||||
|
libdir = os.path.dirname(os.path.join(self.build_lib, filename))
|
||||||
|
if ltd and libdir not in ext.library_dirs:
|
||||||
|
ext.library_dirs.append(libdir)
|
||||||
|
if ltd and use_stubs and os.curdir not in ext.runtime_library_dirs:
|
||||||
|
ext.runtime_library_dirs.append(os.curdir)
|
||||||
|
|
||||||
|
def setup_shlib_compiler(self):
|
||||||
|
compiler = self.shlib_compiler = new_compiler(
|
||||||
|
compiler=self.compiler, dry_run=self.dry_run, force=self.force
|
||||||
|
)
|
||||||
|
if sys.platform == "darwin":
|
||||||
|
tmp = _CONFIG_VARS.copy()
|
||||||
|
try:
|
||||||
|
# XXX Help! I don't have any idea whether these are right...
|
||||||
|
_CONFIG_VARS['LDSHARED'] = (
|
||||||
|
"gcc -Wl,-x -dynamiclib -undefined dynamic_lookup")
|
||||||
|
_CONFIG_VARS['CCSHARED'] = " -dynamiclib"
|
||||||
|
_CONFIG_VARS['SO'] = ".dylib"
|
||||||
|
customize_compiler(compiler)
|
||||||
|
finally:
|
||||||
|
_CONFIG_VARS.clear()
|
||||||
|
_CONFIG_VARS.update(tmp)
|
||||||
|
else:
|
||||||
|
customize_compiler(compiler)
|
||||||
|
|
||||||
|
if self.include_dirs is not None:
|
||||||
|
compiler.set_include_dirs(self.include_dirs)
|
||||||
|
if self.define is not None:
|
||||||
|
# 'define' option is a list of (name,value) tuples
|
||||||
|
for (name, value) in self.define:
|
||||||
|
compiler.define_macro(name, value)
|
||||||
|
if self.undef is not None:
|
||||||
|
for macro in self.undef:
|
||||||
|
compiler.undefine_macro(macro)
|
||||||
|
if self.libraries is not None:
|
||||||
|
compiler.set_libraries(self.libraries)
|
||||||
|
if self.library_dirs is not None:
|
||||||
|
compiler.set_library_dirs(self.library_dirs)
|
||||||
|
if self.rpath is not None:
|
||||||
|
compiler.set_runtime_library_dirs(self.rpath)
|
||||||
|
if self.link_objects is not None:
|
||||||
|
compiler.set_link_objects(self.link_objects)
|
||||||
|
|
||||||
|
# hack so distutils' build_extension() builds a library instead
|
||||||
|
compiler.link_shared_object = link_shared_object.__get__(compiler)
|
||||||
|
|
||||||
|
def get_export_symbols(self, ext):
|
||||||
|
if isinstance(ext, Library):
|
||||||
|
return ext.export_symbols
|
||||||
|
return _build_ext.get_export_symbols(self, ext)
|
||||||
|
|
||||||
|
def build_extension(self, ext):
|
||||||
|
ext._convert_pyx_sources_to_lang()
|
||||||
|
_compiler = self.compiler
|
||||||
|
try:
|
||||||
|
if isinstance(ext, Library):
|
||||||
|
self.compiler = self.shlib_compiler
|
||||||
|
_build_ext.build_extension(self, ext)
|
||||||
|
if ext._needs_stub:
|
||||||
|
cmd = self.get_finalized_command('build_py').build_lib
|
||||||
|
self.write_stub(cmd, ext)
|
||||||
|
finally:
|
||||||
|
self.compiler = _compiler
|
||||||
|
|
||||||
|
def links_to_dynamic(self, ext):
|
||||||
|
"""Return true if 'ext' links to a dynamic lib in the same package"""
|
||||||
|
# XXX this should check to ensure the lib is actually being built
|
||||||
|
# XXX as dynamic, and not just using a locally-found version or a
|
||||||
|
# XXX static-compiled version
|
||||||
|
libnames = dict.fromkeys([lib._full_name for lib in self.shlibs])
|
||||||
|
pkg = '.'.join(ext._full_name.split('.')[:-1] + [''])
|
||||||
|
return any(pkg + libname in libnames for libname in ext.libraries)
|
||||||
|
|
||||||
|
def get_outputs(self):
|
||||||
|
return _build_ext.get_outputs(self) + self.__get_stubs_outputs()
|
||||||
|
|
||||||
|
def __get_stubs_outputs(self):
|
||||||
|
# assemble the base name for each extension that needs a stub
|
||||||
|
ns_ext_bases = (
|
||||||
|
os.path.join(self.build_lib, *ext._full_name.split('.'))
|
||||||
|
for ext in self.extensions
|
||||||
|
if ext._needs_stub
|
||||||
|
)
|
||||||
|
# pair each base with the extension
|
||||||
|
pairs = itertools.product(ns_ext_bases, self.__get_output_extensions())
|
||||||
|
return list(base + fnext for base, fnext in pairs)
|
||||||
|
|
||||||
|
def __get_output_extensions(self):
|
||||||
|
yield '.py'
|
||||||
|
yield '.pyc'
|
||||||
|
if self.get_finalized_command('build_py').optimize:
|
||||||
|
yield '.pyo'
|
||||||
|
|
||||||
|
def write_stub(self, output_dir, ext, compile=False):
|
||||||
|
log.info("writing stub loader for %s to %s", ext._full_name,
|
||||||
|
output_dir)
|
||||||
|
stub_file = (os.path.join(output_dir, *ext._full_name.split('.')) +
|
||||||
|
'.py')
|
||||||
|
if compile and os.path.exists(stub_file):
|
||||||
|
raise DistutilsError(stub_file + " already exists! Please delete.")
|
||||||
|
if not self.dry_run:
|
||||||
|
f = open(stub_file, 'w')
|
||||||
|
f.write(
|
||||||
|
'\n'.join([
|
||||||
|
"def __bootstrap__():",
|
||||||
|
" global __bootstrap__, __file__, __loader__",
|
||||||
|
" import sys, os, pkg_resources, imp" + if_dl(", dl"),
|
||||||
|
" __file__ = pkg_resources.resource_filename"
|
||||||
|
"(__name__,%r)"
|
||||||
|
% os.path.basename(ext._file_name),
|
||||||
|
" del __bootstrap__",
|
||||||
|
" if '__loader__' in globals():",
|
||||||
|
" del __loader__",
|
||||||
|
if_dl(" old_flags = sys.getdlopenflags()"),
|
||||||
|
" old_dir = os.getcwd()",
|
||||||
|
" try:",
|
||||||
|
" os.chdir(os.path.dirname(__file__))",
|
||||||
|
if_dl(" sys.setdlopenflags(dl.RTLD_NOW)"),
|
||||||
|
" imp.load_dynamic(__name__,__file__)",
|
||||||
|
" finally:",
|
||||||
|
if_dl(" sys.setdlopenflags(old_flags)"),
|
||||||
|
" os.chdir(old_dir)",
|
||||||
|
"__bootstrap__()",
|
||||||
|
"" # terminal \n
|
||||||
|
])
|
||||||
|
)
|
||||||
|
f.close()
|
||||||
|
if compile:
|
||||||
|
from distutils.util import byte_compile
|
||||||
|
|
||||||
|
byte_compile([stub_file], optimize=0,
|
||||||
|
force=True, dry_run=self.dry_run)
|
||||||
|
optimize = self.get_finalized_command('install_lib').optimize
|
||||||
|
if optimize > 0:
|
||||||
|
byte_compile([stub_file], optimize=optimize,
|
||||||
|
force=True, dry_run=self.dry_run)
|
||||||
|
if os.path.exists(stub_file) and not self.dry_run:
|
||||||
|
os.unlink(stub_file)
|
||||||
|
|
||||||
|
|
||||||
|
if use_stubs or os.name == 'nt':
|
||||||
|
# Build shared libraries
|
||||||
|
#
|
||||||
|
def link_shared_object(
|
||||||
|
self, objects, output_libname, output_dir=None, libraries=None,
|
||||||
|
library_dirs=None, runtime_library_dirs=None, export_symbols=None,
|
||||||
|
debug=0, extra_preargs=None, extra_postargs=None, build_temp=None,
|
||||||
|
target_lang=None):
|
||||||
|
self.link(
|
||||||
|
self.SHARED_LIBRARY, objects, output_libname,
|
||||||
|
output_dir, libraries, library_dirs, runtime_library_dirs,
|
||||||
|
export_symbols, debug, extra_preargs, extra_postargs,
|
||||||
|
build_temp, target_lang
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
# Build static libraries everywhere else
|
||||||
|
libtype = 'static'
|
||||||
|
|
||||||
|
def link_shared_object(
|
||||||
|
self, objects, output_libname, output_dir=None, libraries=None,
|
||||||
|
library_dirs=None, runtime_library_dirs=None, export_symbols=None,
|
||||||
|
debug=0, extra_preargs=None, extra_postargs=None, build_temp=None,
|
||||||
|
target_lang=None):
|
||||||
|
# XXX we need to either disallow these attrs on Library instances,
|
||||||
|
# or warn/abort here if set, or something...
|
||||||
|
# libraries=None, library_dirs=None, runtime_library_dirs=None,
|
||||||
|
# export_symbols=None, extra_preargs=None, extra_postargs=None,
|
||||||
|
# build_temp=None
|
||||||
|
|
||||||
|
assert output_dir is None # distutils build_ext doesn't pass this
|
||||||
|
output_dir, filename = os.path.split(output_libname)
|
||||||
|
basename, ext = os.path.splitext(filename)
|
||||||
|
if self.library_filename("x").startswith('lib'):
|
||||||
|
# strip 'lib' prefix; this is kludgy if some platform uses
|
||||||
|
# a different prefix
|
||||||
|
basename = basename[3:]
|
||||||
|
|
||||||
|
self.create_static_lib(
|
||||||
|
objects, basename, output_dir, debug, target_lang
|
||||||
|
)
|
215
lib/python3.4/site-packages/setuptools/command/build_py.py
Normal file
215
lib/python3.4/site-packages/setuptools/command/build_py.py
Normal file
|
@ -0,0 +1,215 @@
|
||||||
|
from glob import glob
|
||||||
|
from distutils.util import convert_path
|
||||||
|
import distutils.command.build_py as orig
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
import fnmatch
|
||||||
|
import textwrap
|
||||||
|
|
||||||
|
try:
|
||||||
|
from setuptools.lib2to3_ex import Mixin2to3
|
||||||
|
except ImportError:
|
||||||
|
class Mixin2to3:
|
||||||
|
def run_2to3(self, files, doctests=True):
|
||||||
|
"do nothing"
|
||||||
|
|
||||||
|
|
||||||
|
class build_py(orig.build_py, Mixin2to3):
|
||||||
|
"""Enhanced 'build_py' command that includes data files with packages
|
||||||
|
|
||||||
|
The data files are specified via a 'package_data' argument to 'setup()'.
|
||||||
|
See 'setuptools.dist.Distribution' for more details.
|
||||||
|
|
||||||
|
Also, this version of the 'build_py' command allows you to specify both
|
||||||
|
'py_modules' and 'packages' in the same setup operation.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def finalize_options(self):
|
||||||
|
orig.build_py.finalize_options(self)
|
||||||
|
self.package_data = self.distribution.package_data
|
||||||
|
self.exclude_package_data = (self.distribution.exclude_package_data or
|
||||||
|
{})
|
||||||
|
if 'data_files' in self.__dict__:
|
||||||
|
del self.__dict__['data_files']
|
||||||
|
self.__updated_files = []
|
||||||
|
self.__doctests_2to3 = []
|
||||||
|
|
||||||
|
def run(self):
|
||||||
|
"""Build modules, packages, and copy data files to build directory"""
|
||||||
|
if not self.py_modules and not self.packages:
|
||||||
|
return
|
||||||
|
|
||||||
|
if self.py_modules:
|
||||||
|
self.build_modules()
|
||||||
|
|
||||||
|
if self.packages:
|
||||||
|
self.build_packages()
|
||||||
|
self.build_package_data()
|
||||||
|
|
||||||
|
self.run_2to3(self.__updated_files, False)
|
||||||
|
self.run_2to3(self.__updated_files, True)
|
||||||
|
self.run_2to3(self.__doctests_2to3, True)
|
||||||
|
|
||||||
|
# Only compile actual .py files, using our base class' idea of what our
|
||||||
|
# output files are.
|
||||||
|
self.byte_compile(orig.build_py.get_outputs(self, include_bytecode=0))
|
||||||
|
|
||||||
|
def __getattr__(self, attr):
|
||||||
|
if attr == 'data_files': # lazily compute data files
|
||||||
|
self.data_files = files = self._get_data_files()
|
||||||
|
return files
|
||||||
|
return orig.build_py.__getattr__(self, attr)
|
||||||
|
|
||||||
|
def build_module(self, module, module_file, package):
|
||||||
|
outfile, copied = orig.build_py.build_module(self, module, module_file,
|
||||||
|
package)
|
||||||
|
if copied:
|
||||||
|
self.__updated_files.append(outfile)
|
||||||
|
return outfile, copied
|
||||||
|
|
||||||
|
def _get_data_files(self):
|
||||||
|
"""Generate list of '(package,src_dir,build_dir,filenames)' tuples"""
|
||||||
|
self.analyze_manifest()
|
||||||
|
data = []
|
||||||
|
for package in self.packages or ():
|
||||||
|
# Locate package source directory
|
||||||
|
src_dir = self.get_package_dir(package)
|
||||||
|
|
||||||
|
# Compute package build directory
|
||||||
|
build_dir = os.path.join(*([self.build_lib] + package.split('.')))
|
||||||
|
|
||||||
|
# Length of path to strip from found files
|
||||||
|
plen = len(src_dir) + 1
|
||||||
|
|
||||||
|
# Strip directory from globbed filenames
|
||||||
|
filenames = [
|
||||||
|
file[plen:] for file in self.find_data_files(package, src_dir)
|
||||||
|
]
|
||||||
|
data.append((package, src_dir, build_dir, filenames))
|
||||||
|
return data
|
||||||
|
|
||||||
|
def find_data_files(self, package, src_dir):
|
||||||
|
"""Return filenames for package's data files in 'src_dir'"""
|
||||||
|
globs = (self.package_data.get('', [])
|
||||||
|
+ self.package_data.get(package, []))
|
||||||
|
files = self.manifest_files.get(package, [])[:]
|
||||||
|
for pattern in globs:
|
||||||
|
# Each pattern has to be converted to a platform-specific path
|
||||||
|
files.extend(glob(os.path.join(src_dir, convert_path(pattern))))
|
||||||
|
return self.exclude_data_files(package, src_dir, files)
|
||||||
|
|
||||||
|
def build_package_data(self):
|
||||||
|
"""Copy data files into build directory"""
|
||||||
|
for package, src_dir, build_dir, filenames in self.data_files:
|
||||||
|
for filename in filenames:
|
||||||
|
target = os.path.join(build_dir, filename)
|
||||||
|
self.mkpath(os.path.dirname(target))
|
||||||
|
srcfile = os.path.join(src_dir, filename)
|
||||||
|
outf, copied = self.copy_file(srcfile, target)
|
||||||
|
srcfile = os.path.abspath(srcfile)
|
||||||
|
if (copied and
|
||||||
|
srcfile in self.distribution.convert_2to3_doctests):
|
||||||
|
self.__doctests_2to3.append(outf)
|
||||||
|
|
||||||
|
def analyze_manifest(self):
|
||||||
|
self.manifest_files = mf = {}
|
||||||
|
if not self.distribution.include_package_data:
|
||||||
|
return
|
||||||
|
src_dirs = {}
|
||||||
|
for package in self.packages or ():
|
||||||
|
# Locate package source directory
|
||||||
|
src_dirs[assert_relative(self.get_package_dir(package))] = package
|
||||||
|
|
||||||
|
self.run_command('egg_info')
|
||||||
|
ei_cmd = self.get_finalized_command('egg_info')
|
||||||
|
for path in ei_cmd.filelist.files:
|
||||||
|
d, f = os.path.split(assert_relative(path))
|
||||||
|
prev = None
|
||||||
|
oldf = f
|
||||||
|
while d and d != prev and d not in src_dirs:
|
||||||
|
prev = d
|
||||||
|
d, df = os.path.split(d)
|
||||||
|
f = os.path.join(df, f)
|
||||||
|
if d in src_dirs:
|
||||||
|
if path.endswith('.py') and f == oldf:
|
||||||
|
continue # it's a module, not data
|
||||||
|
mf.setdefault(src_dirs[d], []).append(path)
|
||||||
|
|
||||||
|
def get_data_files(self):
|
||||||
|
pass # Lazily compute data files in _get_data_files() function.
|
||||||
|
|
||||||
|
def check_package(self, package, package_dir):
|
||||||
|
"""Check namespace packages' __init__ for declare_namespace"""
|
||||||
|
try:
|
||||||
|
return self.packages_checked[package]
|
||||||
|
except KeyError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
init_py = orig.build_py.check_package(self, package, package_dir)
|
||||||
|
self.packages_checked[package] = init_py
|
||||||
|
|
||||||
|
if not init_py or not self.distribution.namespace_packages:
|
||||||
|
return init_py
|
||||||
|
|
||||||
|
for pkg in self.distribution.namespace_packages:
|
||||||
|
if pkg == package or pkg.startswith(package + '.'):
|
||||||
|
break
|
||||||
|
else:
|
||||||
|
return init_py
|
||||||
|
|
||||||
|
f = open(init_py, 'rbU')
|
||||||
|
if 'declare_namespace'.encode() not in f.read():
|
||||||
|
from distutils.errors import DistutilsError
|
||||||
|
|
||||||
|
raise DistutilsError(
|
||||||
|
"Namespace package problem: %s is a namespace package, but "
|
||||||
|
"its\n__init__.py does not call declare_namespace()! Please "
|
||||||
|
'fix it.\n(See the setuptools manual under '
|
||||||
|
'"Namespace Packages" for details.)\n"' % (package,)
|
||||||
|
)
|
||||||
|
f.close()
|
||||||
|
return init_py
|
||||||
|
|
||||||
|
def initialize_options(self):
|
||||||
|
self.packages_checked = {}
|
||||||
|
orig.build_py.initialize_options(self)
|
||||||
|
|
||||||
|
def get_package_dir(self, package):
|
||||||
|
res = orig.build_py.get_package_dir(self, package)
|
||||||
|
if self.distribution.src_root is not None:
|
||||||
|
return os.path.join(self.distribution.src_root, res)
|
||||||
|
return res
|
||||||
|
|
||||||
|
def exclude_data_files(self, package, src_dir, files):
|
||||||
|
"""Filter filenames for package's data files in 'src_dir'"""
|
||||||
|
globs = (self.exclude_package_data.get('', [])
|
||||||
|
+ self.exclude_package_data.get(package, []))
|
||||||
|
bad = []
|
||||||
|
for pattern in globs:
|
||||||
|
bad.extend(
|
||||||
|
fnmatch.filter(
|
||||||
|
files, os.path.join(src_dir, convert_path(pattern))
|
||||||
|
)
|
||||||
|
)
|
||||||
|
bad = dict.fromkeys(bad)
|
||||||
|
seen = {}
|
||||||
|
return [
|
||||||
|
f for f in files if f not in bad
|
||||||
|
and f not in seen and seen.setdefault(f, 1) # ditch dupes
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
def assert_relative(path):
|
||||||
|
if not os.path.isabs(path):
|
||||||
|
return path
|
||||||
|
from distutils.errors import DistutilsSetupError
|
||||||
|
|
||||||
|
msg = textwrap.dedent("""
|
||||||
|
Error: setup script specifies an absolute path:
|
||||||
|
|
||||||
|
%s
|
||||||
|
|
||||||
|
setup() arguments must *always* be /-separated paths relative to the
|
||||||
|
setup.py directory, *never* absolute paths.
|
||||||
|
""").lstrip() % path
|
||||||
|
raise DistutilsSetupError(msg)
|
169
lib/python3.4/site-packages/setuptools/command/develop.py
Normal file
169
lib/python3.4/site-packages/setuptools/command/develop.py
Normal file
|
@ -0,0 +1,169 @@
|
||||||
|
from distutils.util import convert_path
|
||||||
|
from distutils import log
|
||||||
|
from distutils.errors import DistutilsError, DistutilsOptionError
|
||||||
|
import os
|
||||||
|
import glob
|
||||||
|
|
||||||
|
from pkg_resources import Distribution, PathMetadata, normalize_path
|
||||||
|
from setuptools.command.easy_install import easy_install
|
||||||
|
from setuptools.compat import PY3
|
||||||
|
import setuptools
|
||||||
|
|
||||||
|
|
||||||
|
class develop(easy_install):
|
||||||
|
"""Set up package for development"""
|
||||||
|
|
||||||
|
description = "install package in 'development mode'"
|
||||||
|
|
||||||
|
user_options = easy_install.user_options + [
|
||||||
|
("uninstall", "u", "Uninstall this source package"),
|
||||||
|
("egg-path=", None, "Set the path to be used in the .egg-link file"),
|
||||||
|
]
|
||||||
|
|
||||||
|
boolean_options = easy_install.boolean_options + ['uninstall']
|
||||||
|
|
||||||
|
command_consumes_arguments = False # override base
|
||||||
|
|
||||||
|
def run(self):
|
||||||
|
if self.uninstall:
|
||||||
|
self.multi_version = True
|
||||||
|
self.uninstall_link()
|
||||||
|
else:
|
||||||
|
self.install_for_development()
|
||||||
|
self.warn_deprecated_options()
|
||||||
|
|
||||||
|
def initialize_options(self):
|
||||||
|
self.uninstall = None
|
||||||
|
self.egg_path = None
|
||||||
|
easy_install.initialize_options(self)
|
||||||
|
self.setup_path = None
|
||||||
|
self.always_copy_from = '.' # always copy eggs installed in curdir
|
||||||
|
|
||||||
|
def finalize_options(self):
|
||||||
|
ei = self.get_finalized_command("egg_info")
|
||||||
|
if ei.broken_egg_info:
|
||||||
|
template = "Please rename %r to %r before using 'develop'"
|
||||||
|
args = ei.egg_info, ei.broken_egg_info
|
||||||
|
raise DistutilsError(template % args)
|
||||||
|
self.args = [ei.egg_name]
|
||||||
|
|
||||||
|
easy_install.finalize_options(self)
|
||||||
|
self.expand_basedirs()
|
||||||
|
self.expand_dirs()
|
||||||
|
# pick up setup-dir .egg files only: no .egg-info
|
||||||
|
self.package_index.scan(glob.glob('*.egg'))
|
||||||
|
|
||||||
|
self.egg_link = os.path.join(self.install_dir, ei.egg_name +
|
||||||
|
'.egg-link')
|
||||||
|
self.egg_base = ei.egg_base
|
||||||
|
if self.egg_path is None:
|
||||||
|
self.egg_path = os.path.abspath(ei.egg_base)
|
||||||
|
|
||||||
|
target = normalize_path(self.egg_base)
|
||||||
|
egg_path = normalize_path(os.path.join(self.install_dir,
|
||||||
|
self.egg_path))
|
||||||
|
if egg_path != target:
|
||||||
|
raise DistutilsOptionError(
|
||||||
|
"--egg-path must be a relative path from the install"
|
||||||
|
" directory to " + target
|
||||||
|
)
|
||||||
|
|
||||||
|
# Make a distribution for the package's source
|
||||||
|
self.dist = Distribution(
|
||||||
|
target,
|
||||||
|
PathMetadata(target, os.path.abspath(ei.egg_info)),
|
||||||
|
project_name=ei.egg_name
|
||||||
|
)
|
||||||
|
|
||||||
|
p = self.egg_base.replace(os.sep, '/')
|
||||||
|
if p != os.curdir:
|
||||||
|
p = '../' * (p.count('/') + 1)
|
||||||
|
self.setup_path = p
|
||||||
|
p = normalize_path(os.path.join(self.install_dir, self.egg_path, p))
|
||||||
|
if p != normalize_path(os.curdir):
|
||||||
|
raise DistutilsOptionError(
|
||||||
|
"Can't get a consistent path to setup script from"
|
||||||
|
" installation directory", p, normalize_path(os.curdir))
|
||||||
|
|
||||||
|
def install_for_development(self):
|
||||||
|
if PY3 and getattr(self.distribution, 'use_2to3', False):
|
||||||
|
# If we run 2to3 we can not do this inplace:
|
||||||
|
|
||||||
|
# Ensure metadata is up-to-date
|
||||||
|
self.reinitialize_command('build_py', inplace=0)
|
||||||
|
self.run_command('build_py')
|
||||||
|
bpy_cmd = self.get_finalized_command("build_py")
|
||||||
|
build_path = normalize_path(bpy_cmd.build_lib)
|
||||||
|
|
||||||
|
# Build extensions
|
||||||
|
self.reinitialize_command('egg_info', egg_base=build_path)
|
||||||
|
self.run_command('egg_info')
|
||||||
|
|
||||||
|
self.reinitialize_command('build_ext', inplace=0)
|
||||||
|
self.run_command('build_ext')
|
||||||
|
|
||||||
|
# Fixup egg-link and easy-install.pth
|
||||||
|
ei_cmd = self.get_finalized_command("egg_info")
|
||||||
|
self.egg_path = build_path
|
||||||
|
self.dist.location = build_path
|
||||||
|
# XXX
|
||||||
|
self.dist._provider = PathMetadata(build_path, ei_cmd.egg_info)
|
||||||
|
else:
|
||||||
|
# Without 2to3 inplace works fine:
|
||||||
|
self.run_command('egg_info')
|
||||||
|
|
||||||
|
# Build extensions in-place
|
||||||
|
self.reinitialize_command('build_ext', inplace=1)
|
||||||
|
self.run_command('build_ext')
|
||||||
|
|
||||||
|
self.install_site_py() # ensure that target dir is site-safe
|
||||||
|
if setuptools.bootstrap_install_from:
|
||||||
|
self.easy_install(setuptools.bootstrap_install_from)
|
||||||
|
setuptools.bootstrap_install_from = None
|
||||||
|
|
||||||
|
# create an .egg-link in the installation dir, pointing to our egg
|
||||||
|
log.info("Creating %s (link to %s)", self.egg_link, self.egg_base)
|
||||||
|
if not self.dry_run:
|
||||||
|
f = open(self.egg_link, "w")
|
||||||
|
f.write(self.egg_path + "\n" + self.setup_path)
|
||||||
|
f.close()
|
||||||
|
# postprocess the installed distro, fixing up .pth, installing scripts,
|
||||||
|
# and handling requirements
|
||||||
|
self.process_distribution(None, self.dist, not self.no_deps)
|
||||||
|
|
||||||
|
def uninstall_link(self):
|
||||||
|
if os.path.exists(self.egg_link):
|
||||||
|
log.info("Removing %s (link to %s)", self.egg_link, self.egg_base)
|
||||||
|
egg_link_file = open(self.egg_link)
|
||||||
|
contents = [line.rstrip() for line in egg_link_file]
|
||||||
|
egg_link_file.close()
|
||||||
|
if contents not in ([self.egg_path],
|
||||||
|
[self.egg_path, self.setup_path]):
|
||||||
|
log.warn("Link points to %s: uninstall aborted", contents)
|
||||||
|
return
|
||||||
|
if not self.dry_run:
|
||||||
|
os.unlink(self.egg_link)
|
||||||
|
if not self.dry_run:
|
||||||
|
self.update_pth(self.dist) # remove any .pth link to us
|
||||||
|
if self.distribution.scripts:
|
||||||
|
# XXX should also check for entry point scripts!
|
||||||
|
log.warn("Note: you must uninstall or replace scripts manually!")
|
||||||
|
|
||||||
|
def install_egg_scripts(self, dist):
|
||||||
|
if dist is not self.dist:
|
||||||
|
# Installing a dependency, so fall back to normal behavior
|
||||||
|
return easy_install.install_egg_scripts(self, dist)
|
||||||
|
|
||||||
|
# create wrapper scripts in the script dir, pointing to dist.scripts
|
||||||
|
|
||||||
|
# new-style...
|
||||||
|
self.install_wrapper_scripts(dist)
|
||||||
|
|
||||||
|
# ...and old-style
|
||||||
|
for script_name in self.distribution.scripts or []:
|
||||||
|
script_path = os.path.abspath(convert_path(script_name))
|
||||||
|
script_name = os.path.basename(script_path)
|
||||||
|
f = open(script_path, 'rU')
|
||||||
|
script_text = f.read()
|
||||||
|
f.close()
|
||||||
|
self.install_script(dist, script_name, script_text, script_path)
|
2301
lib/python3.4/site-packages/setuptools/command/easy_install.py
Normal file
2301
lib/python3.4/site-packages/setuptools/command/easy_install.py
Normal file
File diff suppressed because it is too large
Load diff
480
lib/python3.4/site-packages/setuptools/command/egg_info.py
Normal file
480
lib/python3.4/site-packages/setuptools/command/egg_info.py
Normal file
|
@ -0,0 +1,480 @@
|
||||||
|
"""setuptools.command.egg_info
|
||||||
|
|
||||||
|
Create a distribution's .egg-info directory and contents"""
|
||||||
|
|
||||||
|
from distutils.filelist import FileList as _FileList
|
||||||
|
from distutils.util import convert_path
|
||||||
|
from distutils import log
|
||||||
|
import distutils.errors
|
||||||
|
import distutils.filelist
|
||||||
|
import os
|
||||||
|
import re
|
||||||
|
import sys
|
||||||
|
|
||||||
|
try:
|
||||||
|
from setuptools_svn import svn_utils
|
||||||
|
except ImportError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
from setuptools import Command
|
||||||
|
from setuptools.command.sdist import sdist
|
||||||
|
from setuptools.compat import basestring, PY3, StringIO
|
||||||
|
from setuptools.command.sdist import walk_revctrl
|
||||||
|
from pkg_resources import (
|
||||||
|
parse_requirements, safe_name, parse_version,
|
||||||
|
safe_version, yield_lines, EntryPoint, iter_entry_points, to_filename)
|
||||||
|
import setuptools.unicode_utils as unicode_utils
|
||||||
|
|
||||||
|
from pkg_resources import packaging
|
||||||
|
|
||||||
|
class egg_info(Command):
|
||||||
|
description = "create a distribution's .egg-info directory"
|
||||||
|
|
||||||
|
user_options = [
|
||||||
|
('egg-base=', 'e', "directory containing .egg-info directories"
|
||||||
|
" (default: top of the source tree)"),
|
||||||
|
('tag-svn-revision', 'r',
|
||||||
|
"Add subversion revision ID to version number"),
|
||||||
|
('tag-date', 'd', "Add date stamp (e.g. 20050528) to version number"),
|
||||||
|
('tag-build=', 'b', "Specify explicit tag to add to version number"),
|
||||||
|
('no-svn-revision', 'R',
|
||||||
|
"Don't add subversion revision ID [default]"),
|
||||||
|
('no-date', 'D', "Don't include date stamp [default]"),
|
||||||
|
]
|
||||||
|
|
||||||
|
boolean_options = ['tag-date', 'tag-svn-revision']
|
||||||
|
negative_opt = {'no-svn-revision': 'tag-svn-revision',
|
||||||
|
'no-date': 'tag-date'}
|
||||||
|
|
||||||
|
def initialize_options(self):
|
||||||
|
self.egg_name = None
|
||||||
|
self.egg_version = None
|
||||||
|
self.egg_base = None
|
||||||
|
self.egg_info = None
|
||||||
|
self.tag_build = None
|
||||||
|
self.tag_svn_revision = 0
|
||||||
|
self.tag_date = 0
|
||||||
|
self.broken_egg_info = False
|
||||||
|
self.vtags = None
|
||||||
|
|
||||||
|
def save_version_info(self, filename):
|
||||||
|
from setuptools.command.setopt import edit_config
|
||||||
|
|
||||||
|
values = dict(
|
||||||
|
egg_info=dict(
|
||||||
|
tag_svn_revision=0,
|
||||||
|
tag_date=0,
|
||||||
|
tag_build=self.tags(),
|
||||||
|
)
|
||||||
|
)
|
||||||
|
edit_config(filename, values)
|
||||||
|
|
||||||
|
def finalize_options(self):
|
||||||
|
self.egg_name = safe_name(self.distribution.get_name())
|
||||||
|
self.vtags = self.tags()
|
||||||
|
self.egg_version = self.tagged_version()
|
||||||
|
|
||||||
|
parsed_version = parse_version(self.egg_version)
|
||||||
|
|
||||||
|
try:
|
||||||
|
is_version = isinstance(parsed_version, packaging.version.Version)
|
||||||
|
spec = (
|
||||||
|
"%s==%s" if is_version else "%s===%s"
|
||||||
|
)
|
||||||
|
list(
|
||||||
|
parse_requirements(spec % (self.egg_name, self.egg_version))
|
||||||
|
)
|
||||||
|
except ValueError:
|
||||||
|
raise distutils.errors.DistutilsOptionError(
|
||||||
|
"Invalid distribution name or version syntax: %s-%s" %
|
||||||
|
(self.egg_name, self.egg_version)
|
||||||
|
)
|
||||||
|
|
||||||
|
if self.egg_base is None:
|
||||||
|
dirs = self.distribution.package_dir
|
||||||
|
self.egg_base = (dirs or {}).get('', os.curdir)
|
||||||
|
|
||||||
|
self.ensure_dirname('egg_base')
|
||||||
|
self.egg_info = to_filename(self.egg_name) + '.egg-info'
|
||||||
|
if self.egg_base != os.curdir:
|
||||||
|
self.egg_info = os.path.join(self.egg_base, self.egg_info)
|
||||||
|
if '-' in self.egg_name:
|
||||||
|
self.check_broken_egg_info()
|
||||||
|
|
||||||
|
# Set package version for the benefit of dumber commands
|
||||||
|
# (e.g. sdist, bdist_wininst, etc.)
|
||||||
|
#
|
||||||
|
self.distribution.metadata.version = self.egg_version
|
||||||
|
|
||||||
|
# If we bootstrapped around the lack of a PKG-INFO, as might be the
|
||||||
|
# case in a fresh checkout, make sure that any special tags get added
|
||||||
|
# to the version info
|
||||||
|
#
|
||||||
|
pd = self.distribution._patched_dist
|
||||||
|
if pd is not None and pd.key == self.egg_name.lower():
|
||||||
|
pd._version = self.egg_version
|
||||||
|
pd._parsed_version = parse_version(self.egg_version)
|
||||||
|
self.distribution._patched_dist = None
|
||||||
|
|
||||||
|
def write_or_delete_file(self, what, filename, data, force=False):
|
||||||
|
"""Write `data` to `filename` or delete if empty
|
||||||
|
|
||||||
|
If `data` is non-empty, this routine is the same as ``write_file()``.
|
||||||
|
If `data` is empty but not ``None``, this is the same as calling
|
||||||
|
``delete_file(filename)`. If `data` is ``None``, then this is a no-op
|
||||||
|
unless `filename` exists, in which case a warning is issued about the
|
||||||
|
orphaned file (if `force` is false), or deleted (if `force` is true).
|
||||||
|
"""
|
||||||
|
if data:
|
||||||
|
self.write_file(what, filename, data)
|
||||||
|
elif os.path.exists(filename):
|
||||||
|
if data is None and not force:
|
||||||
|
log.warn(
|
||||||
|
"%s not set in setup(), but %s exists", what, filename
|
||||||
|
)
|
||||||
|
return
|
||||||
|
else:
|
||||||
|
self.delete_file(filename)
|
||||||
|
|
||||||
|
def write_file(self, what, filename, data):
|
||||||
|
"""Write `data` to `filename` (if not a dry run) after announcing it
|
||||||
|
|
||||||
|
`what` is used in a log message to identify what is being written
|
||||||
|
to the file.
|
||||||
|
"""
|
||||||
|
log.info("writing %s to %s", what, filename)
|
||||||
|
if PY3:
|
||||||
|
data = data.encode("utf-8")
|
||||||
|
if not self.dry_run:
|
||||||
|
f = open(filename, 'wb')
|
||||||
|
f.write(data)
|
||||||
|
f.close()
|
||||||
|
|
||||||
|
def delete_file(self, filename):
|
||||||
|
"""Delete `filename` (if not a dry run) after announcing it"""
|
||||||
|
log.info("deleting %s", filename)
|
||||||
|
if not self.dry_run:
|
||||||
|
os.unlink(filename)
|
||||||
|
|
||||||
|
def tagged_version(self):
|
||||||
|
version = self.distribution.get_version()
|
||||||
|
# egg_info may be called more than once for a distribution,
|
||||||
|
# in which case the version string already contains all tags.
|
||||||
|
if self.vtags and version.endswith(self.vtags):
|
||||||
|
return safe_version(version)
|
||||||
|
return safe_version(version + self.vtags)
|
||||||
|
|
||||||
|
def run(self):
|
||||||
|
self.mkpath(self.egg_info)
|
||||||
|
installer = self.distribution.fetch_build_egg
|
||||||
|
for ep in iter_entry_points('egg_info.writers'):
|
||||||
|
ep.require(installer=installer)
|
||||||
|
writer = ep.resolve()
|
||||||
|
writer(self, ep.name, os.path.join(self.egg_info, ep.name))
|
||||||
|
|
||||||
|
# Get rid of native_libs.txt if it was put there by older bdist_egg
|
||||||
|
nl = os.path.join(self.egg_info, "native_libs.txt")
|
||||||
|
if os.path.exists(nl):
|
||||||
|
self.delete_file(nl)
|
||||||
|
|
||||||
|
self.find_sources()
|
||||||
|
|
||||||
|
def tags(self):
|
||||||
|
version = ''
|
||||||
|
if self.tag_build:
|
||||||
|
version += self.tag_build
|
||||||
|
if self.tag_svn_revision:
|
||||||
|
rev = self.get_svn_revision()
|
||||||
|
if rev: # is 0 if it's not an svn working copy
|
||||||
|
version += '-r%s' % rev
|
||||||
|
if self.tag_date:
|
||||||
|
import time
|
||||||
|
|
||||||
|
version += time.strftime("-%Y%m%d")
|
||||||
|
return version
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def get_svn_revision():
|
||||||
|
if 'svn_utils' not in globals():
|
||||||
|
return "0"
|
||||||
|
return str(svn_utils.SvnInfo.load(os.curdir).get_revision())
|
||||||
|
|
||||||
|
def find_sources(self):
|
||||||
|
"""Generate SOURCES.txt manifest file"""
|
||||||
|
manifest_filename = os.path.join(self.egg_info, "SOURCES.txt")
|
||||||
|
mm = manifest_maker(self.distribution)
|
||||||
|
mm.manifest = manifest_filename
|
||||||
|
mm.run()
|
||||||
|
self.filelist = mm.filelist
|
||||||
|
|
||||||
|
def check_broken_egg_info(self):
|
||||||
|
bei = self.egg_name + '.egg-info'
|
||||||
|
if self.egg_base != os.curdir:
|
||||||
|
bei = os.path.join(self.egg_base, bei)
|
||||||
|
if os.path.exists(bei):
|
||||||
|
log.warn(
|
||||||
|
"-" * 78 + '\n'
|
||||||
|
"Note: Your current .egg-info directory has a '-' in its name;"
|
||||||
|
'\nthis will not work correctly with "setup.py develop".\n\n'
|
||||||
|
'Please rename %s to %s to correct this problem.\n' + '-' * 78,
|
||||||
|
bei, self.egg_info
|
||||||
|
)
|
||||||
|
self.broken_egg_info = self.egg_info
|
||||||
|
self.egg_info = bei # make it work for now
|
||||||
|
|
||||||
|
|
||||||
|
class FileList(_FileList):
|
||||||
|
"""File list that accepts only existing, platform-independent paths"""
|
||||||
|
|
||||||
|
def append(self, item):
|
||||||
|
if item.endswith('\r'): # Fix older sdists built on Windows
|
||||||
|
item = item[:-1]
|
||||||
|
path = convert_path(item)
|
||||||
|
|
||||||
|
if self._safe_path(path):
|
||||||
|
self.files.append(path)
|
||||||
|
|
||||||
|
def extend(self, paths):
|
||||||
|
self.files.extend(filter(self._safe_path, paths))
|
||||||
|
|
||||||
|
def _repair(self):
|
||||||
|
"""
|
||||||
|
Replace self.files with only safe paths
|
||||||
|
|
||||||
|
Because some owners of FileList manipulate the underlying
|
||||||
|
``files`` attribute directly, this method must be called to
|
||||||
|
repair those paths.
|
||||||
|
"""
|
||||||
|
self.files = list(filter(self._safe_path, self.files))
|
||||||
|
|
||||||
|
def _safe_path(self, path):
|
||||||
|
enc_warn = "'%s' not %s encodable -- skipping"
|
||||||
|
|
||||||
|
# To avoid accidental trans-codings errors, first to unicode
|
||||||
|
u_path = unicode_utils.filesys_decode(path)
|
||||||
|
if u_path is None:
|
||||||
|
log.warn("'%s' in unexpected encoding -- skipping" % path)
|
||||||
|
return False
|
||||||
|
|
||||||
|
# Must ensure utf-8 encodability
|
||||||
|
utf8_path = unicode_utils.try_encode(u_path, "utf-8")
|
||||||
|
if utf8_path is None:
|
||||||
|
log.warn(enc_warn, path, 'utf-8')
|
||||||
|
return False
|
||||||
|
|
||||||
|
try:
|
||||||
|
# accept is either way checks out
|
||||||
|
if os.path.exists(u_path) or os.path.exists(utf8_path):
|
||||||
|
return True
|
||||||
|
# this will catch any encode errors decoding u_path
|
||||||
|
except UnicodeEncodeError:
|
||||||
|
log.warn(enc_warn, path, sys.getfilesystemencoding())
|
||||||
|
|
||||||
|
|
||||||
|
class manifest_maker(sdist):
|
||||||
|
template = "MANIFEST.in"
|
||||||
|
|
||||||
|
def initialize_options(self):
|
||||||
|
self.use_defaults = 1
|
||||||
|
self.prune = 1
|
||||||
|
self.manifest_only = 1
|
||||||
|
self.force_manifest = 1
|
||||||
|
|
||||||
|
def finalize_options(self):
|
||||||
|
pass
|
||||||
|
|
||||||
|
def run(self):
|
||||||
|
self.filelist = FileList()
|
||||||
|
if not os.path.exists(self.manifest):
|
||||||
|
self.write_manifest() # it must exist so it'll get in the list
|
||||||
|
self.filelist.findall()
|
||||||
|
self.add_defaults()
|
||||||
|
if os.path.exists(self.template):
|
||||||
|
self.read_template()
|
||||||
|
self.prune_file_list()
|
||||||
|
self.filelist.sort()
|
||||||
|
self.filelist.remove_duplicates()
|
||||||
|
self.write_manifest()
|
||||||
|
|
||||||
|
def _manifest_normalize(self, path):
|
||||||
|
path = unicode_utils.filesys_decode(path)
|
||||||
|
return path.replace(os.sep, '/')
|
||||||
|
|
||||||
|
def write_manifest(self):
|
||||||
|
"""
|
||||||
|
Write the file list in 'self.filelist' to the manifest file
|
||||||
|
named by 'self.manifest'.
|
||||||
|
"""
|
||||||
|
self.filelist._repair()
|
||||||
|
|
||||||
|
# Now _repairs should encodability, but not unicode
|
||||||
|
files = [self._manifest_normalize(f) for f in self.filelist.files]
|
||||||
|
msg = "writing manifest file '%s'" % self.manifest
|
||||||
|
self.execute(write_file, (self.manifest, files), msg)
|
||||||
|
|
||||||
|
def warn(self, msg): # suppress missing-file warnings from sdist
|
||||||
|
if not msg.startswith("standard file not found:"):
|
||||||
|
sdist.warn(self, msg)
|
||||||
|
|
||||||
|
def add_defaults(self):
|
||||||
|
sdist.add_defaults(self)
|
||||||
|
self.filelist.append(self.template)
|
||||||
|
self.filelist.append(self.manifest)
|
||||||
|
rcfiles = list(walk_revctrl())
|
||||||
|
if rcfiles:
|
||||||
|
self.filelist.extend(rcfiles)
|
||||||
|
elif os.path.exists(self.manifest):
|
||||||
|
self.read_manifest()
|
||||||
|
ei_cmd = self.get_finalized_command('egg_info')
|
||||||
|
self._add_egg_info(cmd=ei_cmd)
|
||||||
|
self.filelist.include_pattern("*", prefix=ei_cmd.egg_info)
|
||||||
|
|
||||||
|
def _add_egg_info(self, cmd):
|
||||||
|
"""
|
||||||
|
Add paths for egg-info files for an external egg-base.
|
||||||
|
|
||||||
|
The egg-info files are written to egg-base. If egg-base is
|
||||||
|
outside the current working directory, this method
|
||||||
|
searchs the egg-base directory for files to include
|
||||||
|
in the manifest. Uses distutils.filelist.findall (which is
|
||||||
|
really the version monkeypatched in by setuptools/__init__.py)
|
||||||
|
to perform the search.
|
||||||
|
|
||||||
|
Since findall records relative paths, prefix the returned
|
||||||
|
paths with cmd.egg_base, so add_default's include_pattern call
|
||||||
|
(which is looking for the absolute cmd.egg_info) will match
|
||||||
|
them.
|
||||||
|
"""
|
||||||
|
if cmd.egg_base == os.curdir:
|
||||||
|
# egg-info files were already added by something else
|
||||||
|
return
|
||||||
|
|
||||||
|
discovered = distutils.filelist.findall(cmd.egg_base)
|
||||||
|
resolved = (os.path.join(cmd.egg_base, path) for path in discovered)
|
||||||
|
self.filelist.allfiles.extend(resolved)
|
||||||
|
|
||||||
|
def prune_file_list(self):
|
||||||
|
build = self.get_finalized_command('build')
|
||||||
|
base_dir = self.distribution.get_fullname()
|
||||||
|
self.filelist.exclude_pattern(None, prefix=build.build_base)
|
||||||
|
self.filelist.exclude_pattern(None, prefix=base_dir)
|
||||||
|
sep = re.escape(os.sep)
|
||||||
|
self.filelist.exclude_pattern(r'(^|' + sep + r')(RCS|CVS|\.svn)' + sep,
|
||||||
|
is_regex=1)
|
||||||
|
|
||||||
|
|
||||||
|
def write_file(filename, contents):
|
||||||
|
"""Create a file with the specified name and write 'contents' (a
|
||||||
|
sequence of strings without line terminators) to it.
|
||||||
|
"""
|
||||||
|
contents = "\n".join(contents)
|
||||||
|
|
||||||
|
# assuming the contents has been vetted for utf-8 encoding
|
||||||
|
contents = contents.encode("utf-8")
|
||||||
|
|
||||||
|
with open(filename, "wb") as f: # always write POSIX-style manifest
|
||||||
|
f.write(contents)
|
||||||
|
|
||||||
|
|
||||||
|
def write_pkg_info(cmd, basename, filename):
|
||||||
|
log.info("writing %s", filename)
|
||||||
|
if not cmd.dry_run:
|
||||||
|
metadata = cmd.distribution.metadata
|
||||||
|
metadata.version, oldver = cmd.egg_version, metadata.version
|
||||||
|
metadata.name, oldname = cmd.egg_name, metadata.name
|
||||||
|
try:
|
||||||
|
# write unescaped data to PKG-INFO, so older pkg_resources
|
||||||
|
# can still parse it
|
||||||
|
metadata.write_pkg_info(cmd.egg_info)
|
||||||
|
finally:
|
||||||
|
metadata.name, metadata.version = oldname, oldver
|
||||||
|
|
||||||
|
safe = getattr(cmd.distribution, 'zip_safe', None)
|
||||||
|
from setuptools.command import bdist_egg
|
||||||
|
|
||||||
|
bdist_egg.write_safety_flag(cmd.egg_info, safe)
|
||||||
|
|
||||||
|
|
||||||
|
def warn_depends_obsolete(cmd, basename, filename):
|
||||||
|
if os.path.exists(filename):
|
||||||
|
log.warn(
|
||||||
|
"WARNING: 'depends.txt' is not used by setuptools 0.6!\n"
|
||||||
|
"Use the install_requires/extras_require setup() args instead."
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _write_requirements(stream, reqs):
|
||||||
|
lines = yield_lines(reqs or ())
|
||||||
|
append_cr = lambda line: line + '\n'
|
||||||
|
lines = map(append_cr, lines)
|
||||||
|
stream.writelines(lines)
|
||||||
|
|
||||||
|
|
||||||
|
def write_requirements(cmd, basename, filename):
|
||||||
|
dist = cmd.distribution
|
||||||
|
data = StringIO()
|
||||||
|
_write_requirements(data, dist.install_requires)
|
||||||
|
extras_require = dist.extras_require or {}
|
||||||
|
for extra in sorted(extras_require):
|
||||||
|
data.write('\n[{extra}]\n'.format(**vars()))
|
||||||
|
_write_requirements(data, extras_require[extra])
|
||||||
|
cmd.write_or_delete_file("requirements", filename, data.getvalue())
|
||||||
|
|
||||||
|
|
||||||
|
def write_setup_requirements(cmd, basename, filename):
|
||||||
|
data = StringIO()
|
||||||
|
_write_requirements(data, cmd.distribution.setup_requires)
|
||||||
|
cmd.write_or_delete_file("setup-requirements", filename, data.getvalue())
|
||||||
|
|
||||||
|
|
||||||
|
def write_toplevel_names(cmd, basename, filename):
|
||||||
|
pkgs = dict.fromkeys(
|
||||||
|
[
|
||||||
|
k.split('.', 1)[0]
|
||||||
|
for k in cmd.distribution.iter_distribution_names()
|
||||||
|
]
|
||||||
|
)
|
||||||
|
cmd.write_file("top-level names", filename, '\n'.join(sorted(pkgs)) + '\n')
|
||||||
|
|
||||||
|
|
||||||
|
def overwrite_arg(cmd, basename, filename):
|
||||||
|
write_arg(cmd, basename, filename, True)
|
||||||
|
|
||||||
|
|
||||||
|
def write_arg(cmd, basename, filename, force=False):
|
||||||
|
argname = os.path.splitext(basename)[0]
|
||||||
|
value = getattr(cmd.distribution, argname, None)
|
||||||
|
if value is not None:
|
||||||
|
value = '\n'.join(value) + '\n'
|
||||||
|
cmd.write_or_delete_file(argname, filename, value, force)
|
||||||
|
|
||||||
|
|
||||||
|
def write_entries(cmd, basename, filename):
|
||||||
|
ep = cmd.distribution.entry_points
|
||||||
|
|
||||||
|
if isinstance(ep, basestring) or ep is None:
|
||||||
|
data = ep
|
||||||
|
elif ep is not None:
|
||||||
|
data = []
|
||||||
|
for section, contents in sorted(ep.items()):
|
||||||
|
if not isinstance(contents, basestring):
|
||||||
|
contents = EntryPoint.parse_group(section, contents)
|
||||||
|
contents = '\n'.join(sorted(map(str, contents.values())))
|
||||||
|
data.append('[%s]\n%s\n\n' % (section, contents))
|
||||||
|
data = ''.join(data)
|
||||||
|
|
||||||
|
cmd.write_or_delete_file('entry points', filename, data, True)
|
||||||
|
|
||||||
|
|
||||||
|
def get_pkg_info_revision():
|
||||||
|
# See if we can get a -r### off of PKG-INFO, in case this is an sdist of
|
||||||
|
# a subversion revision
|
||||||
|
#
|
||||||
|
if os.path.exists('PKG-INFO'):
|
||||||
|
f = open('PKG-INFO', 'rU')
|
||||||
|
for line in f:
|
||||||
|
match = re.match(r"Version:.*-r(\d+)\s*$", line)
|
||||||
|
if match:
|
||||||
|
return int(match.group(1))
|
||||||
|
f.close()
|
||||||
|
return 0
|
125
lib/python3.4/site-packages/setuptools/command/install.py
Normal file
125
lib/python3.4/site-packages/setuptools/command/install.py
Normal file
|
@ -0,0 +1,125 @@
|
||||||
|
from distutils.errors import DistutilsArgError
|
||||||
|
import inspect
|
||||||
|
import glob
|
||||||
|
import warnings
|
||||||
|
import platform
|
||||||
|
import distutils.command.install as orig
|
||||||
|
|
||||||
|
import setuptools
|
||||||
|
|
||||||
|
# Prior to numpy 1.9, NumPy relies on the '_install' name, so provide it for
|
||||||
|
# now. See https://bitbucket.org/pypa/setuptools/issue/199/
|
||||||
|
_install = orig.install
|
||||||
|
|
||||||
|
|
||||||
|
class install(orig.install):
|
||||||
|
"""Use easy_install to install the package, w/dependencies"""
|
||||||
|
|
||||||
|
user_options = orig.install.user_options + [
|
||||||
|
('old-and-unmanageable', None, "Try not to use this!"),
|
||||||
|
('single-version-externally-managed', None,
|
||||||
|
"used by system package builders to create 'flat' eggs"),
|
||||||
|
]
|
||||||
|
boolean_options = orig.install.boolean_options + [
|
||||||
|
'old-and-unmanageable', 'single-version-externally-managed',
|
||||||
|
]
|
||||||
|
new_commands = [
|
||||||
|
('install_egg_info', lambda self: True),
|
||||||
|
('install_scripts', lambda self: True),
|
||||||
|
]
|
||||||
|
_nc = dict(new_commands)
|
||||||
|
|
||||||
|
def initialize_options(self):
|
||||||
|
orig.install.initialize_options(self)
|
||||||
|
self.old_and_unmanageable = None
|
||||||
|
self.single_version_externally_managed = None
|
||||||
|
|
||||||
|
def finalize_options(self):
|
||||||
|
orig.install.finalize_options(self)
|
||||||
|
if self.root:
|
||||||
|
self.single_version_externally_managed = True
|
||||||
|
elif self.single_version_externally_managed:
|
||||||
|
if not self.root and not self.record:
|
||||||
|
raise DistutilsArgError(
|
||||||
|
"You must specify --record or --root when building system"
|
||||||
|
" packages"
|
||||||
|
)
|
||||||
|
|
||||||
|
def handle_extra_path(self):
|
||||||
|
if self.root or self.single_version_externally_managed:
|
||||||
|
# explicit backward-compatibility mode, allow extra_path to work
|
||||||
|
return orig.install.handle_extra_path(self)
|
||||||
|
|
||||||
|
# Ignore extra_path when installing an egg (or being run by another
|
||||||
|
# command without --root or --single-version-externally-managed
|
||||||
|
self.path_file = None
|
||||||
|
self.extra_dirs = ''
|
||||||
|
|
||||||
|
def run(self):
|
||||||
|
# Explicit request for old-style install? Just do it
|
||||||
|
if self.old_and_unmanageable or self.single_version_externally_managed:
|
||||||
|
return orig.install.run(self)
|
||||||
|
|
||||||
|
if not self._called_from_setup(inspect.currentframe()):
|
||||||
|
# Run in backward-compatibility mode to support bdist_* commands.
|
||||||
|
orig.install.run(self)
|
||||||
|
else:
|
||||||
|
self.do_egg_install()
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _called_from_setup(run_frame):
|
||||||
|
"""
|
||||||
|
Attempt to detect whether run() was called from setup() or by another
|
||||||
|
command. If called by setup(), the parent caller will be the
|
||||||
|
'run_command' method in 'distutils.dist', and *its* caller will be
|
||||||
|
the 'run_commands' method. If called any other way, the
|
||||||
|
immediate caller *might* be 'run_command', but it won't have been
|
||||||
|
called by 'run_commands'. Return True in that case or if a call stack
|
||||||
|
is unavailable. Return False otherwise.
|
||||||
|
"""
|
||||||
|
if run_frame is None:
|
||||||
|
msg = "Call stack not available. bdist_* commands may fail."
|
||||||
|
warnings.warn(msg)
|
||||||
|
if platform.python_implementation() == 'IronPython':
|
||||||
|
msg = "For best results, pass -X:Frames to enable call stack."
|
||||||
|
warnings.warn(msg)
|
||||||
|
return True
|
||||||
|
res = inspect.getouterframes(run_frame)[2]
|
||||||
|
caller, = res[:1]
|
||||||
|
info = inspect.getframeinfo(caller)
|
||||||
|
caller_module = caller.f_globals.get('__name__', '')
|
||||||
|
return (
|
||||||
|
caller_module == 'distutils.dist'
|
||||||
|
and info.function == 'run_commands'
|
||||||
|
)
|
||||||
|
|
||||||
|
def do_egg_install(self):
|
||||||
|
|
||||||
|
easy_install = self.distribution.get_command_class('easy_install')
|
||||||
|
|
||||||
|
cmd = easy_install(
|
||||||
|
self.distribution, args="x", root=self.root, record=self.record,
|
||||||
|
)
|
||||||
|
cmd.ensure_finalized() # finalize before bdist_egg munges install cmd
|
||||||
|
cmd.always_copy_from = '.' # make sure local-dir eggs get installed
|
||||||
|
|
||||||
|
# pick up setup-dir .egg files only: no .egg-info
|
||||||
|
cmd.package_index.scan(glob.glob('*.egg'))
|
||||||
|
|
||||||
|
self.run_command('bdist_egg')
|
||||||
|
args = [self.distribution.get_command_obj('bdist_egg').egg_output]
|
||||||
|
|
||||||
|
if setuptools.bootstrap_install_from:
|
||||||
|
# Bootstrap self-installation of setuptools
|
||||||
|
args.insert(0, setuptools.bootstrap_install_from)
|
||||||
|
|
||||||
|
cmd.args = args
|
||||||
|
cmd.run()
|
||||||
|
setuptools.bootstrap_install_from = None
|
||||||
|
|
||||||
|
|
||||||
|
# XXX Python 3.1 doesn't see _nc if this is inside the class
|
||||||
|
install.sub_commands = (
|
||||||
|
[cmd for cmd in orig.install.sub_commands if cmd[0] not in install._nc] +
|
||||||
|
install.new_commands
|
||||||
|
)
|
|
@ -0,0 +1,116 @@
|
||||||
|
from distutils import log, dir_util
|
||||||
|
import os
|
||||||
|
|
||||||
|
from setuptools import Command
|
||||||
|
from setuptools.archive_util import unpack_archive
|
||||||
|
import pkg_resources
|
||||||
|
|
||||||
|
|
||||||
|
class install_egg_info(Command):
|
||||||
|
"""Install an .egg-info directory for the package"""
|
||||||
|
|
||||||
|
description = "Install an .egg-info directory for the package"
|
||||||
|
|
||||||
|
user_options = [
|
||||||
|
('install-dir=', 'd', "directory to install to"),
|
||||||
|
]
|
||||||
|
|
||||||
|
def initialize_options(self):
|
||||||
|
self.install_dir = None
|
||||||
|
|
||||||
|
def finalize_options(self):
|
||||||
|
self.set_undefined_options('install_lib',
|
||||||
|
('install_dir', 'install_dir'))
|
||||||
|
ei_cmd = self.get_finalized_command("egg_info")
|
||||||
|
basename = pkg_resources.Distribution(
|
||||||
|
None, None, ei_cmd.egg_name, ei_cmd.egg_version
|
||||||
|
).egg_name() + '.egg-info'
|
||||||
|
self.source = ei_cmd.egg_info
|
||||||
|
self.target = os.path.join(self.install_dir, basename)
|
||||||
|
self.outputs = [self.target]
|
||||||
|
|
||||||
|
def run(self):
|
||||||
|
self.run_command('egg_info')
|
||||||
|
if os.path.isdir(self.target) and not os.path.islink(self.target):
|
||||||
|
dir_util.remove_tree(self.target, dry_run=self.dry_run)
|
||||||
|
elif os.path.exists(self.target):
|
||||||
|
self.execute(os.unlink, (self.target,), "Removing " + self.target)
|
||||||
|
if not self.dry_run:
|
||||||
|
pkg_resources.ensure_directory(self.target)
|
||||||
|
self.execute(
|
||||||
|
self.copytree, (), "Copying %s to %s" % (self.source, self.target)
|
||||||
|
)
|
||||||
|
self.install_namespaces()
|
||||||
|
|
||||||
|
def get_outputs(self):
|
||||||
|
return self.outputs
|
||||||
|
|
||||||
|
def copytree(self):
|
||||||
|
# Copy the .egg-info tree to site-packages
|
||||||
|
def skimmer(src, dst):
|
||||||
|
# filter out source-control directories; note that 'src' is always
|
||||||
|
# a '/'-separated path, regardless of platform. 'dst' is a
|
||||||
|
# platform-specific path.
|
||||||
|
for skip in '.svn/', 'CVS/':
|
||||||
|
if src.startswith(skip) or '/' + skip in src:
|
||||||
|
return None
|
||||||
|
self.outputs.append(dst)
|
||||||
|
log.debug("Copying %s to %s", src, dst)
|
||||||
|
return dst
|
||||||
|
|
||||||
|
unpack_archive(self.source, self.target, skimmer)
|
||||||
|
|
||||||
|
def install_namespaces(self):
|
||||||
|
nsp = self._get_all_ns_packages()
|
||||||
|
if not nsp:
|
||||||
|
return
|
||||||
|
filename, ext = os.path.splitext(self.target)
|
||||||
|
filename += '-nspkg.pth'
|
||||||
|
self.outputs.append(filename)
|
||||||
|
log.info("Installing %s", filename)
|
||||||
|
lines = map(self._gen_nspkg_line, nsp)
|
||||||
|
|
||||||
|
if self.dry_run:
|
||||||
|
# always generate the lines, even in dry run
|
||||||
|
list(lines)
|
||||||
|
return
|
||||||
|
|
||||||
|
with open(filename, 'wt') as f:
|
||||||
|
f.writelines(lines)
|
||||||
|
|
||||||
|
_nspkg_tmpl = (
|
||||||
|
"import sys, types, os",
|
||||||
|
"p = os.path.join(sys._getframe(1).f_locals['sitedir'], *%(pth)r)",
|
||||||
|
"ie = os.path.exists(os.path.join(p,'__init__.py'))",
|
||||||
|
"m = not ie and "
|
||||||
|
"sys.modules.setdefault(%(pkg)r, types.ModuleType(%(pkg)r))",
|
||||||
|
"mp = (m or []) and m.__dict__.setdefault('__path__',[])",
|
||||||
|
"(p not in mp) and mp.append(p)",
|
||||||
|
)
|
||||||
|
"lines for the namespace installer"
|
||||||
|
|
||||||
|
_nspkg_tmpl_multi = (
|
||||||
|
'm and setattr(sys.modules[%(parent)r], %(child)r, m)',
|
||||||
|
)
|
||||||
|
"additional line(s) when a parent package is indicated"
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def _gen_nspkg_line(cls, pkg):
|
||||||
|
# ensure pkg is not a unicode string under Python 2.7
|
||||||
|
pkg = str(pkg)
|
||||||
|
pth = tuple(pkg.split('.'))
|
||||||
|
tmpl_lines = cls._nspkg_tmpl
|
||||||
|
parent, sep, child = pkg.rpartition('.')
|
||||||
|
if parent:
|
||||||
|
tmpl_lines += cls._nspkg_tmpl_multi
|
||||||
|
return ';'.join(tmpl_lines) % locals() + '\n'
|
||||||
|
|
||||||
|
def _get_all_ns_packages(self):
|
||||||
|
"""Return sorted list of all package namespaces"""
|
||||||
|
nsp = set()
|
||||||
|
for pkg in self.distribution.namespace_packages or []:
|
||||||
|
pkg = pkg.split('.')
|
||||||
|
while pkg:
|
||||||
|
nsp.add('.'.join(pkg))
|
||||||
|
pkg.pop()
|
||||||
|
return sorted(nsp)
|
118
lib/python3.4/site-packages/setuptools/command/install_lib.py
Normal file
118
lib/python3.4/site-packages/setuptools/command/install_lib.py
Normal file
|
@ -0,0 +1,118 @@
|
||||||
|
import os
|
||||||
|
import imp
|
||||||
|
from itertools import product, starmap
|
||||||
|
import distutils.command.install_lib as orig
|
||||||
|
|
||||||
|
class install_lib(orig.install_lib):
|
||||||
|
"""Don't add compiled flags to filenames of non-Python files"""
|
||||||
|
|
||||||
|
def run(self):
|
||||||
|
self.build()
|
||||||
|
outfiles = self.install()
|
||||||
|
if outfiles is not None:
|
||||||
|
# always compile, in case we have any extension stubs to deal with
|
||||||
|
self.byte_compile(outfiles)
|
||||||
|
|
||||||
|
def get_exclusions(self):
|
||||||
|
"""
|
||||||
|
Return a collections.Sized collections.Container of paths to be
|
||||||
|
excluded for single_version_externally_managed installations.
|
||||||
|
"""
|
||||||
|
all_packages = (
|
||||||
|
pkg
|
||||||
|
for ns_pkg in self._get_SVEM_NSPs()
|
||||||
|
for pkg in self._all_packages(ns_pkg)
|
||||||
|
)
|
||||||
|
|
||||||
|
excl_specs = product(all_packages, self._gen_exclusion_paths())
|
||||||
|
return set(starmap(self._exclude_pkg_path, excl_specs))
|
||||||
|
|
||||||
|
def _exclude_pkg_path(self, pkg, exclusion_path):
|
||||||
|
"""
|
||||||
|
Given a package name and exclusion path within that package,
|
||||||
|
compute the full exclusion path.
|
||||||
|
"""
|
||||||
|
parts = pkg.split('.') + [exclusion_path]
|
||||||
|
return os.path.join(self.install_dir, *parts)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _all_packages(pkg_name):
|
||||||
|
"""
|
||||||
|
>>> list(install_lib._all_packages('foo.bar.baz'))
|
||||||
|
['foo.bar.baz', 'foo.bar', 'foo']
|
||||||
|
"""
|
||||||
|
while pkg_name:
|
||||||
|
yield pkg_name
|
||||||
|
pkg_name, sep, child = pkg_name.rpartition('.')
|
||||||
|
|
||||||
|
def _get_SVEM_NSPs(self):
|
||||||
|
"""
|
||||||
|
Get namespace packages (list) but only for
|
||||||
|
single_version_externally_managed installations and empty otherwise.
|
||||||
|
"""
|
||||||
|
# TODO: is it necessary to short-circuit here? i.e. what's the cost
|
||||||
|
# if get_finalized_command is called even when namespace_packages is
|
||||||
|
# False?
|
||||||
|
if not self.distribution.namespace_packages:
|
||||||
|
return []
|
||||||
|
|
||||||
|
install_cmd = self.get_finalized_command('install')
|
||||||
|
svem = install_cmd.single_version_externally_managed
|
||||||
|
|
||||||
|
return self.distribution.namespace_packages if svem else []
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _gen_exclusion_paths():
|
||||||
|
"""
|
||||||
|
Generate file paths to be excluded for namespace packages (bytecode
|
||||||
|
cache files).
|
||||||
|
"""
|
||||||
|
# always exclude the package module itself
|
||||||
|
yield '__init__.py'
|
||||||
|
|
||||||
|
yield '__init__.pyc'
|
||||||
|
yield '__init__.pyo'
|
||||||
|
|
||||||
|
if not hasattr(imp, 'get_tag'):
|
||||||
|
return
|
||||||
|
|
||||||
|
base = os.path.join('__pycache__', '__init__.' + imp.get_tag())
|
||||||
|
yield base + '.pyc'
|
||||||
|
yield base + '.pyo'
|
||||||
|
|
||||||
|
def copy_tree(
|
||||||
|
self, infile, outfile,
|
||||||
|
preserve_mode=1, preserve_times=1, preserve_symlinks=0, level=1
|
||||||
|
):
|
||||||
|
assert preserve_mode and preserve_times and not preserve_symlinks
|
||||||
|
exclude = self.get_exclusions()
|
||||||
|
|
||||||
|
if not exclude:
|
||||||
|
return orig.install_lib.copy_tree(self, infile, outfile)
|
||||||
|
|
||||||
|
# Exclude namespace package __init__.py* files from the output
|
||||||
|
|
||||||
|
from setuptools.archive_util import unpack_directory
|
||||||
|
from distutils import log
|
||||||
|
|
||||||
|
outfiles = []
|
||||||
|
|
||||||
|
def pf(src, dst):
|
||||||
|
if dst in exclude:
|
||||||
|
log.warn("Skipping installation of %s (namespace package)",
|
||||||
|
dst)
|
||||||
|
return False
|
||||||
|
|
||||||
|
log.info("copying %s -> %s", src, os.path.dirname(dst))
|
||||||
|
outfiles.append(dst)
|
||||||
|
return dst
|
||||||
|
|
||||||
|
unpack_directory(infile, outfile, pf)
|
||||||
|
return outfiles
|
||||||
|
|
||||||
|
def get_outputs(self):
|
||||||
|
outputs = orig.install_lib.get_outputs(self)
|
||||||
|
exclude = self.get_exclusions()
|
||||||
|
if exclude:
|
||||||
|
return [f for f in outputs if f not in exclude]
|
||||||
|
return outputs
|
|
@ -0,0 +1,60 @@
|
||||||
|
from distutils import log
|
||||||
|
import distutils.command.install_scripts as orig
|
||||||
|
import os
|
||||||
|
|
||||||
|
from pkg_resources import Distribution, PathMetadata, ensure_directory
|
||||||
|
|
||||||
|
|
||||||
|
class install_scripts(orig.install_scripts):
|
||||||
|
"""Do normal script install, plus any egg_info wrapper scripts"""
|
||||||
|
|
||||||
|
def initialize_options(self):
|
||||||
|
orig.install_scripts.initialize_options(self)
|
||||||
|
self.no_ep = False
|
||||||
|
|
||||||
|
def run(self):
|
||||||
|
import setuptools.command.easy_install as ei
|
||||||
|
|
||||||
|
self.run_command("egg_info")
|
||||||
|
if self.distribution.scripts:
|
||||||
|
orig.install_scripts.run(self) # run first to set up self.outfiles
|
||||||
|
else:
|
||||||
|
self.outfiles = []
|
||||||
|
if self.no_ep:
|
||||||
|
# don't install entry point scripts into .egg file!
|
||||||
|
return
|
||||||
|
|
||||||
|
ei_cmd = self.get_finalized_command("egg_info")
|
||||||
|
dist = Distribution(
|
||||||
|
ei_cmd.egg_base, PathMetadata(ei_cmd.egg_base, ei_cmd.egg_info),
|
||||||
|
ei_cmd.egg_name, ei_cmd.egg_version,
|
||||||
|
)
|
||||||
|
bs_cmd = self.get_finalized_command('build_scripts')
|
||||||
|
exec_param = getattr(bs_cmd, 'executable', None)
|
||||||
|
bw_cmd = self.get_finalized_command("bdist_wininst")
|
||||||
|
is_wininst = getattr(bw_cmd, '_is_running', False)
|
||||||
|
writer = ei.ScriptWriter
|
||||||
|
if is_wininst:
|
||||||
|
exec_param = "python.exe"
|
||||||
|
writer = ei.WindowsScriptWriter
|
||||||
|
# resolve the writer to the environment
|
||||||
|
writer = writer.best()
|
||||||
|
cmd = writer.command_spec_class.best().from_param(exec_param)
|
||||||
|
for args in writer.get_args(dist, cmd.as_header()):
|
||||||
|
self.write_script(*args)
|
||||||
|
|
||||||
|
def write_script(self, script_name, contents, mode="t", *ignored):
|
||||||
|
"""Write an executable file to the scripts directory"""
|
||||||
|
from setuptools.command.easy_install import chmod, current_umask
|
||||||
|
|
||||||
|
log.info("Installing %s script to %s", script_name, self.install_dir)
|
||||||
|
target = os.path.join(self.install_dir, script_name)
|
||||||
|
self.outfiles.append(target)
|
||||||
|
|
||||||
|
mask = current_umask()
|
||||||
|
if not self.dry_run:
|
||||||
|
ensure_directory(target)
|
||||||
|
f = open(target, "w" + mode)
|
||||||
|
f.write(contents)
|
||||||
|
f.close()
|
||||||
|
chmod(target, 0o777 - mask)
|
|
@ -0,0 +1,15 @@
|
||||||
|
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
|
||||||
|
<assembly xmlns="urn:schemas-microsoft-com:asm.v1" manifestVersion="1.0">
|
||||||
|
<assemblyIdentity version="1.0.0.0"
|
||||||
|
processorArchitecture="X86"
|
||||||
|
name="%(name)s"
|
||||||
|
type="win32"/>
|
||||||
|
<!-- Identify the application security requirements. -->
|
||||||
|
<trustInfo xmlns="urn:schemas-microsoft-com:asm.v3">
|
||||||
|
<security>
|
||||||
|
<requestedPrivileges>
|
||||||
|
<requestedExecutionLevel level="asInvoker" uiAccess="false"/>
|
||||||
|
</requestedPrivileges>
|
||||||
|
</security>
|
||||||
|
</trustInfo>
|
||||||
|
</assembly>
|
10
lib/python3.4/site-packages/setuptools/command/register.py
Normal file
10
lib/python3.4/site-packages/setuptools/command/register.py
Normal file
|
@ -0,0 +1,10 @@
|
||||||
|
import distutils.command.register as orig
|
||||||
|
|
||||||
|
|
||||||
|
class register(orig.register):
|
||||||
|
__doc__ = orig.register.__doc__
|
||||||
|
|
||||||
|
def run(self):
|
||||||
|
# Make sure that we are using valid current name/version info
|
||||||
|
self.run_command('egg_info')
|
||||||
|
orig.register.run(self)
|
61
lib/python3.4/site-packages/setuptools/command/rotate.py
Normal file
61
lib/python3.4/site-packages/setuptools/command/rotate.py
Normal file
|
@ -0,0 +1,61 @@
|
||||||
|
from distutils.util import convert_path
|
||||||
|
from distutils import log
|
||||||
|
from distutils.errors import DistutilsOptionError
|
||||||
|
import os
|
||||||
|
|
||||||
|
from setuptools import Command
|
||||||
|
from setuptools.compat import basestring
|
||||||
|
|
||||||
|
|
||||||
|
class rotate(Command):
|
||||||
|
"""Delete older distributions"""
|
||||||
|
|
||||||
|
description = "delete older distributions, keeping N newest files"
|
||||||
|
user_options = [
|
||||||
|
('match=', 'm', "patterns to match (required)"),
|
||||||
|
('dist-dir=', 'd', "directory where the distributions are"),
|
||||||
|
('keep=', 'k', "number of matching distributions to keep"),
|
||||||
|
]
|
||||||
|
|
||||||
|
boolean_options = []
|
||||||
|
|
||||||
|
def initialize_options(self):
|
||||||
|
self.match = None
|
||||||
|
self.dist_dir = None
|
||||||
|
self.keep = None
|
||||||
|
|
||||||
|
def finalize_options(self):
|
||||||
|
if self.match is None:
|
||||||
|
raise DistutilsOptionError(
|
||||||
|
"Must specify one or more (comma-separated) match patterns "
|
||||||
|
"(e.g. '.zip' or '.egg')"
|
||||||
|
)
|
||||||
|
if self.keep is None:
|
||||||
|
raise DistutilsOptionError("Must specify number of files to keep")
|
||||||
|
try:
|
||||||
|
self.keep = int(self.keep)
|
||||||
|
except ValueError:
|
||||||
|
raise DistutilsOptionError("--keep must be an integer")
|
||||||
|
if isinstance(self.match, basestring):
|
||||||
|
self.match = [
|
||||||
|
convert_path(p.strip()) for p in self.match.split(',')
|
||||||
|
]
|
||||||
|
self.set_undefined_options('bdist', ('dist_dir', 'dist_dir'))
|
||||||
|
|
||||||
|
def run(self):
|
||||||
|
self.run_command("egg_info")
|
||||||
|
from glob import glob
|
||||||
|
|
||||||
|
for pattern in self.match:
|
||||||
|
pattern = self.distribution.get_name() + '*' + pattern
|
||||||
|
files = glob(os.path.join(self.dist_dir, pattern))
|
||||||
|
files = [(os.path.getmtime(f), f) for f in files]
|
||||||
|
files.sort()
|
||||||
|
files.reverse()
|
||||||
|
|
||||||
|
log.info("%d file(s) matching %s", len(files), pattern)
|
||||||
|
files = files[self.keep:]
|
||||||
|
for (t, f) in files:
|
||||||
|
log.info("Deleting %s", f)
|
||||||
|
if not self.dry_run:
|
||||||
|
os.unlink(f)
|
22
lib/python3.4/site-packages/setuptools/command/saveopts.py
Normal file
22
lib/python3.4/site-packages/setuptools/command/saveopts.py
Normal file
|
@ -0,0 +1,22 @@
|
||||||
|
from setuptools.command.setopt import edit_config, option_base
|
||||||
|
|
||||||
|
|
||||||
|
class saveopts(option_base):
|
||||||
|
"""Save command-line options to a file"""
|
||||||
|
|
||||||
|
description = "save supplied options to setup.cfg or other config file"
|
||||||
|
|
||||||
|
def run(self):
|
||||||
|
dist = self.distribution
|
||||||
|
settings = {}
|
||||||
|
|
||||||
|
for cmd in dist.command_options:
|
||||||
|
|
||||||
|
if cmd == 'saveopts':
|
||||||
|
continue # don't save our own options!
|
||||||
|
|
||||||
|
for opt, (src, val) in dist.get_option_dict(cmd).items():
|
||||||
|
if src == "command line":
|
||||||
|
settings.setdefault(cmd, {})[opt] = val
|
||||||
|
|
||||||
|
edit_config(self.filename, settings, self.dry_run)
|
197
lib/python3.4/site-packages/setuptools/command/sdist.py
Normal file
197
lib/python3.4/site-packages/setuptools/command/sdist.py
Normal file
|
@ -0,0 +1,197 @@
|
||||||
|
from glob import glob
|
||||||
|
from distutils import log
|
||||||
|
import distutils.command.sdist as orig
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
|
||||||
|
from setuptools.compat import PY3
|
||||||
|
from setuptools.utils import cs_path_exists
|
||||||
|
|
||||||
|
import pkg_resources
|
||||||
|
|
||||||
|
READMES = 'README', 'README.rst', 'README.txt'
|
||||||
|
|
||||||
|
_default_revctrl = list
|
||||||
|
|
||||||
|
def walk_revctrl(dirname=''):
|
||||||
|
"""Find all files under revision control"""
|
||||||
|
for ep in pkg_resources.iter_entry_points('setuptools.file_finders'):
|
||||||
|
for item in ep.load()(dirname):
|
||||||
|
yield item
|
||||||
|
|
||||||
|
|
||||||
|
class sdist(orig.sdist):
|
||||||
|
"""Smart sdist that finds anything supported by revision control"""
|
||||||
|
|
||||||
|
user_options = [
|
||||||
|
('formats=', None,
|
||||||
|
"formats for source distribution (comma-separated list)"),
|
||||||
|
('keep-temp', 'k',
|
||||||
|
"keep the distribution tree around after creating " +
|
||||||
|
"archive file(s)"),
|
||||||
|
('dist-dir=', 'd',
|
||||||
|
"directory to put the source distribution archive(s) in "
|
||||||
|
"[default: dist]"),
|
||||||
|
]
|
||||||
|
|
||||||
|
negative_opt = {}
|
||||||
|
|
||||||
|
def run(self):
|
||||||
|
self.run_command('egg_info')
|
||||||
|
ei_cmd = self.get_finalized_command('egg_info')
|
||||||
|
self.filelist = ei_cmd.filelist
|
||||||
|
self.filelist.append(os.path.join(ei_cmd.egg_info, 'SOURCES.txt'))
|
||||||
|
self.check_readme()
|
||||||
|
|
||||||
|
# Run sub commands
|
||||||
|
for cmd_name in self.get_sub_commands():
|
||||||
|
self.run_command(cmd_name)
|
||||||
|
|
||||||
|
# Call check_metadata only if no 'check' command
|
||||||
|
# (distutils <= 2.6)
|
||||||
|
import distutils.command
|
||||||
|
|
||||||
|
if 'check' not in distutils.command.__all__:
|
||||||
|
self.check_metadata()
|
||||||
|
|
||||||
|
self.make_distribution()
|
||||||
|
|
||||||
|
dist_files = getattr(self.distribution, 'dist_files', [])
|
||||||
|
for file in self.archive_files:
|
||||||
|
data = ('sdist', '', file)
|
||||||
|
if data not in dist_files:
|
||||||
|
dist_files.append(data)
|
||||||
|
|
||||||
|
def __read_template_hack(self):
|
||||||
|
# This grody hack closes the template file (MANIFEST.in) if an
|
||||||
|
# exception occurs during read_template.
|
||||||
|
# Doing so prevents an error when easy_install attempts to delete the
|
||||||
|
# file.
|
||||||
|
try:
|
||||||
|
orig.sdist.read_template(self)
|
||||||
|
except:
|
||||||
|
_, _, tb = sys.exc_info()
|
||||||
|
tb.tb_next.tb_frame.f_locals['template'].close()
|
||||||
|
raise
|
||||||
|
|
||||||
|
# Beginning with Python 2.7.2, 3.1.4, and 3.2.1, this leaky file handle
|
||||||
|
# has been fixed, so only override the method if we're using an earlier
|
||||||
|
# Python.
|
||||||
|
has_leaky_handle = (
|
||||||
|
sys.version_info < (2, 7, 2)
|
||||||
|
or (3, 0) <= sys.version_info < (3, 1, 4)
|
||||||
|
or (3, 2) <= sys.version_info < (3, 2, 1)
|
||||||
|
)
|
||||||
|
if has_leaky_handle:
|
||||||
|
read_template = __read_template_hack
|
||||||
|
|
||||||
|
def add_defaults(self):
|
||||||
|
standards = [READMES,
|
||||||
|
self.distribution.script_name]
|
||||||
|
for fn in standards:
|
||||||
|
if isinstance(fn, tuple):
|
||||||
|
alts = fn
|
||||||
|
got_it = 0
|
||||||
|
for fn in alts:
|
||||||
|
if cs_path_exists(fn):
|
||||||
|
got_it = 1
|
||||||
|
self.filelist.append(fn)
|
||||||
|
break
|
||||||
|
|
||||||
|
if not got_it:
|
||||||
|
self.warn("standard file not found: should have one of " +
|
||||||
|
', '.join(alts))
|
||||||
|
else:
|
||||||
|
if cs_path_exists(fn):
|
||||||
|
self.filelist.append(fn)
|
||||||
|
else:
|
||||||
|
self.warn("standard file '%s' not found" % fn)
|
||||||
|
|
||||||
|
optional = ['test/test*.py', 'setup.cfg']
|
||||||
|
for pattern in optional:
|
||||||
|
files = list(filter(cs_path_exists, glob(pattern)))
|
||||||
|
if files:
|
||||||
|
self.filelist.extend(files)
|
||||||
|
|
||||||
|
# getting python files
|
||||||
|
if self.distribution.has_pure_modules():
|
||||||
|
build_py = self.get_finalized_command('build_py')
|
||||||
|
self.filelist.extend(build_py.get_source_files())
|
||||||
|
# This functionality is incompatible with include_package_data, and
|
||||||
|
# will in fact create an infinite recursion if include_package_data
|
||||||
|
# is True. Use of include_package_data will imply that
|
||||||
|
# distutils-style automatic handling of package_data is disabled
|
||||||
|
if not self.distribution.include_package_data:
|
||||||
|
for _, src_dir, _, filenames in build_py.data_files:
|
||||||
|
self.filelist.extend([os.path.join(src_dir, filename)
|
||||||
|
for filename in filenames])
|
||||||
|
|
||||||
|
if self.distribution.has_ext_modules():
|
||||||
|
build_ext = self.get_finalized_command('build_ext')
|
||||||
|
self.filelist.extend(build_ext.get_source_files())
|
||||||
|
|
||||||
|
if self.distribution.has_c_libraries():
|
||||||
|
build_clib = self.get_finalized_command('build_clib')
|
||||||
|
self.filelist.extend(build_clib.get_source_files())
|
||||||
|
|
||||||
|
if self.distribution.has_scripts():
|
||||||
|
build_scripts = self.get_finalized_command('build_scripts')
|
||||||
|
self.filelist.extend(build_scripts.get_source_files())
|
||||||
|
|
||||||
|
def check_readme(self):
|
||||||
|
for f in READMES:
|
||||||
|
if os.path.exists(f):
|
||||||
|
return
|
||||||
|
else:
|
||||||
|
self.warn(
|
||||||
|
"standard file not found: should have one of " +
|
||||||
|
', '.join(READMES)
|
||||||
|
)
|
||||||
|
|
||||||
|
def make_release_tree(self, base_dir, files):
|
||||||
|
orig.sdist.make_release_tree(self, base_dir, files)
|
||||||
|
|
||||||
|
# Save any egg_info command line options used to create this sdist
|
||||||
|
dest = os.path.join(base_dir, 'setup.cfg')
|
||||||
|
if hasattr(os, 'link') and os.path.exists(dest):
|
||||||
|
# unlink and re-copy, since it might be hard-linked, and
|
||||||
|
# we don't want to change the source version
|
||||||
|
os.unlink(dest)
|
||||||
|
self.copy_file('setup.cfg', dest)
|
||||||
|
|
||||||
|
self.get_finalized_command('egg_info').save_version_info(dest)
|
||||||
|
|
||||||
|
def _manifest_is_not_generated(self):
|
||||||
|
# check for special comment used in 2.7.1 and higher
|
||||||
|
if not os.path.isfile(self.manifest):
|
||||||
|
return False
|
||||||
|
|
||||||
|
fp = open(self.manifest, 'rbU')
|
||||||
|
try:
|
||||||
|
first_line = fp.readline()
|
||||||
|
finally:
|
||||||
|
fp.close()
|
||||||
|
return (first_line !=
|
||||||
|
'# file GENERATED by distutils, do NOT edit\n'.encode())
|
||||||
|
|
||||||
|
def read_manifest(self):
|
||||||
|
"""Read the manifest file (named by 'self.manifest') and use it to
|
||||||
|
fill in 'self.filelist', the list of files to include in the source
|
||||||
|
distribution.
|
||||||
|
"""
|
||||||
|
log.info("reading manifest file '%s'", self.manifest)
|
||||||
|
manifest = open(self.manifest, 'rbU')
|
||||||
|
for line in manifest:
|
||||||
|
# The manifest must contain UTF-8. See #303.
|
||||||
|
if PY3:
|
||||||
|
try:
|
||||||
|
line = line.decode('UTF-8')
|
||||||
|
except UnicodeDecodeError:
|
||||||
|
log.warn("%r not UTF-8 decodable -- skipping" % line)
|
||||||
|
continue
|
||||||
|
# ignore comments and blank lines
|
||||||
|
line = line.strip()
|
||||||
|
if line.startswith('#') or not line:
|
||||||
|
continue
|
||||||
|
self.filelist.append(line)
|
||||||
|
manifest.close()
|
150
lib/python3.4/site-packages/setuptools/command/setopt.py
Normal file
150
lib/python3.4/site-packages/setuptools/command/setopt.py
Normal file
|
@ -0,0 +1,150 @@
|
||||||
|
from distutils.util import convert_path
|
||||||
|
from distutils import log
|
||||||
|
from distutils.errors import DistutilsOptionError
|
||||||
|
import distutils
|
||||||
|
import os
|
||||||
|
|
||||||
|
from setuptools import Command
|
||||||
|
|
||||||
|
|
||||||
|
__all__ = ['config_file', 'edit_config', 'option_base', 'setopt']
|
||||||
|
|
||||||
|
|
||||||
|
def config_file(kind="local"):
|
||||||
|
"""Get the filename of the distutils, local, global, or per-user config
|
||||||
|
|
||||||
|
`kind` must be one of "local", "global", or "user"
|
||||||
|
"""
|
||||||
|
if kind == 'local':
|
||||||
|
return 'setup.cfg'
|
||||||
|
if kind == 'global':
|
||||||
|
return os.path.join(
|
||||||
|
os.path.dirname(distutils.__file__), 'distutils.cfg'
|
||||||
|
)
|
||||||
|
if kind == 'user':
|
||||||
|
dot = os.name == 'posix' and '.' or ''
|
||||||
|
return os.path.expanduser(convert_path("~/%spydistutils.cfg" % dot))
|
||||||
|
raise ValueError(
|
||||||
|
"config_file() type must be 'local', 'global', or 'user'", kind
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def edit_config(filename, settings, dry_run=False):
|
||||||
|
"""Edit a configuration file to include `settings`
|
||||||
|
|
||||||
|
`settings` is a dictionary of dictionaries or ``None`` values, keyed by
|
||||||
|
command/section name. A ``None`` value means to delete the entire section,
|
||||||
|
while a dictionary lists settings to be changed or deleted in that section.
|
||||||
|
A setting of ``None`` means to delete that setting.
|
||||||
|
"""
|
||||||
|
from setuptools.compat import ConfigParser
|
||||||
|
|
||||||
|
log.debug("Reading configuration from %s", filename)
|
||||||
|
opts = ConfigParser.RawConfigParser()
|
||||||
|
opts.read([filename])
|
||||||
|
for section, options in settings.items():
|
||||||
|
if options is None:
|
||||||
|
log.info("Deleting section [%s] from %s", section, filename)
|
||||||
|
opts.remove_section(section)
|
||||||
|
else:
|
||||||
|
if not opts.has_section(section):
|
||||||
|
log.debug("Adding new section [%s] to %s", section, filename)
|
||||||
|
opts.add_section(section)
|
||||||
|
for option, value in options.items():
|
||||||
|
if value is None:
|
||||||
|
log.debug(
|
||||||
|
"Deleting %s.%s from %s",
|
||||||
|
section, option, filename
|
||||||
|
)
|
||||||
|
opts.remove_option(section, option)
|
||||||
|
if not opts.options(section):
|
||||||
|
log.info("Deleting empty [%s] section from %s",
|
||||||
|
section, filename)
|
||||||
|
opts.remove_section(section)
|
||||||
|
else:
|
||||||
|
log.debug(
|
||||||
|
"Setting %s.%s to %r in %s",
|
||||||
|
section, option, value, filename
|
||||||
|
)
|
||||||
|
opts.set(section, option, value)
|
||||||
|
|
||||||
|
log.info("Writing %s", filename)
|
||||||
|
if not dry_run:
|
||||||
|
with open(filename, 'w') as f:
|
||||||
|
opts.write(f)
|
||||||
|
|
||||||
|
|
||||||
|
class option_base(Command):
|
||||||
|
"""Abstract base class for commands that mess with config files"""
|
||||||
|
|
||||||
|
user_options = [
|
||||||
|
('global-config', 'g',
|
||||||
|
"save options to the site-wide distutils.cfg file"),
|
||||||
|
('user-config', 'u',
|
||||||
|
"save options to the current user's pydistutils.cfg file"),
|
||||||
|
('filename=', 'f',
|
||||||
|
"configuration file to use (default=setup.cfg)"),
|
||||||
|
]
|
||||||
|
|
||||||
|
boolean_options = [
|
||||||
|
'global-config', 'user-config',
|
||||||
|
]
|
||||||
|
|
||||||
|
def initialize_options(self):
|
||||||
|
self.global_config = None
|
||||||
|
self.user_config = None
|
||||||
|
self.filename = None
|
||||||
|
|
||||||
|
def finalize_options(self):
|
||||||
|
filenames = []
|
||||||
|
if self.global_config:
|
||||||
|
filenames.append(config_file('global'))
|
||||||
|
if self.user_config:
|
||||||
|
filenames.append(config_file('user'))
|
||||||
|
if self.filename is not None:
|
||||||
|
filenames.append(self.filename)
|
||||||
|
if not filenames:
|
||||||
|
filenames.append(config_file('local'))
|
||||||
|
if len(filenames) > 1:
|
||||||
|
raise DistutilsOptionError(
|
||||||
|
"Must specify only one configuration file option",
|
||||||
|
filenames
|
||||||
|
)
|
||||||
|
self.filename, = filenames
|
||||||
|
|
||||||
|
|
||||||
|
class setopt(option_base):
|
||||||
|
"""Save command-line options to a file"""
|
||||||
|
|
||||||
|
description = "set an option in setup.cfg or another config file"
|
||||||
|
|
||||||
|
user_options = [
|
||||||
|
('command=', 'c', 'command to set an option for'),
|
||||||
|
('option=', 'o', 'option to set'),
|
||||||
|
('set-value=', 's', 'value of the option'),
|
||||||
|
('remove', 'r', 'remove (unset) the value'),
|
||||||
|
] + option_base.user_options
|
||||||
|
|
||||||
|
boolean_options = option_base.boolean_options + ['remove']
|
||||||
|
|
||||||
|
def initialize_options(self):
|
||||||
|
option_base.initialize_options(self)
|
||||||
|
self.command = None
|
||||||
|
self.option = None
|
||||||
|
self.set_value = None
|
||||||
|
self.remove = None
|
||||||
|
|
||||||
|
def finalize_options(self):
|
||||||
|
option_base.finalize_options(self)
|
||||||
|
if self.command is None or self.option is None:
|
||||||
|
raise DistutilsOptionError("Must specify --command *and* --option")
|
||||||
|
if self.set_value is None and not self.remove:
|
||||||
|
raise DistutilsOptionError("Must specify --set-value or --remove")
|
||||||
|
|
||||||
|
def run(self):
|
||||||
|
edit_config(
|
||||||
|
self.filename, {
|
||||||
|
self.command: {self.option.replace('-', '_'): self.set_value}
|
||||||
|
},
|
||||||
|
self.dry_run
|
||||||
|
)
|
183
lib/python3.4/site-packages/setuptools/command/test.py
Normal file
183
lib/python3.4/site-packages/setuptools/command/test.py
Normal file
|
@ -0,0 +1,183 @@
|
||||||
|
from distutils.errors import DistutilsOptionError
|
||||||
|
from unittest import TestLoader
|
||||||
|
import sys
|
||||||
|
|
||||||
|
from pkg_resources import (resource_listdir, resource_exists, normalize_path,
|
||||||
|
working_set, _namespace_packages,
|
||||||
|
add_activation_listener, require, EntryPoint)
|
||||||
|
from setuptools import Command
|
||||||
|
from setuptools.compat import PY3
|
||||||
|
from setuptools.py31compat import unittest_main
|
||||||
|
|
||||||
|
|
||||||
|
class ScanningLoader(TestLoader):
|
||||||
|
def loadTestsFromModule(self, module, pattern=None):
|
||||||
|
"""Return a suite of all tests cases contained in the given module
|
||||||
|
|
||||||
|
If the module is a package, load tests from all the modules in it.
|
||||||
|
If the module has an ``additional_tests`` function, call it and add
|
||||||
|
the return value to the tests.
|
||||||
|
"""
|
||||||
|
tests = []
|
||||||
|
tests.append(TestLoader.loadTestsFromModule(self, module))
|
||||||
|
|
||||||
|
if hasattr(module, "additional_tests"):
|
||||||
|
tests.append(module.additional_tests())
|
||||||
|
|
||||||
|
if hasattr(module, '__path__'):
|
||||||
|
for file in resource_listdir(module.__name__, ''):
|
||||||
|
if file.endswith('.py') and file != '__init__.py':
|
||||||
|
submodule = module.__name__ + '.' + file[:-3]
|
||||||
|
else:
|
||||||
|
if resource_exists(module.__name__, file + '/__init__.py'):
|
||||||
|
submodule = module.__name__ + '.' + file
|
||||||
|
else:
|
||||||
|
continue
|
||||||
|
tests.append(self.loadTestsFromName(submodule))
|
||||||
|
|
||||||
|
if len(tests) != 1:
|
||||||
|
return self.suiteClass(tests)
|
||||||
|
else:
|
||||||
|
return tests[0] # don't create a nested suite for only one return
|
||||||
|
|
||||||
|
|
||||||
|
class test(Command):
|
||||||
|
"""Command to run unit tests after in-place build"""
|
||||||
|
|
||||||
|
description = "run unit tests after in-place build"
|
||||||
|
|
||||||
|
user_options = [
|
||||||
|
('test-module=', 'm', "Run 'test_suite' in specified module"),
|
||||||
|
('test-suite=', 's',
|
||||||
|
"Test suite to run (e.g. 'some_module.test_suite')"),
|
||||||
|
('test-runner=', 'r', "Test runner to use"),
|
||||||
|
]
|
||||||
|
|
||||||
|
def initialize_options(self):
|
||||||
|
self.test_suite = None
|
||||||
|
self.test_module = None
|
||||||
|
self.test_loader = None
|
||||||
|
self.test_runner = None
|
||||||
|
|
||||||
|
def finalize_options(self):
|
||||||
|
|
||||||
|
if self.test_suite and self.test_module:
|
||||||
|
msg = "You may specify a module or a suite, but not both"
|
||||||
|
raise DistutilsOptionError(msg)
|
||||||
|
|
||||||
|
if self.test_suite is None:
|
||||||
|
if self.test_module is None:
|
||||||
|
self.test_suite = self.distribution.test_suite
|
||||||
|
else:
|
||||||
|
self.test_suite = self.test_module + ".test_suite"
|
||||||
|
|
||||||
|
if self.test_loader is None:
|
||||||
|
self.test_loader = getattr(self.distribution, 'test_loader', None)
|
||||||
|
if self.test_loader is None:
|
||||||
|
self.test_loader = "setuptools.command.test:ScanningLoader"
|
||||||
|
if self.test_runner is None:
|
||||||
|
self.test_runner = getattr(self.distribution, 'test_runner', None)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def test_args(self):
|
||||||
|
return list(self._test_args())
|
||||||
|
|
||||||
|
def _test_args(self):
|
||||||
|
if self.verbose:
|
||||||
|
yield '--verbose'
|
||||||
|
if self.test_suite:
|
||||||
|
yield self.test_suite
|
||||||
|
|
||||||
|
def with_project_on_sys_path(self, func):
|
||||||
|
with_2to3 = PY3 and getattr(self.distribution, 'use_2to3', False)
|
||||||
|
|
||||||
|
if with_2to3:
|
||||||
|
# If we run 2to3 we can not do this inplace:
|
||||||
|
|
||||||
|
# Ensure metadata is up-to-date
|
||||||
|
self.reinitialize_command('build_py', inplace=0)
|
||||||
|
self.run_command('build_py')
|
||||||
|
bpy_cmd = self.get_finalized_command("build_py")
|
||||||
|
build_path = normalize_path(bpy_cmd.build_lib)
|
||||||
|
|
||||||
|
# Build extensions
|
||||||
|
self.reinitialize_command('egg_info', egg_base=build_path)
|
||||||
|
self.run_command('egg_info')
|
||||||
|
|
||||||
|
self.reinitialize_command('build_ext', inplace=0)
|
||||||
|
self.run_command('build_ext')
|
||||||
|
else:
|
||||||
|
# Without 2to3 inplace works fine:
|
||||||
|
self.run_command('egg_info')
|
||||||
|
|
||||||
|
# Build extensions in-place
|
||||||
|
self.reinitialize_command('build_ext', inplace=1)
|
||||||
|
self.run_command('build_ext')
|
||||||
|
|
||||||
|
ei_cmd = self.get_finalized_command("egg_info")
|
||||||
|
|
||||||
|
old_path = sys.path[:]
|
||||||
|
old_modules = sys.modules.copy()
|
||||||
|
|
||||||
|
try:
|
||||||
|
sys.path.insert(0, normalize_path(ei_cmd.egg_base))
|
||||||
|
working_set.__init__()
|
||||||
|
add_activation_listener(lambda dist: dist.activate())
|
||||||
|
require('%s==%s' % (ei_cmd.egg_name, ei_cmd.egg_version))
|
||||||
|
func()
|
||||||
|
finally:
|
||||||
|
sys.path[:] = old_path
|
||||||
|
sys.modules.clear()
|
||||||
|
sys.modules.update(old_modules)
|
||||||
|
working_set.__init__()
|
||||||
|
|
||||||
|
def run(self):
|
||||||
|
if self.distribution.install_requires:
|
||||||
|
self.distribution.fetch_build_eggs(
|
||||||
|
self.distribution.install_requires)
|
||||||
|
if self.distribution.tests_require:
|
||||||
|
self.distribution.fetch_build_eggs(self.distribution.tests_require)
|
||||||
|
|
||||||
|
cmd = ' '.join(self._argv)
|
||||||
|
if self.dry_run:
|
||||||
|
self.announce('skipping "%s" (dry run)' % cmd)
|
||||||
|
else:
|
||||||
|
self.announce('running "%s"' % cmd)
|
||||||
|
self.with_project_on_sys_path(self.run_tests)
|
||||||
|
|
||||||
|
def run_tests(self):
|
||||||
|
# Purge modules under test from sys.modules. The test loader will
|
||||||
|
# re-import them from the build location. Required when 2to3 is used
|
||||||
|
# with namespace packages.
|
||||||
|
if PY3 and getattr(self.distribution, 'use_2to3', False):
|
||||||
|
module = self.test_suite.split('.')[0]
|
||||||
|
if module in _namespace_packages:
|
||||||
|
del_modules = []
|
||||||
|
if module in sys.modules:
|
||||||
|
del_modules.append(module)
|
||||||
|
module += '.'
|
||||||
|
for name in sys.modules:
|
||||||
|
if name.startswith(module):
|
||||||
|
del_modules.append(name)
|
||||||
|
list(map(sys.modules.__delitem__, del_modules))
|
||||||
|
|
||||||
|
unittest_main(
|
||||||
|
None, None, self._argv,
|
||||||
|
testLoader=self._resolve_as_ep(self.test_loader),
|
||||||
|
testRunner=self._resolve_as_ep(self.test_runner),
|
||||||
|
)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def _argv(self):
|
||||||
|
return ['unittest'] + self.test_args
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _resolve_as_ep(val):
|
||||||
|
"""
|
||||||
|
Load the indicated attribute value, called, as a as if it were
|
||||||
|
specified as an entry point.
|
||||||
|
"""
|
||||||
|
if val is None:
|
||||||
|
return
|
||||||
|
parsed = EntryPoint.parse("x=" + val)
|
||||||
|
return parsed.resolve()()
|
190
lib/python3.4/site-packages/setuptools/command/upload_docs.py
Normal file
190
lib/python3.4/site-packages/setuptools/command/upload_docs.py
Normal file
|
@ -0,0 +1,190 @@
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
"""upload_docs
|
||||||
|
|
||||||
|
Implements a Distutils 'upload_docs' subcommand (upload documentation to
|
||||||
|
PyPI's pythonhosted.org).
|
||||||
|
"""
|
||||||
|
|
||||||
|
from base64 import standard_b64encode
|
||||||
|
from distutils import log
|
||||||
|
from distutils.errors import DistutilsOptionError
|
||||||
|
from distutils.command.upload import upload
|
||||||
|
import os
|
||||||
|
import socket
|
||||||
|
import zipfile
|
||||||
|
import tempfile
|
||||||
|
import sys
|
||||||
|
import shutil
|
||||||
|
|
||||||
|
from setuptools.compat import httplib, urlparse, unicode, iteritems, PY3
|
||||||
|
from pkg_resources import iter_entry_points
|
||||||
|
|
||||||
|
|
||||||
|
errors = 'surrogateescape' if PY3 else 'strict'
|
||||||
|
|
||||||
|
|
||||||
|
# This is not just a replacement for byte literals
|
||||||
|
# but works as a general purpose encoder
|
||||||
|
def b(s, encoding='utf-8'):
|
||||||
|
if isinstance(s, unicode):
|
||||||
|
return s.encode(encoding, errors)
|
||||||
|
return s
|
||||||
|
|
||||||
|
|
||||||
|
class upload_docs(upload):
|
||||||
|
description = 'Upload documentation to PyPI'
|
||||||
|
|
||||||
|
user_options = [
|
||||||
|
('repository=', 'r',
|
||||||
|
"url of repository [default: %s]" % upload.DEFAULT_REPOSITORY),
|
||||||
|
('show-response', None,
|
||||||
|
'display full response text from server'),
|
||||||
|
('upload-dir=', None, 'directory to upload'),
|
||||||
|
]
|
||||||
|
boolean_options = upload.boolean_options
|
||||||
|
|
||||||
|
def has_sphinx(self):
|
||||||
|
if self.upload_dir is None:
|
||||||
|
for ep in iter_entry_points('distutils.commands', 'build_sphinx'):
|
||||||
|
return True
|
||||||
|
|
||||||
|
sub_commands = [('build_sphinx', has_sphinx)]
|
||||||
|
|
||||||
|
def initialize_options(self):
|
||||||
|
upload.initialize_options(self)
|
||||||
|
self.upload_dir = None
|
||||||
|
self.target_dir = None
|
||||||
|
|
||||||
|
def finalize_options(self):
|
||||||
|
upload.finalize_options(self)
|
||||||
|
if self.upload_dir is None:
|
||||||
|
if self.has_sphinx():
|
||||||
|
build_sphinx = self.get_finalized_command('build_sphinx')
|
||||||
|
self.target_dir = build_sphinx.builder_target_dir
|
||||||
|
else:
|
||||||
|
build = self.get_finalized_command('build')
|
||||||
|
self.target_dir = os.path.join(build.build_base, 'docs')
|
||||||
|
else:
|
||||||
|
self.ensure_dirname('upload_dir')
|
||||||
|
self.target_dir = self.upload_dir
|
||||||
|
self.announce('Using upload directory %s' % self.target_dir)
|
||||||
|
|
||||||
|
def create_zipfile(self, filename):
|
||||||
|
zip_file = zipfile.ZipFile(filename, "w")
|
||||||
|
try:
|
||||||
|
self.mkpath(self.target_dir) # just in case
|
||||||
|
for root, dirs, files in os.walk(self.target_dir):
|
||||||
|
if root == self.target_dir and not files:
|
||||||
|
raise DistutilsOptionError(
|
||||||
|
"no files found in upload directory '%s'"
|
||||||
|
% self.target_dir)
|
||||||
|
for name in files:
|
||||||
|
full = os.path.join(root, name)
|
||||||
|
relative = root[len(self.target_dir):].lstrip(os.path.sep)
|
||||||
|
dest = os.path.join(relative, name)
|
||||||
|
zip_file.write(full, dest)
|
||||||
|
finally:
|
||||||
|
zip_file.close()
|
||||||
|
|
||||||
|
def run(self):
|
||||||
|
# Run sub commands
|
||||||
|
for cmd_name in self.get_sub_commands():
|
||||||
|
self.run_command(cmd_name)
|
||||||
|
|
||||||
|
tmp_dir = tempfile.mkdtemp()
|
||||||
|
name = self.distribution.metadata.get_name()
|
||||||
|
zip_file = os.path.join(tmp_dir, "%s.zip" % name)
|
||||||
|
try:
|
||||||
|
self.create_zipfile(zip_file)
|
||||||
|
self.upload_file(zip_file)
|
||||||
|
finally:
|
||||||
|
shutil.rmtree(tmp_dir)
|
||||||
|
|
||||||
|
def upload_file(self, filename):
|
||||||
|
f = open(filename, 'rb')
|
||||||
|
content = f.read()
|
||||||
|
f.close()
|
||||||
|
meta = self.distribution.metadata
|
||||||
|
data = {
|
||||||
|
':action': 'doc_upload',
|
||||||
|
'name': meta.get_name(),
|
||||||
|
'content': (os.path.basename(filename), content),
|
||||||
|
}
|
||||||
|
# set up the authentication
|
||||||
|
credentials = b(self.username + ':' + self.password)
|
||||||
|
credentials = standard_b64encode(credentials)
|
||||||
|
if PY3:
|
||||||
|
credentials = credentials.decode('ascii')
|
||||||
|
auth = "Basic " + credentials
|
||||||
|
|
||||||
|
# Build up the MIME payload for the POST data
|
||||||
|
boundary = '--------------GHSKFJDLGDS7543FJKLFHRE75642756743254'
|
||||||
|
sep_boundary = b('\n--') + b(boundary)
|
||||||
|
end_boundary = sep_boundary + b('--')
|
||||||
|
body = []
|
||||||
|
for key, values in iteritems(data):
|
||||||
|
title = '\nContent-Disposition: form-data; name="%s"' % key
|
||||||
|
# handle multiple entries for the same name
|
||||||
|
if not isinstance(values, list):
|
||||||
|
values = [values]
|
||||||
|
for value in values:
|
||||||
|
if type(value) is tuple:
|
||||||
|
title += '; filename="%s"' % value[0]
|
||||||
|
value = value[1]
|
||||||
|
else:
|
||||||
|
value = b(value)
|
||||||
|
body.append(sep_boundary)
|
||||||
|
body.append(b(title))
|
||||||
|
body.append(b("\n\n"))
|
||||||
|
body.append(value)
|
||||||
|
if value and value[-1:] == b('\r'):
|
||||||
|
body.append(b('\n')) # write an extra newline (lurve Macs)
|
||||||
|
body.append(end_boundary)
|
||||||
|
body.append(b("\n"))
|
||||||
|
body = b('').join(body)
|
||||||
|
|
||||||
|
self.announce("Submitting documentation to %s" % (self.repository),
|
||||||
|
log.INFO)
|
||||||
|
|
||||||
|
# build the Request
|
||||||
|
# We can't use urllib2 since we need to send the Basic
|
||||||
|
# auth right with the first request
|
||||||
|
schema, netloc, url, params, query, fragments = \
|
||||||
|
urlparse(self.repository)
|
||||||
|
assert not params and not query and not fragments
|
||||||
|
if schema == 'http':
|
||||||
|
conn = httplib.HTTPConnection(netloc)
|
||||||
|
elif schema == 'https':
|
||||||
|
conn = httplib.HTTPSConnection(netloc)
|
||||||
|
else:
|
||||||
|
raise AssertionError("unsupported schema " + schema)
|
||||||
|
|
||||||
|
data = ''
|
||||||
|
try:
|
||||||
|
conn.connect()
|
||||||
|
conn.putrequest("POST", url)
|
||||||
|
content_type = 'multipart/form-data; boundary=%s' % boundary
|
||||||
|
conn.putheader('Content-type', content_type)
|
||||||
|
conn.putheader('Content-length', str(len(body)))
|
||||||
|
conn.putheader('Authorization', auth)
|
||||||
|
conn.endheaders()
|
||||||
|
conn.send(body)
|
||||||
|
except socket.error as e:
|
||||||
|
self.announce(str(e), log.ERROR)
|
||||||
|
return
|
||||||
|
|
||||||
|
r = conn.getresponse()
|
||||||
|
if r.status == 200:
|
||||||
|
self.announce('Server response (%s): %s' % (r.status, r.reason),
|
||||||
|
log.INFO)
|
||||||
|
elif r.status == 301:
|
||||||
|
location = r.getheader('Location')
|
||||||
|
if location is None:
|
||||||
|
location = 'https://pythonhosted.org/%s/' % meta.get_name()
|
||||||
|
self.announce('Upload successful. Visit %s' % location,
|
||||||
|
log.INFO)
|
||||||
|
else:
|
||||||
|
self.announce('Upload failed (%s): %s' % (r.status, r.reason),
|
||||||
|
log.ERROR)
|
||||||
|
if self.show_response:
|
||||||
|
print('-' * 75, r.read(), '-' * 75)
|
66
lib/python3.4/site-packages/setuptools/compat.py
Normal file
66
lib/python3.4/site-packages/setuptools/compat.py
Normal file
|
@ -0,0 +1,66 @@
|
||||||
|
import sys
|
||||||
|
import itertools
|
||||||
|
|
||||||
|
PY3 = sys.version_info >= (3,)
|
||||||
|
PY2 = not PY3
|
||||||
|
|
||||||
|
if PY2:
|
||||||
|
basestring = basestring
|
||||||
|
import __builtin__ as builtins
|
||||||
|
import ConfigParser
|
||||||
|
from StringIO import StringIO
|
||||||
|
BytesIO = StringIO
|
||||||
|
func_code = lambda o: o.func_code
|
||||||
|
func_globals = lambda o: o.func_globals
|
||||||
|
im_func = lambda o: o.im_func
|
||||||
|
from htmlentitydefs import name2codepoint
|
||||||
|
import httplib
|
||||||
|
from BaseHTTPServer import HTTPServer
|
||||||
|
from SimpleHTTPServer import SimpleHTTPRequestHandler
|
||||||
|
from BaseHTTPServer import BaseHTTPRequestHandler
|
||||||
|
iteritems = lambda o: o.iteritems()
|
||||||
|
long_type = long
|
||||||
|
maxsize = sys.maxint
|
||||||
|
unichr = unichr
|
||||||
|
unicode = unicode
|
||||||
|
bytes = str
|
||||||
|
from urllib import url2pathname, splittag, pathname2url
|
||||||
|
import urllib2
|
||||||
|
from urllib2 import urlopen, HTTPError, URLError, unquote, splituser
|
||||||
|
from urlparse import urlparse, urlunparse, urljoin, urlsplit, urlunsplit
|
||||||
|
filterfalse = itertools.ifilterfalse
|
||||||
|
|
||||||
|
exec("""def reraise(tp, value, tb=None):
|
||||||
|
raise tp, value, tb""")
|
||||||
|
|
||||||
|
if PY3:
|
||||||
|
basestring = str
|
||||||
|
import builtins
|
||||||
|
import configparser as ConfigParser
|
||||||
|
from io import StringIO, BytesIO
|
||||||
|
func_code = lambda o: o.__code__
|
||||||
|
func_globals = lambda o: o.__globals__
|
||||||
|
im_func = lambda o: o.__func__
|
||||||
|
from html.entities import name2codepoint
|
||||||
|
import http.client as httplib
|
||||||
|
from http.server import HTTPServer, SimpleHTTPRequestHandler
|
||||||
|
from http.server import BaseHTTPRequestHandler
|
||||||
|
iteritems = lambda o: o.items()
|
||||||
|
long_type = int
|
||||||
|
maxsize = sys.maxsize
|
||||||
|
unichr = chr
|
||||||
|
unicode = str
|
||||||
|
bytes = bytes
|
||||||
|
from urllib.error import HTTPError, URLError
|
||||||
|
import urllib.request as urllib2
|
||||||
|
from urllib.request import urlopen, url2pathname, pathname2url
|
||||||
|
from urllib.parse import (
|
||||||
|
urlparse, urlunparse, unquote, splituser, urljoin, urlsplit,
|
||||||
|
urlunsplit, splittag,
|
||||||
|
)
|
||||||
|
filterfalse = itertools.filterfalse
|
||||||
|
|
||||||
|
def reraise(tp, value, tb=None):
|
||||||
|
if value.__traceback__ is not tb:
|
||||||
|
raise value.with_traceback(tb)
|
||||||
|
raise value
|
215
lib/python3.4/site-packages/setuptools/depends.py
Normal file
215
lib/python3.4/site-packages/setuptools/depends.py
Normal file
|
@ -0,0 +1,215 @@
|
||||||
|
import sys
|
||||||
|
import imp
|
||||||
|
import marshal
|
||||||
|
from imp import PKG_DIRECTORY, PY_COMPILED, PY_SOURCE, PY_FROZEN
|
||||||
|
from distutils.version import StrictVersion
|
||||||
|
from setuptools import compat
|
||||||
|
|
||||||
|
__all__ = [
|
||||||
|
'Require', 'find_module', 'get_module_constant', 'extract_constant'
|
||||||
|
]
|
||||||
|
|
||||||
|
class Require:
|
||||||
|
"""A prerequisite to building or installing a distribution"""
|
||||||
|
|
||||||
|
def __init__(self, name, requested_version, module, homepage='',
|
||||||
|
attribute=None, format=None):
|
||||||
|
|
||||||
|
if format is None and requested_version is not None:
|
||||||
|
format = StrictVersion
|
||||||
|
|
||||||
|
if format is not None:
|
||||||
|
requested_version = format(requested_version)
|
||||||
|
if attribute is None:
|
||||||
|
attribute = '__version__'
|
||||||
|
|
||||||
|
self.__dict__.update(locals())
|
||||||
|
del self.self
|
||||||
|
|
||||||
|
def full_name(self):
|
||||||
|
"""Return full package/distribution name, w/version"""
|
||||||
|
if self.requested_version is not None:
|
||||||
|
return '%s-%s' % (self.name,self.requested_version)
|
||||||
|
return self.name
|
||||||
|
|
||||||
|
def version_ok(self, version):
|
||||||
|
"""Is 'version' sufficiently up-to-date?"""
|
||||||
|
return self.attribute is None or self.format is None or \
|
||||||
|
str(version) != "unknown" and version >= self.requested_version
|
||||||
|
|
||||||
|
def get_version(self, paths=None, default="unknown"):
|
||||||
|
|
||||||
|
"""Get version number of installed module, 'None', or 'default'
|
||||||
|
|
||||||
|
Search 'paths' for module. If not found, return 'None'. If found,
|
||||||
|
return the extracted version attribute, or 'default' if no version
|
||||||
|
attribute was specified, or the value cannot be determined without
|
||||||
|
importing the module. The version is formatted according to the
|
||||||
|
requirement's version format (if any), unless it is 'None' or the
|
||||||
|
supplied 'default'.
|
||||||
|
"""
|
||||||
|
|
||||||
|
if self.attribute is None:
|
||||||
|
try:
|
||||||
|
f,p,i = find_module(self.module,paths)
|
||||||
|
if f: f.close()
|
||||||
|
return default
|
||||||
|
except ImportError:
|
||||||
|
return None
|
||||||
|
|
||||||
|
v = get_module_constant(self.module, self.attribute, default, paths)
|
||||||
|
|
||||||
|
if v is not None and v is not default and self.format is not None:
|
||||||
|
return self.format(v)
|
||||||
|
|
||||||
|
return v
|
||||||
|
|
||||||
|
def is_present(self, paths=None):
|
||||||
|
"""Return true if dependency is present on 'paths'"""
|
||||||
|
return self.get_version(paths) is not None
|
||||||
|
|
||||||
|
def is_current(self, paths=None):
|
||||||
|
"""Return true if dependency is present and up-to-date on 'paths'"""
|
||||||
|
version = self.get_version(paths)
|
||||||
|
if version is None:
|
||||||
|
return False
|
||||||
|
return self.version_ok(version)
|
||||||
|
|
||||||
|
|
||||||
|
def _iter_code(code):
|
||||||
|
|
||||||
|
"""Yield '(op,arg)' pair for each operation in code object 'code'"""
|
||||||
|
|
||||||
|
from array import array
|
||||||
|
from dis import HAVE_ARGUMENT, EXTENDED_ARG
|
||||||
|
|
||||||
|
bytes = array('b',code.co_code)
|
||||||
|
eof = len(code.co_code)
|
||||||
|
|
||||||
|
ptr = 0
|
||||||
|
extended_arg = 0
|
||||||
|
|
||||||
|
while ptr<eof:
|
||||||
|
|
||||||
|
op = bytes[ptr]
|
||||||
|
|
||||||
|
if op>=HAVE_ARGUMENT:
|
||||||
|
|
||||||
|
arg = bytes[ptr+1] + bytes[ptr+2]*256 + extended_arg
|
||||||
|
ptr += 3
|
||||||
|
|
||||||
|
if op==EXTENDED_ARG:
|
||||||
|
extended_arg = arg * compat.long_type(65536)
|
||||||
|
continue
|
||||||
|
|
||||||
|
else:
|
||||||
|
arg = None
|
||||||
|
ptr += 1
|
||||||
|
|
||||||
|
yield op,arg
|
||||||
|
|
||||||
|
|
||||||
|
def find_module(module, paths=None):
|
||||||
|
"""Just like 'imp.find_module()', but with package support"""
|
||||||
|
|
||||||
|
parts = module.split('.')
|
||||||
|
|
||||||
|
while parts:
|
||||||
|
part = parts.pop(0)
|
||||||
|
f, path, (suffix,mode,kind) = info = imp.find_module(part, paths)
|
||||||
|
|
||||||
|
if kind==PKG_DIRECTORY:
|
||||||
|
parts = parts or ['__init__']
|
||||||
|
paths = [path]
|
||||||
|
|
||||||
|
elif parts:
|
||||||
|
raise ImportError("Can't find %r in %s" % (parts,module))
|
||||||
|
|
||||||
|
return info
|
||||||
|
|
||||||
|
|
||||||
|
def get_module_constant(module, symbol, default=-1, paths=None):
|
||||||
|
|
||||||
|
"""Find 'module' by searching 'paths', and extract 'symbol'
|
||||||
|
|
||||||
|
Return 'None' if 'module' does not exist on 'paths', or it does not define
|
||||||
|
'symbol'. If the module defines 'symbol' as a constant, return the
|
||||||
|
constant. Otherwise, return 'default'."""
|
||||||
|
|
||||||
|
try:
|
||||||
|
f, path, (suffix, mode, kind) = find_module(module, paths)
|
||||||
|
except ImportError:
|
||||||
|
# Module doesn't exist
|
||||||
|
return None
|
||||||
|
|
||||||
|
try:
|
||||||
|
if kind==PY_COMPILED:
|
||||||
|
f.read(8) # skip magic & date
|
||||||
|
code = marshal.load(f)
|
||||||
|
elif kind==PY_FROZEN:
|
||||||
|
code = imp.get_frozen_object(module)
|
||||||
|
elif kind==PY_SOURCE:
|
||||||
|
code = compile(f.read(), path, 'exec')
|
||||||
|
else:
|
||||||
|
# Not something we can parse; we'll have to import it. :(
|
||||||
|
if module not in sys.modules:
|
||||||
|
imp.load_module(module, f, path, (suffix, mode, kind))
|
||||||
|
return getattr(sys.modules[module], symbol, None)
|
||||||
|
|
||||||
|
finally:
|
||||||
|
if f:
|
||||||
|
f.close()
|
||||||
|
|
||||||
|
return extract_constant(code, symbol, default)
|
||||||
|
|
||||||
|
|
||||||
|
def extract_constant(code, symbol, default=-1):
|
||||||
|
"""Extract the constant value of 'symbol' from 'code'
|
||||||
|
|
||||||
|
If the name 'symbol' is bound to a constant value by the Python code
|
||||||
|
object 'code', return that value. If 'symbol' is bound to an expression,
|
||||||
|
return 'default'. Otherwise, return 'None'.
|
||||||
|
|
||||||
|
Return value is based on the first assignment to 'symbol'. 'symbol' must
|
||||||
|
be a global, or at least a non-"fast" local in the code block. That is,
|
||||||
|
only 'STORE_NAME' and 'STORE_GLOBAL' opcodes are checked, and 'symbol'
|
||||||
|
must be present in 'code.co_names'.
|
||||||
|
"""
|
||||||
|
|
||||||
|
if symbol not in code.co_names:
|
||||||
|
# name's not there, can't possibly be an assigment
|
||||||
|
return None
|
||||||
|
|
||||||
|
name_idx = list(code.co_names).index(symbol)
|
||||||
|
|
||||||
|
STORE_NAME = 90
|
||||||
|
STORE_GLOBAL = 97
|
||||||
|
LOAD_CONST = 100
|
||||||
|
|
||||||
|
const = default
|
||||||
|
|
||||||
|
for op, arg in _iter_code(code):
|
||||||
|
|
||||||
|
if op==LOAD_CONST:
|
||||||
|
const = code.co_consts[arg]
|
||||||
|
elif arg==name_idx and (op==STORE_NAME or op==STORE_GLOBAL):
|
||||||
|
return const
|
||||||
|
else:
|
||||||
|
const = default
|
||||||
|
|
||||||
|
|
||||||
|
def _update_globals():
|
||||||
|
"""
|
||||||
|
Patch the globals to remove the objects not available on some platforms.
|
||||||
|
|
||||||
|
XXX it'd be better to test assertions about bytecode instead.
|
||||||
|
"""
|
||||||
|
|
||||||
|
if not sys.platform.startswith('java') and sys.platform != 'cli':
|
||||||
|
return
|
||||||
|
incompatible = 'extract_constant', 'get_module_constant'
|
||||||
|
for name in incompatible:
|
||||||
|
del globals()[name]
|
||||||
|
__all__.remove(name)
|
||||||
|
|
||||||
|
_update_globals()
|
864
lib/python3.4/site-packages/setuptools/dist.py
Normal file
864
lib/python3.4/site-packages/setuptools/dist.py
Normal file
|
@ -0,0 +1,864 @@
|
||||||
|
__all__ = ['Distribution']
|
||||||
|
|
||||||
|
import re
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
import warnings
|
||||||
|
import numbers
|
||||||
|
import distutils.log
|
||||||
|
import distutils.core
|
||||||
|
import distutils.cmd
|
||||||
|
import distutils.dist
|
||||||
|
from distutils.core import Distribution as _Distribution
|
||||||
|
from distutils.errors import (DistutilsOptionError, DistutilsPlatformError,
|
||||||
|
DistutilsSetupError)
|
||||||
|
|
||||||
|
from setuptools.depends import Require
|
||||||
|
from setuptools.compat import basestring, PY2
|
||||||
|
from setuptools import windows_support
|
||||||
|
import pkg_resources
|
||||||
|
|
||||||
|
packaging = pkg_resources.packaging
|
||||||
|
|
||||||
|
|
||||||
|
def _get_unpatched(cls):
|
||||||
|
"""Protect against re-patching the distutils if reloaded
|
||||||
|
|
||||||
|
Also ensures that no other distutils extension monkeypatched the distutils
|
||||||
|
first.
|
||||||
|
"""
|
||||||
|
while cls.__module__.startswith('setuptools'):
|
||||||
|
cls, = cls.__bases__
|
||||||
|
if not cls.__module__.startswith('distutils'):
|
||||||
|
raise AssertionError(
|
||||||
|
"distutils has already been patched by %r" % cls
|
||||||
|
)
|
||||||
|
return cls
|
||||||
|
|
||||||
|
_Distribution = _get_unpatched(_Distribution)
|
||||||
|
|
||||||
|
def _patch_distribution_metadata_write_pkg_info():
|
||||||
|
"""
|
||||||
|
Workaround issue #197 - Python 3 prior to 3.2.2 uses an environment-local
|
||||||
|
encoding to save the pkg_info. Monkey-patch its write_pkg_info method to
|
||||||
|
correct this undesirable behavior.
|
||||||
|
"""
|
||||||
|
environment_local = (3,) <= sys.version_info[:3] < (3, 2, 2)
|
||||||
|
if not environment_local:
|
||||||
|
return
|
||||||
|
|
||||||
|
# from Python 3.4
|
||||||
|
def write_pkg_info(self, base_dir):
|
||||||
|
"""Write the PKG-INFO file into the release tree.
|
||||||
|
"""
|
||||||
|
with open(os.path.join(base_dir, 'PKG-INFO'), 'w',
|
||||||
|
encoding='UTF-8') as pkg_info:
|
||||||
|
self.write_pkg_file(pkg_info)
|
||||||
|
|
||||||
|
distutils.dist.DistributionMetadata.write_pkg_info = write_pkg_info
|
||||||
|
_patch_distribution_metadata_write_pkg_info()
|
||||||
|
|
||||||
|
sequence = tuple, list
|
||||||
|
|
||||||
|
def check_importable(dist, attr, value):
|
||||||
|
try:
|
||||||
|
ep = pkg_resources.EntryPoint.parse('x='+value)
|
||||||
|
assert not ep.extras
|
||||||
|
except (TypeError,ValueError,AttributeError,AssertionError):
|
||||||
|
raise DistutilsSetupError(
|
||||||
|
"%r must be importable 'module:attrs' string (got %r)"
|
||||||
|
% (attr,value)
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def assert_string_list(dist, attr, value):
|
||||||
|
"""Verify that value is a string list or None"""
|
||||||
|
try:
|
||||||
|
assert ''.join(value)!=value
|
||||||
|
except (TypeError,ValueError,AttributeError,AssertionError):
|
||||||
|
raise DistutilsSetupError(
|
||||||
|
"%r must be a list of strings (got %r)" % (attr,value)
|
||||||
|
)
|
||||||
|
def check_nsp(dist, attr, value):
|
||||||
|
"""Verify that namespace packages are valid"""
|
||||||
|
assert_string_list(dist,attr,value)
|
||||||
|
for nsp in value:
|
||||||
|
if not dist.has_contents_for(nsp):
|
||||||
|
raise DistutilsSetupError(
|
||||||
|
"Distribution contains no modules or packages for " +
|
||||||
|
"namespace package %r" % nsp
|
||||||
|
)
|
||||||
|
if '.' in nsp:
|
||||||
|
parent = '.'.join(nsp.split('.')[:-1])
|
||||||
|
if parent not in value:
|
||||||
|
distutils.log.warn(
|
||||||
|
"WARNING: %r is declared as a package namespace, but %r"
|
||||||
|
" is not: please correct this in setup.py", nsp, parent
|
||||||
|
)
|
||||||
|
|
||||||
|
def check_extras(dist, attr, value):
|
||||||
|
"""Verify that extras_require mapping is valid"""
|
||||||
|
try:
|
||||||
|
for k,v in value.items():
|
||||||
|
if ':' in k:
|
||||||
|
k,m = k.split(':',1)
|
||||||
|
if pkg_resources.invalid_marker(m):
|
||||||
|
raise DistutilsSetupError("Invalid environment marker: "+m)
|
||||||
|
list(pkg_resources.parse_requirements(v))
|
||||||
|
except (TypeError,ValueError,AttributeError):
|
||||||
|
raise DistutilsSetupError(
|
||||||
|
"'extras_require' must be a dictionary whose values are "
|
||||||
|
"strings or lists of strings containing valid project/version "
|
||||||
|
"requirement specifiers."
|
||||||
|
)
|
||||||
|
|
||||||
|
def assert_bool(dist, attr, value):
|
||||||
|
"""Verify that value is True, False, 0, or 1"""
|
||||||
|
if bool(value) != value:
|
||||||
|
tmpl = "{attr!r} must be a boolean value (got {value!r})"
|
||||||
|
raise DistutilsSetupError(tmpl.format(attr=attr, value=value))
|
||||||
|
|
||||||
|
|
||||||
|
def check_requirements(dist, attr, value):
|
||||||
|
"""Verify that install_requires is a valid requirements list"""
|
||||||
|
try:
|
||||||
|
list(pkg_resources.parse_requirements(value))
|
||||||
|
except (TypeError, ValueError) as error:
|
||||||
|
tmpl = (
|
||||||
|
"{attr!r} must be a string or list of strings "
|
||||||
|
"containing valid project/version requirement specifiers; {error}"
|
||||||
|
)
|
||||||
|
raise DistutilsSetupError(tmpl.format(attr=attr, error=error))
|
||||||
|
|
||||||
|
def check_entry_points(dist, attr, value):
|
||||||
|
"""Verify that entry_points map is parseable"""
|
||||||
|
try:
|
||||||
|
pkg_resources.EntryPoint.parse_map(value)
|
||||||
|
except ValueError as e:
|
||||||
|
raise DistutilsSetupError(e)
|
||||||
|
|
||||||
|
def check_test_suite(dist, attr, value):
|
||||||
|
if not isinstance(value,basestring):
|
||||||
|
raise DistutilsSetupError("test_suite must be a string")
|
||||||
|
|
||||||
|
def check_package_data(dist, attr, value):
|
||||||
|
"""Verify that value is a dictionary of package names to glob lists"""
|
||||||
|
if isinstance(value,dict):
|
||||||
|
for k,v in value.items():
|
||||||
|
if not isinstance(k,str): break
|
||||||
|
try: iter(v)
|
||||||
|
except TypeError:
|
||||||
|
break
|
||||||
|
else:
|
||||||
|
return
|
||||||
|
raise DistutilsSetupError(
|
||||||
|
attr+" must be a dictionary mapping package names to lists of "
|
||||||
|
"wildcard patterns"
|
||||||
|
)
|
||||||
|
|
||||||
|
def check_packages(dist, attr, value):
|
||||||
|
for pkgname in value:
|
||||||
|
if not re.match(r'\w+(\.\w+)*', pkgname):
|
||||||
|
distutils.log.warn(
|
||||||
|
"WARNING: %r not a valid package name; please use only"
|
||||||
|
".-separated package names in setup.py", pkgname
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class Distribution(_Distribution):
|
||||||
|
"""Distribution with support for features, tests, and package data
|
||||||
|
|
||||||
|
This is an enhanced version of 'distutils.dist.Distribution' that
|
||||||
|
effectively adds the following new optional keyword arguments to 'setup()':
|
||||||
|
|
||||||
|
'install_requires' -- a string or sequence of strings specifying project
|
||||||
|
versions that the distribution requires when installed, in the format
|
||||||
|
used by 'pkg_resources.require()'. They will be installed
|
||||||
|
automatically when the package is installed. If you wish to use
|
||||||
|
packages that are not available in PyPI, or want to give your users an
|
||||||
|
alternate download location, you can add a 'find_links' option to the
|
||||||
|
'[easy_install]' section of your project's 'setup.cfg' file, and then
|
||||||
|
setuptools will scan the listed web pages for links that satisfy the
|
||||||
|
requirements.
|
||||||
|
|
||||||
|
'extras_require' -- a dictionary mapping names of optional "extras" to the
|
||||||
|
additional requirement(s) that using those extras incurs. For example,
|
||||||
|
this::
|
||||||
|
|
||||||
|
extras_require = dict(reST = ["docutils>=0.3", "reSTedit"])
|
||||||
|
|
||||||
|
indicates that the distribution can optionally provide an extra
|
||||||
|
capability called "reST", but it can only be used if docutils and
|
||||||
|
reSTedit are installed. If the user installs your package using
|
||||||
|
EasyInstall and requests one of your extras, the corresponding
|
||||||
|
additional requirements will be installed if needed.
|
||||||
|
|
||||||
|
'features' **deprecated** -- a dictionary mapping option names to
|
||||||
|
'setuptools.Feature'
|
||||||
|
objects. Features are a portion of the distribution that can be
|
||||||
|
included or excluded based on user options, inter-feature dependencies,
|
||||||
|
and availability on the current system. Excluded features are omitted
|
||||||
|
from all setup commands, including source and binary distributions, so
|
||||||
|
you can create multiple distributions from the same source tree.
|
||||||
|
Feature names should be valid Python identifiers, except that they may
|
||||||
|
contain the '-' (minus) sign. Features can be included or excluded
|
||||||
|
via the command line options '--with-X' and '--without-X', where 'X' is
|
||||||
|
the name of the feature. Whether a feature is included by default, and
|
||||||
|
whether you are allowed to control this from the command line, is
|
||||||
|
determined by the Feature object. See the 'Feature' class for more
|
||||||
|
information.
|
||||||
|
|
||||||
|
'test_suite' -- the name of a test suite to run for the 'test' command.
|
||||||
|
If the user runs 'python setup.py test', the package will be installed,
|
||||||
|
and the named test suite will be run. The format is the same as
|
||||||
|
would be used on a 'unittest.py' command line. That is, it is the
|
||||||
|
dotted name of an object to import and call to generate a test suite.
|
||||||
|
|
||||||
|
'package_data' -- a dictionary mapping package names to lists of filenames
|
||||||
|
or globs to use to find data files contained in the named packages.
|
||||||
|
If the dictionary has filenames or globs listed under '""' (the empty
|
||||||
|
string), those names will be searched for in every package, in addition
|
||||||
|
to any names for the specific package. Data files found using these
|
||||||
|
names/globs will be installed along with the package, in the same
|
||||||
|
location as the package. Note that globs are allowed to reference
|
||||||
|
the contents of non-package subdirectories, as long as you use '/' as
|
||||||
|
a path separator. (Globs are automatically converted to
|
||||||
|
platform-specific paths at runtime.)
|
||||||
|
|
||||||
|
In addition to these new keywords, this class also has several new methods
|
||||||
|
for manipulating the distribution's contents. For example, the 'include()'
|
||||||
|
and 'exclude()' methods can be thought of as in-place add and subtract
|
||||||
|
commands that add or remove packages, modules, extensions, and so on from
|
||||||
|
the distribution. They are used by the feature subsystem to configure the
|
||||||
|
distribution for the included and excluded features.
|
||||||
|
"""
|
||||||
|
|
||||||
|
_patched_dist = None
|
||||||
|
|
||||||
|
def patch_missing_pkg_info(self, attrs):
|
||||||
|
# Fake up a replacement for the data that would normally come from
|
||||||
|
# PKG-INFO, but which might not yet be built if this is a fresh
|
||||||
|
# checkout.
|
||||||
|
#
|
||||||
|
if not attrs or 'name' not in attrs or 'version' not in attrs:
|
||||||
|
return
|
||||||
|
key = pkg_resources.safe_name(str(attrs['name'])).lower()
|
||||||
|
dist = pkg_resources.working_set.by_key.get(key)
|
||||||
|
if dist is not None and not dist.has_metadata('PKG-INFO'):
|
||||||
|
dist._version = pkg_resources.safe_version(str(attrs['version']))
|
||||||
|
self._patched_dist = dist
|
||||||
|
|
||||||
|
def __init__(self, attrs=None):
|
||||||
|
have_package_data = hasattr(self, "package_data")
|
||||||
|
if not have_package_data:
|
||||||
|
self.package_data = {}
|
||||||
|
_attrs_dict = attrs or {}
|
||||||
|
if 'features' in _attrs_dict or 'require_features' in _attrs_dict:
|
||||||
|
Feature.warn_deprecated()
|
||||||
|
self.require_features = []
|
||||||
|
self.features = {}
|
||||||
|
self.dist_files = []
|
||||||
|
self.src_root = attrs and attrs.pop("src_root", None)
|
||||||
|
self.patch_missing_pkg_info(attrs)
|
||||||
|
# Make sure we have any eggs needed to interpret 'attrs'
|
||||||
|
if attrs is not None:
|
||||||
|
self.dependency_links = attrs.pop('dependency_links', [])
|
||||||
|
assert_string_list(self,'dependency_links',self.dependency_links)
|
||||||
|
if attrs and 'setup_requires' in attrs:
|
||||||
|
self.fetch_build_eggs(attrs['setup_requires'])
|
||||||
|
for ep in pkg_resources.iter_entry_points('distutils.setup_keywords'):
|
||||||
|
if not hasattr(self,ep.name):
|
||||||
|
setattr(self,ep.name,None)
|
||||||
|
_Distribution.__init__(self,attrs)
|
||||||
|
if isinstance(self.metadata.version, numbers.Number):
|
||||||
|
# Some people apparently take "version number" too literally :)
|
||||||
|
self.metadata.version = str(self.metadata.version)
|
||||||
|
|
||||||
|
if self.metadata.version is not None:
|
||||||
|
try:
|
||||||
|
ver = packaging.version.Version(self.metadata.version)
|
||||||
|
normalized_version = str(ver)
|
||||||
|
if self.metadata.version != normalized_version:
|
||||||
|
warnings.warn(
|
||||||
|
"Normalizing '%s' to '%s'" % (
|
||||||
|
self.metadata.version,
|
||||||
|
normalized_version,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
self.metadata.version = normalized_version
|
||||||
|
except (packaging.version.InvalidVersion, TypeError):
|
||||||
|
warnings.warn(
|
||||||
|
"The version specified (%r) is an invalid version, this "
|
||||||
|
"may not work as expected with newer versions of "
|
||||||
|
"setuptools, pip, and PyPI. Please see PEP 440 for more "
|
||||||
|
"details." % self.metadata.version
|
||||||
|
)
|
||||||
|
|
||||||
|
def parse_command_line(self):
|
||||||
|
"""Process features after parsing command line options"""
|
||||||
|
result = _Distribution.parse_command_line(self)
|
||||||
|
if self.features:
|
||||||
|
self._finalize_features()
|
||||||
|
return result
|
||||||
|
|
||||||
|
def _feature_attrname(self,name):
|
||||||
|
"""Convert feature name to corresponding option attribute name"""
|
||||||
|
return 'with_'+name.replace('-','_')
|
||||||
|
|
||||||
|
def fetch_build_eggs(self, requires):
|
||||||
|
"""Resolve pre-setup requirements"""
|
||||||
|
resolved_dists = pkg_resources.working_set.resolve(
|
||||||
|
pkg_resources.parse_requirements(requires),
|
||||||
|
installer=self.fetch_build_egg,
|
||||||
|
replace_conflicting=True,
|
||||||
|
)
|
||||||
|
for dist in resolved_dists:
|
||||||
|
pkg_resources.working_set.add(dist, replace=True)
|
||||||
|
|
||||||
|
def finalize_options(self):
|
||||||
|
_Distribution.finalize_options(self)
|
||||||
|
if self.features:
|
||||||
|
self._set_global_opts_from_features()
|
||||||
|
|
||||||
|
for ep in pkg_resources.iter_entry_points('distutils.setup_keywords'):
|
||||||
|
value = getattr(self,ep.name,None)
|
||||||
|
if value is not None:
|
||||||
|
ep.require(installer=self.fetch_build_egg)
|
||||||
|
ep.load()(self, ep.name, value)
|
||||||
|
if getattr(self, 'convert_2to3_doctests', None):
|
||||||
|
# XXX may convert to set here when we can rely on set being builtin
|
||||||
|
self.convert_2to3_doctests = [os.path.abspath(p) for p in self.convert_2to3_doctests]
|
||||||
|
else:
|
||||||
|
self.convert_2to3_doctests = []
|
||||||
|
|
||||||
|
def get_egg_cache_dir(self):
|
||||||
|
egg_cache_dir = os.path.join(os.curdir, '.eggs')
|
||||||
|
if not os.path.exists(egg_cache_dir):
|
||||||
|
os.mkdir(egg_cache_dir)
|
||||||
|
windows_support.hide_file(egg_cache_dir)
|
||||||
|
readme_txt_filename = os.path.join(egg_cache_dir, 'README.txt')
|
||||||
|
with open(readme_txt_filename, 'w') as f:
|
||||||
|
f.write('This directory contains eggs that were downloaded '
|
||||||
|
'by setuptools to build, test, and run plug-ins.\n\n')
|
||||||
|
f.write('This directory caches those eggs to prevent '
|
||||||
|
'repeated downloads.\n\n')
|
||||||
|
f.write('However, it is safe to delete this directory.\n\n')
|
||||||
|
|
||||||
|
return egg_cache_dir
|
||||||
|
|
||||||
|
def fetch_build_egg(self, req):
|
||||||
|
"""Fetch an egg needed for building"""
|
||||||
|
|
||||||
|
try:
|
||||||
|
cmd = self._egg_fetcher
|
||||||
|
cmd.package_index.to_scan = []
|
||||||
|
except AttributeError:
|
||||||
|
from setuptools.command.easy_install import easy_install
|
||||||
|
dist = self.__class__({'script_args':['easy_install']})
|
||||||
|
dist.parse_config_files()
|
||||||
|
opts = dist.get_option_dict('easy_install')
|
||||||
|
keep = (
|
||||||
|
'find_links', 'site_dirs', 'index_url', 'optimize',
|
||||||
|
'site_dirs', 'allow_hosts'
|
||||||
|
)
|
||||||
|
for key in list(opts):
|
||||||
|
if key not in keep:
|
||||||
|
del opts[key] # don't use any other settings
|
||||||
|
if self.dependency_links:
|
||||||
|
links = self.dependency_links[:]
|
||||||
|
if 'find_links' in opts:
|
||||||
|
links = opts['find_links'][1].split() + links
|
||||||
|
opts['find_links'] = ('setup', links)
|
||||||
|
install_dir = self.get_egg_cache_dir()
|
||||||
|
cmd = easy_install(
|
||||||
|
dist, args=["x"], install_dir=install_dir, exclude_scripts=True,
|
||||||
|
always_copy=False, build_directory=None, editable=False,
|
||||||
|
upgrade=False, multi_version=True, no_report=True, user=False
|
||||||
|
)
|
||||||
|
cmd.ensure_finalized()
|
||||||
|
self._egg_fetcher = cmd
|
||||||
|
return cmd.easy_install(req)
|
||||||
|
|
||||||
|
def _set_global_opts_from_features(self):
|
||||||
|
"""Add --with-X/--without-X options based on optional features"""
|
||||||
|
|
||||||
|
go = []
|
||||||
|
no = self.negative_opt.copy()
|
||||||
|
|
||||||
|
for name,feature in self.features.items():
|
||||||
|
self._set_feature(name,None)
|
||||||
|
feature.validate(self)
|
||||||
|
|
||||||
|
if feature.optional:
|
||||||
|
descr = feature.description
|
||||||
|
incdef = ' (default)'
|
||||||
|
excdef=''
|
||||||
|
if not feature.include_by_default():
|
||||||
|
excdef, incdef = incdef, excdef
|
||||||
|
|
||||||
|
go.append(('with-'+name, None, 'include '+descr+incdef))
|
||||||
|
go.append(('without-'+name, None, 'exclude '+descr+excdef))
|
||||||
|
no['without-'+name] = 'with-'+name
|
||||||
|
|
||||||
|
self.global_options = self.feature_options = go + self.global_options
|
||||||
|
self.negative_opt = self.feature_negopt = no
|
||||||
|
|
||||||
|
def _finalize_features(self):
|
||||||
|
"""Add/remove features and resolve dependencies between them"""
|
||||||
|
|
||||||
|
# First, flag all the enabled items (and thus their dependencies)
|
||||||
|
for name,feature in self.features.items():
|
||||||
|
enabled = self.feature_is_included(name)
|
||||||
|
if enabled or (enabled is None and feature.include_by_default()):
|
||||||
|
feature.include_in(self)
|
||||||
|
self._set_feature(name,1)
|
||||||
|
|
||||||
|
# Then disable the rest, so that off-by-default features don't
|
||||||
|
# get flagged as errors when they're required by an enabled feature
|
||||||
|
for name,feature in self.features.items():
|
||||||
|
if not self.feature_is_included(name):
|
||||||
|
feature.exclude_from(self)
|
||||||
|
self._set_feature(name,0)
|
||||||
|
|
||||||
|
def get_command_class(self, command):
|
||||||
|
"""Pluggable version of get_command_class()"""
|
||||||
|
if command in self.cmdclass:
|
||||||
|
return self.cmdclass[command]
|
||||||
|
|
||||||
|
for ep in pkg_resources.iter_entry_points('distutils.commands',command):
|
||||||
|
ep.require(installer=self.fetch_build_egg)
|
||||||
|
self.cmdclass[command] = cmdclass = ep.load()
|
||||||
|
return cmdclass
|
||||||
|
else:
|
||||||
|
return _Distribution.get_command_class(self, command)
|
||||||
|
|
||||||
|
def print_commands(self):
|
||||||
|
for ep in pkg_resources.iter_entry_points('distutils.commands'):
|
||||||
|
if ep.name not in self.cmdclass:
|
||||||
|
# don't require extras as the commands won't be invoked
|
||||||
|
cmdclass = ep.resolve()
|
||||||
|
self.cmdclass[ep.name] = cmdclass
|
||||||
|
return _Distribution.print_commands(self)
|
||||||
|
|
||||||
|
def _set_feature(self,name,status):
|
||||||
|
"""Set feature's inclusion status"""
|
||||||
|
setattr(self,self._feature_attrname(name),status)
|
||||||
|
|
||||||
|
def feature_is_included(self,name):
|
||||||
|
"""Return 1 if feature is included, 0 if excluded, 'None' if unknown"""
|
||||||
|
return getattr(self,self._feature_attrname(name))
|
||||||
|
|
||||||
|
def include_feature(self,name):
|
||||||
|
"""Request inclusion of feature named 'name'"""
|
||||||
|
|
||||||
|
if self.feature_is_included(name)==0:
|
||||||
|
descr = self.features[name].description
|
||||||
|
raise DistutilsOptionError(
|
||||||
|
descr + " is required, but was excluded or is not available"
|
||||||
|
)
|
||||||
|
self.features[name].include_in(self)
|
||||||
|
self._set_feature(name,1)
|
||||||
|
|
||||||
|
def include(self,**attrs):
|
||||||
|
"""Add items to distribution that are named in keyword arguments
|
||||||
|
|
||||||
|
For example, 'dist.exclude(py_modules=["x"])' would add 'x' to
|
||||||
|
the distribution's 'py_modules' attribute, if it was not already
|
||||||
|
there.
|
||||||
|
|
||||||
|
Currently, this method only supports inclusion for attributes that are
|
||||||
|
lists or tuples. If you need to add support for adding to other
|
||||||
|
attributes in this or a subclass, you can add an '_include_X' method,
|
||||||
|
where 'X' is the name of the attribute. The method will be called with
|
||||||
|
the value passed to 'include()'. So, 'dist.include(foo={"bar":"baz"})'
|
||||||
|
will try to call 'dist._include_foo({"bar":"baz"})', which can then
|
||||||
|
handle whatever special inclusion logic is needed.
|
||||||
|
"""
|
||||||
|
for k,v in attrs.items():
|
||||||
|
include = getattr(self, '_include_'+k, None)
|
||||||
|
if include:
|
||||||
|
include(v)
|
||||||
|
else:
|
||||||
|
self._include_misc(k,v)
|
||||||
|
|
||||||
|
def exclude_package(self,package):
|
||||||
|
"""Remove packages, modules, and extensions in named package"""
|
||||||
|
|
||||||
|
pfx = package+'.'
|
||||||
|
if self.packages:
|
||||||
|
self.packages = [
|
||||||
|
p for p in self.packages
|
||||||
|
if p != package and not p.startswith(pfx)
|
||||||
|
]
|
||||||
|
|
||||||
|
if self.py_modules:
|
||||||
|
self.py_modules = [
|
||||||
|
p for p in self.py_modules
|
||||||
|
if p != package and not p.startswith(pfx)
|
||||||
|
]
|
||||||
|
|
||||||
|
if self.ext_modules:
|
||||||
|
self.ext_modules = [
|
||||||
|
p for p in self.ext_modules
|
||||||
|
if p.name != package and not p.name.startswith(pfx)
|
||||||
|
]
|
||||||
|
|
||||||
|
def has_contents_for(self,package):
|
||||||
|
"""Return true if 'exclude_package(package)' would do something"""
|
||||||
|
|
||||||
|
pfx = package+'.'
|
||||||
|
|
||||||
|
for p in self.iter_distribution_names():
|
||||||
|
if p==package or p.startswith(pfx):
|
||||||
|
return True
|
||||||
|
|
||||||
|
def _exclude_misc(self,name,value):
|
||||||
|
"""Handle 'exclude()' for list/tuple attrs without a special handler"""
|
||||||
|
if not isinstance(value,sequence):
|
||||||
|
raise DistutilsSetupError(
|
||||||
|
"%s: setting must be a list or tuple (%r)" % (name, value)
|
||||||
|
)
|
||||||
|
try:
|
||||||
|
old = getattr(self,name)
|
||||||
|
except AttributeError:
|
||||||
|
raise DistutilsSetupError(
|
||||||
|
"%s: No such distribution setting" % name
|
||||||
|
)
|
||||||
|
if old is not None and not isinstance(old,sequence):
|
||||||
|
raise DistutilsSetupError(
|
||||||
|
name+": this setting cannot be changed via include/exclude"
|
||||||
|
)
|
||||||
|
elif old:
|
||||||
|
setattr(self,name,[item for item in old if item not in value])
|
||||||
|
|
||||||
|
def _include_misc(self,name,value):
|
||||||
|
"""Handle 'include()' for list/tuple attrs without a special handler"""
|
||||||
|
|
||||||
|
if not isinstance(value,sequence):
|
||||||
|
raise DistutilsSetupError(
|
||||||
|
"%s: setting must be a list (%r)" % (name, value)
|
||||||
|
)
|
||||||
|
try:
|
||||||
|
old = getattr(self,name)
|
||||||
|
except AttributeError:
|
||||||
|
raise DistutilsSetupError(
|
||||||
|
"%s: No such distribution setting" % name
|
||||||
|
)
|
||||||
|
if old is None:
|
||||||
|
setattr(self,name,value)
|
||||||
|
elif not isinstance(old,sequence):
|
||||||
|
raise DistutilsSetupError(
|
||||||
|
name+": this setting cannot be changed via include/exclude"
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
setattr(self,name,old+[item for item in value if item not in old])
|
||||||
|
|
||||||
|
def exclude(self,**attrs):
|
||||||
|
"""Remove items from distribution that are named in keyword arguments
|
||||||
|
|
||||||
|
For example, 'dist.exclude(py_modules=["x"])' would remove 'x' from
|
||||||
|
the distribution's 'py_modules' attribute. Excluding packages uses
|
||||||
|
the 'exclude_package()' method, so all of the package's contained
|
||||||
|
packages, modules, and extensions are also excluded.
|
||||||
|
|
||||||
|
Currently, this method only supports exclusion from attributes that are
|
||||||
|
lists or tuples. If you need to add support for excluding from other
|
||||||
|
attributes in this or a subclass, you can add an '_exclude_X' method,
|
||||||
|
where 'X' is the name of the attribute. The method will be called with
|
||||||
|
the value passed to 'exclude()'. So, 'dist.exclude(foo={"bar":"baz"})'
|
||||||
|
will try to call 'dist._exclude_foo({"bar":"baz"})', which can then
|
||||||
|
handle whatever special exclusion logic is needed.
|
||||||
|
"""
|
||||||
|
for k,v in attrs.items():
|
||||||
|
exclude = getattr(self, '_exclude_'+k, None)
|
||||||
|
if exclude:
|
||||||
|
exclude(v)
|
||||||
|
else:
|
||||||
|
self._exclude_misc(k,v)
|
||||||
|
|
||||||
|
def _exclude_packages(self,packages):
|
||||||
|
if not isinstance(packages,sequence):
|
||||||
|
raise DistutilsSetupError(
|
||||||
|
"packages: setting must be a list or tuple (%r)" % (packages,)
|
||||||
|
)
|
||||||
|
list(map(self.exclude_package, packages))
|
||||||
|
|
||||||
|
def _parse_command_opts(self, parser, args):
|
||||||
|
# Remove --with-X/--without-X options when processing command args
|
||||||
|
self.global_options = self.__class__.global_options
|
||||||
|
self.negative_opt = self.__class__.negative_opt
|
||||||
|
|
||||||
|
# First, expand any aliases
|
||||||
|
command = args[0]
|
||||||
|
aliases = self.get_option_dict('aliases')
|
||||||
|
while command in aliases:
|
||||||
|
src,alias = aliases[command]
|
||||||
|
del aliases[command] # ensure each alias can expand only once!
|
||||||
|
import shlex
|
||||||
|
args[:1] = shlex.split(alias,True)
|
||||||
|
command = args[0]
|
||||||
|
|
||||||
|
nargs = _Distribution._parse_command_opts(self, parser, args)
|
||||||
|
|
||||||
|
# Handle commands that want to consume all remaining arguments
|
||||||
|
cmd_class = self.get_command_class(command)
|
||||||
|
if getattr(cmd_class,'command_consumes_arguments',None):
|
||||||
|
self.get_option_dict(command)['args'] = ("command line", nargs)
|
||||||
|
if nargs is not None:
|
||||||
|
return []
|
||||||
|
|
||||||
|
return nargs
|
||||||
|
|
||||||
|
def get_cmdline_options(self):
|
||||||
|
"""Return a '{cmd: {opt:val}}' map of all command-line options
|
||||||
|
|
||||||
|
Option names are all long, but do not include the leading '--', and
|
||||||
|
contain dashes rather than underscores. If the option doesn't take
|
||||||
|
an argument (e.g. '--quiet'), the 'val' is 'None'.
|
||||||
|
|
||||||
|
Note that options provided by config files are intentionally excluded.
|
||||||
|
"""
|
||||||
|
|
||||||
|
d = {}
|
||||||
|
|
||||||
|
for cmd,opts in self.command_options.items():
|
||||||
|
|
||||||
|
for opt,(src,val) in opts.items():
|
||||||
|
|
||||||
|
if src != "command line":
|
||||||
|
continue
|
||||||
|
|
||||||
|
opt = opt.replace('_','-')
|
||||||
|
|
||||||
|
if val==0:
|
||||||
|
cmdobj = self.get_command_obj(cmd)
|
||||||
|
neg_opt = self.negative_opt.copy()
|
||||||
|
neg_opt.update(getattr(cmdobj,'negative_opt',{}))
|
||||||
|
for neg,pos in neg_opt.items():
|
||||||
|
if pos==opt:
|
||||||
|
opt=neg
|
||||||
|
val=None
|
||||||
|
break
|
||||||
|
else:
|
||||||
|
raise AssertionError("Shouldn't be able to get here")
|
||||||
|
|
||||||
|
elif val==1:
|
||||||
|
val = None
|
||||||
|
|
||||||
|
d.setdefault(cmd,{})[opt] = val
|
||||||
|
|
||||||
|
return d
|
||||||
|
|
||||||
|
def iter_distribution_names(self):
|
||||||
|
"""Yield all packages, modules, and extension names in distribution"""
|
||||||
|
|
||||||
|
for pkg in self.packages or ():
|
||||||
|
yield pkg
|
||||||
|
|
||||||
|
for module in self.py_modules or ():
|
||||||
|
yield module
|
||||||
|
|
||||||
|
for ext in self.ext_modules or ():
|
||||||
|
if isinstance(ext,tuple):
|
||||||
|
name, buildinfo = ext
|
||||||
|
else:
|
||||||
|
name = ext.name
|
||||||
|
if name.endswith('module'):
|
||||||
|
name = name[:-6]
|
||||||
|
yield name
|
||||||
|
|
||||||
|
def handle_display_options(self, option_order):
|
||||||
|
"""If there were any non-global "display-only" options
|
||||||
|
(--help-commands or the metadata display options) on the command
|
||||||
|
line, display the requested info and return true; else return
|
||||||
|
false.
|
||||||
|
"""
|
||||||
|
import sys
|
||||||
|
|
||||||
|
if PY2 or self.help_commands:
|
||||||
|
return _Distribution.handle_display_options(self, option_order)
|
||||||
|
|
||||||
|
# Stdout may be StringIO (e.g. in tests)
|
||||||
|
import io
|
||||||
|
if not isinstance(sys.stdout, io.TextIOWrapper):
|
||||||
|
return _Distribution.handle_display_options(self, option_order)
|
||||||
|
|
||||||
|
# Don't wrap stdout if utf-8 is already the encoding. Provides
|
||||||
|
# workaround for #334.
|
||||||
|
if sys.stdout.encoding.lower() in ('utf-8', 'utf8'):
|
||||||
|
return _Distribution.handle_display_options(self, option_order)
|
||||||
|
|
||||||
|
# Print metadata in UTF-8 no matter the platform
|
||||||
|
encoding = sys.stdout.encoding
|
||||||
|
errors = sys.stdout.errors
|
||||||
|
newline = sys.platform != 'win32' and '\n' or None
|
||||||
|
line_buffering = sys.stdout.line_buffering
|
||||||
|
|
||||||
|
sys.stdout = io.TextIOWrapper(
|
||||||
|
sys.stdout.detach(), 'utf-8', errors, newline, line_buffering)
|
||||||
|
try:
|
||||||
|
return _Distribution.handle_display_options(self, option_order)
|
||||||
|
finally:
|
||||||
|
sys.stdout = io.TextIOWrapper(
|
||||||
|
sys.stdout.detach(), encoding, errors, newline, line_buffering)
|
||||||
|
|
||||||
|
|
||||||
|
# Install it throughout the distutils
|
||||||
|
for module in distutils.dist, distutils.core, distutils.cmd:
|
||||||
|
module.Distribution = Distribution
|
||||||
|
|
||||||
|
|
||||||
|
class Feature:
|
||||||
|
"""
|
||||||
|
**deprecated** -- The `Feature` facility was never completely implemented
|
||||||
|
or supported, `has reported issues
|
||||||
|
<https://bitbucket.org/pypa/setuptools/issue/58>`_ and will be removed in
|
||||||
|
a future version.
|
||||||
|
|
||||||
|
A subset of the distribution that can be excluded if unneeded/wanted
|
||||||
|
|
||||||
|
Features are created using these keyword arguments:
|
||||||
|
|
||||||
|
'description' -- a short, human readable description of the feature, to
|
||||||
|
be used in error messages, and option help messages.
|
||||||
|
|
||||||
|
'standard' -- if true, the feature is included by default if it is
|
||||||
|
available on the current system. Otherwise, the feature is only
|
||||||
|
included if requested via a command line '--with-X' option, or if
|
||||||
|
another included feature requires it. The default setting is 'False'.
|
||||||
|
|
||||||
|
'available' -- if true, the feature is available for installation on the
|
||||||
|
current system. The default setting is 'True'.
|
||||||
|
|
||||||
|
'optional' -- if true, the feature's inclusion can be controlled from the
|
||||||
|
command line, using the '--with-X' or '--without-X' options. If
|
||||||
|
false, the feature's inclusion status is determined automatically,
|
||||||
|
based on 'availabile', 'standard', and whether any other feature
|
||||||
|
requires it. The default setting is 'True'.
|
||||||
|
|
||||||
|
'require_features' -- a string or sequence of strings naming features
|
||||||
|
that should also be included if this feature is included. Defaults to
|
||||||
|
empty list. May also contain 'Require' objects that should be
|
||||||
|
added/removed from the distribution.
|
||||||
|
|
||||||
|
'remove' -- a string or list of strings naming packages to be removed
|
||||||
|
from the distribution if this feature is *not* included. If the
|
||||||
|
feature *is* included, this argument is ignored. This argument exists
|
||||||
|
to support removing features that "crosscut" a distribution, such as
|
||||||
|
defining a 'tests' feature that removes all the 'tests' subpackages
|
||||||
|
provided by other features. The default for this argument is an empty
|
||||||
|
list. (Note: the named package(s) or modules must exist in the base
|
||||||
|
distribution when the 'setup()' function is initially called.)
|
||||||
|
|
||||||
|
other keywords -- any other keyword arguments are saved, and passed to
|
||||||
|
the distribution's 'include()' and 'exclude()' methods when the
|
||||||
|
feature is included or excluded, respectively. So, for example, you
|
||||||
|
could pass 'packages=["a","b"]' to cause packages 'a' and 'b' to be
|
||||||
|
added or removed from the distribution as appropriate.
|
||||||
|
|
||||||
|
A feature must include at least one 'requires', 'remove', or other
|
||||||
|
keyword argument. Otherwise, it can't affect the distribution in any way.
|
||||||
|
Note also that you can subclass 'Feature' to create your own specialized
|
||||||
|
feature types that modify the distribution in other ways when included or
|
||||||
|
excluded. See the docstrings for the various methods here for more detail.
|
||||||
|
Aside from the methods, the only feature attributes that distributions look
|
||||||
|
at are 'description' and 'optional'.
|
||||||
|
"""
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def warn_deprecated():
|
||||||
|
warnings.warn(
|
||||||
|
"Features are deprecated and will be removed in a future "
|
||||||
|
"version. See http://bitbucket.org/pypa/setuptools/65.",
|
||||||
|
DeprecationWarning,
|
||||||
|
stacklevel=3,
|
||||||
|
)
|
||||||
|
|
||||||
|
def __init__(self, description, standard=False, available=True,
|
||||||
|
optional=True, require_features=(), remove=(), **extras):
|
||||||
|
self.warn_deprecated()
|
||||||
|
|
||||||
|
self.description = description
|
||||||
|
self.standard = standard
|
||||||
|
self.available = available
|
||||||
|
self.optional = optional
|
||||||
|
if isinstance(require_features,(str,Require)):
|
||||||
|
require_features = require_features,
|
||||||
|
|
||||||
|
self.require_features = [
|
||||||
|
r for r in require_features if isinstance(r,str)
|
||||||
|
]
|
||||||
|
er = [r for r in require_features if not isinstance(r,str)]
|
||||||
|
if er: extras['require_features'] = er
|
||||||
|
|
||||||
|
if isinstance(remove,str):
|
||||||
|
remove = remove,
|
||||||
|
self.remove = remove
|
||||||
|
self.extras = extras
|
||||||
|
|
||||||
|
if not remove and not require_features and not extras:
|
||||||
|
raise DistutilsSetupError(
|
||||||
|
"Feature %s: must define 'require_features', 'remove', or at least one"
|
||||||
|
" of 'packages', 'py_modules', etc."
|
||||||
|
)
|
||||||
|
|
||||||
|
def include_by_default(self):
|
||||||
|
"""Should this feature be included by default?"""
|
||||||
|
return self.available and self.standard
|
||||||
|
|
||||||
|
def include_in(self,dist):
|
||||||
|
|
||||||
|
"""Ensure feature and its requirements are included in distribution
|
||||||
|
|
||||||
|
You may override this in a subclass to perform additional operations on
|
||||||
|
the distribution. Note that this method may be called more than once
|
||||||
|
per feature, and so should be idempotent.
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
if not self.available:
|
||||||
|
raise DistutilsPlatformError(
|
||||||
|
self.description+" is required,"
|
||||||
|
"but is not available on this platform"
|
||||||
|
)
|
||||||
|
|
||||||
|
dist.include(**self.extras)
|
||||||
|
|
||||||
|
for f in self.require_features:
|
||||||
|
dist.include_feature(f)
|
||||||
|
|
||||||
|
def exclude_from(self,dist):
|
||||||
|
|
||||||
|
"""Ensure feature is excluded from distribution
|
||||||
|
|
||||||
|
You may override this in a subclass to perform additional operations on
|
||||||
|
the distribution. This method will be called at most once per
|
||||||
|
feature, and only after all included features have been asked to
|
||||||
|
include themselves.
|
||||||
|
"""
|
||||||
|
|
||||||
|
dist.exclude(**self.extras)
|
||||||
|
|
||||||
|
if self.remove:
|
||||||
|
for item in self.remove:
|
||||||
|
dist.exclude_package(item)
|
||||||
|
|
||||||
|
def validate(self,dist):
|
||||||
|
|
||||||
|
"""Verify that feature makes sense in context of distribution
|
||||||
|
|
||||||
|
This method is called by the distribution just before it parses its
|
||||||
|
command line. It checks to ensure that the 'remove' attribute, if any,
|
||||||
|
contains only valid package/module names that are present in the base
|
||||||
|
distribution when 'setup()' is called. You may override it in a
|
||||||
|
subclass to perform any other required validation of the feature
|
||||||
|
against a target distribution.
|
||||||
|
"""
|
||||||
|
|
||||||
|
for item in self.remove:
|
||||||
|
if not dist.has_contents_for(item):
|
||||||
|
raise DistutilsSetupError(
|
||||||
|
"%s wants to be able to remove %s, but the distribution"
|
||||||
|
" doesn't contain any packages or modules under %s"
|
||||||
|
% (self.description, item, item)
|
||||||
|
)
|
55
lib/python3.4/site-packages/setuptools/extension.py
Normal file
55
lib/python3.4/site-packages/setuptools/extension.py
Normal file
|
@ -0,0 +1,55 @@
|
||||||
|
import sys
|
||||||
|
import re
|
||||||
|
import functools
|
||||||
|
import distutils.core
|
||||||
|
import distutils.errors
|
||||||
|
import distutils.extension
|
||||||
|
|
||||||
|
from .dist import _get_unpatched
|
||||||
|
from . import msvc9_support
|
||||||
|
|
||||||
|
_Extension = _get_unpatched(distutils.core.Extension)
|
||||||
|
|
||||||
|
msvc9_support.patch_for_specialized_compiler()
|
||||||
|
|
||||||
|
def _have_cython():
|
||||||
|
"""
|
||||||
|
Return True if Cython can be imported.
|
||||||
|
"""
|
||||||
|
cython_impl = 'Cython.Distutils.build_ext',
|
||||||
|
try:
|
||||||
|
# from (cython_impl) import build_ext
|
||||||
|
__import__(cython_impl, fromlist=['build_ext']).build_ext
|
||||||
|
return True
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
return False
|
||||||
|
|
||||||
|
# for compatibility
|
||||||
|
have_pyrex = _have_cython
|
||||||
|
|
||||||
|
|
||||||
|
class Extension(_Extension):
|
||||||
|
"""Extension that uses '.c' files in place of '.pyx' files"""
|
||||||
|
|
||||||
|
def _convert_pyx_sources_to_lang(self):
|
||||||
|
"""
|
||||||
|
Replace sources with .pyx extensions to sources with the target
|
||||||
|
language extension. This mechanism allows language authors to supply
|
||||||
|
pre-converted sources but to prefer the .pyx sources.
|
||||||
|
"""
|
||||||
|
if _have_cython():
|
||||||
|
# the build has Cython, so allow it to compile the .pyx files
|
||||||
|
return
|
||||||
|
lang = self.language or ''
|
||||||
|
target_ext = '.cpp' if lang.lower() == 'c++' else '.c'
|
||||||
|
sub = functools.partial(re.sub, '.pyx$', target_ext)
|
||||||
|
self.sources = list(map(sub, self.sources))
|
||||||
|
|
||||||
|
class Library(Extension):
|
||||||
|
"""Just like a regular Extension, but built as a library instead"""
|
||||||
|
|
||||||
|
distutils.core.Extension = Extension
|
||||||
|
distutils.extension.Extension = Extension
|
||||||
|
if 'distutils.command.build_ext' in sys.modules:
|
||||||
|
sys.modules['distutils.command.build_ext'].Extension = Extension
|
BIN
lib/python3.4/site-packages/setuptools/gui-32.exe
Normal file
BIN
lib/python3.4/site-packages/setuptools/gui-32.exe
Normal file
Binary file not shown.
BIN
lib/python3.4/site-packages/setuptools/gui-64.exe
Normal file
BIN
lib/python3.4/site-packages/setuptools/gui-64.exe
Normal file
Binary file not shown.
BIN
lib/python3.4/site-packages/setuptools/gui-arm-32.exe
Normal file
BIN
lib/python3.4/site-packages/setuptools/gui-arm-32.exe
Normal file
Binary file not shown.
BIN
lib/python3.4/site-packages/setuptools/gui.exe
Normal file
BIN
lib/python3.4/site-packages/setuptools/gui.exe
Normal file
Binary file not shown.
58
lib/python3.4/site-packages/setuptools/lib2to3_ex.py
Normal file
58
lib/python3.4/site-packages/setuptools/lib2to3_ex.py
Normal file
|
@ -0,0 +1,58 @@
|
||||||
|
"""
|
||||||
|
Customized Mixin2to3 support:
|
||||||
|
|
||||||
|
- adds support for converting doctests
|
||||||
|
|
||||||
|
|
||||||
|
This module raises an ImportError on Python 2.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from distutils.util import Mixin2to3 as _Mixin2to3
|
||||||
|
from distutils import log
|
||||||
|
from lib2to3.refactor import RefactoringTool, get_fixers_from_package
|
||||||
|
import setuptools
|
||||||
|
|
||||||
|
class DistutilsRefactoringTool(RefactoringTool):
|
||||||
|
def log_error(self, msg, *args, **kw):
|
||||||
|
log.error(msg, *args)
|
||||||
|
|
||||||
|
def log_message(self, msg, *args):
|
||||||
|
log.info(msg, *args)
|
||||||
|
|
||||||
|
def log_debug(self, msg, *args):
|
||||||
|
log.debug(msg, *args)
|
||||||
|
|
||||||
|
class Mixin2to3(_Mixin2to3):
|
||||||
|
def run_2to3(self, files, doctests = False):
|
||||||
|
# See of the distribution option has been set, otherwise check the
|
||||||
|
# setuptools default.
|
||||||
|
if self.distribution.use_2to3 is not True:
|
||||||
|
return
|
||||||
|
if not files:
|
||||||
|
return
|
||||||
|
log.info("Fixing "+" ".join(files))
|
||||||
|
self.__build_fixer_names()
|
||||||
|
self.__exclude_fixers()
|
||||||
|
if doctests:
|
||||||
|
if setuptools.run_2to3_on_doctests:
|
||||||
|
r = DistutilsRefactoringTool(self.fixer_names)
|
||||||
|
r.refactor(files, write=True, doctests_only=True)
|
||||||
|
else:
|
||||||
|
_Mixin2to3.run_2to3(self, files)
|
||||||
|
|
||||||
|
def __build_fixer_names(self):
|
||||||
|
if self.fixer_names: return
|
||||||
|
self.fixer_names = []
|
||||||
|
for p in setuptools.lib2to3_fixer_packages:
|
||||||
|
self.fixer_names.extend(get_fixers_from_package(p))
|
||||||
|
if self.distribution.use_2to3_fixers is not None:
|
||||||
|
for p in self.distribution.use_2to3_fixers:
|
||||||
|
self.fixer_names.extend(get_fixers_from_package(p))
|
||||||
|
|
||||||
|
def __exclude_fixers(self):
|
||||||
|
excluded_fixers = getattr(self, 'exclude_fixers', [])
|
||||||
|
if self.distribution.use_2to3_exclude_fixers is not None:
|
||||||
|
excluded_fixers.extend(self.distribution.use_2to3_exclude_fixers)
|
||||||
|
for fixer_name in excluded_fixers:
|
||||||
|
if fixer_name in self.fixer_names:
|
||||||
|
self.fixer_names.remove(fixer_name)
|
63
lib/python3.4/site-packages/setuptools/msvc9_support.py
Normal file
63
lib/python3.4/site-packages/setuptools/msvc9_support.py
Normal file
|
@ -0,0 +1,63 @@
|
||||||
|
try:
|
||||||
|
import distutils.msvc9compiler
|
||||||
|
except ImportError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
unpatched = dict()
|
||||||
|
|
||||||
|
def patch_for_specialized_compiler():
|
||||||
|
"""
|
||||||
|
Patch functions in distutils.msvc9compiler to use the standalone compiler
|
||||||
|
build for Python (Windows only). Fall back to original behavior when the
|
||||||
|
standalone compiler is not available.
|
||||||
|
"""
|
||||||
|
if 'distutils' not in globals():
|
||||||
|
# The module isn't available to be patched
|
||||||
|
return
|
||||||
|
|
||||||
|
if unpatched:
|
||||||
|
# Already patched
|
||||||
|
return
|
||||||
|
|
||||||
|
unpatched.update(vars(distutils.msvc9compiler))
|
||||||
|
|
||||||
|
distutils.msvc9compiler.find_vcvarsall = find_vcvarsall
|
||||||
|
distutils.msvc9compiler.query_vcvarsall = query_vcvarsall
|
||||||
|
|
||||||
|
def find_vcvarsall(version):
|
||||||
|
Reg = distutils.msvc9compiler.Reg
|
||||||
|
VC_BASE = r'Software\%sMicrosoft\DevDiv\VCForPython\%0.1f'
|
||||||
|
key = VC_BASE % ('', version)
|
||||||
|
try:
|
||||||
|
# Per-user installs register the compiler path here
|
||||||
|
productdir = Reg.get_value(key, "installdir")
|
||||||
|
except KeyError:
|
||||||
|
try:
|
||||||
|
# All-user installs on a 64-bit system register here
|
||||||
|
key = VC_BASE % ('Wow6432Node\\', version)
|
||||||
|
productdir = Reg.get_value(key, "installdir")
|
||||||
|
except KeyError:
|
||||||
|
productdir = None
|
||||||
|
|
||||||
|
if productdir:
|
||||||
|
import os
|
||||||
|
vcvarsall = os.path.join(productdir, "vcvarsall.bat")
|
||||||
|
if os.path.isfile(vcvarsall):
|
||||||
|
return vcvarsall
|
||||||
|
|
||||||
|
return unpatched['find_vcvarsall'](version)
|
||||||
|
|
||||||
|
def query_vcvarsall(version, *args, **kwargs):
|
||||||
|
try:
|
||||||
|
return unpatched['query_vcvarsall'](version, *args, **kwargs)
|
||||||
|
except distutils.errors.DistutilsPlatformError as exc:
|
||||||
|
if exc and "vcvarsall.bat" in exc.args[0]:
|
||||||
|
message = 'Microsoft Visual C++ %0.1f is required (%s).' % (version, exc.args[0])
|
||||||
|
if int(version) == 9:
|
||||||
|
# This redirection link is maintained by Microsoft.
|
||||||
|
# Contact vspython@microsoft.com if it needs updating.
|
||||||
|
raise distutils.errors.DistutilsPlatformError(
|
||||||
|
message + ' Get it from http://aka.ms/vcpython27'
|
||||||
|
)
|
||||||
|
raise distutils.errors.DistutilsPlatformError(message)
|
||||||
|
raise
|
1049
lib/python3.4/site-packages/setuptools/package_index.py
Normal file
1049
lib/python3.4/site-packages/setuptools/package_index.py
Normal file
File diff suppressed because it is too large
Load diff
19
lib/python3.4/site-packages/setuptools/py26compat.py
Normal file
19
lib/python3.4/site-packages/setuptools/py26compat.py
Normal file
|
@ -0,0 +1,19 @@
|
||||||
|
"""
|
||||||
|
Compatibility Support for Python 2.6 and earlier
|
||||||
|
"""
|
||||||
|
|
||||||
|
import sys
|
||||||
|
|
||||||
|
from setuptools.compat import splittag
|
||||||
|
|
||||||
|
def strip_fragment(url):
|
||||||
|
"""
|
||||||
|
In `Python 8280 <http://bugs.python.org/issue8280>`_, Python 2.7 and
|
||||||
|
later was patched to disregard the fragment when making URL requests.
|
||||||
|
Do the same for Python 2.6 and earlier.
|
||||||
|
"""
|
||||||
|
url, fragment = splittag(url)
|
||||||
|
return url
|
||||||
|
|
||||||
|
if sys.version_info >= (2,7):
|
||||||
|
strip_fragment = lambda x: x
|
15
lib/python3.4/site-packages/setuptools/py27compat.py
Normal file
15
lib/python3.4/site-packages/setuptools/py27compat.py
Normal file
|
@ -0,0 +1,15 @@
|
||||||
|
"""
|
||||||
|
Compatibility Support for Python 2.7 and earlier
|
||||||
|
"""
|
||||||
|
|
||||||
|
import sys
|
||||||
|
|
||||||
|
def get_all_headers(message, key):
|
||||||
|
"""
|
||||||
|
Given an HTTPMessage, return all headers matching a given key.
|
||||||
|
"""
|
||||||
|
return message.get_all(key)
|
||||||
|
|
||||||
|
if sys.version_info < (3,):
|
||||||
|
def get_all_headers(message, key):
|
||||||
|
return message.getheaders(key)
|
52
lib/python3.4/site-packages/setuptools/py31compat.py
Normal file
52
lib/python3.4/site-packages/setuptools/py31compat.py
Normal file
|
@ -0,0 +1,52 @@
|
||||||
|
import sys
|
||||||
|
import unittest
|
||||||
|
|
||||||
|
__all__ = ['get_config_vars', 'get_path']
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Python 2.7 or >=3.2
|
||||||
|
from sysconfig import get_config_vars, get_path
|
||||||
|
except ImportError:
|
||||||
|
from distutils.sysconfig import get_config_vars, get_python_lib
|
||||||
|
def get_path(name):
|
||||||
|
if name not in ('platlib', 'purelib'):
|
||||||
|
raise ValueError("Name must be purelib or platlib")
|
||||||
|
return get_python_lib(name=='platlib')
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Python >=3.2
|
||||||
|
from tempfile import TemporaryDirectory
|
||||||
|
except ImportError:
|
||||||
|
import shutil
|
||||||
|
import tempfile
|
||||||
|
class TemporaryDirectory(object):
|
||||||
|
""""
|
||||||
|
Very simple temporary directory context manager.
|
||||||
|
Will try to delete afterward, but will also ignore OS and similar
|
||||||
|
errors on deletion.
|
||||||
|
"""
|
||||||
|
def __init__(self):
|
||||||
|
self.name = None # Handle mkdtemp raising an exception
|
||||||
|
self.name = tempfile.mkdtemp()
|
||||||
|
|
||||||
|
def __enter__(self):
|
||||||
|
return self.name
|
||||||
|
|
||||||
|
def __exit__(self, exctype, excvalue, exctrace):
|
||||||
|
try:
|
||||||
|
shutil.rmtree(self.name, True)
|
||||||
|
except OSError: #removal errors are not the only possible
|
||||||
|
pass
|
||||||
|
self.name = None
|
||||||
|
|
||||||
|
|
||||||
|
unittest_main = unittest.main
|
||||||
|
|
||||||
|
_PY31 = (3, 1) <= sys.version_info[:2] < (3, 2)
|
||||||
|
if _PY31:
|
||||||
|
# on Python 3.1, translate testRunner==None to TextTestRunner
|
||||||
|
# for compatibility with Python 2.6, 2.7, and 3.2+
|
||||||
|
def unittest_main(*args, **kwargs):
|
||||||
|
if 'testRunner' in kwargs and kwargs['testRunner'] is None:
|
||||||
|
kwargs['testRunner'] = unittest.TextTestRunner
|
||||||
|
return unittest.main(*args, **kwargs)
|
489
lib/python3.4/site-packages/setuptools/sandbox.py
Normal file
489
lib/python3.4/site-packages/setuptools/sandbox.py
Normal file
|
@ -0,0 +1,489 @@
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
import tempfile
|
||||||
|
import operator
|
||||||
|
import functools
|
||||||
|
import itertools
|
||||||
|
import re
|
||||||
|
import contextlib
|
||||||
|
import pickle
|
||||||
|
|
||||||
|
import pkg_resources
|
||||||
|
|
||||||
|
if sys.platform.startswith('java'):
|
||||||
|
import org.python.modules.posix.PosixModule as _os
|
||||||
|
else:
|
||||||
|
_os = sys.modules[os.name]
|
||||||
|
try:
|
||||||
|
_file = file
|
||||||
|
except NameError:
|
||||||
|
_file = None
|
||||||
|
_open = open
|
||||||
|
from distutils.errors import DistutilsError
|
||||||
|
from pkg_resources import working_set
|
||||||
|
|
||||||
|
from setuptools import compat
|
||||||
|
from setuptools.compat import builtins
|
||||||
|
|
||||||
|
__all__ = [
|
||||||
|
"AbstractSandbox", "DirectorySandbox", "SandboxViolation", "run_setup",
|
||||||
|
]
|
||||||
|
|
||||||
|
def _execfile(filename, globals, locals=None):
|
||||||
|
"""
|
||||||
|
Python 3 implementation of execfile.
|
||||||
|
"""
|
||||||
|
mode = 'rb'
|
||||||
|
with open(filename, mode) as stream:
|
||||||
|
script = stream.read()
|
||||||
|
# compile() function in Python 2.6 and 3.1 requires LF line endings.
|
||||||
|
if sys.version_info[:2] < (2, 7) or sys.version_info[:2] >= (3, 0) and sys.version_info[:2] < (3, 2):
|
||||||
|
script = script.replace(b'\r\n', b'\n')
|
||||||
|
script = script.replace(b'\r', b'\n')
|
||||||
|
if locals is None:
|
||||||
|
locals = globals
|
||||||
|
code = compile(script, filename, 'exec')
|
||||||
|
exec(code, globals, locals)
|
||||||
|
|
||||||
|
|
||||||
|
@contextlib.contextmanager
|
||||||
|
def save_argv(repl=None):
|
||||||
|
saved = sys.argv[:]
|
||||||
|
if repl is not None:
|
||||||
|
sys.argv[:] = repl
|
||||||
|
try:
|
||||||
|
yield saved
|
||||||
|
finally:
|
||||||
|
sys.argv[:] = saved
|
||||||
|
|
||||||
|
|
||||||
|
@contextlib.contextmanager
|
||||||
|
def save_path():
|
||||||
|
saved = sys.path[:]
|
||||||
|
try:
|
||||||
|
yield saved
|
||||||
|
finally:
|
||||||
|
sys.path[:] = saved
|
||||||
|
|
||||||
|
|
||||||
|
@contextlib.contextmanager
|
||||||
|
def override_temp(replacement):
|
||||||
|
"""
|
||||||
|
Monkey-patch tempfile.tempdir with replacement, ensuring it exists
|
||||||
|
"""
|
||||||
|
if not os.path.isdir(replacement):
|
||||||
|
os.makedirs(replacement)
|
||||||
|
|
||||||
|
saved = tempfile.tempdir
|
||||||
|
|
||||||
|
tempfile.tempdir = replacement
|
||||||
|
|
||||||
|
try:
|
||||||
|
yield
|
||||||
|
finally:
|
||||||
|
tempfile.tempdir = saved
|
||||||
|
|
||||||
|
|
||||||
|
@contextlib.contextmanager
|
||||||
|
def pushd(target):
|
||||||
|
saved = os.getcwd()
|
||||||
|
os.chdir(target)
|
||||||
|
try:
|
||||||
|
yield saved
|
||||||
|
finally:
|
||||||
|
os.chdir(saved)
|
||||||
|
|
||||||
|
|
||||||
|
class UnpickleableException(Exception):
|
||||||
|
"""
|
||||||
|
An exception representing another Exception that could not be pickled.
|
||||||
|
"""
|
||||||
|
@classmethod
|
||||||
|
def dump(cls, type, exc):
|
||||||
|
"""
|
||||||
|
Always return a dumped (pickled) type and exc. If exc can't be pickled,
|
||||||
|
wrap it in UnpickleableException first.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
return pickle.dumps(type), pickle.dumps(exc)
|
||||||
|
except Exception:
|
||||||
|
return cls.dump(cls, cls(repr(exc)))
|
||||||
|
|
||||||
|
|
||||||
|
class ExceptionSaver:
|
||||||
|
"""
|
||||||
|
A Context Manager that will save an exception, serialized, and restore it
|
||||||
|
later.
|
||||||
|
"""
|
||||||
|
def __enter__(self):
|
||||||
|
return self
|
||||||
|
|
||||||
|
def __exit__(self, type, exc, tb):
|
||||||
|
if not exc:
|
||||||
|
return
|
||||||
|
|
||||||
|
# dump the exception
|
||||||
|
self._saved = UnpickleableException.dump(type, exc)
|
||||||
|
self._tb = tb
|
||||||
|
|
||||||
|
# suppress the exception
|
||||||
|
return True
|
||||||
|
|
||||||
|
def resume(self):
|
||||||
|
"restore and re-raise any exception"
|
||||||
|
|
||||||
|
if '_saved' not in vars(self):
|
||||||
|
return
|
||||||
|
|
||||||
|
type, exc = map(pickle.loads, self._saved)
|
||||||
|
compat.reraise(type, exc, self._tb)
|
||||||
|
|
||||||
|
|
||||||
|
@contextlib.contextmanager
|
||||||
|
def save_modules():
|
||||||
|
"""
|
||||||
|
Context in which imported modules are saved.
|
||||||
|
|
||||||
|
Translates exceptions internal to the context into the equivalent exception
|
||||||
|
outside the context.
|
||||||
|
"""
|
||||||
|
saved = sys.modules.copy()
|
||||||
|
with ExceptionSaver() as saved_exc:
|
||||||
|
yield saved
|
||||||
|
|
||||||
|
sys.modules.update(saved)
|
||||||
|
# remove any modules imported since
|
||||||
|
del_modules = (
|
||||||
|
mod_name for mod_name in sys.modules
|
||||||
|
if mod_name not in saved
|
||||||
|
# exclude any encodings modules. See #285
|
||||||
|
and not mod_name.startswith('encodings.')
|
||||||
|
)
|
||||||
|
_clear_modules(del_modules)
|
||||||
|
|
||||||
|
saved_exc.resume()
|
||||||
|
|
||||||
|
|
||||||
|
def _clear_modules(module_names):
|
||||||
|
for mod_name in list(module_names):
|
||||||
|
del sys.modules[mod_name]
|
||||||
|
|
||||||
|
|
||||||
|
@contextlib.contextmanager
|
||||||
|
def save_pkg_resources_state():
|
||||||
|
saved = pkg_resources.__getstate__()
|
||||||
|
try:
|
||||||
|
yield saved
|
||||||
|
finally:
|
||||||
|
pkg_resources.__setstate__(saved)
|
||||||
|
|
||||||
|
|
||||||
|
@contextlib.contextmanager
|
||||||
|
def setup_context(setup_dir):
|
||||||
|
temp_dir = os.path.join(setup_dir, 'temp')
|
||||||
|
with save_pkg_resources_state():
|
||||||
|
with save_modules():
|
||||||
|
hide_setuptools()
|
||||||
|
with save_path():
|
||||||
|
with save_argv():
|
||||||
|
with override_temp(temp_dir):
|
||||||
|
with pushd(setup_dir):
|
||||||
|
# ensure setuptools commands are available
|
||||||
|
__import__('setuptools')
|
||||||
|
yield
|
||||||
|
|
||||||
|
|
||||||
|
def _needs_hiding(mod_name):
|
||||||
|
"""
|
||||||
|
>>> _needs_hiding('setuptools')
|
||||||
|
True
|
||||||
|
>>> _needs_hiding('pkg_resources')
|
||||||
|
True
|
||||||
|
>>> _needs_hiding('setuptools_plugin')
|
||||||
|
False
|
||||||
|
>>> _needs_hiding('setuptools.__init__')
|
||||||
|
True
|
||||||
|
>>> _needs_hiding('distutils')
|
||||||
|
True
|
||||||
|
"""
|
||||||
|
pattern = re.compile('(setuptools|pkg_resources|distutils)(\.|$)')
|
||||||
|
return bool(pattern.match(mod_name))
|
||||||
|
|
||||||
|
|
||||||
|
def hide_setuptools():
|
||||||
|
"""
|
||||||
|
Remove references to setuptools' modules from sys.modules to allow the
|
||||||
|
invocation to import the most appropriate setuptools. This technique is
|
||||||
|
necessary to avoid issues such as #315 where setuptools upgrading itself
|
||||||
|
would fail to find a function declared in the metadata.
|
||||||
|
"""
|
||||||
|
modules = filter(_needs_hiding, sys.modules)
|
||||||
|
_clear_modules(modules)
|
||||||
|
|
||||||
|
|
||||||
|
def run_setup(setup_script, args):
|
||||||
|
"""Run a distutils setup script, sandboxed in its directory"""
|
||||||
|
setup_dir = os.path.abspath(os.path.dirname(setup_script))
|
||||||
|
with setup_context(setup_dir):
|
||||||
|
try:
|
||||||
|
sys.argv[:] = [setup_script]+list(args)
|
||||||
|
sys.path.insert(0, setup_dir)
|
||||||
|
# reset to include setup dir, w/clean callback list
|
||||||
|
working_set.__init__()
|
||||||
|
working_set.callbacks.append(lambda dist:dist.activate())
|
||||||
|
def runner():
|
||||||
|
ns = dict(__file__=setup_script, __name__='__main__')
|
||||||
|
_execfile(setup_script, ns)
|
||||||
|
DirectorySandbox(setup_dir).run(runner)
|
||||||
|
except SystemExit as v:
|
||||||
|
if v.args and v.args[0]:
|
||||||
|
raise
|
||||||
|
# Normal exit, just return
|
||||||
|
|
||||||
|
|
||||||
|
class AbstractSandbox:
|
||||||
|
"""Wrap 'os' module and 'open()' builtin for virtualizing setup scripts"""
|
||||||
|
|
||||||
|
_active = False
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
self._attrs = [
|
||||||
|
name for name in dir(_os)
|
||||||
|
if not name.startswith('_') and hasattr(self,name)
|
||||||
|
]
|
||||||
|
|
||||||
|
def _copy(self, source):
|
||||||
|
for name in self._attrs:
|
||||||
|
setattr(os, name, getattr(source,name))
|
||||||
|
|
||||||
|
def run(self, func):
|
||||||
|
"""Run 'func' under os sandboxing"""
|
||||||
|
try:
|
||||||
|
self._copy(self)
|
||||||
|
if _file:
|
||||||
|
builtins.file = self._file
|
||||||
|
builtins.open = self._open
|
||||||
|
self._active = True
|
||||||
|
return func()
|
||||||
|
finally:
|
||||||
|
self._active = False
|
||||||
|
if _file:
|
||||||
|
builtins.file = _file
|
||||||
|
builtins.open = _open
|
||||||
|
self._copy(_os)
|
||||||
|
|
||||||
|
def _mk_dual_path_wrapper(name):
|
||||||
|
original = getattr(_os,name)
|
||||||
|
def wrap(self,src,dst,*args,**kw):
|
||||||
|
if self._active:
|
||||||
|
src,dst = self._remap_pair(name,src,dst,*args,**kw)
|
||||||
|
return original(src,dst,*args,**kw)
|
||||||
|
return wrap
|
||||||
|
|
||||||
|
for name in ["rename", "link", "symlink"]:
|
||||||
|
if hasattr(_os,name): locals()[name] = _mk_dual_path_wrapper(name)
|
||||||
|
|
||||||
|
def _mk_single_path_wrapper(name, original=None):
|
||||||
|
original = original or getattr(_os,name)
|
||||||
|
def wrap(self,path,*args,**kw):
|
||||||
|
if self._active:
|
||||||
|
path = self._remap_input(name,path,*args,**kw)
|
||||||
|
return original(path,*args,**kw)
|
||||||
|
return wrap
|
||||||
|
|
||||||
|
if _file:
|
||||||
|
_file = _mk_single_path_wrapper('file', _file)
|
||||||
|
_open = _mk_single_path_wrapper('open', _open)
|
||||||
|
for name in [
|
||||||
|
"stat", "listdir", "chdir", "open", "chmod", "chown", "mkdir",
|
||||||
|
"remove", "unlink", "rmdir", "utime", "lchown", "chroot", "lstat",
|
||||||
|
"startfile", "mkfifo", "mknod", "pathconf", "access"
|
||||||
|
]:
|
||||||
|
if hasattr(_os,name): locals()[name] = _mk_single_path_wrapper(name)
|
||||||
|
|
||||||
|
def _mk_single_with_return(name):
|
||||||
|
original = getattr(_os,name)
|
||||||
|
def wrap(self,path,*args,**kw):
|
||||||
|
if self._active:
|
||||||
|
path = self._remap_input(name,path,*args,**kw)
|
||||||
|
return self._remap_output(name, original(path,*args,**kw))
|
||||||
|
return original(path,*args,**kw)
|
||||||
|
return wrap
|
||||||
|
|
||||||
|
for name in ['readlink', 'tempnam']:
|
||||||
|
if hasattr(_os,name): locals()[name] = _mk_single_with_return(name)
|
||||||
|
|
||||||
|
def _mk_query(name):
|
||||||
|
original = getattr(_os,name)
|
||||||
|
def wrap(self,*args,**kw):
|
||||||
|
retval = original(*args,**kw)
|
||||||
|
if self._active:
|
||||||
|
return self._remap_output(name, retval)
|
||||||
|
return retval
|
||||||
|
return wrap
|
||||||
|
|
||||||
|
for name in ['getcwd', 'tmpnam']:
|
||||||
|
if hasattr(_os,name): locals()[name] = _mk_query(name)
|
||||||
|
|
||||||
|
def _validate_path(self,path):
|
||||||
|
"""Called to remap or validate any path, whether input or output"""
|
||||||
|
return path
|
||||||
|
|
||||||
|
def _remap_input(self,operation,path,*args,**kw):
|
||||||
|
"""Called for path inputs"""
|
||||||
|
return self._validate_path(path)
|
||||||
|
|
||||||
|
def _remap_output(self,operation,path):
|
||||||
|
"""Called for path outputs"""
|
||||||
|
return self._validate_path(path)
|
||||||
|
|
||||||
|
def _remap_pair(self,operation,src,dst,*args,**kw):
|
||||||
|
"""Called for path pairs like rename, link, and symlink operations"""
|
||||||
|
return (
|
||||||
|
self._remap_input(operation+'-from',src,*args,**kw),
|
||||||
|
self._remap_input(operation+'-to',dst,*args,**kw)
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
if hasattr(os, 'devnull'):
|
||||||
|
_EXCEPTIONS = [os.devnull,]
|
||||||
|
else:
|
||||||
|
_EXCEPTIONS = []
|
||||||
|
|
||||||
|
try:
|
||||||
|
from win32com.client.gencache import GetGeneratePath
|
||||||
|
_EXCEPTIONS.append(GetGeneratePath())
|
||||||
|
del GetGeneratePath
|
||||||
|
except ImportError:
|
||||||
|
# it appears pywin32 is not installed, so no need to exclude.
|
||||||
|
pass
|
||||||
|
|
||||||
|
class DirectorySandbox(AbstractSandbox):
|
||||||
|
"""Restrict operations to a single subdirectory - pseudo-chroot"""
|
||||||
|
|
||||||
|
write_ops = dict.fromkeys([
|
||||||
|
"open", "chmod", "chown", "mkdir", "remove", "unlink", "rmdir",
|
||||||
|
"utime", "lchown", "chroot", "mkfifo", "mknod", "tempnam",
|
||||||
|
])
|
||||||
|
|
||||||
|
_exception_patterns = [
|
||||||
|
# Allow lib2to3 to attempt to save a pickled grammar object (#121)
|
||||||
|
'.*lib2to3.*\.pickle$',
|
||||||
|
]
|
||||||
|
"exempt writing to paths that match the pattern"
|
||||||
|
|
||||||
|
def __init__(self, sandbox, exceptions=_EXCEPTIONS):
|
||||||
|
self._sandbox = os.path.normcase(os.path.realpath(sandbox))
|
||||||
|
self._prefix = os.path.join(self._sandbox,'')
|
||||||
|
self._exceptions = [
|
||||||
|
os.path.normcase(os.path.realpath(path))
|
||||||
|
for path in exceptions
|
||||||
|
]
|
||||||
|
AbstractSandbox.__init__(self)
|
||||||
|
|
||||||
|
def _violation(self, operation, *args, **kw):
|
||||||
|
raise SandboxViolation(operation, args, kw)
|
||||||
|
|
||||||
|
if _file:
|
||||||
|
def _file(self, path, mode='r', *args, **kw):
|
||||||
|
if mode not in ('r', 'rt', 'rb', 'rU', 'U') and not self._ok(path):
|
||||||
|
self._violation("file", path, mode, *args, **kw)
|
||||||
|
return _file(path,mode,*args,**kw)
|
||||||
|
|
||||||
|
def _open(self, path, mode='r', *args, **kw):
|
||||||
|
if mode not in ('r', 'rt', 'rb', 'rU', 'U') and not self._ok(path):
|
||||||
|
self._violation("open", path, mode, *args, **kw)
|
||||||
|
return _open(path,mode,*args,**kw)
|
||||||
|
|
||||||
|
def tmpnam(self):
|
||||||
|
self._violation("tmpnam")
|
||||||
|
|
||||||
|
def _ok(self, path):
|
||||||
|
active = self._active
|
||||||
|
try:
|
||||||
|
self._active = False
|
||||||
|
realpath = os.path.normcase(os.path.realpath(path))
|
||||||
|
return (
|
||||||
|
self._exempted(realpath)
|
||||||
|
or realpath == self._sandbox
|
||||||
|
or realpath.startswith(self._prefix)
|
||||||
|
)
|
||||||
|
finally:
|
||||||
|
self._active = active
|
||||||
|
|
||||||
|
def _exempted(self, filepath):
|
||||||
|
start_matches = (
|
||||||
|
filepath.startswith(exception)
|
||||||
|
for exception in self._exceptions
|
||||||
|
)
|
||||||
|
pattern_matches = (
|
||||||
|
re.match(pattern, filepath)
|
||||||
|
for pattern in self._exception_patterns
|
||||||
|
)
|
||||||
|
candidates = itertools.chain(start_matches, pattern_matches)
|
||||||
|
return any(candidates)
|
||||||
|
|
||||||
|
def _remap_input(self, operation, path, *args, **kw):
|
||||||
|
"""Called for path inputs"""
|
||||||
|
if operation in self.write_ops and not self._ok(path):
|
||||||
|
self._violation(operation, os.path.realpath(path), *args, **kw)
|
||||||
|
return path
|
||||||
|
|
||||||
|
def _remap_pair(self, operation, src, dst, *args, **kw):
|
||||||
|
"""Called for path pairs like rename, link, and symlink operations"""
|
||||||
|
if not self._ok(src) or not self._ok(dst):
|
||||||
|
self._violation(operation, src, dst, *args, **kw)
|
||||||
|
return (src,dst)
|
||||||
|
|
||||||
|
def open(self, file, flags, mode=0o777, *args, **kw):
|
||||||
|
"""Called for low-level os.open()"""
|
||||||
|
if flags & WRITE_FLAGS and not self._ok(file):
|
||||||
|
self._violation("os.open", file, flags, mode, *args, **kw)
|
||||||
|
return _os.open(file,flags,mode, *args, **kw)
|
||||||
|
|
||||||
|
WRITE_FLAGS = functools.reduce(
|
||||||
|
operator.or_, [getattr(_os, a, 0) for a in
|
||||||
|
"O_WRONLY O_RDWR O_APPEND O_CREAT O_TRUNC O_TEMPORARY".split()]
|
||||||
|
)
|
||||||
|
|
||||||
|
class SandboxViolation(DistutilsError):
|
||||||
|
"""A setup script attempted to modify the filesystem outside the sandbox"""
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return """SandboxViolation: %s%r %s
|
||||||
|
|
||||||
|
The package setup script has attempted to modify files on your system
|
||||||
|
that are not within the EasyInstall build area, and has been aborted.
|
||||||
|
|
||||||
|
This package cannot be safely installed by EasyInstall, and may not
|
||||||
|
support alternate installation locations even if you run its setup
|
||||||
|
script by hand. Please inform the package's author and the EasyInstall
|
||||||
|
maintainers to find out if a fix or workaround is available.""" % self.args
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
#
|
5
lib/python3.4/site-packages/setuptools/script (dev).tmpl
Normal file
5
lib/python3.4/site-packages/setuptools/script (dev).tmpl
Normal file
|
@ -0,0 +1,5 @@
|
||||||
|
# EASY-INSTALL-DEV-SCRIPT: %(spec)r,%(script_name)r
|
||||||
|
__requires__ = %(spec)r
|
||||||
|
__import__('pkg_resources').require(%(spec)r)
|
||||||
|
__file__ = %(dev_path)r
|
||||||
|
exec(compile(open(__file__).read(), __file__, 'exec'))
|
3
lib/python3.4/site-packages/setuptools/script.tmpl
Normal file
3
lib/python3.4/site-packages/setuptools/script.tmpl
Normal file
|
@ -0,0 +1,3 @@
|
||||||
|
# EASY-INSTALL-SCRIPT: %(spec)r,%(script_name)r
|
||||||
|
__requires__ = %(spec)r
|
||||||
|
__import__('pkg_resources').run_script(%(spec)r, %(script_name)r)
|
76
lib/python3.4/site-packages/setuptools/site-patch.py
Normal file
76
lib/python3.4/site-packages/setuptools/site-patch.py
Normal file
|
@ -0,0 +1,76 @@
|
||||||
|
def __boot():
|
||||||
|
import sys
|
||||||
|
import os
|
||||||
|
PYTHONPATH = os.environ.get('PYTHONPATH')
|
||||||
|
if PYTHONPATH is None or (sys.platform=='win32' and not PYTHONPATH):
|
||||||
|
PYTHONPATH = []
|
||||||
|
else:
|
||||||
|
PYTHONPATH = PYTHONPATH.split(os.pathsep)
|
||||||
|
|
||||||
|
pic = getattr(sys,'path_importer_cache',{})
|
||||||
|
stdpath = sys.path[len(PYTHONPATH):]
|
||||||
|
mydir = os.path.dirname(__file__)
|
||||||
|
#print "searching",stdpath,sys.path
|
||||||
|
|
||||||
|
for item in stdpath:
|
||||||
|
if item==mydir or not item:
|
||||||
|
continue # skip if current dir. on Windows, or my own directory
|
||||||
|
importer = pic.get(item)
|
||||||
|
if importer is not None:
|
||||||
|
loader = importer.find_module('site')
|
||||||
|
if loader is not None:
|
||||||
|
# This should actually reload the current module
|
||||||
|
loader.load_module('site')
|
||||||
|
break
|
||||||
|
else:
|
||||||
|
try:
|
||||||
|
import imp # Avoid import loop in Python >= 3.3
|
||||||
|
stream, path, descr = imp.find_module('site',[item])
|
||||||
|
except ImportError:
|
||||||
|
continue
|
||||||
|
if stream is None:
|
||||||
|
continue
|
||||||
|
try:
|
||||||
|
# This should actually reload the current module
|
||||||
|
imp.load_module('site',stream,path,descr)
|
||||||
|
finally:
|
||||||
|
stream.close()
|
||||||
|
break
|
||||||
|
else:
|
||||||
|
raise ImportError("Couldn't find the real 'site' module")
|
||||||
|
|
||||||
|
#print "loaded", __file__
|
||||||
|
|
||||||
|
known_paths = dict([(makepath(item)[1],1) for item in sys.path]) # 2.2 comp
|
||||||
|
|
||||||
|
oldpos = getattr(sys,'__egginsert',0) # save old insertion position
|
||||||
|
sys.__egginsert = 0 # and reset the current one
|
||||||
|
|
||||||
|
for item in PYTHONPATH:
|
||||||
|
addsitedir(item)
|
||||||
|
|
||||||
|
sys.__egginsert += oldpos # restore effective old position
|
||||||
|
|
||||||
|
d, nd = makepath(stdpath[0])
|
||||||
|
insert_at = None
|
||||||
|
new_path = []
|
||||||
|
|
||||||
|
for item in sys.path:
|
||||||
|
p, np = makepath(item)
|
||||||
|
|
||||||
|
if np==nd and insert_at is None:
|
||||||
|
# We've hit the first 'system' path entry, so added entries go here
|
||||||
|
insert_at = len(new_path)
|
||||||
|
|
||||||
|
if np in known_paths or insert_at is None:
|
||||||
|
new_path.append(item)
|
||||||
|
else:
|
||||||
|
# new path after the insert point, back-insert it
|
||||||
|
new_path.insert(insert_at, item)
|
||||||
|
insert_at += 1
|
||||||
|
|
||||||
|
sys.path[:] = new_path
|
||||||
|
|
||||||
|
if __name__=='site':
|
||||||
|
__boot()
|
||||||
|
del __boot
|
241
lib/python3.4/site-packages/setuptools/ssl_support.py
Normal file
241
lib/python3.4/site-packages/setuptools/ssl_support.py
Normal file
|
@ -0,0 +1,241 @@
|
||||||
|
import os
|
||||||
|
import socket
|
||||||
|
import atexit
|
||||||
|
import re
|
||||||
|
|
||||||
|
import pkg_resources
|
||||||
|
from pkg_resources import ResolutionError, ExtractionError
|
||||||
|
from setuptools.compat import urllib2
|
||||||
|
|
||||||
|
try:
|
||||||
|
import ssl
|
||||||
|
except ImportError:
|
||||||
|
ssl = None
|
||||||
|
|
||||||
|
__all__ = [
|
||||||
|
'VerifyingHTTPSHandler', 'find_ca_bundle', 'is_available', 'cert_paths',
|
||||||
|
'opener_for'
|
||||||
|
]
|
||||||
|
|
||||||
|
cert_paths = """
|
||||||
|
/etc/pki/tls/certs/ca-bundle.crt
|
||||||
|
/etc/ssl/certs/ca-certificates.crt
|
||||||
|
/usr/share/ssl/certs/ca-bundle.crt
|
||||||
|
/usr/local/share/certs/ca-root.crt
|
||||||
|
/etc/ssl/cert.pem
|
||||||
|
/System/Library/OpenSSL/certs/cert.pem
|
||||||
|
""".strip().split()
|
||||||
|
|
||||||
|
|
||||||
|
HTTPSHandler = HTTPSConnection = object
|
||||||
|
|
||||||
|
for what, where in (
|
||||||
|
('HTTPSHandler', ['urllib2','urllib.request']),
|
||||||
|
('HTTPSConnection', ['httplib', 'http.client']),
|
||||||
|
):
|
||||||
|
for module in where:
|
||||||
|
try:
|
||||||
|
exec("from %s import %s" % (module, what))
|
||||||
|
except ImportError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
is_available = ssl is not None and object not in (HTTPSHandler, HTTPSConnection)
|
||||||
|
|
||||||
|
|
||||||
|
try:
|
||||||
|
from ssl import CertificateError, match_hostname
|
||||||
|
except ImportError:
|
||||||
|
try:
|
||||||
|
from backports.ssl_match_hostname import CertificateError
|
||||||
|
from backports.ssl_match_hostname import match_hostname
|
||||||
|
except ImportError:
|
||||||
|
CertificateError = None
|
||||||
|
match_hostname = None
|
||||||
|
|
||||||
|
if not CertificateError:
|
||||||
|
class CertificateError(ValueError):
|
||||||
|
pass
|
||||||
|
|
||||||
|
if not match_hostname:
|
||||||
|
def _dnsname_match(dn, hostname, max_wildcards=1):
|
||||||
|
"""Matching according to RFC 6125, section 6.4.3
|
||||||
|
|
||||||
|
http://tools.ietf.org/html/rfc6125#section-6.4.3
|
||||||
|
"""
|
||||||
|
pats = []
|
||||||
|
if not dn:
|
||||||
|
return False
|
||||||
|
|
||||||
|
# Ported from python3-syntax:
|
||||||
|
# leftmost, *remainder = dn.split(r'.')
|
||||||
|
parts = dn.split(r'.')
|
||||||
|
leftmost = parts[0]
|
||||||
|
remainder = parts[1:]
|
||||||
|
|
||||||
|
wildcards = leftmost.count('*')
|
||||||
|
if wildcards > max_wildcards:
|
||||||
|
# Issue #17980: avoid denials of service by refusing more
|
||||||
|
# than one wildcard per fragment. A survey of established
|
||||||
|
# policy among SSL implementations showed it to be a
|
||||||
|
# reasonable choice.
|
||||||
|
raise CertificateError(
|
||||||
|
"too many wildcards in certificate DNS name: " + repr(dn))
|
||||||
|
|
||||||
|
# speed up common case w/o wildcards
|
||||||
|
if not wildcards:
|
||||||
|
return dn.lower() == hostname.lower()
|
||||||
|
|
||||||
|
# RFC 6125, section 6.4.3, subitem 1.
|
||||||
|
# The client SHOULD NOT attempt to match a presented identifier in which
|
||||||
|
# the wildcard character comprises a label other than the left-most label.
|
||||||
|
if leftmost == '*':
|
||||||
|
# When '*' is a fragment by itself, it matches a non-empty dotless
|
||||||
|
# fragment.
|
||||||
|
pats.append('[^.]+')
|
||||||
|
elif leftmost.startswith('xn--') or hostname.startswith('xn--'):
|
||||||
|
# RFC 6125, section 6.4.3, subitem 3.
|
||||||
|
# The client SHOULD NOT attempt to match a presented identifier
|
||||||
|
# where the wildcard character is embedded within an A-label or
|
||||||
|
# U-label of an internationalized domain name.
|
||||||
|
pats.append(re.escape(leftmost))
|
||||||
|
else:
|
||||||
|
# Otherwise, '*' matches any dotless string, e.g. www*
|
||||||
|
pats.append(re.escape(leftmost).replace(r'\*', '[^.]*'))
|
||||||
|
|
||||||
|
# add the remaining fragments, ignore any wildcards
|
||||||
|
for frag in remainder:
|
||||||
|
pats.append(re.escape(frag))
|
||||||
|
|
||||||
|
pat = re.compile(r'\A' + r'\.'.join(pats) + r'\Z', re.IGNORECASE)
|
||||||
|
return pat.match(hostname)
|
||||||
|
|
||||||
|
def match_hostname(cert, hostname):
|
||||||
|
"""Verify that *cert* (in decoded format as returned by
|
||||||
|
SSLSocket.getpeercert()) matches the *hostname*. RFC 2818 and RFC 6125
|
||||||
|
rules are followed, but IP addresses are not accepted for *hostname*.
|
||||||
|
|
||||||
|
CertificateError is raised on failure. On success, the function
|
||||||
|
returns nothing.
|
||||||
|
"""
|
||||||
|
if not cert:
|
||||||
|
raise ValueError("empty or no certificate")
|
||||||
|
dnsnames = []
|
||||||
|
san = cert.get('subjectAltName', ())
|
||||||
|
for key, value in san:
|
||||||
|
if key == 'DNS':
|
||||||
|
if _dnsname_match(value, hostname):
|
||||||
|
return
|
||||||
|
dnsnames.append(value)
|
||||||
|
if not dnsnames:
|
||||||
|
# The subject is only checked when there is no dNSName entry
|
||||||
|
# in subjectAltName
|
||||||
|
for sub in cert.get('subject', ()):
|
||||||
|
for key, value in sub:
|
||||||
|
# XXX according to RFC 2818, the most specific Common Name
|
||||||
|
# must be used.
|
||||||
|
if key == 'commonName':
|
||||||
|
if _dnsname_match(value, hostname):
|
||||||
|
return
|
||||||
|
dnsnames.append(value)
|
||||||
|
if len(dnsnames) > 1:
|
||||||
|
raise CertificateError("hostname %r "
|
||||||
|
"doesn't match either of %s"
|
||||||
|
% (hostname, ', '.join(map(repr, dnsnames))))
|
||||||
|
elif len(dnsnames) == 1:
|
||||||
|
raise CertificateError("hostname %r "
|
||||||
|
"doesn't match %r"
|
||||||
|
% (hostname, dnsnames[0]))
|
||||||
|
else:
|
||||||
|
raise CertificateError("no appropriate commonName or "
|
||||||
|
"subjectAltName fields were found")
|
||||||
|
|
||||||
|
|
||||||
|
class VerifyingHTTPSHandler(HTTPSHandler):
|
||||||
|
"""Simple verifying handler: no auth, subclasses, timeouts, etc."""
|
||||||
|
|
||||||
|
def __init__(self, ca_bundle):
|
||||||
|
self.ca_bundle = ca_bundle
|
||||||
|
HTTPSHandler.__init__(self)
|
||||||
|
|
||||||
|
def https_open(self, req):
|
||||||
|
return self.do_open(
|
||||||
|
lambda host, **kw: VerifyingHTTPSConn(host, self.ca_bundle, **kw), req
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class VerifyingHTTPSConn(HTTPSConnection):
|
||||||
|
"""Simple verifying connection: no auth, subclasses, timeouts, etc."""
|
||||||
|
def __init__(self, host, ca_bundle, **kw):
|
||||||
|
HTTPSConnection.__init__(self, host, **kw)
|
||||||
|
self.ca_bundle = ca_bundle
|
||||||
|
|
||||||
|
def connect(self):
|
||||||
|
sock = socket.create_connection(
|
||||||
|
(self.host, self.port), getattr(self, 'source_address', None)
|
||||||
|
)
|
||||||
|
|
||||||
|
# Handle the socket if a (proxy) tunnel is present
|
||||||
|
if hasattr(self, '_tunnel') and getattr(self, '_tunnel_host', None):
|
||||||
|
self.sock = sock
|
||||||
|
self._tunnel()
|
||||||
|
# http://bugs.python.org/issue7776: Python>=3.4.1 and >=2.7.7
|
||||||
|
# change self.host to mean the proxy server host when tunneling is
|
||||||
|
# being used. Adapt, since we are interested in the destination
|
||||||
|
# host for the match_hostname() comparison.
|
||||||
|
actual_host = self._tunnel_host
|
||||||
|
else:
|
||||||
|
actual_host = self.host
|
||||||
|
|
||||||
|
self.sock = ssl.wrap_socket(
|
||||||
|
sock, cert_reqs=ssl.CERT_REQUIRED, ca_certs=self.ca_bundle
|
||||||
|
)
|
||||||
|
try:
|
||||||
|
match_hostname(self.sock.getpeercert(), actual_host)
|
||||||
|
except CertificateError:
|
||||||
|
self.sock.shutdown(socket.SHUT_RDWR)
|
||||||
|
self.sock.close()
|
||||||
|
raise
|
||||||
|
|
||||||
|
def opener_for(ca_bundle=None):
|
||||||
|
"""Get a urlopen() replacement that uses ca_bundle for verification"""
|
||||||
|
return urllib2.build_opener(
|
||||||
|
VerifyingHTTPSHandler(ca_bundle or find_ca_bundle())
|
||||||
|
).open
|
||||||
|
|
||||||
|
|
||||||
|
_wincerts = None
|
||||||
|
|
||||||
|
def get_win_certfile():
|
||||||
|
global _wincerts
|
||||||
|
if _wincerts is not None:
|
||||||
|
return _wincerts.name
|
||||||
|
|
||||||
|
try:
|
||||||
|
from wincertstore import CertFile
|
||||||
|
except ImportError:
|
||||||
|
return None
|
||||||
|
|
||||||
|
class MyCertFile(CertFile):
|
||||||
|
def __init__(self, stores=(), certs=()):
|
||||||
|
CertFile.__init__(self)
|
||||||
|
for store in stores:
|
||||||
|
self.addstore(store)
|
||||||
|
self.addcerts(certs)
|
||||||
|
atexit.register(self.close)
|
||||||
|
|
||||||
|
_wincerts = MyCertFile(stores=['CA', 'ROOT'])
|
||||||
|
return _wincerts.name
|
||||||
|
|
||||||
|
|
||||||
|
def find_ca_bundle():
|
||||||
|
"""Return an existing CA bundle path, or None"""
|
||||||
|
if os.name=='nt':
|
||||||
|
return get_win_certfile()
|
||||||
|
else:
|
||||||
|
for cert_path in cert_paths:
|
||||||
|
if os.path.isfile(cert_path):
|
||||||
|
return cert_path
|
||||||
|
try:
|
||||||
|
return pkg_resources.resource_filename('certifi', 'cacert.pem')
|
||||||
|
except (ImportError, ResolutionError, ExtractionError):
|
||||||
|
return None
|
41
lib/python3.4/site-packages/setuptools/unicode_utils.py
Normal file
41
lib/python3.4/site-packages/setuptools/unicode_utils.py
Normal file
|
@ -0,0 +1,41 @@
|
||||||
|
import unicodedata
|
||||||
|
import sys
|
||||||
|
from setuptools.compat import unicode as decoded_string
|
||||||
|
|
||||||
|
|
||||||
|
# HFS Plus uses decomposed UTF-8
|
||||||
|
def decompose(path):
|
||||||
|
if isinstance(path, decoded_string):
|
||||||
|
return unicodedata.normalize('NFD', path)
|
||||||
|
try:
|
||||||
|
path = path.decode('utf-8')
|
||||||
|
path = unicodedata.normalize('NFD', path)
|
||||||
|
path = path.encode('utf-8')
|
||||||
|
except UnicodeError:
|
||||||
|
pass # Not UTF-8
|
||||||
|
return path
|
||||||
|
|
||||||
|
|
||||||
|
def filesys_decode(path):
|
||||||
|
"""
|
||||||
|
Ensure that the given path is decoded,
|
||||||
|
NONE when no expected encoding works
|
||||||
|
"""
|
||||||
|
|
||||||
|
fs_enc = sys.getfilesystemencoding()
|
||||||
|
if isinstance(path, decoded_string):
|
||||||
|
return path
|
||||||
|
|
||||||
|
for enc in (fs_enc, "utf-8"):
|
||||||
|
try:
|
||||||
|
return path.decode(enc)
|
||||||
|
except UnicodeDecodeError:
|
||||||
|
continue
|
||||||
|
|
||||||
|
|
||||||
|
def try_encode(string, enc):
|
||||||
|
"turn unicode encoding into a functional routine"
|
||||||
|
try:
|
||||||
|
return string.encode(enc)
|
||||||
|
except UnicodeEncodeError:
|
||||||
|
return None
|
11
lib/python3.4/site-packages/setuptools/utils.py
Normal file
11
lib/python3.4/site-packages/setuptools/utils.py
Normal file
|
@ -0,0 +1,11 @@
|
||||||
|
import os
|
||||||
|
import os.path
|
||||||
|
|
||||||
|
|
||||||
|
def cs_path_exists(fspath):
|
||||||
|
if not os.path.exists(fspath):
|
||||||
|
return False
|
||||||
|
# make absolute so we always have a directory
|
||||||
|
abspath = os.path.abspath(fspath)
|
||||||
|
directory, filename = os.path.split(abspath)
|
||||||
|
return filename in os.listdir(directory)
|
1
lib/python3.4/site-packages/setuptools/version.py
Normal file
1
lib/python3.4/site-packages/setuptools/version.py
Normal file
|
@ -0,0 +1 @@
|
||||||
|
__version__ = '18.5'
|
29
lib/python3.4/site-packages/setuptools/windows_support.py
Normal file
29
lib/python3.4/site-packages/setuptools/windows_support.py
Normal file
|
@ -0,0 +1,29 @@
|
||||||
|
import platform
|
||||||
|
import ctypes
|
||||||
|
|
||||||
|
|
||||||
|
def windows_only(func):
|
||||||
|
if platform.system() != 'Windows':
|
||||||
|
return lambda *args, **kwargs: None
|
||||||
|
return func
|
||||||
|
|
||||||
|
|
||||||
|
@windows_only
|
||||||
|
def hide_file(path):
|
||||||
|
"""
|
||||||
|
Set the hidden attribute on a file or directory.
|
||||||
|
|
||||||
|
From http://stackoverflow.com/questions/19622133/
|
||||||
|
|
||||||
|
`path` must be text.
|
||||||
|
"""
|
||||||
|
__import__('ctypes.wintypes')
|
||||||
|
SetFileAttributes = ctypes.windll.kernel32.SetFileAttributesW
|
||||||
|
SetFileAttributes.argtypes = ctypes.wintypes.LPWSTR, ctypes.wintypes.DWORD
|
||||||
|
SetFileAttributes.restype = ctypes.wintypes.BOOL
|
||||||
|
|
||||||
|
FILE_ATTRIBUTE_HIDDEN = 0x02
|
||||||
|
|
||||||
|
ret = SetFileAttributes(path, FILE_ATTRIBUTE_HIDDEN)
|
||||||
|
if not ret:
|
||||||
|
raise ctypes.WinError()
|
|
@ -0,0 +1,18 @@
|
||||||
|
Six is a Python 2 and 3 compatibility library. It provides utility functions
|
||||||
|
for smoothing over the differences between the Python versions with the goal of
|
||||||
|
writing Python code that is compatible on both Python versions. See the
|
||||||
|
documentation for more information on what is provided.
|
||||||
|
|
||||||
|
Six supports every Python version since 2.6. It is contained in only one Python
|
||||||
|
file, so it can be easily copied into your project. (The copyright and license
|
||||||
|
notice must be retained.)
|
||||||
|
|
||||||
|
Online documentation is at https://pythonhosted.org/six/.
|
||||||
|
|
||||||
|
Bugs can be reported to https://bitbucket.org/gutworth/six. The code can also
|
||||||
|
be found there.
|
||||||
|
|
||||||
|
For questions about six or porting in general, email the python-porting mailing
|
||||||
|
list: https://mail.python.org/mailman/listinfo/python-porting
|
||||||
|
|
||||||
|
|
34
lib/python3.4/site-packages/six-1.10.0.dist-info/METADATA
Normal file
34
lib/python3.4/site-packages/six-1.10.0.dist-info/METADATA
Normal file
|
@ -0,0 +1,34 @@
|
||||||
|
Metadata-Version: 2.0
|
||||||
|
Name: six
|
||||||
|
Version: 1.10.0
|
||||||
|
Summary: Python 2 and 3 compatibility utilities
|
||||||
|
Home-page: http://pypi.python.org/pypi/six/
|
||||||
|
Author: Benjamin Peterson
|
||||||
|
Author-email: benjamin@python.org
|
||||||
|
License: MIT
|
||||||
|
Platform: UNKNOWN
|
||||||
|
Classifier: Programming Language :: Python :: 2
|
||||||
|
Classifier: Programming Language :: Python :: 3
|
||||||
|
Classifier: Intended Audience :: Developers
|
||||||
|
Classifier: License :: OSI Approved :: MIT License
|
||||||
|
Classifier: Topic :: Software Development :: Libraries
|
||||||
|
Classifier: Topic :: Utilities
|
||||||
|
|
||||||
|
Six is a Python 2 and 3 compatibility library. It provides utility functions
|
||||||
|
for smoothing over the differences between the Python versions with the goal of
|
||||||
|
writing Python code that is compatible on both Python versions. See the
|
||||||
|
documentation for more information on what is provided.
|
||||||
|
|
||||||
|
Six supports every Python version since 2.6. It is contained in only one Python
|
||||||
|
file, so it can be easily copied into your project. (The copyright and license
|
||||||
|
notice must be retained.)
|
||||||
|
|
||||||
|
Online documentation is at https://pythonhosted.org/six/.
|
||||||
|
|
||||||
|
Bugs can be reported to https://bitbucket.org/gutworth/six. The code can also
|
||||||
|
be found there.
|
||||||
|
|
||||||
|
For questions about six or porting in general, email the python-porting mailing
|
||||||
|
list: https://mail.python.org/mailman/listinfo/python-porting
|
||||||
|
|
||||||
|
|
8
lib/python3.4/site-packages/six-1.10.0.dist-info/RECORD
Normal file
8
lib/python3.4/site-packages/six-1.10.0.dist-info/RECORD
Normal file
|
@ -0,0 +1,8 @@
|
||||||
|
six.py,sha256=A6hdJZVjI3t_geebZ9BzUvwRrIXo0lfwzQlM2LcKyas,30098
|
||||||
|
six-1.10.0.dist-info/DESCRIPTION.rst,sha256=QWBtSTT2zzabwJv1NQbTfClSX13m-Qc6tqU4TRL1RLs,774
|
||||||
|
six-1.10.0.dist-info/METADATA,sha256=5HceJsUnHof2IRamlCKO2MwNjve1eSP4rLzVQDfwpCQ,1283
|
||||||
|
six-1.10.0.dist-info/RECORD,,
|
||||||
|
six-1.10.0.dist-info/WHEEL,sha256=GrqQvamwgBV4nLoJe0vhYRSWzWsx7xjlt74FT0SWYfE,110
|
||||||
|
six-1.10.0.dist-info/metadata.json,sha256=jtOeeTBubYDChl_5Ql5ZPlKoHgg6rdqRIjOz1e5Ek2U,658
|
||||||
|
six-1.10.0.dist-info/top_level.txt,sha256=_iVH_iYEtEXnD8nYGQYpYFUvkUW9sEO1GYbkeKSAais,4
|
||||||
|
__pycache__/six.cpython-34.pyc,,
|
6
lib/python3.4/site-packages/six-1.10.0.dist-info/WHEEL
Normal file
6
lib/python3.4/site-packages/six-1.10.0.dist-info/WHEEL
Normal file
|
@ -0,0 +1,6 @@
|
||||||
|
Wheel-Version: 1.0
|
||||||
|
Generator: bdist_wheel (0.26.0)
|
||||||
|
Root-Is-Purelib: true
|
||||||
|
Tag: py2-none-any
|
||||||
|
Tag: py3-none-any
|
||||||
|
|
|
@ -0,0 +1 @@
|
||||||
|
{"generator": "bdist_wheel (0.26.0)", "summary": "Python 2 and 3 compatibility utilities", "classifiers": ["Programming Language :: Python :: 2", "Programming Language :: Python :: 3", "Intended Audience :: Developers", "License :: OSI Approved :: MIT License", "Topic :: Software Development :: Libraries", "Topic :: Utilities"], "extensions": {"python.details": {"project_urls": {"Home": "http://pypi.python.org/pypi/six/"}, "contacts": [{"email": "benjamin@python.org", "name": "Benjamin Peterson", "role": "author"}], "document_names": {"description": "DESCRIPTION.rst"}}}, "license": "MIT", "metadata_version": "2.0", "name": "six", "version": "1.10.0"}
|
|
@ -0,0 +1 @@
|
||||||
|
six
|
868
lib/python3.4/site-packages/six.py
Normal file
868
lib/python3.4/site-packages/six.py
Normal file
|
@ -0,0 +1,868 @@
|
||||||
|
"""Utilities for writing code that runs on Python 2 and 3"""
|
||||||
|
|
||||||
|
# Copyright (c) 2010-2015 Benjamin Peterson
|
||||||
|
#
|
||||||
|
# Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
# of this software and associated documentation files (the "Software"), to deal
|
||||||
|
# in the Software without restriction, including without limitation the rights
|
||||||
|
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
# copies of the Software, and to permit persons to whom the Software is
|
||||||
|
# furnished to do so, subject to the following conditions:
|
||||||
|
#
|
||||||
|
# The above copyright notice and this permission notice shall be included in all
|
||||||
|
# copies or substantial portions of the Software.
|
||||||
|
#
|
||||||
|
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||||
|
# SOFTWARE.
|
||||||
|
|
||||||
|
from __future__ import absolute_import
|
||||||
|
|
||||||
|
import functools
|
||||||
|
import itertools
|
||||||
|
import operator
|
||||||
|
import sys
|
||||||
|
import types
|
||||||
|
|
||||||
|
__author__ = "Benjamin Peterson <benjamin@python.org>"
|
||||||
|
__version__ = "1.10.0"
|
||||||
|
|
||||||
|
|
||||||
|
# Useful for very coarse version differentiation.
|
||||||
|
PY2 = sys.version_info[0] == 2
|
||||||
|
PY3 = sys.version_info[0] == 3
|
||||||
|
PY34 = sys.version_info[0:2] >= (3, 4)
|
||||||
|
|
||||||
|
if PY3:
|
||||||
|
string_types = str,
|
||||||
|
integer_types = int,
|
||||||
|
class_types = type,
|
||||||
|
text_type = str
|
||||||
|
binary_type = bytes
|
||||||
|
|
||||||
|
MAXSIZE = sys.maxsize
|
||||||
|
else:
|
||||||
|
string_types = basestring,
|
||||||
|
integer_types = (int, long)
|
||||||
|
class_types = (type, types.ClassType)
|
||||||
|
text_type = unicode
|
||||||
|
binary_type = str
|
||||||
|
|
||||||
|
if sys.platform.startswith("java"):
|
||||||
|
# Jython always uses 32 bits.
|
||||||
|
MAXSIZE = int((1 << 31) - 1)
|
||||||
|
else:
|
||||||
|
# It's possible to have sizeof(long) != sizeof(Py_ssize_t).
|
||||||
|
class X(object):
|
||||||
|
|
||||||
|
def __len__(self):
|
||||||
|
return 1 << 31
|
||||||
|
try:
|
||||||
|
len(X())
|
||||||
|
except OverflowError:
|
||||||
|
# 32-bit
|
||||||
|
MAXSIZE = int((1 << 31) - 1)
|
||||||
|
else:
|
||||||
|
# 64-bit
|
||||||
|
MAXSIZE = int((1 << 63) - 1)
|
||||||
|
del X
|
||||||
|
|
||||||
|
|
||||||
|
def _add_doc(func, doc):
|
||||||
|
"""Add documentation to a function."""
|
||||||
|
func.__doc__ = doc
|
||||||
|
|
||||||
|
|
||||||
|
def _import_module(name):
|
||||||
|
"""Import module, returning the module after the last dot."""
|
||||||
|
__import__(name)
|
||||||
|
return sys.modules[name]
|
||||||
|
|
||||||
|
|
||||||
|
class _LazyDescr(object):
|
||||||
|
|
||||||
|
def __init__(self, name):
|
||||||
|
self.name = name
|
||||||
|
|
||||||
|
def __get__(self, obj, tp):
|
||||||
|
result = self._resolve()
|
||||||
|
setattr(obj, self.name, result) # Invokes __set__.
|
||||||
|
try:
|
||||||
|
# This is a bit ugly, but it avoids running this again by
|
||||||
|
# removing this descriptor.
|
||||||
|
delattr(obj.__class__, self.name)
|
||||||
|
except AttributeError:
|
||||||
|
pass
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
class MovedModule(_LazyDescr):
|
||||||
|
|
||||||
|
def __init__(self, name, old, new=None):
|
||||||
|
super(MovedModule, self).__init__(name)
|
||||||
|
if PY3:
|
||||||
|
if new is None:
|
||||||
|
new = name
|
||||||
|
self.mod = new
|
||||||
|
else:
|
||||||
|
self.mod = old
|
||||||
|
|
||||||
|
def _resolve(self):
|
||||||
|
return _import_module(self.mod)
|
||||||
|
|
||||||
|
def __getattr__(self, attr):
|
||||||
|
_module = self._resolve()
|
||||||
|
value = getattr(_module, attr)
|
||||||
|
setattr(self, attr, value)
|
||||||
|
return value
|
||||||
|
|
||||||
|
|
||||||
|
class _LazyModule(types.ModuleType):
|
||||||
|
|
||||||
|
def __init__(self, name):
|
||||||
|
super(_LazyModule, self).__init__(name)
|
||||||
|
self.__doc__ = self.__class__.__doc__
|
||||||
|
|
||||||
|
def __dir__(self):
|
||||||
|
attrs = ["__doc__", "__name__"]
|
||||||
|
attrs += [attr.name for attr in self._moved_attributes]
|
||||||
|
return attrs
|
||||||
|
|
||||||
|
# Subclasses should override this
|
||||||
|
_moved_attributes = []
|
||||||
|
|
||||||
|
|
||||||
|
class MovedAttribute(_LazyDescr):
|
||||||
|
|
||||||
|
def __init__(self, name, old_mod, new_mod, old_attr=None, new_attr=None):
|
||||||
|
super(MovedAttribute, self).__init__(name)
|
||||||
|
if PY3:
|
||||||
|
if new_mod is None:
|
||||||
|
new_mod = name
|
||||||
|
self.mod = new_mod
|
||||||
|
if new_attr is None:
|
||||||
|
if old_attr is None:
|
||||||
|
new_attr = name
|
||||||
|
else:
|
||||||
|
new_attr = old_attr
|
||||||
|
self.attr = new_attr
|
||||||
|
else:
|
||||||
|
self.mod = old_mod
|
||||||
|
if old_attr is None:
|
||||||
|
old_attr = name
|
||||||
|
self.attr = old_attr
|
||||||
|
|
||||||
|
def _resolve(self):
|
||||||
|
module = _import_module(self.mod)
|
||||||
|
return getattr(module, self.attr)
|
||||||
|
|
||||||
|
|
||||||
|
class _SixMetaPathImporter(object):
|
||||||
|
|
||||||
|
"""
|
||||||
|
A meta path importer to import six.moves and its submodules.
|
||||||
|
|
||||||
|
This class implements a PEP302 finder and loader. It should be compatible
|
||||||
|
with Python 2.5 and all existing versions of Python3
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, six_module_name):
|
||||||
|
self.name = six_module_name
|
||||||
|
self.known_modules = {}
|
||||||
|
|
||||||
|
def _add_module(self, mod, *fullnames):
|
||||||
|
for fullname in fullnames:
|
||||||
|
self.known_modules[self.name + "." + fullname] = mod
|
||||||
|
|
||||||
|
def _get_module(self, fullname):
|
||||||
|
return self.known_modules[self.name + "." + fullname]
|
||||||
|
|
||||||
|
def find_module(self, fullname, path=None):
|
||||||
|
if fullname in self.known_modules:
|
||||||
|
return self
|
||||||
|
return None
|
||||||
|
|
||||||
|
def __get_module(self, fullname):
|
||||||
|
try:
|
||||||
|
return self.known_modules[fullname]
|
||||||
|
except KeyError:
|
||||||
|
raise ImportError("This loader does not know module " + fullname)
|
||||||
|
|
||||||
|
def load_module(self, fullname):
|
||||||
|
try:
|
||||||
|
# in case of a reload
|
||||||
|
return sys.modules[fullname]
|
||||||
|
except KeyError:
|
||||||
|
pass
|
||||||
|
mod = self.__get_module(fullname)
|
||||||
|
if isinstance(mod, MovedModule):
|
||||||
|
mod = mod._resolve()
|
||||||
|
else:
|
||||||
|
mod.__loader__ = self
|
||||||
|
sys.modules[fullname] = mod
|
||||||
|
return mod
|
||||||
|
|
||||||
|
def is_package(self, fullname):
|
||||||
|
"""
|
||||||
|
Return true, if the named module is a package.
|
||||||
|
|
||||||
|
We need this method to get correct spec objects with
|
||||||
|
Python 3.4 (see PEP451)
|
||||||
|
"""
|
||||||
|
return hasattr(self.__get_module(fullname), "__path__")
|
||||||
|
|
||||||
|
def get_code(self, fullname):
|
||||||
|
"""Return None
|
||||||
|
|
||||||
|
Required, if is_package is implemented"""
|
||||||
|
self.__get_module(fullname) # eventually raises ImportError
|
||||||
|
return None
|
||||||
|
get_source = get_code # same as get_code
|
||||||
|
|
||||||
|
_importer = _SixMetaPathImporter(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
class _MovedItems(_LazyModule):
|
||||||
|
|
||||||
|
"""Lazy loading of moved objects"""
|
||||||
|
__path__ = [] # mark as package
|
||||||
|
|
||||||
|
|
||||||
|
_moved_attributes = [
|
||||||
|
MovedAttribute("cStringIO", "cStringIO", "io", "StringIO"),
|
||||||
|
MovedAttribute("filter", "itertools", "builtins", "ifilter", "filter"),
|
||||||
|
MovedAttribute("filterfalse", "itertools", "itertools", "ifilterfalse", "filterfalse"),
|
||||||
|
MovedAttribute("input", "__builtin__", "builtins", "raw_input", "input"),
|
||||||
|
MovedAttribute("intern", "__builtin__", "sys"),
|
||||||
|
MovedAttribute("map", "itertools", "builtins", "imap", "map"),
|
||||||
|
MovedAttribute("getcwd", "os", "os", "getcwdu", "getcwd"),
|
||||||
|
MovedAttribute("getcwdb", "os", "os", "getcwd", "getcwdb"),
|
||||||
|
MovedAttribute("range", "__builtin__", "builtins", "xrange", "range"),
|
||||||
|
MovedAttribute("reload_module", "__builtin__", "importlib" if PY34 else "imp", "reload"),
|
||||||
|
MovedAttribute("reduce", "__builtin__", "functools"),
|
||||||
|
MovedAttribute("shlex_quote", "pipes", "shlex", "quote"),
|
||||||
|
MovedAttribute("StringIO", "StringIO", "io"),
|
||||||
|
MovedAttribute("UserDict", "UserDict", "collections"),
|
||||||
|
MovedAttribute("UserList", "UserList", "collections"),
|
||||||
|
MovedAttribute("UserString", "UserString", "collections"),
|
||||||
|
MovedAttribute("xrange", "__builtin__", "builtins", "xrange", "range"),
|
||||||
|
MovedAttribute("zip", "itertools", "builtins", "izip", "zip"),
|
||||||
|
MovedAttribute("zip_longest", "itertools", "itertools", "izip_longest", "zip_longest"),
|
||||||
|
MovedModule("builtins", "__builtin__"),
|
||||||
|
MovedModule("configparser", "ConfigParser"),
|
||||||
|
MovedModule("copyreg", "copy_reg"),
|
||||||
|
MovedModule("dbm_gnu", "gdbm", "dbm.gnu"),
|
||||||
|
MovedModule("_dummy_thread", "dummy_thread", "_dummy_thread"),
|
||||||
|
MovedModule("http_cookiejar", "cookielib", "http.cookiejar"),
|
||||||
|
MovedModule("http_cookies", "Cookie", "http.cookies"),
|
||||||
|
MovedModule("html_entities", "htmlentitydefs", "html.entities"),
|
||||||
|
MovedModule("html_parser", "HTMLParser", "html.parser"),
|
||||||
|
MovedModule("http_client", "httplib", "http.client"),
|
||||||
|
MovedModule("email_mime_multipart", "email.MIMEMultipart", "email.mime.multipart"),
|
||||||
|
MovedModule("email_mime_nonmultipart", "email.MIMENonMultipart", "email.mime.nonmultipart"),
|
||||||
|
MovedModule("email_mime_text", "email.MIMEText", "email.mime.text"),
|
||||||
|
MovedModule("email_mime_base", "email.MIMEBase", "email.mime.base"),
|
||||||
|
MovedModule("BaseHTTPServer", "BaseHTTPServer", "http.server"),
|
||||||
|
MovedModule("CGIHTTPServer", "CGIHTTPServer", "http.server"),
|
||||||
|
MovedModule("SimpleHTTPServer", "SimpleHTTPServer", "http.server"),
|
||||||
|
MovedModule("cPickle", "cPickle", "pickle"),
|
||||||
|
MovedModule("queue", "Queue"),
|
||||||
|
MovedModule("reprlib", "repr"),
|
||||||
|
MovedModule("socketserver", "SocketServer"),
|
||||||
|
MovedModule("_thread", "thread", "_thread"),
|
||||||
|
MovedModule("tkinter", "Tkinter"),
|
||||||
|
MovedModule("tkinter_dialog", "Dialog", "tkinter.dialog"),
|
||||||
|
MovedModule("tkinter_filedialog", "FileDialog", "tkinter.filedialog"),
|
||||||
|
MovedModule("tkinter_scrolledtext", "ScrolledText", "tkinter.scrolledtext"),
|
||||||
|
MovedModule("tkinter_simpledialog", "SimpleDialog", "tkinter.simpledialog"),
|
||||||
|
MovedModule("tkinter_tix", "Tix", "tkinter.tix"),
|
||||||
|
MovedModule("tkinter_ttk", "ttk", "tkinter.ttk"),
|
||||||
|
MovedModule("tkinter_constants", "Tkconstants", "tkinter.constants"),
|
||||||
|
MovedModule("tkinter_dnd", "Tkdnd", "tkinter.dnd"),
|
||||||
|
MovedModule("tkinter_colorchooser", "tkColorChooser",
|
||||||
|
"tkinter.colorchooser"),
|
||||||
|
MovedModule("tkinter_commondialog", "tkCommonDialog",
|
||||||
|
"tkinter.commondialog"),
|
||||||
|
MovedModule("tkinter_tkfiledialog", "tkFileDialog", "tkinter.filedialog"),
|
||||||
|
MovedModule("tkinter_font", "tkFont", "tkinter.font"),
|
||||||
|
MovedModule("tkinter_messagebox", "tkMessageBox", "tkinter.messagebox"),
|
||||||
|
MovedModule("tkinter_tksimpledialog", "tkSimpleDialog",
|
||||||
|
"tkinter.simpledialog"),
|
||||||
|
MovedModule("urllib_parse", __name__ + ".moves.urllib_parse", "urllib.parse"),
|
||||||
|
MovedModule("urllib_error", __name__ + ".moves.urllib_error", "urllib.error"),
|
||||||
|
MovedModule("urllib", __name__ + ".moves.urllib", __name__ + ".moves.urllib"),
|
||||||
|
MovedModule("urllib_robotparser", "robotparser", "urllib.robotparser"),
|
||||||
|
MovedModule("xmlrpc_client", "xmlrpclib", "xmlrpc.client"),
|
||||||
|
MovedModule("xmlrpc_server", "SimpleXMLRPCServer", "xmlrpc.server"),
|
||||||
|
]
|
||||||
|
# Add windows specific modules.
|
||||||
|
if sys.platform == "win32":
|
||||||
|
_moved_attributes += [
|
||||||
|
MovedModule("winreg", "_winreg"),
|
||||||
|
]
|
||||||
|
|
||||||
|
for attr in _moved_attributes:
|
||||||
|
setattr(_MovedItems, attr.name, attr)
|
||||||
|
if isinstance(attr, MovedModule):
|
||||||
|
_importer._add_module(attr, "moves." + attr.name)
|
||||||
|
del attr
|
||||||
|
|
||||||
|
_MovedItems._moved_attributes = _moved_attributes
|
||||||
|
|
||||||
|
moves = _MovedItems(__name__ + ".moves")
|
||||||
|
_importer._add_module(moves, "moves")
|
||||||
|
|
||||||
|
|
||||||
|
class Module_six_moves_urllib_parse(_LazyModule):
|
||||||
|
|
||||||
|
"""Lazy loading of moved objects in six.moves.urllib_parse"""
|
||||||
|
|
||||||
|
|
||||||
|
_urllib_parse_moved_attributes = [
|
||||||
|
MovedAttribute("ParseResult", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("SplitResult", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("parse_qs", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("parse_qsl", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("urldefrag", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("urljoin", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("urlparse", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("urlsplit", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("urlunparse", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("urlunsplit", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("quote", "urllib", "urllib.parse"),
|
||||||
|
MovedAttribute("quote_plus", "urllib", "urllib.parse"),
|
||||||
|
MovedAttribute("unquote", "urllib", "urllib.parse"),
|
||||||
|
MovedAttribute("unquote_plus", "urllib", "urllib.parse"),
|
||||||
|
MovedAttribute("urlencode", "urllib", "urllib.parse"),
|
||||||
|
MovedAttribute("splitquery", "urllib", "urllib.parse"),
|
||||||
|
MovedAttribute("splittag", "urllib", "urllib.parse"),
|
||||||
|
MovedAttribute("splituser", "urllib", "urllib.parse"),
|
||||||
|
MovedAttribute("uses_fragment", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("uses_netloc", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("uses_params", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("uses_query", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("uses_relative", "urlparse", "urllib.parse"),
|
||||||
|
]
|
||||||
|
for attr in _urllib_parse_moved_attributes:
|
||||||
|
setattr(Module_six_moves_urllib_parse, attr.name, attr)
|
||||||
|
del attr
|
||||||
|
|
||||||
|
Module_six_moves_urllib_parse._moved_attributes = _urllib_parse_moved_attributes
|
||||||
|
|
||||||
|
_importer._add_module(Module_six_moves_urllib_parse(__name__ + ".moves.urllib_parse"),
|
||||||
|
"moves.urllib_parse", "moves.urllib.parse")
|
||||||
|
|
||||||
|
|
||||||
|
class Module_six_moves_urllib_error(_LazyModule):
|
||||||
|
|
||||||
|
"""Lazy loading of moved objects in six.moves.urllib_error"""
|
||||||
|
|
||||||
|
|
||||||
|
_urllib_error_moved_attributes = [
|
||||||
|
MovedAttribute("URLError", "urllib2", "urllib.error"),
|
||||||
|
MovedAttribute("HTTPError", "urllib2", "urllib.error"),
|
||||||
|
MovedAttribute("ContentTooShortError", "urllib", "urllib.error"),
|
||||||
|
]
|
||||||
|
for attr in _urllib_error_moved_attributes:
|
||||||
|
setattr(Module_six_moves_urllib_error, attr.name, attr)
|
||||||
|
del attr
|
||||||
|
|
||||||
|
Module_six_moves_urllib_error._moved_attributes = _urllib_error_moved_attributes
|
||||||
|
|
||||||
|
_importer._add_module(Module_six_moves_urllib_error(__name__ + ".moves.urllib.error"),
|
||||||
|
"moves.urllib_error", "moves.urllib.error")
|
||||||
|
|
||||||
|
|
||||||
|
class Module_six_moves_urllib_request(_LazyModule):
|
||||||
|
|
||||||
|
"""Lazy loading of moved objects in six.moves.urllib_request"""
|
||||||
|
|
||||||
|
|
||||||
|
_urllib_request_moved_attributes = [
|
||||||
|
MovedAttribute("urlopen", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("install_opener", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("build_opener", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("pathname2url", "urllib", "urllib.request"),
|
||||||
|
MovedAttribute("url2pathname", "urllib", "urllib.request"),
|
||||||
|
MovedAttribute("getproxies", "urllib", "urllib.request"),
|
||||||
|
MovedAttribute("Request", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("OpenerDirector", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("HTTPDefaultErrorHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("HTTPRedirectHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("HTTPCookieProcessor", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("ProxyHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("BaseHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("HTTPPasswordMgr", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("HTTPPasswordMgrWithDefaultRealm", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("AbstractBasicAuthHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("HTTPBasicAuthHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("ProxyBasicAuthHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("AbstractDigestAuthHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("HTTPDigestAuthHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("ProxyDigestAuthHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("HTTPHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("HTTPSHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("FileHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("FTPHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("CacheFTPHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("UnknownHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("HTTPErrorProcessor", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("urlretrieve", "urllib", "urllib.request"),
|
||||||
|
MovedAttribute("urlcleanup", "urllib", "urllib.request"),
|
||||||
|
MovedAttribute("URLopener", "urllib", "urllib.request"),
|
||||||
|
MovedAttribute("FancyURLopener", "urllib", "urllib.request"),
|
||||||
|
MovedAttribute("proxy_bypass", "urllib", "urllib.request"),
|
||||||
|
]
|
||||||
|
for attr in _urllib_request_moved_attributes:
|
||||||
|
setattr(Module_six_moves_urllib_request, attr.name, attr)
|
||||||
|
del attr
|
||||||
|
|
||||||
|
Module_six_moves_urllib_request._moved_attributes = _urllib_request_moved_attributes
|
||||||
|
|
||||||
|
_importer._add_module(Module_six_moves_urllib_request(__name__ + ".moves.urllib.request"),
|
||||||
|
"moves.urllib_request", "moves.urllib.request")
|
||||||
|
|
||||||
|
|
||||||
|
class Module_six_moves_urllib_response(_LazyModule):
|
||||||
|
|
||||||
|
"""Lazy loading of moved objects in six.moves.urllib_response"""
|
||||||
|
|
||||||
|
|
||||||
|
_urllib_response_moved_attributes = [
|
||||||
|
MovedAttribute("addbase", "urllib", "urllib.response"),
|
||||||
|
MovedAttribute("addclosehook", "urllib", "urllib.response"),
|
||||||
|
MovedAttribute("addinfo", "urllib", "urllib.response"),
|
||||||
|
MovedAttribute("addinfourl", "urllib", "urllib.response"),
|
||||||
|
]
|
||||||
|
for attr in _urllib_response_moved_attributes:
|
||||||
|
setattr(Module_six_moves_urllib_response, attr.name, attr)
|
||||||
|
del attr
|
||||||
|
|
||||||
|
Module_six_moves_urllib_response._moved_attributes = _urllib_response_moved_attributes
|
||||||
|
|
||||||
|
_importer._add_module(Module_six_moves_urllib_response(__name__ + ".moves.urllib.response"),
|
||||||
|
"moves.urllib_response", "moves.urllib.response")
|
||||||
|
|
||||||
|
|
||||||
|
class Module_six_moves_urllib_robotparser(_LazyModule):
|
||||||
|
|
||||||
|
"""Lazy loading of moved objects in six.moves.urllib_robotparser"""
|
||||||
|
|
||||||
|
|
||||||
|
_urllib_robotparser_moved_attributes = [
|
||||||
|
MovedAttribute("RobotFileParser", "robotparser", "urllib.robotparser"),
|
||||||
|
]
|
||||||
|
for attr in _urllib_robotparser_moved_attributes:
|
||||||
|
setattr(Module_six_moves_urllib_robotparser, attr.name, attr)
|
||||||
|
del attr
|
||||||
|
|
||||||
|
Module_six_moves_urllib_robotparser._moved_attributes = _urllib_robotparser_moved_attributes
|
||||||
|
|
||||||
|
_importer._add_module(Module_six_moves_urllib_robotparser(__name__ + ".moves.urllib.robotparser"),
|
||||||
|
"moves.urllib_robotparser", "moves.urllib.robotparser")
|
||||||
|
|
||||||
|
|
||||||
|
class Module_six_moves_urllib(types.ModuleType):
|
||||||
|
|
||||||
|
"""Create a six.moves.urllib namespace that resembles the Python 3 namespace"""
|
||||||
|
__path__ = [] # mark as package
|
||||||
|
parse = _importer._get_module("moves.urllib_parse")
|
||||||
|
error = _importer._get_module("moves.urllib_error")
|
||||||
|
request = _importer._get_module("moves.urllib_request")
|
||||||
|
response = _importer._get_module("moves.urllib_response")
|
||||||
|
robotparser = _importer._get_module("moves.urllib_robotparser")
|
||||||
|
|
||||||
|
def __dir__(self):
|
||||||
|
return ['parse', 'error', 'request', 'response', 'robotparser']
|
||||||
|
|
||||||
|
_importer._add_module(Module_six_moves_urllib(__name__ + ".moves.urllib"),
|
||||||
|
"moves.urllib")
|
||||||
|
|
||||||
|
|
||||||
|
def add_move(move):
|
||||||
|
"""Add an item to six.moves."""
|
||||||
|
setattr(_MovedItems, move.name, move)
|
||||||
|
|
||||||
|
|
||||||
|
def remove_move(name):
|
||||||
|
"""Remove item from six.moves."""
|
||||||
|
try:
|
||||||
|
delattr(_MovedItems, name)
|
||||||
|
except AttributeError:
|
||||||
|
try:
|
||||||
|
del moves.__dict__[name]
|
||||||
|
except KeyError:
|
||||||
|
raise AttributeError("no such move, %r" % (name,))
|
||||||
|
|
||||||
|
|
||||||
|
if PY3:
|
||||||
|
_meth_func = "__func__"
|
||||||
|
_meth_self = "__self__"
|
||||||
|
|
||||||
|
_func_closure = "__closure__"
|
||||||
|
_func_code = "__code__"
|
||||||
|
_func_defaults = "__defaults__"
|
||||||
|
_func_globals = "__globals__"
|
||||||
|
else:
|
||||||
|
_meth_func = "im_func"
|
||||||
|
_meth_self = "im_self"
|
||||||
|
|
||||||
|
_func_closure = "func_closure"
|
||||||
|
_func_code = "func_code"
|
||||||
|
_func_defaults = "func_defaults"
|
||||||
|
_func_globals = "func_globals"
|
||||||
|
|
||||||
|
|
||||||
|
try:
|
||||||
|
advance_iterator = next
|
||||||
|
except NameError:
|
||||||
|
def advance_iterator(it):
|
||||||
|
return it.next()
|
||||||
|
next = advance_iterator
|
||||||
|
|
||||||
|
|
||||||
|
try:
|
||||||
|
callable = callable
|
||||||
|
except NameError:
|
||||||
|
def callable(obj):
|
||||||
|
return any("__call__" in klass.__dict__ for klass in type(obj).__mro__)
|
||||||
|
|
||||||
|
|
||||||
|
if PY3:
|
||||||
|
def get_unbound_function(unbound):
|
||||||
|
return unbound
|
||||||
|
|
||||||
|
create_bound_method = types.MethodType
|
||||||
|
|
||||||
|
def create_unbound_method(func, cls):
|
||||||
|
return func
|
||||||
|
|
||||||
|
Iterator = object
|
||||||
|
else:
|
||||||
|
def get_unbound_function(unbound):
|
||||||
|
return unbound.im_func
|
||||||
|
|
||||||
|
def create_bound_method(func, obj):
|
||||||
|
return types.MethodType(func, obj, obj.__class__)
|
||||||
|
|
||||||
|
def create_unbound_method(func, cls):
|
||||||
|
return types.MethodType(func, None, cls)
|
||||||
|
|
||||||
|
class Iterator(object):
|
||||||
|
|
||||||
|
def next(self):
|
||||||
|
return type(self).__next__(self)
|
||||||
|
|
||||||
|
callable = callable
|
||||||
|
_add_doc(get_unbound_function,
|
||||||
|
"""Get the function out of a possibly unbound function""")
|
||||||
|
|
||||||
|
|
||||||
|
get_method_function = operator.attrgetter(_meth_func)
|
||||||
|
get_method_self = operator.attrgetter(_meth_self)
|
||||||
|
get_function_closure = operator.attrgetter(_func_closure)
|
||||||
|
get_function_code = operator.attrgetter(_func_code)
|
||||||
|
get_function_defaults = operator.attrgetter(_func_defaults)
|
||||||
|
get_function_globals = operator.attrgetter(_func_globals)
|
||||||
|
|
||||||
|
|
||||||
|
if PY3:
|
||||||
|
def iterkeys(d, **kw):
|
||||||
|
return iter(d.keys(**kw))
|
||||||
|
|
||||||
|
def itervalues(d, **kw):
|
||||||
|
return iter(d.values(**kw))
|
||||||
|
|
||||||
|
def iteritems(d, **kw):
|
||||||
|
return iter(d.items(**kw))
|
||||||
|
|
||||||
|
def iterlists(d, **kw):
|
||||||
|
return iter(d.lists(**kw))
|
||||||
|
|
||||||
|
viewkeys = operator.methodcaller("keys")
|
||||||
|
|
||||||
|
viewvalues = operator.methodcaller("values")
|
||||||
|
|
||||||
|
viewitems = operator.methodcaller("items")
|
||||||
|
else:
|
||||||
|
def iterkeys(d, **kw):
|
||||||
|
return d.iterkeys(**kw)
|
||||||
|
|
||||||
|
def itervalues(d, **kw):
|
||||||
|
return d.itervalues(**kw)
|
||||||
|
|
||||||
|
def iteritems(d, **kw):
|
||||||
|
return d.iteritems(**kw)
|
||||||
|
|
||||||
|
def iterlists(d, **kw):
|
||||||
|
return d.iterlists(**kw)
|
||||||
|
|
||||||
|
viewkeys = operator.methodcaller("viewkeys")
|
||||||
|
|
||||||
|
viewvalues = operator.methodcaller("viewvalues")
|
||||||
|
|
||||||
|
viewitems = operator.methodcaller("viewitems")
|
||||||
|
|
||||||
|
_add_doc(iterkeys, "Return an iterator over the keys of a dictionary.")
|
||||||
|
_add_doc(itervalues, "Return an iterator over the values of a dictionary.")
|
||||||
|
_add_doc(iteritems,
|
||||||
|
"Return an iterator over the (key, value) pairs of a dictionary.")
|
||||||
|
_add_doc(iterlists,
|
||||||
|
"Return an iterator over the (key, [values]) pairs of a dictionary.")
|
||||||
|
|
||||||
|
|
||||||
|
if PY3:
|
||||||
|
def b(s):
|
||||||
|
return s.encode("latin-1")
|
||||||
|
|
||||||
|
def u(s):
|
||||||
|
return s
|
||||||
|
unichr = chr
|
||||||
|
import struct
|
||||||
|
int2byte = struct.Struct(">B").pack
|
||||||
|
del struct
|
||||||
|
byte2int = operator.itemgetter(0)
|
||||||
|
indexbytes = operator.getitem
|
||||||
|
iterbytes = iter
|
||||||
|
import io
|
||||||
|
StringIO = io.StringIO
|
||||||
|
BytesIO = io.BytesIO
|
||||||
|
_assertCountEqual = "assertCountEqual"
|
||||||
|
if sys.version_info[1] <= 1:
|
||||||
|
_assertRaisesRegex = "assertRaisesRegexp"
|
||||||
|
_assertRegex = "assertRegexpMatches"
|
||||||
|
else:
|
||||||
|
_assertRaisesRegex = "assertRaisesRegex"
|
||||||
|
_assertRegex = "assertRegex"
|
||||||
|
else:
|
||||||
|
def b(s):
|
||||||
|
return s
|
||||||
|
# Workaround for standalone backslash
|
||||||
|
|
||||||
|
def u(s):
|
||||||
|
return unicode(s.replace(r'\\', r'\\\\'), "unicode_escape")
|
||||||
|
unichr = unichr
|
||||||
|
int2byte = chr
|
||||||
|
|
||||||
|
def byte2int(bs):
|
||||||
|
return ord(bs[0])
|
||||||
|
|
||||||
|
def indexbytes(buf, i):
|
||||||
|
return ord(buf[i])
|
||||||
|
iterbytes = functools.partial(itertools.imap, ord)
|
||||||
|
import StringIO
|
||||||
|
StringIO = BytesIO = StringIO.StringIO
|
||||||
|
_assertCountEqual = "assertItemsEqual"
|
||||||
|
_assertRaisesRegex = "assertRaisesRegexp"
|
||||||
|
_assertRegex = "assertRegexpMatches"
|
||||||
|
_add_doc(b, """Byte literal""")
|
||||||
|
_add_doc(u, """Text literal""")
|
||||||
|
|
||||||
|
|
||||||
|
def assertCountEqual(self, *args, **kwargs):
|
||||||
|
return getattr(self, _assertCountEqual)(*args, **kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
def assertRaisesRegex(self, *args, **kwargs):
|
||||||
|
return getattr(self, _assertRaisesRegex)(*args, **kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
def assertRegex(self, *args, **kwargs):
|
||||||
|
return getattr(self, _assertRegex)(*args, **kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
if PY3:
|
||||||
|
exec_ = getattr(moves.builtins, "exec")
|
||||||
|
|
||||||
|
def reraise(tp, value, tb=None):
|
||||||
|
if value is None:
|
||||||
|
value = tp()
|
||||||
|
if value.__traceback__ is not tb:
|
||||||
|
raise value.with_traceback(tb)
|
||||||
|
raise value
|
||||||
|
|
||||||
|
else:
|
||||||
|
def exec_(_code_, _globs_=None, _locs_=None):
|
||||||
|
"""Execute code in a namespace."""
|
||||||
|
if _globs_ is None:
|
||||||
|
frame = sys._getframe(1)
|
||||||
|
_globs_ = frame.f_globals
|
||||||
|
if _locs_ is None:
|
||||||
|
_locs_ = frame.f_locals
|
||||||
|
del frame
|
||||||
|
elif _locs_ is None:
|
||||||
|
_locs_ = _globs_
|
||||||
|
exec("""exec _code_ in _globs_, _locs_""")
|
||||||
|
|
||||||
|
exec_("""def reraise(tp, value, tb=None):
|
||||||
|
raise tp, value, tb
|
||||||
|
""")
|
||||||
|
|
||||||
|
|
||||||
|
if sys.version_info[:2] == (3, 2):
|
||||||
|
exec_("""def raise_from(value, from_value):
|
||||||
|
if from_value is None:
|
||||||
|
raise value
|
||||||
|
raise value from from_value
|
||||||
|
""")
|
||||||
|
elif sys.version_info[:2] > (3, 2):
|
||||||
|
exec_("""def raise_from(value, from_value):
|
||||||
|
raise value from from_value
|
||||||
|
""")
|
||||||
|
else:
|
||||||
|
def raise_from(value, from_value):
|
||||||
|
raise value
|
||||||
|
|
||||||
|
|
||||||
|
print_ = getattr(moves.builtins, "print", None)
|
||||||
|
if print_ is None:
|
||||||
|
def print_(*args, **kwargs):
|
||||||
|
"""The new-style print function for Python 2.4 and 2.5."""
|
||||||
|
fp = kwargs.pop("file", sys.stdout)
|
||||||
|
if fp is None:
|
||||||
|
return
|
||||||
|
|
||||||
|
def write(data):
|
||||||
|
if not isinstance(data, basestring):
|
||||||
|
data = str(data)
|
||||||
|
# If the file has an encoding, encode unicode with it.
|
||||||
|
if (isinstance(fp, file) and
|
||||||
|
isinstance(data, unicode) and
|
||||||
|
fp.encoding is not None):
|
||||||
|
errors = getattr(fp, "errors", None)
|
||||||
|
if errors is None:
|
||||||
|
errors = "strict"
|
||||||
|
data = data.encode(fp.encoding, errors)
|
||||||
|
fp.write(data)
|
||||||
|
want_unicode = False
|
||||||
|
sep = kwargs.pop("sep", None)
|
||||||
|
if sep is not None:
|
||||||
|
if isinstance(sep, unicode):
|
||||||
|
want_unicode = True
|
||||||
|
elif not isinstance(sep, str):
|
||||||
|
raise TypeError("sep must be None or a string")
|
||||||
|
end = kwargs.pop("end", None)
|
||||||
|
if end is not None:
|
||||||
|
if isinstance(end, unicode):
|
||||||
|
want_unicode = True
|
||||||
|
elif not isinstance(end, str):
|
||||||
|
raise TypeError("end must be None or a string")
|
||||||
|
if kwargs:
|
||||||
|
raise TypeError("invalid keyword arguments to print()")
|
||||||
|
if not want_unicode:
|
||||||
|
for arg in args:
|
||||||
|
if isinstance(arg, unicode):
|
||||||
|
want_unicode = True
|
||||||
|
break
|
||||||
|
if want_unicode:
|
||||||
|
newline = unicode("\n")
|
||||||
|
space = unicode(" ")
|
||||||
|
else:
|
||||||
|
newline = "\n"
|
||||||
|
space = " "
|
||||||
|
if sep is None:
|
||||||
|
sep = space
|
||||||
|
if end is None:
|
||||||
|
end = newline
|
||||||
|
for i, arg in enumerate(args):
|
||||||
|
if i:
|
||||||
|
write(sep)
|
||||||
|
write(arg)
|
||||||
|
write(end)
|
||||||
|
if sys.version_info[:2] < (3, 3):
|
||||||
|
_print = print_
|
||||||
|
|
||||||
|
def print_(*args, **kwargs):
|
||||||
|
fp = kwargs.get("file", sys.stdout)
|
||||||
|
flush = kwargs.pop("flush", False)
|
||||||
|
_print(*args, **kwargs)
|
||||||
|
if flush and fp is not None:
|
||||||
|
fp.flush()
|
||||||
|
|
||||||
|
_add_doc(reraise, """Reraise an exception.""")
|
||||||
|
|
||||||
|
if sys.version_info[0:2] < (3, 4):
|
||||||
|
def wraps(wrapped, assigned=functools.WRAPPER_ASSIGNMENTS,
|
||||||
|
updated=functools.WRAPPER_UPDATES):
|
||||||
|
def wrapper(f):
|
||||||
|
f = functools.wraps(wrapped, assigned, updated)(f)
|
||||||
|
f.__wrapped__ = wrapped
|
||||||
|
return f
|
||||||
|
return wrapper
|
||||||
|
else:
|
||||||
|
wraps = functools.wraps
|
||||||
|
|
||||||
|
|
||||||
|
def with_metaclass(meta, *bases):
|
||||||
|
"""Create a base class with a metaclass."""
|
||||||
|
# This requires a bit of explanation: the basic idea is to make a dummy
|
||||||
|
# metaclass for one level of class instantiation that replaces itself with
|
||||||
|
# the actual metaclass.
|
||||||
|
class metaclass(meta):
|
||||||
|
|
||||||
|
def __new__(cls, name, this_bases, d):
|
||||||
|
return meta(name, bases, d)
|
||||||
|
return type.__new__(metaclass, 'temporary_class', (), {})
|
||||||
|
|
||||||
|
|
||||||
|
def add_metaclass(metaclass):
|
||||||
|
"""Class decorator for creating a class with a metaclass."""
|
||||||
|
def wrapper(cls):
|
||||||
|
orig_vars = cls.__dict__.copy()
|
||||||
|
slots = orig_vars.get('__slots__')
|
||||||
|
if slots is not None:
|
||||||
|
if isinstance(slots, str):
|
||||||
|
slots = [slots]
|
||||||
|
for slots_var in slots:
|
||||||
|
orig_vars.pop(slots_var)
|
||||||
|
orig_vars.pop('__dict__', None)
|
||||||
|
orig_vars.pop('__weakref__', None)
|
||||||
|
return metaclass(cls.__name__, cls.__bases__, orig_vars)
|
||||||
|
return wrapper
|
||||||
|
|
||||||
|
|
||||||
|
def python_2_unicode_compatible(klass):
|
||||||
|
"""
|
||||||
|
A decorator that defines __unicode__ and __str__ methods under Python 2.
|
||||||
|
Under Python 3 it does nothing.
|
||||||
|
|
||||||
|
To support Python 2 and 3 with a single code base, define a __str__ method
|
||||||
|
returning text and apply this decorator to the class.
|
||||||
|
"""
|
||||||
|
if PY2:
|
||||||
|
if '__str__' not in klass.__dict__:
|
||||||
|
raise ValueError("@python_2_unicode_compatible cannot be applied "
|
||||||
|
"to %s because it doesn't define __str__()." %
|
||||||
|
klass.__name__)
|
||||||
|
klass.__unicode__ = klass.__str__
|
||||||
|
klass.__str__ = lambda self: self.__unicode__().encode('utf-8')
|
||||||
|
return klass
|
||||||
|
|
||||||
|
|
||||||
|
# Complete the moves implementation.
|
||||||
|
# This code is at the end of this module to speed up module loading.
|
||||||
|
# Turn this module into a package.
|
||||||
|
__path__ = [] # required for PEP 302 and PEP 451
|
||||||
|
__package__ = __name__ # see PEP 366 @ReservedAssignment
|
||||||
|
if globals().get("__spec__") is not None:
|
||||||
|
__spec__.submodule_search_locations = [] # PEP 451 @UndefinedVariable
|
||||||
|
# Remove other six meta path importers, since they cause problems. This can
|
||||||
|
# happen if six is removed from sys.modules and then reloaded. (Setuptools does
|
||||||
|
# this for some reason.)
|
||||||
|
if sys.meta_path:
|
||||||
|
for i, importer in enumerate(sys.meta_path):
|
||||||
|
# Here's some real nastiness: Another "instance" of the six module might
|
||||||
|
# be floating around. Therefore, we can't use isinstance() to check for
|
||||||
|
# the six meta path importer, since the other six instance will have
|
||||||
|
# inserted an importer with different class.
|
||||||
|
if (type(importer).__name__ == "_SixMetaPathImporter" and
|
||||||
|
importer.name == __name__):
|
||||||
|
del sys.meta_path[i]
|
||||||
|
break
|
||||||
|
del i, importer
|
||||||
|
# Finally, add the importer to the meta path import hook.
|
||||||
|
sys.meta_path.append(_importer)
|
135
lib/python3.4/site-packages/sqlalchemy/__init__.py
Normal file
135
lib/python3.4/site-packages/sqlalchemy/__init__.py
Normal file
|
@ -0,0 +1,135 @@
|
||||||
|
# sqlalchemy/__init__.py
|
||||||
|
# Copyright (C) 2005-2014 the SQLAlchemy authors and contributors
|
||||||
|
# <see AUTHORS file>
|
||||||
|
#
|
||||||
|
# This module is part of SQLAlchemy and is released under
|
||||||
|
# the MIT License: http://www.opensource.org/licenses/mit-license.php
|
||||||
|
|
||||||
|
|
||||||
|
from .sql import (
|
||||||
|
alias,
|
||||||
|
and_,
|
||||||
|
asc,
|
||||||
|
between,
|
||||||
|
bindparam,
|
||||||
|
case,
|
||||||
|
cast,
|
||||||
|
collate,
|
||||||
|
delete,
|
||||||
|
desc,
|
||||||
|
distinct,
|
||||||
|
except_,
|
||||||
|
except_all,
|
||||||
|
exists,
|
||||||
|
extract,
|
||||||
|
false,
|
||||||
|
func,
|
||||||
|
insert,
|
||||||
|
intersect,
|
||||||
|
intersect_all,
|
||||||
|
join,
|
||||||
|
literal,
|
||||||
|
literal_column,
|
||||||
|
modifier,
|
||||||
|
not_,
|
||||||
|
null,
|
||||||
|
or_,
|
||||||
|
outerjoin,
|
||||||
|
outparam,
|
||||||
|
over,
|
||||||
|
select,
|
||||||
|
subquery,
|
||||||
|
text,
|
||||||
|
true,
|
||||||
|
tuple_,
|
||||||
|
type_coerce,
|
||||||
|
union,
|
||||||
|
union_all,
|
||||||
|
update,
|
||||||
|
)
|
||||||
|
|
||||||
|
from .types import (
|
||||||
|
BIGINT,
|
||||||
|
BINARY,
|
||||||
|
BLOB,
|
||||||
|
BOOLEAN,
|
||||||
|
BigInteger,
|
||||||
|
Binary,
|
||||||
|
Boolean,
|
||||||
|
CHAR,
|
||||||
|
CLOB,
|
||||||
|
DATE,
|
||||||
|
DATETIME,
|
||||||
|
DECIMAL,
|
||||||
|
Date,
|
||||||
|
DateTime,
|
||||||
|
Enum,
|
||||||
|
FLOAT,
|
||||||
|
Float,
|
||||||
|
INT,
|
||||||
|
INTEGER,
|
||||||
|
Integer,
|
||||||
|
Interval,
|
||||||
|
LargeBinary,
|
||||||
|
NCHAR,
|
||||||
|
NVARCHAR,
|
||||||
|
NUMERIC,
|
||||||
|
Numeric,
|
||||||
|
PickleType,
|
||||||
|
REAL,
|
||||||
|
SMALLINT,
|
||||||
|
SmallInteger,
|
||||||
|
String,
|
||||||
|
TEXT,
|
||||||
|
TIME,
|
||||||
|
TIMESTAMP,
|
||||||
|
Text,
|
||||||
|
Time,
|
||||||
|
TypeDecorator,
|
||||||
|
Unicode,
|
||||||
|
UnicodeText,
|
||||||
|
VARBINARY,
|
||||||
|
VARCHAR,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
from .schema import (
|
||||||
|
CheckConstraint,
|
||||||
|
Column,
|
||||||
|
ColumnDefault,
|
||||||
|
Constraint,
|
||||||
|
DefaultClause,
|
||||||
|
FetchedValue,
|
||||||
|
ForeignKey,
|
||||||
|
ForeignKeyConstraint,
|
||||||
|
Index,
|
||||||
|
MetaData,
|
||||||
|
PassiveDefault,
|
||||||
|
PrimaryKeyConstraint,
|
||||||
|
Sequence,
|
||||||
|
Table,
|
||||||
|
ThreadLocalMetaData,
|
||||||
|
UniqueConstraint,
|
||||||
|
DDL,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
from .inspection import inspect
|
||||||
|
from .engine import create_engine, engine_from_config
|
||||||
|
|
||||||
|
__version__ = '0.9.7'
|
||||||
|
|
||||||
|
|
||||||
|
def __go(lcls):
|
||||||
|
global __all__
|
||||||
|
|
||||||
|
from . import events
|
||||||
|
from . import util as _sa_util
|
||||||
|
|
||||||
|
import inspect as _inspect
|
||||||
|
|
||||||
|
__all__ = sorted(name for name, obj in lcls.items()
|
||||||
|
if not (name.startswith('_') or _inspect.ismodule(obj)))
|
||||||
|
|
||||||
|
_sa_util.dependencies.resolve_all("sqlalchemy")
|
||||||
|
__go(locals())
|
|
@ -0,0 +1,10 @@
|
||||||
|
# connectors/__init__.py
|
||||||
|
# Copyright (C) 2005-2014 the SQLAlchemy authors and contributors
|
||||||
|
# <see AUTHORS file>
|
||||||
|
#
|
||||||
|
# This module is part of SQLAlchemy and is released under
|
||||||
|
# the MIT License: http://www.opensource.org/licenses/mit-license.php
|
||||||
|
|
||||||
|
|
||||||
|
class Connector(object):
|
||||||
|
pass
|
150
lib/python3.4/site-packages/sqlalchemy/connectors/mxodbc.py
Normal file
150
lib/python3.4/site-packages/sqlalchemy/connectors/mxodbc.py
Normal file
|
@ -0,0 +1,150 @@
|
||||||
|
# connectors/mxodbc.py
|
||||||
|
# Copyright (C) 2005-2014 the SQLAlchemy authors and contributors
|
||||||
|
# <see AUTHORS file>
|
||||||
|
#
|
||||||
|
# This module is part of SQLAlchemy and is released under
|
||||||
|
# the MIT License: http://www.opensource.org/licenses/mit-license.php
|
||||||
|
|
||||||
|
"""
|
||||||
|
Provide an SQLALchemy connector for the eGenix mxODBC commercial
|
||||||
|
Python adapter for ODBC. This is not a free product, but eGenix
|
||||||
|
provides SQLAlchemy with a license for use in continuous integration
|
||||||
|
testing.
|
||||||
|
|
||||||
|
This has been tested for use with mxODBC 3.1.2 on SQL Server 2005
|
||||||
|
and 2008, using the SQL Server Native driver. However, it is
|
||||||
|
possible for this to be used on other database platforms.
|
||||||
|
|
||||||
|
For more info on mxODBC, see http://www.egenix.com/
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
import sys
|
||||||
|
import re
|
||||||
|
import warnings
|
||||||
|
|
||||||
|
from . import Connector
|
||||||
|
|
||||||
|
|
||||||
|
class MxODBCConnector(Connector):
|
||||||
|
driver = 'mxodbc'
|
||||||
|
|
||||||
|
supports_sane_multi_rowcount = False
|
||||||
|
supports_unicode_statements = True
|
||||||
|
supports_unicode_binds = True
|
||||||
|
|
||||||
|
supports_native_decimal = True
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def dbapi(cls):
|
||||||
|
# this classmethod will normally be replaced by an instance
|
||||||
|
# attribute of the same name, so this is normally only called once.
|
||||||
|
cls._load_mx_exceptions()
|
||||||
|
platform = sys.platform
|
||||||
|
if platform == 'win32':
|
||||||
|
from mx.ODBC import Windows as module
|
||||||
|
# this can be the string "linux2", and possibly others
|
||||||
|
elif 'linux' in platform:
|
||||||
|
from mx.ODBC import unixODBC as module
|
||||||
|
elif platform == 'darwin':
|
||||||
|
from mx.ODBC import iODBC as module
|
||||||
|
else:
|
||||||
|
raise ImportError("Unrecognized platform for mxODBC import")
|
||||||
|
return module
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def _load_mx_exceptions(cls):
|
||||||
|
""" Import mxODBC exception classes into the module namespace,
|
||||||
|
as if they had been imported normally. This is done here
|
||||||
|
to avoid requiring all SQLAlchemy users to install mxODBC.
|
||||||
|
"""
|
||||||
|
global InterfaceError, ProgrammingError
|
||||||
|
from mx.ODBC import InterfaceError
|
||||||
|
from mx.ODBC import ProgrammingError
|
||||||
|
|
||||||
|
def on_connect(self):
|
||||||
|
def connect(conn):
|
||||||
|
conn.stringformat = self.dbapi.MIXED_STRINGFORMAT
|
||||||
|
conn.datetimeformat = self.dbapi.PYDATETIME_DATETIMEFORMAT
|
||||||
|
conn.decimalformat = self.dbapi.DECIMAL_DECIMALFORMAT
|
||||||
|
conn.errorhandler = self._error_handler()
|
||||||
|
return connect
|
||||||
|
|
||||||
|
def _error_handler(self):
|
||||||
|
""" Return a handler that adjusts mxODBC's raised Warnings to
|
||||||
|
emit Python standard warnings.
|
||||||
|
"""
|
||||||
|
from mx.ODBC.Error import Warning as MxOdbcWarning
|
||||||
|
|
||||||
|
def error_handler(connection, cursor, errorclass, errorvalue):
|
||||||
|
if issubclass(errorclass, MxOdbcWarning):
|
||||||
|
errorclass.__bases__ = (Warning,)
|
||||||
|
warnings.warn(message=str(errorvalue),
|
||||||
|
category=errorclass,
|
||||||
|
stacklevel=2)
|
||||||
|
else:
|
||||||
|
raise errorclass(errorvalue)
|
||||||
|
return error_handler
|
||||||
|
|
||||||
|
def create_connect_args(self, url):
|
||||||
|
""" Return a tuple of *args,**kwargs for creating a connection.
|
||||||
|
|
||||||
|
The mxODBC 3.x connection constructor looks like this:
|
||||||
|
|
||||||
|
connect(dsn, user='', password='',
|
||||||
|
clear_auto_commit=1, errorhandler=None)
|
||||||
|
|
||||||
|
This method translates the values in the provided uri
|
||||||
|
into args and kwargs needed to instantiate an mxODBC Connection.
|
||||||
|
|
||||||
|
The arg 'errorhandler' is not used by SQLAlchemy and will
|
||||||
|
not be populated.
|
||||||
|
|
||||||
|
"""
|
||||||
|
opts = url.translate_connect_args(username='user')
|
||||||
|
opts.update(url.query)
|
||||||
|
args = opts.pop('host')
|
||||||
|
opts.pop('port', None)
|
||||||
|
opts.pop('database', None)
|
||||||
|
return (args,), opts
|
||||||
|
|
||||||
|
def is_disconnect(self, e, connection, cursor):
|
||||||
|
# TODO: eGenix recommends checking connection.closed here
|
||||||
|
# Does that detect dropped connections ?
|
||||||
|
if isinstance(e, self.dbapi.ProgrammingError):
|
||||||
|
return "connection already closed" in str(e)
|
||||||
|
elif isinstance(e, self.dbapi.Error):
|
||||||
|
return '[08S01]' in str(e)
|
||||||
|
else:
|
||||||
|
return False
|
||||||
|
|
||||||
|
def _get_server_version_info(self, connection):
|
||||||
|
# eGenix suggests using conn.dbms_version instead
|
||||||
|
# of what we're doing here
|
||||||
|
dbapi_con = connection.connection
|
||||||
|
version = []
|
||||||
|
r = re.compile('[.\-]')
|
||||||
|
# 18 == pyodbc.SQL_DBMS_VER
|
||||||
|
for n in r.split(dbapi_con.getinfo(18)[1]):
|
||||||
|
try:
|
||||||
|
version.append(int(n))
|
||||||
|
except ValueError:
|
||||||
|
version.append(n)
|
||||||
|
return tuple(version)
|
||||||
|
|
||||||
|
def _get_direct(self, context):
|
||||||
|
if context:
|
||||||
|
native_odbc_execute = context.execution_options.\
|
||||||
|
get('native_odbc_execute', 'auto')
|
||||||
|
# default to direct=True in all cases, is more generally
|
||||||
|
# compatible especially with SQL Server
|
||||||
|
return False if native_odbc_execute is True else True
|
||||||
|
else:
|
||||||
|
return True
|
||||||
|
|
||||||
|
def do_executemany(self, cursor, statement, parameters, context=None):
|
||||||
|
cursor.executemany(
|
||||||
|
statement, parameters, direct=self._get_direct(context))
|
||||||
|
|
||||||
|
def do_execute(self, cursor, statement, parameters, context=None):
|
||||||
|
cursor.execute(statement, parameters, direct=self._get_direct(context))
|
145
lib/python3.4/site-packages/sqlalchemy/connectors/mysqldb.py
Normal file
145
lib/python3.4/site-packages/sqlalchemy/connectors/mysqldb.py
Normal file
|
@ -0,0 +1,145 @@
|
||||||
|
# connectors/mysqldb.py
|
||||||
|
# Copyright (C) 2005-2014 the SQLAlchemy authors and contributors
|
||||||
|
# <see AUTHORS file>
|
||||||
|
#
|
||||||
|
# This module is part of SQLAlchemy and is released under
|
||||||
|
# the MIT License: http://www.opensource.org/licenses/mit-license.php
|
||||||
|
|
||||||
|
"""Define behaviors common to MySQLdb dialects.
|
||||||
|
|
||||||
|
Currently includes MySQL and Drizzle.
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
from . import Connector
|
||||||
|
from ..engine import base as engine_base, default
|
||||||
|
from ..sql import operators as sql_operators
|
||||||
|
from .. import exc, log, schema, sql, types as sqltypes, util, processors
|
||||||
|
import re
|
||||||
|
|
||||||
|
|
||||||
|
# the subclassing of Connector by all classes
|
||||||
|
# here is not strictly necessary
|
||||||
|
|
||||||
|
|
||||||
|
class MySQLDBExecutionContext(Connector):
|
||||||
|
|
||||||
|
@property
|
||||||
|
def rowcount(self):
|
||||||
|
if hasattr(self, '_rowcount'):
|
||||||
|
return self._rowcount
|
||||||
|
else:
|
||||||
|
return self.cursor.rowcount
|
||||||
|
|
||||||
|
|
||||||
|
class MySQLDBCompiler(Connector):
|
||||||
|
def visit_mod_binary(self, binary, operator, **kw):
|
||||||
|
return self.process(binary.left, **kw) + " %% " + \
|
||||||
|
self.process(binary.right, **kw)
|
||||||
|
|
||||||
|
def post_process_text(self, text):
|
||||||
|
return text.replace('%', '%%')
|
||||||
|
|
||||||
|
|
||||||
|
class MySQLDBIdentifierPreparer(Connector):
|
||||||
|
|
||||||
|
def _escape_identifier(self, value):
|
||||||
|
value = value.replace(self.escape_quote, self.escape_to_quote)
|
||||||
|
return value.replace("%", "%%")
|
||||||
|
|
||||||
|
|
||||||
|
class MySQLDBConnector(Connector):
|
||||||
|
driver = 'mysqldb'
|
||||||
|
supports_unicode_statements = False
|
||||||
|
supports_sane_rowcount = True
|
||||||
|
supports_sane_multi_rowcount = True
|
||||||
|
|
||||||
|
supports_native_decimal = True
|
||||||
|
|
||||||
|
default_paramstyle = 'format'
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def dbapi(cls):
|
||||||
|
# is overridden when pymysql is used
|
||||||
|
return __import__('MySQLdb')
|
||||||
|
|
||||||
|
|
||||||
|
def do_executemany(self, cursor, statement, parameters, context=None):
|
||||||
|
rowcount = cursor.executemany(statement, parameters)
|
||||||
|
if context is not None:
|
||||||
|
context._rowcount = rowcount
|
||||||
|
|
||||||
|
def create_connect_args(self, url):
|
||||||
|
opts = url.translate_connect_args(database='db', username='user',
|
||||||
|
password='passwd')
|
||||||
|
opts.update(url.query)
|
||||||
|
|
||||||
|
util.coerce_kw_type(opts, 'compress', bool)
|
||||||
|
util.coerce_kw_type(opts, 'connect_timeout', int)
|
||||||
|
util.coerce_kw_type(opts, 'read_timeout', int)
|
||||||
|
util.coerce_kw_type(opts, 'client_flag', int)
|
||||||
|
util.coerce_kw_type(opts, 'local_infile', int)
|
||||||
|
# Note: using either of the below will cause all strings to be returned
|
||||||
|
# as Unicode, both in raw SQL operations and with column types like
|
||||||
|
# String and MSString.
|
||||||
|
util.coerce_kw_type(opts, 'use_unicode', bool)
|
||||||
|
util.coerce_kw_type(opts, 'charset', str)
|
||||||
|
|
||||||
|
# Rich values 'cursorclass' and 'conv' are not supported via
|
||||||
|
# query string.
|
||||||
|
|
||||||
|
ssl = {}
|
||||||
|
keys = ['ssl_ca', 'ssl_key', 'ssl_cert', 'ssl_capath', 'ssl_cipher']
|
||||||
|
for key in keys:
|
||||||
|
if key in opts:
|
||||||
|
ssl[key[4:]] = opts[key]
|
||||||
|
util.coerce_kw_type(ssl, key[4:], str)
|
||||||
|
del opts[key]
|
||||||
|
if ssl:
|
||||||
|
opts['ssl'] = ssl
|
||||||
|
|
||||||
|
# FOUND_ROWS must be set in CLIENT_FLAGS to enable
|
||||||
|
# supports_sane_rowcount.
|
||||||
|
client_flag = opts.get('client_flag', 0)
|
||||||
|
if self.dbapi is not None:
|
||||||
|
try:
|
||||||
|
CLIENT_FLAGS = __import__(
|
||||||
|
self.dbapi.__name__ + '.constants.CLIENT'
|
||||||
|
).constants.CLIENT
|
||||||
|
client_flag |= CLIENT_FLAGS.FOUND_ROWS
|
||||||
|
except (AttributeError, ImportError):
|
||||||
|
self.supports_sane_rowcount = False
|
||||||
|
opts['client_flag'] = client_flag
|
||||||
|
return [[], opts]
|
||||||
|
|
||||||
|
def _get_server_version_info(self, connection):
|
||||||
|
dbapi_con = connection.connection
|
||||||
|
version = []
|
||||||
|
r = re.compile('[.\-]')
|
||||||
|
for n in r.split(dbapi_con.get_server_info()):
|
||||||
|
try:
|
||||||
|
version.append(int(n))
|
||||||
|
except ValueError:
|
||||||
|
version.append(n)
|
||||||
|
return tuple(version)
|
||||||
|
|
||||||
|
def _extract_error_code(self, exception):
|
||||||
|
return exception.args[0]
|
||||||
|
|
||||||
|
def _detect_charset(self, connection):
|
||||||
|
"""Sniff out the character set in use for connection results."""
|
||||||
|
|
||||||
|
try:
|
||||||
|
# note: the SQL here would be
|
||||||
|
# "SHOW VARIABLES LIKE 'character_set%%'"
|
||||||
|
cset_name = connection.connection.character_set_name
|
||||||
|
except AttributeError:
|
||||||
|
util.warn(
|
||||||
|
"No 'character_set_name' can be detected with "
|
||||||
|
"this MySQL-Python version; "
|
||||||
|
"please upgrade to a recent version of MySQL-Python. "
|
||||||
|
"Assuming latin1.")
|
||||||
|
return 'latin1'
|
||||||
|
else:
|
||||||
|
return cset_name()
|
||||||
|
|
172
lib/python3.4/site-packages/sqlalchemy/connectors/pyodbc.py
Normal file
172
lib/python3.4/site-packages/sqlalchemy/connectors/pyodbc.py
Normal file
|
@ -0,0 +1,172 @@
|
||||||
|
# connectors/pyodbc.py
|
||||||
|
# Copyright (C) 2005-2014 the SQLAlchemy authors and contributors
|
||||||
|
# <see AUTHORS file>
|
||||||
|
#
|
||||||
|
# This module is part of SQLAlchemy and is released under
|
||||||
|
# the MIT License: http://www.opensource.org/licenses/mit-license.php
|
||||||
|
|
||||||
|
from . import Connector
|
||||||
|
from .. import util
|
||||||
|
|
||||||
|
|
||||||
|
import sys
|
||||||
|
import re
|
||||||
|
|
||||||
|
|
||||||
|
class PyODBCConnector(Connector):
|
||||||
|
driver = 'pyodbc'
|
||||||
|
|
||||||
|
supports_sane_multi_rowcount = False
|
||||||
|
|
||||||
|
if util.py2k:
|
||||||
|
# PyODBC unicode is broken on UCS-4 builds
|
||||||
|
supports_unicode = sys.maxunicode == 65535
|
||||||
|
supports_unicode_statements = supports_unicode
|
||||||
|
|
||||||
|
supports_native_decimal = True
|
||||||
|
default_paramstyle = 'named'
|
||||||
|
|
||||||
|
# for non-DSN connections, this should
|
||||||
|
# hold the desired driver name
|
||||||
|
pyodbc_driver_name = None
|
||||||
|
|
||||||
|
# will be set to True after initialize()
|
||||||
|
# if the freetds.so is detected
|
||||||
|
freetds = False
|
||||||
|
|
||||||
|
# will be set to the string version of
|
||||||
|
# the FreeTDS driver if freetds is detected
|
||||||
|
freetds_driver_version = None
|
||||||
|
|
||||||
|
# will be set to True after initialize()
|
||||||
|
# if the libessqlsrv.so is detected
|
||||||
|
easysoft = False
|
||||||
|
|
||||||
|
def __init__(self, supports_unicode_binds=None, **kw):
|
||||||
|
super(PyODBCConnector, self).__init__(**kw)
|
||||||
|
self._user_supports_unicode_binds = supports_unicode_binds
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def dbapi(cls):
|
||||||
|
return __import__('pyodbc')
|
||||||
|
|
||||||
|
def create_connect_args(self, url):
|
||||||
|
opts = url.translate_connect_args(username='user')
|
||||||
|
opts.update(url.query)
|
||||||
|
|
||||||
|
keys = opts
|
||||||
|
query = url.query
|
||||||
|
|
||||||
|
connect_args = {}
|
||||||
|
for param in ('ansi', 'unicode_results', 'autocommit'):
|
||||||
|
if param in keys:
|
||||||
|
connect_args[param] = util.asbool(keys.pop(param))
|
||||||
|
|
||||||
|
if 'odbc_connect' in keys:
|
||||||
|
connectors = [util.unquote_plus(keys.pop('odbc_connect'))]
|
||||||
|
else:
|
||||||
|
dsn_connection = 'dsn' in keys or \
|
||||||
|
('host' in keys and 'database' not in keys)
|
||||||
|
if dsn_connection:
|
||||||
|
connectors = ['dsn=%s' % (keys.pop('host', '') or
|
||||||
|
keys.pop('dsn', ''))]
|
||||||
|
else:
|
||||||
|
port = ''
|
||||||
|
if 'port' in keys and 'port' not in query:
|
||||||
|
port = ',%d' % int(keys.pop('port'))
|
||||||
|
|
||||||
|
connectors = ["DRIVER={%s}" %
|
||||||
|
keys.pop('driver', self.pyodbc_driver_name),
|
||||||
|
'Server=%s%s' % (keys.pop('host', ''), port),
|
||||||
|
'Database=%s' % keys.pop('database', '')]
|
||||||
|
|
||||||
|
user = keys.pop("user", None)
|
||||||
|
if user:
|
||||||
|
connectors.append("UID=%s" % user)
|
||||||
|
connectors.append("PWD=%s" % keys.pop('password', ''))
|
||||||
|
else:
|
||||||
|
connectors.append("Trusted_Connection=Yes")
|
||||||
|
|
||||||
|
# if set to 'Yes', the ODBC layer will try to automagically
|
||||||
|
# convert textual data from your database encoding to your
|
||||||
|
# client encoding. This should obviously be set to 'No' if
|
||||||
|
# you query a cp1253 encoded database from a latin1 client...
|
||||||
|
if 'odbc_autotranslate' in keys:
|
||||||
|
connectors.append("AutoTranslate=%s" %
|
||||||
|
keys.pop("odbc_autotranslate"))
|
||||||
|
|
||||||
|
connectors.extend(['%s=%s' % (k, v) for k, v in keys.items()])
|
||||||
|
return [[";".join(connectors)], connect_args]
|
||||||
|
|
||||||
|
def is_disconnect(self, e, connection, cursor):
|
||||||
|
if isinstance(e, self.dbapi.ProgrammingError):
|
||||||
|
return "The cursor's connection has been closed." in str(e) or \
|
||||||
|
'Attempt to use a closed connection.' in str(e)
|
||||||
|
elif isinstance(e, self.dbapi.Error):
|
||||||
|
return '[08S01]' in str(e)
|
||||||
|
else:
|
||||||
|
return False
|
||||||
|
|
||||||
|
def initialize(self, connection):
|
||||||
|
# determine FreeTDS first. can't issue SQL easily
|
||||||
|
# without getting unicode_statements/binds set up.
|
||||||
|
|
||||||
|
pyodbc = self.dbapi
|
||||||
|
|
||||||
|
dbapi_con = connection.connection
|
||||||
|
|
||||||
|
_sql_driver_name = dbapi_con.getinfo(pyodbc.SQL_DRIVER_NAME)
|
||||||
|
self.freetds = bool(re.match(r".*libtdsodbc.*\.so", _sql_driver_name
|
||||||
|
))
|
||||||
|
self.easysoft = bool(re.match(r".*libessqlsrv.*\.so", _sql_driver_name
|
||||||
|
))
|
||||||
|
|
||||||
|
if self.freetds:
|
||||||
|
self.freetds_driver_version = dbapi_con.getinfo(
|
||||||
|
pyodbc.SQL_DRIVER_VER)
|
||||||
|
|
||||||
|
self.supports_unicode_statements = (
|
||||||
|
not util.py2k or
|
||||||
|
(not self.freetds and not self.easysoft)
|
||||||
|
)
|
||||||
|
|
||||||
|
if self._user_supports_unicode_binds is not None:
|
||||||
|
self.supports_unicode_binds = self._user_supports_unicode_binds
|
||||||
|
elif util.py2k:
|
||||||
|
self.supports_unicode_binds = (
|
||||||
|
not self.freetds or self.freetds_driver_version >= '0.91'
|
||||||
|
) and not self.easysoft
|
||||||
|
else:
|
||||||
|
self.supports_unicode_binds = True
|
||||||
|
|
||||||
|
# run other initialization which asks for user name, etc.
|
||||||
|
super(PyODBCConnector, self).initialize(connection)
|
||||||
|
|
||||||
|
|
||||||
|
def _dbapi_version(self):
|
||||||
|
if not self.dbapi:
|
||||||
|
return ()
|
||||||
|
return self._parse_dbapi_version(self.dbapi.version)
|
||||||
|
|
||||||
|
def _parse_dbapi_version(self, vers):
|
||||||
|
m = re.match(
|
||||||
|
r'(?:py.*-)?([\d\.]+)(?:-(\w+))?',
|
||||||
|
vers
|
||||||
|
)
|
||||||
|
if not m:
|
||||||
|
return ()
|
||||||
|
vers = tuple([int(x) for x in m.group(1).split(".")])
|
||||||
|
if m.group(2):
|
||||||
|
vers += (m.group(2),)
|
||||||
|
return vers
|
||||||
|
|
||||||
|
def _get_server_version_info(self, connection):
|
||||||
|
dbapi_con = connection.connection
|
||||||
|
version = []
|
||||||
|
r = re.compile('[.\-]')
|
||||||
|
for n in r.split(dbapi_con.getinfo(self.dbapi.SQL_DBMS_VER)):
|
||||||
|
try:
|
||||||
|
version.append(int(n))
|
||||||
|
except ValueError:
|
||||||
|
version.append(n)
|
||||||
|
return tuple(version)
|
Some files were not shown because too many files have changed in this diff Show more
Loading…
Reference in a new issue