What’s New In Python 3.2 — Python v3.2.6 documentation (original) (raw)

This article explains the new features in Python 3.2 as compared to 3.1. It focuses on a few highlights and gives a few examples. For full details, see theMisc/NEWS file.

PEP 384: Defining a Stable ABI

In the past, extension modules built for one Python version were often not usable with other Python versions. Particularly on Windows, every feature release of Python required rebuilding all extension modules that one wanted to use. This requirement was the result of the free access to Python interpreter internals that extension modules could use.

With Python 3.2, an alternative approach becomes available: extension modules which restrict themselves to a limited API (by defining Py_LIMITED_API) cannot use many of the internals, but are constrained to a set of API functions that are promised to be stable for several releases. As a consequence, extension modules built for 3.2 in that mode will also work with 3.3, 3.4, and so on. Extension modules that make use of details of memory structures can still be built, but will need to be recompiled for every feature release.

See also

PEP 384 - Defining a Stable ABI

PEP written by Martin von Löwis.

PEP 389: Argparse Command Line Parsing Module

A new module for command line parsing, argparse, was introduced to overcome the limitations of optparse which did not provide support for positional arguments (not just options), subcommands, required options and other common patterns of specifying and validating options.

This module has already had widespread success in the community as a third-party module. Being more fully featured than its predecessor, theargparse module is now the preferred module for command-line processing. The older module is still being kept available because of the substantial amount of legacy code that depends on it.

Here’s an annotated example parser showing features like limiting results to a set of choices, specifying a metavar in the help screen, validating that one or more positional arguments is present, and making a required option:

import argparse parser = argparse.ArgumentParser( description = 'Manage servers', # main description for help epilog = 'Tested on Solaris and Linux') # displayed after help parser.add_argument('action', # argument name choices = ['deploy', 'start', 'stop'], # three allowed values help = 'action on each target') # help msg parser.add_argument('targets', metavar = 'HOSTNAME', # var name used in help msg nargs = '+', # require one or more targets help = 'url for target machines') # help msg explanation parser.add_argument('-u', '--user', # -u or --user option required = True, # make it a required argument help = 'login as user')

Example of calling the parser on a command string:

cmd = 'deploy sneezy.example.com sleepy.example.com -u skycaptain' result = parser.parse_args(cmd.split()) result.action 'deploy' result.targets ['sneezy.example.com', 'sleepy.example.com'] result.user 'skycaptain'

Example of the parser’s automatically generated help:

parser.parse_args('-h'.split())

usage: manage_cloud.py [-h] -u USER {deploy,start,stop} HOSTNAME [HOSTNAME ...]

Manage servers

positional arguments: {deploy,start,stop} action on each target HOSTNAME url for target machines

optional arguments: -h, --help show this help message and exit -u USER, --user USER login as user

Tested on Solaris and Linux

An especially nice argparse feature is the ability to define subparsers, each with their own argument patterns and help displays:

import argparse parser = argparse.ArgumentParser(prog='HELM') subparsers = parser.add_subparsers()

parser_l = subparsers.add_parser('launch', help='Launch Control') # first subgroup parser_l.add_argument('-m', '--missiles', action='store_true') parser_l.add_argument('-t', '--torpedos', action='store_true')

parser_m = subparsers.add_parser('move', help='Move Vessel', # second subgroup aliases=('steer', 'turn')) # equivalent names parser_m.add_argument('-c', '--course', type=int, required=True) parser_m.add_argument('-s', '--speed', type=int, default=0)

$ ./helm.py --help # top level help (launch and move) $ ./helm.py launch --help # help for launch options $ ./helm.py launch --missiles # set missiles=True and torpedos=False $ ./helm.py steer --course 180 --speed 5 # set movement parameters

PEP 391: Dictionary Based Configuration for Logging

The logging module provided two kinds of configuration, one style with function calls for each option or another style driven by an external file saved in a ConfigParser format. Those options did not provide the flexibility to create configurations from JSON or YAML files, nor did they support incremental configuration, which is needed for specifying logger options from a command line.

To support a more flexible style, the module now offerslogging.config.dictConfig() for specifying logging configuration with plain Python dictionaries. The configuration options include formatters, handlers, filters, and loggers. Here’s a working example of a configuration dictionary:

{"version": 1, "formatters": {"brief": {"format": "%(levelname)-8s: %(name)-15s: %(message)s"}, "full": {"format": "%(asctime)s %(name)-15s %(levelname)-8s %(message)s"} }, "handlers": {"console": { "class": "logging.StreamHandler", "formatter": "brief", "level": "INFO", "stream": "ext://sys.stdout"}, "console_priority": { "class": "logging.StreamHandler", "formatter": "full", "level": "ERROR", "stream": "ext://sys.stderr"} }, "root": {"level": "DEBUG", "handlers": ["console", "console_priority"]}}

If that dictionary is stored in a file called conf.json, it can be loaded and called with code like this:

import json, logging.config with open('conf.json') as f: conf = json.load(f) logging.config.dictConfig(conf) logging.info("Transaction completed normally") INFO : root : Transaction completed normally logging.critical("Abnormal termination") 2011-02-17 11:14:36,694 root CRITICAL Abnormal termination

See also

PEP 391 - Dictionary Based Configuration for Logging

PEP written by Vinay Sajip.

PEP 3148: The concurrent.futures module

Code for creating and managing concurrency is being collected in a new top-level namespace, concurrent. Its first member is a futures package which provides a uniform high-level interface for managing threads and processes.

The design for concurrent.futures was inspired by the_java.util.concurrent_ package. In that model, a running call and its result are represented by a Future object that abstracts features common to threads, processes, and remote procedure calls. That object supports status checks (running or done), timeouts, cancellations, adding callbacks, and access to results or exceptions.

The primary offering of the new module is a pair of executor classes for launching and managing calls. The goal of the executors is to make it easier to use existing tools for making parallel calls. They save the effort needed to setup a pool of resources, launch the calls, create a results queue, add time-out handling, and limit the total number of threads, processes, or remote procedure calls.

Ideally, each application should share a single executor across multiple components so that process and thread limits can be centrally managed. This solves the design challenge that arises when each component has its own competing strategy for resource management.

Both classes share a common interface with three methods:submit() for scheduling a callable and returning a Future object;map() for scheduling many asynchronous calls at a time, and shutdown() for freeing resources. The class is a context manager and can be used in awith statement to assure that resources are automatically released when currently pending futures are done executing.

A simple of example of ThreadPoolExecutor is a launch of four parallel threads for copying files:

import concurrent.futures, shutil with concurrent.futures.ThreadPoolExecutor(max_workers=4) as e: e.submit(shutil.copy, 'src1.txt', 'dest1.txt') e.submit(shutil.copy, 'src2.txt', 'dest2.txt') e.submit(shutil.copy, 'src3.txt', 'dest3.txt') e.submit(shutil.copy, 'src4.txt', 'dest4.txt')

PEP 3147: PYC Repository Directories

Python’s scheme for caching bytecode in .pyc files did not work well in environments with multiple Python interpreters. If one interpreter encountered a cached file created by another interpreter, it would recompile the source and overwrite the cached file, thus losing the benefits of caching.

The issue of “pyc fights” has become more pronounced as it has become commonplace for Linux distributions to ship with multiple versions of Python. These conflicts also arise with CPython alternatives such as Unladen Swallow.

To solve this problem, Python’s import machinery has been extended to use distinct filenames for each interpreter. Instead of Python 3.2 and Python 3.3 and Unladen Swallow each competing for a file called “mymodule.pyc”, they will now look for “mymodule.cpython-32.pyc”, “mymodule.cpython-33.pyc”, and “mymodule.unladen10.pyc”. And to prevent all of these new files from cluttering source directories, the pyc files are now collected in a “__pycache__” directory stored under the package directory.

Aside from the filenames and target directories, the new scheme has a few aspects that are visible to the programmer:

See also

PEP 3147 - PYC Repository Directories

PEP written by Barry Warsaw.

PEP 3149: ABI Version Tagged .so Files

The PYC repository directory allows multiple bytecode cache files to be co-located. This PEP implements a similar mechanism for shared object files by giving them a common directory and distinct names for each version.

The common directory is “pyshared” and the file names are made distinct by identifying the Python implementation (such as CPython, PyPy, Jython, etc.), the major and minor version numbers, and optional build flags (such as “d” for debug, “m” for pymalloc, “u” for wide-unicode). For an arbitrary package “foo”, you may see these files when the distribution package is installed:

/usr/share/pyshared/foo.cpython-32m.so /usr/share/pyshared/foo.cpython-33md.so

In Python itself, the tags are accessible from functions in the sysconfigmodule:

import sysconfig sysconfig.get_config_var('SOABI') # find the version tag 'cpython-32mu' sysconfig.get_config_var('EXT_SUFFIX') # find the full filename extension '.cpython-32mu.so'

See also

PEP 3149 - ABI Version Tagged .so Files

PEP written by Barry Warsaw.

PEP 3333: Python Web Server Gateway Interface v1.0.1

This informational PEP clarifies how bytes/text issues are to be handled by the WSGI protocol. The challenge is that string handling in Python 3 is most conveniently handled with the str type even though the HTTP protocol is itself bytes oriented.

The PEP differentiates so-called native strings that are used for request/response headers and metadata versus byte strings which are used for the bodies of requests and responses.

The native strings are always of type str but are restricted to code points between U+0000 through U+00FF which are translatable to bytes using_Latin-1_ encoding. These strings are used for the keys and values in the environment dictionary and for response headers and statuses in thestart_response() function. They must follow RFC 2616 with respect to encoding. That is, they must either be ISO-8859-1 characters or useRFC 2047 MIME encoding.

For developers porting WSGI applications from Python 2, here are the salient points:

For server implementers writing CGI-to-WSGI pathways or other CGI-style protocols, the users must to be able access the environment using native strings even though the underlying platform may have a different convention. To bridge this gap, the wsgiref module has a new function,wsgiref.handlers.read_environ() for transcoding CGI variables fromos.environ into native strings and returning a new dictionary.

See also

PEP 3333 - Python Web Server Gateway Interface v1.0.1

PEP written by Phillip Eby.

Other Language Changes

Some smaller changes made to the core Python language are:

lcd = LowerCasedDict(part='widgets', quantity=10)
'There are {QUANTITY} {Part} in stock'.format_map(lcd)
'There are 10 widgets in stock'
class PlaceholderDict(dict):
def missing(self, key):
return '<{}>'.format(key)
'Hello {name}, welcome to {location}'.format_map(PlaceholderDict())
'Hello , welcome to '

(Suggested by Raymond Hettinger and implemented by Eric Smith inissue 6081.)

a = A()
hasattr(a, 'f')
Traceback (most recent call last):
...
ZeroDivisionError: integer division or modulo by zero
(Discovered by Yury Selivanov and fixed by Benjamin Peterson; issue 9666.)

[97, 98, 99, 100, 101, 102, 103, 104]
(Added by Antoine Pitrou; issue 9757.)

(See issue 4617.)

(Required extensive work by Victor Stinner in issue 9425.)

New, Improved, and Deprecated Modules

Python’s standard library has undergone significant maintenance efforts and quality improvements.

The biggest news for Python 3.2 is that the email package, mailboxmodule, and nntplib modules now work correctly with the bytes/text model in Python 3. For the first time, there is correct handling of messages with mixed encodings.

Throughout the standard library, there has been more careful attention to encodings and text versus bytes issues. In particular, interactions with the operating system are now better able to exchange non-ASCII data using the Windows MBCS encoding, locale-aware encodings, or UTF-8.

Another significant win is the addition of substantially better support for_SSL_ connections and security certificates.

In addition, more classes now implement a context manager to support convenient and reliable resource clean-up using a with statement.

email

The usability of the email package in Python 3 has been mostly fixed by the extensive efforts of R. David Murray. The problem was that emails are typically read and stored in the form of bytes rather than strtext, and they may contain multiple encodings within a single email. So, the email package had to be extended to parse and generate email messages in bytes format.

(Proposed and implemented by R. David Murray, issue 4661 and issue 10321.)

collections

threading

The threading module has a new Barriersynchronization class for making multiple threads wait until all of them have reached a common barrier point. Barriers are useful for making sure that a task with multiple preconditions does not run until all of the predecessor tasks are complete.

Barriers can work with an arbitrary number of threads. This is a generalization of a Rendezvous which is defined for only two threads.

Implemented as a two-phase cyclic barrier, Barrier objects are suitable for use in loops. The separate filling and draining phases assure that all threads get released (drained) before any one of them can loop back and re-enter the barrier. The barrier fully resets after each cycle.

Example of using barriers:

from threading import Barrier, Thread

def get_votes(site): ballots = conduct_election(site) all_polls_closed.wait() # do not count until all polls are closed totals = summarize(ballots) publish(site, totals)

all_polls_closed = Barrier(len(sites)) for site in sites: Thread(target=get_votes, args=(site,)).start()

In this example, the barrier enforces a rule that votes cannot be counted at any polling site until all polls are closed. Notice how a solution with a barrier is similar to one with threading.Thread.join(), but the threads stay alive and continue to do work (summarizing ballots) after the barrier point is crossed.

If any of the predecessor tasks can hang or be delayed, a barrier can be created with an optional timeout parameter. Then if the timeout period elapses before all the predecessor tasks reach the barrier point, all waiting threads are released and a BrokenBarrierError exception is raised:

def get_votes(site): ballots = conduct_election(site) try: all_polls_closed.wait(timeout = midnight - time.now()) except BrokenBarrierError: lockbox = seal_ballots(ballots) queue.put(lockbox) else: totals = summarize(ballots) publish(site, totals)

In this example, the barrier enforces a more robust rule. If some election sites do not finish before midnight, the barrier times-out and the ballots are sealed and deposited in a queue for later handling.

See Barrier Synchronization Patterns for more examples of how barriers can be used in parallel computing. Also, there is a simple but thorough explanation of barriers in The Little Book of Semaphores, section 3.6.

(Contributed by Kristján Valur Jónsson with an API review by Jeffrey Yasskin inissue 8777.)

datetime and time

(Contributed by Alexander Belopolsky and Victor Stinner in issue 1289118,issue 5094, issue 6641, issue 2706, issue 1777412, issue 8013, and issue 10827.)

math

The math module has been updated with six new functions inspired by the C99 standard.

The isfinite() function provides a reliable and fast way to detect special values. It returns True for regular numbers and False for Nan or_Infinity_:

[isfinite(x) for x in (123, 4.56, float('Nan'), float('Inf'))] [True, True, False, False]

The expm1() function computes e**x-1 for small values of _x_without incurring the loss of precision that usually accompanies the subtraction of nearly equal quantities:

expm1(0.013671875) # more accurate way to compute e**x-1 for a small x 0.013765762467652909

The erf() function computes a probability integral or Gaussian error function. The complementary error function, erfc(), is 1 - erf(x):

erf(1.0/sqrt(2.0)) # portion of normal distribution within 1 standard deviation 0.682689492137086 erfc(1.0/sqrt(2.0)) # portion of normal distribution outside 1 standard deviation 0.31731050786291404 erf(1.0/sqrt(2.0)) + erfc(1.0/sqrt(2.0)) 1.0

The gamma() function is a continuous extension of the factorial function. See http://en.wikipedia.org/wiki/Gamma_function for details. Because the function is related to factorials, it grows large even for small values of_x_, so there is also a lgamma() function for computing the natural logarithm of the gamma function:

gamma(7.0) # six factorial 720.0 lgamma(801.0) # log(800 factorial) 4551.950730698041

(Contributed by Mark Dickinson.)

io

The io.BytesIO has a new method, getbuffer(), which provides functionality similar to memoryview(). It creates an editable view of the data without making a copy. The buffer’s random access and support for slice notation are well-suited to in-place editing:

REC_LEN, LOC_START, LOC_LEN = 34, 7, 11

def change_location(buffer, record_number, location): start = record_number * REC_LEN + LOC_START buffer[start: start+LOC_LEN] = location

import io

byte_stream = io.BytesIO( b'G3805 storeroom Main chassis ' b'X7899 shipping Reserve cog ' b'L6988 receiving Primary sprocket' ) buffer = byte_stream.getbuffer() change_location(buffer, 1, b'warehouse ') change_location(buffer, 0, b'showroom ') print(byte_stream.getvalue()) b'G3805 showroom Main chassis ' b'X7899 warehouse Reserve cog ' b'L6988 receiving Primary sprocket'

(Contributed by Antoine Pitrou in issue 5506.)

reprlib

When writing a __repr__() method for a custom container, it is easy to forget to handle the case where a member refers back to the container itself. Python’s builtin objects such as list and set handle self-reference by displaying ”...” in the recursive part of the representation string.

To help write such __repr__() methods, the reprlib module has a new decorator, recursive_repr(), for detecting recursive calls to__repr__() and substituting a placeholder string instead:

class MyList(list): @recursive_repr() def repr(self): return '<' + '|'.join(map(repr, self)) + '>'

m = MyList('abc') m.append(m) m.append('x') print(m) <'a'|'b'|'c'|...|'x'>

(Contributed by Raymond Hettinger in issue 9826 and issue 9840.)

logging

In addition to dictionary-based configuration described above, thelogging package has many other improvements.

The logging documentation has been augmented by a basic tutorial, an advanced tutorial, and a cookbook of logging recipes. These documents are the fastest way to learn about logging.

The logging.basicConfig() set-up function gained a style argument to support three different types of string formatting. It defaults to “%” for traditional %-formatting, can be set to “{” for the new str.format() style, or can be set to “$” for the shell-style formatting provided bystring.Template. The following three configurations are equivalent:

from logging import basicConfig basicConfig(style='%', format="%(name)s -> %(levelname)s: %(message)s") basicConfig(style='{', format="{name} -> {levelname} {message}") basicConfig(style='$', format="$name -> levelname:levelname: levelname:message")

If no configuration is set-up before a logging event occurs, there is now a default configuration using a StreamHandler directed tosys.stderr for events of WARNING level or higher. Formerly, an event occurring before a configuration was set-up would either raise an exception or silently drop the event depending on the value oflogging.raiseExceptions. The new default handler is stored inlogging.lastResort.

The use of filters has been simplified. Instead of creating aFilter object, the predicate can be any Python callable that returns True or False.

There were a number of other improvements that add flexibility and simplify configuration. See the module documentation for a full listing of changes in Python 3.2.

csv

The csv module now supports a new dialect, unix_dialect, which applies quoting for all fields and a traditional Unix style with '\n' as the line terminator. The registered dialect name is unix.

The csv.DictWriter has a new method,writeheader() for writing-out an initial row to document the field names:

import csv, sys w = csv.DictWriter(sys.stdout, ['name', 'dept'], dialect='unix') w.writeheader() "name","dept" w.writerows([ {'name': 'tom', 'dept': 'accounting'}, {'name': 'susan', 'dept': 'Salesl'}]) "tom","accounting" "susan","sales"

(New dialect suggested by Jay Talbot in issue 5975, and the new method suggested by Ed Abraham in issue 1537721.)

contextlib

There is a new and slightly mind-blowing toolContextDecorator that is helpful for creating acontext manager that does double duty as a function decorator.

As a convenience, this new functionality is used bycontextmanager() so that no extra effort is needed to support both roles.

The basic idea is that both context managers and function decorators can be used for pre-action and post-action wrappers. Context managers wrap a group of statements using a with statement, and function decorators wrap a group of statements enclosed in a function. So, occasionally there is a need to write a pre-action or post-action wrapper that can be used in either role.

For example, it is sometimes useful to wrap functions or groups of statements with a logger that can track the time of entry and time of exit. Rather than writing both a function decorator and a context manager for the task, thecontextmanager() provides both capabilities in a single definition:

from contextlib import contextmanager import logging

logging.basicConfig(level=logging.INFO)

@contextmanager def track_entry_and_exit(name): logging.info('Entering: {}'.format(name)) yield logging.info('Exiting: {}'.format(name))

Formerly, this would have only been usable as a context manager:

with track_entry_and_exit('widget loader'): print('Some time consuming activity goes here') load_widget()

Now, it can be used as a decorator as well:

@track_entry_and_exit('widget loader') def activity(): print('Some time consuming activity goes here') load_widget()

Trying to fulfill two roles at once places some limitations on the technique. Context managers normally have the flexibility to return an argument usable by a with statement, but there is no parallel for function decorators.

In the above example, there is not a clean way for the _track_entry_and_exit_context manager to return a logging instance for use in the body of enclosed statements.

(Contributed by Michael Foord in issue 9110.)

decimal and fractions

Mark Dickinson crafted an elegant and efficient scheme for assuring that different numeric datatypes will have the same hash value whenever their actual values are equal (issue 8188):

assert hash(Fraction(3, 2)) == hash(1.5) ==
hash(Decimal("1.5")) == hash(complex(1.5, 0))

Some of the hashing details are exposed through a new attribute,sys.hash_info, which describes the bit width of the hash value, the prime modulus, the hash values for infinity and nan, and the multiplier used for the imaginary part of a number:

sys.hash_info sys.hash_info(width=64, modulus=2305843009213693951, inf=314159, nan=0, imag=1000003)

An early decision to limit the inter-operability of various numeric types has been relaxed. It is still unsupported (and ill-advised) to have implicit mixing in arithmetic expressions such as Decimal('1.1') + float('1.1')because the latter loses information in the process of constructing the binary float. However, since existing floating point value can be converted losslessly to either a decimal or rational representation, it makes sense to add them to the constructor and to support mixed-type comparisons.

Similar changes were made to fractions.Fraction so that thefrom_float() and from_decimal()methods are no longer needed (issue 8294):

Decimal(1.1) Decimal('1.100000000000000088817841970012523233890533447265625') Fraction(1.1) Fraction(2476979795053773, 2251799813685248)

Another useful change for the decimal module is that theContext.clamp attribute is now public. This is useful in creating contexts that correspond to the decimal interchange formats specified in IEEE 754 (see issue 8540).

(Contributed by Mark Dickinson and Raymond Hettinger.)

ftp

The ftplib.FTP class now supports the context manager protocol to unconditionally consume socket.error exceptions and to close the FTP connection when done:

from ftplib import FTP with FTP("ftp1.at.proftpd.org") as ftp: ftp.login() ftp.dir()

'230 Anonymous login ok, restrictions apply.' dr-xr-xr-x 9 ftp ftp 154 May 6 10:43 . dr-xr-xr-x 9 ftp ftp 154 May 6 10:43 .. dr-xr-xr-x 5 ftp ftp 4096 May 6 10:43 CentOS dr-xr-xr-x 3 ftp ftp 18 Jul 10 2008 Fedora

Other file-like objects such as mmap.mmap and fileinput.input()also grew auto-closing context managers:

with fileinput.input(files=('log1.txt', 'log2.txt')) as f: for line in f: process(line)

(Contributed by Tarek Ziadé and Giampaolo Rodolà in issue 4972, and by Georg Brandl in issue 8046 and issue 1286.)

The FTP_TLS class now accepts a context parameter, which is assl.SSLContext object allowing bundling SSL configuration options, certificates and private keys into a single (potentially long-lived) structure.

(Contributed by Giampaolo Rodolà; issue 8806.)

select

The select module now exposes a new, constant attribute,PIPE_BUF, which gives the minimum number of bytes which are guaranteed not to block when select.select() says a pipe is ready for writing.

import select select.PIPE_BUF 512

(Available on Unix systems. Patch by Sébastien Sablé in issue 9862)

gzip and zipfile

gzip.GzipFile now implements the io.BufferedIOBase abstract base class (except for truncate()). It also has apeek() method and supports unseekable as well as zero-padded file objects.

The gzip module also gains the compress() anddecompress() functions for easier in-memory compression and decompression. Keep in mind that text needs to be encoded as bytesbefore compressing and decompressing:

s = 'Three shall be the number thou shalt count, ' s += 'and the number of the counting shall be three' b = s.encode() # convert to utf-8 len(b) 89 c = gzip.compress(b) len(c) 77 gzip.decompress(c).decode()[:42] # decompress and convert to text 'Three shall be the number thou shalt count,'

(Contributed by Anand B. Pillai in issue 3488; and by Antoine Pitrou, Nir Aides and Brian Curtin in issue 9962, issue 1675951, issue 7471 andissue 2846.)

Also, the zipfile.ZipExtFile class was reworked internally to represent files stored inside an archive. The new implementation is significantly faster and can be wrapped in a io.BufferedReader object for more speedups. It also solves an issue where interleaved calls to read and readline gave the wrong results.

(Patch submitted by Nir Aides in issue 7610.)

tarfile

The TarFile class can now be used as a context manager. In addition, its add() method has a new option, filter, that controls which files are added to the archive and allows the file metadata to be edited.

The new filter option replaces the older, less flexible exclude parameter which is now deprecated. If specified, the optional filter parameter needs to be a keyword argument. The user-supplied filter function accepts aTarInfo object and returns an updatedTarInfo object, or if it wants the file to be excluded, the function can return None:

import tarfile, glob

def myfilter(tarinfo): if tarinfo.isfile(): # only save real files tarinfo.uname = 'monty' # redact the user name return tarinfo

with tarfile.open(name='myarchive.tar.gz', mode='w:gz') as tf: for filename in glob.glob('*.txt'): tf.add(filename, filter=myfilter) tf.list() -rw-r--r-- monty/501 902 2011-01-26 17:59:11 annotations.txt -rw-r--r-- monty/501 123 2011-01-26 17:59:11 general_questions.txt -rw-r--r-- monty/501 3514 2011-01-26 17:59:11 prion.txt -rw-r--r-- monty/501 124 2011-01-26 17:59:11 py_todo.txt -rw-r--r-- monty/501 1399 2011-01-26 17:59:11 semaphore_notes.txt

(Proposed by Tarek Ziadé and implemented by Lars Gustäbel in issue 6856.)

hashlib

The hashlib module has two new constant attributes listing the hashing algorithms guaranteed to be present in all implementations and those available on the current implementation:

import hashlib

hashlib.algorithms_guaranteed {'sha1', 'sha224', 'sha384', 'sha256', 'sha512', 'md5'}

hashlib.algorithms_available {'md2', 'SHA256', 'SHA512', 'dsaWithSHA', 'mdc2', 'SHA224', 'MD4', 'sha256', 'sha512', 'ripemd160', 'SHA1', 'MDC2', 'SHA', 'SHA384', 'MD2', 'ecdsa-with-SHA1','md4', 'md5', 'sha1', 'DSA-SHA', 'sha224', 'dsaEncryption', 'DSA', 'RIPEMD160', 'sha', 'MD5', 'sha384'}

(Suggested by Carl Chenet in issue 7418.)

ast

The ast module has a wonderful a general-purpose tool for safely evaluating expression strings using the Python literal syntax. The ast.literal_eval() function serves as a secure alternative to the builtin eval() function which is easily abused. Python 3.2 addsbytes and set literals to the list of supported types: strings, bytes, numbers, tuples, lists, dicts, sets, booleans, and None.

from ast import literal_eval

request = "{'req': 3, 'func': 'pow', 'args': (2, 0.5)}" literal_eval(request) {'args': (2, 0.5), 'req': 3, 'func': 'pow'}

request = "os.system('do something harmful')" literal_eval(request) Traceback (most recent call last): ... ValueError: malformed node or string: <_ast.Call object at 0x101739a10>

(Implemented by Benjamin Peterson and Georg Brandl.)

os

Different operating systems use various encodings for filenames and environment variables. The os module provides two new functions,fsencode() and fsdecode(), for encoding and decoding filenames:

filename = 'Sehenswürdigkeiten' os.fsencode(filename) b'Sehensw\xc3\xbcrdigkeiten'

Some operating systems allow direct access to encoded bytes in the environment. If so, the os.supports_bytes_environ constant will be true.

For direct access to encoded environment variables (if available), use the new os.getenvb() function or use os.environbwhich is a bytes version of os.environ.

(Contributed by Victor Stinner.)

shutil

The shutil.copytree() function has two new options:

(Contributed by Tarek Ziadé.)

In addition, the shutil module now supports archiving operations for zipfiles, uncompressed tarfiles, gzipped tarfiles, and bzipped tarfiles. And there are functions for registering additional archiving file formats (such as xz compressed tarfiles or custom formats).

The principal functions are make_archive() andunpack_archive(). By default, both operate on the current directory (which can be set by os.chdir()) and on any sub-directories. The archive filename needs to be specified with a full pathname. The archiving step is non-destructive (the original files are left unchanged).

import shutil, pprint

os.chdir('mydata') # change to the source directory f = shutil.make_archive('/var/backup/mydata', 'zip') # archive the current directory f # show the name of archive '/var/backup/mydata.zip' os.chdir('tmp') # change to an unpacking shutil.unpack_archive('/var/backup/mydata.zip') # recover the data

pprint.pprint(shutil.get_archive_formats()) # display known formats [('bztar', "bzip2'ed tar-file"), ('gztar', "gzip'ed tar-file"), ('tar', 'uncompressed tar file'), ('zip', 'ZIP file')]

shutil.register_archive_format( # register a new archive format name = 'xz', function = xz.compress, # callable archiving function extra_args = [('level', 8)], # arguments to the function description = 'xz compression' )

(Contributed by Tarek Ziadé.)

sqlite3

The sqlite3 module was updated to pysqlite version 2.6.0. It has two new capabilities.

(Contributed by R. David Murray and Shashwat Anand; issue 8845.)

html

A new html module was introduced with only a single function,escape(), which is used for escaping reserved characters from HTML markup:

import html html.escape('x > 2 && x < 7') 'x > 2 && x < 7'

socket

The socket module has two new improvements.

ssl

The ssl module added a number of features to satisfy common requirements for secure (encrypted, authenticated) internet connections:

(Contributed by Antoine Pitrou in issue 8850, issue 1589, issue 8322,issue 5639, issue 4870, issue 8484, and issue 8321.)

nntp

The nntplib module has a revamped implementation with better bytes and text semantics as well as more practical APIs. These improvements break compatibility with the nntplib version in Python 3.1, which was partly dysfunctional in itself.

Support for secure connections through both implicit (usingnntplib.NNTP_SSL) and explicit (using nntplib.NNTP.starttls()) TLS has also been added.

(Contributed by Antoine Pitrou in issue 9360 and Andrew Vant in issue 1926.)

imaplib

Support for explicit TLS on standard IMAP4 connections has been added through the new imaplib.IMAP4.starttls method.

(Contributed by Lorenzo M. Catucci and Antoine Pitrou, issue 4471.)

http.client

There were a number of small API improvements in the http.client module. The old-style HTTP 0.9 simple responses are no longer supported and the _strict_parameter is deprecated in all classes.

The HTTPConnection andHTTPSConnection classes now have a _source_address_parameter for a (host, port) tuple indicating where the HTTP connection is made from.

Support for certificate checking and HTTPS virtual hosts were added toHTTPSConnection.

The request() method on connection objects allowed an optional body argument so that a file object could be used to supply the content of the request. Conveniently, the body argument now also accepts an iterable object so long as it includes an explicitContent-Length header. This extended interface is much more flexible than before.

To establish an HTTPS connection through a proxy server, there is a newset_tunnel() method that sets the host and port for HTTP Connect tunneling.

To match the behavior of http.server, the HTTP client library now also encodes headers with ISO-8859-1 (Latin-1) encoding. It was already doing that for incoming headers, so now the behavior is consistent for both incoming and outgoing traffic. (See work by Armin Ronacher in issue 10980.)

unittest

The unittest module has a number of improvements supporting test discovery for packages, easier experimentation at the interactive prompt, new testcase methods, improved diagnostic messages for test failures, and better method names.

random

The integer methods in the random module now do a better job of producing uniform distributions. Previously, they computed selections withint(n*random()) which had a slight bias whenever n was not a power of two. Now, multiple selections are made from a range up to the next power of two and a selection is kept only when it falls within the range 0 <= x < n. The functions and methods affected are randrange(),randint(), choice(), shuffle() andsample().

(Contributed by Raymond Hettinger; issue 9025.)

poplib

POP3_SSL class now accepts a context parameter, which is assl.SSLContext object allowing bundling SSL configuration options, certificates and private keys into a single (potentially long-lived) structure.

(Contributed by Giampaolo Rodolà; issue 8807.)

asyncore

asyncore.dispatcher now provides ahandle_accepted() method returning a (sock, addr) pair which is called when a connection has actually been established with a new remote endpoint. This is supposed to be used as a replacement for old handle_accept() and avoids the user to call accept() directly.

(Contributed by Giampaolo Rodolà; issue 6706.)

tempfile

The tempfile module has a new context manager,TemporaryDirectory which provides easy deterministic cleanup of temporary directories:

with tempfile.TemporaryDirectory() as tmpdirname: print('created temporary dir:', tmpdirname)

(Contributed by Neil Schemenauer and Nick Coghlan; issue 5178.)

inspect

g = gen()
getgeneratorstate(g)
'GEN_CREATED'
next(g)
'demo'
getgeneratorstate(g)
'GEN_SUSPENDED'
next(g, None)
getgeneratorstate(g)
'GEN_CLOSED'
(Contributed by Rodolpho Eckhardt and Nick Coghlan, issue 10220.)

a = A()
getattr(a, 'f')
Running
10
inspect.getattr_static(a, 'f')
<property object at 0x1022bd788>

(Contributed by Michael Foord.)

pydoc

The pydoc module now provides a much-improved Web server interface, as well as a new command-line option -b to automatically open a browser window to display that server:

(Contributed by Ron Adam; issue 2001.)

dis

The dis module gained two new functions for inspecting code,code_info() and show_code(). Both provide detailed code object information for the supplied function, method, source code string or code object. The former returns a string and the latter prints it:

import dis, random dis.show_code(random.choice) Name: choice Filename: /Library/Frameworks/Python.framework/Versions/3.2/lib/python3.2/random.py Argument count: 2 Kw-only arguments: 0 Number of locals: 3 Stack size: 11 Flags: OPTIMIZED, NEWLOCALS, NOFREE Constants: 0: 'Choose a random element from a non-empty sequence.' 1: 'Cannot choose from an empty sequence' Names: 0: _randbelow 1: len 2: ValueError 3: IndexError Variable names: 0: self 1: seq 2: i

In addition, the dis() function now accepts string arguments so that the common idiom dis(compile(s, '', 'eval')) can be shortened to dis(s):

dis('3*x+1 if x%2==1 else x//2') 1 0 LOAD_NAME 0 (x) 3 LOAD_CONST 0 (2) 6 BINARY_MODULO 7 LOAD_CONST 1 (1) 10 COMPARE_OP 2 (==) 13 POP_JUMP_IF_FALSE 28 16 LOAD_CONST 2 (3) 19 LOAD_NAME 0 (x) 22 BINARY_MULTIPLY 23 LOAD_CONST 1 (1) 26 BINARY_ADD 27 RETURN_VALUE 28 LOAD_NAME 0 (x) 31 LOAD_CONST 0 (2) 34 BINARY_FLOOR_DIVIDE 35 RETURN_VALUE

Taken together, these improvements make it easier to explore how CPython is implemented and to see for yourself what the language syntax does under-the-hood.

(Contributed by Nick Coghlan in issue 9147.)

dbm

All database modules now support the get() and setdefault() methods.

(Suggested by Ray Allen in issue 9523.)

site

The site module has three new functions useful for reporting on the details of a given Python installation.

import site site.getsitepackages() ['/Library/Frameworks/Python.framework/Versions/3.2/lib/python3.2/site-packages', '/Library/Frameworks/Python.framework/Versions/3.2/lib/site-python', '/Library/Python/3.2/site-packages'] site.getuserbase() '/Users/raymondhettinger/Library/Python/3.2' site.getusersitepackages() '/Users/raymondhettinger/Library/Python/3.2/lib/python/site-packages'

Conveniently, some of site’s functionality is accessible directly from the command-line:

$ python -m site --user-base /Users/raymondhettinger/.local $ python -m site --user-site /Users/raymondhettinger/.local/lib/python3.2/site-packages

(Contributed by Tarek Ziadé in issue 6693.)

sysconfig

The new sysconfig module makes it straightforward to discover installation paths and configuration variables that vary across platforms and installations.

The module offers access simple access functions for platform and version information:

It also provides access to the paths and variables corresponding to one of seven named schemes used by distutils. Those include posix_prefix,posix_home, posix_user, nt, nt_user, os2, os2_home:

There is also a convenient command-line interface:

C:\Python32>python -m sysconfig Platform: "win32" Python version: "3.2" Current installation scheme: "nt"

Paths: data = "C:\Python32" include = "C:\Python32\Include" platinclude = "C:\Python32\Include" platlib = "C:\Python32\Lib\site-packages" platstdlib = "C:\Python32\Lib" purelib = "C:\Python32\Lib\site-packages" scripts = "C:\Python32\Scripts" stdlib = "C:\Python32\Lib"

Variables: BINDIR = "C:\Python32" BINLIBDEST = "C:\Python32\Lib" EXE = ".exe" INCLUDEPY = "C:\Python32\Include" LIBDEST = "C:\Python32\Lib" SO = ".pyd" VERSION = "32" abiflags = "" base = "C:\Python32" exec_prefix = "C:\Python32" platbase = "C:\Python32" prefix = "C:\Python32" projectbase = "C:\Python32" py_version = "3.2" py_version_nodot = "32" py_version_short = "3.2" srcdir = "C:\Python32" userbase = "C:\Documents and Settings\Raymond\Application Data\Python"

(Moved out of Distutils by Tarek Ziadé.)

pdb

The pdb debugger module gained a number of usability improvements:

(Contributed by Georg Brandl, Antonio Cuni and Ilya Sandler.)

configparser

The configparser module was modified to improve usability and predictability of the default parser and its supported INI syntax. The oldConfigParser class was removed in favor of SafeConfigParserwhich has in turn been renamed to ConfigParser. Support for inline comments is now turned off by default and section or option duplicates are not allowed in a single configuration source.

Config parsers gained a new API based on the mapping protocol:

parser = ConfigParser() parser.read_string(""" [DEFAULT] location = upper left visible = yes editable = no color = blue

[main] title = Main Menu color = green

[options] title = Options """)

parser['main']['color'] 'green' parser['main']['editable'] 'no' section = parser['options'] section['title'] 'Options' section['title'] = 'Options (editable: %(editable)s)' section['title'] 'Options (editable: no)'

The new API is implemented on top of the classical API, so custom parser subclasses should be able to use it without modifications.

The INI file structure accepted by config parsers can now be customized. Users can specify alternative option/value delimiters and comment prefixes, change the name of the DEFAULT section or switch the interpolation syntax.

There is support for pluggable interpolation including an additional interpolation handler ExtendedInterpolation:

parser = ConfigParser(interpolation=ExtendedInterpolation()) parser.read_dict({'buildout': {'directory': '/home/ambv/zope9'}, 'custom': {'prefix': '/usr/local'}}) parser.read_string(""" [buildout] parts = zope9 instance find-links = ${buildout:directory}/downloads/dist

[zope9]
recipe = plone.recipe.zope9install
location = /opt/zope

[instance]
recipe = plone.recipe.zope9instance
zope9-location = ${zope9:location}
zope-conf = ${custom:prefix}/etc/zope.conf
""")

parser['buildout']['find-links'] '\n/home/ambv/zope9/downloads/dist' parser['instance']['zope-conf'] '/usr/local/etc/zope.conf' instance = parser['instance'] instance['zope-conf'] '/usr/local/etc/zope.conf' instance['zope9-location'] '/opt/zope'

A number of smaller features were also introduced, like support for specifying encoding in read operations, specifying fallback values for get-functions, or reading directly from dictionaries and strings.

(All changes contributed by Łukasz Langa.)

urllib.parse

A number of usability improvements were made for the urllib.parse module.

The urlparse() function now supports IPv6 addresses as described in RFC 2732:

import urllib.parse urllib.parse.urlparse('http://[dead:beef:cafe:5417:affe:8FA3:deaf:feed]/foo/') ParseResult(scheme='http', netloc='[dead:beef:cafe:5417:affe:8FA3:deaf:feed]', path='/foo/', params='', query='', fragment='')

The urldefrag() function now returns a named tuple:

r = urllib.parse.urldefrag('http://python.org/about/#target') r DefragResult(url='http://python.org/about/', fragment='target') r[0] 'http://python.org/about/' r.fragment 'target'

And, the urlencode() function is now much more flexible, accepting either a string or bytes type for the query argument. If it is a string, then the safe, encoding, and error parameters are sent toquote_plus() for encoding:

urllib.parse.urlencode([ ('type', 'telenovela'), ('name', '¿Dónde Está Elisa?')], encoding='latin-1') 'type=telenovela&name=%BFD%F3nde+Est%E1+Elisa%3F'

As detailed in Parsing ASCII Encoded Bytes, all the urllib.parsefunctions now accept ASCII-encoded byte strings as input, so long as they are not mixed with regular strings. If ASCII-encoded byte strings are given as parameters, the return types will also be an ASCII-encoded byte strings:

urllib.parse.urlparse(b'http://www.python.org:80/about/') ParseResultBytes(scheme=b'http', netloc=b'www.python.org:80', path=b'/about/', params=b'', query=b'', fragment=b'')

(Work by Nick Coghlan, Dan Mahn, and Senthil Kumaran in issue 2987,issue 5468, and issue 9873.)

mailbox

Thanks to a concerted effort by R. David Murray, the mailbox module has been fixed for Python 3.2. The challenge was that mailbox had been originally designed with a text interface, but email messages are best represented withbytes because various parts of a message may have different encodings.

The solution harnessed the email package’s binary support for parsing arbitrary email messages. In addition, the solution required a number of API changes.

As expected, the add() method formailbox.Mailbox objects now accepts binary input.

StringIO and text file input are deprecated. Also, string input will fail early if non-ASCII characters are used. Previously it would fail when the email was processed in a later step.

There is also support for binary output. The get_file()method now returns a file in the binary mode (where it used to incorrectly set the file to text-mode). There is also a new get_bytes()method that returns a bytes representation of a message corresponding to a given key.

It is still possible to get non-binary output using the old API’sget_string() method, but that approach is not very useful. Instead, it is best to extract messages from a Message object or to load them from binary input.

(Contributed by R. David Murray, with efforts from Steffen Daode Nurpmeso and an initial patch by Victor Stinner in issue 9124.)

turtledemo

The demonstration code for the turtle module was moved from the _Demo_directory to main library. It includes over a dozen sample scripts with lively displays. Being on sys.path, it can now be run directly from the command-line:

(Moved from the Demo directory by Alexander Belopolsky in issue 10199.)

Multi-threading

Optimizations

A number of small performance enhancements have been added:

There were several other minor optimizations. Set differencing now runs faster when one operand is much larger than the other (patch by Andress Bennetts inissue 8685). The array.repeat() method has a faster implementation (issue 1569291 by Alexander Belopolsky). The BaseHTTPRequestHandlerhas more efficient buffering (issue 3709 by Andrew Schaaf). Theoperator.attrgetter() function has been sped-up (issue 10160 by Christos Georgiou). And ConfigParser loads multi-line arguments a bit faster (issue 7113 by Łukasz Langa).

Unicode

Python has been updated to Unicode 6.0.0. The update to the standard adds over 2,000 new characters including emojisymbols which are important for mobile phones.

In addition, the updated standard has altered the character properties for two Kannada characters (U+0CF1, U+0CF2) and one New Tai Lue numeric character (U+19DA), making the former eligible for use in identifiers while disqualifying the latter. For more information, see Unicode Character Database Changes.

Codecs

Support was added for cp720 Arabic DOS encoding (issue 1616979).

MBCS encoding no longer ignores the error handler argument. In the default strict mode, it raises an UnicodeDecodeError when it encounters an undecodable byte sequence and an UnicodeEncodeError for an unencodable character.

The MBCS codec supports 'strict' and 'ignore' error handlers for decoding, and 'strict' and 'replace' for encoding.

To emulate Python3.1 MBCS encoding, select the 'ignore' handler for decoding and the 'replace' handler for encoding.

On Mac OS X, Python decodes command line arguments with 'utf-8' rather than the locale encoding.

By default, tarfile uses 'utf-8' encoding on Windows (instead of'mbcs') and the 'surrogateescape' error handler on all operating systems.

Documentation

The documentation continues to be improved.

IDLE

Build and C API Changes

Changes to Python’s build process and to the C API include:

There were a number of other small changes to the C-API. See theMisc/NEWS file for a complete list.

Also, there were a number of updates to the Mac OS X build, seeMac/BuildScript/README.txt for details. For users running a 32/64-bit build, there is a known problem with the default Tcl/Tk on Mac OS X 10.6. Accordingly, we recommend installing an updated alternative such asActiveState Tcl/Tk 8.5.9. See http://www.python.org/download/mac/tcltk/ for additional details.

Porting to Python 3.2

This section lists previously described changes and other bugfixes that may require changes to your code:

(Contributed by Georg Brandl and Mattias Brändström;appspot issue 53094.)