A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://github.com/pytorch/pytorch/releases/tag/v0.1.6 below:

Release Beta is here. · pytorch/pytorch · GitHub

Our last release (v0.1.5) was on November 14th, 2016

We finished, froze and released (v0.1.6) on Jan 21st, 2017.

A lot has happened since 0.1.5.

Summary API Changes Model Zoo

A model zoo has been started with several pre-trained vision models available such as AlexNet, ResNet50, etc. The download and usage of the models is seamless with a keyword argument.

import torchvision.models as models
models.alexnet(pretrained=True)

The models are hosted on Amazon S3, and we look forward to more models from the community.
Basic documentation is found here:

http://pytorch.org/docs/model_zoo.html

You can find specific models listed in the README of torchvision and torchtext

Stochastic Functions in Autograd

We introduced Stochastic functions that needed to be provided with a reward for their backward.
This feature was inspired by Gradient Estimation Using Stochastic Computation Graphs by Schulman et. al. and is helpful to implement reinforcement learning techniques.
Documentation is here: http://pytorch.org/docs/autograd.html#torch.autograd.Variable.reinforce
A showcase of using these nodes is in the REINFORCE example: https://github.com/pytorch/examples/blob/master/reinforcement_learning/reinforce.py#L70

Functional interface to nn

PyTorch neural networks have so far been modeled around nn.Module. However, for most simple functions such as ReLU, using this is a bit cumbersome.
To simplify this, we've introduced a functional interface to nn, and modified the tutorials to use this API where appropriate.

For example:

import torch.nn as nn
import torch.nn.functional as F

# module style
relu = nn.ReLU()
y = relu(x)

# functional style
y = F.relu(x)

The functional style is convenient when using non-parametric and non-learnable functions.

Documentation for these functions is here: http://pytorch.org/docs/nn.html#torch-nn-functional

Faster GPU code

The initialization of the GPU backend has been made lazy. This means that it will automatically be
imported and initialized when needed (and not before-hand). Doing this has improved startup times (especially for multi-GPU systems) and reduced boilerplate code.

We've also integrated support for pinned memory, which accelerates CPU to GPU transfers for specially marked buffers. Using this, we accelerated the multiprocessing data loaders.

A rich set of examples

With the help of some of you, we've added a rich set of examples from Image Super-resolution to Neural Machine Translation.
You can explore more here: https://github.com/pytorch/examples

API Reference and Notes

We've fleshed out a full API reference that is mostly complete at docs.pytorch.org
Contributions are welcome :)

We've also added notes such has CUDA Semantics, Extending PyTorch, etc.

Multiprocessing support for CUDA

Uptil now, Tensor sharing using multiprocessing only worked for CPU Tensors.
We've now enabled Tensor sharing for CUDA tensors when using python-3.
You can read more notes here: http://pytorch.org/docs/notes/multiprocessing.html

Lua Reader

A "lua reader" has been integrated, that can load most LuaTorch .t7 files, including nn models.
nngraph models are not supported.

Example usage can be found here: https://discuss.pytorch.org/t/convert-import-torch-model-to-pytorch/37/2


RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.3