IAM

OPENSOURCEFAN STUDYING
STUDYINGCOMPUTERSCIENCEANDMATH COMPUTERSCIENCE

Check out the latest superpixel benchmark — Superpixel Benchmark (2016) — and let me know your opinion! @david_stutz
20thFEBRUARY2017

SNIPPET

Following the documentation, this snippet illustrates the implementation of a simple auto-encoder using Torch’s nn.StochasticGradient trainer.

auto_encoder_trainer.lua
-- Simple auto-encoder example.
-- Uses the nn.StochasticGradient trainer for training.

require('math')
require('torch')
require('nn')

N = 1000
D = 100

dataset = {}
function dataset:size()
  return N
end

for i = 1, dataset:size() do 
  local output = torch.ones(D)
  local input = torch.cmul(output, torch.randn(D)*0.05 + 1)
  dataset[i] = {input, output}
end

model = nn.Sequential()
model:add(nn.Linear(D, D/2))
model:add(nn.Tanh())
model:add(nn.Linear(D/2, D))

learningRate = 0.01

criterion = nn.AbsCriterion()  
trainer = nn.StochasticGradient(model, criterion)
trainer.learningRate = learningRate
trainer:train(dataset)

What is your opinion on the code snippet? Is it working? Let me know your thoughts in the comments below or using the following platforms: