Is it possible in arraymancer to apply a function over slices along an axis? For example
let t = [1,2,3,4].toTensor().reshape(2,2)
assert t.map(max, axis=1) == [2,4].toTensor().reshape(1,2)
reduce does something along these lines, but sums up the resultsimport arraymancer
import sugar
# Create a tensor of value between 0 and 10
var tensor = randomTensor[float](2, 2, 10)
echo tensor
tensor[_, 0] = tensor[_, 0].map(x => x*2)
echo tensor
This is the easiest way I found.
Slicing does not actually create a mutable tensor so this will not work :
# This does not compile
apply_inline(tensor[_, 0]):
x*2
Otherwise, you can also iterate on an axis (see https://mratsim.github.io/Arraymancer/accessors.html)
I'll try to see if I can provide a better example when I have access to my computer (formatting code on a phone isn't practical).
You can by storing the slice in a variable before passing it to apply. I tried to make it possible without the intermediate assignment but had issues when trying to return var Tensor from slicing.
Example usage for RNN/GRU implementation: https://github.com/mratsim/Arraymancer/blob/1a2422a1/src/arraymancer/nn_primitives/nnp_gru.nim#L76-L84
.
# Step 2 - Computing reset (r) and update (z) gate
var W2ru = W3x[_, srz] # shape [batch_size, 2*H] - we reuse the previous buffer
apply2_inline(W2ru, U3h[_, srz]):
sigmoid(x + y)
# Step 3 - Computing candidate hidden state ñ
var n = W3x[_, s] # shape [batch_size, H] - we reuse the previous buffer
apply3_inline(n, W2ru[_, sr], U3h[_, s]):
tanh(x + y * z)