6
6
[ ![ Build Status] ( https://github.com/mcabbott/TensorCast.jl/workflows/CI/badge.svg )] ( https://github.com/mcabbott/TensorCast.jl/actions?query=workflow%3ACI )
7
7
8
8
This package lets you work with multi-dimensional arrays in index notation,
9
- by defining a few macros.
9
+ by defining a few macros which translate this to broadcasting, permuting, and reducing operations.
10
10
11
11
The first is ` @cast ` , which deals both with "casting" into new shapes (including going to and from an array-of-arrays) and with broadcasting:
12
12
13
13
``` julia
14
- @cast A[row][col] := B[row, col] # slice a matrix B into rows, also @cast A[r] := B[r,:]
14
+ @cast A[row][col] := B[row, col] # slice a matrix B into rows, also @cast A[r] := B[r,:]
15
15
16
- @cast C[(i,j), (k,ℓ)] := D. x[i,j,k,ℓ] # reshape a 4-tensor D.x to give a matrix
16
+ @cast C[(i,j), (k,ℓ)] := D. x[i,j,k,ℓ] # reshape a 4-tensor D.x to give a matrix
17
17
18
- @cast E[φ,γ] = F[φ]^ 2 * exp (G[γ]) # broadcast E .= F.^2 .* exp.(G') into existing E
18
+ @cast E[φ,γ] = F[φ]^ 2 * exp (G[γ]) # broadcast E .= F.^2 .* exp.(G') into existing E
19
19
20
- @cast _[i] := isodd (i) ? log (i) : V[i] # broadcast a function of the index values
20
+ @cast _[i] := isodd (i) ? log (i) : V[i] # broadcast a function of the index values
21
21
22
- @cast T[x,y,n] := outer (M[:,n])[x,y] # generalised mapslices, vector -> matrix function
22
+ @cast T[x,y,n] := outer (M[:,n])[x,y] # generalised mapslices, vector -> matrix function
23
23
```
24
24
25
25
Second, ` @reduce ` takes sums (or other reductions) over the indicated directions. Among such sums is
@@ -33,7 +33,7 @@ matrix multiplication, which can be done more efficiently using `@matmul` instea
33
33
@matmul M[i,j] := sum (k,k′) U[i,k,k′] * V[(k,k′),j] # matrix multiplication, plus reshape
34
34
```
35
35
36
- This notation with ` @cast ` applies a function which takes the ` dims ` keyword, without reducing:
36
+ The same notation with ` @cast ` applies a function accepting the ` dims ` keyword, without reducing:
37
37
38
38
``` julia
39
39
@cast W[i,j,c,n] := cumsum (c) X[c,i,j,n]^ 2 # permute, broadcast, cumsum(; dims=3)
@@ -43,7 +43,16 @@ All of these are converted into array commands like `reshape` and `permutedims`
43
43
and ` eachslice ` , plus a [ broadcasting expression] ( https://julialang.org/blog/2017/01/moredots ) if needed,
44
44
and ` sum ` / ` sum! ` , or ` * ` / ` mul! ` . This package just provides a convenient notation.
45
45
46
- It can be used with some other packages which modify broadcasting:
46
+ From version 0.4, it relies on [ TransmuteDims.jl] ( https://github.com/mcabbott/TransmuteDims.jl )
47
+ to handle re-ordering of dimensions, and [ LazyStack.jl] ( https://github.com/mcabbott/LazyStack.jl )
48
+ to handle slices. It should also now work with [ OffsetArrays.jl] ( https://github.com/JuliaArrays/OffsetArrays.jl ) :
49
+
50
+ ``` julia
51
+ using OffsetArrays
52
+ @cast R[n,c] := n^ 2 + rand (3 )[c] (n in - 5 : 5 ) # arbitrary indexing starts
53
+ ```
54
+
55
+ And it can be used with some packages which modify broadcasting, now with the following notation:
47
56
48
57
``` julia
49
58
using Strided, LoopVectorization, LazyArrays
@@ -55,15 +64,15 @@ using Strided, LoopVectorization, LazyArrays
55
64
## Installation
56
65
57
66
``` julia
58
- ] add TensorCast
67
+ using Pkg; Pkg . add ( " TensorCast" )
59
68
```
60
69
61
70
The current version requires [ Julia 1.4] ( https://julialang.org/downloads/ ) or later.
62
71
There are a few pages of [ documentation] ( https://mcabbott.github.io/TensorCast.jl/dev ) .
63
72
64
73
## Elsewhere
65
74
66
- Similar notation is used by some other packages, although all of them use an implicit sum over
75
+ Similar notation is also used by some other packages, although all of them use an implicit sum over
67
76
repeated indices. [ TensorOperations.jl] ( https://github.com/Jutho/TensorOperations.jl ) performs
68
77
Einstein-convention contractions and traces:
69
78
@@ -102,4 +111,3 @@ while `@ein` & `@tensor` are closer to [`einsum`](https://numpy.org/doc/stable/r
102
111
This was a holiday project to learn a bit of metaprogramming, originally ` TensorSlice.jl ` .
103
112
But it suffered a little scope creep.
104
113
105
- From version 0.4, it relies on two helper packages: [ TransmuteDims.jl] ( https://github.com/mcabbott/TransmuteDims.jl ) handles permutations & reshapes, and [ LazyStack.jl] ( https://github.com/mcabbott/LazyStack.jl ) handles slices.
0 commit comments