Skip to content

Commit 62a9939

Browse files
committed
readme
1 parent c876d1a commit 62a9939

File tree

2 files changed

+21
-12
lines changed

2 files changed

+21
-12
lines changed

README.md

+19-11
Original file line numberDiff line numberDiff line change
@@ -6,20 +6,20 @@
66
[![Build Status](https://github.com/mcabbott/TensorCast.jl/workflows/CI/badge.svg)](https://github.com/mcabbott/TensorCast.jl/actions?query=workflow%3ACI)
77

88
This package lets you work with multi-dimensional arrays in index notation,
9-
by defining a few macros.
9+
by defining a few macros which translate this to broadcasting, permuting, and reducing operations.
1010

1111
The first is `@cast`, which deals both with "casting" into new shapes (including going to and from an array-of-arrays) and with broadcasting:
1212

1313
```julia
14-
@cast A[row][col] := B[row, col] # slice a matrix B into rows, also @cast A[r] := B[r,:]
14+
@cast A[row][col] := B[row, col] # slice a matrix B into rows, also @cast A[r] := B[r,:]
1515

16-
@cast C[(i,j), (k,ℓ)] := D.x[i,j,k,ℓ] # reshape a 4-tensor D.x to give a matrix
16+
@cast C[(i,j), (k,ℓ)] := D.x[i,j,k,ℓ] # reshape a 4-tensor D.x to give a matrix
1717

18-
@cast E[φ,γ] = F[φ]^2 * exp(G[γ]) # broadcast E .= F.^2 .* exp.(G') into existing E
18+
@cast E[φ,γ] = F[φ]^2 * exp(G[γ]) # broadcast E .= F.^2 .* exp.(G') into existing E
1919

20-
@cast _[i] := isodd(i) ? log(i) : V[i] # broadcast a function of the index values
20+
@cast _[i] := isodd(i) ? log(i) : V[i] # broadcast a function of the index values
2121

22-
@cast T[x,y,n] := outer(M[:,n])[x,y] # generalised mapslices, vector -> matrix function
22+
@cast T[x,y,n] := outer(M[:,n])[x,y] # generalised mapslices, vector -> matrix function
2323
```
2424

2525
Second, `@reduce` takes sums (or other reductions) over the indicated directions. Among such sums is
@@ -33,7 +33,7 @@ matrix multiplication, which can be done more efficiently using `@matmul` instea
3333
@matmul M[i,j] := sum(k,k′) U[i,k,k′] * V[(k,k′),j] # matrix multiplication, plus reshape
3434
```
3535

36-
This notation with `@cast` applies a function which takes the `dims` keyword, without reducing:
36+
The same notation with `@cast` applies a function accepting the `dims` keyword, without reducing:
3737

3838
```julia
3939
@cast W[i,j,c,n] := cumsum(c) X[c,i,j,n]^2 # permute, broadcast, cumsum(; dims=3)
@@ -43,7 +43,16 @@ All of these are converted into array commands like `reshape` and `permutedims`
4343
and `eachslice`, plus a [broadcasting expression](https://julialang.org/blog/2017/01/moredots) if needed,
4444
and `sum` / `sum!`, or `*` / `mul!`. This package just provides a convenient notation.
4545

46-
It can be used with some other packages which modify broadcasting:
46+
From version 0.4, it relies on [TransmuteDims.jl](https://github.com/mcabbott/TransmuteDims.jl)
47+
to handle re-ordering of dimensions, and [LazyStack.jl](https://github.com/mcabbott/LazyStack.jl)
48+
to handle slices. It should also now work with [OffsetArrays.jl](https://github.com/JuliaArrays/OffsetArrays.jl):
49+
50+
```julia
51+
using OffsetArrays
52+
@cast R[n,c] := n^2 + rand(3)[c] (n in -5:5) # arbitrary indexing starts
53+
```
54+
55+
And it can be used with some packages which modify broadcasting, now with the following notation:
4756

4857
```julia
4958
using Strided, LoopVectorization, LazyArrays
@@ -55,15 +64,15 @@ using Strided, LoopVectorization, LazyArrays
5564
## Installation
5665

5766
```julia
58-
] add TensorCast
67+
using Pkg; Pkg.add("TensorCast")
5968
```
6069

6170
The current version requires [Julia 1.4](https://julialang.org/downloads/) or later.
6271
There are a few pages of [documentation](https://mcabbott.github.io/TensorCast.jl/dev).
6372

6473
## Elsewhere
6574

66-
Similar notation is used by some other packages, although all of them use an implicit sum over
75+
Similar notation is also used by some other packages, although all of them use an implicit sum over
6776
repeated indices. [TensorOperations.jl](https://github.com/Jutho/TensorOperations.jl) performs
6877
Einstein-convention contractions and traces:
6978

@@ -102,4 +111,3 @@ while `@ein` & `@tensor` are closer to [`einsum`](https://numpy.org/doc/stable/r
102111
This was a holiday project to learn a bit of metaprogramming, originally `TensorSlice.jl`.
103112
But it suffered a little scope creep.
104113

105-
From version 0.4, it relies on two helper packages: [TransmuteDims.jl](https://github.com/mcabbott/TransmuteDims.jl) handles permutations & reshapes, and [LazyStack.jl](https://github.com/mcabbott/LazyStack.jl) handles slices.

docs/src/index.md

+2-1
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,8 @@ Version 0.4 has significant changes:
2121

2222
New features in 0.4:
2323
- Indices can appear ouside of indexing: `@cast A[i,j] = i+j` translates to `A .= axes(A,1) .+ axes(A,2)'`
24-
- The ternary operator `? :` can appear on the right, and will be broadcast correctly.
24+
- The ternary operator `? :` can appear on the right, and will be broadcast correctly.
25+
- All operations should now support [OffsetArrays.jl](https://github.com/JuliaArrays/OffsetArrays.jl).
2526

2627
## Pages
2728

0 commit comments

Comments
 (0)