Skip to content

Efficient computations with symmetric and non-symmetric tensors with support for automatic differentiation.

License

Notifications You must be signed in to change notification settings

tlow22/Tensors.jl

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Tensors.jl

Efficient computations with symmetric and non-symmetric tensors with support for automatic differentiation.

Documentation Build Status

Introduction

This Julia package provides fast operations with symmetric and non-symmetric tensors of order 1, 2 and 4. The tensors are allocated on the stack which means that there is no need to preallocate output results for performance. Unicode infix operators are provided such that the tensor expression in the source code is similar to the one written with mathematical notation. When possible, symmetry of tensors is exploited for better performance. Supports Automatic Differentiation to easily compute first and second order derivatives of tensorial functions.

Installation

The package can be installed with the Julia package manager. From the Julia REPL, type ] to enter the Pkg REPL mode and run:

pkg> add Tensors

Or, equivalently, via the Pkg API:

julia> import Pkg; Pkg.add("Tensors")

Documentation

  • STABLEmost recently tagged version of the documentation.
  • LATESTin-development version of the documentation.

Project Status

The package is tested against Julia 1.X on Linux, macOS, and Windows.

Contributing and Questions

Contributions are very welcome, as are feature requests and suggestions. Please open an issue if you encounter any problems.

Things to work on

If you are interested in contributing to Tensors.jl, here are a few topics that can get you started:

  • Implement support for third order tensors. These are more rarely used than first, second and fourth order tensors but are still useful in some applications. It would be good to support this.
  • Find a way to reduce code duplication without sacrificing performance or compilation time. Currently, there is quite a lot of code duplication in the implementation of different operators. It should be possible to have a higher level code generation framework that generates optimized functions from pretty much only the Einstein summation notation for the operation.
  • Tensors.jl has been developed with mostly the application to continuum mechanics in mind. For other fields, perhaps other tensor operations are useful. Implement these in a well performant manner and give good test coverage and documentation for the new functionalities.

Citing Tensors.jl

If you use Tensors.jl for research and publication, please cite the following article

@article{Tensors.jl,
  title = {Tensors.jl -- Tensor Computations in Julia},
  author = {Carlsson, Kristoffer and Ekre, Fredrik},
  year = {2019},
  journal = {Journal of Open Research Software},
  doi = {10.5334/jors.182},
}

Related packages

Both the packages below provide a convenience macro to provide einstein summation notation for standard Julia Array's:

About

Efficient computations with symmetric and non-symmetric tensors with support for automatic differentiation.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Julia 100.0%