SciPost logo

SciPost Submission Page

The ITensor Software Library for Tensor Network Calculations

by Matthew Fishman, Steven R. White, E. Miles Stoudenmire

Submission summary

Authors (as registered SciPost users): Edwin Miles Stoudenmire
Submission information
Preprint Link: scipost_202007_00059v2  (pdf)
Code repository: https://github.com/ITensor/ITensors.jl
Date accepted: 2022-03-22
Date submitted: 2021-12-19 02:28
Submitted by: Stoudenmire, Edwin Miles
Submitted to: SciPost Physics Codebases
Ontological classification
Academic field: Physics
Specialties:
  • Condensed Matter Physics - Computational
Approach: Computational

Abstract

ITensor is a system for programming tensor network calculations with an interface modeled on tensor diagram notation, which allows users to focus on the connectivity of a tensor network without manually bookkeeping tensor indices. The ITensor interface rules out common programming errors and enables rapid prototyping of tensor network algorithms. After discussing the philosophy behind the ITensor approach, we show examples of each part of the interface including Index objects, the ITensor product operator, tensor factorizations, tensor storage types, algorithms for matrix product state (MPS) and matrix product operator (MPO) tensor networks, quantum number conserving block-sparse tensors, and the NDTensors library. We also review publications that have used ITensor for quantum many-body physics and for other areas where tensor networks are increasingly applied. To conclude we discuss promising features and optimizations to be added in the future.

Author comments upon resubmission

We thank the referees and editors while we prepared our resubmission and hope it reflects their many helpful suggestions. Thank you to the referees for the detailed reviews and taking the time to review such a long manuscript. Below we list the major changes. Of course we have also made many small changes that are too numerous to list, such as updating code examples to use newer syntax reflecting updates we have made to ITensor, or updates about features we are currently working on.

Major changes:

  1. We have added two sections discussing benchmarks of ITensor. The first performs extensive benchmarks of ITensor itself, both the C++ and Julia implementations, across multiple algorithms and using various kinds of multi-threaded parallelism. The second set of benchmarks is more briefly discussed in the paper and is hosted externally. These compare ITensor to the state-of-the-art TeNPy software.

  2. We have revised the introduction to discuss more aspects about the unusual choice to have "intelligent" tensor indices, why it is helpful, some drawbacks it can present (but how to handle them). We have also contrasted the ITensor interface against other tensor libraries and their interfaces in some more depth, while trying to make the writing broad enough to be "future proof" given that other libraries are always changing and adding features.

  3. We discuss aspects of the various ITensor storage types in some additional depth, with more details of implementation and references to other parts of the paper where they are discussed more.

  4. On a related note, the NDTensors section has been expanded, including a more detailed example of a block-sparse tensor with a figure.

  5. Please note the link to further code examples online added to the end of Section 2 and beginning of Appendix A. We are continually updating and expanding these.

List of changes

Specific Replies to Referees
--------------------------------------

# Referee 1

Thank you for the suggestion of adding benchmarks, which does make the paper much stronger. We have now added extensive benchmarks of the Julia versus C++ versions of ITensor, showing they have similar performance, with the Julia version being even faster in many cases.

# Referee 2

Thank you for the detailed feedback, and especially for the suggestion of adding benchmarks.

Regarding point (1), we have added new material to the introduction to put ITensor in more context. Of course we feel the intelligent index design is a good one and are continually finding it to have many benefits (e.g. as we are now working on an automatic fermion system, automatic differentiation tools, etc.). But we certainly agree that there are risks or possible downsides of the design that have to be thought about: for example, we now specifically note how users can gain manual control over index ordering and memory layout when needed.

To address point (2), we have added references to relevant papers on conservation of quantum numbers i.e. symmetric tensors within the section on QN conserving ITensors. Thank you for this suggestion since there has been much work on this topic in the tensor literature of course.

Regarding (3), we have now added extensive benchmarks of the C++ versus Julia implementations of ITensor, and detailed benchmarks versus the TeNPy software online. We agree the paper greatly benefits from including these.

# Referee 3

Thank you for the detailed review. In the new section on benchmarks, we have also provided links to our ITensorBenchmarks Github repository, which includes further instructions for running or re-running the benchmark codes with various adjustable options. We plan to continue working on this repository over time by not only adding more benchmarks but also making it easier to use with more adjustable options etc.

We have also provided a link to the examples code folder, plus made this more prominent in our online ITensor documentation. Thank you for this suggestion as these examples were definitely too hard to find previously.

Thank you for the writing corrections which we have implemented.

Published as SciPost Phys. Codebases 4-r0.3 (2022) , SciPost Phys. Codebases 4 (2022)


Reports on this Submission

Anonymous Report 3 on 2022-3-13 (Invited Report)

  • Cite as: Anonymous, Report on arXiv:scipost_202007_00059v2, delivered 2022-03-13, doi: 10.21468/SciPost.Report.4676

Report

The new version has accommodated the requested changes and was generally improved across the whole manuscript. It now meets the requirements to be accepted for publication.

One possible suggestion could be that it might have been useful to plot the benchmarks using a logarithmic scale for the runtime. This might show the scaling as a function of bond dimension more clearly, but more importantly, would probably allow for a clearer visualisation of the differences (between the different data sets) at small values of the bond dimension. In the current version, the results collapse at small runtimes due to the range of the scale.

  • validity: -
  • significance: -
  • originality: -
  • clarity: -
  • formatting: -
  • grammar: -

Anonymous Report 2 on 2022-2-15 (Invited Report)

Weaknesses

None

Report

This updated version of manuscript includes an extensive benchmark in C++ and Julia version compared to TenPy. Also, NDTensor section is expanded and a link to example codes has been provided. With these improvements, this provides a well-written document for the Tensor package.

Requested changes

None

  • validity: top
  • significance: high
  • originality: top
  • clarity: top
  • formatting: excellent
  • grammar: perfect

Report 1 by Johannes Hauschild on 2022-1-10 (Invited Report)

Report

I thank the authors for incorporating the suggested changes.
The software and manuscript now easily meet the acceptance criteria, and I recommend a publication.

Requested changes

Duplicated sentences "We are also developing tools for visualizing ... based on Makie.jl" in the section "Future Directions".

  • validity: top
  • significance: top
  • originality: top
  • clarity: top
  • formatting: perfect
  • grammar: perfect

Login to report or comment