GitHub - auto-differentiation/xad: Powerful automatic differentiation in C++ and Python (original) (raw)

XAD

πŸš€ XAD: Powerful Automatic Differentiation for C++ & Python

XAD is the ultimate solution for automatic differentiation, combining ease of use with high performance. It's designed to help you differentiate complex applications with speed and precisionβ€”whether you're optimizing neural networks, solving scientific problems, or performing financial risk analysis.

Download PRs Welcome Build Status Coverage Codacy Quality

🌟 Why XAD?

XAD is trusted by professionals for its speed, flexibility, and scalability across various fields:

Key Features

πŸ’» Example

Calculate first-order derivatives of an arbitrary function with two inputs and one output using XAD in adjoint mode.

Adouble x0 = 1.3; // initialise inputs Adouble x1 = 5.2; tape.registerInput(x0); // register independent variables tape.registerInput(x1); // with the tape tape.newRecording(); // start recording derivatives Adouble y = func(x0, x1); // run main function tape.registerOutput(y); // register the output variable derivative(y) = 1.0; // seed output adjoint to 1.0 tape.computeAdjoints(); // roll back adjoints to inputs cout << "dy/dx0=" << derivative(x0) << "\n" << "dy/dx1=" << derivative(x1) << "\n";

πŸš€ Getting Started

git clone https://github.com/auto-differentiation/xad.git cd xad mkdir build cd build cmake .. make

For more detailed guides, refer to our Installation Guideand explore Tutorials.

🀝 Contributing

Want to get involved? We welcome contributors from all backgrounds! Check out our Contributing Guide and join the conversation in ourDiscussions.

πŸ› Found a Bug?

Please report any issues through ourIssue Tracker.