Skip to content
The Hacker's Machine Learning Engine
Rust Shell
Find file
Latest commit 6f41247 @homu homu Auto merge of #82 - autumnai:feat/new_features, r=hobofan
change meaning of framework features

Changes the default feature flags to only build in support for the Native backend, since that is what most people will have available on their development machines.

It also changes the meaning of the framework feature flags (`native`,`cuda`,`opencl`), so that only the capabilities that are shared between the frameworks will be included in the compiled version. See #81 for a possible long term solution.

Example:
- feature flags are `native cuda` -> `Convolution` Layer **is not available** since the native backend does not provide the required traits, not even for the CUDA backend.
- feature flags are `cuda` -> `Convolution` Layer **is available** since the CUDA backend provides the required traits and there is no native backend it has to be compatible with.
- feature flags are `native` -> `Convolution` Layer **is not available** since the native backend does not provide the required traits and there are no other frameworks present.

WIP:

I still nedd to finish a top-level FEATURE-FLAGS guide that explains this a bit more in depth.

README.md

Leaf • Join the chat at https://gitter.im/autumnai/leaf Build Status Crates.io License

Introduction

Leaf is a Machine Intelligence Framework engineered by software developers, not scientists. It was inspired by the brilliant people behind TensorFlow, Torch, Caffe, Rust and numerous research papers and brings modularity, performance and portability to deep learning. Leaf is lean and tries to introduce minimal technical debt to your stack.

Leaf is a few months old, but thanks to its architecture and Rust, it is already one of the fastest Machine Intelligence Frameworks in the world.



See more Deep Neural Networks benchmarks on Deep Learning Benchmarks.

Leaf is portable. Run it on CPUs, GPUs, FPGAs on machines with an OS or on machines without one. Run it with OpenCL or CUDA. Credit goes to Collenchyma and Rust.

Leaf is part of the Autumn Machine Intelligence Platform, which is working on making AI algorithms 100x more computational efficient. It seeks to bring real-time, offline AI to smartphones and embedded devices.

We see Leaf as the core of constructing high-performance machine intelligence applications. Leaf's design makes it easy to publish independent modules to make e.g. deep reinforcement learning, visualization and monitoring, network distribution, automated preprocessing or scaleable production deployment easily accessible for everyone.

For more info, refer to

Disclaimer: Leaf is currently in an early stage of development. If you are experiencing any bugs with features that have been implemented, feel free to create a issue.

Getting Started

If you are new to Rust you can install it as detailed here. We also recommend taking a look at the official Getting Started Guide.

If you're using Cargo, just add Leaf to your Cargo.toml:

[dependencies]
leaf = "0.2.0"

If you're using Cargo Edit, you can call:

cargo add leaf

If you are on a machine that doesn't have support for CUDA or OpenCL you can selectively enable them like this in your Cargo.toml:

[dependencies]
leaf = { version = "0.2.0", default-features = false }

[features]
default = ["native"] # include only the ones you want to use, in this case "native"
native  = ["leaf/native"]
cuda    = ["leaf/cuda"]
opencl  = ["leaf/opencl"]

More information on the use of feature flags in Leaf can be found in FEATURE-FLAGS.md

Examples

We are providing a Leaf examples repository, where we and others publish executable machine learning models build with Leaf. It features a CLI for easy usage and has a detailed guide in the project README.md.

Leaf comes with an examples directory as well, which features popular neural networks (e.g. Alexnet, Overfeat, VGG). To run them on your machine, just follow the install guide, clone this repoistory and then run

# The examples currently require CUDA support.
cargo run --release --no-default-features --features cuda --example benchmarks alexnet

Ecosystem / Extensions

We designed Leaf and the other crates of the Autumn Platform to be as modular and extensible as possible. More helpful crates you can use with Leaf:

  • Cuticula: Preprocessing Framework for Machine Learning
  • Collenchyma: Portable, HPC-Framework on any hardware with CUDA, OpenCL, Rust

Support / Contact

  • With a bit of luck, you can find us online on the #rust-machine-learing IRC at irc.mozilla.org,
  • but we are always approachable on Gitter/Leaf
  • For bugs and feature request, you can create a Github issue
  • For more private matters, send us email straight to our inbox: developers@autumnai.com
  • Refer to Autumn for more information

Contributing

Want to contribute? Awesome! We have instructions to help you get started.

Leaf has a near real-time collaboration culture, and it happens here on Github and on the Leaf Gitter Channel.

Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in the work by you, as defined in the Apache-2.0 license, shall be dual licensed as below, without any additional terms or conditions.

Changelog

You can find the release history at the CHANGELOG.md. We are using Clog, the Rust tool for auto-generating CHANGELOG files.

Q&A

Why Rust?

Hardware has just recently become strong enough to support real-world usage of machine intelligence e.g. super-human image recognition, self-driving cars, etc. To take advantage of the computational power of the underlying hardware, from GPUs to clusters, you need a low-level language that allows for control of memory. But to make machine intelligence widely accessible you want to have a high-level, comfortable abstraction over the underlying hardware.

Rust allows us to cross this chasm. Rust promises performance like C/C++ but with safe memory-control. For now we can use C Rust wrappers for performant libraries. But in the future Rust rewritten libraries will have the advantage of zero-cost safe memory control, that will make large, parallel learning networks over CPUs and GPUs more feasible and more reliable to develop. The development of these future libraries is already under way e.g. Glium.

On the usability side, Rust offers a trait-system that makes it easy for researchers and hobbyists alike to extend and work with Leaf as if it were written in a higher-level language such as Ruby, Python, or Java.

Who can use Leaf?

We develop Leaf under the MIT open source license, which, paired with the easy access and performance, makes Leaf a first-choice option for researchers and developers alike.

Why did you open source Leaf?

We believe strongly in machine intelligence and think that it will have a major impact on future innovations, products and our society. At Autumn, we experienced a lack of common and well engineered tools for machine learning and therefore started to create a modular toolbox for machine learning in Rust. We hope that, by making our work open source, we will speed up research and development of production-ready applications and make that work easier as well.

Who is Autumn?

Autumn is a startup working on automated decision making. Autumn was started by two developers, MJ and Max. The startup is located in Berlin and recently received a pre-seed investment from Axel Springer and Plug&Play.

License

Licensed under either of

at your option.

Something went wrong with that request. Please try again.