OSLO: Open Source framework for Large-scale transformer Optimization
Contents
What’s New:¶
February 15, 2022 Added kernel fusion.
February 15, 2021 Released OSLO 2.0.
February 02, 2022 Added activation checkpointing.
January 30, 2022 Released OSLO 2.0 alpha version.
December 30, 2021 Added Deployment Launcher.
December 21, 2021 Released OSLO 1.0.
What is OSLO about?¶
OSLO is a framework that provides various GPU based optimization technologies for large-scale modeling. 3D Parallelism and Kernel Fusion which could be useful when training a large model like EleutherAI/gpt-j-6B are the key features. OSLO makes these technologies easy-to-use by magical compatibility with Hugging Face Transformers that is being considered as a de facto standard in 2021.
Installation¶
OSLO can be easily installed using the pip package manager. All the dependencies such as torch and transformers should be installed automatically with the following command. Be careful that the ‘core’ is in the PyPI project name.
pip install oslo-core
Basic Usage¶
It only takes a single line of code. Now feel free to train and infer a large transformer model. 😎
import oslo
model = oslo.initialize(model, "oslo-config.json")
Documents¶
CONFIGURATION
Administrative Notes¶
Citing OSLO¶
If you find our work useful, please consider citing:
@misc{oslo,
author = {Ko, Hyunwoong and Kim, Soohwan and Lee, Yohan and Park, Kyubyong},
title = {OSLO: Open Source framework for Large-scale transformer Optimization},
howpublished = {\url{https://github.com/tunib-ai/oslo}},
year = {2021},
}
Licensing¶
The Code of the OSLO project is licensed under the terms of the Apache License 2.0.
Copyright 2021 TUNiB Inc. http://www.tunib.ai All Rights Reserved.
Acknowledgements¶
The OSLO project is built with GPU support from the AICA (Artificial Intelligence Industry Cluster Agency).