BLOOM (BigScience Large Open-science Open-access Multilingual Language Model) , a transformer-based large language model created by over 1000 AI researchers, is released. It offers a free alternative to OpenAI's GPT-3, trained on approximately 366 billion tokens from March to July 2022, with 176 billion parameters. BLOOM utilizes a decoder-only transformer model architecture based on Megatron-LM GPT-2. The project involved multiple teams, including HuggingFace, Microsoft DeepSpeed, NVIDIA Megatron-LM, IDRIS/GENCI, PyTorch, and volunteers from the BigScience Engineering workgroup. It has been trained on diverse datasets consisting of 46 natural languages and 13 programming languages, amounting to 1.6 TeraByte pre-processed text and 350 billion unique tokens.