Meta Platforms Inc has opened access to a large-scale language model to include the artificial intelligence community to help with unbiased research.
Meta announced the launch of OPT-175B under a non-commercial license in its official Meta AI blog.
Open Pretrained Transformer (OPT-175B) “is a language model with 175 billion parameters trained on publicly available data sets, to allow for more community engagement in understanding this foundational new technology.”
We can say OPT-175B is equivalent to GPT-3 of OpenAI. It is a natural language processing system that has been pre-trained from massive amounts of text to create AI without complicated setup or preparation work.
Meta stated that the publication of OPT-175B will improve researchers’ ability to understand how large large-scale language models work. According to Meta AI, restricting access to such models has been “hindering progress on efforts to improve their robustness and mitigate known issues such as bias and toxicity.”
However, for responsible research, access to the model is limited to governments, civil society and academic researchers worldwide. Researchers who wish to use OPT-175B can apply through the request link. The release will include the pre-trained models and the code to train and use them.