Today Elon Musk, an information mastermind who has created incredible feats of tech genius among his Tesla and SpaceX innovations, is in the spotlight in the AI community. His start-up, xAI, has released one of its GPT-3 type language models, Grok-1, on GitHub, as it will be open-source. Here we set out a challenge to solve in debatable open and closed-source AI development.
The motives behind this dismissal are still not clear. The decision comes in the backdrop of Musk’s clash with an organization called OpenAI which aims to explore various areas of AI concerning the public domain. ChatGPT created by OpenAI is a very popular name, now waiting involved in the principles Alvester said, even though OpenAI is not a subsidiary of Microsoft, they have invested in the company.
Grok-1, as it is called, is the same kind of thing that is usually found closed-source from tech giants like GPT-3.5, GPT-4, and of course Apple. The latter refers to models that can infiltrate within the restrictive framework of private laboratories. It is marked in contrast to Grok-1 which is a 341 million parameter Chariot designed for public access with its many applications.
This open-source philosophy demands that researchers and developers step forward to discover where the base of Grok-1 is located, a highly probable source of major potential enhancements to its capabilities. The code is licensed under Apache 2.0, which details the terms provided with the application that allows the user to use it freely and responsibly.
The presence of Grok-1 on the public library stand is an open-source revolution. Building on this huge potential road of our AI best practices in the form of our clusters can lead to a new phase of AI technology.