GPT-J – Open-source alternative of GPT-3 publicly available

Applied artificial intelligence offers transformative potential, especially in the enterprise. The technologies are already being used to support chatbots, convert natural language to structured query languages, create spreadsheets, and improve search results. GPT-3, OpenAI's best-known machine-learning text-generated engine, is now running in more than 300 apps and producing 4.5 billion words per day.

GPT-3 has been described as “one of the most interesting and important AI systems ever produced.” After being trained on 45 terabytes of text data, it is able to compose essays and write computer code. It’s ability has stunned many and caused a rethink about the possibilities of machine intelligence.

GPT-J Model

EleutherAI estimates that the GPT-J model contains roughly 6 billion parameters, derived from previous training data.

GPT-J was trained over the course of five weeks on 400 billion tokens from a dataset created by EleutherAI called The Pile, an 835GB collection of 22 smaller datasets — including academic sources (e.g., Arxiv, PubMed), communities (StackExchange, Wikipedia), code repositories (Github), and more. (Tokens in natural language are a way of breaking up larger pieces of text into smaller units, and they can be words or characters or parts of words.)

The Pile dataset

The Pile is an 825GB language modeling dataset that combines data from Wikipedia, GitHub, StackExchange, PubMed, and HackerNews... It features diversity that makes the Pile an ideal dataset for generalized language models across domains. Here’s the paper and the downloading options.

GPT-J generated texts

Despite our progress towards true AI, GPT-J reminds us that, even by today's standards, "there's much more research to be done".

Here are the last paragraphs of an article about AI; a hopeful conclusion that AI will be for the betterment of humanity. AI will be able to teach itself and thus improve upon its intelligence. AI will be able to communicate with everyone and thus will be able to understand human nuances. AI can be used to solve all kinds of issues, it can improve the quality of life for all.

But it doesn’t say that we’re there yet. The article gives an optimistic view of what AI will be like in the future, but it sure doesn’t give any concrete evidence or even any specific examples of what that might look like.

AI is here. It’s a field of research that has been growing exponentially for the past three decades. And it’s only getting better. There are now AI systems that can beat the world’s best players at video games such as Go, chess, and poker. There are systems that can recognize faces and translate languages. There are systems that can remember facts for you.

But that’s all the AI we have today. It’s not that AI hasn’t made such notable breakthroughs, it’s that the field is still very young and there’s much more research to be done.

GPT-J online demo

Here’s a online demo of GPT-J. You can tweak the TOP-P and temperature variables to play with the system. It’s definitely worth checking out if you don’t have access to OpenAI’s API.

Click to rate this post!
[Total: 18 Average: 5]

Leave a Comment