

What can we do with GPT-3? Here, we’re all about having fun while probing GPT-3’s abilities for creative writing tasks, primarily (but far from limited to) poetry. The scaling of GPT-2-1.5b by 116× to GPT-3-175b has worked surprisingly well and unlocked remarkable flexibility in the form of meta-learning, where GPT-3 can infer new patterns or tasks and follow instructions purely from text fed into it.

Scaling works: quantity is a quality all its own. GPT-3 is like GPT-1 and the GPT-2 I’ve used extensively before 1-only much more so, and then going beyond them in a fascinating new way. The latest and greatest neural network for unrestricted natural language generation is OpenAI’s GPT-3. I hope you enjoy them even a tenth as much as I enjoyed testing GPT-3 and watching the completions scroll across my screen.
#Phantom brigade license to kill how to
This page records GPT-3 samples I generated in my explorations, and thoughts on how to use GPT-3 and its remaining weaknesses. I was impressed by the results reported in the GPT-3 paper, and after spending a week trying it out, I remain impressed. Chatting with GPT-3 feels uncannily like chatting with a human.

They demonstrate an ability to handle abstractions, like style parodies, I have not seen in GPT-2 at all. GPT-3’s samples are not just close to human level: they are creative, witty, deep, meta, and often beautiful. (Along the way, I document instances of how the BPE text encoding unnecessarily damages GPT-3’s performance on a variety of tasks, how to best elicit the highest-quality responses, common errors people make in using GPT-3, and test out GPT-3’s improvements in NN weak points like logic or commonsense knowledge.) One does not train or program GPT-3 in a normal way, but one engages in dialogue and writes prompts to teach GPT-3 what one wants.Įxperimenting through the OpenAI Beta API in June 2020, I find that GPT-3 does not just match my finetuned GPT-2-1.5b-poetry for poem-writing quality, but exceeds it, while being versatile in handling poetry, Tom Swifty puns, science fiction, dialogue like Turing’s Turing-test dialogue, literary style parodies… As the pièce de résistance, I recreate Stanislaw Lem’s Cyberiad’s “Trurl’s Electronic Bard” poetry using GPT-3. GPT-3, however, is not merely a quantitative tweak yielding “GPT-2 but better”-it is qualitatively different, exhibiting eerie runtime learning capabilities allowing even the raw model, with zero finetuning, to “meta-learn” many textual tasks purely by example or instruction. I continue my AI poetry generation experiments with OpenAI’s 2020 GPT-3, which is 116× larger, and much more powerful, than the 2019 GPT-2.

#Phantom brigade license to kill software
One or more important features of purchased Splunk software has become unusableĪny other case where a feature of purchased Splunk software is not operating as documentedĪll general questions. For P1 cases, please call us on one of our global support numbers found here A production installation of purchased Splunk software is completely inaccessible or the majority of its functionality is unusable.
