Prompt engineering or prompt programming is an interesting way to interact with GPT-3. It basically involves creating clever text-based scripts that make GPT-3 perform the tasks you desire.
What is GPT3?
GPT-3 is a language model by OpenAI. It generates AI-written text which can be virtually indistinguishable to human-written sentences and paragraphs, articles, short stories dialogues, lyrics, and many other things.
OpenAI trained GPT-3 to analyze a large corpus of text, more than 175 billion parameters. This made it the largest ever language model built. In non-technical terms, it was shown how millions of people write. It also taught how to create patterns based upon user input which is as simple as entering some information. The model will then generate intelligent text according to the submitted pattern.
Programming through dialogue?
GPT-3 is a huge model, both in terms of data and power. Therefore, it has a qualitatively different behavior. Unlike other models, if you want to do a new task, it won’t require retraining with additional data.
Instead, you interact with the model, using natural language descriptions, requests, examples and other means to communicate any task. Once the prompt has been adjusted until it understands, the system meta-learns the task based upon the high-level abstractions that it received from pretraining. This kind of “Prompt Programming” is less like regular programming and more like an coaching exercise. Actually, is has similarities to coach athletes: you tell them what they should do and hopefully you’ll get the result you want. And just like in sports, you won’t always get the desired result from GPT-3.
Because the way you interact with GPT-3 is through a prompt, you have to follow a different approach and it also kind of “feels” totally differt. With traditional software that you implement for customers, you have to think things through first. Otherwise it won’t work. With deep learning software, your focus is on providing data that in some way represents the correct answer. However, with GPT-3, it’s all about how to describe what you want. In a way, it’s helping to anthropomorphize GPT-3: just like with people, sometimes you get the right answer by just asking the right question in the right format.
One example of GPT-3 capabilities is the emoji transformation. Here, GPT-3 is given a few examples of how to represent film titels as emojis:
Back to Future: π¨π΄ππ
Batman: π€΅π¦
Transformers: ππ€
Wonder Woman: πΈπ»πΈπΌπΈπ½πΈπΎπΈπΏ
Winnie the Pooh: π»πΌπ»
The Godfather: π¨π©π§π΅π»ββοΈπ²π₯
Game of Thrones: πΉπ‘π‘πΉ
When entered It will give you this result:
Spider–Man: π·οΈπ·οΈ
The Avengers: π¨π»π¨π½π¨πΌπ¨π½π¨πΌββοΈ
While this is certainly a fun example that shows the capabilities of GPT-3, there are already a set of applications that you make use of. One example are chatbots for book recommendations:
Want to explore more of what’s possible with GPT-3 for you business? Get in touch with us and follow us on our blog app.
Β
Hereβs why Artificial Intelligence is so power-hungry
What is GPT-3?
OpenAI’s most recent picture-making AI DALL-E is incredible
Photo by Brooks Leibee on Unsplash
Sources: