Content
summary Summary

Three new GPT-3 based apps show that large language models (LLMs) are much more than text generators.

Ad

Developer Dwarkesh Patel uses OpenAI's embeddings API for a semantic search in eBooks. An embedding is an information-dense representation of the meaning of a piece of text.

Patel takes advantage of this representation for a text search in books, which can search book passages based on a scenic description ("Character A and Character B meet") or on questions, for example.

The semantic search is much more flexible than the conventional Ctrl+F search function for eBooks, which returns text passages only if they match the search command exactly. Patel demonstrates this in a short demo video.

Ad
Ad

Patel provides a demo version of his embeddings search for eBooks at Google Colab.

Natural language prompts for Google Sheets

Developer Shubhro Saha demonstrates another use case for GPT-3: He connects the API to Google Sheets. Using natural language prompts in the table columns, he can assign tasks to Sheets that he would otherwise have to write in abstract code, such as for extracting a zip code from an address line. All that is needed is the question, "What is the zip code of this address?"

GPT-3 can also write a result to a new column directly in Sheets, based on the contents of various columns. For example, GPT-3 can create the text for a thank you card from a name and a short list of things to mention.

Saha's example, however, also shows the biggest weakness of large language models besides social and cultural biases: The systems are still too unreliable for tasks where precision is the top premise.

For example, in the zip code example already mentioned, even the largest GPT-3 model "text-davinci-002" fails in some cases and fantasizes wrong numbers into the columns. Purpose-built plugins or regular Google sheet code are more reliable in this scenario.

Recommendation

If you are interested in Saha's "GPT3()" software, you can express your interest here. An alternative is this pre-filled Google Doc by Fabian Stelzer. It contains the Javascript code that connects the GPT-3 API with the Google software. Stelzer says he can't program and had the connection code generated by GPT-3 as well.

Stelzer asked GPT-3 to generate sample code for the API connection to Google Sheets. The code works. | Image: Fabian Stelzer

Zahid Khawaja also takes advantage of the programming capability of the large language model for his "ToolBot," which uses natural language to create a prototype app based on GPT-3.

A user enters an app idea via a prompt, from which ToolBot then generates a simple user interface with a text input field that processes another user input in the context of the app function.

Khawaja says he developed ToolBot for people who want to use GPT-3 for applications but are not familiar with user interface creation or prompt engineering. They can save their created tool and share it via a link.

Ad
Ad
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.
Support our independent, free-access reporting. Any contribution helps and secures our future. Support now:
Bank transfer
Summary
  • Natural language is the programming language for the human brain. And increasingly for computers, as three new GPT-3 apps show.
  • A semantic book search, for example, can search eBooks based on scene descriptions, not just individual words or phrases.
  • GPT-3 prompts embedded in Google Sheets make it easier for people unfamiliar with spreadsheet programming to perform tasks.
  • ToolBot can prototype a GPT-3 application based on a prompt, including the user interface.
Online journalist Matthias is the co-founder and publisher of THE DECODER. He believes that artificial intelligence will fundamentally change the relationship between humans and computers.
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.