Large Language Models (LLMs) like ChatGPT are changing how we work. GPT originally stood for Generative Pretrained Transformer, referring to the neural net architecture and training process involved. Coincidentally, GPT can also stand for General Purpose Technology, which is very apt. In the same way a pocket calculator enhances our arithmetic ability, an LLM enhances our general reasoning capabilities. Amazingly, the only skill required to leverage the power of an LLM is language. This might not be immediately obvious, but even the software developers building complex applications with LLMs are writing instructions in plain English.
These instructions are called “prompts” and are becoming an increasingly important ingredient in the development of AI-enabled applications. Because prompts are written in plain English, traditional software engineers can be replaced with “prompt engineers.” Anyone who has ever tried to hire a software engineer would be excited at the prospect of expanding the talent pool. But there is a less obvious, more important second-order effect. Software engineers might be masters at their craft, but we can’t expect them to have domain expertise in every industrial vertical. Since anyone who speaks English can be a prompt engineer (a much larger cohort than Python speakers), companies can hire specialists to guide their AIs. Instead of an engineer talking to an accountant to translate their job into software, we can now hire an accountant to directly “train” the AI.
Some readers will recognize that this sounds like no-code, a category of tools that allows non-technical people to build simple applications. In fact, Kevin and Ivan were early proponents of this approach. At Graphiq, they developed the role of “Knowledge Engineer,” responsible for bringing domain expertise, such as in Politics, Sports, History, or Finance, which would eventually power Alexa. There is a catch, however. For no-coders, knowledge engineers, or prompt engineers to do their job, they need an interface to control the system’s core functions. At Graphiq, this interface (a web application called Knowledge Management System) was a key asset that justified the Amazon acquisition. But this interface remains locked for internal use at Amazon.
PromptLayer has built a tool that enables any company to hire domain experts and leverage their experience with zero software development. They offer an entire development workflow, including version control, testing, deployment, and more. Prompt engineers using PromptLayer can accomplish all of their objectives without depending on the software team, including prompt management, LLM evaluations, and LLM observability. We believe every single industry will have use cases that require domain experts guiding AIs. As Marc Andreessen said, “software is eating the world,” meaning that all industries eventually become software companies. Subsequently, software companies will eventually become AI companies. In every AI application stack, there will be a layer owned by prompt engineers, and PromptLayer will be the framework of choice.