November 7, 2023

Key Takeaways from OpenAI’s Dev Day!

Key Takeaways from OpenAI’s Dev Day!

Key Takeaways from OpenAI’s Dev Day

OpenAI’s Dev Day has come and gone, and they’ve not disappointed. With the updates today, it’s no small exaggeration to say that it will radically change the AI development landscape for the foreseeable future.

@sama came, he saw and he conquered

The event revealed some exciting new capabilities that promise to transform AI application development, notably with the introduction of GPT-4 Turbo, which promises enhanced capabilities at significantly reduced costs. 

Here’s a closer look at some event highlights:

GPT-4 Turbo: “Turbo”-charging development of new AI apps

OpenAI has set the stage for wider accessibility with the release of GPT-4 Turbo. This iteration is not only more affordable but also expands the context window to 128K tokens, accommodating more than 300 pages of text in a single prompt. You can now paste an entire novel/thesis into GPT-4 for it to summarize. 

For developers working with AI apps everywhere, this means delivering richer interactions at a fraction of the cost. While it's still far too expensive for use in every part of our product, we will likely route more difficult questions or situations which we know induce hallucination into GPT-4 Turbo. 

Function Calling and Seed Parameters: Hallucinations be-gone!

We hope they go away

The function calling feature has been refined to now support complex multi-action requests in a single message, boosting efficiency and accuracy. Additionally, the introduction of a seed parameter for reproducible outputs is a significant step towards more predictable and debuggable AI behaviors. 

This is huge for us, as one of the biggest challenges in developing our AI chatbot offering was hallucination. By integrating the seed parameter, we should now be able to consistently reproduce outputs for debugging and unit testing.

This will be a major focus for us to try out for our current product in the coming weeks. We’ll provide updates on its efficacy in the coming future!

GPTs: Build Your Own Bot

Pretty much what it says on the tin

Earlier this year, OpenAI unveiled "Custom Instructions," a feature enabling users to personalize their GPT models through precise prompt crafting.  OpenAI is now taking this to the next level with the launch of GPT builder, which facilitates the creation of custom ChatGPT instances for specific uses without coding.

This feature empowers users to customize the AI behavior through a conversational interface. On top of that, the GPT Builder grants users unprecedented control over their AI and new capabilities, allowing them to mold their GPT to understand company-specific documents, browse the internet and generate images using DALL-E 3. This advancement unlocks a wealth of possibilities for community innovation as well as discovery via the GPT store.

Assistant API: A Comprehensive Toolkit for AI Operations

Now, we come to a massive announcement for AI developers, the assistants API. The Assistants API is a comprehensive toolkit that simplifies complex language model operations. It’s a unified system that knows when to browse the web, utilize code interpreters, and combine information—all tasks that previously required significant effort. 

So much time has been spent by so many developers (including us) trying to implement RAG in our AI products. With this update, OpenAI just made it a lot easier to do so.

While these advancements hold exciting potential for personal or casual use, their implications for our enterprise solutions may be more limited at this current stage. The feasibility of assistant APIs to completely replace Langchain or Llama Index requires rigorous hands-on testing. Due to its "black box" nature, you're unable to implement custom functionalities or handlers as you can with Retrieval-Augmented Generation (RAG) and structured prompt engineering, which afford more granular control. We believe that companies implementing LLM applications at scale would require more precise controls such as convoluted routing systems, security and privacy checks and integration with non-LLM approaches for critical functions. 

Nonetheless, the GPT Builder could serve as an excellent tool for developing proofs of concept fast for validation. The abstraction of RAG via the managed service allows users to quickly ship use cases without worrying about the lower level details. The real measure of its impact will emerge with time as OpenAI allows more customizability over the LLMOps.

Our Thoughts on the Update & the Future of AI Apps

OpenAI has really outdone itself with this latest update. They’ve managed to create a lot more incentive for the development of more AI apps by dramatically reducing costs for their most powerful models (for now) & creating features that cater specifically to AI devs in particular. 

However, with that, the question remains. Will OpenAI’s updates spell the end for up and coming AI startups as some have feared?

Community sentiment on X right now

We don’t think so. There was a lot of discussion about the death knell of new startups built on the base of the GPT 3.5 API in social media. However, the fact of the matter is if your AI app’s differentiation is one PR’s worth of code, it will very quickly become obsolete.

We still see a lot of value in building AI chatbots for small to medium sized enterprises (SMEs) across the SEA region. Our view is that deep integration with platforms e.g. Shopify will continue to be paramount for providing value to SMEs. 

The Assistant API, while a boon for developers, does not directly cater to SMEs, which is where our focus remains. We anticipate a new wave of innovation from developers using the Assistant API, which could complement our bespoke solutions.

The developments announced at OpenAI's Dev Day are a testament to the rapid progression of AI capabilities. As we integrate these advancements into our services, we are positioned to offer even more powerful and cost-effective AI solutions to our clients. 

In our view, the future of AI development is not just about technology; it's about the experiences we can create and the value we can unlock for our niche. And with these tools at our disposal, we are more ready than ever to lead the charge.

There will be more AI apps being developed in the coming months. Here in SUPA, we strongly believe that LLMs at their current stage of development still need human validation. We help customers working with LLMs to build better AI. Learn more about our use cases here.

We’re also leveraging our expertise to help local businesses in SEA with their AI chatbot needs! Sign up for early access here.