ChatGPT Advanced Voice Mode is Rolling Out to All Paid Users

OpenAI Advanced Voice Mode, released on 30 July 2024 to ChatGPT Plus Alpha users, is now accessible to all ChatGPT Plus and Team users. OpenAI has started rolling out this advanced voice assistant to all ChatGPT paid users, with immediate access to Plus and Team users. The Enterprise and Edu users could access it next week.

It’s important to note that ChatGPT Advanced Voice mode is not yet accessible in certain regions like the European Union (EU), Switzerland, Iceland, Norway, and Liechtenstein.

Earlier Advanced Voice was not available in the UK either but now OpenAI has made it available for all ChatGPT Plus users in the United Kingdom (UK).

The advanced voice feature is not yet available while using Custom GPTs. You can only have standard voice mode when using GPTs.

ChatGPT’s advanced voice assistant is also not available via API.

ChatGPT advanced voice mode (AVM) was first announced in the May 2024 event while launching the GPT-4o model, the multimodal GPT model behind this advanced voice assistant. It was actually released later in July of this year but to a limited group of ChatGPT Plus users called Alpha users.

ChatGPT advanced voice mode

What’s New in the Newly Rolled Out Advanced Voice Mode?

There are certain changes or advancements OpenAI made in the Advanced Voice feature before releasing it to all paid subscribers.

Nine Voice Characters

When it was rolled out to ChatGPT Plus Alpha users, it had only 4 voices: Breeze, Cove, Ember, and Juniper. Now, it has 5 additional voices: Arbor, Sol, Maple, Spruce, and Vale.

These 9 distinguished advanced voices have their own unique tones and characters which differentiate them from each other. Breeze is characterized as an animated and earnest voice, Cove as direct and composed, Ember as optimistic and confident, Juniper as open and upbeat, Sol as relaxed and savvy, Maple as candid and cheerful, Spruce as calm and affirming, and Vale as inquisitive and bright.

So, you can better make use of these advanced voices if choose them considering their unique character and tone.

Background Conversations

This advanced voice feature allows you to continue interacting with it even when you switch to other apps or lock your phone screen.

For example, you can start a conversation with ChatGPT’s advanced voice assistant, asking it to play your favorite playlist for a workout. After the music starts, you decide to check your messages and then lock your phone to conserve battery. Even while your phone is locked, you can easily say, “Skip to the next song,” and the assistant will change the track for you, allowing you to stay focused on your exercise without interruption.

To enable this capability of advanced voice mode, go to the settings of your ChatGPT app and turn on “Background Conversations”.

Memories and Custom Instructions Features

When using OpenAI‘s advanced voice feature, you can create memories and access your previous chats to resume your voice conversations. The feature of Custom Instructions can also be utilized to personalize your experience with this new voice assistant.

Daily Usage Limit

As far as the usage limit of ChatGPT’s advanced voice mode is concerned, there is some daily limit, undefined by OpenAI. Once you reach the daily limit, your conversation with AVM ends and from there you can continue with standard voice.

However, 15 minutes before ending your daily usage limit, you’ll receive a notification saying: “Almost at daily limit”.

How to Access ChatGPT Advanced Voice Mode?

If you’re a ChatGPT Plus User, you can access advanced voice mode through your ChatGPT iOS, macOS, and Android apps.

To access it, don’t forget to download the latest version of the ChatGPT app.

What Safety Concerns Were Addressed Before this Release?

The release of Advanced Voice Mode to all paid users addressed several key safety concerns, starting with copyright content, such as voices, songs, and music. You can neither generate music and song in the voice of any popular singer nor make this feature conversate with you in the voice of a renowned actor or politician.

Nonetheless, the advanced voice system underwent rigorous testing to improve its ability to detect and block inappropriate, violent, vulgar, or harmful requests, ensuring that responses remain safe and aligned with community standards.

Additionally, voice recognition accuracy was a significant focus. The voice recognition algorithms were fine-tuned based on feedback from the alpha testing group, which minimized misunderstandings that could lead to incorrect responses, thereby enhancing the overall user experience.

Albert Haley

Albert Haley

Albert Haley, the enthusiastic author and visionary behind ChatGPT4Online, is deeply fueled by his love for everything related to artificial intelligence (AI). Possessing a unique talent for simplifying intricate AI concepts, he is devoted to helping readers of varying expertise levels, whether they are newcomers or seasoned professionals, in navigating the fascinating realm of AI. Albert ensures that readers consistently have access to the latest and most pertinent AI updates, tools, and valuable insights. His commitment to delivering exceptional quality, precise information, and crystal-clear explanations sets his blogs apart, establishing them as a dependable and go-to resource for anyone keen on harnessing the potential of AI. Author Bio