Siri in iOS 27: Everything We Know - MacRumorsOpen MenuShow RoundupsShow Forums menuVisit ForumsOpen Sidebar
Skip to Content

Siri in iOS 27: Everything We Know

We're only months away from our first look at Apple's smarter, redesigned version of Siri. iOS 27, iPadOS 27, and macOS 27 will focus on an entirely revamped version of ‌Siri‌, and rumors about what we can expect are picking up.


There's a chatbot version of ‌Siri‌ in the works that will change the way that we use Apple's personal assistant. ‌Siri‌ will be more like Claude or ChatGPT, marking a major improvement in how ‌Siri‌ works and what it can do.

SiriBot

With ‌iOS 27‌, Apple is turning ‌Siri‌ into a chatbot. Right now, ‌Siri‌ can answer common questions and complete simple tasks, but you can't engage it in a back and forth conversation, get help with multi-step tasks, or ask complicated questions with multiple steps in one query.

Based on the ‌Siri‌ chatbot rumors, ‌Siri‌ will be able to do all of that and more with the upcoming upgrade, and it will work like competing chatbots.

Apple wasn't initially planning to introduce a full chatbot like ChatGPT, but chatbots have become too popular for Apple to ignore. Simply adding AI capabilities to apps and features isn't enough for Apple to stay competitive with the way people have embraced chatbots for everything from web searches to coding help.

Google has already integrated Gemini into its Android device lineup, and chatbots like ChatGPT and Claude have hundreds of millions of weekly active users. Apple can't afford not to compete.

Standalone Siri App

When ‌Siri‌ evolves into a Apple-designed chatbot, it will launch alongside a standalone ‌Siri‌ app. The ‌Siri‌ app will look similar to apps from other companies like OpenAI, displaying a grid or list of past ‌Siri‌ conversations.

‌Siri‌ will support text and voice-based conversations, and there will be options to favorite chats, search for content within chats, initiate new chats, and save chats. Conversations with ‌Siri‌ will apparently resemble iMessage conversations, with Apple adopting chat bubbles.

New conversations will start with suggested prompts on what users can ask ‌Siri‌.

Deep Integration

While there will be a standalone ‌Siri‌ app for back-and-forth conversations, ‌Siri‌ will be deeply integrated into Apple devices at the system level. ‌Siri‌ will be activated the same way as today, by speaking the ‌Siri‌ wake word or pressing on the side button of a Siri-enabled device. ‌Siri‌ will be able to respond to both voice and text-based requests.

Siri Capabilities

‌Siri‌ will be able to do what current chatbots can do, such as searching the web with visually rich results, providing summaries, and evaluating uploaded documents. The personal assistant will still be integrated into Apple devices. ‌Siri‌ integration will replace the current Spotlight search functionality, but Apple plans to keep and expand on ‌Siri‌ Suggestions. ‌Siri‌ Suggestions will have more access to user data to provide more relevant prompts.

  • Search the web for information
  • Generate images
  • Generate content
  • Summarize information
  • Analyze uploaded files
  • Use personal data to complete tasks
  • Ingest information from emails, messages, files and more
  • Analyze open windows and on-screen content to take action
  • Control device features and settings
  • Search for on-device content, replacing Spotlight

‌Siri‌ will also be integrated into Apple's core apps, including Mail, Messages, Apple TV, Xcode, and Photos. ‌Siri‌ will be able to search for specific images, edit photos, help with coding, make suggestions for TV shows and movies, and send emails.

New Look

Chatbot ‌Siri‌ will have an updated look to go along with the dedicated app. Activating ‌Siri‌ will have a new animation that prompts the user to search or ask a question, and Bloomberg says Apple is testing a version of ‌Siri‌ integrated into the Dynamic Island. Apple's test interface includes a glowing ‌Siri‌ icon and a "searching" label in the ‌Dynamic Island‌ while ‌Siri‌ is processing a request, and once done, ‌Siri‌ expands into a larger translucent panel with the results. Pulling down on the menu initiates an interface for a conversation.

Apple may also integrate an "Ask ‌Siri‌" button into the menus of other apps, giving users a way to send content directly to ‌Siri‌ alongside a request. The iOS keyboard could get a Write with ‌Siri‌ option that surfaces Writing Tools.

Memory

Claude, ChatGPT, and Gemini can remember past conversations and interactions, retaining a memory of the user. Apple is said to be discussing how much the ‌Siri‌ chatbot will be able to remember.

Apple may limit conversational memory to protect user privacy.

Third-Party Chatbot Integrations

Apple will allow third-party AI chatbots to integrate with Siri in ‌iOS 27‌. Apple already has a partnership with OpenAI that lets ‌Siri‌ hand questions off to ChatGPT, but that integration will expand to chatbots from other companies like Google and Anthropic.

An iPhone user with the Claude or Gemini app installed will be able to send questions from ‌Siri‌ to those chatbots, similar to how the OpenAI integration works today.

iPhone users will be able to choose which services they want to use inside ‌Siri‌ through a new "Extensions" option coming to the ‌Siri‌ and Apple Intelligence section in the Settings app.

Promised iOS 18 Features

‌Apple Intelligence‌ ‌Siri‌ features that were originally planned for iOS 18 will finally be introduced in ‌iOS 27‌, with ‌Siri‌ able to use personal data and context to answer queries. ‌Siri‌ will also be able to do more in and between apps, and will be able to see what's on the user's screen. Apple promised that those features would appear before the end of 2026.

Underlying Architecture and Servers

Apple has inked a deal with Google that will see Gemini powering upcoming versions of ‌Siri‌. Apple plans to use Gemini for the ‌Siri‌ chatbot and the other ‌Siri‌ features coming in ‌iOS 27‌.

"Apple and Google have entered into a multi-year collaboration under which the next generation of Apple Foundation Models will be based on Google's Gemini models and cloud technology," the two companies said in a statement in January.

The ‌Siri‌ chatbot will rely on a custom AI model developed by the Google Gemini team. Gurman claims that the custom model is comparable to Gemini 3, and that it is more powerful than models Apple has developed in-house.

Apple and Google are also discussing running the ‌Siri‌ chatbot on Google's servers powered by Tensor Processing Units, probably because Apple doesn't yet have the infrastructure to handle chatbot queries from billions of active devices per day.

Launch Date

Apple is planning to introduce ‌Siri‌'s chatbot capabilities when it announces ‌iOS 27‌, iPadOS 27, and ‌macOS 27‌ at the June Worldwide Developers Conference, which starts on Monday, June 8. It is still unclear which ‌Siri‌ features Apple will be ready to unveil, and some could be held for future updates.

Read More

We have a dedicated iOS 27 roundup that goes into more detail on all of the features that we might see in the ‌iOS 27‌ update.

Related Roundup: iOS 27

Top Rated Comments

HouseLannister Avatar
4 hours ago at 11:23 am
We know nothing. We heard some stuff from Gurman, who has been wrong about ALL of Apple Intelligence since launch. He even went so far to say Prosser was wrong about iOS 26 when he was largely right. There are things we expect and things we suspect, but we know nothing.
Score: 4 Votes (Like | Disagree)
uniquestevejobs Avatar
4 hours ago at 11:34 am
Its all rumors nothing more or less. I dont expect anything
Score: 2 Votes (Like | Disagree)
1 hour ago at 02:09 pm


Attachment Image
Score: 1 Votes (Like | Disagree)
GfPQqmcRKUvP Avatar
3 hours ago at 12:34 pm
I’m anticipating an absolutely horrible experience compared to using the latest frontier models from OpenAI and Anthropic. These models are changing almost weekly and Apple has shown no willingness to push out updates to anything, even relatively simple ones, on that kind of schedule. Also, operating frontier models at the scale of Apple’s user base would burn through billions of dollars of computing costs per quarter so I think Apple is likely to use lightweight models that are fast and not too computationally intensive but are poor quality as a result. A lot of the AI skepticism from around the internet is because most people interact with these kind of fast/cheap models for the most part - using default free-tier of ChatGPT vs 5.4 Thinking or 5.4 Pro might as well be different technologies. They’re not in the same ballpark.
Score: 1 Votes (Like | Disagree)
titanium alloy Avatar
3 hours ago at 12:23 pm
it’s sad that apple intelligence isn’t performing as expected. i was one of few that was really excited for what it promised to offer such as deep app integration and understanding personal context… darn shame
Score: 1 Votes (Like | Disagree)
3 hours ago at 12:10 pm
things that were supposed to be featured on iOS 18 is finally being released on iOS 27? lol. And I bet by this time next year they will say iOS 28 is most likely.
Score: 1 Votes (Like | Disagree)
Related Apple News: Education | Health | Travel | Iphone | Ipad