Search is key to Apple's iOS/macOS strategy

iOS14/Big Sur is the beginning of major convergence for Apple's software platforms.  Along with supporting iOS-native UIKit/SwiftUI apps on macOS, Big Sur also offers a virtualization layer, Rosetta, that enables running Intel apps on Apple ARM Macs.  The ideal outcome here is that folks can build apps once and have them run everywhere with adaptive user interface and features.

Apple has a reasonably good track record with software vendors and supporting these sorts of major architectural migrations; it's not just the compatibility layers like Rosetta (which was originally built to support running PowerPC apps on Intel Macs over 10 years ago), but they also do make a significant effort to enable a simple, well-supported developer experience when releasing apps on new platforms.

News also broke this year that Apple is building a search platform that will compete with Google.  Apple and Google both have a similar advantage: they prioritize enabling these features on an architectural level for developers.  There's things Apple can do that makes search not just an internet tool, but a great general purpose tool that gets users straight to what they're looking for either in an app or on the internet, even if they ask for it via Siri.  My hope is that Apple emphasizes privacy features that affords users more liberty and privacy with their data than existing search systems.

Good platform-level and 3rd party app support is something that really sets Apple apart from all competitors; they can build native technologies that support unique caching paradigms and client-side ML.  An app could now say "here's the data I've got, here's ways users like to search for it", and everything is automatically fed into the new neural engines on Apple chips.  This could be for both native apps and websites via an SDK.  The more resources Apple commits to helping apps and websites adopt this functionality, the better they can build Apple Search as a universal search and analytics platform.

One technology and UX pattern that I hope Apple continues vying for is it's contextual search API in apps.  This can be seen in the new Apple Music and Messages apps, which display well-organized search results that seem to tap into metadata better, and across different fields.

Search on macOS Big Sur - Apple Inc.

A good search bar shouldn't need toggles or filters, and it should return a distribution of types of results in display, rather than requiring scrolling between different grouped categories.  When working on the UI/UX for the (now discontinued) visit-sharing app Saunter, something that we decided was important was being able to provide good contextual search results based on natural language.  This was a general industry trend with Core Spotlight, but it still needs work to get it to the point of Search being a platform-level primitive on iOS and Mac so that developers don't have to build their own (janky) implementations of in-app search.

Search is a surprisingly uniform problem in a lot of ways (and surprisingly difficult in a lot of others) that can be implemented in a way that makes Apple Search groundbreaking for people that are used to existing search and AI systems.  The things Wolfram Alpha did in science and math are things Apple is very well positioned for with generalized search, even eventually enterprise and healthcare.  Apple products just work, and customers know it.