Apple is preparing to go to the technology scene once again with this year’s WWDC event. But this time the difference between expectations and facts can be a little more remarkable. According to Mark Gurman from Bloomberg, some sources in Apple, this event, believe that the expected effect on artificial intelligence. Apple, which is behind giants such as OpenAI and Google in the industry, takes some steps to close this gap, but they seem limited for now.
The biggest innovation for developers is that Apple will open the basic models working on the device to the outside world. These models, which have a parameter of approximately 3 billion, are currently used in functions such as text summary and automatic correction. Soon developers will be able to integrate these models into their own applications. This shows that Apple will move to a more open position in the artificial intelligence ecosystem.
Apple improves artificial intelligence platform with internal models
However, these in -device models still have much more limited capabilities than large cloud -based systems. OpenAI and Google’s wide data centers models can perform more complex and creative tasks. Apple’s approach is based on processing on the device because it gives priority to privacy. This means some concessions in terms of functionality.
Other artificial intelligence functions that are expected to be introduced in WWDC 2025 are also remarkable but not groundbreaking. Some of them are a renewed translation application integrated with a new power -saving mode, Siri and Airpods. Some features in applications such as safari and photographs are planned to be called “artificial intelligence supported”. But for now, it is unclear to what extent these qualifications are based on real artificial intelligence functions.
Apple is more cautious because of some artificial intelligence projects that it has previously introduced but announced without ready. According to internal sources, some promised features such as Apple Intelligence have not been used for use. This year, some projects were deliberately postponed to prevent the same mistake from being repeated. Thus, the expectation management is desired to be made more balanced.
However, larger models are being developed behind the scenes of Apple’s artificial intelligence works. 3B, 7B, 33B and 150B parameter models are in the test phase. In particular, the largest model is said to have reached the level of competing in terms of quality with the latest versions of Chatgpt. Nevertheless, these models are kept confidential for now because of hallucinations and problems such as content accuracy.
Apple’s future plans are not only the user applications, but also developing tools. The rich text regulator comes for the Swiftui frame. In addition, Apple’s artificial intelligence -supported code completion tool announced last year Swift Assist is on the agenda again. It is not yet clear when the Anthropic -supported version of this vehicle will be released.
Nevertheless, Apple is trying to compensate for its lack of artificial intelligence with these expansions for developers. Artificial intelligence integration will become more prominent in developer tools such as XCode. From user interface tests to code writing, artificial intelligence support will come to the fore in many processes. This can increase efficiency in the Apple ecosystem.