Sora, the video production tool that OpenAI offers limited to the invitation system, has become one of the most remarkable artificial intelligence applications of 2025. While the application received wide coverage on social media, it quickly became one of the most downloaded software on the App Store. However, this interest also brought with it an unexpected problem in terms of security and brand integrity. Many fake applications using the name Sora began to appear in Apple’s store.
According to data compiled by Appfigures, at least fifteen fake apps appeared on the App Store immediately following the release of the official Sora app. More than half of them directly used the name “Sora 2”, aiming to stand out in search results. Some of these applications were old software that had previously appeared on the store under different names. Developers turned popularity into an opportunity by simply changing the name. In addition, the approval of these applications despite Apple’s strict control mechanism revealed the weaknesses in the system.
Apple’s review process fails to prevent fake Sora apps
Some fake software managed to mislead users despite having no technical similarities with the official application. The images, descriptions and logos of these fake applications were prepared to resemble OpenAI’s original design. Additionally, developers frequently used phrases like “AI video generator” in app descriptions, resulting in greater visibility in searches. Although Apple’s App Review team tries to prevent such violations, delays in the process buy time for opportunists. This situation shows that intense user interest, especially around artificial intelligence applications, puts a strain on control mechanisms.
Appfigures states that fake “Sora” applications have reached more than 300 thousand downloads in total. Although this number may seem small compared to the official Sora app passing one million downloads, it is critical for user security. In addition, more than 80,000 of these fake apps were downloaded in the days after OpenAI’s app was released. In other words, users unknowingly turned to the wrong software while following the popularity. Regardless, this chart made it clear that the quality control process in Apple’s app store needs to be tightened.
Apple removed some fake applications from the store after increasing criticism on the issue. Despite this, some examples such as “Sora 2 – AI Video Generator” remained available for a long time. This app managed to manipulate the App Store search algorithm by receiving more than 50 thousand downloads. However, Apple’s failure to intervene quickly increased users’ concerns about security. In addition to all this, it is also a matter of curiosity whether OpenAI will take any legal action regarding trademark rights.
Some smaller-scale applications tried to be visible by using the name “Sora” in different ways. While “PetReels — Sora for Pets” reached only a few hundred users, another application called “Viral AI Photo Maker: Vi-sora” did not receive the attention it expected. Despite this, the application called “Sora 2 – Video Generator Ai” attracted attention, albeit limited, with more than 6,000 downloads. These examples show how widespread the trend of copycat developers exploiting popular brands has become. This trend has become an issue that directly affects the reputation of the App Store.
According to Appfigures data, these fake apps quickly generated over $160,000 in total revenue. This amount reveals the size of the payments made by users without realizing it. However, it is estimated that all of these revenues are collected through subscriptions or in-app purchases. Users seem to have paid for the application, believing it to be original, relying on the content promises. This situation reminds us once again how important user awareness is.
Apple was asked how these fake “Sora” apps were approved and when the remaining ones would be removed. The company remained silent until the news was published. This silence brought about new discussions in technology circles. However, it is thought that Apple will take stricter measures against similar incidents. It is expected that automatic warning systems will be activated, especially in detecting trademark violations.
Besides all this, this incident clearly revealed the new risks posed by artificial intelligence-supported applications. When users go after popular applications and download them without authenticating, this makes the job of malicious developers easier. At this point, conscious user behavior becomes the most effective element of security measures.