Apple has an independent group called App Review, responsible for moderating apps and updates to the App Store. Google also has a strict policy to analyze and moderate the app before launching it on the Google Play app store.
According to CNBC, Apple has an independent group called App Review, responsible for moderating apps and updates to the App Store. App Review’s moderation policy will be determined by the “Review Board” (ERB). The ERB was created after the App Store launched in 2008 and is the place to make the final decision on whether an app can continue to be circulated on the App Store or be banned or removed.
In addition to the automated system, App Review also often uses a manual method to moderate the app before the app is released.
Every day, App Review employees rate between 50 and 100 apps and spend just a few minutes on one app. Apple claims an average of 40% of apps or updates are rejected for reasons such as phishing, privacy violations, etc.
This unit also has a strict policy in analyzing and moderating the app before launching it on the Google Play app store on the Google side.
Google Play uses a variety of methods to check if developers are complying with development policies, which encapsulate three directions: checking developer compliance with Google policies, check inside the app, and check the user rating.
The process is as follows: Google Play will analyze information from the developer’s Google account, actions, history, payment details, device information, and inside the app. If there’s anything suspicious, Google Play will manually review the apps. In addition, Google Play has a Google Play Protect (GPP) feature that checks the user’s app and device before the user downloads the app. If it detects that the app is malicious or violates Google’s policies, GPP will notify the user to remove it from the device. GPP continues to protect users after the app has been installed by periodically scanning the device