Apple said it would take more time to collect feedback and improve proposed child safety features after the criticism of the system on privacy and other grounds both inside and outside the company.
Apple's promise last month to check customer phones and computers for child sex abuse images sparked a global backlash from a wide range of rights groups, with employees also criticising the plan internally.
The Electronic Frontier Foundation (EFF), at the time, branded it "a backdoor into its data storage system and its messaging system".
"We've said it before, and we'll say it again now: It's impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children," India McKinney and Erica Portnoy of the EFF wrote.
"Apple should make the right decision: Keep these backdoors off of users' devices."
Further to that 31 organisations and nearly 9000 individuals signed an open letter to Apple demanding it stop the deployment immediately and re-affirms its commitment to privacy and encryption.
Critics also argued the feature could be exploited by repressive governments looking to find other material for censorship or arrests and would also be impossible for outside researchers to determine whether Apple was only checking a small set of on-device content.
Apple countered that it would allow security researchers to verify its claims, but the company said it would take more time to make changes to the system.
"Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features," the company said in a statement.
Matthew Green, a cybersecurity researcher at Johns Hopkins University who had criticised Apple's move, said the Apple's move was "promising."
Green said on Twitter that Apple should "be clear about why you’re scanning and what you’re scanning. Going from scanning nothing (but email attachments) to scanning everyone’s private photo library was an enormous delta. You need to justify escalations like this."
Apple had been playing defense on the plan for weeks, and had already offered a series of explanations and documents to show that the risks of false detections were low.
It had planned to roll out the feature for iPhones, iPads, and Mac with software updates later this year in the United States.
It was then going to be rolled out to other markets based on the laws of each country it operated in.