The ‘Big’ reason why you must carefully read Facebook and Instagram’s terms and conditions

3 hours ago 4
ARTICLE AD BOX

The ‘Big’ reason why you must carefully read Facebook and Instagram’s terms and conditions

After years of training its

generative AI

models on billions of public images from Facebook and Instagram,

Meta

is reportedly seeking access to billions of photos users haven't publicly uploaded, sparking fresh

privacy

debates. While the social media giant explicitly states it is not currently training its AI models on these private photos, the company has declined to clarify whether it might do so in the future or what rights it will hold over these images, a report has said.The new initiative, first reported by TechCrunch on Friday (June 27) sees Facebook users encountering pop-up messages when attempting to post to Stories. These prompts ask users to opt into "cloud processing," which would allow Facebook to "select media from your camera roll and upload it to our cloud on a regular basis."The stated purpose is to generate "ideas like collages, recaps, AI restyling or themes like birthdays or graduations."The report notes that by agreeing to this feature, users also consent to Meta's AI terms, which permit the analysis of "media and facial features" from these unpublished photos, alongside metadata like creation dates and the presence of other people or objects. Users also grant Meta the right to "retain and use" this personal information.

Meta used public, not private, data train its generative AI models

According to The Verge, Meta recently acknowledged that it used data from all public content published on Facebook and Instagram since 2007 to train its generative AI models. Although the company stated it only used public posts from adult users over 18, it has remained vague about the precise definition of “public” and what constituted an “adult user” in 2007.Meta public affairs manager , Ryan Daniels, has reiterated to the publication that this new “cloud processing” feature is not currently used for training its AI models, a, told The Verge,"[The story by the publication] implies we are currently training our AI models with these photos, which we aren’t. This test doesn’t use people’s photos to improve or train our AI models," Maria Cubeta, a Meta comms manager, was quoted as saying.Cubeta also described the feature as “very early,” innocuous, and entirely opt-in, stating, "Camera roll media may be used to improve these suggestions, but are not used to improve AI models in this test."Furthermore, while Meta also said that opting in grants permission to retrieve only 30 days' worth of camera roll data at a time, Meta's own terms suggest some data retention may be longer. “Camera roll suggestions based on themes, such as pets, weddings and graduations, may include media that is older than 30 days,” Meta's says.

Google Pixel 9 Pro Fold After 1 Year: Is It STILL My Daily Driver? (Long-Term Review)

Read Entire Article