i don’t know why you worded your comment like we are in disagreement haha
samsung is forcibly inserting themselves into the chain of custody. in a world where cell phone video is finally spotlighting existing police brutality , the idea that my evidence could get thrown out because of some MBA’s idea of what constitutes a “real picture” is nightmarish.
It’s not an MBA thing, it’s a technological progress thing.
We’ve gone from photos full of “ghosts” (double exposures, light leaks, poor processing), to photos that took some skill to modify while developing them, to celluloid that could be spliced but took a lot of effort to edit, to Photoshop and video editing software that allowed compositing all sort of stuff… and we’re entering an age where everyone will be able to record some cell phone footage, then tell the AI to “remove the stop sign”, “remove the gun from the hand of the guy getting chased, then add one to the cop chasing them”, or “actually, turn them into dancing bears”, and the cell phone will happily oblige.
Right now, watermarking and footage certification legislation is being discussed, because there is an ice cube’s chance in hell of Samsung or any other phone manufacturer to not add those AI editing features and marketing them to oblivion.
In this article, as a preemptive move, Samsung is claiming to “add a watermark” to modified photos, so you could tell them from “actual footage”… except it’s BS, because they’re only adding a metadata field, which anyone can easily strip away.
TL;DR: thanks to AI, your evidence will get thrown away unless it’s certified to originate from a genuine device and hasn’t been tampered with. Also expect a deluge of fake footages to pop up.
It’s not BS, it’s reality. Photo evidence is as valid as its chain of custody, since long before AI or even Photoshop:
https://www.openculture.com/2017/08/long-before-photoshop-the-soviets-mastered-the-art-of-erasing-people-from-photographs-and-history-too.html
i don’t know why you worded your comment like we are in disagreement haha
samsung is forcibly inserting themselves into the chain of custody. in a world where cell phone video is finally spotlighting existing police brutality , the idea that my evidence could get thrown out because of some MBA’s idea of what constitutes a “real picture” is nightmarish.
It’s not an MBA thing, it’s a technological progress thing.
We’ve gone from photos full of “ghosts” (double exposures, light leaks, poor processing), to photos that took some skill to modify while developing them, to celluloid that could be spliced but took a lot of effort to edit, to Photoshop and video editing software that allowed compositing all sort of stuff… and we’re entering an age where everyone will be able to record some cell phone footage, then tell the AI to “remove the stop sign”, “remove the gun from the hand of the guy getting chased, then add one to the cop chasing them”, or “actually, turn them into dancing bears”, and the cell phone will happily oblige.
Right now, watermarking and footage certification legislation is being discussed, because there is an ice cube’s chance in hell of Samsung or any other phone manufacturer to not add those AI editing features and marketing them to oblivion.
In this article, as a preemptive move, Samsung is claiming to “add a watermark” to modified photos, so you could tell them from “actual footage”… except it’s BS, because they’re only adding a metadata field, which anyone can easily strip away.
TL;DR: thanks to AI, your evidence will get thrown away unless it’s certified to originate from a genuine device and hasn’t been tampered with. Also expect a deluge of fake footages to pop up.
“it’s not an MBA thing, it’s a technical progress thing”
proceeds to describe how MBAs (Samsung marketers and business leaders) are doing this with technology
again with the acting like i disagree with you? lol