{"id":27312,"date":"2024-04-09T20:15:05","date_gmt":"2024-04-09T14:45:05","guid":{"rendered":"https:\/\/www.kaspersky.co.in\/blog\/?p=27312"},"modified":"2024-04-09T20:18:42","modified_gmt":"2024-04-09T14:48:42","slug":"real-or-fake-image-analysis-and-provenance","status":"publish","type":"post","link":"https:\/\/www.kaspersky.co.in\/blog\/real-or-fake-image-analysis-and-provenance\/27312\/","title":{"rendered":"Watch the (verified) birdie, or new ways to recognize fakes"},"content":{"rendered":"<p>Over the past 18 months or so, we seem to have lost the ability to trust our eyes. Photoshop fakes are nothing new, of course, but the advent of generative artificial intelligence (AI) has taken fakery to a whole new level. Perhaps the first viral AI fake was the 2023 image of the Pope in a white designer puffer jacket, but since then the number of high-quality eye deceivers has skyrocketed into the many thousands. And as AI develops further, we can expect more and more convincing fake videos in the very near future.<\/p>\n<div id=\"attachment_50943\" style=\"width: 416px\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/media.kasperskydaily.com\/wp-content\/uploads\/sites\/36\/2024\/04\/09200315\/real-or-fake-image-analysis-and-provenance-01.jpg\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-50943\" class=\"size-full wp-image-50943\" src=\"https:\/\/media.kasperskydaily.com\/wp-content\/uploads\/sites\/36\/2024\/04\/09200315\/real-or-fake-image-analysis-and-provenance-01.jpg\" alt=\"One of the first deepfakes to go viral worldwide: the Pope sporting a trendy white puffer jacket\" width=\"406\" height=\"500\"><\/a><p id=\"caption-attachment-50943\" class=\"wp-caption-text\">One of the first deepfakes to go viral worldwide: the Pope sporting a trendy white puffer jacket<\/p><\/div>\n<p>This will only exacerbate the already knotty problem of fake news and accompanying images. These might show a photo from one event and claim it\u2019s from another, put people who\u2019ve never met in the same picture, and so on.<\/p>\n<p>Image and video spoofing has a direct bearing on cybersecurity. Scammers have been using fake images and videos to trick victims into parting with their cash for years. They might send you a picture of a sad puppy they claim needs help, an image of a celebrity promoting <a href=\"https:\/\/www.kaspersky.com\/blog\/online-investment-dangerous-apps\/50057\/\" target=\"_blank\" rel=\"noopener nofollow\">some shady schemes<\/a>, or even a picture of a credit card they say belongs to someone you know. Fraudsters also use <a href=\"https:\/\/www.kaspersky.com\/blog\/pig-butchering-crypto-investment-scam\/50764\/\" target=\"_blank\" rel=\"noopener nofollow\">AI-generated images for profiles for catfishing<\/a> on dating sites and social media.<\/p>\n<p>The most sophisticated scams make use of <a href=\"https:\/\/www.kaspersky.com\/blog\/how-to-spot-and-prevent-boss-scams\/50861\/\" target=\"_blank\" rel=\"noopener nofollow\">deepfake video and audio of the victim\u2019s boss<\/a> or a relative to get them to do the scammers\u2019 bidding. Just recently, an employee of a financial institution was duped into <a href=\"https:\/\/edition.cnn.com\/2024\/02\/04\/asia\/deepfake-cfo-scam-hong-kong-intl-hnk\/index.html\" target=\"_blank\" rel=\"nofollow noopener\">transferring $25 million<\/a> to cybercrooks! They had set up a video call with the \u201cCFO\u201d and \u201ccolleagues\u201d of the victim \u2014 all deepfakes.<\/p>\n<p>So what can be done to deal with deepfakes or just plain fakes? How can they be detected? This is an extremely complex problem, but one that can be mitigated step by step \u2014 by tracing the <strong>provenance<\/strong> of the image.\n<\/p>\n<h2>Wait\u2026 haven\u2019t I seen that before?<\/h2>\n<p>\nAs mentioned above, there are different kinds of \u201cfakeness\u201d. Sometimes the image itself isn\u2019t fake, but it\u2019s used in a misleading way. Maybe a real photo from a warzone is passed off as being from another conflict, or a scene from a movie is presented as documentary footage. In these cases, looking for anomalies in the image itself won\u2019t help much, but you can try searching for copies of the picture online. Luckily, we\u2019ve got tools like <a href=\"https:\/\/support.google.com\/websearch\/answer\/1325808?hl=en&amp;co=GENIE.Platform=Desktop\" target=\"_blank\" rel=\"nofollow noopener\">Google Reverse Image Search<\/a> and <a href=\"https:\/\/tineye.com\/\" target=\"_blank\" rel=\"nofollow noopener\">TinEye<\/a>, which can help us do just that.<\/p>\n<p>If you\u2019ve any doubts about an image, just upload it to one of these tools and see what comes up. You might find that the same picture of a family made homeless by fire, or a group of shelter dogs, or victims of some other tragedy has been making the rounds online for years. Incidentally, when it comes to false fundraising, there are a few other <a href=\"https:\/\/www.kaspersky.com\/blog\/fake-charity-scam\/28496\/\" target=\"_blank\" rel=\"noopener nofollow\">red flags<\/a> to watch out for besides the images themselves.<\/p>\n<div id=\"attachment_50942\" style=\"width: 1506px\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/media.kasperskydaily.com\/wp-content\/uploads\/sites\/36\/2024\/04\/09200325\/real-or-fake-image-analysis-and-provenance-02.jpg\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-50942\" class=\"size-full wp-image-50942\" src=\"https:\/\/media.kasperskydaily.com\/wp-content\/uploads\/sites\/36\/2024\/04\/09200325\/real-or-fake-image-analysis-and-provenance-02.jpg\" alt=\"Dog from a shelter? No, from a photo stock\" width=\"1496\" height=\"903\"><\/a><p id=\"caption-attachment-50942\" class=\"wp-caption-text\">Dog from a shelter? No, from a photo stock<\/p><\/div>\n<h2>Photoshopped? We\u2019ll soon know.<\/h2>\n<p>\nSince photoshopping has been around for a while, mathematicians, engineers, and image experts have long been working on ways to detect altered images automatically. Some popular methods include image metadata analysis and <a href=\"https:\/\/en.wikipedia.org\/wiki\/Error_level_analysis\" target=\"_blank\" rel=\"nofollow noopener\">error level analysis (ELA)<\/a>, which checks for JPEG compression artifacts to identify modified portions of an image. Many popular image analysis tools, such as <a href=\"https:\/\/www.fakeimagedetector.com\/\" target=\"_blank\" rel=\"nofollow noopener\">Fake Image Detector<\/a>, apply these techniques.<\/p>\n<div id=\"attachment_50941\" style=\"width: 619px\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/media.kasperskydaily.com\/wp-content\/uploads\/sites\/36\/2024\/04\/09200338\/real-or-fake-image-analysis-and-provenance-03.jpg\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-50941\" class=\"size-full wp-image-50941\" src=\"https:\/\/media.kasperskydaily.com\/wp-content\/uploads\/sites\/36\/2024\/04\/09200338\/real-or-fake-image-analysis-and-provenance-03.jpg\" alt=\"Fake Image Detector warns that the Pope probably didn't wear this on Easter Sunday... Or ever\" width=\"609\" height=\"810\"><\/a><p id=\"caption-attachment-50941\" class=\"wp-caption-text\">Fake Image Detector warns that the Pope probably didn\u2019t wear this on Easter Sunday\u2026 Or ever<\/p><\/div>\n<p>With the emergence of generative AI, we\u2019ve also seen new AI-based methods for detecting generated content, but none of them are perfect. Here are some of the relevant developments: detection of <a href=\"https:\/\/www.hhi.fraunhofer.de\/en\/departments\/vit\/research-groups\/computer-vision-graphics\/research-topics\/detection-of-face-morphing-attacks.html\" target=\"_blank\" rel=\"nofollow noopener\">face morphing<\/a>; detection of AI-generated images and <a href=\"https:\/\/www.aiornot.com\/#pricing\" target=\"_blank\" rel=\"nofollow noopener\">determining the AI model used to generate them<\/a>; and an open <a href=\"https:\/\/huggingface.co\/Organika\/sdxl-detector\" target=\"_blank\" rel=\"nofollow noopener\">AI model<\/a> for the same purposes.<\/p>\n<p>With all these approaches, the key problem is that none gives you 100% certainty about the provenance of the image, guarantees that the image is free of modifications, or makes it possible to verify any such modifications.\n<\/p>\n<h2>WWW to the rescue: verifying content provenance<\/h2>\n<p>\nWouldn\u2019t it be great if there were an easier way for regular users to check if an image is the real deal? Imagine clicking on a picture and seeing something like: \u201cJohn took this photo with an iPhone on March 20\u201d, \u201cAnn cropped the edges and increased the brightness on March 22\u201d, \u201cPeter re-saved this image with high compression on March 23\u201d, or \u201cNo changes were made\u201d \u2014 and all such data would be impossible to fake. Sounds like a dream, right? Well, that\u2019s exactly what the Coalition for <a href=\"https:\/\/c2pa.org\/\" target=\"_blank\" rel=\"nofollow noopener\">Content Provenance and Authenticity (C2PA)<\/a> is aiming for. C2PA includes some major players from the computer, photography, and media industries: Canon, Nikon, Sony, Adobe, AWS, Microsoft, Google, Intel, BBC, Associated Press, and about a hundred other members \u2014 basically all the companies that could have been individually involved in pretty much any step of an image\u2019s life from creation to publication online.<\/p>\n<p>The <a href=\"https:\/\/c2pa.org\/specifications\/specifications\/1.3\/index.html\" target=\"_blank\" rel=\"nofollow noopener\">C2PA standard<\/a> developed by this coalition is already out there and has even reached version 1.3, and now we\u2019re starting to see the pieces of the industrial puzzle necessary to use it fall into place. Nikon is <a href=\"https:\/\/www.nikon.com\/company\/news\/2024\/0109_imaging_02.html\" target=\"_blank\" rel=\"nofollow noopener\">planning<\/a> to make C2PA-compatible cameras, and the BBC has already <a href=\"https:\/\/www.bbc.co.uk\/rd\/blog\/2024-03-c2pa-verification-news-journalism-credentials\" target=\"_blank\" rel=\"nofollow noopener\">published<\/a> its first articles with verified images.<\/p>\n<div id=\"attachment_50940\" style=\"width: 1258px\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/media.kasperskydaily.com\/wp-content\/uploads\/sites\/36\/2024\/04\/09200348\/real-or-fake-image-analysis-and-provenance-04.jpg\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-50940\" class=\"wp-image-50940 size-full\" src=\"https:\/\/media.kasperskydaily.com\/wp-content\/uploads\/sites\/36\/2024\/04\/09200348\/real-or-fake-image-analysis-and-provenance-04.jpg\" alt=\"BBC talks about how images and videos in its articles are verified\" width=\"1248\" height=\"702\"><\/a><p id=\"caption-attachment-50940\" class=\"wp-caption-text\">BBC talks about how images and videos in its articles are verified<\/p><\/div>\n<p>The idea is that when responsible media outlets and big companies switch to publishing images in verified form, you\u2019ll be able to check the provenance of any image directly in the browser. You\u2019ll see a little \u201cverified image\u201d label, and when you click on it, a bigger window will pop up showing you what images served as the source, and what edits were made at each stage before the image appeared in the browser and by whom and when. You\u2019ll even be able to see all the intermediate versions of the image.<\/p>\n<div id=\"attachment_50939\" style=\"width: 2090px\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/media.kasperskydaily.com\/wp-content\/uploads\/sites\/36\/2024\/04\/09200403\/real-or-fake-image-analysis-and-provenance-05.jpg\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-50939\" class=\"size-full wp-image-50939\" src=\"https:\/\/media.kasperskydaily.com\/wp-content\/uploads\/sites\/36\/2024\/04\/09200403\/real-or-fake-image-analysis-and-provenance-05.jpg\" alt=\"History of image creation and editing\" width=\"2080\" height=\"756\"><\/a><p id=\"caption-attachment-50939\" class=\"wp-caption-text\">History of image creation and editing<\/p><\/div>\n<p>This approach isn\u2019t just for cameras; it can work for other ways of creating images too. Services like Dall-E and Midjourney can also label their creations.<\/p>\n<div id=\"attachment_50938\" style=\"width: 1174px\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/media.kasperskydaily.com\/wp-content\/uploads\/sites\/36\/2024\/04\/09200414\/real-or-fake-image-analysis-and-provenance-06.jpg\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-50938\" class=\"size-full wp-image-50938\" src=\"https:\/\/media.kasperskydaily.com\/wp-content\/uploads\/sites\/36\/2024\/04\/09200414\/real-or-fake-image-analysis-and-provenance-06.jpg\" alt=\"This was clearly created in Adobe Photoshop\" width=\"1164\" height=\"977\"><\/a><p id=\"caption-attachment-50938\" class=\"wp-caption-text\">This was clearly created in Adobe Photoshop<\/p><\/div>\n<p>The verification process is based on public-key cryptography similar to the protection used in web server certificates for establishing a secure HTTPS connection. The idea is that every image creator \u2014 be it Joe Bloggs with a particular type of camera, or Angela Smith with a Photoshop license \u2014 will need to obtain an X.509 certificate from a trusted certificate authority. This certificate can be hardwired directly into the camera at the factory, while for software products it can be issued upon activation. When processing images with provenance tracking, each new version of the file will contain a large amount of extra information: the date, time, and location of the edits, thumbnails of the original and edited versions, and so on. All this will be digitally signed by the author or editor of the image. This way, a verified image file will have a chain of all its previous versions, each signed by the person who edited it.<\/p>\n<div id=\"attachment_50937\" style=\"width: 742px\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/media.kasperskydaily.com\/wp-content\/uploads\/sites\/36\/2024\/04\/09200425\/real-or-fake-image-analysis-and-provenance-07.jpg\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-50937\" class=\"size-full wp-image-50937\" src=\"https:\/\/media.kasperskydaily.com\/wp-content\/uploads\/sites\/36\/2024\/04\/09200425\/real-or-fake-image-analysis-and-provenance-07.jpg\" alt=\"This video contains AI-generated content\" width=\"732\" height=\"412\"><\/a><p id=\"caption-attachment-50937\" class=\"wp-caption-text\">This video contains AI-generated content<\/p><\/div>\n<p>The authors of the specification were also concerned with privacy features. Sometimes, journalists can\u2019t reveal their sources. For situations like that, there\u2019s a special type of edit called \u201credaction\u201d. This allows someone to replace some of the information about the image creator with zeros and then sign that change with their own certificate.<\/p>\n<p>To showcase the capabilities of C2PA, a collection of <a href=\"https:\/\/github.com\/c2pa-org\/public-testfiles?tab=readme-ov-file\" target=\"_blank\" rel=\"nofollow noopener\">test images and videos<\/a> was created. You can check out the <a href=\"https:\/\/contentcredentials.org\/verify\" target=\"_blank\" rel=\"nofollow noopener\">Content Credentials website<\/a> to see the credentials, creation history, and editing history of these images.<\/p>\n<div id=\"attachment_50936\" style=\"width: 1910px\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/media.kasperskydaily.com\/wp-content\/uploads\/sites\/36\/2024\/04\/09200435\/real-or-fake-image-analysis-and-provenance-08.jpg\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-50936\" class=\"size-full wp-image-50936\" src=\"https:\/\/media.kasperskydaily.com\/wp-content\/uploads\/sites\/36\/2024\/04\/09200435\/real-or-fake-image-analysis-and-provenance-08.jpg\" alt=\"The Content Credentials website reveals the full background of C2PA images\" width=\"1900\" height=\"1286\"><\/a><p id=\"caption-attachment-50936\" class=\"wp-caption-text\">The Content Credentials website reveals the full background of C2PA images<\/p><\/div>\n<h2>Natural limitations<\/h2>\n<p>\nUnfortunately, digital signatures for images won\u2019t solve the fakes problem overnight. After all, there are already billions of images online that haven\u2019t been signed by anyone and aren\u2019t going anywhere. However, as more and more reputable information sources switch to publishing only signed images, any photo without a digital signature will start to be viewed with suspicion. Real photos and videos with timestamps and location data will be almost impossible to pass off as something else, and AI-generated content will be easier to spot.<\/p>\n<input type=\"hidden\" class=\"category_for_banner\" value=\"premium-generic\">\n","protected":false},"excerpt":{"rendered":"<p>How to tell a real photo or video from a fake, and trace its provenance. <\/p>\n","protected":false},"author":2722,"featured_media":27314,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[5],"tags":[1094,1095,3270,3342,2925,76,701,321,1898],"class_list":{"0":"post-27312","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-news","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-deepfake","11":"tag-fakes","12":"tag-images","13":"tag-phishing","14":"tag-scam","15":"tag-technology","16":"tag-tips"},"hreflang":[{"hreflang":"en-in","url":"https:\/\/www.kaspersky.co.in\/blog\/real-or-fake-image-analysis-and-provenance\/27312\/"},{"hreflang":"en-ae","url":"https:\/\/me-en.kaspersky.com\/blog\/real-or-fake-image-analysis-and-provenance\/22606\/"},{"hreflang":"ar","url":"https:\/\/me.kaspersky.com\/blog\/real-or-fake-image-analysis-and-provenance\/11577\/"},{"hreflang":"en-us","url":"https:\/\/usa.kaspersky.com\/blog\/real-or-fake-image-analysis-and-provenance\/29968\/"},{"hreflang":"en-gb","url":"https:\/\/www.kaspersky.co.uk\/blog\/real-or-fake-image-analysis-and-provenance\/27469\/"},{"hreflang":"es-mx","url":"https:\/\/latam.kaspersky.com\/blog\/real-or-fake-image-analysis-and-provenance\/27262\/"},{"hreflang":"es","url":"https:\/\/www.kaspersky.es\/blog\/real-or-fake-image-analysis-and-provenance\/29956\/"},{"hreflang":"it","url":"https:\/\/www.kaspersky.it\/blog\/real-or-fake-image-analysis-and-provenance\/28683\/"},{"hreflang":"ru","url":"https:\/\/www.kaspersky.ru\/blog\/real-or-fake-image-analysis-and-provenance\/37264\/"},{"hreflang":"tr","url":"https:\/\/www.kaspersky.com.tr\/blog\/real-or-fake-image-analysis-and-provenance\/12254\/"},{"hreflang":"x-default","url":"https:\/\/www.kaspersky.com\/blog\/real-or-fake-image-analysis-and-provenance\/50932\/"},{"hreflang":"fr","url":"https:\/\/www.kaspersky.fr\/blog\/real-or-fake-image-analysis-and-provenance\/21787\/"},{"hreflang":"pt-br","url":"https:\/\/www.kaspersky.com.br\/blog\/real-or-fake-image-analysis-and-provenance\/22495\/"},{"hreflang":"de","url":"https:\/\/www.kaspersky.de\/blog\/real-or-fake-image-analysis-and-provenance\/31189\/"},{"hreflang":"ja","url":"https:\/\/blog.kaspersky.co.jp\/real-or-fake-image-analysis-and-provenance\/36220\/"},{"hreflang":"ru-kz","url":"https:\/\/blog.kaspersky.kz\/real-or-fake-image-analysis-and-provenance\/27628\/"},{"hreflang":"en-au","url":"https:\/\/www.kaspersky.com.au\/blog\/real-or-fake-image-analysis-and-provenance\/33473\/"},{"hreflang":"en-za","url":"https:\/\/www.kaspersky.co.za\/blog\/real-or-fake-image-analysis-and-provenance\/33100\/"}],"acf":[],"banners":"","maintag":{"url":"https:\/\/www.kaspersky.co.in\/blog\/tag\/images\/","name":"images"},"_links":{"self":[{"href":"https:\/\/www.kaspersky.co.in\/blog\/wp-json\/wp\/v2\/posts\/27312","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.kaspersky.co.in\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.kaspersky.co.in\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.kaspersky.co.in\/blog\/wp-json\/wp\/v2\/users\/2722"}],"replies":[{"embeddable":true,"href":"https:\/\/www.kaspersky.co.in\/blog\/wp-json\/wp\/v2\/comments?post=27312"}],"version-history":[{"count":3,"href":"https:\/\/www.kaspersky.co.in\/blog\/wp-json\/wp\/v2\/posts\/27312\/revisions"}],"predecessor-version":[{"id":27316,"href":"https:\/\/www.kaspersky.co.in\/blog\/wp-json\/wp\/v2\/posts\/27312\/revisions\/27316"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.kaspersky.co.in\/blog\/wp-json\/wp\/v2\/media\/27314"}],"wp:attachment":[{"href":"https:\/\/www.kaspersky.co.in\/blog\/wp-json\/wp\/v2\/media?parent=27312"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.kaspersky.co.in\/blog\/wp-json\/wp\/v2\/categories?post=27312"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.kaspersky.co.in\/blog\/wp-json\/wp\/v2\/tags?post=27312"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}