{"id":27035,"date":"2024-02-05T13:27:02","date_gmt":"2024-02-05T18:27:02","guid":{"rendered":"https:\/\/www.kaspersky.co.in\/blog\/ambient-light-sensor-privacy\/27035\/"},"modified":"2024-03-21T15:38:46","modified_gmt":"2024-03-21T10:08:46","slug":"ambient-light-sensor-privacy","status":"publish","type":"post","link":"https:\/\/www.kaspersky.co.in\/blog\/ambient-light-sensor-privacy\/27035\/","title":{"rendered":"Ambient light sensor as a spy tool"},"content":{"rendered":"<p><a href=\"https:\/\/www.science.org\/doi\/10.1126\/sciadv.adj3608\" target=\"_blank\" rel=\"nofollow noopener\">An article in Science Magazine<\/a><\/p>\n<p> published mid-January describes a non-trivial method of snooping on smartphone users through an ambient light sensor. All smartphones and tablets have this component built-in \u2014 as do many laptops and TVs. Its primary task is to sense the amount of ambient light in the environment the device finds itself in, and to alter the brightness of the display accordingly.<\/p>\n<p>But first we need to explain why a threat actor would use a tool ill-suited for capturing footage instead of the target device\u2019s regular camera. The reason is that such \u201cill-suited\u201d sensors are usually totally unprotected. Let\u2019s imagine an attacker tricked a user into installing a malicious program on their smartphone. The malware will struggle to gain access to oft-targeted components, such as the microphone or camera. But to the light sensor? Easy as pie.<\/p>\n<p>So, the researchers proved that this ambient light sensor can be used instead of a camera; for example, to get a snapshot of the user\u2019s hand entering a PIN on a virtual keyboard. In theory, by analyzing such data, it\u2019s possible to reconstruct the password itself. This post explains the ins and outs in plain language.<\/p>\n<div id=\"attachment_50474\" style=\"width: 1107px\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/media.kasperskydaily.com\/wp-content\/uploads\/sites\/36\/2024\/02\/05235732\/ambient-light-sensor-privacy-01.jpg\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-50474\" class=\"size-full wp-image-50474\" src=\"https:\/\/media.kasperskydaily.com\/wp-content\/uploads\/sites\/36\/2024\/02\/05235732\/ambient-light-sensor-privacy-01.jpg\" alt=\"\" width=\"1097\" height=\"413\"><\/a><p id=\"caption-attachment-50474\" class=\"wp-caption-text\">\u201cTaking shots\u201d with a light sensor. <a href=\"https:\/\/www.science.org\/doi\/epdf\/10.1126\/sciadv.adj3608\" target=\"_blank\" rel=\"noopener nofollow\">Source<\/a><\/p><\/div>\n<p>A light sensor is a rather primitive piece of technology. It\u2019s a light-sensitive photocell for measuring the brightness of ambient light several times per second. Digital cameras use very similar (albeit smaller) light sensors, but there are many millions of them. The lens projects an image onto this photocell matrix, the brightness of each element is measured, and the result is a digital photograph. Thus, you could describe a light sensor as the most primitive digital camera there is: its resolution is exactly one pixel. How could such a thing ever capture what\u2019s going on around the device?<\/p>\n<p>The researchers used the <a href=\"https:\/\/en.wikipedia.org\/wiki\/Helmholtz_reciprocity\" target=\"_blank\" rel=\"nofollow noopener\">Helmholtz reciprocity principle<\/a>, formulated back in the mid-19<sup>th<\/sup> century. This principle is widely used in computer graphics, for example, where it greatly simplifies calculations. In 2005, the principle formed the basis of the <a href=\"https:\/\/graphics.stanford.edu\/papers\/dual_photography\/DualPhotography.pdf\" target=\"_blank\" rel=\"nofollow noopener\">proposed<\/a> dual photography method. Let\u2019s take an illustration from this paper to help explain:<\/p>\n<div id=\"attachment_50475\" style=\"width: 823px\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/media.kasperskydaily.com\/wp-content\/uploads\/sites\/36\/2024\/02\/05235739\/ambient-light-sensor-privacy-02.jpg\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-50475\" class=\"size-full wp-image-50475\" src=\"https:\/\/media.kasperskydaily.com\/wp-content\/uploads\/sites\/36\/2024\/02\/05235739\/ambient-light-sensor-privacy-02.jpg\" alt=\"On the left is a real photograph of the object. On the right is an image calculated from the point of view of the light source.\" width=\"813\" height=\"480\"><\/a><p id=\"caption-attachment-50475\" class=\"wp-caption-text\">On the left is a real photograph of the object. On the right is an image calculated from the point of view of the light source. <a href=\"https:\/\/graphics.stanford.edu\/papers\/dual_photography\/DualPhotography.pdf\" target=\"_blank\" rel=\"noopener nofollow\">Source<\/a><\/p><\/div>\n<p>Imagine you\u2019re photographing objects on a table. A lamp shines on the objects, the reflected light hits the camera lens, and the result is a photograph. Nothing out of the ordinary. In the illustration above, the image on the left is precisely that \u2014 a regular photo. Next, in greatly simplified terms, the researchers began to alter the brightness of the lamp and record the changes in illumination. As a result, they collected enough information to reconstruct the image on the right \u2014 taken as if from the point of view of the lamp. There\u2019s no camera in this position and never was, but based on the measurements, the scene was successfully reconstructed.<\/p>\n<p>Most interesting of all is that this trick doesn\u2019t even require a camera. A simple photoresistor will do\u2026 just like the one in an ambient light sensor. A photoresistor (or \u201csingle-pixel camera\u201d) measures changes in the light reflected from objects, and this data is used to construct a photograph of them. The quality of the image will be low, and many measurements must be taken \u2014 numbering in the hundreds or thousands.<\/p>\n<div id=\"attachment_50476\" style=\"width: 590px\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/media.kasperskydaily.com\/wp-content\/uploads\/sites\/36\/2024\/02\/05235750\/ambient-light-sensor-privacy-03.jpg\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-50476\" class=\"wp-image-50476 size-full\" src=\"https:\/\/media.kasperskydaily.com\/wp-content\/uploads\/sites\/36\/2024\/02\/05235750\/ambient-light-sensor-privacy-03.jpg\" alt=\"Experimental setup\" width=\"580\" height=\"421\"><\/a><p id=\"caption-attachment-50476\" class=\"wp-caption-text\">Experimental setup: a Samsung Galaxy View tablet and a mannequin hand. <a href=\"https:\/\/www.science.org\/doi\/epdf\/10.1126\/sciadv.adj3608\" target=\"_blank\" rel=\"noopener nofollow\">Source<\/a><\/p><\/div>\n<p>Let\u2019s return to the study and the light sensor. The authors of the paper used a fairly large Samsung Galaxy View tablet with a 17-inch display. Various patterns of black and white rectangles were displayed on the tablet\u2019s screen. A mannequin was positioned facing the screen in the role of a user entering something on the on-screen keyboard. The light sensor captured changes in brightness. In several hundred measurements like this, an image of the mannequin\u2019s hand was produced. That is, the authors applied the Helmholtz reciprocity principle to get a photograph of the hand, taken as if from the point of view of the screen. The researchers effectively turned the tablet display into an extremely low-quality camera.<\/p>\n<div id=\"attachment_50477\" style=\"width: 718px\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/media.kasperskydaily.com\/wp-content\/uploads\/sites\/36\/2024\/02\/05235756\/ambient-light-sensor-privacy-04.jpg\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-50477\" class=\"size-full wp-image-50477\" src=\"https:\/\/media.kasperskydaily.com\/wp-content\/uploads\/sites\/36\/2024\/02\/05235756\/ambient-light-sensor-privacy-04.jpg\" alt=\"Comparing real objects in front of the tablet with what the light sensor captured.\" width=\"708\" height=\"422\"><\/a><p id=\"caption-attachment-50477\" class=\"wp-caption-text\">Comparing real objects in front of the tablet with what the light sensor captured. <a href=\"https:\/\/www.science.org\/doi\/epdf\/10.1126\/sciadv.adj3608\" target=\"_blank\" rel=\"noopener nofollow\">Source<\/a><\/p><\/div>\n<p>True, not the sharpest image. The above-left picture shows what needed to be captured: in one case, the open palm of the mannequin; in the other, how the \u201cuser\u201d appears to tap something on the display. The images in the center are a reconstructed \u201cphoto\u201d at 32\u00d732 pixel resolution, in which almost nothing is visible \u2014 too much noise in the data. But with the help of machine-learning algorithms, the noise was filtered out to produce the images on the right, where we can distinguish one hand position from the other. The authors of the paper give other examples of typical gestures that people make when using a tablet touchscreen. Or rather, examples of how they managed to \u201cphotograph\u201d them:<\/p>\n<div id=\"attachment_50478\" style=\"width: 1216px\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/media.kasperskydaily.com\/wp-content\/uploads\/sites\/36\/2024\/02\/05235805\/ambient-light-sensor-privacy-05.jpg\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-50478\" class=\"size-full wp-image-50478\" src=\"https:\/\/media.kasperskydaily.com\/wp-content\/uploads\/sites\/36\/2024\/02\/05235805\/ambient-light-sensor-privacy-05.jpg\" alt=\"Capturing various hand positions using a light sensor.\" width=\"1206\" height=\"831\"><\/a><p id=\"caption-attachment-50478\" class=\"wp-caption-text\">Capturing various hand positions using a light sensor. <a href=\"https:\/\/www.science.org\/doi\/epdf\/10.1126\/sciadv.adj3608\" target=\"_blank\" rel=\"noopener nofollow\">Source<\/a><\/p><\/div>\n<p>So can we apply this method in practice? Is it possible to monitor how the user interacts with the touchscreen of a tablet or smartphone? How they enter text on the on-screen keyboard? How they enter credit card details? How they open apps? Fortunately, it\u2019s not that straightforward. Note the captions above the \u201cphotographs\u201d in the illustration above. They show how slow this method works. In the best-case scenario, the researchers were able to reconstruct a \u201cphoto\u201d of the hand in just over three minutes. The image in the previous illustration took 17 minutes to capture. Real-time surveillance at such speeds is out of the question. It\u2019s also clear now why most of the experiments featured a mannequin\u2019s hand: a human being simply can\u2019t hold their hand motionless for that long.<\/p>\n<p>But that doesn\u2019t rule out the possibility of the method being improved. Let\u2019s ponder the worst-case scenario: if each hand image can be obtained not in three minutes, but in, say, half a second; if the on-screen output is not some strange black-and-white figures, but a video or set of pictures or animation of interest to the user; and if the user does something worth spying on\u2026 \u2014 then the attack would make sense. But even then \u2014 not much sense. All the researchers\u2019 efforts are undermined by the fact that if an attacker managed to slip malware onto the victim\u2019s device, there are many easier ways to then trick them into entering a password or credit card number. Perhaps for the first time in covering such papers (examples: <a href=\"https:\/\/www.kaspersky.com\/blog\/side-eye-attack\/49361\/\" target=\"_blank\" rel=\"noopener nofollow\">one<\/a>, <a href=\"https:\/\/www.kaspersky.com\/blog\/led-data-exfiltration\/48523\/\" target=\"_blank\" rel=\"noopener nofollow\">two<\/a>, <a href=\"https:\/\/www.kaspersky.com\/blog\/pc-speaker-data-exfiltration\/47737\/\" target=\"_blank\" rel=\"noopener nofollow\">three<\/a>, <a href=\"https:\/\/www.kaspersky.com\/blog\/wi-peep-wireless-localization\/46611\/\" target=\"_blank\" rel=\"noopener nofollow\">four<\/a>), we are struggling even to imagine a real-life scenario for such an attack.<\/p>\n<p>All we can do is marvel at the beauty of the proposed method. This research serves as another reminder that the seemingly familiar, inconspicuous devices we are surrounded by can harbor unusual, lesser-known functionalities. That said, for those concerned about this potential violation of privacy, the solution is simple. Such low-quality images are due to the fact that the light sensor takes measurements quite infrequently: 10\u201320 times per second. The output data also lacks precision. However, that\u2019s only relevant for turning the sensor into a camera. For the main task \u2014 measuring ambient light \u2014 this rate is even too high. We can \u201ccoarsen\u201d the data even more \u2014 transmitting it, say, five times per second instead of 20. For matching the screen brightness to the level of ambient light, this is more than enough. But spying through the sensor \u2014 already improbable \u2014 would become impossible. Perhaps for the best.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>A fresh study of some unexpected properties of a standard feature of all modern smartphones and tablets. <\/p>\n","protected":false},"author":665,"featured_media":27037,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[2036,2609,2610],"tags":[43,706,45],"class_list":{"0":"post-27035","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-business","8":"category-enterprise","9":"category-smb","10":"tag-privacy","11":"tag-research","12":"tag-smartphones"},"hreflang":[{"hreflang":"en-in","url":"https:\/\/www.kaspersky.co.in\/blog\/ambient-light-sensor-privacy\/27035\/"},{"hreflang":"en-ae","url":"https:\/\/me-en.kaspersky.com\/blog\/ambient-light-sensor-privacy\/22347\/"},{"hreflang":"en-us","url":"https:\/\/usa.kaspersky.com\/blog\/ambient-light-sensor-privacy\/29704\/"},{"hreflang":"en-gb","url":"https:\/\/www.kaspersky.co.uk\/blog\/ambient-light-sensor-privacy\/27202\/"},{"hreflang":"es-mx","url":"https:\/\/latam.kaspersky.com\/blog\/ambient-light-sensor-privacy\/27073\/"},{"hreflang":"it","url":"https:\/\/www.kaspersky.it\/blog\/ambient-light-sensor-privacy\/28576\/"},{"hreflang":"ru","url":"https:\/\/www.kaspersky.ru\/blog\/ambient-light-sensor-privacy\/36932\/"},{"hreflang":"tr","url":"https:\/\/www.kaspersky.com.tr\/blog\/ambient-light-sensor-privacy\/12101\/"},{"hreflang":"x-default","url":"https:\/\/www.kaspersky.com\/blog\/ambient-light-sensor-privacy\/50473\/"},{"hreflang":"fr","url":"https:\/\/www.kaspersky.fr\/blog\/ambient-light-sensor-privacy\/21582\/"},{"hreflang":"pt-br","url":"https:\/\/www.kaspersky.com.br\/blog\/ambient-light-sensor-privacy\/22294\/"},{"hreflang":"de","url":"https:\/\/www.kaspersky.de\/blog\/ambient-light-sensor-privacy\/30980\/"},{"hreflang":"ru-kz","url":"https:\/\/blog.kaspersky.kz\/ambient-light-sensor-privacy\/27422\/"},{"hreflang":"en-au","url":"https:\/\/www.kaspersky.com.au\/blog\/ambient-light-sensor-privacy\/33219\/"},{"hreflang":"en-za","url":"https:\/\/www.kaspersky.co.za\/blog\/ambient-light-sensor-privacy\/32843\/"}],"acf":[],"banners":"","maintag":{"url":"https:\/\/www.kaspersky.co.in\/blog\/tag\/smartphones\/","name":"smartphones"},"_links":{"self":[{"href":"https:\/\/www.kaspersky.co.in\/blog\/wp-json\/wp\/v2\/posts\/27035","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.kaspersky.co.in\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.kaspersky.co.in\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.kaspersky.co.in\/blog\/wp-json\/wp\/v2\/users\/665"}],"replies":[{"embeddable":true,"href":"https:\/\/www.kaspersky.co.in\/blog\/wp-json\/wp\/v2\/comments?post=27035"}],"version-history":[{"count":3,"href":"https:\/\/www.kaspersky.co.in\/blog\/wp-json\/wp\/v2\/posts\/27035\/revisions"}],"predecessor-version":[{"id":27198,"href":"https:\/\/www.kaspersky.co.in\/blog\/wp-json\/wp\/v2\/posts\/27035\/revisions\/27198"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.kaspersky.co.in\/blog\/wp-json\/wp\/v2\/media\/27037"}],"wp:attachment":[{"href":"https:\/\/www.kaspersky.co.in\/blog\/wp-json\/wp\/v2\/media?parent=27035"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.kaspersky.co.in\/blog\/wp-json\/wp\/v2\/categories?post=27035"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.kaspersky.co.in\/blog\/wp-json\/wp\/v2\/tags?post=27035"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}