Achieving Angle of Light Detection: Silicon Nanowires Emulate a Gecko’s Ears - LEKULE

Breaking

5 Nov 2018

Achieving Angle of Light Detection: Silicon Nanowires Emulate a Gecko’s Ears

Angular detection is something difficult to accomplish with modern sensors. What could this functionality offer? And what does it have to do with gecko ears?

Researchers from Stanford University have created an experimental setup that may see future cameras and other light detecting systems record both intensity and angle of incoming light.

The Problem of Angular Detection

All consumer cameras on the market use image sensors (such as a CCD or CMOS) to either record still images or to record video. This capture of images is accomplished by recording the intensity of incoming photons.

The angle at which these photons come into the camera is not recorded. Such data, however, could be very useful with one particular application in mind: focusing.

A camera that can record both the intensity and the angle of incoming light could use that data to focus an image in post (i.e., after the image has been taken). It could also use angular information to help with on-the-fly focus using triangulation. Two angle detectors separated by a given angle can be used to determine the distance of a light source with the use of the sine and cosine rule in trigonometry.

Detecting the angle of incoming light, however, is complex and requires equipment such as multiple lenses. While a nano-sensor would be useful (as it could be grown on the camera's sensor directly) there is an issue with “sub-wavelength” detection. To better understand this problem in action, we can look at the animal kingdom with sound detection and positioning.

Angle of Light and Gecko Ears

Animals with ears whose spacing is larger than typical sound wavelengths (8 ~ 30cm) can determine the direction of incoming sound via the time difference as sound waves reach each ear.

For example, a sound wave that arrives at the right ear before the left ear must have originated in a direction towards the right ear. This type of position detection is only possible because of the time taken for sound waves to propagate (300 m/s), as well as the relative speed of neural transmissions such that neurons can process enough information before a sound wave reaches the second ear. Animals that are much smaller than these common wavelengths are said to be “sub-wavelength” and cannot use this technique for determining the direction of a sound source. Most of these animals can determine position with the use of a connected cavity that connects both eardrums acoustically.

When the sound wave arrives at one eardrum first it causes a change in the cavity between the two eardrums and this makes the detection capability of the other eardrum to lessen. Even though each eardrum will be receiving a signal that is essentially identical in amplitude the eardrum to detect it first will affect the other eardrum and this difference is easily detected. One creature in particular that uses this method is the gecko, which has an acoustic cavity linking both eardrums which allow it to determine sound source direction.



So, can this technique of coupling be used to determine the angle of incoming light with sensors that are considered “sub-wavelength”? Stanford University has just answered this question!

Nanowires and Angular Detection

Researchers from Stanford University have created an experimental setup where they are able to determine the incoming angle of light. The setup relies on the coupling of two silicon nanowires that can interfere with each other when they receive incoming photons. The two wires, which are 100nm in both width and height, are much smaller than the wavelength of incoming photons and are positioned 100nm from each other.

When incoming photons arrive at one of the wires first it results in Mie scattering which essentially means that the absorption capability of the second wire is affected. Since both wires are optically coupled and the photocurrent is proportional to the angle of the incoming light the angle can easily be determined.

The same experiment was conducted but with a wire separation of 2um to prove that it’s the close proximity that couples the wires together and that experiment showed no coupling.


Nanowires as pictured in Stanford's 2012 announcement of welding nanowires with light. Image from Stanford University.

The researchers, however, took their experiment a step further and built two angle detections. The two detectors were then separated by a known distance and using the differential current readings from each sensor they were able to triangulate the light source and therefore know its distance. According to their triangulation experiment distances from a light source can be determined with an accuracy of a centimeter within a range of 10 meters. Interestingly, this method of range finding is considerably less complex than using high-speed electronics which fire a laser beam and then time the return journey.

Potential Applications: Cameras, Machine Vision, Augmented Reality

The use of nanowire sensors for angular detection could affect camera sensors in a number of scenarios that need to perform either angular or distance detection without the need for complex hardware.

For example, LiDAR systems use a rotating mirror and a laser along with high-speed electronics to time the return journey of a laser. While this method is reliable and already in use, it generally requires bulky parts (such as motors and mirrors), as well as having a minimum detection distance.

Nanowires, however, may not have a minimum distance measurement due to the fact that they operate around real-world photon behavior as opposed to a CPU and a counter. A LiDAR system that used nanowires would still need a rotating mirror with a laser but there would be no need for a CPU with timer and results could be read with even the simplest microcontroller. A fixed laser could also be used, which would act as a laser range-finder but the entire sensor and laser setup could easily fit into a single IC package.

Angular detection, as stated before, could be potentially useful for photography. While professional photographers typically use manual focus, most novice users will use autofocus. Autofocus can be achieved using multiple methods. A simple example of one such method involves contrast and sharpness detection whereby an object that is to be focused should have a sharp change in contrast between it and the background. The lens is adjusted until the largest change is detected, at which point the camera considers the object in focus.

However, angular detection sensors could provide both angular and direction information that would tell the camera exactly how far away the subject is. Therefore, instead of guessing if the image is in focus, the camera would be able to adjust the camera focus setting (these settings are often shown as a distance to object). This could provide a path towards lens-less cameras.

This functionality also has ramifications for robotic vision applications, providing additional data for processors to utilize in, for example, autonomous vehicle guidance. Augmented reality, which relies on sensor data to populate graphics overtop the existing environment, could see a revolution as more advanced focusing and distance detection allow more immersive augmented experiences.

You can read more about the research in the journal Nature Nanotechnology.


Featured image includes image of nanowires used courtesy of Stanford University.

No comments: