I am very new to Android Camera API. My question is a bit similar to this but with different intentions.
In astro-photography, long exposures are very common to capture faint Deep Sky Objects (DSOs) like nebulas, galaxies and star clusters. But when we take sky shots with long exposures (say 30s), the stars appear as lines (star trails) instead of points, due to continuous rotation of the Earth. Thus astro-photography highly depends on tracking mounts for cameras and telescopes which negate this effect by continuous rotation (after polar alignment) in the opposite direction as the direction of Earth's rotation.
I am trying to find if it's possible to develop an algorithm to achieve this "de-rotation".
Remember we can record videos for planetary imaging (planets are very bright). Stacking softwares like Registax are used to stack good frames to end up with a nice detailed image. But the same technique cannot be used for DSOs because they are too faint and at 30 FPS each frame will just get 1/30 seconds of exposure, thus enough photons won't be recorded per frame by the sensor to distinguish it from background glow.
So my question is: Can we stream raw data from sensor using Android Camera API to a program which will take care of derotation, continuously adding the light information to the SAME pixels, instead of adjacent pixels due to Earth's rotation?
Many Thanks,
Ahmed
(Attached image was taken using 30s exposure using Xiaomi Redmi Note 9S)
Orion and Pleiades: