This is the first of a series of articles. As I'm always experimenting and tuning my setup, I'm not sure how many more articles I'll be writing.
A few years ago I began to introduce myself to astrophotography. I had some fairly nice equipment back then: a SkyWatcher HEQ5 mount, a Meade ACF 8", guide scope and camera, a borrowed reflex, laptop, 12v car battery.
Although this is pretty much entry level equipment, barely sufficient to getting started, it had been already quite expensive (almost 2000€ just for scope and mount, even though the scope was second hand), bulky and heavy. I ended up barely using it, both because of a relatively steep learning curve and because I honestly was getting tired of carrying around 20/30KG of equipment with barely any tangible result.
Then a few things happened: the mount was stolen, I sold the optical tube, and ended up moving to London, where I embraced a new "astronomical philosophy": the lighter, the better.
I was also lucky that this was when some fancy new products like the Star Adventurer started to go "viral", which contributed to lower prices, good support, and good publicity as well, so it wasn't long until I got mine too.
Of course, the Star Adventurer is only half of the story: you also need optics, and some kind of camera. The Star Adventurer is often seen as the best pal to DSLR cameras, it can even trigger shooting using a specific cable, but what about other cameras, like CMOS/CCD astronomy cameras? I wanted to use a Mono camera with filter wheels, specifically an ASI 1600MM. This means you need to use a laptop to drive the camera, download the images (no SD Card slot there), rotate the filter wheel, etc. This might not seem a complicated addition, afterall everyone has a laptop, nowadays. The problem is that a laptop's battery, out in the cold nights, doesn't usually last long. You'd need to bring some sort of power source, like a 12V car battery (heavy, bulky equipment again). You might also want some table and chairs, as it's probably not a great idea to just leave a laptop on the wet grass while shooting.
Long story short, this is when a second "viral" world comes in handy: Single Board Computers, with its most famous example, the Raspberry Pi.
I'm now trying various alternative boards, but by far right now the Raspberry Pi (specifically the 3rd version) is the most reliable, the one I'm still actively using.
This is my typical setup/workflow:
The ASI camera and filter wheel are connected directly to the Raspberry Pi. The Raspberry is usually strapped nearby, either to the mount, on the counterweight, or on the scope tube. Being very lightweight, it doesn't really affect balancing, and it doesn't affect the mount load.
The Raspberry Pi is powered by a 20Ah power bank, the same you normally use to charge your mobile phone (I use the 20Ah version to get more "juice", since the Raspberry Pi, despite being a very low power device compared to a laptop, is still relatively power hungry).
For the software part, the Raspberry Pi runs an INDI server, and your client of choice to manage the imaging part. You can use KStars/Ekos, which is usually the best choice among INDI clients, and it's a very nice software indeed, but instead I'm developing my own scripts (which soon will become a webapp). You can have a look at my repo here: https://github.com/GuLinux/indi-lite-tools, but I'll be making a specific post later on. An easier, but less performant and more power consuming alternative, is to use a desktop version of Raspbian (or Ubuntu), and simply use VNC to remotely control your raspberry Pi.
Finally, you'll still need a laptop: pointing your target, adjusting the field of view, focusing, and exposure, they all need you to view what your camera is currently pointing. But here comes a little trick: you need a laptop only for these initial steps, which with pratice can last only 15/30 minutes. You won't need any large battery for your laptop, simply because you won't be using it for more than half an hour. And to get the images, you don't need to connect your laptop to the camera (and filter wheel), nor you need an ethernet cable from the Raspberry Pi to the to the laptop: the Raspberry Pi v3 has a builtin wifi interface, that can also act as an Access Point.
You can then simply connect your laptop to your Raspberry Pi Wifi, use KStars/Ekos to connect to the INDI server running on the Raspberry, and wirelessly get the images. Then, you will start the sequence on the Raspberry Pi itself, turn off your laptop, and... just enjoy the night sky :)
Or if it's particularly cold, and/or you're tired and want to rest, you can go inside (your house, your car, tent, or whatever), and wirelessly check on your sequence.
In summary, these are the advantages of this setup:
- Low power requirement (a large capacity mobile phone powerbank is more than enough to run it for multiple nights).
- Extremely lightweight, it's even possible to bring your astro equipment with you on a plane, effortlessly. Even the 20Ah power bank weights less than half a Kilogram.
- Hardware compatibility (INDI can support lots of devices).
- "Plugin friendly": a Raspberry Pi can be expanded with more hardware, both using the USB ports or GPIO. I tried connecting an RTC clock, an OLED display (showing current sequence progress), a buzzer to warn me if an error occured, a GPS module to get the exact coordinates, etc.
And the disadvantages:
- Setup can be very difficult for people not used to Linux command line (althought the VNC method described above is fairly easy). I'm working on a "provisioning script" that can easly setup the Raspberry Pi in just a couple of simple steps.
- Low transfer speed: the Raspberry Pi 3 still uses USB 2.0. With my setup, I usually have short exposures (under 60 seconds, sometimes even just 15 seconds), and the Raspberry takes up to 4 seconds to save an image, before shooting the next one. This means that a significant portion of the shooting time will be wasted waiting for image saving. Low USB speed also significantly increases Amp Glow. This is why I'm currently experimenting with some USB3 boards, instead of the raspberry.
Last saturday, after lots of garden testing and software checks, I've finally been able to drive to a dark place for a few deep sky shots.
The driving part itself was the most "scary", as I'm still new to driving in the "wrong side" of the road... Getting the hang of it, though..
I chose to go observing with the HantsAstro stargazing group.. they met in a quite dark site (at least for being not too far from London), and their website and facebook pages really did inspire me. I'm really glad I joined them, as it was a really pleasant evening, with lots of nice people.
My target for the evening was the center of the Cygnus constellation, between Deneb and Sadr. It's an area full of nebulae, perfect for a wide field lens. Technical data, together with stars and object names, can be found in the astrobin technical page.
Last week, a CalSky alert email reminded me about a close passage of the International Space Station to the bright Arcturus, in the Bootes constellation.
Alessia was here, so we catched this opportunity to do some "garden astronomy" together, watching the passage while also trying to record it on camera.
The idea was to do two shots: a wide field, with my large sensor ASI1600mm and an 85mm lens, and a narrow field with the telescope.
It was a beautiful, almost hot evening. Unfortunately, not everything went as planned: the ISS was passing a bit further then expected, since I forgot to update my location coordinates in CalSky, so the telecope shooting was missed, and a few technical issues, plus me choosing the wrong recording duration on the shooting program, almost made me miss the passage itself even on the wide field.
But after a few minutes, without even knowing if the recording was actually successful, looking the frames I was able to spot this bright strip moving through the stars. Although this was meant to be just a "backout shooting", it's still a good catch. We also recorded a hint of a plane passing through the field, at the end of the passage.
As I wrote on my previous post, an exceptionally good weather kept me outside pretty much every night just when Jupiter was at its best.
On April the 7th, during its opposition, I was able to capture a sequence of 4 sets of video captures, each one in RGB. I tried to optimize as much as possible my timings, in order to keep rotational differences between frames under control. This will probably be even easier on a future Planetary Imager release, when I'll implement a scripting interface.
The results are even better than the previous evening.
I was able to take 4 images, and create an animation displaying Jupiter's rotation and its satellites.
Click here for a webp animation: much higher quality, but right now working only on Google Chrome.
These are the best two frames of the animation, so you can better view the features:
During the following night I optimized even further my capture speed, so I could take much more frames (up to 15). Unfortunately I couldn't use all of them due to the usual tree in front of my garden, but the result is still pretty good. The resolution is possibly a little bit worse, maybe for worse seeing or focusing issues, but the animation is much more smooth now.
Click here for a webp animation: much higher quality, but right now working only on Google Chrome.
And here again a few interesting frames of the sequence
And finally, a little treat for Alessia, who wasn't with me, but she would have liked to, particularly given this little incursion by a curious fox
A few more summerlike days, and a few more astronomical shots.
It was sunny, and with a very good seeing from Thursday to Saturday night, just in time for Jupiter's opposition, when it's closer to Earth, and then bigger and easier to capture.
But since I wasn't very happy with my previous Jupiter shots, the first of these three nights I mainly took pictures of the moon.
I began by using my newest camera, an ASI 1600mm: it's more of a deep sky camera, not very suitable for planets and moon: 3.8 µm pixel size instead of 2.4µm of my other camera, an ASI 178mm, and bigger pixels means lower resolution. It also has a much wider sensor, which slows down capturing (and fast framerates is a key element to get high resolution images), but this is also an advantage from another point of view: I was able to capture the whole moon disk in just a single shot, instead of the usual mosaic.
This is the result, in my opinion one of my best looking images ever:
Please click the "Original version" button to get the high resolution image.
But after a couple of full moon shots, I also wanted to see the difference in resolution with my ASI 178mm, so I swapped camera, and started capturing frames near the terminator.
After stacking and stitching everything, this is the result:
Of course, the previous image looks a bit better aestetically: having the full disk is surely more eye pleasing, and it's also a bit less grainy, due to the lower resolution.
But if you look at both at them at full size (again, open the "Original version", and zoom at 100%) this second image mosaic is clearly showing a lot more details.
Again, I am quite happy about the result. It surely might have been better if I had taken more images to cover the full disk, and there are some stiching issues here and there in the image (I'll try reprocessing it soon), but for such a small telescope (a 5" maksutov) I couldn't have hoped for better images.
Finally, when Jupiter raised a bit more, I decided to stay outside a little longer, despite having work the following day, and tried an RGB shooting, although it was still very low on the horizon (only 25°). While shooting the images didn't look bad at all, but I wasn't quite ready to the result I was gonna have after processing the RGB set:
Again, for such a small scope, and such a low object, the amout of details is impressive, particularly compared to my previous jupiter shot. It's almost as good as the images taken with my 8" telescope back in Milan, but with colour this time!
During the evening I also asked my neighbours, a pleasant young couple from New Zealand, to have a quick look through the eyepiece... It's always nice to see reactions of someone watching for the first time the moon through a telescope, sometimes you're able to feel their wonder and awe.
Happy about the results, I then decided to try a full weekend of imaging, weather permitting. And I was lucky. I took lots more pictures during Friday and Saturday nights.
But I still have to finish processing them, so stay tuned until the next post... :)
This weekend weather in London was quite amazing: sunny, a bit too windy, but sky almost perfect. Seeing forecast was also encouraging, so on Friday evening I took a chance to shoot at Jupiter.
It was a bit of an unlucky evening: firstly I discovered that I forgot my red dot finder on, so the battery was totally drained. After struggling for a while trying to align my GoTo mount without it, I decided it was worth to leave the telescope alone for a few minutes (my garden is easly visible from the street... I didn't want to do it unless absolutely necessary) and got back inside to find new batteries.
After everything was aligned, and I was ready to observe and record my images, I noticed that the image wasn't exactly satisfying at the eyepiece. When I replaced the eyepiece with the camera, the very unfocused image revealed me why: some tree branches were in the way, and of course the image was deteriorated from the interference!
I looked around me to see if I could find a better spot to place my scope, but with no luck. I decided to try anyway, taking multiple shots, so maybe in some of them I might get an almost clear picture.
This is the best result I could get:
Of course, the difference with my previous shots taken with a bigger 200mm SC is pretty visible, but I think with better conditions this new telescope can do much more.
Since I made four sets of images, spanning a bit more than 40 minutes, I was also able to an animation showing Jupiter rotation:
- Celestron Nexstar SLT 127 Maksutov
- ZWO ASI 178mm with LRGB filters
- Software: my Planetary Imager for shooting, Autostakkert!2, Registax, Siril and GIMP for image processing.
Luminance channel: 4500 frames, best 20% used. R/G/B channels: 1000 frames, best 40% used.
I'm currently living in a house with a very nice backyard, right outside London. Still a lot of light pollution, but it can be manageable, and useful for testing my equipment before running to darker locations.
This was meant to be a (L)RGB shot, but light pollution and humidity made the blue and green channels pretty useless, while red channel produced quite good results
Last friday night the sky was finally very clear, so I made a few shots.
The brilliant Orion Nebula (M42) is very well defined, but also the Flame nebula (NGC 2024) is quite conspicuous on the left. And very close to the Flame nebula, a tiny Horsehead nebula can be spotted too.
For being just a test shot, I must say I'm very happy with the result, and can't wait for better conditions to try RGB.
I recently bought a new smartphone, a OnePlus 3, since my Nexus 6 was pretty much dead, turning off pretty much everytime I did something else than keeping it in my pocket.
I was particularly interested in the "manual controls" feature of the OnePlus, including the ability to shoot up to 30 seconds, particularly convenient if you want to use it for astronomy.
It turned out this really works out well!
I just did a few tests, but the results are very promising.
The phone can also save images in RAW format, which means that if I'll also be able to program a sequence of shots, I could be able to stack pictures, to produce even better results.
It's been a while since my last updates. I recently moved to London, changed work, and.. changed astronomical instruments too. I'm planning to buy an ultra portable 300mm dobsonian soon, probably the best choice since I would like to travel as much as possible with it - back to Italy, or maybe to the southern skies.
But I also wanted a "quick" telescope, lightweight and simple, something that I could bring out in the garden in a very few minutes, and even better, ready for planetary imaging, so in the meantime I also bought a small Maksutov-Cassegrain telescope, a Celestron Nexstar SLT 127. I also got a new camera, USB3, with very high resolution and wider field than my previous one, a ZWO ASI 178mm, still a monochrome camera, but this time with RGB filter set.
Given the smaller aperture, I'm not expecting really great shots, but my previous setup much heavier, so I spent very little time observing and shooting, which in turn means I didn't really got it to its maximum potential.
My first shooting with the new setup is a very difficult one: Saturn, in an unfavourable time, quite far away from its opposition, and very low in the UK skies.
Given the small aperture, and the difficult target, I can say I'm really satisfied about it. I also tried a few new softwares for image processing, particularly ImPPG, with a very handy lucy-richardson deconvolution filter, which greatly improved the image over my first processing attempt.
A very convenient tecnique in c++ programming is the one known with many names: "d-pointer" (found in Qt/KDE contexts), shadow pointer, "pimpl", opaque pointer. Basically the idea is to hide all the private details of a class in a forward declared private pointer, which will be the only real private member of the class. Since this member will never change, this will guarantee binary compatibility among different versions of a library.
But there are other advantages in using d-pointers: one is compilation speedup during development (usually if you change a private member of a class, the header changes, and you have to recomplile all units using that header, while with d-pointer you change only the .cpp file), and another is code cleanup: you will have very compact header files, describing your class public interface without private stuff pollution. Also, sometimes you may want to add your d-pointer definition in a separate header file, ending up with three well defined files:
- myclass.h: class declaration, only public stuff, without private stuff
- myclass_p.hpp: private class declaration: only private stuff, no implementations
- myclass.cpp: only implementations
The classical approach is to create a plain raw pointer to a forward declared class, initialize it in the constructor, and delete it on the destructor. A nice addition is to have the private class be a nested type, so that you can avoid polluting your IDE class list.