How we Sharpened the James Webb Telescope's Vision from a Million Kilometers Away
The James Webb Space Telescope's Journey to Clarity
The James Webb Space Telescope, a $10 billion marvel, captivated our family during its nail-biting launch in 2021. This event marked a significant leap in telescope technology since the Hubble Space Telescope's launch in 1990. Webb's journey to deployment was fraught with potential pitfalls, requiring it to navigate 344 potential points of failure. Thankfully, the launch exceeded expectations, and we could finally exhale in relief.
Six months later, Webb unveiled its first images, revealing the most distant galaxies ever witnessed. However, for our team in Australia, the real work was just beginning. We embarked on a mission to enhance Webb's vision using its highest-resolution mode, the Aperture Masking Interferometer (AMI).
AMI: Australia's Contribution to Webb's Vision
AMI, a tiny piece of precisely machined metal, slots into one of Webb's cameras, significantly boosting its resolution. Designed by astronomer Peter Tuthill, AMI is the only Australian hardware on board. Its purpose is to diagnose and measure any blur in Webb's images, which could hinder the study of planets or black holes due to even nanometers of distortion in its 18 hexagonal primary mirrors and internal surfaces.
Unveiling the Blurry Pixels
Our initial observations with AMI revealed a challenge. At the pixel level, all images were slightly blurry due to an electronic effect where brighter pixels bled into darker neighbors. This wasn't a flaw but a fundamental feature of infrared cameras, posing a significant issue for Webb's sensitivity and resolution.
Correcting the Vision
We tackled this problem head-on. In a new paper led by University of Sydney Ph.D. student Louis Desdoigts, we employed a computer model to simulate AMI's optical physics, considering mirror and aperture shapes and star colors. We then connected this to a machine learning model, creating an 'effective detector model' focused on data reproduction.
After training and validation, this setup allowed us to calculate and undo the blur in other data, restoring AMI's full functionality. This correction doesn't alter Webb's space operations but enhances data during processing.
Real-World Impact
The results were remarkable. We could now observe the birthplaces of planets and material being sucked into black holes. The correction enabled us to clearly see faint planets around stars like HD 206893, known but previously out of reach. Additionally, we focused on Jupiter's moon Io, tracking its volcanoes over an hour-long timelapse, and observed the jet from the black hole at the center of NGC 1068 with remarkable clarity.
Expanding Horizons
In a companion paper by Max Charles, we applied this correction to complex imaging at the highest resolution with Webb. We revisited well-studied targets, testing the telescope's performance. With the new correction, we brought Jupiter's moon Io into focus and observed the ribbon of dust around WR 137, aligning with theoretical predictions.
Looking Ahead
The code built for AMI serves as a demo for more complex cameras on Webb and its follow-up, the Roman Space Telescope. These tools demand optical calibration so fine, it's just a fraction of a nanometer, pushing the limits of known materials.
Our work demonstrates that by measuring, controlling, and correcting available materials, we can still hope to find Earth-like planets in our galaxy's far reaches.