As progress in AI and ML accelerates, the gap between human and automated pattern recognition capabilities is narrowing.

Copyright by


Astronomy has a rich history of data gathering and record keeping. Most ancient civilizations developed their own version of astronomy, where significant solar and celestial events were used to establish calendars, support navigation, and played a strong cultural and spiritual role. Oral traditions, such as those used by the indigenous inhabitants of Australia, allowed celestial information to be preserved across millennia. Architectural marvels, including Stonehenge, the Thirteen Towers of Chankillo in Peru, and various temples in Egypt, all show strong alignments with the solstice sunrise or sunset position. These demonstrate a sophisticated understanding of the annual motion of the Sun along the horizon, which would likely have taken decades, or even centuries, of monitoring and record keeping to determine.

Fast forward several thousand years, and modern-day astronomers collect data about celestial objects using an assortment of telescopes and particle detectors. This observational data provides astronomers with information on the position, size, mass, and chemical composition of a myriad of astronomical phenomena such as planets, stars, pulsars, black holes, and galaxies. Our understanding of the Universe is further enhanced through the use of computer simulations, generating even more data for modelling, predicting, and supporting analysis of the observational data.

With access to datasets where sizes are measured in Petabytes, and soon, Exabytes, astronomers have been turning to machine learning (automated processes that learn by example) and artificial intelligence or AI (computers making decisions or discoveries that would usually require human intelligence) to help sift through the data….

Read more: