مقالات التقنية

NASA partners with IBM to build AI foundation models to advance climate science


Check out all the on-demand sessions from the Intelligent Security Summit here.


U.S. space agency NASA isn’t just concerned about exploring outer space, it’s also concerned about helping humanity to learn more about the planet Earth and the impacts of climate change.

Today, NASA and IBM announced a partnership that will see the development of new artificial intelligence (AI) foundation models to help analyze geospatial satellite data, in a bid to help better understand and take action on climate change. To date, NASA has largely relied on the development of its own set of bespoke AI models to serve specific use cases. The promise of the foundation model approach is a large language model (LLM) that has been trained on lots of data that can serve as a more general purpose system that can be customized as needed.

Among the initial goals of the partnership is to train a foundation model on NASA’s Harmonized Landsat Sentinel-2 (HLS) dataset, which has petabytes of data collected from space about land use changes on Earth.

Beyond just helping to improve the state of climate analysis on Earth, IBM is hopeful that the new foundation model that it develops jointly with NASA will have broader applicability and a positive impact for enterprise use cases of AI as well.

Event

Intelligent Security Summit On-Demand

Learn the critical role of AI & ML in cybersecurity and industry specific case studies. Watch on-demand sessions today.

Watch Here

“What we’re doing with NASA is going to help us push innovation all the way from infrastructure and hardware up through distributed systems platforms, middleware and the applications themselves,” Priya Nagpurkar, VP, hybrid cloud platform and developer productivity at IBM Research, said during a press briefing announcing the partnership. “And it will include driving advances in AI architectures, and even data management techniques.”

Houston, we have a (big data) problem

To put it mildly, NASA has a lot of data.

Rahul Ramachandran, senior research scientist at NASA’s Marshall Space Flight Center in Huntsville, Alabama, explained during the press briefing that NASA actually has the largest collection of Earth observation data. That data has been collected to support NASA’s science mission to understand planet Earth as a complex system. The data comes from various instruments, and currently includes an archive with 70 petabytes of data. The archive is projected to grow within a few years to 250 petabytes.

“Clearly, given the scale of the data that we have, we have a big data problem,” Ramachandran said. “Our goal is to make our data discoverable, accessible and usable for broad scientific use in applications worldwide.”

Ramachandran added that NASA is always looking for new approaches and technologies that will help streamline the research process, as well as lower the barrier to entry for end users to utilize the complex science data held by the space agency. That’s where the development of foundation models comes into play to make it easier to benefit from the data that NASA has collected. 

The potential for the foundation model that NASA is building with IBM could literally be life changing for humanity.

For example, Ramachandran said that building a foundation model that has satellite image data could make it easier for someone in a disaster area to identify the extent of flooding, where the model automatically maps where the flooding is occurring. Another example could be identifying damage in a hurricane zone.

PyTorch and open-source AI will also benefit

On the technology side, IBM will be making extensive use of a series of technologies, including Red Hat OpenShift, for running the AI-training workloads and open-source machine learning frameworks, notably PyTorch.

The open-source PyTorch machine learning framework was started at Facebook (now known as Meta) and spun off as its own PyTorch Foundation in September 2022. IBM Research has been an active contributor to PyTorch, integrating capabilities into the PyTorch 1.13 release framework to help run large workloads on commodity hardware.

Raghu Ganti, principal researcher at IBM Research, said PyTorch is a core element of IBM’s AI strategy.

“We solely rely on PyTorch for training all our foundation models,” Ganti said. 

Ganti added that IBM will continue to contribute back to the PyTorch community as it continues to innovate on the technology to build increasingly powerful foundation models. In Ganti’s view, the joint effort with NASA to build foundation models will have multiple applications and broad impact.

“I think it will augment and accelerate the scientific process in terms of building and solving specific science problems,” he said. “Instead of people having to build their own individual machine learning pipeline, starting from collecting the large volumes of training data.”

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.

مقالات ذات صلة

اترك تعليقاً

لن يتم نشر عنوان بريدك الإلكتروني. الحقول الإلزامية مشار إليها بـ *

زر الذهاب إلى الأعلى