DeepModeling

Define the future of scientific computing together

Pre-trained models are sweeping through the AI field by extracting representative information from large-scale unlabeled data and then performing supervised learning on small-scale labeled downstream tasks, becoming the de facto solution in many application scenarios. In drug design, there is still no consensus on the "best way to represent molecules." In the field of materials chemistry, predicting molecular properties is equally important. Mainstream molecular pre-training models typically start from one-dimensional sequences or two-dimensional graph structures, but molecular structures are inherently represented in three-dimensional space. Therefore, directly constructing pre-trained models from three-dimensional information to achieve better molecular representations has become an important and meaningful problem. To further promote research on molecular representation and pre-trained models, Uni-Mol will join the DeepModeling community to work with community developers to advance the development of a three-dimensional molecular representation pre-training framework.

Read more »

On the journey toward developing a Large Atomic Model (LAM), the core Deep Potential development team has launched the OpenLAM initiative for the community. OpenLAM’s slogan is "Conquer the Periodic Table!" The project aims to create an open-source ecosystem centered on microscale large models, providing new infrastructure for microscopic scientific research and driving transformative advancements in microscale industrial design across fields such as materials, energy, and biopharmaceuticals.

Read more »

From the development of software ecosystems in fields such as electronic structure calculations and molecular dynamics, to the systematic evaluation of large models like OpenLAM, and gradually addressing scientific and industrial R&D problems such as biological simulations, drug design, and molecular property prediction, a series of AI4Science scientific computing software and models are rapidly advancing. This progress is closely linked to better research infrastructure, with the Dflow project being a key component.

Read more »

The tight-binding model based on second quantization is a widely used theoretical model in condensed matter physics. In this model:

  • Atoms in a lattice are represented as discrete points with a specific number of electrons.
  • Each electron occupies a corresponding atomic orbital.
  • Using creation and annihilation operators, electron transitions between atomic orbitals are described in the second quantization framework.
  • The Hamiltonian comprises:
    • Transition terms between atomic orbitals.
    • Energy levels of the orbitals.

Project on GitHub: https://github.com/deepmodeling/tbplas

Read more »

The slogan for OpenLAM is "Conquer the Periodic Table!" We hope to provide a new infrastructure for microscale scientific research and drive the transformation of microscale industrial design in fields such as materials, energy, and biopharmaceuticals by establishing an open-source ecosystem around large microscale models. Relevant models, data, and workflows will be consolidated around the AIS Square; related software development will take place in the DeepModeling open-source community. At the same time, we welcome open interaction from different communities in model development, data sharing, evaluation, and testing.

See AIS Square for more details.

Read more »

"The integration of machine learning and physical modeling is revolutionizing the paradigm of scientific research. People aiming to push the boundaries of science and solve challenging problems through computational modeling are coming together in unprecedented ways." Recently, the DeepModeling open-source community has welcomed a new member in the field of macro-scale computation. To further advance the development of the JAX-FEM project, a differentiable finite element method library, JAX-FEM will join the DeepModeling community. Together with developers and users in the community, it aims to expand the frontiers of finite element methods in the AI4Science era.

Community project homepage:
https://github.com/deepmodeling/jax-fem

Read more »

The slogan for OpenLAM is "Conquer the Periodic Table!" We hope to provide a new infrastructure for microscale scientific research and drive the transformation of microscale industrial design in fields such as materials, energy, and biopharmaceuticals by establishing an open-source ecosystem around large microscale models. Relevant models, data, and workflows will be consolidated around the AIS Square; related software development will take place in the DeepModeling open-source community. At the same time, we welcome open interaction from different communities in model development, data sharing, evaluation, and testing.

See AIS Square for more details.

Read more »

Peter Thiel once said, "We wanted flying cars, instead we got 140 characters (Twitter)." Over the past decade, we have made great strides at the bit level (internet), but progress at the atomic level (cutting-edge technology) has been relatively slow.

The accumulation of linguistic data has propelled the development of machine learning and ultimately led to the emergence of Large Language Models (LLMs). With the push from AI, progress at the atomic level is also accelerating. Methods like Deep Potential, by learning quantum mechanical data, have increased the space-time scale of microscopic simulations by several orders of magnitude and have made significant progress in fields like drug design, material design, and chemical engineering.

The accumulation of quantum mechanical data is gradually covering the entire periodic table, and the Deep Potential team has also begun the practice of the DPA pre-training model. Analogous to the progress of LLMs, we are on the eve of the emergence of a general Large Atom Model (LAM). At the same time, we believe that open-source and openness will play an increasingly important role in the development of LAM.

Against this backdrop, the core developer team of Deep Potential is launching the OpenLAM Initiative to the community. This plan is still in the draft stage and is set to officially start on January 1, 2024. We warmly and openly welcome opinions and support from all parties.

The slogan for OpenLAM is "Conquer the Periodic Table!" We hope to provide a new infrastructure for microscale scientific research and drive the transformation of microscale industrial design in fields such as materials, energy, and biopharmaceuticals by establishing an open-source ecosystem around large microscale models. Relevant models, data, and workflows will be consolidated around the AIS Square; related software development will take place in the DeepModeling open-source community. At the same time, we welcome open interaction from different communities in model development, data sharing, evaluation, and testing.

OpenLAM's goals for the next three years are: In 2024, to effectively cover the periodic table with first-principles data and achieve a universal property learning capability; in 2025, to combine large-scale experimental characterization data and literature data to achieve a universal cross-modal capability; and in 2026, to realize a target-oriented atomic scale universal generation and planning capability. Ultimately, within 5-10 years, we aim to achieve "Large Atom Embodied Intelligence" for atomic-scale intelligent scientific discovery and synthetic design.

OpenLAM's specific plans for 2024 include:

  • Model Update and Evaluation Report Release:

    • Starting from January 1, 2024, driven by the Deep Potential team, with participation from all LAM developers welcomed.
    • Every three months, a major model version update will take place, with updates that may include model architecture, related data, training strategies, and evaluation test criteria.
  • AIS Cup Competition:

    • Initiated by the Deep Potential team and supported by the Bohrium Cloud Platform, starting in March 2024 and concluding at the end of the year;
    • The goal is to promote the creation of a benchmarking system focused on several application-oriented metrics.
  • Domain Data Contribution:

    • Seeking collaboration with domain developers to establish "LAM-ready" datasets for pre-training and evaluation.
    • Domain datasets for iterative training of the latest models will be updated every three months.
  • Domain Application and Evaluation Workflow Contribution:

    • The domain application and evaluation workflows will be updated and released every three months.
  • Education and Training:

    • Planning a series of educational and training events aimed at LAM developers, domain developers, and users to encourage advancement in the field.
  • How to Contact Us:

    • Direct discussions are encouraged in the DeepModeling community.
    • For more complex inquiries, please contact the project lead, Han Wang (王涵, wang_han@iapcm.ac.cn), Linfeng Zhang (张林峰, zhanglf@aisi.ac.cn), for the new future of Science!

Introducing LibRI: Advancing Computational Methods for DFT

Development and Features

Dr. Peize Lin and the research group led by Xinguo Ren at the Institute of Physics, Chinese Academy of Sciences, have developed the open-source library LibRI. This innovative tool is designed for high-efficiency and highly parallelized RI model calculations and has already integrated several advanced electronic structure computation methods.

Joining the DeepModeling Community

To accelerate its development and broaden its impact, LibRI has joined the DeepModeling community. This collaboration will:

  • Support advanced methods that go beyond conventional DFT, enabling the further development of RI methods.
  • Provide more efficient and accurate computational capabilities for the domestic DFT software ABACUS, boosting its performance and efficiency.
  • Contribute to AI-assisted, next-generation electronic structure algorithms.
Read more »

Lecture 1: Deep Potential Method for Molecular Simulation, Roberto Car

Lecture 2: Deep Potential at Scale, Linfeng Zhang

Lecture 3: Towards a Realistic Description of H3O+ and OH- Transport, Robert A. DiStasio Jr.

Lecture 4: Next Generation Quantum and Deep Learning Potentials, Darrin York

Lecture 5: Linear Response Theory of Transport in Condensed Matter, Stefano Baroni

Lecture 6: Deep Modeling with Long-Range Electrostatic Interactions, Chunyi Zhang

Hands-on session 4: Machine learning of Wannier centers and dipoles

Hands-on session 5: Long range electrostatic interactions with DPLR

Hands-on session 6: Concurrent learning with DP-GEN

0%