Graphormer实战教程:基于ogb库加载PCQM4M数据微调模型示例
Graphormer实战教程基于ogb库加载PCQM4M数据微调模型示例1. 引言Graphormer是一种创新的分子属性预测模型采用纯Transformer架构的图神经网络设计。它专门针对分子图原子-键结构的全局结构建模与属性预测任务在OGB、PCQM4M等分子基准测试中表现优异大幅超越传统GNN模型。本教程将带您从零开始学习如何使用ogb库加载PCQM4M数据集并对Graphormer模型进行微调。通过本教程您将掌握如何准备分子图数据环境使用ogb库加载和处理PCQM4M数据集配置和微调Graphormer模型评估模型性能的完整流程2. 环境准备与安装2.1 系统要求Python 3.8CUDA 11.3 (推荐)至少16GB内存支持PyTorch的GPU (推荐RTX 3090及以上)2.2 安装依赖conda create -n graphormer python3.9 conda activate graphormer pip install torch1.12.0cu113 torchvision0.13.0cu113 torchaudio0.12.0 --extra-index-url https://download.pytorch.org/whl/cu113 pip install ogb rdkit-pypi torch-geometric2.3 安装Graphormergit clone https://github.com/microsoft/Graphormer.git cd Graphormer pip install -e .3. 数据准备与处理3.1 了解PCQM4M数据集PCQM4M是OGB(Open Graph Benchmark)提供的大规模分子属性预测数据集包含约380万个分子结构及其HOMO-LUMO能隙值。3.2 加载数据集from ogb.lsc import PCQM4Mv2Dataset dataset PCQM4Mv2Dataset(rootdataset/) print(dataset) print(dataset[0]) # 查看第一个样本3.3 数据预处理Graphormer需要特定的数据格式我们需要将分子图转换为模型可接受的输入from ogb.utils.features import atom_to_feature_vector, bond_to_feature_vector from rdkit import Chem def smiles2graph(smiles_string): mol Chem.MolFromSmiles(smiles_string) # 原子特征 atom_features_list [] for atom in mol.GetAtoms(): atom_features_list.append(atom_to_feature_vector(atom)) # 键特征 edges [] edge_features_list [] for bond in mol.GetBonds(): i bond.GetBeginAtomIdx() j bond.GetEndAtomIdx() edge_feature bond_to_feature_vector(bond) edges.append((i, j)) edge_features_list.append(edge_feature) edges.append((j, i)) # 无向图 edge_features_list.append(edge_feature) # 处理孤立原子 if len(edges) 0: for i in range(len(atom_features_list)): edges.append((i, i)) edge_features_list.append([0]*len(edge_feature)) return atom_features_list, edges, edge_features_list4. 模型配置与微调4.1 加载预训练模型from graphormer import Graphormer model Graphormer( n_layers12, num_heads32, hidden_dim512, dropout_rate0.1, intput_dropout_rate0.1, ffn_dim512, dataset_namepcqm4m, )4.2 数据加载器设置from torch_geometric.data import DataLoader # 划分训练集、验证集 split_idx dataset.get_idx_split() train_dataset dataset[split_idx[train]] valid_dataset dataset[split_idx[valid]] # 创建数据加载器 train_loader DataLoader(train_dataset, batch_size32, shuffleTrue) valid_loader DataLoader(valid_dataset, batch_size32, shuffleFalse)4.3 训练配置import torch import torch.nn as nn from torch.optim import AdamW device torch.device(cuda if torch.cuda.is_available() else cpu) model model.to(device) criterion nn.MSELoss() optimizer AdamW(model.parameters(), lr1e-4, weight_decay0.01) scheduler torch.optim.lr_scheduler.CosineAnnealingLR(optimizer, T_max10)5. 训练与评估5.1 训练循环def train(): model.train() total_loss 0 for batch in train_loader: batch batch.to(device) optimizer.zero_grad() # 前向传播 out model(batch) loss criterion(out, batch.y.view(-1, 1)) # 反向传播 loss.backward() optimizer.step() total_loss loss.item() return total_loss / len(train_loader)5.2 验证循环torch.no_grad() def validate(): model.eval() total_loss 0 for batch in valid_loader: batch batch.to(device) out model(batch) loss criterion(out, batch.y.view(-1, 1)) total_loss loss.item() return total_loss / len(valid_loader)5.3 主训练流程best_val_loss float(inf) patience 5 counter 0 for epoch in range(1, 101): train_loss train() val_loss validate() scheduler.step() print(fEpoch: {epoch:03d}, Train Loss: {train_loss:.4f}, Val Loss: {val_loss:.4f}) if val_loss best_val_loss: best_val_loss val_loss torch.save(model.state_dict(), best_model.pt) counter 0 else: counter 1 if counter patience: print(Early stopping!) break6. 模型评估与应用6.1 测试集评估test_dataset dataset[split_idx[test]] test_loader DataLoader(test_dataset, batch_size32, shuffleFalse) torch.no_grad() def test(): model.eval() total_loss 0 for batch in test_loader: batch batch.to(device) out model(batch) loss criterion(out, batch.y.view(-1, 1)) total_loss loss.item() return total_loss / len(test_loader) test_loss test() print(fTest Loss: {test_loss:.4f})6.2 预测新分子def predict_smiles(smiles): model.eval() atom_features, edges, edge_features smiles2graph(smiles) # 转换为模型输入格式 data { x: torch.tensor(atom_features, dtypetorch.long), edge_index: torch.tensor(edges, dtypetorch.long).t().contiguous(), edge_attr: torch.tensor(edge_features, dtypetorch.long), } data data.to(device) pred model(data) return pred.item() # 示例预测 smiles CCO # 乙醇 prediction predict_smiles(smiles) print(fPredicted HOMO-LUMO gap for {smiles}: {prediction:.4f})7. 总结通过本教程我们完成了以下工作搭建了Graphormer的运行环境并安装了必要依赖使用ogb库加载和处理了PCQM4M数据集配置了Graphormer模型并实现了微调流程评估了模型在验证集和测试集上的性能实现了对新分子属性的预测功能Graphormer作为基于Transformer的图神经网络在分子属性预测任务上展现出强大能力。通过本教程的实践您应该已经掌握了使用ogb库和Graphormer进行分子属性预测的基本流程。获取更多AI镜像想探索更多AI镜像和应用场景访问 CSDN星图镜像广场提供丰富的预置镜像覆盖大模型推理、图像生成、视频生成、模型微调等多个领域支持一键部署。
本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.coloradmin.cn/o/2470934.html
如若内容造成侵权/违法违规/事实不符,请联系多彩编程网进行投诉反馈,一经查实,立即删除!