lightning.ai Open in urlscan Pro
2606:4700::6812:1adf  Public Scan

URL: https://lightning.ai/
Submission: On October 13 via manual from IN — Scanned from DE

Form analysis 0 forms found in the DOM

Text Content

You need to enable JavaScript to run this app.
ProductsCommunityDocsReleasesPricing
Login
Start Free



TRAIN


DEPLOY


BUILD AI WITH


 


PYTORCH, LIGHTNING FAST

The platform for teams to build AI, without the headaches

Get started free



47+ MILLION



Downloads



pip install lightning
24,850

17,000+



Projects use Lightning



PyTorch Lightning Platform

Build foundation models, on your data, your cloud.

🔬
Develop
Where teams develop models and AI products without cloud headaches.
🧠
Train
Train LLMs with fault-tolerance, diffusion models and any model at scale.
🚀
Deploy
Deploy high-availability, scalable models.
🗄️
Your data
Use your own data across your favorite services like S3, Snowflake, BigQuery and
more.
🔒
Your environment
Everything runs on your cloud account on your private VPC

Get started free

Open Source AI

Fast and minimal libraries
to train and deploy AI models

PyTorch Lightning
Train and deploy any PyTorch model including LLMs, transformers and Stable
Diffusion without the boilerplate.

Learn more



Lightning Fabric
Scale foundation models with expert-level control.

Learn more



TorchMetrics
90+ Easy to use PyTorch metrics optimized for scale.

Learn more



Lightning Apps
Deploy and ship fullstack AI products. Example: Deploy and auto-scaling stable
diffusion server.

Learn more



import os, torch, torch.nn as nn, torch.utils.data as data, torchvision as tv
import lightning as L encoder = nn.Sequential(nn.Linear(28 * 28, 128),
nn.ReLU(), nn.Linear(128, 3)) decoder = nn.Sequential(nn.Linear(3, 128),
nn.ReLU(), nn.Linear(128, 28 * 28)) class LitAutoEncoder(L.LightningModule): def
__init__(self, encoder, decoder): super().__init__() self.encoder, self.decoder
= encoder, decoder def training_step(self, batch, batch_idx): x, y = batch x =
x.view(x.size(0), -1) z = self.encoder(x) x_hat = self.decoder(z) loss =
nn.functional.mse_loss(x_hat, x) self.log("train_loss", loss) return loss def
configure_optimizers(self): return torch.optim.Adam(self.parameters(), lr=1e-3)
dataset = tv.datasets.MNIST(".", download=True,
transform=tv.transforms.ToTensor()) trainer = L.Trainer()
trainer.fit(LitAutoEncoder(encoder, decoder), data.DataLoader(dataset,
batch_size=64))

Lightning powers AI across 10,000+ organizations




About

Features

Pricing

Terms of Service

Privacy Policy

Community

Forums

Discord

GitHub

Resources

AI Education

Careers

Policies

Docs

Lightning Apps

PyTorch Lightning

Fabric

TorchMetrics

Who doesn't love a good cookie?

We use cookies for the best experience. By using our platform, you agree to our

cookie policy

.

Reject

Accept