From de7568651486a4d2e45b589090d2b328e2f8f3f7 Mon Sep 17 00:00:00 2001 From: michal pitr Date: Thu, 9 Jan 2025 05:19:44 +0100 Subject: [PATCH] Fix typos in pytorch on xla docs (#8543) --- docs/source/learn/pytorch-on-xla-devices.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/source/learn/pytorch-on-xla-devices.md b/docs/source/learn/pytorch-on-xla-devices.md index 5bf4953a6ce..5307ee4fb77 100644 --- a/docs/source/learn/pytorch-on-xla-devices.md +++ b/docs/source/learn/pytorch-on-xla-devices.md @@ -186,9 +186,9 @@ doc will talk about the device independent bits of multi-host training and will use the TPU + PJRT runtime(currently available on 1.13 and 2.x releases) as an example. -Before you being, please take a look at our user guide at +Before you begin, please take a look at our user guide at [here](https://cloud.google.com/tpu/docs/run-calculation-pytorch) which -will explain some Google Cloud basis like how to use `gcloud` command +will explain some Google Cloud basics like how to use `gcloud` command and how to setup your project. You can also check [here](https://cloud.google.com/tpu/docs/how-to) for all Cloud TPU Howto. This doc will focus on the PyTorch/XLA perspective of the Setup.