From 8c8507efb78c02c7fb43caf7a28fb3d1ca82efc7 Mon Sep 17 00:00:00 2001 From: Shorthills AI <141953346+ShorthillsAI@users.noreply.github.com> Date: Thu, 14 Mar 2024 17:31:56 +0530 Subject: [PATCH] Update MODEL_CARD.md --- MODEL_CARD.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/MODEL_CARD.md b/MODEL_CARD.md index 370807880..d58833656 100644 --- a/MODEL_CARD.md +++ b/MODEL_CARD.md @@ -38,7 +38,7 @@ Llama 2|*A new mix of publicly available online data*|70B|4k|✔|2.0T|1.5 x **Note: Developers may fine-tune Llama 2 models for languages beyond English provided they comply with the Llama 2 Community License and the Acceptable Use Policy. # **Hardware and Software** -**Training Factors** We used custom training libraries, Meta's Research Super Cluster, and production clusters for pretraining. Fine-tuning, annotation, and evaluation were also performed on third-party cloud compute. +**Training Factors** We used custom training libraries, Meta's Research Super Cluster, and production clusters for pretraining. Fine-tuning, annotation, and evaluation were also performed on third-party cloud computing. **Carbon Footprint** Pretraining utilized a cumulative 3.3M GPU hours of computation on hardware of type A100-80GB (TDP of 350-400W). Estimated total emissions were 539 tCO2eq, 100% of which were offset by Meta’s sustainability program.