Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix transformers examples #174

Merged
merged 3 commits into from
Apr 25, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion guides/get-started/Comet_Quickstart.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -426,7 +426,7 @@
},
"outputs": [],
"source": [
"%pip install -U comet_ml torch datasets transformers scikit-learn accelerate"
"%pip install -U comet_ml torch datasets \"transformers<4.40.0\" scikit-learn accelerate"
]
},
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -88,7 +88,7 @@
"metadata": {},
"outputs": [],
"source": [
"%pip install -r alpaca-lora/requirements.txt scipy"
"%pip install -r alpaca-lora/requirements.txt scipy \"transformers<4.40.0\""
]
},
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@
},
"outputs": [],
"source": [
"%pip install -U comet_ml gradio altair torch torchvision transformers requests Pillow"
"%pip install -U comet_ml gradio altair torch torchvision \"transformers<4.40.0\" requests Pillow"
]
},
{
Expand Down Expand Up @@ -118,7 +118,9 @@
"from PIL import Image\n",
"from torchvision import transforms\n",
"\n",
"torch.hub.download_url_to_file(\"https://github.com/pytorch/hub/raw/master/images/dog.jpg\", \"dog.jpg\")\n",
"torch.hub.download_url_to_file(\n",
" \"https://github.com/pytorch/hub/raw/master/images/dog.jpg\", \"dog.jpg\"\n",
")\n",
"\n",
"model = torch.hub.load(\"pytorch/vision:v0.6.0\", \"resnet18\", pretrained=True).eval()\n",
"\n",
Expand All @@ -138,9 +140,7 @@
"inputs = gr.Image()\n",
"outputs = gr.Label(num_top_classes=3)\n",
"\n",
"io = gr.Interface(\n",
" fn=predict, inputs=inputs, outputs=outputs, examples=[\"dog.jpg\"]\n",
")\n",
"io = gr.Interface(fn=predict, inputs=inputs, outputs=outputs, examples=[\"dog.jpg\"])\n",
"io.launch(inline=False, share=True)\n",
"\n",
"experiment = comet_ml.Experiment()\n",
Expand Down Expand Up @@ -200,7 +200,7 @@
" \"max_length\": 50,\n",
" \"temperature\": 0.7,\n",
" \"top_k\": 50,\n",
" \"no_repeat_ngram_size\": 2\n",
" \"no_repeat_ngram_size\": 2,\n",
"}\n",
"model = model.to(device)\n",
"tokenizer = AutoTokenizer.from_pretrained(MODEL_NAME)\n",
Expand All @@ -220,17 +220,14 @@
" )\n",
" return \".\".join(output.split(\".\")[:-1]) + \".\"\n",
"\n",
"\n",
"input_text = gr.Textbox(label=\"Input Prompt\")\n",
"output_text = gr.Textbox(label=\"Generated Output\")\n",
"io = gr.Interface(\n",
" generate_text,\n",
" inputs=input_text,\n",
" outputs=output_text,\n",
" examples=[\n",
" [\n",
" \"The dectective looked at the room full of suspects and said, \"\n",
" ]\n",
" ],\n",
" examples=[[\"The dectective looked at the room full of suspects and said, \"]],\n",
")\n",
"io.launch(inline=False, share=True)\n",
"\n",
Expand Down Expand Up @@ -273,9 +270,7 @@
"do_lower_case = True\n",
"model_version = \"distilbert-base-uncased-distilled-squad\"\n",
"\n",
"tokenizer = AutoTokenizer.from_pretrained(\n",
" model_version, do_lower_case=do_lower_case\n",
")\n",
"tokenizer = AutoTokenizer.from_pretrained(model_version, do_lower_case=do_lower_case)\n",
"tokenizer.pad_token = \"[PAD]\"\n",
"model = AutoModelForQuestionAnswering.from_pretrained(\n",
" model_version, output_attentions=True, pad_token_id=tokenizer.eos_token_id\n",
Expand All @@ -286,23 +281,28 @@
"\n",
"def qa_func(context, question):\n",
" prediction = qa(question=question, context=context)\n",
" answer = prediction['answer']\n",
" answer = prediction[\"answer\"]\n",
"\n",
" return answer\n",
"\n",
"\n",
"io = gr.Interface(\n",
" qa_func,\n",
" inputs=[\n",
" gr.Textbox(lines=7, label=\"Context\"),\n",
" gr.Textbox(label=\"Question\"),\n",
" ],\n",
" outputs=[gr.Textbox(label=\"Answer\")],\n",
" examples=[[\"\"\"A Moon landing is the arrival of a spacecraft on the surface of the Moon.\n",
" examples=[\n",
" [\n",
" \"\"\"A Moon landing is the arrival of a spacecraft on the surface of the Moon.\n",
" This includes both crewed and robotic missions. The first human-made object to touch the Moon was the Soviet Union's Luna 2, on 13 September 1959.\n",
" The United States' Apollo 11 was the first crewed mission to land on the Moon, on 20 July 1969. \n",
" There were six crewed U.S. landings between 1969 and 1972, and numerous uncrewed landings, with no soft landings happening between 22 August 1976 and 14 December 2013.\n",
" \"\"\", \"What year did the first crewed mission land on the moon?\"]\n",
" ]\n",
" \"\"\",\n",
" \"What year did the first crewed mission land on the moon?\",\n",
" ]\n",
" ],\n",
")\n",
"io.launch(inline=False, share=True)\n",
"\n",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@
},
"outputs": [],
"source": [
"%pip install comet_ml torch transformers \"gradio>=4.0\" shap"
"%pip install comet_ml torch \"transformers<4.40.0\" \"gradio>=4.0\" shap"
]
},
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,11 +11,13 @@
},
{
"cell_type": "markdown",
"metadata": {},
"metadata": {
"id": "5L-2VqFDWgGx"
},
"source": [
"[Hugging Face](https://huggingface.co/docs) is a community and data science platform that provides tools that enable users to build, train and deploy ML models based on open source (OS) code and technologies. Primarily known for their `transformers` library, Hugging Face has helped democratized access to these models by providing a unified API to train and evaluate a number of popular models for NLP. \n",
"[Hugging Face](https://huggingface.co/docs) is a community and data science platform that provides tools that enable users to build, train and deploy ML models based on open source (OS) code and technologies. Primarily known for their `transformers` library, Hugging Face has helped democratized access to these models by providing a unified API to train and evaluate a number of popular models for NLP.\n",
"\n",
"Comet integrates with Hugging Face's `Trainer` object, allowing you to log your model parameters, metrics, and assets such as model checkpoints. Learn more about our integration [here](https://www.comet.com/docs/v2/integrations/ml-frameworks/huggingface/) \n",
"Comet integrates with Hugging Face's `Trainer` object, allowing you to log your model parameters, metrics, and assets such as model checkpoints. Learn more about our integration [here](https://www.comet.com/docs/v2/integrations/ml-frameworks/huggingface/)\n",
"\n",
"Curious about how Comet can help you build better models, faster? Find out more about [Comet](https://www.comet.com/site/products/ml-experiment-tracking/?utm_campaign=transformers&utm_medium=colab) and our [other integrations](https://www.comet.ml/docs/v2/integrations/overview/)\n",
"\n",
Expand Down Expand Up @@ -82,7 +84,7 @@
},
"outputs": [],
"source": [
"PRE_TRAINED_MODEL_NAME = \"distilbert-base-uncased\"\n",
"PRE_TRAINED_MODEL_NAME = \"distilbert/distilroberta-base\"\n",
"SEED = 42"
]
},
Expand Down Expand Up @@ -138,7 +140,11 @@
"outputs": [],
"source": [
"def tokenize_function(examples):\n",
" return tokenizer(examples[\"text\"], padding=\"max_length\", truncation=True)\n",
" return tokenizer(\n",
" examples[\"text\"],\n",
" padding=\"max_length\",\n",
" truncation=True,\n",
" )\n",
"\n",
"\n",
"tokenized_datasets = raw_datasets.map(tokenize_function, batched=True)"
Expand Down Expand Up @@ -314,19 +320,11 @@
")\n",
"trainer.train()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"accelerator": "GPU",
"colab": {
"collapsed_sections": [],
"name": "Comet with Hugging Face",
"provenance": []
},
Expand All @@ -345,9 +343,9 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.3"
"version": "3.10.12"
}
},
"nbformat": 4,
"nbformat_minor": 1
"nbformat_minor": 4
}
Original file line number Diff line number Diff line change
Expand Up @@ -3,4 +3,4 @@ comet_ml
pandas
scikit-learn
torch
transformers
transformers<4.40.0
Loading
Loading