{"id":36139,"date":"2024-11-01T09:46:03","date_gmt":"2024-11-01T09:46:03","guid":{"rendered":"http:\/\/atmokpo.com\/w\/?p=36139"},"modified":"2024-11-01T09:46:03","modified_gmt":"2024-11-01T09:46:03","slug":"hugging-face-transformers-tutorial-transferred-to-gpu","status":"publish","type":"post","link":"https:\/\/atmokpo.com\/w\/36139\/","title":{"rendered":"Hugging Face Transformers Tutorial, Transferred to GPU"},"content":{"rendered":"<p><body><\/p>\n<p>Deep learning and natural language processing (NLP) have recently gained significant attention in the field of artificial intelligence. Among them, Hugging Face provides user-friendly Transformer models that help researchers and developers easily perform NLP tasks. In this course, we will explain in detail how to use basic Transformer models utilizing the Hugging Face library and how to improve performance through GPU acceleration.<\/p>\n<h2>1. What is Hugging Face Transformers?<\/h2>\n<p>Hugging Face Transformers is a library that provides pre-trained models for various natural language processing tasks. These models can be utilized in various fields such as language understanding, text generation, translation, question-answering, and more. The Hugging Face library is designed to provide an easy-to-use API to facilitate the simple use of complex deep learning models.<\/p>\n<h2>2. Environment Setup<\/h2>\n<p>To use Hugging Face Transformers, you need to install Python and pip, and install the necessary libraries. Let&#8217;s install them using the command below.<\/p>\n<pre><code>pip install transformers torch<\/code><\/pre>\n<p>The above command installs the Transformers library and PyTorch. Next, we will run the following code to check if a GPU is available.<\/p>\n<pre><code>\nimport torch\nprint(\"CUDA availability:\", torch.cuda.is_available())\nprint(\"Current CUDA device:\", torch.cuda.get_device_name(0) if torch.cuda.is_available() else \"None\")\n<\/code><\/pre>\n<p>By running the above code, you can check whether CUDA is available and the name of the GPU being used.<\/p>\n<h2>3. Loading the Model<\/h2>\n<p>Now, let&#8217;s learn how to load and use the model. You can load various pre-trained models through Hugging Face&#8217;s <code>transformers<\/code> library. Here, we will demonstrate using the BERT model for text classification as an example.<\/p>\n<pre><code>\nfrom transformers import BertTokenizer, BertForSequenceClassification\nfrom torch.nn import functional as F\n\n# Load BERT tokenizer and model\ntokenizer = BertTokenizer.from_pretrained('bert-base-uncased')\nmodel = BertForSequenceClassification.from_pretrained('bert-base-uncased')\n\n# Send to GPU\ndevice = torch.device(\"cuda\" if torch.cuda.is_available() else \"cpu\")\nmodel.to(device)\n<\/code><\/pre>\n<p>The above code is an example of loading the BERT model and tokenizer, and transferring the model to GPU if available.<\/p>\n<h2>4. Text Data Preprocessing<\/h2>\n<p>It is necessary to preprocess the data before inputting it into the model. Here, we show the process of tokenizing a sentence and generating input tensors.<\/p>\n<pre><code>\n# Input sentence\ntext = \"Hugging Face's Transformers provide powerful natural language processing technology.\"\n# Tokenization and conversion to indices\ninputs = tokenizer(text, return_tensors=\"pt\").to(device)\n<\/code><\/pre>\n<p>Here, <code>return_tensors=\"pt\"<\/code> means that we will return a PyTorch tensor. Now we are ready to pass the input data to the model.<\/p>\n<h2>5. Model Prediction<\/h2>\n<p>The process of making predictions with the model is as follows. We pass the input data to the model and interpret the results using logits.<\/p>\n<pre><code>\n# Model prediction\nwith torch.no_grad():\n    outputs = model(**inputs)\n\n# Logits output\nlogits = outputs.logits\npredicted_class = logits.argmax(dim=1).item()\nprint(\"Predicted class:\", predicted_class)\n<\/code><\/pre>\n<p>Running the above code will output the predicted class of the model for the input sentence.<\/p>\n<h2>6. Batch Processing of Data<\/h2>\n<p>In real applications, it is common to process multiple sentences at once. Here is how to process multiple sentences in batches.<\/p>\n<pre><code>\ntexts = [\n    \"This is the first sentence.\",\n    \"This is the second sentence.\",\n    \"This is the third sentence.\"\n]\n\n# Tokenization and conversion to indices\ninputs = tokenizer(texts, padding=True, truncation=True, return_tensors=\"pt\").to(device)\n\n# Model prediction\nwith torch.no_grad():\n    outputs = model(**inputs)\n\n# Logits output\nlogits = outputs.logits\npredicted_classes = logits.argmax(dim=1).tolist()\nprint(\"Predicted classes:\", predicted_classes)\n<\/code><\/pre>\n<p>Processing multiple sentences at once as shown above allows for more efficient acquisition of the model&#8217;s prediction results.<\/p>\n<h2>7. Optimization and GPU Utilization<\/h2>\n<p>When handling large-scale data, it is important to use GPUs to speed up training. The following code shows a simple example of training the model. In this sample example, we used the Adadelta optimizer.<\/p>\n<pre><code>\nfrom torch.optim import AdamW\n\n# Optimizer setup\noptimizer = AdamW(model.parameters(), lr=5e-5)\n\n# Dummy data and labels\ntrain_texts = [\"This is a positive sentence.\", \"This is a negative sentence.\"]\ntrain_labels = [1, 0]\n\n# Batch processing\ntrain_inputs = tokenizer(train_texts, padding=True, truncation=True, return_tensors=\"pt\").to(device)\ntrain_labels = torch.tensor(train_labels).to(device)\n\n# Model training\nmodel.train()\nfor epoch in range(3): # Number of epochs\n    optimizer.zero_grad()\n    outputs = model(**train_inputs, labels=train_labels)\n    loss = outputs.loss\n    loss.backward()\n    optimizer.step()\n    print(f\"Epoch {epoch + 1}, Loss: {loss.item()}\")\n<\/code><\/pre>\n<p>The above code is an example of training the model using two simple sentences. It prints the loss at each epoch to monitor the training progress.<\/p>\n<h2>8. Saving and Loading the Model<\/h2>\n<p>A trained model can be saved and loaded later. The code below shows how to save and load a model.<\/p>\n<pre><code>\n# Save the model\nmodel.save_pretrained(\".\/model_directory\")\ntokenizer.save_pretrained(\".\/model_directory\")\n\n# Load the model\nmodel = BertForSequenceClassification.from_pretrained(\".\/model_directory\")\ntokenizer = BertTokenizer.from_pretrained(\".\/model_directory\")\nmodel.to(device)\n<\/code><\/pre>\n<p>You can save the model and tokenizer, and later load them when needed for use.<\/p>\n<h2>9. Conclusion<\/h2>\n<p>In this course, we explained how to perform NLP tasks using the BERT model through the Hugging Face Transformers library and how to optimize performance through GPU utilization. As deep learning becomes increasingly important, developing the ability to use various tools and libraries is essential. We hope to see further advancements in the fields of AI and NLP.<\/p>\n<p><\/body><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Deep learning and natural language processing (NLP) have recently gained significant attention in the field of artificial intelligence. Among them, Hugging Face provides user-friendly Transformer models that help researchers and developers easily perform NLP tasks. In this course, we will explain in detail how to use basic Transformer models utilizing the Hugging Face library and &hellip; <a href=\"https:\/\/atmokpo.com\/w\/36139\/\" class=\"more-link\">\ub354 \ubcf4\uae30<span class=\"screen-reader-text\"> &#8220;Hugging Face Transformers Tutorial, Transferred to GPU&#8221;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[108],"tags":[],"class_list":["post-36139","post","type-post","status-publish","format-standard","hentry","category---en"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v26.2 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Hugging Face Transformers Tutorial, Transferred to GPU - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/atmokpo.com\/w\/36139\/\" \/>\n<meta property=\"og:locale\" content=\"ko_KR\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Hugging Face Transformers Tutorial, Transferred to GPU - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\" \/>\n<meta property=\"og:description\" content=\"Deep learning and natural language processing (NLP) have recently gained significant attention in the field of artificial intelligence. Among them, Hugging Face provides user-friendly Transformer models that help researchers and developers easily perform NLP tasks. In this course, we will explain in detail how to use basic Transformer models utilizing the Hugging Face library and &hellip; \ub354 \ubcf4\uae30 &quot;Hugging Face Transformers Tutorial, Transferred to GPU&quot;\" \/>\n<meta property=\"og:url\" content=\"https:\/\/atmokpo.com\/w\/36139\/\" \/>\n<meta property=\"og:site_name\" content=\"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\" \/>\n<meta property=\"article:published_time\" content=\"2024-11-01T09:46:03+00:00\" \/>\n<meta name=\"author\" content=\"root\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@bebubo4\" \/>\n<meta name=\"twitter:site\" content=\"@bebubo4\" \/>\n<meta name=\"twitter:label1\" content=\"\uae00\uc4f4\uc774\" \/>\n\t<meta name=\"twitter:data1\" content=\"root\" \/>\n\t<meta name=\"twitter:label2\" content=\"\uc608\uc0c1 \ub418\ub294 \ud310\ub3c5 \uc2dc\uac04\" \/>\n\t<meta name=\"twitter:data2\" content=\"4\ubd84\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/atmokpo.com\/w\/36139\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/atmokpo.com\/w\/36139\/\"},\"author\":{\"name\":\"root\",\"@id\":\"https:\/\/atmokpo.com\/w\/#\/schema\/person\/91b6b3b138fbba0efb4ae64b1abd81d7\"},\"headline\":\"Hugging Face Transformers Tutorial, Transferred to GPU\",\"datePublished\":\"2024-11-01T09:46:03+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/atmokpo.com\/w\/36139\/\"},\"wordCount\":559,\"publisher\":{\"@id\":\"https:\/\/atmokpo.com\/w\/#organization\"},\"articleSection\":[\"Using Hugging Face\"],\"inLanguage\":\"ko-KR\"},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/atmokpo.com\/w\/36139\/\",\"url\":\"https:\/\/atmokpo.com\/w\/36139\/\",\"name\":\"Hugging Face Transformers Tutorial, Transferred to GPU - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\",\"isPartOf\":{\"@id\":\"https:\/\/atmokpo.com\/w\/#website\"},\"datePublished\":\"2024-11-01T09:46:03+00:00\",\"breadcrumb\":{\"@id\":\"https:\/\/atmokpo.com\/w\/36139\/#breadcrumb\"},\"inLanguage\":\"ko-KR\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/atmokpo.com\/w\/36139\/\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/atmokpo.com\/w\/36139\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"\ud648\",\"item\":\"https:\/\/atmokpo.com\/w\/en\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Hugging Face Transformers Tutorial, Transferred to GPU\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/atmokpo.com\/w\/#website\",\"url\":\"https:\/\/atmokpo.com\/w\/\",\"name\":\"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\",\"description\":\"\",\"publisher\":{\"@id\":\"https:\/\/atmokpo.com\/w\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/atmokpo.com\/w\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"ko-KR\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/atmokpo.com\/w\/#organization\",\"name\":\"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\",\"url\":\"https:\/\/atmokpo.com\/w\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"ko-KR\",\"@id\":\"https:\/\/atmokpo.com\/w\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/atmokpo.com\/w\/wp-content\/uploads\/2024\/11\/logo.png\",\"contentUrl\":\"https:\/\/atmokpo.com\/w\/wp-content\/uploads\/2024\/11\/logo.png\",\"width\":400,\"height\":400,\"caption\":\"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8\"},\"image\":{\"@id\":\"https:\/\/atmokpo.com\/w\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/x.com\/bebubo4\"]},{\"@type\":\"Person\",\"@id\":\"https:\/\/atmokpo.com\/w\/#\/schema\/person\/91b6b3b138fbba0efb4ae64b1abd81d7\",\"name\":\"root\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"ko-KR\",\"@id\":\"https:\/\/atmokpo.com\/w\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/708197b41fc6435a7ce22d951b25d4a47e9e904270cb1f04682d4f025066f80c?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/708197b41fc6435a7ce22d951b25d4a47e9e904270cb1f04682d4f025066f80c?s=96&d=mm&r=g\",\"caption\":\"root\"},\"sameAs\":[\"http:\/\/atmokpo.com\/w\"],\"url\":\"https:\/\/atmokpo.com\/w\/author\/root\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Hugging Face Transformers Tutorial, Transferred to GPU - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/atmokpo.com\/w\/36139\/","og_locale":"ko_KR","og_type":"article","og_title":"Hugging Face Transformers Tutorial, Transferred to GPU - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","og_description":"Deep learning and natural language processing (NLP) have recently gained significant attention in the field of artificial intelligence. Among them, Hugging Face provides user-friendly Transformer models that help researchers and developers easily perform NLP tasks. In this course, we will explain in detail how to use basic Transformer models utilizing the Hugging Face library and &hellip; \ub354 \ubcf4\uae30 \"Hugging Face Transformers Tutorial, Transferred to GPU\"","og_url":"https:\/\/atmokpo.com\/w\/36139\/","og_site_name":"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","article_published_time":"2024-11-01T09:46:03+00:00","author":"root","twitter_card":"summary_large_image","twitter_creator":"@bebubo4","twitter_site":"@bebubo4","twitter_misc":{"\uae00\uc4f4\uc774":"root","\uc608\uc0c1 \ub418\ub294 \ud310\ub3c5 \uc2dc\uac04":"4\ubd84"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/atmokpo.com\/w\/36139\/#article","isPartOf":{"@id":"https:\/\/atmokpo.com\/w\/36139\/"},"author":{"name":"root","@id":"https:\/\/atmokpo.com\/w\/#\/schema\/person\/91b6b3b138fbba0efb4ae64b1abd81d7"},"headline":"Hugging Face Transformers Tutorial, Transferred to GPU","datePublished":"2024-11-01T09:46:03+00:00","mainEntityOfPage":{"@id":"https:\/\/atmokpo.com\/w\/36139\/"},"wordCount":559,"publisher":{"@id":"https:\/\/atmokpo.com\/w\/#organization"},"articleSection":["Using Hugging Face"],"inLanguage":"ko-KR"},{"@type":"WebPage","@id":"https:\/\/atmokpo.com\/w\/36139\/","url":"https:\/\/atmokpo.com\/w\/36139\/","name":"Hugging Face Transformers Tutorial, Transferred to GPU - \ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","isPartOf":{"@id":"https:\/\/atmokpo.com\/w\/#website"},"datePublished":"2024-11-01T09:46:03+00:00","breadcrumb":{"@id":"https:\/\/atmokpo.com\/w\/36139\/#breadcrumb"},"inLanguage":"ko-KR","potentialAction":[{"@type":"ReadAction","target":["https:\/\/atmokpo.com\/w\/36139\/"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/atmokpo.com\/w\/36139\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"\ud648","item":"https:\/\/atmokpo.com\/w\/en\/"},{"@type":"ListItem","position":2,"name":"Hugging Face Transformers Tutorial, Transferred to GPU"}]},{"@type":"WebSite","@id":"https:\/\/atmokpo.com\/w\/#website","url":"https:\/\/atmokpo.com\/w\/","name":"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","description":"","publisher":{"@id":"https:\/\/atmokpo.com\/w\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/atmokpo.com\/w\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"ko-KR"},{"@type":"Organization","@id":"https:\/\/atmokpo.com\/w\/#organization","name":"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8","url":"https:\/\/atmokpo.com\/w\/","logo":{"@type":"ImageObject","inLanguage":"ko-KR","@id":"https:\/\/atmokpo.com\/w\/#\/schema\/logo\/image\/","url":"https:\/\/atmokpo.com\/w\/wp-content\/uploads\/2024\/11\/logo.png","contentUrl":"https:\/\/atmokpo.com\/w\/wp-content\/uploads\/2024\/11\/logo.png","width":400,"height":400,"caption":"\ub77c\uc774\ube0c\uc2a4\ub9c8\ud2b8"},"image":{"@id":"https:\/\/atmokpo.com\/w\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/x.com\/bebubo4"]},{"@type":"Person","@id":"https:\/\/atmokpo.com\/w\/#\/schema\/person\/91b6b3b138fbba0efb4ae64b1abd81d7","name":"root","image":{"@type":"ImageObject","inLanguage":"ko-KR","@id":"https:\/\/atmokpo.com\/w\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/708197b41fc6435a7ce22d951b25d4a47e9e904270cb1f04682d4f025066f80c?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/708197b41fc6435a7ce22d951b25d4a47e9e904270cb1f04682d4f025066f80c?s=96&d=mm&r=g","caption":"root"},"sameAs":["http:\/\/atmokpo.com\/w"],"url":"https:\/\/atmokpo.com\/w\/author\/root\/"}]}},"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"jetpack-related-posts":[],"_links":{"self":[{"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/posts\/36139","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/comments?post=36139"}],"version-history":[{"count":1,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/posts\/36139\/revisions"}],"predecessor-version":[{"id":36140,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/posts\/36139\/revisions\/36140"}],"wp:attachment":[{"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/media?parent=36139"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/categories?post=36139"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/atmokpo.com\/w\/wp-json\/wp\/v2\/tags?post=36139"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}